WorldWideScience

Sample records for validation methods included

  1. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    Science.gov (United States)

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  2. Method Validation Procedure in Gamma Spectroscopy Laboratory

    International Nuclear Information System (INIS)

    El Samad, O.; Baydoun, R.

    2008-01-01

    The present work describes the methodology followed for the application of ISO 17025 standards in gamma spectroscopy laboratory at the Lebanese Atomic Energy Commission including the management and technical requirements. A set of documents, written procedures and records were prepared to achieve the management part. The technical requirements, internal method validation was applied through the estimation of trueness, repeatability , minimum detectable activity and combined uncertainty, participation in IAEA proficiency tests assure the external method validation, specially that the gamma spectroscopy lab is a member of ALMERA network (Analytical Laboratories for the Measurements of Environmental Radioactivity). Some of these results are presented in this paper. (author)

  3. Human Factors methods concerning integrated validation of nuclear power plant control rooms; Metodutveckling foer integrerad validering

    Energy Technology Data Exchange (ETDEWEB)

    Oskarsson, Per-Anders; Johansson, Bjoern J.E.; Gonzalez, Natalia (Swedish Defence Research Agency, Information Systems, Linkoeping (Sweden))

    2010-02-15

    The frame of reference for this work was existing recommendations and instructions from the NPP area, experiences from the review of the Turbic Validation and experiences from system validations performed at the Swedish Armed Forces, e.g. concerning military control rooms and fighter pilots. These enterprises are characterized by complex systems in extreme environments, often with high risks, where human error can lead to serious consequences. A focus group has been performed with representatives responsible for Human Factors issues from all Swedish NPP:s. The questions that were discussed were, among other things, for whom an integrated validation (IV) is performed and its purpose, what should be included in an IV, the comparison with baseline measures, the design process, the role of SSM, which methods of measurement should be used, and how the methods are affected of changes in the control room. The report brings different questions to discussion concerning the validation process. Supplementary methods of measurement for integrated validation are discussed, e.g. dynamic, psychophysiological, and qualitative methods for identification of problems. Supplementary methods for statistical analysis are presented. The study points out a number of deficiencies in the validation process, e.g. the need of common guidelines for validation and design, criteria for different types of measurements, clarification of the role of SSM, and recommendations for the responsibility of external participants in the validation process. The authors propose 12 measures for taking care of the identified problems

  4. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    Science.gov (United States)

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  5. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  6. Developing and validating a nutrition knowledge questionnaire: key methods and considerations.

    Science.gov (United States)

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-10-01

    To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.

  7. Method validation for chemical composition determination by electron microprobe with wavelength dispersive spectrometer

    Science.gov (United States)

    Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.

    2016-07-01

    The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.

  8. Specification and Preliminary Validation of IAT (Integrated Analysis Techniques) Methods: Executive Summary.

    Science.gov (United States)

    1985-03-01

    conceptual framwork , and preliminary validation of IAT concepts. Planned work for FY85, including more extensive validation, is also described. 20...Developments: Required Capabilities .... ......... 10 2-1 IAT Conceptual Framework - FY85 (FEO) ..... ........... 11 2-2 Recursive Nature of Decomposition...approach: 1) Identify needs & requirements for IAT. 2) Develop IAT conceptual framework. 3) Validate IAT methods. 4) Develop applications materials. To

  9. OWL-based reasoning methods for validating archetypes.

    Science.gov (United States)

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. A chromatographic method validation to quantify tablets Mephenesine of national production

    International Nuclear Information System (INIS)

    Suarez Perez, Yania; Izquierdo Castro, Adalberto; Milian Sanchez, Jana Daria

    2009-01-01

    Authors made validation of an analytical method by high performance liquid chromatography (HPLC) for quantification of Mephenesine in recently reformulated 500 mg tablets. With regard to its application to quality control, validation included the following parameters: linearity, accuracy, precision, and selectivity. Results were satisfactory within 50-150 % rank. In the case of its use in subsequent studies of chemical stability, the selectivity for stability and sensitivity was assessed. Estimated detection and quantification limits were appropriate, and the method was selective versus the possible degradation products. (Author)

  11. Empirical Validation of a Thermal Model of a Complex Roof Including Phase Change Materials

    Directory of Open Access Journals (Sweden)

    Stéphane Guichard

    2015-12-01

    Full Text Available This paper deals with the empirical validation of a building thermal model of a complex roof including a phase change material (PCM. A mathematical model dedicated to PCMs based on the heat apparent capacity method was implemented in a multi-zone building simulation code, the aim being to increase the understanding of the thermal behavior of the whole building with PCM technologies. In order to empirically validate the model, the methodology is based both on numerical and experimental studies. A parametric sensitivity analysis was performed and a set of parameters of the thermal model has been identified for optimization. The use of the generic optimization program called GenOpt® coupled to the building simulation code enabled to determine the set of adequate parameters. We first present the empirical validation methodology and main results of previous work. We then give an overview of GenOpt® and its coupling with the building simulation code. Finally, once the optimization results are obtained, comparisons of the thermal predictions with measurements are found to be acceptable and are presented.

  12. Human Factors methods concerning integrated validation of nuclear power plant control rooms

    International Nuclear Information System (INIS)

    Oskarsson, Per-Anders; Johansson, Bjoern J.E.; Gonzalez, Natalia

    2010-02-01

    The frame of reference for this work was existing recommendations and instructions from the NPP area, experiences from the review of the Turbic Validation and experiences from system validations performed at the Swedish Armed Forces, e.g. concerning military control rooms and fighter pilots. These enterprises are characterized by complex systems in extreme environments, often with high risks, where human error can lead to serious consequences. A focus group has been performed with representatives responsible for Human Factors issues from all Swedish NPP:s. The questions that were discussed were, among other things, for whom an integrated validation (IV) is performed and its purpose, what should be included in an IV, the comparison with baseline measures, the design process, the role of SSM, which methods of measurement should be used, and how the methods are affected of changes in the control room. The report brings different questions to discussion concerning the validation process. Supplementary methods of measurement for integrated validation are discussed, e.g. dynamic, psychophysiological, and qualitative methods for identification of problems. Supplementary methods for statistical analysis are presented. The study points out a number of deficiencies in the validation process, e.g. the need of common guidelines for validation and design, criteria for different types of measurements, clarification of the role of SSM, and recommendations for the responsibility of external participants in the validation process. The authors propose 12 measures for taking care of the identified problems

  13. Laboratory diagnostic methods, system of quality and validation

    Directory of Open Access Journals (Sweden)

    Ašanin Ružica

    2005-01-01

    Full Text Available It is known that laboratory investigations secure safe and reliable results that provide a final confirmation of the quality of work. Ideas, planning, knowledge, skills, experience, and environment, along with good laboratory practice, quality control and reliability of quality, make the area of biological investigations very complex. In recent years, quality control, including the control of work in the laboratory, is based on international standards and is used at that level. The implementation of widely recognized international standards, such as the International Standard ISO/IEC 17025 (1 and the implementing of the quality system series ISO/IEC 9000 (2 have become the imperative on the grounds of which laboratories have a formal, visible and corresponding system of quality. The diagnostic methods that are used must constantly yield results which identify the animal as positive or negative, and the precise status of the animal is determined with a predefined degree of statistical significance. Methods applied on a selected population reduce the risk of obtaining falsely positive or falsely negative results. A condition for this are well conceived and documented methods, with the application of the corresponding reagents, and work with professional and skilled staff. This process requires also a consistent implementation of the most rigorous experimental plans, epidemiological and statistical data and estimations, with constant monitoring of the validity of the applied methods. Such an approach is necessary in order to cut down the number of misconceptions and accidental mistakes, for a referent population of animals on which the validity of a method is tested. Once a valid method is included in daily routine investigations, it is necessary to apply constant monitoring for the purpose of internal quality control, in order adequately to evaluate its reproducibility and reliability. Consequently, it is necessary at least twice yearly to conduct

  14. Validation for chromatographic and electrophoretic methods

    OpenAIRE

    Ribani, Marcelo; Bottoli, Carla Beatriz Grespan; Collins, Carol H.; Jardim, Isabel Cristina Sales Fontes; Melo, Lúcio Flávio Costa

    2004-01-01

    The validation of an analytical method is fundamental to implementing a quality control system in any analytical laboratory. As the separation techniques, GC, HPLC and CE, are often the principal tools used in such determinations, procedure validation is a necessity. The objective of this review is to describe the main aspects of validation in chromatographic and electrophoretic analysis, showing, in a general way, the similarities and differences between the guidelines established by the dif...

  15. Comparing the Validity of Non-Invasive Methods in Measuring Thoracic Kyphosis and Lumbar Lordosis

    Directory of Open Access Journals (Sweden)

    Mohammad Yousefi

    2012-04-01

    Full Text Available Background: the purpose of this article is to study the validity of each of the non-invasive methods (flexible ruler, spinal mouse, and processing the image versus the one through-Ray radiation (the basic method and comparing them with each other.Materials and Methods: for evaluating the validity of each of these non-invasive methods, the thoracic Kyphosis and lumber Lordosis angle of 20 students of Birjand University (age mean and standard deviation: 26±2, weight: 72±2.5 kg, height: 169±5.5 cm through fours methods of flexible ruler, spinal mouse, and image processing and X-ray.Results: the results indicated that the validity of the methods including flexible ruler, spinal mouse, and image processing in measuring the thoracic Kyphosis and lumber Lordosis angle respectively have an adherence of 0.81, 0.87, 0.73, 0.76, 0.83, 0.89 (p>0.05. As a result, regarding the gained validity against the golden method of X-ray, it could be stated that the three mentioned non-invasive methods have adequate validity. In addition, the one-way analysis of variance test indicated that there existed a meaningful relationship between the three methods of measuring the thoracic Kyphosis and lumber Lordosis, and with respect to the Tukey’s test result, the image processing method is the most precise one.Conclusion as a result, this method could be used along with other non-invasive methods as a valid measuring method.

  16. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  17. Increased efficacy for in-house validation of real-time PCR GMO detection methods.

    Science.gov (United States)

    Scholtens, I M J; Kok, E J; Hougs, L; Molenaar, B; Thissen, J T N M; van der Voet, H

    2010-03-01

    To improve the efficacy of the in-house validation of GMO detection methods (DNA isolation and real-time PCR, polymerase chain reaction), a study was performed to gain insight in the contribution of the different steps of the GMO detection method to the repeatability and in-house reproducibility. In the present study, 19 methods for (GM) soy, maize canola and potato were validated in-house of which 14 on the basis of an 8-day validation scheme using eight different samples and five on the basis of a more concise validation protocol. In this way, data was obtained with respect to the detection limit, accuracy and precision. Also, decision limits were calculated for declaring non-conformance (>0.9%) with 95% reliability. In order to estimate the contribution of the different steps in the GMO analysis to the total variation variance components were estimated using REML (residual maximum likelihood method). From these components, relative standard deviations for repeatability and reproducibility (RSD(r) and RSD(R)) were calculated. The results showed that not only the PCR reaction but also the factors 'DNA isolation' and 'PCR day' are important factors for the total variance and should therefore be included in the in-house validation. It is proposed to use a statistical model to estimate these factors from a large dataset of initial validations so that for similar GMO methods in the future, only the PCR step needs to be validated. The resulting data are discussed in the light of agreed European criteria for qualified GMO detection methods.

  18. Using method triangulation to validate a new instrument (CPWQ-com) assessing cancer patients' satisfaction with communication

    DEFF Research Database (Denmark)

    Ross, Lone; Lundstrøm, Louise Hyldborg; Petersen, Morten Aagaard

    2012-01-01

    Patients' perceptions of care including the communication with health care staff is recognized as an important aspect of the quality of cancer care. Using mixed methods, we developed and validated a short instrument assessing this communication.......Patients' perceptions of care including the communication with health care staff is recognized as an important aspect of the quality of cancer care. Using mixed methods, we developed and validated a short instrument assessing this communication....

  19. Development and Validation of a Dissolution Test Method for ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a dissolution test method for dissolution release of artemether and lumefantrine from tablets. Methods: A single dissolution method for evaluating the in vitro release of artemether and lumefantrine from tablets was developed and validated. The method comprised of a dissolution medium of ...

  20. Practical procedure for method validation in INAA- A tutorial

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2015-01-01

    This paper describes the procedure employed by the Neutron Activation Laboratory at the Nuclear and Energy Research Institute (LAN, IPEN - CNEN/SP) for validation of Instrumental Neutron Activation Analysis (INAA) methods. According to recommendations of ISO/IEC 17025 the method performance characteristics (limit of detection, limit of quantification, trueness, repeatability, intermediate precision, reproducibility, selectivity, linearity and uncertainties budget) were outline in an easy, fast and convenient way. The paper presents step by step how to calculate the required method performance characteristics in a process of method validation, what are the procedures, adopted strategies and acceptance criteria for the results, that is, how to make a method validation in INAA. In order to exemplify the methodology applied, obtained results for the method validation of mass fraction determination of Co, Cr, Fe, Rb, Se and Zn in biological matrix samples, using an internal reference material of mussel tissue were presented. It was concluded that the methodology applied for validation of INAA methods is suitable, meeting all the requirements of ISO/IEC 17025, and thereby, generating satisfactory results for the studies carried at LAN, IPEN - CNEN/SP. (author)

  1. Practical procedure for method validation in INAA- A tutorial

    Energy Technology Data Exchange (ETDEWEB)

    Petroni, Robson; Moreira, Edson G., E-mail: robsonpetroni@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    This paper describes the procedure employed by the Neutron Activation Laboratory at the Nuclear and Energy Research Institute (LAN, IPEN - CNEN/SP) for validation of Instrumental Neutron Activation Analysis (INAA) methods. According to recommendations of ISO/IEC 17025 the method performance characteristics (limit of detection, limit of quantification, trueness, repeatability, intermediate precision, reproducibility, selectivity, linearity and uncertainties budget) were outline in an easy, fast and convenient way. The paper presents step by step how to calculate the required method performance characteristics in a process of method validation, what are the procedures, adopted strategies and acceptance criteria for the results, that is, how to make a method validation in INAA. In order to exemplify the methodology applied, obtained results for the method validation of mass fraction determination of Co, Cr, Fe, Rb, Se and Zn in biological matrix samples, using an internal reference material of mussel tissue were presented. It was concluded that the methodology applied for validation of INAA methods is suitable, meeting all the requirements of ISO/IEC 17025, and thereby, generating satisfactory results for the studies carried at LAN, IPEN - CNEN/SP. (author)

  2. Validation of the Rotation Ratios Method

    International Nuclear Information System (INIS)

    Foss, O.A.; Klaksvik, J.; Benum, P.; Anda, S.

    2007-01-01

    Background: The rotation ratios method describes rotations between pairs of sequential pelvic radiographs. The method seems promising but has not been validated. Purpose: To validate the accuracy of the rotation ratios method. Material and Methods: Known pelvic rotations between 165 radiographs obtained from five skeletal pelvises in an experimental material were compared with the corresponding calculated rotations to describe the accuracy of the method. The results from a clinical material of 262 pelvic radiographs from 46 patients defined the ranges of rotational differences compared. Repeated analyses, both on the experimental and the clinical material, were performed using the selected reference points to describe the robustness and the repeatability of the method. Results: The reference points were easy to identify and barely influenced by pelvic rotations. The mean differences between calculated and real pelvic rotations were 0.0 deg (SD 0.6) for vertical rotations and 0.1 deg (SD 0.7) for transversal rotations in the experimental material. The intra- and interobserver repeatability of the method was good. Conclusion: The accuracy of the method was reasonably high, and the method may prove to be clinically useful

  3. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.; Adams, M.L.; McClarren, R.G.; Mallick, B.K.

    2011-01-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework

  4. Development and Validation of High Performance Liquid Chromatography Method for Determination Atorvastatin in Tablet

    Science.gov (United States)

    Yugatama, A.; Rohmani, S.; Dewangga, A.

    2018-03-01

    Atorvastatin is the primary choice for dyslipidemia treatment. Due to patent expiration of atorvastatin, the pharmaceutical industry makes copy of the drug. Therefore, the development methods for tablet quality tests involving atorvastatin concentration on tablets needs to be performed. The purpose of this research was to develop and validate the simple atorvastatin tablet analytical method by HPLC. HPLC system used in this experiment consisted of column Cosmosil C18 (150 x 4,6 mm, 5 µm) as the stationary reverse phase chomatography, a mixture of methanol-water at pH 3 (80:20 v/v) as the mobile phase, flow rate of 1 mL/min, and UV detector at wavelength of 245 nm. Validation methods were including: selectivity, linearity, accuracy, precision, limit of detection (LOD), and limit of quantitation (LOQ). The results of this study indicate that the developed method had good validation including selectivity, linearity, accuracy, precision, LOD, and LOQ for analysis of atorvastatin tablet content. LOD and LOQ were 0.2 and 0.7 ng/mL, and the linearity range were 20 - 120 ng/mL.

  5. Validation of methods for the detection and quantification of engineered nanoparticles in food

    DEFF Research Database (Denmark)

    Linsinger, T.P.J.; Chaudhry, Q.; Dehalu, V.

    2013-01-01

    the methods apply equally well to particles of different suppliers. In trueness testing, information whether the particle size distribution has changed during analysis is required. Results are largely expected to follow normal distributions due to the expected high number of particles. An approach...... approach for the validation of methods for detection and quantification of nanoparticles in food samples. It proposes validation of identity, selectivity, precision, working range, limit of detection and robustness, bearing in mind that each “result” must include information about the chemical identity...

  6. Method validation in pharmaceutical analysis: from theory to practical optimization

    Directory of Open Access Journals (Sweden)

    Jaqueline Kaleian Eserian

    2015-01-01

    Full Text Available The validation of analytical methods is required to obtain high-quality data. For the pharmaceutical industry, method validation is crucial to ensure the product quality as regards both therapeutic efficacy and patient safety. The most critical step in validating a method is to establish a protocol containing well-defined procedures and criteria. A well planned and organized protocol, such as the one proposed in this paper, results in a rapid and concise method validation procedure for quantitative high performance liquid chromatography (HPLC analysis.   Type: Commentary

  7. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    Science.gov (United States)

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the

  8. Validation of NAA Method for Urban Particulate Matter

    International Nuclear Information System (INIS)

    Woro Yatu Niken Syahfitri; Muhayatun; Diah Dwiana Lestiani; Natalia Adventini

    2009-01-01

    Nuclear analytical techniques have been applied in many countries for determination of environmental pollutant. Method of NAA (neutron activation analysis) representing one of nuclear analytical technique of that has low detection limits, high specificity, high precision, and accuracy for large majority of naturally occurring elements, and ability of non-destructive and simultaneous determination of multi-elemental, and can handle small sample size (< 1 mg). To ensure quality and reliability of the method, validation are needed to be done. A standard reference material, SRM NIST 1648 Urban Particulate Matter, has been used to validate NAA method. Accuracy and precision test were used as validation parameters. Particulate matter were validated for 18 elements: Ti, I, V, Br, Mn, Na, K, Cl, Cu, Al, As, Fe, Co, Zn, Ag, La, Cr, and Sm,. The result showed that the percent relative standard deviation of the measured elemental concentrations are found to be within ranged from 2 to 14,8% for most of the elements analyzed whereas Hor rat value in range 0,3-1,3. Accuracy test results showed that relative bias ranged from -11,1 to 3,6%. Based on validation results, it can be stated that NAA method is reliable for characterization particulate matter and other similar matrix samples to support air quality monitoring. (author)

  9. Update of Standard Practices for New Method Validation in Forensic Toxicology.

    Science.gov (United States)

    Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T

    2017-01-01

    International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  10. Validation of method in instrumental NAA for food products sample

    International Nuclear Information System (INIS)

    Alfian; Siti Suprapti; Setyo Purwanto

    2010-01-01

    NAA is a method of testing that has not been standardized. To affirm and confirm that this method is valid. it must be done validation of the method with various sample standard reference materials. In this work. the validation is carried for food product samples using NIST SRM 1567a (wheat flour) and NIST SRM 1568a (rice flour). The results show that the validation method for testing nine elements (Al, K, Mg, Mn, Na, Ca, Fe, Se and Zn) in SRM 1567a and eight elements (Al, K, Mg, Mn, Na, Ca, Se and Zn ) in SRM 1568a pass the test of accuracy and precision. It can be conclude that this method has power to give valid result in determination element of the food products samples. (author)

  11. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Directory of Open Access Journals (Sweden)

    Shankarjee Krishnamoorthi

    Full Text Available We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  12. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Science.gov (United States)

    Krishnamoorthi, Shankarjee; Perotti, Luigi E; Borgstrom, Nils P; Ajijola, Olujimi A; Frid, Anna; Ponnaluri, Aditya V; Weiss, James N; Qu, Zhilin; Klug, William S; Ennis, Daniel B; Garfinkel, Alan

    2014-01-01

    We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  13. Reliability and validity of non-radiographic methods of thoracic kyphosis measurement: a systematic review.

    Science.gov (United States)

    Barrett, Eva; McCreesh, Karen; Lewis, Jeremy

    2014-02-01

    A wide array of instruments are available for non-invasive thoracic kyphosis measurement. Guidelines for selecting outcome measures for use in clinical and research practice recommend that properties such as validity and reliability are considered. This systematic review reports on the reliability and validity of non-invasive methods for measuring thoracic kyphosis. A systematic search of 11 electronic databases located studies assessing reliability and/or validity of non-invasive thoracic kyphosis measurement techniques. Two independent reviewers used a critical appraisal tool to assess the quality of retrieved studies. Data was extracted by the primary reviewer. The results were synthesized qualitatively using a level of evidence approach. 27 studies satisfied the eligibility criteria and were included in the review. The reliability, validity and both reliability and validity were investigated by sixteen, two and nine studies respectively. 17/27 studies were deemed to be of high quality. In total, 15 methods of thoracic kyphosis were evaluated in retrieved studies. All investigated methods showed high (ICC ≥ .7) to very high (ICC ≥ .9) levels of reliability. The validity of the methods ranged from low to very high. The strongest levels of evidence for reliability exists in support of the Debrunner kyphometer, Spinal Mouse and Flexicurve index, and for validity supports the arcometer and Flexicurve index. Further reliability and validity studies are required to strengthen the level of evidence for the remaining methods of measurement. This should be addressed by future research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. ASTM Validates Air Pollution Test Methods

    Science.gov (United States)

    Chemical and Engineering News, 1973

    1973-01-01

    The American Society for Testing and Materials (ASTM) has validated six basic methods for measuring pollutants in ambient air as the first part of its Project Threshold. Aim of the project is to establish nationwide consistency in measuring pollutants; determining precision, accuracy and reproducibility of 35 standard measuring methods. (BL)

  15. The Validation of NAA Method Used as Test Method in Serpong NAA Laboratory

    International Nuclear Information System (INIS)

    Rina-Mulyaningsih, Th.

    2004-01-01

    The Validation Of NAA Method Used As Test Method In Serpong NAA Laboratory. NAA Method is a non standard testing method. The testing laboratory shall validate its using method to ensure and confirm that it is suitable with application. The validation of NAA methods have been done with the parameters of accuracy, precision, repeatability and selectivity. The NIST 1573a Tomato Leaves, NIES 10C Rice flour unpolished and standard elements were used in this testing program. The result of testing with NIST 1573a showed that the elements of Na, Zn, Al and Mn are met from acceptance criteria of accuracy and precision, whereas Co is rejected. The result of testing with NIES 10C showed that Na and Zn elements are met from acceptance criteria of accuracy and precision, but Mn element is rejected. The result of selectivity test showed that the value of quantity is between 0.1-2.5 μg, depend on the elements. (author)

  16. Toward a Unified Validation Framework in Mixed Methods Research

    Science.gov (United States)

    Dellinger, Amy B.; Leech, Nancy L.

    2007-01-01

    The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as…

  17. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  18. Validation of innovative technologies and strategies for regulatory safety assessment methods: challenges and opportunities.

    Science.gov (United States)

    Stokes, William S; Wind, Marilyn

    2010-01-01

    Advances in science and innovative technologies are providing new opportunities to develop test methods and strategies that may improve safety assessments and reduce animal use for safety testing. These include high throughput screening and other approaches that can rapidly measure or predict various molecular, genetic, and cellular perturbations caused by test substances. Integrated testing and decision strategies that consider multiple types of information and data are also being developed. Prior to their use for regulatory decision-making, new methods and strategies must undergo appropriate validation studies to determine the extent that their use can provide equivalent or improved protection compared to existing methods and to determine the extent that reproducible results can be obtained in different laboratories. Comprehensive and optimal validation study designs are expected to expedite the validation and regulatory acceptance of new test methods and strategies that will support improved safety assessments and reduced animal use for regulatory testing.

  19. A method for the computation of turbulent polymeric liquids including hydrodynamic interactions and chain entanglements

    Energy Technology Data Exchange (ETDEWEB)

    Kivotides, Demosthenes, E-mail: demosthenes.kivotides@strath.ac.uk

    2017-02-12

    An asymptotically exact method for the direct computation of turbulent polymeric liquids that includes (a) fully resolved, creeping microflow fields due to hydrodynamic interactions between chains, (b) exact account of (subfilter) residual stresses, (c) polymer Brownian motion, and (d) direct calculation of chain entanglements, is formulated. Although developed in the context of polymeric fluids, the method is equally applicable to turbulent colloidal dispersions and aerosols. - Highlights: • An asymptotically exact method for the computation of polymer and colloidal fluids is developed. • The method is valid for all flow inertia and all polymer volume fractions. • The method models entanglements and hydrodynamic interactions between polymer chains.

  20. Validity of a questionnaire measuring motives for choosing foods including sustainable concerns.

    Science.gov (United States)

    Sautron, Valérie; Péneau, Sandrine; Camilleri, Géraldine M; Muller, Laurent; Ruffieux, Bernard; Hercberg, Serge; Méjean, Caroline

    2015-04-01

    Since the 1990s, sustainability of diet has become an increasingly important concern for consumers. However, there is no validated multidimensional measurement of motivation in the choice of foods including a concern for sustainability currently available. In the present study, we developed a questionnaire that measures food choice motives during purchasing, and we tested its psychometric properties. The questionnaire included 104 items divided into four predefined dimensions (environmental, health and well-being, economic and miscellaneous). It was administered to 1000 randomly selected subjects participating in the Nutrinet-Santé cohort study. Among 637 responders, one-third found the questionnaire complex or too long, while one-quarter found it difficult to fill in. Its underlying structure was determined by exploratory factor analysis and then internally validated by confirmatory factor analysis. Reliability was also assessed by internal consistency of selected dimensions and test-retest repeatability. After selecting the most relevant items, first-order analysis highlighted nine main dimensions: labeled ethics and environment, local and traditional production, taste, price, environmental limitations, health, convenience, innovation and absence of contaminants. The model demonstrated excellent internal validity (adjusted goodness of fit index = 0.97; standardized root mean square residuals = 0.07) and satisfactory reliability (internal consistency = 0.96, test-retest repeatability coefficient ranged between 0.31 and 0.68 over a mean 4-week period). This study enabled precise identification of the various dimensions in food choice motives and proposed an original, internally valid tool applicable to large populations for assessing consumer food motivation during purchasing, particularly in terms of sustainability. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  2. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  3. Examination of packaging materials in bakery products : a validated method for detection and quantification

    NARCIS (Netherlands)

    Raamsdonk, van L.W.D.; Pinckaers, V.G.Z.; Vliege, J.J.M.; Egmond, van H.J.

    2012-01-01

    Methods for the detection and quantification of packaging materials are necessary for the control of the prohibition of these materials according to Regulation (EC)767/2009. A method has been developed and validated at RIKILT for bakery products, including sweet bread and raisin bread. This choice

  4. Validated modified Lycopodium spore method development for ...

    African Journals Online (AJOL)

    Validated modified lycopodium spore method has been developed for simple and rapid quantification of herbal powdered drugs. Lycopodium spore method was performed on ingredients of Shatavaryadi churna, an ayurvedic formulation used as immunomodulator, galactagogue, aphrodisiac and rejuvenator. Estimation of ...

  5. NordVal: A Nordic system for validation of alternative microbiological methods

    DEFF Research Database (Denmark)

    Qvist, Sven

    2007-01-01

    NordVal was created in 1999 by the Nordic Committee of Senior Officials for Food Issues under the Nordic Council of Ministers. The Committee adopted the following objective for NordVal: NordVal evaluates the performance and field of application of alternative microbiological methods. This includes...... analyses of food, water, feed, animal faeces and food environmental samples in the Nordic countries. NordVal is managed by a steering group, which is appointed by the National Food Administrations in Denmark, Finland, Iceland, Norway and Sweden. The background for creation of NordVal was a Danish...... validation system (DanVal) established in 1995 to cope with a need to validate alternative methods to be used in the Danish Salmonella Action Program. The program attracted considerable attention in the other Nordic countries. NordVal has elaborated a number of documents, which describe the requirements...

  6. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.

    Science.gov (United States)

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil

    2014-08-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.

  7. Validation of a same-day real-time PCR method for screening of meat and carcass swabs for Salmonella

    DEFF Research Database (Denmark)

    Löfström, Charlotta; Krause, Michael; Josefsen, Mathilde Hartmann

    2009-01-01

    of the published PCR methods for Salmonella have been validated in collaborative studies. This study describes a validation including comparative and collaborative trials, based on the recommendations from the Nordic organization for validation of alternative microbiological methods (NordVal) of a same-day, non....... Partly based on results obtained in this study, the method has obtained NordVal approval for analysis of Salmonella in meat and carcass swabs. The PCR method was transferred to a production laboratory and the performance was compared with the BAX Salmonella test on 39 pork samples artificially...... contaminated with Salmonella. There was no significant difference in the results obtained by the two methods. Conclusion: The real-time PCR method for detection of Salmonella in meat and carcass swabs was validated in comparative and collaborative trials according to NordVal recommendations. The PCR method...

  8. Method validation for strobilurin fungicides in cereals and fruit

    DEFF Research Database (Denmark)

    Christensen, Hanne Bjerre; Granby, Kit

    2001-01-01

    Strobilurins are a new class of fungicides that are active against a broad spectrum of fungi. In the present work a GC method for analysis of strobilurin fungicides was validated. The method was based on extraction with ethyl acetate/cyclohexane, clean-up by gel permeation chromatography (GPC......) and determination of the content by gas chromatography (GC) with electron capture (EC-), nitrogen/phosphorous (NP-), and mass spectrometric (MS-) detection. Three strobilurins, azoxystrobin, kresoxim-methyl and trifloxystrobin were validated on three matrices, wheat, apple and grapes. The validation was based...

  9. 76 FR 28664 - Method 301-Field Validation of Pollutant Measurement Methods From Various Waste Media

    Science.gov (United States)

    2011-05-18

    ... . d m = The mean of the paired sample differences. n = Total number of paired samples. 7.4.2 t Test... being compared to a validated test method as part of the Method 301 validation and an audit sample for... tighten the acceptance criteria for the precision of candidate alternative test methods. One commenter...

  10. The Value of Qualitative Methods in Social Validity Research

    Science.gov (United States)

    Leko, Melinda M.

    2014-01-01

    One quality indicator of intervention research is the extent to which the intervention has a high degree of social validity, or practicality. In this study, I drew on Wolf's framework for social validity and used qualitative methods to ascertain five middle schoolteachers' perceptions of the social validity of System 44®--a phonics-based reading…

  11. Validation of the Analytical Method for the Determination of Flavonoids in Broccoli

    Directory of Open Access Journals (Sweden)

    Tuszyńska Magdalena

    2014-09-01

    Full Text Available A simple, accurate and selective HPLC method was developed and validated for determination of quercetin and kaempferol, which are the main flavonols in broccoli. The separation was achieved on a reversed-phase C18 column using a mobile phase composed of methanol/water (60/40 and phosphoric acid 0.2% at a flow rate of 1.0 ml min-1. The detection was carried out on a DAD detector at 370 nm. This method was validated according to the requirements for new methods, which include selectivity, linearity, precision, accuracy, limit of detection and limit of quantitation. The current method demonstrates good linearity, with R2 > 0.99. The recovery is within 98.07-102.15% and 97.92-101.83% for quercetin and kaempferol, respectively. The method is selective, in that quercetin and kaempferol are well separated from other compounds of broccoli with good resolution. The low limit of detection and limit of quantitation of quercetin and kaempferol enable the detection and quantitation of these flavonoids in broccoli at low con–centrations.

  12. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  13. Principles of Single-Laboratory Validation of Analytical Methods for Testing the Chemical Composition of Pesticides

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Underlying theoretical and practical approaches towards pesticide formulation analysis are discussed, i.e. general principles, performance characteristics, applicability of validation data, verification of method performance, and adaptation of validated methods by other laboratories. The principles of single laboratory validation of analytical methods for testing the chemical composition of pesticides are outlined. Also the theoretical background is described for performing pesticide formulation analysis as outlined in ISO, CIPAC/AOAC and IUPAC guidelines, including methodological characteristics such as specificity, selectivity, linearity, accuracy, trueness, precision and bias. Appendices I–III hereof give practical and elaborated examples on how to use the Horwitz approach and formulae for estimating the target standard deviation towards acceptable analytical repeatability. The estimation of trueness and the establishment of typical within-laboratory reproducibility are treated in greater detail by means of worked-out examples. (author)

  14. Method Development and Validation for the Determination of Caffeine: An Alkaloid from Coffea arabica by High-performance Liquid Chromatography Method.

    Science.gov (United States)

    Naveen, P; Lingaraju, H B; Deepak, M; Medhini, B; Prasad, K Shyam

    2018-01-01

    The present study was investigated to develop and validate a reversed phase high performance liquid chromatography method for the determination of caffeine from bean material of Coffee arabica. The separation was achieved on a reversed-phase C18 column using a mobile phase composed of water: methanol (50:50) at a flow rate of 1.0 mlmin-1. The detection was carried out on a UV detector at 272 nm. The developed method was validated according to the requirements for International Conference on Harmonisation (ICH) guidelines, which includes specificity, linearity, precision, accuracy, limit of detection and limit of quantitation. The developed method validates good linearity with excellent correlation coefficient (R2 > 0.999). In repeatability and intermediate precision, the percentage relative standard deviation (% RSD) of peak area was less than 1% shows high precision of the method. The recovery rate for caffeine was within 98.78% - 101.28% indicates high accuracy of the method. The low limit of detection and limit of quantitation of caffeine enable the detection and quantitation of caffeine from C. arabica at low concentrations. The developed HPLC method is a simple, rapid, precisely, accurately and widely accepted and it is recommended for efficient assays in routine work. A simple, accurate, and sensitive high-performance liquid chromatography (HPLC) method for caffeine from Coffea arabica has been developed and validated. The developed HPLC method was validated for linearity, specificity, precision, recovery, limits of detection, and limits of quantification by the International Conference on Harmonization guidelines. The results revealed that the proposed method is highly reliable. This method could be successfully applied for routine quality work analysis. Abbreviation Used: C. arabica : Coffee arabica, ICH: International Conference on Harmonisation, % RSD: Percentage Relative Standard Deviation, R2: Correlation Coefficient, ppm: Parts per million, LOD: Limits

  15. Improved best estimate plus uncertainty methodology, including advanced validation concepts, to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, C.; Williams, B.; Hemez, F.; Atamturktur, S.H.; McClure, P.

    2011-01-01

    Research highlights: → The best estimate plus uncertainty methodology (BEPU) is one option in the licensing of nuclear reactors. → The challenges for extending the BEPU method for fuel qualification for an advanced reactor fuel are primarily driven by schedule, the need for data, and the sufficiency of the data. → In this paper we develop an extended BEPU methodology that can potentially be used to address these new challenges in the design and licensing of advanced nuclear reactors. → The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. → The methodology includes a formalism to quantify an adequate level of validation (predictive maturity) with respect to existing data, so that required new testing can be minimized, saving cost by demonstrating that further testing will not enhance the quality of the predictive tools. - Abstract: Many evolving nuclear energy technologies use advanced predictive multiscale, multiphysics modeling and simulation (M and S) capabilities to reduce the cost and schedule of design and licensing. Historically, the role of experiments has been as a primary tool for the design and understanding of nuclear system behavior, while M and S played the subordinate role of supporting experiments. In the new era of multiscale, multiphysics computational-based technology development, this role has been reversed. The experiments will still be needed, but they will be performed at different scales to calibrate and validate the models leading to predictive simulations for design and licensing. Minimizing the required number of validation experiments produces cost and time savings. The use of multiscale, multiphysics models introduces challenges in validating these predictive tools - traditional methodologies will have to be modified to address these challenges. This paper gives the basic aspects of a methodology that can potentially be used to address these new challenges in

  16. General criteria for validation of dosimetry methods in the context of a quality system ISO / IEC 17025

    International Nuclear Information System (INIS)

    Martin Garcia, R.; Navarro Bravo, T.

    2011-01-01

    The accreditation of a testing laboratory in accordance with ISO / IEC 17025 recognizes the technical competence of a laboratory to perform certain tests. One of the requirements of that rule states that laboratories must demonstrate that the methods used are valid and appropriate for the intended use and customer needs. This demonstration is accomplished through the process of validation of methods, defined in the rule it self as c onfirmation by examination and provision of objective evidence that the requirements for a particular purpose . The process of validating a test method should be well planned and documented, including the requirements under the applicable rules and criteria established by the laboratory to comply with these requirements.

  17. International Harmonization and Cooperation in the Validation of Alternative Methods.

    Science.gov (United States)

    Barroso, João; Ahn, Il Young; Caldeira, Cristiane; Carmichael, Paul L; Casey, Warren; Coecke, Sandra; Curren, Rodger; Desprez, Bertrand; Eskes, Chantra; Griesinger, Claudius; Guo, Jiabin; Hill, Erin; Roi, Annett Janusch; Kojima, Hajime; Li, Jin; Lim, Chae Hyung; Moura, Wlamir; Nishikawa, Akiyoshi; Park, HyeKyung; Peng, Shuangqing; Presgrave, Octavio; Singer, Tim; Sohn, Soo Jung; Westmoreland, Carl; Whelan, Maurice; Yang, Xingfen; Yang, Ying; Zuang, Valérie

    The development and validation of scientific alternatives to animal testing is important not only from an ethical perspective (implementation of 3Rs), but also to improve safety assessment decision making with the use of mechanistic information of higher relevance to humans. To be effective in these efforts, it is however imperative that validation centres, industry, regulatory bodies, academia and other interested parties ensure a strong international cooperation, cross-sector collaboration and intense communication in the design, execution, and peer review of validation studies. Such an approach is critical to achieve harmonized and more transparent approaches to method validation, peer-review and recommendation, which will ultimately expedite the international acceptance of valid alternative methods or strategies by regulatory authorities and their implementation and use by stakeholders. It also allows achieving greater efficiency and effectiveness by avoiding duplication of effort and leveraging limited resources. In view of achieving these goals, the International Cooperation on Alternative Test Methods (ICATM) was established in 2009 by validation centres from Europe, USA, Canada and Japan. ICATM was later joined by Korea in 2011 and currently also counts with Brazil and China as observers. This chapter describes the existing differences across world regions and major efforts carried out for achieving consistent international cooperation and harmonization in the validation and adoption of alternative approaches to animal testing.

  18. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.

    2011-09-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which experimental data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport universe, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new experiments within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies. © 2011 Elsevier Ltd. All rights reserved.

  19. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    Development and Validation of Analytical Method for Losartan-Copper Complex Using UV-Vis Spectrophotometry. ... Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with ...

  20. Validation and further development of a novel thermal analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Mathews, E.H.; Shuttleworth, A.G.; Rousseau, P.G. [Pretoria Univ. (South Africa). Dept. of Mechanical Engineering

    1994-12-31

    The design of thermal and energy efficient buildings requires inter alia the investigation of the passive performance, natural ventilation, mechanical ventilation as well as structural and evaporative cooling of the building. Only when these fail to achieve the desired thermal comfort should mechanical cooling systems be considered. Few computer programs have the ability to investigate all these comfort regulating methods at the design stage. The QUICK design program can simulate these options with the exception of mechanical cooling. In this paper, Quick`s applicability is extended to include the analysis of basic air-conditioning systems. Since the design of these systems is based on indoor loads, it was necessary to validate QUICK`s load predictions before extending it. This article addresses validation in general and proposes a procedure to establish the efficiency of a program`s load predictions. This proposed procedure is used to compare load predictions by the ASHRAE, CIBSE, CARRIER, CHEETAH, BSIMAC and QUICK methods for 46 case studies involving 36 buildings in various climatic conditions. Although significant differences in the results of the various methods were observed, it is concluded that QUICK can be used with the same confidence as the other methods. It was further shown that load prediction programs usually under-estimate the effect of building mass and therefore over-estimate the peak loads. The details for the 46 case studies are available to other researchers for further verification purposes. With the confidence gained in its load predictions, QUICK was extended to include air-conditioning system analysis. The program was then applied to different case studies. It is shown that system size and energy usage can be reduced by more than 60% by using a combination of passive and mechanical cooling systems as well as different control strategies. (author)

  1. Development and Validation of Improved Method for Fingerprint ...

    African Journals Online (AJOL)

    Purpose: To develop and validate an improved method by capillary zone electrophoresis with photodiode array detection for the fingerprint analysis of Ligusticum chuanxiong Hort. (Rhizoma Chuanxiong). Methods: The optimum high performance capillary electrophoresis (HPCE) conditions were 30 mM borax containing 5 ...

  2. Validation of a dissolution method with RP-HPLC analysis for Perindopril erbumine and Indapamide combination tablet

    Directory of Open Access Journals (Sweden)

    Jain P.S.

    2012-01-01

    Full Text Available A Dissolution method with high performance liquid chromatography (HPLC analysis was validated for perindopril erbumine and indapamide in combination tablet formulation. The method was validated to meet requirements for a global regulatory filing and this validation included specificity, linearity, accuracy, precision, range, robustness and solution stability studies. The dissolution method, which uses USP apparatus 1 with basket rotating at 100 rpm, 1000 ml of phosphate buffer pH 6.8 as the dissolution medium, and reversed-phased HPLC was carried out at 50⁰C on a 4.6mm×250mm 5μm cyano column that contained USP packing L1 with acetonitrile: buffer pH 2.8::40:60 (v/v, as mobile phase. UV detector was set at 225 nm. A method was found to be selective, linear, accurate and precise in the specified ranges. Intra-day and inter-day variability for method was <2% RSD. This method was successfully used for quantification of perindopril erbumine and indapamide combination tablet formulations.

  3. Development and validation of a spectroscopic method for the ...

    African Journals Online (AJOL)

    Development and validation of a spectroscopic method for the simultaneous analysis of ... advanced analytical methods such as high pressure liquid ..... equipment. DECLARATIONS ... high-performance liquid chromatography. J Chromatogr.

  4. A systematic review of validated methods to capture acute bronchospasm using administrative or claims data.

    Science.gov (United States)

    Sharifi, Mona; Krishanswami, Shanthi; McPheeters, Melissa L

    2013-12-30

    To identify and assess billing, procedural, or diagnosis code, or pharmacy claim-based algorithms used to identify acute bronchospasm in administrative and claims databases. We searched the MEDLINE database from 1991 to September 2012 using controlled vocabulary and key terms related to bronchospasm, wheeze and acute asthma. We also searched the reference lists of included studies. Two investigators independently assessed the full text of studies against pre-determined inclusion criteria. Two reviewers independently extracted data regarding participant and algorithm characteristics. Our searches identified 677 citations of which 38 met our inclusion criteria. In these 38 studies, the most commonly used ICD-9 code was 493.x. Only 3 studies reported any validation methods for the identification of bronchospasm, wheeze or acute asthma in administrative and claims databases; all were among pediatric populations and only 2 offered any validation statistics. Some of the outcome definitions utilized were heterogeneous and included other disease based diagnoses, such as bronchiolitis and pneumonia, which are typically of an infectious etiology. One study offered the validation of algorithms utilizing Emergency Department triage chief complaint codes to diagnose acute asthma exacerbations with ICD-9 786.07 (wheezing) revealing the highest sensitivity (56%), specificity (97%), PPV (93.5%) and NPV (76%). There is a paucity of studies reporting rigorous methods to validate algorithms for the identification of bronchospasm in administrative data. The scant validated data available are limited in their generalizability to broad-based populations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Bayesian risk-based decision method for model validation under uncertainty

    International Nuclear Information System (INIS)

    Jiang Xiaomo; Mahadevan, Sankaran

    2007-01-01

    This paper develops a decision-making methodology for computational model validation, considering the risk of using the current model, data support for the current model, and cost of acquiring new information to improve the model. A Bayesian decision theory-based method is developed for this purpose, using a likelihood ratio as the validation metric for model assessment. An expected risk or cost function is defined as a function of the decision costs, and the likelihood and prior of each hypothesis. The risk is minimized through correctly assigning experimental data to two decision regions based on the comparison of the likelihood ratio with a decision threshold. A Bayesian validation metric is derived based on the risk minimization criterion. Two types of validation tests are considered: pass/fail tests and system response value measurement tests. The methodology is illustrated for the validation of reliability prediction models in a tension bar and an engine blade subjected to high cycle fatigue. The proposed method can effectively integrate optimal experimental design into model validation to simultaneously reduce the cost and improve the accuracy of reliability model assessment

  6. Spacecraft early design validation using formal methods

    International Nuclear Information System (INIS)

    Bozzano, Marco; Cimatti, Alessandro; Katoen, Joost-Pieter; Katsaros, Panagiotis; Mokos, Konstantinos; Nguyen, Viet Yen; Noll, Thomas; Postma, Bart; Roveri, Marco

    2014-01-01

    The size and complexity of software in spacecraft is increasing exponentially, and this trend complicates its validation within the context of the overall spacecraft system. Current validation methods are labor-intensive as they rely on manual analysis, review and inspection. For future space missions, we developed – with challenging requirements from the European space industry – a novel modeling language and toolset for a (semi-)automated validation approach. Our modeling language is a dialect of AADL and enables engineers to express the system, the software, and their reliability aspects. The COMPASS toolset utilizes state-of-the-art model checking techniques, both qualitative and probabilistic, for the analysis of requirements related to functional correctness, safety, dependability and performance. Several pilot projects have been performed by industry, with two of them having focused on the system-level of a satellite platform in development. Our efforts resulted in a significant advancement of validating spacecraft designs from several perspectives, using a single integrated system model. The associated technology readiness level increased from level 1 (basic concepts and ideas) to early level 4 (laboratory-tested)

  7. Method validation for uranium content analysis using a potentiometer T-90

    International Nuclear Information System (INIS)

    Torowati; Ngatijo; Rahmiati

    2016-01-01

    An experimental method validation has been conducted for uranium content analysis using Potentiometer T-90. The method validation experiment was performed in the quality control laboratory of Experiment Fuel Element Installation, PTBBN - BATAN. The objective is to determine the level of precision and accuracy of analytical results for uranium analysis referring to the latest American Standard Test Method (ASTM) of ASTM C1267-11, which is a modified reference method by reducing of reagent consumption by 10% of the amount used by the original method. The ASTM C 1267-11 reference is a new ASTM as a substitute for the older ASTM namely ASTM C799, Vol.12.01, 2003. It is, therefore, necessary to validate the renewed method. The tool used for the analysis of uranium was potentiometer T-90 and the material used was standard uranium oxide powder CRM (Certificate Reference Material). Validation of the method was done by analyzing standard uranium powder by 7 times weighing and 7 times analysis. Analysis results were used to determine the level of accuracy, precision, Relative Standard Deviation (RSD) and Horwitz coefficient Variation and limit detection and quantitation. The average uranium obtained for this method validation is 84.36% with Standard Deviation (SD) of 0.12%, Relative Standard Deviation (RSD) 0.14% and 2/3 Horwitz coefficient Variation (CV Horwitz) 2.05%. The results show that RSD value is smaller than the value of (2/3) CV Horwitz, which means that this method has a high precision. The accuracy value obtained is 0.48%, and since the acceptance limit of high level of accuracy is when the accuracy value is <2.00%, this method is regarded as having a high degree of accuracy [1]. The limit of detection (LOD) and and the limit of quantitation (LOQ) are 0.0145 g/L and 0.0446 g/L respectively. It is concluded that the ASTM C 1267-11 reference method is valid for use. (author)

  8. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    Directory of Open Access Journals (Sweden)

    Daniel Ramos

    2017-02-01

    Full Text Available Data to which the authors refer to throughout this article are likelihood ratios (LR computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR method in forensic evidence evaluation. These data present a necessary asset for conducting validation experiments when validating LR methods used in forensic evidence evaluation and set up validation reports. These data can be also used as a baseline for comparing the fingermark evidence in the same minutiae configuration as presented in (D. Meuwly, D. Ramos, R. Haraksim, [1], although the reader should keep in mind that different feature extraction algorithms and different AFIS systems used may produce different LRs values. Moreover, these data may serve as a reproducibility exercise, in order to train the generation of validation reports of forensic methods, according to [1]. Alongside the data, a justification and motivation for the use of methods is given. These methods calculate LRs from the fingerprint/mark data and are subject to a validation procedure. The choice of using real forensic fingerprint in the validation and simulated data in the development is described and justified. Validation criteria are set for the purpose of validation of the LR methods, which are used to calculate the LR values from the data and the validation report. For privacy and data protection reasons, the original fingerprint/mark images cannot be shared. But these images do not constitute the core data for the validation, contrarily to the LRs that are shared.

  9. Testing and Validation of Computational Methods for Mass Spectrometry.

    Science.gov (United States)

    Gatto, Laurent; Hansen, Kasper D; Hoopmann, Michael R; Hermjakob, Henning; Kohlbacher, Oliver; Beyer, Andreas

    2016-03-04

    High-throughput methods based on mass spectrometry (proteomics, metabolomics, lipidomics, etc.) produce a wealth of data that cannot be analyzed without computational methods. The impact of the choice of method on the overall result of a biological study is often underappreciated, but different methods can result in very different biological findings. It is thus essential to evaluate and compare the correctness and relative performance of computational methods. The volume of the data as well as the complexity of the algorithms render unbiased comparisons challenging. This paper discusses some problems and challenges in testing and validation of computational methods. We discuss the different types of data (simulated and experimental validation data) as well as different metrics to compare methods. We also introduce a new public repository for mass spectrometric reference data sets ( http://compms.org/RefData ) that contains a collection of publicly available data sets for performance evaluation for a wide range of different methods.

  10. Oxcarbazepine: validation and application of an analytical method

    Directory of Open Access Journals (Sweden)

    Paula Cristina Rezende Enéas

    2010-06-01

    Full Text Available Oxcarbazepine (OXC is an important anticonvulsant and mood stabilizing drug. A pharmacopoeial monograph for OXC is not yet available and therefore the development and validation of a new analytical method for quantification of this drug is essential. In the present study, a UV spectrophotometric method for the determination of OXC was developed. The various parameters, such as linearity, precision, accuracy and specificity, were studied according to International Conference on Harmonization Guidelines. Batches of 150 mg OXC capsules were prepared and analyzed using the validated UV method. The formulations were also evaluated for parameters including drug-excipient compatibility, flowability, uniformity of weight, disintegration time, assay, uniformity of content and the amount of drug dissolved during the first hour.Oxcarbazepina (OXC é um fármaco anticonvulsivante e estabilizante do humor. O desenvolvimento e validação de método analítico para quantificação da OXC são de fundamental importância devido à ausência de monografias farmacopéicas oficiais para esse fármaco. Nesse trabalho, um método espectrofotométrico UV para determinação da OXC foi desenvolvido. O método proposto foi validado seguindo os parâmetros de linearidade, precisão, exatidão e especificidade de acordo com as normas da Conferência Internacional de Harmonização. Cápsulas de OXC 150 mg foram preparadas e analisadas utilizando-se o método analítico validado. As formulações foram avaliadas com relação à compatibilidade fármaco-excipientes, fluidez, determinação de peso, tempo de desintegração, doseamento, uniformidade de conteúdo e quantidade do fármaco dissolvido após 60 minutos.

  11. Validation of calculational methods for nuclear criticality safety - approved 1975

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    The American National Standard for Nuclear Criticality Safety in Operations with Fissionable Materials Outside Reactors, N16.1-1975, states in 4.2.5: In the absence of directly applicable experimental measurements, the limits may be derived from calculations made by a method shown to be valid by comparison with experimental data, provided sufficient allowances are made for uncertainties in the data and in the calculations. There are many methods of calculation which vary widely in basis and form. Each has its place in the broad spectrum of problems encountered in the nuclear criticality safety field; however, the general procedure to be followed in establishing validity is common to all. The standard states the requirements for establishing the validity and area(s) of applicability of any calculational method used in assessing nuclear criticality safety

  12. Estimating Rooftop Suitability for PV: A Review of Methods, Patents, and Validation Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Melius, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ong, S. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    A number of methods have been developed using remote sensing data to estimate rooftop area suitable for the installation of photovoltaics (PV) at various geospatial resolutions. This report reviews the literature and patents on methods for estimating rooftop-area appropriate for PV, including constant-value methods, manual selection methods, and GIS-based methods. This report also presents NREL's proposed method for estimating suitable rooftop area for PV using Light Detection and Ranging (LiDAR) data in conjunction with a GIS model to predict areas with appropriate slope, orientation, and sunlight. NREL's method is validated against solar installation data from New Jersey, Colorado, and California to compare modeled results to actual on-the-ground measurements.

  13. Validation of a same-day real-time PCR method for screening of meat and carcass swabs for Salmonella

    Science.gov (United States)

    2009-01-01

    Background One of the major sources of human Salmonella infections is meat. Therefore, efficient and rapid monitoring of Salmonella in the meat production chain is necessary. Validation of alternative methods is needed to prove that the performance is equal to established methods. Very few of the published PCR methods for Salmonella have been validated in collaborative studies. This study describes a validation including comparative and collaborative trials, based on the recommendations from the Nordic organization for validation of alternative microbiological methods (NordVal) of a same-day, non-commercial real-time PCR method for detection of Salmonella in meat and carcass swabs. Results The comparative trial was performed against a reference method (NMKL-71:5, 1999) using artificially and naturally contaminated samples (60 minced veal and pork meat samples, 60 poultry neck-skins, and 120 pig carcass swabs). The relative accuracy was 99%, relative detection level 100%, relative sensitivity 103% and relative specificity 100%. The collaborative trial included six laboratories testing minced meat, poultry neck-skins, and carcass swabs as un-inoculated samples and samples artificially contaminated with 1–10 CFU/25 g, and 10–100 CFU/25 g. Valid results were obtained from five of the laboratories and used for the statistical analysis. Apart from one of the non-inoculated samples being false positive with PCR for one of the laboratories, no false positive or false negative results were reported. Partly based on results obtained in this study, the method has obtained NordVal approval for analysis of Salmonella in meat and carcass swabs. The PCR method was transferred to a production laboratory and the performance was compared with the BAX Salmonella test on 39 pork samples artificially contaminated with Salmonella. There was no significant difference in the results obtained by the two methods. Conclusion The real-time PCR method for detection of Salmonella in meat

  14. Method for Pre-Conditioning a Measured Surface Height Map for Model Validation

    Science.gov (United States)

    Sidick, Erkin

    2012-01-01

    This software allows one to up-sample or down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. Because the re-sampling of a surface map is accomplished based on the analytical expressions of Zernike-polynomials and a power spectral density model, such re-sampling does not introduce any aliasing and interpolation errors as is done by the conventional interpolation and FFT-based (fast-Fourier-transform-based) spatial-filtering method. Also, this new method automatically eliminates the measurement noise and other measurement errors such as artificial discontinuity. The developmental cycle of an optical system, such as a space telescope, includes, but is not limited to, the following two steps: (1) deriving requirements or specs on the optical quality of individual optics before they are fabricated through optical modeling and simulations, and (2) validating the optical model using the measured surface height maps after all optics are fabricated. There are a number of computational issues related to model validation, one of which is the "pre-conditioning" or pre-processing of the measured surface maps before using them in a model validation software tool. This software addresses the following issues: (1) up- or down-sampling a measured surface map to match it with the gridded data format of a model validation tool, and (2) eliminating the surface measurement noise or measurement errors such that the resulted surface height map is continuous or smoothly-varying. So far, the preferred method used for re-sampling a surface map is two-dimensional interpolation. The main problem of this method is that the same pixel can take different values when the method of interpolation is changed among the different methods such as the "nearest," "linear," "cubic," and "spline" fitting in Matlab. The conventional, FFT-based spatial filtering method used to

  15. Moving beyond Traditional Methods of Survey Validation

    Science.gov (United States)

    Maul, Andrew

    2017-01-01

    In his focus article, "Rethinking Traditional Methods of Survey Validation," published in this issue of "Measurement: Interdisciplinary Research and Perspectives," Andrew Maul wrote that it is commonly believed that self-report, survey-based instruments can be used to measure a wide range of psychological attributes, such as…

  16. Method validation for simultaneous counting of Total α , β in Drinking Water using Liquid Scintillation Counter

    International Nuclear Information System (INIS)

    Al-Masri, M. S.; Nashawati, A.

    2014-05-01

    In this work, Method Validation Methods and Pulse Shape Analysis were validated to determine gross Alpha and Beta Emitters in Drinking Water using Liquid Scintillation Counter Win spectral 1414. Validation parameters include Method Detection Limit, Method Quantitation Limit, Repeatability Limit, Intermediate Precision, Trueness) Bias), Recovery Coefficient, Linearity and Uncertainty Budget in analysis. The results show that the Method Detection Limit and Method Quantitation Limit were 0.07, 0.24 Bq/l for Alpha emitters respectively, and 0.42, 1.4 Bq/l for Beta emitters, respectively. The relative standard deviation of Repeatability Limit reached 2.81% for Alpha emitters and 3.96% for Beta emitters. In addition to, the relative standard deviation of Intermediate Precisionis was 0.54% for Alpha emitters and 1.17% for Beta emitters. Moreover, the trueness was - 7.7% for Alpha emitters and - 4.5% for Beta emitters. Recovery Coefficient ranged between 87 - 96% and 88-101 for Alpha and Beta emitters, respectively. Linearity reached 1 for both Alpha and Beta emitters. on the other hand, Uncertainty Budget for all continents was 96.65% ,83.14% for Alpha and Beta emitters, respectively (author).

  17. Validation Process Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); English, Christine M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gesick, Joshua C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukkamala, Saikrishna [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-01-04

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  18. Alternative method to validate the seasonal land cover regions of the conterminous United States

    Science.gov (United States)

    Zhiliang Zhu; Donald O. Ohlen; Raymond L. Czaplewski; Robert E. Burgan

    1996-01-01

    An accuracy assessment method involving double sampling and the multivariate composite estimator has been used to validate the prototype seasonal land cover characteristics database of the conterminous United States. The database consists of 159 land cover classes, classified using time series of 1990 1-km satellite data and augmented with ancillary data including...

  19. A validated RP-HPLC method for the determination of Irinotecan hydrochloride residues for cleaning validation in production area

    Directory of Open Access Journals (Sweden)

    Sunil Reddy

    2013-03-01

    Full Text Available Introduction: cleaning validation is an integral part of current good manufacturing practices in pharmaceutical industry. The main purpose of cleaning validation is to prove the effectiveness and consistency of cleaning in a given pharmaceutical production equipment to prevent cross contamination and adulteration of drug product with other active ingredient. Objective: a rapid, sensitive and specific reverse phase HPLC method was developed and validated for the quantitative determination of irinotecan hydrochloride in cleaning validation swab samples. Method: the method was validated using waters symmetry shield RP-18 (250mm x 4.6mm 5 µm column with isocratic mobile phase containing a mixture of 0.02 M potassium di-hydrogen ortho-phosphate, pH adjusted to 3.5 with ortho-phosphoric acid, methanol and acetonitrile (60:20:20 v/v/v. The flow rate of mobile phase was 1.0 mL/min with column temperature of 25°C and detection wavelength at 220nm. The sample injection volume was 100 µl. Results: the calibration curve was linear over a concentration range from 0.024 to 0.143 µg/mL with a correlation coefficient of 0.997. The intra-day and inter-day precision expressed as relative standard deviation were below 3.2%. The recoveries obtained from stainless steel, PCGI, epoxy, glass and decron cloth surfaces were more than 85% and there was no interference from the cotton swab. The detection limit (DL and quantitation limit (QL were 0.008 and 0.023 µg ml-1, respectively. Conclusion: the developed method was validated with respect to specificity, linearity, limit of detection and quantification, accuracy, precision and solution stability. The overall procedure can be used as part of a cleaning validation program in pharmaceutical manufacture of irinotecan hydrochloride.

  20. Development and validation of analytical methods for dietary supplements

    International Nuclear Information System (INIS)

    Sullivan, Darryl; Crowley, Richard

    2006-01-01

    The expanding use of innovative botanical ingredients in dietary supplements and foods has resulted in a flurry of research aimed at the development and validation of analytical methods for accurate measurement of active ingredients. The pressing need for these methods is being met through an expansive collaborative initiative involving industry, government, and analytical organizations. This effort has resulted in the validation of several important assays as well as important advances in the method engineering procedures which have improved the efficiency of the process. The initiative has also allowed researchers to hurdle many of the barricades that have hindered accurate analysis such as the lack of reference standards and comparative data. As the availability for nutraceutical products continues to increase these methods will provide consumers and regulators with the scientific information needed to assure safety and dependable labeling

  1. Validating the JobFit system functional assessment method

    Energy Technology Data Exchange (ETDEWEB)

    Jenny Legge; Robin Burgess-Limerick

    2007-05-15

    Workplace injuries are costing the Australian coal mining industry and its communities $410 Million a year. This ACARP study aims to meet those demands by developing a safe, reliable and valid pre-employment functional assessment tool. All JobFit System Pre-Employment Functional Assessments (PEFAs) consist of a musculoskeletal screen, balance test, aerobic fitness test and job-specific postural tolerances and material handling tasks. The results of each component are compared to the applicant's job demands and an overall PEFA score between 1 and 4 is given with 1 being the better score. The reliability study and validity study were conducted concurrently. The reliability study examined test-retest, intra-tester and inter-tester reliability of the JobFit System Functional Assessment Method. Overall, good to excellent reliability was found, which was sufficient to be used for comparison with injury data for determining the validity of the assessment. The overall assessment score and material handling tasks had the greatest reliability. The validity study compared the assessment results of 336 records from a Queensland underground and open cut coal mine with their injury records. A predictive relationship was found between PEFA score and the risk of a back/trunk/shoulder injury from manual handling. An association was also found between PEFA score of 1 and increased length of employment. Lower aerobic fitness test results had an inverse relationship with injury rates. The study found that underground workers, regardless of PEFA score, were more likely to have an injury when compared to other departments. No relationship was found between age and risk of injury. These results confirm the validity of the JobFit System Functional Assessment method.

  2. Amphenicols stability in medicated feed – development and validation of liquid chromatography method

    Directory of Open Access Journals (Sweden)

    Pietro Wojciech Jerzy

    2014-12-01

    Full Text Available A liquid chromatography-ultraviolet detection method for the determination of florfenicol (FF and thiamphenicol (TAP in feeds is presented. The method comprises the extraction of analytes from the matrix with a mixture of methanol and acetonitrile, drying of the extract, and its dissolution in phosphate buffer. The analysis was performed with a gradient programme of the mobile phase composed of acetonitrile and buffer (pH = 7.3 on a Zorbax Eclipse Plus C18 (150 × 4.6 mm, 5 μm analytical column with UV (λ = 220 nm detection. The analytical procedure has been successfully adopted and validated for quantitative determination of florfenicol and thiamphenicol in feed samples. Sensitivity, specificity, linearity, repeatability, and intralaboratory reproducibility were included in the validation. The mean recovery of amphenicols was 93.5% within the working range of 50-4000 mg/kg. Simultaneous determination of chloramphenicol, which is banned in the feed, was also included within the same procedure of FF and TAP stability studies. Storing the medicated feed at room temperature for up to one month decreased concentration in the investigated drugs even by 45%. These findings are relevant to successful provision of therapy to animals.

  3. Development and Statistical Validation of Spectrophotometric Methods for the Estimation of Nabumetone in Tablet Dosage Form

    Directory of Open Access Journals (Sweden)

    A. R. Rote

    2010-01-01

    Full Text Available Three new simple, economic spectrophotometric methods were developed and validated for the estimation of nabumetone in bulk and tablet dosage form. First method includes determination of nabumetone at absorption maxima 330 nm, second method applied was area under curve for analysis of nabumetone in the wavelength range of 326-334 nm and third method was First order derivative spectra with scaling factor 4. Beer law obeyed in the concentration range of 10-30 μg/mL for all three methods. The correlation coefficients were found to be 0.9997, 0.9998 and 0.9998 by absorption maxima, area under curve and first order derivative spectra. Results of analysis were validated statistically and by performing recovery studies. The mean percent recoveries were found satisfactory for all three methods. The developed methods were also compared statistically using one way ANOVA. The proposed methods have been successfully applied for the estimation of nabumetone in bulk and pharmaceutical tablet dosage form.

  4. Content Validity of National Post Marriage Educational Program Using Mixed Methods

    Science.gov (United States)

    MOHAJER RAHBARI, Masoumeh; SHARIATI, Mohammad; KERAMAT, Afsaneh; YUNESIAN, Masoud; ESLAMI, Mohammad; MOUSAVI, Seyed Abbas; MONTAZERI, Ali

    2015-01-01

    Background: Although the validity of content of program is mostly conducted with qualitative methods, this study used both qualitative and quantitative methods for the validation of content of post marriage training program provided for newly married couples. Content validity is a preliminary step of obtaining authorization required to install the program in country's health care system. Methods: This mixed methodological content validation study carried out in four steps with forming three expert panels. Altogether 24 expert panelists were involved in 3 qualitative and quantitative panels; 6 in the first item development one; 12 in the reduction kind, 4 of them were common with the first panel, and 10 executive experts in the last one organized to evaluate psychometric properties of CVR and CVI and Face validity of 57 educational objectives. Results: The raw data of post marriage program had been written by professional experts of Ministry of Health, using qualitative expert panel, the content was more developed by generating 3 topics and refining one topic and its respective content. In the second panel, totally six other objectives were deleted, three for being out of agreement cut of point and three on experts' consensus. The validity of all items was above 0.8 and their content validity indices (0.8–1) were completely appropriate in quantitative assessment. Conclusion: This study provided a good evidence for validation and accreditation of national post marriage program planned for newly married couples in health centers of the country in the near future. PMID:26056672

  5. Validation of verbal autopsy methods using hospital medical records: a case study in Vietnam.

    Science.gov (United States)

    Tran, Hong Thi; Nguyen, Hoa Phuong; Walker, Sue M; Hill, Peter S; Rao, Chalapati

    2018-05-18

    Information on causes of death (COD) is crucial for measuring the health outcomes of populations and progress towards the Sustainable Development Goals. In many countries such as Vietnam where the civil registration and vital statistics (CRVS) system is dysfunctional, information on vital events will continue to rely on verbal autopsy (VA) methods. This study assesses the validity of VA methods used in Vietnam, and provides recommendations on methods for implementing VA validation studies in Vietnam. This validation study was conducted on a sample of 670 deaths from a recent VA study in Quang Ninh province. The study covered 116 cases from this sample, which met three inclusion criteria: a) the death occurred within 30 days of discharge after last hospitalisation, and b) medical records (MRs) for the deceased were available from respective hospitals, and c) the medical record mentioned that the patient was terminally ill at discharge. For each death, the underlying cause of death (UCOD) identified from MRs was compared to the UCOD from VA. The validity of VA diagnoses for major causes of death was measured using sensitivity, specificity and positive predictive value (PPV). The sensitivity of VA was at least 75% in identifying some leading CODs such as stroke, road traffic accidents and several site-specific cancers. However, sensitivity was less than 50% for other important causes including ischemic heart disease, chronic obstructive pulmonary diseases, and diabetes. Overall, there was 57% agreement between UCOD from VA and MR, which increased to 76% when multiple causes from VA were compared to UCOD from MR. Our findings suggest that VA is a valid method to ascertain UCOD in contexts such as Vietnam. Furthermore, within cultural contexts in which patients prefer to die at home instead of a healthcare facility, using the available MRs as the gold standard may be meaningful to the extent that recall bias from the interval between last hospital discharge and death

  6. Determination of Modafinil in Tablet Formulation Using Three New Validated Spectrophotometric Methods

    International Nuclear Information System (INIS)

    Basniwal, P.K.; Jain, D.; Basniwal, P.K.

    2014-01-01

    In this study, three new UV spectrophotometric methods viz. linear regression equation (LRE), standard absorptivity (SA) and first order derivative (FOD) method were developed and validated for determination of modafinil in tablet form. The Beer-Lamberts law was obeyed as linear in the range of 10-50 μg/ mL and all the methods were validated for linearity, accuracy, precision and robustness. These methods were successfully applied for assay of modafinil drug content in tablets in the range of 100.20 - 100.42 %, 100.11 - 100.58 % and 100.25 - 100.34 %, respectively with acceptable standard deviation (less than two) for all the methods. The validated spectrophotometric methods may be successfully applied for assay, dissolution studies, bio-equivalence studies as well as routine analysis in pharmaceutical industries. (author)

  7. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    Science.gov (United States)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  8. Validated method for the analysis of goji berry, a rich source of zeaxanthin dipalmitate.

    Science.gov (United States)

    Karioti, Anastasia; Bergonzi, Maria Camilla; Vincieri, Franco F; Bilia, Anna Rita

    2014-12-31

    In the present study an HPLC-DAD method was developed for the determination of the main carotenoid, zeaxanthin dipalmitate, in the fruits of Lycium barbarum. The aim was to develop and optimize an extraction protocol to allow fast, exhaustive, and repeatable extraction, suitable for labile carotenoid content. Use of liquid N2 allowed the grinding of the fruit. A step of ultrasonication with water removed efficiently the polysaccharides and enabled the exhaustive extraction of carotenoids by hexane/acetone 50:50. The assay was fast and simple and permitted the quality control of a large number of commercial samples including fruits, juices, and a jam. The HPLC method was validated according to ICH guidelines and satisfied the requirements. Finally, the overall method was validated for precision (% RSD ranging between 3.81 and 4.13) and accuracy at three concentration levels. The recovery was between 94 and 107% with RSD values <2%, within the acceptable limits, especially if the difficulty of the matrix is taken into consideration.

  9. Validation of a Smartphone Image-Based Dietary Assessment Method for Pregnant Women

    Directory of Open Access Journals (Sweden)

    Amy M. Ashman

    2017-01-01

    Full Text Available Image-based dietary records could lower participant burden associated with traditional prospective methods of dietary assessment. They have been used in children, adolescents and adults, but have not been evaluated in pregnant women. The current study evaluated relative validity of the DietBytes image-based dietary assessment method for assessing energy and nutrient intakes. Pregnant women collected image-based dietary records (via a smartphone application of all food, drinks and supplements consumed over three non-consecutive days. Intakes from the image-based method were compared to intakes collected from three 24-h recalls, taken on random days; once per week, in the weeks following the image-based record. Data were analyzed using nutrient analysis software. Agreement between methods was ascertained using Pearson correlations and Bland-Altman plots. Twenty-five women (27 recruited, one withdrew, one incomplete, median age 29 years, 15 primiparas, eight Aboriginal Australians, completed image-based records for analysis. Significant correlations between the two methods were observed for energy, macronutrients and fiber (r = 0.58–0.84, all p < 0.05, and for micronutrients both including (r = 0.47–0.94, all p < 0.05 and excluding (r = 0.40–0.85, all p < 0.05 supplements in the analysis. Bland-Altman plots confirmed acceptable agreement with no systematic bias. The DietBytes method demonstrated acceptable relative validity for assessment of nutrient intakes of pregnant women.

  10. Validation of Field Methods to Assess Body Fat Percentage in Elite Youth Soccer Players.

    Science.gov (United States)

    Munguia-Izquierdo, Diego; Suarez-Arrones, Luis; Di Salvo, Valter; Paredes-Hernandez, Victor; Alcazar, Julian; Ara, Ignacio; Kreider, Richard; Mendez-Villanueva, Alberto

    2018-05-01

    This study determined the most effective field method for quantifying body fat percentage in male elite youth soccer players and developed prediction equations based on anthropometric variables. Forty-four male elite-standard youth soccer players aged 16.3-18.0 years underwent body fat percentage assessments, including bioelectrical impedance analysis and the calculation of various skinfold-based prediction equations. Dual X-ray absorptiometry provided a criterion measure of body fat percentage. Correlation coefficients, bias, limits of agreement, and differences were used as validity measures, and regression analyses were used to develop soccer-specific prediction equations. The equations from Sarria et al. (1998) and Durnin & Rahaman (1967) reached very large correlations and the lowest biases, and they reached neither the practically worthwhile difference nor the substantial difference between methods. The new youth soccer-specific skinfold equation included a combination of triceps and supraspinale skinfolds. None of the practical methods compared in this study are adequate for estimating body fat percentage in male elite youth soccer players, except for the equations from Sarria et al. (1998) and Durnin & Rahaman (1967). The new youth soccer-specific equation calculated in this investigation is the only field method specifically developed and validated in elite male players, and it shows potentially good predictive power. © Georg Thieme Verlag KG Stuttgart · New York.

  11. Development and Validation of a Stability-Indicating LC-UV Method ...

    African Journals Online (AJOL)

    Keywords: Ketotifen, Cetirizine, Stability indicating method, Stressed conditions, Validation. Tropical ... in biological fluids [13] are also reported. Stability indicating HPLC method is reported for ketotifen where drug is ..... paracetamol, cetirizine.

  12. Fuzzy-logic based strategy for validation of multiplex methods: example with qualitative GMO assays.

    Science.gov (United States)

    Bellocchi, Gianni; Bertholet, Vincent; Hamels, Sandrine; Moens, W; Remacle, José; Van den Eede, Guy

    2010-02-01

    This paper illustrates the advantages that a fuzzy-based aggregation method could bring into the validation of a multiplex method for GMO detection (DualChip GMO kit, Eppendorf). Guidelines for validation of chemical, bio-chemical, pharmaceutical and genetic methods have been developed and ad hoc validation statistics are available and routinely used, for in-house and inter-laboratory testing, and decision-making. Fuzzy logic allows summarising the information obtained by independent validation statistics into one synthetic indicator of overall method performance. The microarray technology, introduced for simultaneous identification of multiple GMOs, poses specific validation issues (patterns of performance for a variety of GMOs at different concentrations). A fuzzy-based indicator for overall evaluation is illustrated in this paper, and applied to validation data for different genetically modified elements. Remarks were drawn on the analytical results. The fuzzy-logic based rules were shown to be applicable to improve interpretation of results and facilitate overall evaluation of the multiplex method.

  13. Analytical method validation of GC-FID for the simultaneous measurement of hydrocarbons (C2-C4 in their gas mixture

    Directory of Open Access Journals (Sweden)

    Oman Zuas

    2016-09-01

    Full Text Available An accurate gas chromatography coupled to a flame ionization detector (GC-FID method was validated for the simultaneous analysis of light hydrocarbons (C2-C4 in their gas mixture. The validation parameters were evaluated based on the ISO/IEC 17025 definition including method selectivity, repeatability, accuracy, linearity, limit of detection (LOD, limit of quantitation (LOQ, and ruggedness. Under the optimum analytical conditions, the analysis of gas mixture revealed that each target component was well-separated with high selectivity property. The method was also found to be precise and accurate. The method linearity was found to be high with good correlation coefficient values (R2 ≥ 0.999 for all target components. It can be concluded that the GC-FID developed method is reliable and suitable for determination of light C2-C4 hydrocarbons (including ethylene, propane, propylene, isobutane, and n-butane in their gas mixture. The validated method has successfully been applied to the estimation of hydrocarbons light C2-C4 hydrocarbons in natural gas samples, showing high performance repeatability with relative standard deviation (RSD less than 1.0% and good selectivity with no interference from other possible components could be observed.

  14. Method development and validation of potent pyrimidine derivative by UV-VIS spectrophotometer.

    Science.gov (United States)

    Chaudhary, Anshu; Singh, Anoop; Verma, Prabhakar Kumar

    2014-12-01

    A rapid and sensitive ultraviolet-visible (UV-VIS) spectroscopic method was developed for the estimation of pyrimidine derivative 6-Bromo-3-(6-(2,6-dichlorophenyl)-2-(morpolinomethylamino) pyrimidine4-yl) -2H-chromen-2-one (BT10M) in bulk form. Pyrimidine derivative was monitored at 275 nm with UV detection, and there is no interference of diluents at 275 nm. The method was found to be linear in the range of 50 to 150 μg/ml. The accuracy and precision were determined and validated statistically. The method was validated as a guideline. The results showed that the proposed method is suitable for the accurate, precise, and rapid determination of pyrimidine derivative. Graphical Abstract Method development and validation of potent pyrimidine derivative by UV spectroscopy.

  15. A long-term validation of the modernised DC-ARC-OES solid-sample method.

    Science.gov (United States)

    Flórián, K; Hassler, J; Förster, O

    2001-12-01

    The validation procedure based on ISO 17025 standard has been used to study and illustrate both the longterm stability of the calibration process of the DC-ARC solid sample spectrometric method and the main validation criteria of the method. In the calculation of the validation characteristics depending on the linearity(calibration), also the fulfilment of predetermining criteria such as normality and homoscedasticity was checked. In order to decide whether there are any trends in the time-variation of the analytical signal or not, also the Neumann test of trend was applied and evaluated. Finally, a comparison with similar validation data of the ETV-ICP-OES method was carried out.

  16. Reliability and validity of the AutoCAD software method in lumbar lordosis measurement.

    Science.gov (United States)

    Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe

    2011-12-01

    The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis.

  17. Session-RPE Method for Training Load Monitoring: Validity, Ecological Usefulness, and Influencing Factors

    Directory of Open Access Journals (Sweden)

    Monoem Haddad

    2017-11-01

    Full Text Available Purpose: The aim of this review is to (1 retrieve all data validating the Session-rating of perceived exertion (RPE-method using various criteria, (2 highlight the rationale of this method and its ecological usefulness, and (3 describe factors that can alter RPE and users of this method should take into consideration.Method: Search engines such as SPORTDiscus, PubMed, and Google Scholar databases in the English language between 2001 and 2016 were consulted for the validity and usefulness of the session-RPE method. Studies were considered for further analysis when they used the session-RPE method proposed by Foster et al. in 2001. Participants were athletes of any gender, age, or level of competition. Studies using languages other than English were excluded in the analysis of the validity and reliability of the session-RPE method. Other studies were examined to explain the rationale of the session-RPE method and the origin of RPE.Results: A total of 950 studies cited the Foster et al. study that proposed the session RPE-method. 36 studies have examined the validity and reliability of this proposed method using the modified CR-10.Conclusion: These studies confirmed the validity and good reliability and internal consistency of session-RPE method in several sports and physical activities with men and women of different age categories (children, adolescents, and adults among various expertise levels. This method could be used as “standing alone” method for training load (TL monitoring purposes though some recommend to combine it with other physiological parameters as heart rate.

  18. Analytical method validation of GC-FID for the simultaneous measurement of hydrocarbons (C2-C4) in their gas mixture

    OpenAIRE

    Oman Zuas; Harry budiman; Muhammad Rizky Mulyana

    2016-01-01

    An accurate gas chromatography coupled to a flame ionization detector (GC-FID) method was validated for the simultaneous analysis of light hydrocarbons (C2-C4) in their gas mixture. The validation parameters were evaluated based on the ISO/IEC 17025 definition including method selectivity, repeatability, accuracy, linearity, limit of detection (LOD), limit of quantitation (LOQ), and ruggedness. Under the optimum analytical conditions, the analysis of gas mixture revealed that each target comp...

  19. Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory

    Science.gov (United States)

    Long, Haiying

    2017-01-01

    Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…

  20. Validated RP-HPLC Method for Quantification of Phenolic ...

    African Journals Online (AJOL)

    Purpose: To evaluate the total phenolic content and antioxidant potential of the methanol extracts of aerial parts and roots of Thymus sipyleus Boiss and also to determine some phenolic compounds using a newly developed and validated reversed phase high performance liquid chromatography (RP-HPLC) method.

  1. Development and Validation of a Bioanalytical Method for Direct ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a user-friendly spiked plasma method for the extraction of diclofenac potassium that reduces the number of treatments with plasma sample, in order to minimize human error. Method: Instead of solvent evaporation technique, the spiked plasma sample was modified with H2SO4 and NaCl, ...

  2. Tidal breath eNO measurements in a cohort of unsedated hospitalized neonates-A method validation

    DEFF Research Database (Denmark)

    Schmidt, Birgitte J; Reim, Pauline S; Jensen, Andreas K

    2018-01-01

    to validate clinically feasible longitudinal online tidal eNO and V'NO in a real-life birth cohort of un-sedated, hospitalized preterm, and term neonates. METHOD: We included 149 newborns, GA 28-42 weeks. Each scheduled for six repeated, non-invasive, on-line eNO measurements with Ecomedics CLD 88sp and NO...

  3. Rationale and methods of the European Food Consumption Validation (EFCOVAL) Project

    NARCIS (Netherlands)

    Boer, de E.J.; Slimani, N.; Boeing, H.; Feinberg, M.; Leclerq, C.; Trolle, E.; Amiano, P.; Andersen, L.F.; Freisling, H.; Geelen, A.; Harttig, U.; Huybrechts, I.; Kaic-Rak, A.; Lafay, L.; Lillegaard, I.T.L.; Ruprich, J.; Vries, de J.H.M.; Ocke, M.C.

    2011-01-01

    Background/Objectives: The overall objective of the European Food Consumption Validation (EFCOVAL) Project was to further develop and validate a trans-European food consumption method to be used for the evaluation of the intake of foods, nutrients and potentially hazardous chemicals within the

  4. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    Science.gov (United States)

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  5. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation.

    Science.gov (United States)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-07-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes' inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of the evidence for a trace specimen, e.g. a fingermark, and a reference specimen, e.g. a fingerprint, to originate from common or different sources. Some theoretical aspects of probabilities necessary for this Guideline were discussed prior to its elaboration, which started after a workshop of forensic researchers and practitioners involved in this topic. In the workshop, the following questions were addressed: "which aspects of a forensic evaluation scenario need to be validated?", "what is the role of the LR as part of a decision process?" and "how to deal with uncertainty in the LR calculation?". The questions: "what to validate?" focuses on the validation methods and criteria and "how to validate?" deals with the implementation of the validation protocol. Answers to these questions were deemed necessary with several objectives. First, concepts typical for validation standards [1], such as performance characteristics, performance metrics and validation criteria, will be adapted or applied by analogy to the LR framework. Second, a validation strategy will be defined. Third, validation methods will be described. Finally, a validation protocol and an example of validation report will be proposed, which can be applied to the forensic fields developing and validating LR methods for the evaluation of the strength of evidence at source level under the following propositions. Copyright © 2016. Published by Elsevier B.V.

  6. Content validity of methods to assess malnutrition in cancer patients: a systematic review

    NARCIS (Netherlands)

    Sealy, Martine; Nijholt, Willemke; Stuiver, M.M.; van der Berg, M.M.; Ottery, Faith D.; van der Schans, Cees; Roodenburg, Jan L N; Jager-Wittenaar, Harriët

    Content validity of methods to assess malnutrition in cancer patients: A systematic review Rationale: Inadequate operationalisation of the multidimensial concept of malnutrition may result in inadequate evaluation of nutritional status. In this review we aimed to assess content validity of methods

  7. Development and Validation of a Liquid Chromatographic Method ...

    African Journals Online (AJOL)

    A liquid chromatographic method for the simultaneous determination of six human immunodeficiency virus (HIV) protease inhibitors, indinavir, saquinavir, ritonavir, amprenavir, nelfinavir and lopinavir, was developed and validated. Optimal separation was achieved on a PLRP-S 100 Å, 250 x 4.6 mm I.D. column maintained ...

  8. A Validated Method for the Quality Control of Andrographis paniculata Preparations.

    Science.gov (United States)

    Karioti, Anastasia; Timoteo, Patricia; Bergonzi, Maria Camilla; Bilia, Anna Rita

    2017-10-01

    Andrographis paniculata is a herbal drug of Asian traditional medicine largely employed for the treatment of several diseases. Recently, it has been introduced in Europe for the prophylactic and symptomatic treatment of common cold and as an ingredient of dietary supplements. The active principles are diterpenes with andrographolide as the main representative. In the present study, an analytical protocol was developed for the determination of the main constituents in the herb and preparations of A. paniculata . Three different extraction protocols (methanol extraction using a modified Soxhlet procedure, maceration under ultrasonication, and decoction) were tested. Ultrasonication achieved the highest content of analytes. HPLC conditions were optimized in terms of solvent mixtures, time course, and temperature. A reversed phase C18 column eluted with a gradient system consisting of acetonitrile and acidified water and including an isocratic step at 30 °C was used. The HPLC method was validated for linearity, limits of quantitation and detection, repeatability, precision, and accuracy. The overall method was validated for precision and accuracy over at least three different concentration levels. Relative standard deviation was less than 1.13%, whereas recovery was between 95.50% and 97.19%. The method also proved to be suitable for the determination of a large number of commercial samples and was proposed to the European Pharmacopoeia for the quality control of Andrographidis herba. Georg Thieme Verlag KG Stuttgart · New York.

  9. Assessment of Advanced Life Support competence when combining different test methods--reliability and validity

    DEFF Research Database (Denmark)

    Ringsted, C; Lippert, F; Hesselfeldt, R

    2007-01-01

    Cardiac Arrest Simulation Test (CASTest) scenarios for the assessments according to guidelines 2005. AIMS: To analyse the reliability and validity of the individual sub-tests provided by ERC and to find a combination of MCQ and CASTest that provides a reliable and valid single effect measure of ALS...... that possessed high reliability, equality of test sets, and ability to discriminate between the two groups of supposedly different ALS competence. CONCLUSIONS: ERC sub-tests of ALS competence possess sufficient reliability and validity. A combined ALS score with equal weighting of one MCQ and one CASTest can...... competence. METHODS: Two groups of participants were included in this randomised, controlled experimental study: a group of newly graduated doctors, who had not taken the ALS course (N=17) and a group of students, who had passed the ALS course 9 months before the study (N=16). Reliability in terms of inter...

  10. Empirical Performance of Cross-Validation With Oracle Methods in a Genomics Context.

    Science.gov (United States)

    Martinez, Josue G; Carroll, Raymond J; Müller, Samuel; Sampson, Joshua N; Chatterjee, Nilanjan

    2011-11-01

    When employing model selection methods with oracle properties such as the smoothly clipped absolute deviation (SCAD) and the Adaptive Lasso, it is typical to estimate the smoothing parameter by m-fold cross-validation, for example, m = 10. In problems where the true regression function is sparse and the signals large, such cross-validation typically works well. However, in regression modeling of genomic studies involving Single Nucleotide Polymorphisms (SNP), the true regression functions, while thought to be sparse, do not have large signals. We demonstrate empirically that in such problems, the number of selected variables using SCAD and the Adaptive Lasso, with 10-fold cross-validation, is a random variable that has considerable and surprising variation. Similar remarks apply to non-oracle methods such as the Lasso. Our study strongly questions the suitability of performing only a single run of m-fold cross-validation with any oracle method, and not just the SCAD and Adaptive Lasso.

  11. Method validation for determination of heavy metals in wine and slightly alcoholic beverages by ICP-MS

    International Nuclear Information System (INIS)

    Voica, Cezara; Dehelean, Adriana; Pamula, A

    2009-01-01

    The Organisation International de la Vigne et du Vin (OIV) fixed an uppermost level for some heavy metals in wine. Consequently, the need to determine very low concentration of elements that may be present in wine in trace and ultra trace levels occurred. Inductively coupled plasma mass spectrometry ICP-MS is considered an excellent tool for detailed characterization of the elementary composition of many samples, including samples of drinks. In this study a method of quantitative analysis for the determination of toxic metals (Cr, As, Cd, Ni, Hg, Pb) in wines and slightly alcoholic beverages by ICP-MS was validated. Several parameters have been taken into account and evaluated for the validation of method, namely: linearity, the minimum detection limit, the limit of quantification, accuracy and uncertainty.

  12. Method validation for determination of heavy metals in wine and slightly alcoholic beverages by ICP-MS

    Energy Technology Data Exchange (ETDEWEB)

    Voica, Cezara; Dehelean, Adriana; Pamula, A, E-mail: cezara.voica@itim-cj.r [National Institute for Research and Development of Isotopic and Molecular Technologies, 65-103 Donath, 400293 Cluj-Napoca (Romania)

    2009-08-01

    The Organisation International de la Vigne et du Vin (OIV) fixed an uppermost level for some heavy metals in wine. Consequently, the need to determine very low concentration of elements that may be present in wine in trace and ultra trace levels occurred. Inductively coupled plasma mass spectrometry ICP-MS is considered an excellent tool for detailed characterization of the elementary composition of many samples, including samples of drinks. In this study a method of quantitative analysis for the determination of toxic metals (Cr, As, Cd, Ni, Hg, Pb) in wines and slightly alcoholic beverages by ICP-MS was validated. Several parameters have been taken into account and evaluated for the validation of method, namely: linearity, the minimum detection limit, the limit of quantification, accuracy and uncertainty.

  13. New clinical validation method for automated sphygmomanometer: a proposal by Japan ISO-WG for sphygmomanometer standard.

    Science.gov (United States)

    Shirasaki, Osamu; Asou, Yosuke; Takahashi, Yukio

    2007-12-01

    Owing to fast or stepwise cuff deflation, or measuring at places other than the upper arm, the clinical accuracy of most recent automated sphygmomanometers (auto-BPMs) cannot be validated by one-arm simultaneous comparison, which would be the only accurate validation method based on auscultation. Two main alternative methods are provided by current standards, that is, two-arm simultaneous comparison (method 1) and one-arm sequential comparison (method 2); however, the accuracy of these validation methods might not be sufficient to compensate for the suspicious accuracy in lateral blood pressure (BP) differences (LD) and/or BP variations (BPV) between the device and reference readings. Thus, the Japan ISO-WG for sphygmomanometer standards has been studying a new method that might improve validation accuracy (method 3). The purpose of this study is to determine the appropriateness of method 3 by comparing immunity to LD and BPV with those of the current validation methods (methods 1 and 2). The validation accuracy of the above three methods was assessed in human participants [N=120, 45+/-15.3 years (mean+/-SD)]. An oscillometric automated monitor, Omron HEM-762, was used as the tested device. When compared with the others, methods 1 and 3 showed a smaller intra-individual standard deviation of device error (SD1), suggesting their higher reproducibility of validation. The SD1 by method 2 (P=0.004) significantly correlated with the participant's BP, supporting our hypothesis that the increased SD of device error by method 2 is at least partially caused by essential BPV. Method 3 showed a significantly (P=0.0044) smaller interparticipant SD of device error (SD2), suggesting its higher interparticipant consistency of validation. Among the methods of validation of the clinical accuracy of auto-BPMs, method 3, which showed the highest reproducibility and highest interparticipant consistency, can be proposed as being the most appropriate.

  14. Determination of vitamin C in foods: current state of method validation.

    Science.gov (United States)

    Spínola, Vítor; Llorent-Martínez, Eulogio J; Castilho, Paula C

    2014-11-21

    Vitamin C is one of the most important vitamins, so reliable information about its content in foodstuffs is a concern to both consumers and quality control agencies. However, the heterogeneity of food matrixes and the potential degradation of this vitamin during its analysis create enormous challenges. This review addresses the development and validation of high-performance liquid chromatography methods for vitamin C analysis in food commodities, during the period 2000-2014. The main characteristics of vitamin C are mentioned, along with the strategies adopted by most authors during sample preparation (freezing and acidification) to avoid vitamin oxidation. After that, the advantages and handicaps of different analytical methods are discussed. Finally, the main aspects concerning method validation for vitamin C analysis are critically discussed. Parameters such as selectivity, linearity, limit of quantification, and accuracy were studied by most authors. Recovery experiments during accuracy evaluation were in general satisfactory, with usual values between 81 and 109%. However, few methods considered vitamin C stability during the analytical process, and the study of the precision was not always clear or complete. Potential future improvements regarding proper method validation are indicated to conclude this review. Copyright © 2014. Published by Elsevier B.V.

  15. Method validation using weighted linear regression models for quantification of UV filters in water samples.

    Science.gov (United States)

    da Silva, Claudia Pereira; Emídio, Elissandro Soares; de Marchi, Mary Rosa Rodrigues

    2015-01-01

    This paper describes the validation of a method consisting of solid-phase extraction followed by gas chromatography-tandem mass spectrometry for the analysis of the ultraviolet (UV) filters benzophenone-3, ethylhexyl salicylate, ethylhexyl methoxycinnamate and octocrylene. The method validation criteria included evaluation of selectivity, analytical curve, trueness, precision, limits of detection and limits of quantification. The non-weighted linear regression model has traditionally been used for calibration, but it is not necessarily the optimal model in all cases. Because the assumption of homoscedasticity was not met for the analytical data in this work, a weighted least squares linear regression was used for the calibration method. The evaluated analytical parameters were satisfactory for the analytes and showed recoveries at four fortification levels between 62% and 107%, with relative standard deviations less than 14%. The detection limits ranged from 7.6 to 24.1 ng L(-1). The proposed method was used to determine the amount of UV filters in water samples from water treatment plants in Araraquara and Jau in São Paulo, Brazil. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. [Data validation methods and discussion on Chinese materia medica resource survey].

    Science.gov (United States)

    Zhang, Yue; Ma, Wei-Feng; Zhang, Xiao-Bo; Zhu, Shou-Dong; Guo, Lan-Ping; Wang, Xing-Xing

    2013-07-01

    From the beginning of the fourth national survey of the Chinese materia medica resources, there were 22 provinces have conducted pilots. The survey teams have reported immense data, it put forward the very high request to the database system construction. In order to ensure the quality, it is necessary to check and validate the data in database system. Data validation is important methods to ensure the validity, integrity and accuracy of census data. This paper comprehensively introduce the data validation system of the fourth national survey of the Chinese materia medica resources database system, and further improve the design idea and programs of data validation. The purpose of this study is to promote the survey work smoothly.

  17. VALUE - Validating and Integrating Downscaling Methods for Climate Change Research

    Science.gov (United States)

    Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose

    2013-04-01

    Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of

  18. In Vitro Dissolution Profile of Dapagliflozin: Development, Method Validation, and Analysis of Commercial Tablets

    Directory of Open Access Journals (Sweden)

    Rafaela Zielinski Cavalheiro de Meira

    2017-01-01

    Full Text Available Dapagliflozin was the first of its class (inhibitors of sodium-glucose cotransporter to be approved in Europe, USA, and Brazil. As the drug was recently approved, there is the need for research on analytical methods, including dissolution studies for the quality evaluation and assurance of tablets. The dissolution methodology was developed with apparatus II (paddle in 900 mL of medium (simulated gastric fluid, pH 1.2, temperature set at 37±0.5°C, and stirring speed of 50 rpm. For the quantification, a spectrophotometric (λ=224 nm method was developed and validated. In validation studies, the method proved to be specific and linear in the range from 0.5 to 15 μg·mL−1 (r2=0.998. The precision showed results with RSD values lower than 2%. The recovery of 80.72, 98.47, and 119.41% proved the accuracy of the method. Through a systematic approach by applying Factorial 23, the robustness of the method was confirmed (p>0.05. The studies of commercial tablets containing 5 or 10 mg demonstrated that they could be considered similar through f1, f2, and dissolution efficiency analyses. Also, the developed method can be used for the quality evaluation of dapagliflozin tablets and can be considered as a scientific basis for future official pharmacopoeial methods.

  19. Pre-validation methods for developing a patient reported outcome instrument

    Directory of Open Access Journals (Sweden)

    Castillo Mayret M

    2011-08-01

    Full Text Available Abstract Background Measures that reflect patients' assessment of their health are of increasing importance as outcome measures in randomised controlled trials. The methodological approach used in the pre-validation development of new instruments (item generation, item reduction and question formatting should be robust and transparent. The totality of the content of existing PRO instruments for a specific condition provides a valuable resource (pool of items that can be utilised to develop new instruments. Such 'top down' approaches are common, but the explicit pre-validation methods are often poorly reported. This paper presents a systematic and generalisable 5-step pre-validation PRO instrument methodology. Methods The method is illustrated using the example of the Aberdeen Glaucoma Questionnaire (AGQ. The five steps are: 1 Generation of a pool of items; 2 Item de-duplication (three phases; 3 Item reduction (two phases; 4 Assessment of the remaining items' content coverage against a pre-existing theoretical framework appropriate to the objectives of the instrument and the target population (e.g. ICF; and 5 qualitative exploration of the target populations' views of the new instrument and the items it contains. Results The AGQ 'item pool' contained 725 items. Three de-duplication phases resulted in reduction of 91, 225 and 48 items respectively. The item reduction phases discarded 70 items and 208 items respectively. The draft AGQ contained 83 items with good content coverage. The qualitative exploration ('think aloud' study resulted in removal of a further 15 items and refinement to the wording of others. The resultant draft AGQ contained 68 items. Conclusions This study presents a novel methodology for developing a PRO instrument, based on three sources: literature reporting what is important to patient; theoretically coherent framework; and patients' experience of completing the instrument. By systematically accounting for all items dropped

  20. Validity and reliability of the session-RPE method for quantifying training load in karate athletes.

    Science.gov (United States)

    Tabben, M; Tourny, C; Haddad, M; Chaabane, H; Chamari, K; Coquart, J B

    2015-04-24

    To test the construct validity and reliability of the session rating of perceived exertion (sRPE) method by examining the relationship between RPE and physiological parameters (heart rate: HR and blood lactate concentration: [La --] ) and the correlations between sRPE and two HR--based methods for quantifying internal training load (Banister's method and Edwards's method) during karate training camp. Eighteen elite karate athletes: ten men (age: 24.2 ± 2.3 y, body mass: 71.2 ± 9.0 kg, body fat: 8.2 ± 1.3% and height: 178 ± 7 cm) and eight women (age: 22.6 ± 1.2 y, body mass: 59.8 ± 8.4 kg, body fat: 20.2 ± 4.4%, height: 169 ± 4 cm) were included in the study. During training camp, subjects participated in eight karate--training sessions including three training modes (4 tactical--technical, 2 technical--development, and 2 randori training), during which RPE, HR, and [La -- ] were recorded. Significant correlations were found between RPE and physiological parameters (percentage of maximal HR: r = 0.75, 95% CI = 0.64--0.86; [La --] : r = 0.62, 95% CI = 0.49--0.75; P training load ( r = 0.65--0.95; P reliability of the same intensity across training sessions (Cronbach's α = 0.81, 95% CI = 0.61--0.92). This study demonstrates that the sRPE method is valid for quantifying internal training load and intensity in karate.

  1. A Rapid, Simple, and Validated RP-HPLC Method for Quantitative Analysis of Levofloxacin in Human Plasma

    Directory of Open Access Journals (Sweden)

    Dion Notario

    2017-04-01

    Full Text Available To conduct a bioequivalence study for a copy product of levofloxacin (LEV, a simple and validated analytical method was needed, but the previous developed methods were still too complicated. For this reason, a simple and rapid high performance liquid chromatography method was developed and validated for LEV quantification in human plasma. Chromatographic separation was performed under isocratic elution on a Luna Phenomenex® C18 (150 × 4.6 mm, 5 µm column. The mobile phase was comprised of acetonitrile, methanol, and phosphate buffer 25 mM that adjusted at pH 3.0 (13:7:80 v/v/v and pumped at a flow rate of 1.5 mL/min. Detection was performed under UV detector at wavelength of 280 nm. Samples were prepared by adding acetonitrile and followed by centrifugation to precipitate plasma protein. Then followed successively by evaporation and reconstitution step. The optimized method meets the requirements of validation parameters which included linearity (r = 0.995, sensitivity (LLOQ and LOD was 1.77 and 0.57 µg/mL respectively, accuracy (%error above LLOQ ≤ 12% and LLOQ ≤ 20%, precision (RSD ≤ 9%, and robustness in the ranges of 1.77-28.83 µg/mL. Therefore, the method can be used as a routine analysis of LEV in human plasma as well as in bioequivalence study of LEV.

  2. Impurities in biogas - validation of analytical methods for siloxanes; Foeroreningar i biogas - validering av analysmetodik foer siloxaner

    Energy Technology Data Exchange (ETDEWEB)

    Arrhenius, Karine; Magnusson, Bertil; Sahlin, Eskil [SP Technical Research Institute of Sweden, Boraas (Sweden)

    2011-11-15

    Biogas produced from digester or landfill contains impurities which can be harmful for component that will be in contact with the biogas during its utilization. Among these, the siloxanes are often mentioned. During combustion, siloxanes are converted to silicon dioxide which accumulates on the heated surfaces in combustion equipment. Silicon dioxide is a solid compound and will remain in the engine and cause damages. Consequently, it is necessary to develop methods for the accurate determination of these compounds in biogases. In the first part of this report, a method for analysis of siloxanes in biogases was validated. The sampling was performed directly at the plant by drawing a small volume of biogas onto an adsorbent tube under a short period of time. These tubes were subsequently sent to the laboratory for analysis. The purpose of method validation is to demonstrate that the established method is fit for the purpose. This means that the method, as used by the laboratory generating the data, will provide data that meets a set of criteria concerning precision and accuracy. At the end, the uncertainty of the method was calculated. In the second part of this report, the validated method was applied to real samples collected in waste water treatment plants, co-digestion plants and plants digesting other wastes (agriculture waste). Results are presented at the end of this report. As expected, the biogases from waste water treatment plants contained largely higher concentrations of siloxanes than biogases from co-digestion plants and plants digesting agriculture wastes. The concentration of siloxanes in upgraded biogas regardless of which feedstock was digested and which upgrading technique was used was low.

  3. A new method for assessing content validity in model-based creation and iteration of eHealth interventions.

    Science.gov (United States)

    Kassam-Adams, Nancy; Marsac, Meghan L; Kohser, Kristen L; Kenardy, Justin A; March, Sonja; Winston, Flaura K

    2015-04-15

    The advent of eHealth interventions to address psychological concerns and health behaviors has created new opportunities, including the ability to optimize the effectiveness of intervention activities and then deliver these activities consistently to a large number of individuals in need. Given that eHealth interventions grounded in a well-delineated theoretical model for change are more likely to be effective and that eHealth interventions can be costly to develop, assuring the match of final intervention content and activities to the underlying model is a key step. We propose to apply the concept of "content validity" as a crucial checkpoint to evaluate the extent to which proposed intervention activities in an eHealth intervention program are valid (eg, relevant and likely to be effective) for the specific mechanism of change that each is intended to target and the intended target population for the intervention. The aims of this paper are to define content validity as it applies to model-based eHealth intervention development, to present a feasible method for assessing content validity in this context, and to describe the implementation of this new method during the development of a Web-based intervention for children. We designed a practical 5-step method for assessing content validity in eHealth interventions that includes defining key intervention targets, delineating intervention activity-target pairings, identifying experts and using a survey tool to gather expert ratings of the relevance of each activity to its intended target, its likely effectiveness in achieving the intended target, and its appropriateness with a specific intended audience, and then using quantitative and qualitative results to identify intervention activities that may need modification. We applied this method during our development of the Coping Coach Web-based intervention for school-age children. In the evaluation of Coping Coach content validity, 15 experts from five countries

  4. Robustness study in SSNTD method validation: indoor radon quality

    Energy Technology Data Exchange (ETDEWEB)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L., E-mail: danilacdias@gmail.com [Comissao Nacional de Energia Nuclear (LAPOC/CNEN), Pocos de Caldas, MG (Brazil). Laboratorio de Pocos de Caldas

    2017-07-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  5. Robustness study in SSNTD method validation: indoor radon quality

    International Nuclear Information System (INIS)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L.

    2017-01-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  6. Validation of a multi-residue method for the determination of several antibiotic groups in honey by LC-MS/MS.

    Science.gov (United States)

    Bohm, Detlef A; Stachel, Carolin S; Gowik, Petra

    2012-07-01

    The presented multi-method was developed for the confirmation of 37 antibiotic substances from the six antibiotic groups: macrolides, lincosamides, quinolones, tetracyclines, pleuromutilines and diamino-pyrimidine derivatives. All substances were analysed simultaneously in a single analytical run with the same procedure, including an extraction with buffer, a clean-up by solid-phase extraction, and the measurement by liquid chromatography tandem mass spectrometry in ESI+ mode. The method was validated on the basis of an in-house validation concept with factorial design by combination of seven factors to check the robustness in a concentration range of 5-50 μg kg(-1). The honeys used were of different types with regard to colour and origin. The values calculated for the validation parameters-decision limit CCα (range, 7.5-12.9 μg kg(-1)), detection capability CCβ (range, 9.4-19.9 μg kg(-1)), within-laboratory reproducibility RSD(wR) (tylvalosin with 21.4 %), repeatability RSD(r) (tylvalosin with 21.1%), and recovery (range, 92-106%)-were acceptable and in agreement with the criteria of Commission Decision 2002/657/EC. The validation results showed that the method was applicable for the residue analysis of antibiotics in honey to substances with and without recommended concentrations, although some changes had been tested during validation to determine the robustness of the method.

  7. Validity and reliability of a method for assessment of cervical vertebral maturation.

    Science.gov (United States)

    Zhao, Xiao-Guang; Lin, Jiuxiang; Jiang, Jiu-Hui; Wang, Qingzhu; Ng, Sut Hong

    2012-03-01

    To evaluate the validity and reliability of the cervical vertebral maturation (CVM) method with a longitudinal sample. Eighty-six cephalograms from 18 subjects (5 males and 13 females) were selected from the longitudinal database. Total mandibular length was measured on each film; an increased rate served as the gold standard in examination of the validity of the CVM method. Eleven orthodontists, after receiving intensive training in the CVM method, evaluated all films twice. Kendall's W and the weighted kappa statistic were employed. Kendall's W values were higher than 0.8 at both times, indicating strong interobserver reproducibility, but interobserver agreement was documented twice at less than 50%. A wide range of intraobserver agreement was noted (40.7%-79.1%), and substantial intraobserver reproducibility was proved by kappa values (0.53-0.86). With regard to validity, moderate agreement was reported between the gold standard and observer staging at the initial time (kappa values 0.44-0.61). However, agreement seemed to be unacceptable for clinical use, especially in cervical stage 3 (26.8%). Even though the validity and reliability of the CVM method proved statistically acceptable, we suggest that many other growth indicators should be taken into consideration in evaluating adolescent skeletal maturation.

  8. An analysis of normalization methods for Drosophila RNAi genomic screens and development of a robust validation scheme

    Science.gov (United States)

    Wiles, Amy M.; Ravi, Dashnamoorthy; Bhavani, Selvaraj; Bishop, Alexander J.R.

    2010-01-01

    Genome-wide RNAi screening is a powerful, yet relatively immature technology that allows investigation into the role of individual genes in a process of choice. Most RNAi screens identify a large number of genes with a continuous gradient in the assessed phenotype. Screeners must then decide whether to examine just those genes with the most robust phenotype or to examine the full gradient of genes that cause an effect and how to identify the candidate genes to be validated. We have used RNAi in Drosophila cells to examine viability in a 384-well plate format and compare two screens, untreated control and treatment. We compare multiple normalization methods, which take advantage of different features within the data, including quantile normalization, background subtraction, scaling, cellHTS2 1, and interquartile range measurement. Considering the false-positive potential that arises from RNAi technology, a robust validation method was designed for the purpose of gene selection for future investigations. In a retrospective analysis, we describe the use of validation data to evaluate each normalization method. While no normalization method worked ideally, we found that a combination of two methods, background subtraction followed by quantile normalization and cellHTS2, at different thresholds, captures the most dependable and diverse candidate genes. Thresholds are suggested depending on whether a few candidate genes are desired or a more extensive systems level analysis is sought. In summary, our normalization approaches and experimental design to perform validation experiments are likely to apply to those high-throughput screening systems attempting to identify genes for systems level analysis. PMID:18753689

  9. The construct validity of the Spanish version of the ABQ using a multi-trait/multi-method approach

    Directory of Open Access Journals (Sweden)

    Thomas D. Raedeke

    2013-10-01

    Full Text Available This study was designed to evaluate construct validity evidence associated with the Spanish version of the Athlete Burnout Questionnaire (ABQ using a multi-trait/multi-method (MTMM approach. The ABQ was administered to a sample of 302 Spanish athletes, along with two other questionnaires including the Maslach Burnout Inventory-General Survey (MBI-GS and the Depression, Anxiety, Stress Scale (DASS-21, which respectively measure burnout in organizational settings and indicators of ill being including depression, anxiety and stress. A structural equation modeling approach to a MTMM analysis was used. Results revealed by comparative analysis of four models that the Spanish version of ABQ has convergent and internal discriminant validity evident by high correlations between matching burnout subscales across two measures and lower correlations between non-matching dimensions. In addition, the burnout measures exhibited external discriminant validity as the correlations between burnout dimensions were higher than those seen between conceptually related, but unique, constructs.

  10. Triangulation, Respondent Validation, and Democratic Participation in Mixed Methods Research

    Science.gov (United States)

    Torrance, Harry

    2012-01-01

    Over the past 10 years or so the "Field" of "Mixed Methods Research" (MMR) has increasingly been exerting itself as something separate, novel, and significant, with some advocates claiming paradigmatic status. Triangulation is an important component of mixed methods designs. Triangulation has its origins in attempts to validate research findings…

  11. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, Cetin; Williams, Brian; McClure, Patrick; Nelson, Ralph A.

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M and S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for

  12. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Unal, Cetin [Los Alamos National Laboratory; Williams, Brian [Los Alamos National Laboratory; Mc Clure, Patrick [Los Alamos National Laboratory; Nelson, Ralph A [IDAHO NATIONAL LAB

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M&S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for cost

  13. Validity of a Manual Soft Tissue Profile Prediction Method Following Mandibular Setback Osteotomy

    OpenAIRE

    Kolokitha, Olga-Elpis

    2007-01-01

    Objectives The aim of this study was to determine the validity of a manual cephalometric method used for predicting the post-operative soft tissue profiles of patients who underwent mandibular setback surgery and compare it to a computerized cephalometric prediction method (Dentofacial Planner). Lateral cephalograms of 18 adults with mandibular prognathism taken at the end of pre-surgical orthodontics and approximately one year after surgery were used. Methods To test the validity of the manu...

  14. The Validity of Dimensional Regularization Method on Fractal Spacetime

    Directory of Open Access Journals (Sweden)

    Yong Tao

    2013-01-01

    Full Text Available Svozil developed a regularization method for quantum field theory on fractal spacetime (1987. Such a method can be applied to the low-order perturbative renormalization of quantum electrodynamics but will depend on a conjectural integral formula on non-integer-dimensional topological spaces. The main purpose of this paper is to construct a fractal measure so as to guarantee the validity of the conjectural integral formula.

  15. Validity of a Simple Method for Measuring Force-Velocity-Power Profile in Countermovement Jump.

    Science.gov (United States)

    Jiménez-Reyes, Pedro; Samozino, Pierre; Pareja-Blanco, Fernando; Conceição, Filipe; Cuadrado-Peñafiel, Víctor; González-Badillo, Juan José; Morin, Jean-Benoît

    2017-01-01

    To analyze the reliability and validity of a simple computation method to evaluate force (F), velocity (v), and power (P) output during a countermovement jump (CMJ) suitable for use in field conditions and to verify the validity of this computation method to compute the CMJ force-velocity (F-v) profile (including unloaded and loaded jumps) in trained athletes. Sixteen high-level male sprinters and jumpers performed maximal CMJs under 6 different load conditions (0-87 kg). A force plate sampling at 1000 Hz was used to record vertical ground-reaction force and derive vertical-displacement data during CMJ trials. For each condition, mean F, v, and P of the push-off phase were determined from both force-plate data (reference method) and simple computation measures based on body mass, jump height (from flight time), and push-off distance and used to establish the linear F-v relationship for each individual. Mean absolute bias values were 0.9% (± 1.6%), 4.7% (± 6.2%), 3.7% (± 4.8%), and 5% (± 6.8%) for F, v, P, and slope of the F-v relationship (S Fv ), respectively. Both methods showed high correlations for F-v-profile-related variables (r = .985-.991). Finally, all variables computed from the simple method showed high reliability, with ICC >.980 and CV push-off distance, and jump height are known.

  16. The international validation of bio- and chemical-anlaytical screening methods for dioxins and dioxin-like PCBs: the DIFFERENCE project rounds 1 and 2

    NARCIS (Netherlands)

    Loco, van J.; Leeuwen, van S.P.J.; Roos, P.; Carbonnelle, S.; Boer, de J.; Goeyens, L.; Beernaert, H.

    2004-01-01

    The European research project DIFFERENCE is focussed on the development, optimisation and validation of screening methods for dioxin analysis, including bio-analytical and chemical techniques (CALUX, GC-LRMS/MS, GC x GC-ECD) and on the optimisation and validation of new extraction and clean-up

  17. Contribution to the validation of thermal ratchetting prevision methods in metallic structures; Contribution a la validation des methodes de prevision du rochet thermique dans les structures metalliques

    Energy Technology Data Exchange (ETDEWEB)

    Rakotovelo, A.M

    1998-03-01

    This work concerns the steady state assessment in the metallic structures subjected to thermomechanical cyclic loadings in biaxial stress state. The effect of the short time mechanical overloads is also investigated. The first chapter is devoted to a bibliographic research concerning the behaviour of the materials and the structures in the cyclic plasticity. Some works relate to the experimental aspect as well as the numerical one for the steady state assessment of such structures are presented. The experimental part of the study is presented in the second chapter. The experimental device was carried out in order to prescribe tension and torsion forces combined with cyclic thermal loading. Some tests was then carried out, among these tests certain include some overloads in tension or torsion. The last chapter describes the numerical calculations using different models (linear isotropic hardening, linear kinematic hardening and elasto-viscoplastic Chaboche's model) and the application of some simplified methods for the ratchetting assessment in the structures. We have considered two categories of methods. The first one is based on an elastic analysis (Bree's diagram, 3 Sm rule, efficiency rule) and the second one combines elastic analysis and elastoplastic analysis of the first cycle (Gatt's and Taleb's methods). The results of this study have enabled: to validate in the biaxial stress state an expression which takes into account the effect of mechanical short time overloads; to test the performances of considered models to describe the evolution of the structure during the first cycle and to take into account the effect of short time overloads. Among the considered models, the elastoplastic Chaboche's model seems to be the most accurate to describe the structure's behaviour during the first cycles; to validate some simplified methods. Certain methods based only on elastic analysis (Bee's diagram and efficiency rule) seem not

  18. "INTRODUCING A FULL VALIDATED ANALYTICAL PROCEDURE AS AN OFFICIAL COMPENDIAL METHOD FOR FENTANYL TRANSDERMAL PATCHES"

    Directory of Open Access Journals (Sweden)

    Amir Mehdizadeh

    2005-04-01

    Full Text Available A simple, sensitive and specific HPLC method and also a simple and fast extraction procedure were developed for quantitative analysis of fentanyl transdermal patches. Chloroform, methanol and ethanol were used as extracting solvents with recovery percent of 92.1, 94.3 and 99.4% respectively. Fentanyl was extracted with ethanol and the eluted fentanyl through the C18 column was monitored by UV detection at 230 nm. The linearity was at the range of 0.5-10 µg/mL with correlation coefficient (r2 of 0.9992. Both intra and inter-day accuracy and precision were within acceptable limits. The detection limit (DL and quantitation limit (QL were 0.15 and 0.5 µg/mL, respectively. Other validation characteristics such as selectivity, robustness and ruggedness were evaluated. Following method validation, a system suitability test (SST including capacity factor (k´, plate number (N, tailing factor (T, and RSD was defined for routine test.

  19. Development and Validation of a RP-HPLC Method for the ...

    African Journals Online (AJOL)

    Development and Validation of a RP-HPLC Method for the Simultaneous Determination of Rifampicin and a Flavonoid Glycoside - A Novel ... range, accuracy, precision, limit of detection, limit of quantification, robustness and specificity.

  20. Validation parameters of instrumental method for determination of total bacterial count in milk

    Directory of Open Access Journals (Sweden)

    Nataša Mikulec

    2004-10-01

    Full Text Available The method of flow citometry as rapid, instrumental and routine microbiological method is used for determination of total bacterial count in milk. The results of flow citometry are expressed as individual bacterial cells count. Problems regarding the interpretation of the results of total bacterial count can be avoided by transformation of the results of flow citometry method onto the scale of reference method (HRN ISO 6610:2001.. The method of flow citometry, like any analitycal method, according to the HRN EN ISO/IEC 17025:2000 standard, requires validation and verification. This paper describes parameters of validation: accuracy, precision, specificity, range, robustness and measuring uncertainty for the method of flow citometry.

  1. Validation of a spectrophotometric method to determine ciprofibrate content in tablets

    Directory of Open Access Journals (Sweden)

    Guilherme Nobre Lima do Nascimento

    2011-03-01

    Full Text Available Ciprofibrate is a drug indicated in cases of hypertriglyceridemia and mixed hyperlipidemia, but no monographs are available in official compendia for the analysis of this substance in tablets. The objective of this work was to develop and validate a spectrophotometric method for routine analysis of ciprofibrate in tablets. In this study, commercial and standard ciprofibrate were used, as well as placebo in absolute ethanol, analyzed by UV spectrophotometer. All tests followed the rules of Resolution RE-899, 2003. The results showed that the developed and validated method offers low cost, easy implementation, precision and accuracy, and may be included in the routine of quality control laboratories.O ciprofibrato é um fármaco indicado em casos de hipertrigliceridemia e hiperlipidemia mista, mas não há monografias em compêndios oficiais para a análise desta substância em comprimidos. O objetivo deste trabalho é desenvolver e validar um método espectrofotométrico para análise de rotina de ciprofibrato em comprimidos. Neste estudo foram empregados ciprofibrato comercial, padrão e placebo em etanol absoluto, analisadas por espectrofotometria UV. Todos os testes seguiram as regras da Resolução RE- 899, 2003. Os resultados mostraram que o método desenvolvido e validado apresenta baixo custo, fácil implementação, precisão e exatidão e pode ser incluído em rotina de laboratórios de controle de qualidade.

  2. Methods of producing adsorption media including a metal oxide

    Science.gov (United States)

    Mann, Nicholas R; Tranter, Troy J

    2014-03-04

    Methods of producing a metal oxide are disclosed. The method comprises dissolving a metal salt in a reaction solvent to form a metal salt/reaction solvent solution. The metal salt is converted to a metal oxide and a caustic solution is added to the metal oxide/reaction solvent solution to adjust the pH of the metal oxide/reaction solvent solution to less than approximately 7.0. The metal oxide is precipitated and recovered. A method of producing adsorption media including the metal oxide is also disclosed, as is a precursor of an active component including particles of a metal oxide.

  3. An intercomparison of a large ensemble of statistical downscaling methods for Europe: Overall results from the VALUE perfect predictor cross-validation experiment

    Science.gov (United States)

    Gutiérrez, Jose Manuel; Maraun, Douglas; Widmann, Martin; Huth, Radan; Hertig, Elke; Benestad, Rasmus; Roessler, Ole; Wibig, Joanna; Wilcke, Renate; Kotlarski, Sven

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. This framework is based on a user-focused validation tree, guiding the selection of relevant validation indices and performance measures for different aspects of the validation (marginal, temporal, spatial, multi-variable). Moreover, several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur (assessment of intrinsic performance, effect of errors inherited from the global models, effect of non-stationarity, etc.). The list of downscaling experiments includes 1) cross-validation with perfect predictors, 2) GCM predictors -aligned with EURO-CORDEX experiment- and 3) pseudo reality predictors (see Maraun et al. 2015, Earth's Future, 3, doi:10.1002/2014EF000259, for more details). The results of these experiments are gathered, validated and publicly distributed through the VALUE validation portal, allowing for a comprehensive community-open downscaling intercomparison study. In this contribution we describe the overall results from Experiment 1), consisting of a European wide 5-fold cross-validation (with consecutive 6-year periods from 1979 to 2008) using predictors from ERA-Interim to downscale precipitation and temperatures (minimum and maximum) over a set of 86 ECA&D stations representative of the main geographical and climatic regions in Europe. As a result of the open call for contribution to this experiment (closed in Dec. 2015), over 40 methods representative of the main approaches (MOS and Perfect Prognosis, PP) and techniques (linear scaling, quantile mapping, analogs, weather typing, linear and generalized regression, weather generators, etc.) were submitted, including information both data

  4. Microfluidic devices and methods including porous polymer monoliths

    Science.gov (United States)

    Hatch, Anson V; Sommer, Gregory J; Singh, Anup K; Wang, Ying-Chih; Abhyankar, Vinay V

    2014-04-22

    Microfluidic devices and methods including porous polymer monoliths are described. Polymerization techniques may be used to generate porous polymer monoliths having pores defined by a liquid component of a fluid mixture. The fluid mixture may contain iniferters and the resulting porous polymer monolith may include surfaces terminated with iniferter species. Capture molecules may then be grafted to the monolith pores.

  5. Validation of DRAGON side-step method for Bruce-A restart Phase-B physics tests

    International Nuclear Information System (INIS)

    Shen, W.; Ngo-Trong, C.; Davis, R.S.

    2004-01-01

    The DRAGON side-step method, developed at AECL, has a number of advantages over the all-DRAGON method that was used before. It is now the qualified method for reactivity-device calculations. Although the side-step-method-generated incremental cross sections have been validated against those previously calculated with the all-DRAGON method, it is highly desirable to validate the side-step method against device-worth measurements in power reactors directly. In this paper, the DRAGON side-step method was validated by comparison with the device-calibration measurements made in Bruce-A NGS Unit 4 restart Phase-B commissioning in 2003. The validation exercise showed excellent results, with the DRAGON code overestimating the measured ZCR worth by ∼5%. A sensitivity study was also performed in this paper to assess the effect of various DRAGON modelling techniques on the incremental cross sections. The assessment shows that the refinement of meshes in 3-D and the use of the side-step method are two major reasons contributing to the improved agreement between the calculated ZCR worths and the measurements. Use of different DRAGON versions, DRAGON libraries, local-parameter core conditions, and weighting techniques for the homogenization of tube clusters inside the ZCR have a very small effect on the ZCR incremental thermal absorption cross section and ZCR reactivity worth. (author)

  6. Potential of accuracy profile for method validation in inductively coupled plasma spectrochemistry

    International Nuclear Information System (INIS)

    Mermet, J.M.; Granier, G.

    2012-01-01

    Method validation is usually performed over a range of concentrations for which analytical criteria must be verified. One important criterion in quantitative analysis is accuracy, i.e. the contribution of both trueness and precision. The study of accuracy over this range is called an accuracy profile and provides experimental tolerance intervals. Comparison with acceptability limits fixed by the end user defines a validity domain. This work describes the computation involved in the building of the tolerance intervals, particularly for the intermediate precision with within-laboratory experiments and for the reproducibility with interlaboratory studies. Computation is based on ISO 5725‐4 and on previously published work. Moreover, the bias uncertainty is also computed to verify the bias contribution to accuracy. The various types of accuracy profile behavior are exemplified with results obtained by using ICP-MS and ICP-AES. This procedure allows the analyst to define unambiguously a validity domain for a given accuracy. However, because the experiments are time-consuming, the accuracy profile method is mainly dedicated to method validation. - Highlights: ► An analytical method is defined by its accuracy, i.e. both trueness and precision. ► The accuracy as a function of an analyte concentration is an accuracy profile. ► Profile basic concepts are explained for trueness and intermediate precision. ► Profile-based tolerance intervals have to be compared with acceptability limits. ► Typical accuracy profiles are given for both ICP-AES and ICP-MS techniques.

  7. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2015-05-01

    Several sophisticated methods of footprint analysis currently exist. However, it is sometimes useful to apply standard measurement methods of recognized evidence with an easy and quick application. We sought to assess the reliability and validity of a new method of footprint assessment in a healthy population using Photoshop CS5 software (Adobe Systems Inc, San Jose, California). Forty-two footprints, corresponding to 21 healthy individuals (11 men with a mean ± SD age of 20.45 ± 2.16 years and 10 women with a mean ± SD age of 20.00 ± 1.70 years) were analyzed. Footprints were recorded in static bipedal standing position using optical podography and digital photography. Three trials for each participant were performed. The Hernández-Corvo, Chippaux-Smirak, and Staheli indices and the Clarke angle were calculated by manual method and by computerized method using Photoshop CS5 software. Test-retest was used to determine reliability. Validity was obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed high values (ICC, 0.98-0.99). Moreover, the validity test clearly showed no difference between techniques (ICC, 0.99-1). The reliability and validity of a method to measure, assess, and record the podometric indices using Photoshop CS5 software has been demonstrated. This provides a quick and accurate tool useful for the digital recording of morphostatic foot study parameters and their control.

  8. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  9. Validity of segmental bioelectrical impedance analysis for estimating fat-free mass in children including overweight individuals.

    Science.gov (United States)

    Ohta, Megumi; Midorikawa, Taishi; Hikihara, Yuki; Masuo, Yoshihisa; Sakamoto, Shizuo; Torii, Suguru; Kawakami, Yasuo; Fukunaga, Tetsuo; Kanehisa, Hiroaki

    2017-02-01

    This study examined the validity of segmental bioelectrical impedance (BI) analysis for predicting the fat-free masses (FFMs) of whole-body and body segments in children including overweight individuals. The FFM and impedance (Z) values of arms, trunk, legs, and whole body were determined using a dual-energy X-ray absorptiometry and segmental BI analyses, respectively, in 149 boys and girls aged 6 to 12 years, who were divided into model-development (n = 74), cross-validation (n = 35), and overweight (n = 40) groups. Simple regression analysis was applied to (length) 2 /Z (BI index) for each of the whole-body and 3 segments to develop the prediction equations of the measured FFM of the related body part. In the model-development group, the BI index of each of the 3 segments and whole body was significantly correlated to the measured FFM (R 2 = 0.867-0.932, standard error of estimation = 0.18-1.44 kg (5.9%-8.7%)). There was no significant difference between the measured and predicted FFM values without systematic error. The application of each equation derived in the model-development group to the cross-validation and overweight groups did not produce significant differences between the measured and predicted FFM values and systematic errors, with an exception that the arm FFM in the overweight group was overestimated. Segmental bioelectrical impedance analysis is useful for predicting the FFM of each of whole-body and body segments in children including overweight individuals, although the application for estimating arm FFM in overweight individuals requires a certain modification.

  10. Validation of ultraviolet method to determine serum phosphorus level

    International Nuclear Information System (INIS)

    Garcia Borges, Lisandra; Perez Prieto, Teresa Maria; Valdes Diez, Lilliam

    2009-01-01

    Validation of a spectrophotometry method applicable in clinic labs was proposed to analytical assessment of serum phosphates using a kit UV-phosphorus of domestic production from 'Carlos J. Finlay' Biologics Production Enterprise (Havana, Cuba). Analysis method was based on phosphorus reaction to ammonium molybdenum to acid pH to creating a measurable complex to 340 nm. Specificity and precision were measured considering the method strength, linearity, accuracy and sensitivity. Analytical method was linear up to 4,8 mmol/L, precise (CV 30 .999) during clinical interest concentration interval where there were not interferences by matrix. Detection limit values were of 0.037 mmol/L and of quantification of 0.13 mmol/L both were satisfactory for product use

  11. Critical Values for Lawshe's Content Validity Ratio: Revisiting the Original Methods of Calculation

    Science.gov (United States)

    Ayre, Colin; Scally, Andrew John

    2014-01-01

    The content validity ratio originally proposed by Lawshe is widely used to quantify content validity and yet methods used to calculate the original critical values were never reported. Methods for original calculation of critical values are suggested along with tables of exact binomial probabilities.

  12. Contribution to the validation of thermal ratchetting prevision methods in metallic structures

    International Nuclear Information System (INIS)

    Rakotovelo, A.M.

    1998-03-01

    This work concerns the steady state assessment in the metallic structures subjected to thermomechanical cyclic loadings in biaxial stress state. The effect of the short time mechanical overloads is also investigated. The first chapter is devoted to a bibliographic research concerning the behaviour of the materials and the structures in the cyclic plasticity. Some works relate to the experimental aspect as well as the numerical one for the steady state assessment of such structures are presented. The experimental part of the study is presented in the second chapter. The experimental device was carried out in order to prescribe tension and torsion forces combined with cyclic thermal loading. Some tests was then carried out, among these tests certain include some overloads in tension or torsion. The last chapter describes the numerical calculations using different models (linear isotropic hardening, linear kinematic hardening and elasto-viscoplastic Chaboche's model) and the application of some simplified methods for the ratchetting assessment in the structures. We have considered two categories of methods. The first one is based on an elastic analysis (Bree's diagram, 3 Sm rule, efficiency rule) and the second one combines elastic analysis and elastoplastic analysis of the first cycle (Gatt's and Taleb's methods). The results of this study have enabled: to validate in the biaxial stress state an expression which takes into account the effect of mechanical short time overloads; to test the performances of considered models to describe the evolution of the structure during the first cycle and to take into account the effect of short time overloads. Among the considered models, the elastoplastic Chaboche's model seems to be the most accurate to describe the structure's behaviour during the first cycles; to validate some simplified methods. Certain methods based only on elastic analysis (Bee's diagram and efficiency rule) seem not suitable for the considered kind of

  13. Microscale validation of 4-aminoantipyrine test method for quantifying phenolic compounds in microbial culture

    International Nuclear Information System (INIS)

    Justiz Mendoza, Ibrahin; Aguilera Rodriguez, Isabel; Perez Portuondo, Irasema

    2014-01-01

    Validation of test methods microscale is currently of great importance due to the economic and environmental advantages possessed, which constitutes a prerequisite for the performance of services and quality assurance of the results to provide customer. This paper addresses the microscale validation of 4-aminoantipyrine spectrophotometric method for the quantification of phenolic compounds in culture medium. Parameters linearity, precision, regression, accuracy, detection limits, quantification limits and robustness were evaluated, addition to the comparison test with no standardized method for determining polyphenols (Folin Ciocalteu). The results showed that both methods are feasible for determining phenols

  14. Validation of newly developed and redesigned key indicator methods for assessment of different working conditions with physical workloads based on mixed-methods design: a study protocol.

    Science.gov (United States)

    Klussmann, Andre; Liebers, Falk; Brandstädt, Felix; Schust, Marianne; Serafin, Patrick; Schäfer, Andreas; Gebhardt, Hansjürgen; Hartmann, Bernd; Steinberg, Ulf

    2017-08-21

    The impact of work-related musculoskeletal disorders is considerable. The assessment of work tasks with physical workloads is crucial to estimate the work-related health risks of exposed employees. Three key indicator methods are available for risk assessment regarding manual lifting, holding and carrying of loads; manual pulling and pushing of loads; and manual handling operations. Three further KIMs for risk assessment regarding whole-body forces, awkward body postures and body movement have been developed de novo. In addition, the development of a newly drafted combined method for mixed exposures is planned. All methods will be validated regarding face validity, reliability, convergent validity, criterion validity and further aspects of utility under practical conditions. As part of the joint project MEGAPHYS (multilevel risk assessment of physical workloads), a mixed-methods study is being designed for the validation of KIMs and conducted in companies of different sizes and branches in Germany. Workplaces are documented and analysed by observations, applying KIMs, interviews and assessment of environmental conditions. Furthermore, a survey among the employees at the respective workplaces takes place with standardised questionnaires, interviews and physical examinations. It is intended to include 1200 employees at 120 different workplaces. For analysis of the quality criteria, recommendations of the COSMIN checklist (COnsensus-based Standards for the selection of health Measurement INstruments) will be taken into account. The study was planned and conducted in accordance with the German Medical Professional Code and the Declaration of Helsinki as well as the German Federal Data Protection Act. The design of the study was approved by ethics committees. We intend to publish the validated KIMs in 2018. Results will be published in peer-reviewed journals, presented at international meetings and disseminated to actual users for practical application. © Article

  15. Validation of the method for investigation of radiopharmaceuticals for in vitro use

    International Nuclear Information System (INIS)

    Vranjes, S; Jovanovic, M.; Orlic, M.; Lazic, E. . E-mail address of corresponding author: sanjav@vin.bg.ac.yu

    2005-01-01

    The aim of this study was to validate analytical method for determination of total radioactivity and radioactive concentration of 125 I-triiodotironin, radiopharmaceutical for in vitro use. Analytical parameters: selectivity, accuracy, linearity and range of this method were determined. Values obtained for all parameters are reasonable for analytical methods, therefore this method could be used for farther investigation. (author)

  16. Validation method training: nurses' experiences and ratings of work climate.

    Science.gov (United States)

    Söderlund, Mona; Norberg, Astrid; Hansebo, Görel

    2014-03-01

    Training nursing staff in communication skills can impact on the quality of care for residents with dementia and contributes to nurses' job satisfaction. Changing attitudes and practices takes time and energy and can affect the entire nursing staff, not just the nurses directly involved in a training programme. Therefore, it seems important to study nurses' experiences of a training programme and any influence of the programme on work climate among the entire nursing staff. To explore nurses' experiences of a 1-year validation method training programme conducted in a nursing home for residents with dementia and to describe ratings of work climate before and after the programme. A mixed-methods approach. Twelve nurses participated in the training and were interviewed afterwards. These individual interviews were tape-recorded and transcribed, then analysed using qualitative content analysis. The Creative Climate Questionnaire was administered before (n = 53) and after (n = 56) the programme to the entire nursing staff in the participating nursing home wards and analysed with descriptive statistics. Analysis of the interviews resulted in four categories: being under extra strain, sharing experiences, improving confidence in care situations and feeling uncertain about continuing the validation method. The results of the questionnaire on work climate showed higher mean values in the assessment after the programme had ended. The training strengthened the participating nurses in caring for residents with dementia, but posed an extra strain on them. These nurses also described an extra strain on the entire nursing staff that was not reflected in the results from the questionnaire. The work climate at the nursing home wards might have made it easier to conduct this extensive training programme. Training in the validation method could develop nurses' communication skills and improve their handling of complex care situations. © 2013 Blackwell Publishing Ltd.

  17. Validation of single-sample doubly labeled water method

    International Nuclear Information System (INIS)

    Webster, M.D.; Weathers, W.W.

    1989-01-01

    We have experimentally validated a single-sample variant of the doubly labeled water method for measuring metabolic rate and water turnover in a very small passerine bird, the verdin (Auriparus flaviceps). We measured CO 2 production using the Haldane gravimetric technique and compared these values with estimates derived from isotopic data. Doubly labeled water results based on the one-sample calculations differed from Haldane values by less than 0.5% on average (range -8.3 to 11.2%, n = 9). Water flux computed by the single-sample method differed by -1.5% on average from results for the same birds based on the standard, two-sample technique (range -13.7 to 2.0%, n = 9)

  18. Experimental validation for calcul methods of structures having shock non-linearity

    International Nuclear Information System (INIS)

    Brochard, D.; Buland, P.

    1987-01-01

    For the seismic analysis of non-linear structures, numerical methods have been developed which need to be validated on experimental results. The aim of this paper is to present the design method of a test program which results will be used for this purpose. Some applications to nuclear components will illustrate this presentation [fr

  19. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method.

    Science.gov (United States)

    Badran, Hani; Pluye, Pierre; Grad, Roland

    2017-03-14

    The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n

  20. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    Science.gov (United States)

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  1. Validation of commercially available automated canine-specific immunoturbidimetric method for measuring canine C-reactive protein

    DEFF Research Database (Denmark)

    Hillström, Anna; Hagman, Ragnvi; Tvedten, Harold

    2014-01-01

    BACKGROUND: Measurement of C-reactive protein (CRP) is used for diagnosing and monitoring systemic inflammatory disease in canine patients. An automated human immunoturbidimetric assay has been validated for measuring canine CRP, but cross-reactivity with canine CRP is unpredictable. OBJECTIVE......: The purpose of the study was to validate a new automated canine-specific immunoturbidimetric CRP method (Gentian cCRP). METHODS: Studies of imprecision, accuracy, prozone effect, interference, limit of quantification, and stability under different storage conditions were performed. The new method was compared...... with a human CRP assay previously validated for canine CRP determination. Samples from 40 healthy dogs were analyzed to establish a reference interval. RESULTS: Total imprecision was

  2. What Does It Cost to Prevent On-Duty Firefighter Cardiac Events? A Content Valid Method for Calculating Costs

    Directory of Open Access Journals (Sweden)

    P. Daniel Patterson

    2013-01-01

    Full Text Available Cardiac arrest is a leading cause of mortality among firefighters. We sought to develop a valid method for determining the costs of a workplace prevention program for firefighters. In 2012, we developed a draft framework using human resource accounting and in-depth interviews with experts in the firefighting and insurance industries. The interviews produced a draft cost model with 6 components and 26 subcomponents. In 2013, we randomly sampled 100 fire chiefs out of >7,400 affiliated with the International Association of Fire Chiefs. We used the Content Validity Index (CVI to identify the content valid components of the draft cost model. This was accomplished by having fire chiefs rate the relevancy of cost components using a 4-point Likert scale (highly relevant to not relevant. We received complete survey data from 65 fire chiefs (65% response rate. We retained 5 components and 21 subcomponents based on CVI scores ≥0.70. The five main components include, (1 investment costs, (2 orientation and training costs, (3 medical and pharmaceutical costs, (4 education and continuing education costs, and (5 maintenance costs. Data from a diverse sample of fire chiefs has produced a content valid method for calculating the cost of a prevention program among firefighters.

  3. A Comparison of Three Methods for the Analysis of Skin Flap Viability: Reliability and Validity.

    Science.gov (United States)

    Tim, Carla Roberta; Martignago, Cintia Cristina Santi; da Silva, Viviane Ribeiro; Dos Santos, Estefany Camila Bonfim; Vieira, Fabiana Nascimento; Parizotto, Nivaldo Antonio; Liebano, Richard Eloin

    2018-05-01

    Objective: Technological advances have provided new alternatives to the analysis of skin flap viability in animal models; however, the interrater validity and reliability of these techniques have yet to be analyzed. The present study aimed to evaluate the interrater validity and reliability of three different methods: weight of paper template (WPT), paper template area (PTA), and photographic analysis. Approach: Sixteen male Wistar rats had their cranially based dorsal skin flap elevated. On the seventh postoperative day, the viable tissue area and the necrotic area of the skin flap were recorded using the paper template method and photo image. The evaluation of the percentage of viable tissue was performed using three methods, simultaneously and independently by two raters. The analysis of interrater reliability and viability was performed using the intraclass correlation coefficient and Bland Altman Plot Analysis was used to visualize the presence or absence of systematic bias in the evaluations of data validity. Results: The results showed that interrater reliability for WPT, measurement of PTA, and photographic analysis were 0.995, 0.990, and 0.982, respectively. For data validity, a correlation >0.90 was observed for all comparisons made between the three methods. In addition, Bland Altman Plot Analysis showed agreement between the comparisons of the methods and the presence of systematic bias was not observed. Innovation: Digital methods are an excellent choice for assessing skin flap viability; moreover, they make data use and storage easier. Conclusion: Independently from the method used, the interrater reliability and validity proved to be excellent for the analysis of skin flaps' viability.

  4. ReSOLV: Applying Cryptocurrency Blockchain Methods to Enable Global Cross-Platform Software License Validation

    Directory of Open Access Journals (Sweden)

    Alan Litchfield

    2018-05-01

    Full Text Available This paper presents a method for a decentralised peer-to-peer software license validation system using cryptocurrency blockchain technology to ameliorate software piracy, and to provide a mechanism for software developers to protect copyrighted works. Protecting software copyright has been an issue since the late 1970s and software license validation has been a primary method employed in an attempt to minimise software piracy and protect software copyright. The method described creates an ecosystem in which the rights and privileges of participants are observed.

  5. Validation of a method for radionuclide activity optimize in SPECT

    International Nuclear Information System (INIS)

    Perez Diaz, M.; Diaz Rizo, O.; Lopez Diaz, A.; Estevez Aparicio, E.; Roque Diaz, R.

    2007-01-01

    A discriminant method for optimizing the activity administered in NM studies is validated by comparison with ROC curves. the method is tested in 21 SPECT, performed with a Cardiac phantom. Three different cold lesions (L1, L2 and L3) were placed in the myocardium-wall for each SPECT. Three activities (84 MBq, 37 MBq or 18.5 MBq) of Tc-99m diluted in water were used as background. The linear discriminant analysis was used to select the parameters that characterize image quality (Background-to-Lesion (B/L) and Signal-to-Noise (S/N) ratios). Two clusters with different image quality (p=0.021) were obtained following the selected variables. the first one involved the studies performed with 37 MBq and 84 MBq, and the second one included the studies with 18.5 MBq. the ratios B/L, B/L2 and B/L3 are the parameters capable to construct the function, with 100% of cases correctly classified into the clusters. The value of 37 MBq is the lowest tested activity for which good results for the B/Li variables were obtained,without significant differences from the results with 84 MBq (p>0.05). The result is coincident with the applied ROC-analysis. A correlation between both method of r=890 was obtained. (Author) 26 refs

  6. Use of the Method of Triads in the Validation of Sodium and Potassium Intake in the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil).

    Science.gov (United States)

    Pereira, Taísa Sabrina Silva; Cade, Nágela Valadão; Mill, José Geraldo; Sichieri, Rosely; Molina, Maria Del Carmen Bisi

    2016-01-01

    Biomarkers are a good choice to be used in the validation of food frequency questionnaire due to the independence of their random errors. To assess the validity of the potassium and sodium intake estimated using the Food Frequency Questionnaire ELSA-Brasil. A subsample of participants in the ELSA-Brasil cohort was included in this study in 2009. Sodium and potassium intake were estimated using three methods: Semi-quantitative food frequency questionnaire, 12-hour nocturnal urinary excretion and three 24-hour food records. Correlation coefficients were calculated between the methods, and the validity coefficient was calculated using the method of triads. The 95% confidence intervals for the validity coefficient were estimated using bootstrap sampling. Exact and adjacent agreement and disagreement of the estimated sodium and potassium intake quintiles were compared among three methods. The sample consisted of 246 participants, aged 53±8 years, 52% of women. Validity coefficient for sodium were considered weak (рfood frequency questionnaire actual intake = 0.37 and рbiomarker actual intake = 0.21) and moderate (рfood records actual intake 0.56). The validity coefficient were higher for potassium (рfood frequency questionnaire actual intake = 0.60; рbiomarker actual intake = 0.42; рfood records actual intake = 0.79). Conclusions: The Food Frequency Questionnaire ELSA-Brasil showed good validity in estimating potassium intake in epidemiological studies. For sodium validity was weak, likely due to the non-quantification of the added salt to prepared food.

  7. Quantitative determination and validation of octreotide acetate using 1 H-NMR spectroscopy with internal standard method.

    Science.gov (United States)

    Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang

    2018-01-01

    Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Development and Validation of HPLC-PDA Assay method of Frangula emodin

    Directory of Open Access Journals (Sweden)

    Deborah Duca

    2016-03-01

    Full Text Available Frangula emodin, (1,3,8-trihydroxy-6-methyl-anthraquinone, is one of the anthraquinone derivatives found abundantly in the roots and bark of a number of plant families traditionally used to treat constipation and haemorrhoids. The present study describes the development and subsequent validation of a specific Assay HPLC method for emodin. The separation was achieved on a Waters Symmetry C18, 4.6 × 250 mm, 5 μm particle size, column at a temperature of 35 °C, with UV detection at 287 and 436 nm. An isocratic elution mode consisting of 0.1% formic acid and 0.01% trifluoroacetic acid as the aqueous mobile phase, and methanol was used. The method was successfully and statistically validated for linearity, range, precision, accuracy, specificity and solution stability.

  9. Development and Validation of an LC-MS/MS Method and Comparison with a GC-MS Method to Measure Phenytoin in Human Brain Dialysate, Blood, and Saliva

    Directory of Open Access Journals (Sweden)

    Raphael Hösli

    2018-01-01

    Full Text Available Phenytoin (PHT is one of the most often used critical dose drugs, where insufficient or excessive dosing can have severe consequences such as seizures or toxicity. Thus, the monitoring and precise measuring of PHT concentrations in patients is crucial. This study develops and validates an LC-MS/MS method for the measurement of phenytoin concentrations in different body compartments (i.e., human brain dialysate, blood, and saliva and compares it with a formerly developed GC-MS method that measures PHT in the same biological matrices. The two methods are evaluated and compared based on their analytical performance, appropriateness to analyze human biological samples, including corresponding extraction and cleanup procedures, and their validation according to ISO 17025/FDA Guidance for Industry. The LC-MS/MS method showed a higher performance compared with the GC-MS method. The LC-MS/MS was more sensitive, needed a smaller sample volume (25 µL and less chemicals, was less time consuming (cleaning up, sample preparation, and analysis, and resulted in a better LOD ( 0.995 for all tested matrices (blood, saliva, and dialysate. For larger sample numbers as in pharmacokinetic/pharmacodynamic studies and for bedside as well as routine analyses, the LC-MS/MS method offers significant advantages over the GC-MS method.

  10. Validation of Experimental whole-body SAR Assessment Method in a Complex Indoor Environment

    DEFF Research Database (Denmark)

    Bamba, Aliou; Joseph, Wout; Vermeeren, Gunter

    2012-01-01

    Assessing experimentally the whole-body specific absorption rate (SARwb) in a complex indoor environment is very challenging. An experimental method based on room electromagnetics theory (accounting only the Line-Of-Sight as specular path) to assess the whole-body SAR is validated by numerical...... of the proposed method is that it allows discarding the computation burden because it does not use any discretizations. Results show good agreement between measurement and computation at 2.8 GHz, as long as the plane wave assumption is valid, i.e., for high distances from the transmitter. Relative deviations 0...

  11. A simple method for validation and verification of pipettes mounted on automated liquid handlers

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Frøslev, Tobias Guldberg

     We have implemented a simple method for validation and verification of the performance of pipettes mounted on automated liquid handlers as necessary for laboratories accredited under ISO 17025. An 8-step serial dilution of Orange G was prepared in quadruplicates in a flat bottom 96-well microtit...... available. In conclusion, we have set up a simple solution for the continuous validation of automated liquid handlers used for accredited work. The method is cheap, simple and easy to use for aqueous solutions but requires a spectrophotometer that can read microtiter plates....... We have implemented a simple method for validation and verification of the performance of pipettes mounted on automated liquid handlers as necessary for laboratories accredited under ISO 17025. An 8-step serial dilution of Orange G was prepared in quadruplicates in a flat bottom 96-well microtiter...

  12. Validation of ascorbic acid tablets of national production by igh-performance liquid chromatography method

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Izquierdo Castro, Idalberto

    2009-01-01

    We validate an analytical method by high-performance liquid chromatography to determine ascorbic acid proportion in vitamin C tablets, which was designed as an alternative method to quality control and to follow-up of active principle chemical stability, since official techniques to quality control of ascorbic acid in tablets are not selective with degradation products. Method was modified according to that reported in USP 28, 2005 for analysis of injectable product. We used a RP-18 column of 250 x 4.6 mm 5 μm with a UV detector to 245 nm. Its validation was necessary for both objectives, considering parameters required for methods of I and II categories. This method was enough linear, exact, and precise in the rank of 100-300 μg/mL. Also, it was selective with remaining components of matrix and with the possible degradation products achieved in stressing conditions. Detection and quantification limits were estimated. When method was validated it was applied to ascorbic acid quantification in two batches of expired tablets and we detected a marked influence of container in active degradation principle after 12 months at room temperature. (Author)

  13. USFDA-GUIDELINE BASED VALIDATION OF TESTING METHOD FOR RIFAMPICIN IN INDONESIAN SERUM SPECIMEN

    Directory of Open Access Journals (Sweden)

    Tri Joko Raharjo

    2010-06-01

    Full Text Available Regarding a new regulation from Indonesia FDA (Badan POM-RI, all new non patent drugs should show bioequivalence with the originator drug prior to registration. Bioequivalence testing (BE-testing has to be performed to the people that represented of population to which the drug to be administrated. BE testing need a valid bio-analytical method for certain drug target and group of population. This research report specific validation of bio-analysis of Rifampicin in Indonesian serum specimen in order to be used for BE testing. The extraction was performed using acetonitrile while the chromatographic separation was accomplished on a RP 18 column (250 × 4.6 mm i.d., 5 µm, with a mobile phase composed of KH2PO4 10 mM-Acetonitrile (40:60, v/v and UV detection was set at 333 nm. The method shown specificity compared to blank serum specimen with retention time of rifampicin at 2.1 min. Lower limit of quantification (LLOQ was 0.06 µg/mL with dynamic range up to 20 µg/mL (R>0.990. Precision of the method was very good with coefficient of variance (CV 0.58; 7.40 and 5.56% for concentration at 0.06, 5, 15 µg/mL, respectively. Accuracies of the method were 3.22; 1.94; 1.90% for concentration 0.06, 5 and 15 µg/mL respectively. The average recoveries were 97.82, 95.50 and 97.31% for concentration of rifampicin 1, 5 and 5 µg/mL, respectively. The method was also shown reliable result on stability test on freezing-thawing, short-term and long-term stability as well as post preparation stability. Validation result shown that the method was ready to be used for Rifampicin BE testing with Indonesian subject.   Keywords: Rifampicin, Validation, USFDA-Guideline

  14. Prospective validation of criteria, including age, for safe, nonsurgical management of the ruptured spleen

    Energy Technology Data Exchange (ETDEWEB)

    Smith, J.S. Jr.; Wengrovitz, M.A.; DeLong, B.S. (Pennsylvania State University College of Medicine, Hershey (United States))

    1992-09-01

    One hundred twelve cases of blunt splenic rupture were prospectively entered (October 1987-October 1991) into surgical or nonsurgical management groups using these criteria for the nonsurgical group: hemodynamic stability-age less than 55 years-CT scan appearance of grade I, II, or III injury-absence of concomitant injuries precluding abdominal assessment+absence of other documented abdominal injuries. All ages were included and AAST injury scaling was used. Patients were grouped from the trauma room. The surgical treatment group included 66 patients (49 splenectomies, 17 splenorraphies). These patients were generally older and more severely injured, required more transfused blood, and a longer ICU stay. The nonsurgical group included 46 patients with 33 older than 14 years. There were 3 patients over the age of 55 years inappropriately included in this group, and nonsurgical therapy failed in all three. Statistical analysis (chi 2) showed that more splenic injuries were observed and more spleens were saved with these criteria applied prospectively compared with a previous retrospective series in the same institution. The series had a success rate of 93%, and validates the criteria used for safe, nonsurgical management of the ruptured spleen and adds a new criterion: a maximum age of 55 years.

  15. Prospective validation of criteria, including age, for safe, nonsurgical management of the ruptured spleen

    International Nuclear Information System (INIS)

    Smith, J.S. Jr.; Wengrovitz, M.A.; DeLong, B.S.

    1992-01-01

    One hundred twelve cases of blunt splenic rupture were prospectively entered (October 1987-October 1991) into surgical or nonsurgical management groups using these criteria for the nonsurgical group: hemodynamic stability-age less than 55 years-CT scan appearance of grade I, II, or III injury-absence of concomitant injuries precluding abdominal assessment+absence of other documented abdominal injuries. All ages were included and AAST injury scaling was used. Patients were grouped from the trauma room. The surgical treatment group included 66 patients (49 splenectomies, 17 splenorraphies). These patients were generally older and more severely injured, required more transfused blood, and a longer ICU stay. The nonsurgical group included 46 patients with 33 older than 14 years. There were 3 patients over the age of 55 years inappropriately included in this group, and nonsurgical therapy failed in all three. Statistical analysis (chi 2) showed that more splenic injuries were observed and more spleens were saved with these criteria applied prospectively compared with a previous retrospective series in the same institution. The series had a success rate of 93%, and validates the criteria used for safe, nonsurgical management of the ruptured spleen and adds a new criterion: a maximum age of 55 years

  16. A New Statistical Method to Determine the Degree of Validity of Health Economic Model Outcomes against Empirical Data.

    Science.gov (United States)

    Corro Ramos, Isaac; van Voorn, George A K; Vemer, Pepijn; Feenstra, Talitha L; Al, Maiwenn J

    2017-09-01

    The validation of health economic (HE) model outcomes against empirical data is of key importance. Although statistical testing seems applicable, guidelines for the validation of HE models lack guidance on statistical validation, and actual validation efforts often present subjective judgment of graphs and point estimates. To discuss the applicability of existing validation techniques and to present a new method for quantifying the degrees of validity statistically, which is useful for decision makers. A new Bayesian method is proposed to determine how well HE model outcomes compare with empirical data. Validity is based on a pre-established accuracy interval in which the model outcomes should fall. The method uses the outcomes of a probabilistic sensitivity analysis and results in a posterior distribution around the probability that HE model outcomes can be regarded as valid. We use a published diabetes model (Modelling Integrated Care for Diabetes based on Observational data) to validate the outcome "number of patients who are on dialysis or with end-stage renal disease." Results indicate that a high probability of a valid outcome is associated with relatively wide accuracy intervals. In particular, 25% deviation from the observed outcome implied approximately 60% expected validity. Current practice in HE model validation can be improved by using an alternative method based on assessing whether the model outcomes fit to empirical data at a predefined level of accuracy. This method has the advantage of assessing both model bias and parameter uncertainty and resulting in a quantitative measure of the degree of validity that penalizes models predicting the mean of an outcome correctly but with overly wide credible intervals. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  17. Literature research concerning alternative methods for validation of criticality calculation systems

    International Nuclear Information System (INIS)

    Behler, Matthias

    2016-05-01

    Beside radiochemical analysis of irradiated fuel and critical experiments, which has become a well-established basis for the validation of depletion code and criticality codes respectively, also results of oscillation experiments or the operating conditions of power reactor and research reactors can provide useful information for the validation of the above mentioned codes. Based on a literature review the potential of the utilization of oscillation experiment measurements for the validation of criticality codes is estimated. It is found that the reactivity measurements for actinides and fission products within the CERES program on the reactors DIMPLE (Winfrith, UK) and MINERVE (Cadarache, France) can give a valuable addition to the commonly used critical experiments for criticality code validation. However, there are approaches but yet no generally satisfactory solution for integrating the reactivity measurements in a quantitative bias determination for the neutron multiplication factor of typical application cases including irradiated spent fuel outside reactor cores, calculated using common criticality codes.

  18. GC Method Validation for the Analysis of Menthol in Suppository Pharmaceutical Dosage Form

    Directory of Open Access Journals (Sweden)

    Murad N. Abualhasan

    2017-01-01

    Full Text Available Menthol is widely used as a fragrance and flavor in the food and cosmetic industries. It is also used in the medical and pharmaceutical fields for its various biological effects. Gas chromatography (GC is considered to be a sensitive method for the analysis of menthol. GC chromatographic separation was developed using capillary column (VF-624 and a flame ionization detector (FID. The method was validated as per ICH guidelines for various parameters such as precision, linearity, accuracy, solution stability, robustness, limit of detection, and quantification. The tested validation parameters were found to be within acceptable limits. The method was successfully applied for the quantification of menthol in suppositories formulations. Quality control departments and official pharmacopeias can use our developed method in the analysis of menthol in pharmaceutical dosage formulation and raw material.

  19. Transverse Crack Modeling and Validation in Rotor Systems, Including Thermal Effects

    Directory of Open Access Journals (Sweden)

    N. Bachschmid

    2003-01-01

    Full Text Available This article describes a model that allows the simulation of the static behavior of a transverse crack in a horizontal rotor under the action of weight and other possible static loads and the dynamic behavior of cracked rotating shaft. The crack breathes—that is, the mechanism of the crack's opening and closing is ruled by the stress on the cracked section exerted by the external loads. In a rotor, the stresses are time-dependent and have a period equal to the period of rotation; thus, the crack periodically breathes. An original, simplified model allows cracks of various shapes to be modeled and thermal stresses to be taken into account, as they may influence the opening and closing mechanism. The proposed method was validated by using two criteria. First the crack's breathing mechanism, simulated by the model, was compared with the results obtained by a nonlinear, threedimensional finite element model calculation, and a good agreement in the results was observed. Then the proposed model allowed the development of the equivalent cracked beam. The results of this model were compared with those obtained by the three-dimensional finite element model. Also in this case, there was a good agreement in the results.

  20. Validation of Likelihood Ratio Methods Used for Forensic Evidence Evaluation: Application in Forensic Fingerprints

    NARCIS (Netherlands)

    Haraksim, Rudolf

    2014-01-01

    In this chapter the Likelihood Ratio (LR) inference model will be introduced, the theoretical aspects of probabilities will be discussed and the validation framework for LR methods used for forensic evidence evaluation will be presented. Prior to introducing the validation framework, following

  1. Development and validation of a dissolution method using HPLC for diclofenac potassium in oral suspension

    Directory of Open Access Journals (Sweden)

    Alexandre Machado Rubim

    2014-04-01

    Full Text Available The present study describes the development and validation of an in vitro dissolution method for evaluation to release diclofenac potassium in oral suspension. The dissolution test was developed and validated according to international guidelines. Parameters like linearity, specificity, precision and accuracy were evaluated, as well as the influence of rotation speed and surfactant concentration on the medium. After selecting the best conditions, the method was validated using apparatus 2 (paddle, 50-rpm rotation speed, 900 mL of water with 0.3% sodium lauryl sulfate (SLS as dissolution medium at 37.0 ± 0.5°C. Samples were analyzed using the HPLC-UV (PDA method. The results obtained were satisfactory for the parameters evaluated. The method developed may be useful in routine quality control for pharmaceutical industries that produce oral suspensions containing diclofenac potassium.

  2. Validation needs of seismic probabilistic risk assessment (PRA) methods applied to nuclear power plants

    International Nuclear Information System (INIS)

    Kot, C.A.; Srinivasan, M.G.; Hsieh, B.J.

    1985-01-01

    An effort to validate seismic PRA methods is in progress. The work concentrates on the validation of plant response and fragility estimates through the use of test data and information from actual earthquake experience. Validation needs have been identified in the areas of soil-structure interaction, structural response and capacity, and equipment fragility. Of particular concern is the adequacy of linear methodology to predict nonlinear behavior. While many questions can be resolved through the judicious use of dynamic test data, other aspects can only be validated by means of input and response measurements during actual earthquakes. A number of past, ongoing, and planned testing programs which can provide useful validation data have been identified, and validation approaches for specific problems are being formulated

  3. Development and validation of an alternative titration method for the determination of sulfate ion in indinavir sulfate

    Directory of Open Access Journals (Sweden)

    Breno de Carvalho e Silva

    2005-02-01

    Full Text Available A simple and rapid precipitation titration method was developed and validated to determine sulfate ion content in indinavir sulfate raw material. 0.1 mol L-1 lead nitrate volumetric solution was used as titrant employing potentiometric endpoint determination using a lead-specific electrode. The United States Pharmacopoeia Forum indicates a potentiometric method for sulfate ion quantitation using 0.1 mol L-1 lead perchlorate as titrant. Both methods were validated concerning linearity, precision and accuracy, yielding good results. The sulfate ion content found by the two validated methods was compared by the statistical t-student test, indicating that there was no statistically significant difference between the methods.

  4. Comprehensive validation scheme for in situ fiber optics dissolution method for pharmaceutical drug product testing.

    Science.gov (United States)

    Mirza, Tahseen; Liu, Qian Julie; Vivilecchia, Richard; Joshi, Yatindra

    2009-03-01

    There has been a growing interest during the past decade in the use of fiber optics dissolution testing. Use of this novel technology is mainly confined to research and development laboratories. It has not yet emerged as a tool for end product release testing despite its ability to generate in situ results and efficiency improvement. One potential reason may be the lack of clear validation guidelines that can be applied for the assessment of suitability of fiber optics. This article describes a comprehensive validation scheme and development of a reliable, robust, reproducible and cost-effective dissolution test using fiber optics technology. The test was successfully applied for characterizing the dissolution behavior of a 40-mg immediate-release tablet dosage form that is under development at Novartis Pharmaceuticals, East Hanover, New Jersey. The method was validated for the following parameters: linearity, precision, accuracy, specificity, and robustness. In particular, robustness was evaluated in terms of probe sampling depth and probe orientation. The in situ fiber optic method was found to be comparable to the existing manual sampling dissolution method. Finally, the fiber optic dissolution test was successfully performed by different operators on different days, to further enhance the validity of the method. The results demonstrate that the fiber optics technology can be successfully validated for end product dissolution/release testing. (c) 2008 Wiley-Liss, Inc. and the American Pharmacists Association

  5. Puerto Rican understandings of child disability: methods for the cultural validation of standardized measures of child health.

    Science.gov (United States)

    Gannotti, Mary E; Handwerker, W Penn

    2002-12-01

    Validating the cultural context of health is important for obtaining accurate and useful information from standardized measures of child health adapted for cross-cultural applications. This paper describes the application of ethnographic triangulation for cultural validation of a measure of childhood disability, the Pediatric Evaluation of Disability Inventory (PEDI) for use with children living in Puerto Rico. The key concepts include macro-level forces such as geography, demography, and economics, specific activities children performed and their key social interactions, beliefs, attitudes, emotions, and patterns of behavior surrounding independence in children and childhood disability, as well as the definition of childhood disability. Methods utilize principal components analysis to establish the validity of cultural concepts and multiple regression analysis to identify intracultural variation. Findings suggest culturally specific modifications to the PEDI, provide contextual information for informed interpretation of test scores, and point to the need to re-standardize normative values for use with Puerto Rican children. Without this type of information, Puerto Rican children may appear more disabled than expected for their level of impairment or not to be making improvements in functional status. The methods also allow for cultural boundaries to be quantitatively established, rather than presupposed. Copyright 2002 Elsevier Science Ltd.

  6. Dependability validation by means of fault injection: method, implementation, application

    International Nuclear Information System (INIS)

    Arlat, Jean

    1990-01-01

    This dissertation presents theoretical and practical results concerning the use of fault injection as a means for testing fault tolerance in the framework of the experimental dependability validation of computer systems. The dissertation first presents the state-of-the-art of published work on fault injection, encompassing both hardware (fault simulation, physical fault Injection) and software (mutation testing) issues. Next, the major attributes of fault injection (faults and their activation, experimental readouts and measures, are characterized taking into account: i) the abstraction levels used to represent the system during the various phases of its development (analytical, empirical and physical models), and Il) the validation objectives (verification and evaluation). An evaluation method is subsequently proposed that combines the analytical modeling approaches (Monte Carlo Simulations, closed-form expressions. Markov chains) used for the representation of the fault occurrence process and the experimental fault Injection approaches (fault Simulation and physical injection); characterizing the error processing and fault treatment provided by the fault tolerance mechanisms. An experimental tool - MESSALINE - is then defined and presented. This tool enables physical faults to be Injected In an hardware and software prototype of the system to be validated. Finally, the application of MESSALINE for testing two fault-tolerant systems possessing very dissimilar features and the utilization of the experimental results obtained - both as design feedbacks and for dependability measures evaluation - are used to illustrate the relevance of the method. (author) [fr

  7. Membrane for distillation including nanostructures, methods of making membranes, and methods of desalination and separation

    KAUST Repository

    Lai, Zhiping; Huang, Kuo-Wei; Chen, Wei

    2016-01-01

    In accordance with the purpose(s) of the present disclosure, as embodied and broadly described herein, embodiments of the present disclosure provide membranes, methods of making the membrane, systems including the membrane, methods of separation, methods of desalination, and the like.

  8. Membrane for distillation including nanostructures, methods of making membranes, and methods of desalination and separation

    KAUST Repository

    Lai, Zhiping

    2016-01-21

    In accordance with the purpose(s) of the present disclosure, as embodied and broadly described herein, embodiments of the present disclosure provide membranes, methods of making the membrane, systems including the membrane, methods of separation, methods of desalination, and the like.

  9. Space Suit Joint Torque Measurement Method Validation

    Science.gov (United States)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  10. Validation study of core analysis methods for full MOX BWR

    International Nuclear Information System (INIS)

    2013-01-01

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO 2 and MOX fuel rods, (3) analysis of isotopic composition data for UO 2 and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  11. Validation study of core analysis methods for full MOX BWR

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO{sub 2} and MOX fuel rods, (3) analysis of isotopic composition data for UO{sub 2} and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  12. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  13. Statistical methods for mechanistic model validation: Salt Repository Project

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1988-07-01

    As part of the Department of Energy's Salt Repository Program, Pacific Northwest Laboratory (PNL) is studying the emplacement of nuclear waste containers in a salt repository. One objective of the SRP program is to develop an overall waste package component model which adequately describes such phenomena as container corrosion, waste form leaching, spent fuel degradation, etc., which are possible in the salt repository environment. The form of this model will be proposed, based on scientific principles and relevant salt repository conditions with supporting data. The model will be used to predict the future characteristics of the near field environment. This involves several different submodels such as the amount of time it takes a brine solution to contact a canister in the repository, how long it takes a canister to corrode and expose its contents to the brine, the leach rate of the contents of the canister, etc. These submodels are often tested in a laboratory and should be statistically validated (in this context, validate means to demonstrate that the model adequately describes the data) before they can be incorporated into the waste package component model. This report describes statistical methods for validating these models. 13 refs., 1 fig., 3 tabs

  14. A method for the statistical interpretation of friction ridge skin impression evidence: Method development and validation.

    Science.gov (United States)

    Swofford, H J; Koertner, A J; Zemp, F; Ausdemore, M; Liu, A; Salyards, M J

    2018-04-03

    The forensic fingerprint community has faced increasing amounts of criticism by scientific and legal commentators, challenging the validity and reliability of fingerprint evidence due to the lack of an empirically demonstrable basis to evaluate and report the strength of the evidence in a given case. This paper presents a method, developed as a stand-alone software application, FRStat, which provides a statistical assessment of the strength of fingerprint evidence. The performance was evaluated using a variety of mated and non-mated datasets. The results show strong performance characteristics, often with values supporting specificity rates greater than 99%. This method provides fingerprint experts the capability to demonstrate the validity and reliability of fingerprint evidence in a given case and report the findings in a more transparent and standardized fashion with clearly defined criteria for conclusions and known error rate information thereby responding to concerns raised by the scientific and legal communities. Published by Elsevier B.V.

  15. Development and Validation of a Stability-Indicating LC-UV Method ...

    African Journals Online (AJOL)

    Development and Validation of a Stability-Indicating LC-UV Method for Simultaneous Determination of Ketotifen and Cetirizine in Pharmaceutical Dosage Forms. ... 5 μm) using an isocratic mobile phase that consisted of acetonitrile and 10 mM disodium hydrogen phosphate buffer (pH 6.5) in a ratio of 45:55 % v/v at a flow ...

  16. Comparison of the performances and validation of three methods for ...

    African Journals Online (AJOL)

    SARAH

    2014-02-28

    Feb 28, 2014 ... bacteria in Norwegian slaughter pigs. Int J. Food Microbiol 1, 301–309. [NCFA] Nordic Committee of Food Analysis (1996). Yersinia enterocolitica Detection in foods 117,. 3rd,edn,1-12. Nowak, B., Mueffling, T.V., Caspari, K. and Hartung, J. 2006 Validation of a method for the detection of virulent Yersinia ...

  17. Validation of Cyanoacrylate Method for Collection of Stratum Corneum in Human Skin for Lipid Analysis

    DEFF Research Database (Denmark)

    Jungersted, JM; Hellgren, Lars; Drachmann, Tue

    2010-01-01

    Background and Objective: Lipids in the stratum corneum (SC) are of major importance for the skin barrier function. Many different methods have been used for the collection of SC for the analysis of SC lipids. The objective of the present study was to validate the cyanoacrylate method for the col......Background and Objective: Lipids in the stratum corneum (SC) are of major importance for the skin barrier function. Many different methods have been used for the collection of SC for the analysis of SC lipids. The objective of the present study was to validate the cyanoacrylate method...

  18. The predictive validity of a situational judgement test, a clinical problem solving test and the core medical training selection methods for performance in specialty training .

    Science.gov (United States)

    Patterson, Fiona; Lopes, Safiatu; Harding, Stephen; Vaux, Emma; Berkin, Liz; Black, David

    2017-02-01

    The aim of this study was to follow up a sample of physicians who began core medical training (CMT) in 2009. This paper examines the long-term validity of CMT and GP selection methods in predicting performance in the Membership of Royal College of Physicians (MRCP(UK)) examinations. We performed a longitudinal study, examining the extent to which the GP and CMT selection methods (T1) predict performance in the MRCP(UK) examinations (T2). A total of 2,569 applicants from 2008-09 who completed CMT and GP selection methods were included in the study. Looking at MRCP(UK) part 1, part 2 written and PACES scores, both CMT and GP selection methods show evidence of predictive validity for the outcome variables, and hierarchical regressions show the GP methods add significant value to the CMT selection process. CMT selection methods predict performance in important outcomes and have good evidence of validity; the GP methods may have an additional role alongside the CMT selection methods. © Royal College of Physicians 2017. All rights reserved.

  19. Stepwise Procedure for Development and Validation of a Multipesticide Method

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    The stepwise procedure for development and the validation of so called multi-pesticide methods are described. Principles, preliminary actions, criteria for the selection of chromatographic separation, detection and performance verification of multi-pesticide methods are outlined. Also the long term repeatability and reproducibility, as well as the necessity for the documentation of laboratory work are highlighted. Appendix I hereof describes in detail the calculation of calibration parameters, whereas Appendix II focuses on the calculation of the significance of differences of concentrations obtained on two different separation columns. (author)

  20. Fault-tolerant clock synchronization validation methodology. [in computer systems

    Science.gov (United States)

    Butler, Ricky W.; Palumbo, Daniel L.; Johnson, Sally C.

    1987-01-01

    A validation method for the synchronization subsystem of a fault-tolerant computer system is presented. The high reliability requirement of flight-crucial systems precludes the use of most traditional validation methods. The method presented utilizes formal design proof to uncover design and coding errors and experimentation to validate the assumptions of the design proof. The experimental method is described and illustrated by validating the clock synchronization system of the Software Implemented Fault Tolerance computer. The design proof of the algorithm includes a theorem that defines the maximum skew between any two nonfaulty clocks in the system in terms of specific system parameters. Most of these parameters are deterministic. One crucial parameter is the upper bound on the clock read error, which is stochastic. The probability that this upper bound is exceeded is calculated from data obtained by the measurement of system parameters. This probability is then included in a detailed reliability analysis of the system.

  1. Transverse Crack Modeling and Validation in Rotor Systems Including Thermal Effects

    Directory of Open Access Journals (Sweden)

    N. Bachschmid

    2004-01-01

    Full Text Available In this article, a model is described that allows one to simulate the static behavior of a transversal crack in a horizontal rotor, under the action of the weight and other possible static loads and the dynamical behavior of the rotating cracked shaft. The crack “breaths,” i.e., the mechanism of opening and closing of the crack, is ruled by the stress acting on the cracked section due to the external loads; in a rotor the stress is time-depending with a period equal to the period of rotation, thus the crack “periodically breaths.” An original simplified model is described that allows cracks of different shape to be modeled and thermal stresses to be taken into account, since they may influence the opening and closing mechanism. The proposed method has been validated using two criteria. Firstly, the crack “breathing” mechanism, simulated with the model, has been compared with the results obtained by a nonlinear 3-D FEM calculation and a good agreement in the results has been observed. Secondly, the proposed model allows the development of the equivalent cracked beam. The results of this model are compared with those obtained by the above-mentioned 3-D FEM. There is a good agreement in the results, of this case as well.

  2. MR-based water content estimation in cartilage: design and validation of a method

    DEFF Research Database (Denmark)

    Shiguetomi Medina, Juan Manuel; Kristiansen, Maja Sophie; Ringgaard, Steffen

    Purpose: Design and validation of an MR-based method that allows the calculation of the water content in cartilage tissue. Methods and Materials: Cartilage tissue T1 map based water content MR sequences were used on a 37 Celsius degree stable system. The T1 map intensity signal was analyzed on 6...... cartilage samples from living animals (pig) and on 8 gelatin samples which water content was already known. For the data analysis a T1 intensity signal map software analyzer used. Finally, the method was validated after measuring and comparing 3 more cartilage samples in a living animal (pig). The obtained...... map based water content sequences can provide information that, after being analyzed using a T1-map analysis software, can be interpreted as the water contained inside a cartilage tissue. The amount of water estimated using this method was similar to the one obtained at the dry-freeze procedure...

  3. Development and validation of stability indicating UPLC assay method for ziprasidone active pharma ingredient

    Directory of Open Access Journals (Sweden)

    Sonam Mittal

    2012-01-01

    Full Text Available Background: Ziprasidone, a novel antipsychotic, exhibits a potent highly selective antagonistic activity on D2 and 5HT2A receptors. Literature survey for ziprasidone revealed several analytical methods based on different techniques but no UPLC method has been reported so far. Aim: Aim of this research paper is to present a simple and rapid stability indicating isocratic, ultra performance liquid chromatographic (UPLC method which was developed and validated for the determination of ziprasidone active pharmaceutical ingredient. Forced degradation studies of ziprasidone were studied under acid, base, oxidative hydrolysis, thermal stress and photo stress conditions. Materials and Methods: The quantitative determination of ziprasidone drug was performed on a Supelco analytical column (100×2.1 mm i.d., 2.7 ΅m with 10 mM ammonium acetate buffer (pH: 6.7 and acetonitrile (ACN as mobile phase with the ratio (55:45-Buffer:ACN at a flow rate of 0.35 ml/ min. For UPLC method, UV detection was made at 318 nm and the run time was 3 min. Developed UPLC method was validated as per ICH guidelines. Results and Conclusion: Mild degradation of the drug substance was observed during oxidative hydrolysis and considerable degradation observed during basic hydrolysis. During method validation, parameters such as precision, linearity, ruggedness, stability, robustness, and specificity were evaluated, which remained within acceptable limits. Developed UPLC method was successfully applied for evaluating assay of Ziprasidone active Pharma ingredient.

  4. Clashing Validities in the Comparative Method? Balancing In-Depth Understanding and Generalizability in Small-N Policy Studies

    NARCIS (Netherlands)

    van der Heijden, J.

    2013-01-01

    The comparative method receives considerable attention in political science. To some a main advantage of the method is that it allows for both in-depth insights (internal validity), and generalizability beyond the cases studied (external validity). However, others consider internal and external

  5. Interlaboratory validation of an improved U.S. Food and Drug Administration method for detection of Cyclospora cayetanensis in produce using TaqMan real-time PCR

    Science.gov (United States)

    A collaborative validation study was performed to evaluate the performance of a new U.S. Food and Drug Administration method developed for detection of the protozoan parasite, Cyclospora cayetanensis, on cilantro and raspberries. The method includes a sample preparation step in which oocysts are re...

  6. Development, validation and evaluation of an analytical method for the determination of monomeric and oligomeric procyanidins in apple extracts.

    Science.gov (United States)

    Hollands, Wendy J; Voorspoels, Stefan; Jacobs, Griet; Aaby, Kjersti; Meisland, Ane; Garcia-Villalba, Rocio; Tomas-Barberan, Francisco; Piskula, Mariusz K; Mawson, Deborah; Vovk, Irena; Needs, Paul W; Kroon, Paul A

    2017-04-28

    There is a lack of data for individual oligomeric procyanidins in apples and apple extracts. Our aim was to develop, validate and evaluate an analytical method for the separation, identification and quantification of monomeric and oligomeric flavanols in apple extracts. To achieve this, we prepared two types of flavanol extracts from freeze-dried apples; one was an epicatechin-rich extract containing ∼30% (w/w) monomeric (-)-epicatechin which also contained oligomeric procyanidins (Extract A), the second was an oligomeric procyanidin-rich extract depleted of epicatechin (Extract B). The parameters considered for method optimisation were HPLC columns and conditions, sample heating, mass of extract and dilution volumes. The performance characteristics considered for method validation included standard linearity, method sensitivity, precision and trueness. Eight laboratories participated in the method evaluation. Chromatographic separation of the analytes was best achieved utilizing a Hilic column with a binary mobile phase consisting of acidic acetonitrile and acidic aqueous methanol. The final method showed linearity for epicatechin in the range 5-100μg/mL with a correlation co-efficient >0.999. Intra-day and inter-day precision of the analytes ranged from 2 to 6% and 2 to 13% respectively. Up to dp3, trueness of the method was >95% but decreased with increasing dp. Within laboratory precision showed RSD values <5 and 10% for monomers and oligomers, respectively. Between laboratory precision was 4 and 15% (Extract A) and 7 and 30% (Extract B) for monomers and oligomers, respectively. An analytical method for the separation, identification and quantification of procyanidins in an apple extract was developed, validated and assessed. The results of the inter-laboratory evaluation indicate that the method is reliable and reproducible. Copyright © 2017. Published by Elsevier B.V.

  7. High-performance liquid chromatography method validation for determination of tetracycline residues in poultry meat

    Directory of Open Access Journals (Sweden)

    Vikas Gupta

    2014-01-01

    Full Text Available Background: In this study, a method for determination of tetracycline (TC residues in poultry with the help of high-performance liquid chromatography technique was validated. Materials and Methods: The principle step involved in ultrasonic-assisted extraction of TCs from poultry samples by 2 ml of 20% trichloroacetic acid and phosphate buffer (pH 4, which gave a clearer supernatant and high recovery, followed by centrifugation and purification by using 0.22 μm filter paper. Results: Validity study of the method revealed that all obtained calibration curves showed good linearity (r2 > 0.999 over the range of 40-4500 ng. Sensitivity was found to be 1.54 and 1.80 ng for oxytetracycline (OTC and TC. Accuracy was in the range of 87.94-96.20% and 72.40-79.84% for meat. Precision was lower than 10% in all cases indicating that the method can be used as a validated method. Limit of detection was found to be 4.8 and 5.10 ng for OTC and TC, respectively. The corresponding values of limit of quantitation were 11 and 12 ng. Conclusion: The method reliably identifies and quantifies the selected TC and OTC in the reconstituted poultry meat in the low and sub-nanogram range and can be applied in any laboratory.

  8. Reference Proteome Extracts for Mass Spec Instrument Performance Validation and Method Development

    Science.gov (United States)

    Rosenblatt, Mike; Urh, Marjeta; Saveliev, Sergei

    2014-01-01

    Biological samples of high complexity are required to test protein mass spec sample preparation procedures and validate mass spec instrument performance. Total cell protein extracts provide the needed sample complexity. However, to be compatible with mass spec applications, such extracts should meet a number of design requirements: compatibility with LC/MS (free of detergents, etc.)high protein integrity (minimal level of protein degradation and non-biological PTMs)compatibility with common sample preparation methods such as proteolysis, PTM enrichment and mass-tag labelingLot-to-lot reproducibility Here we describe total protein extracts from yeast and human cells that meet the above criteria. Two extract formats have been developed: Intact protein extracts with primary use for sample preparation method development and optimizationPre-digested extracts (peptides) with primary use for instrument validation and performance monitoring

  9. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    Science.gov (United States)

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders.

  10. Validation of three new methods for determination of metal emissions using a modified Environmental Protection Agency Method 301

    Energy Technology Data Exchange (ETDEWEB)

    Catherine A. Yanca; Douglas C. Barth; Krag A. Petterson; Michael P. Nakanishi; John A. Cooper; Bruce E. Johnsen; Richard H. Lambert; Daniel G. Bivins [Cooper Environmental Services, LLC, Portland, OR (United States)

    2006-12-15

    Three new methods applicable to the determination of hazardous metal concentrations in stationary source emissions were developed and evaluated for use in U.S. Environmental Protection Agency (EPA) compliance applications. Two of the three independent methods, a continuous emissions monitor-based method (Xact) and an X-ray-based filter method (XFM), are used to measure metal emissions. The third method involves a quantitative aerosol generator (QAG), which produces a reference aerosol used to evaluate the measurement methods. A modification of EPA Method 301 was used to validate the three methods for As, Cd, Cr, Pb, and Hg, representing three hazardous waste combustor Maximum Achievable Control Technology (MACT) metal categories (low volatile, semivolatile, and volatile). The measurement methods were evaluated at a hazardous waste combustor (HWC) by comparing measured with reference aerosol concentrations. The QAG, Xact, and XFM met the modified Method 301 validation criteria. All three of the methods demonstrated precisions and accuracies on the order of 5%. The measurement methods should be applicable to emissions from a wide range of sources, and the reference aerosol generator should be applicable to additional analytes. EPA recently approved an alternative monitoring petition for an HWC at Eli Lilly's Tippecanoe site in Lafayette, IN, in which the Xact is used for demonstrating compliance with the HWC MACT metal emissions (low volatile, semivolatile, and volatile). The QAG reference aerosol generator was approved as a method for providing a quantitative reference aerosol, which is required for certification and continuing quality assurance of the Xact. 30 refs., 5 figs., 11 tabs.

  11. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    Science.gov (United States)

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  12. AOAC Official MethodSM Matrix Extension Validation Study of Assurance GDSTM for the Detection of Salmonella in Selected Spices.

    Science.gov (United States)

    Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew

    2015-01-01

    Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella.

  13. A Component-Based Modeling and Validation Method for PLC Systems

    Directory of Open Access Journals (Sweden)

    Rui Wang

    2014-05-01

    Full Text Available Programmable logic controllers (PLCs are complex embedded systems that are widely used in industry. This paper presents a component-based modeling and validation method for PLC systems using the behavior-interaction-priority (BIP framework. We designed a general system architecture and a component library for a type of device control system. The control software and hardware of the environment were all modeled as BIP components. System requirements were formalized as monitors. Simulation was carried out to validate the system model. A realistic example from industry of the gates control system was employed to illustrate our strategies. We found a couple of design errors during the simulation, which helped us to improve the dependability of the original systems. The results of experiment demonstrated the effectiveness of our approach.

  14. The truncated Wigner method for Bose-condensed gases: limits of validity and applications

    International Nuclear Information System (INIS)

    Sinatra, Alice; Lobo, Carlos; Castin, Yvan

    2002-01-01

    We study the truncated Wigner method applied to a weakly interacting spinless Bose-condensed gas which is perturbed away from thermal equilibrium by a time-dependent external potential. The principle of the method is to generate an ensemble of classical fields ψ(r) which samples the Wigner quasi-distribution function of the initial thermal equilibrium density operator of the gas, and then to evolve each classical field with the Gross-Pitaevskii equation. In the first part of the paper we improve the sampling technique over our previous work (Sinatra et al 2000 J. Mod. Opt. 47 2629-44) and we test its accuracy against the exactly solvable model of the ideal Bose gas. In the second part of the paper we investigate the conditions of validity of the truncated Wigner method. For short evolution times it is known that the time-dependent Bogoliubov approximation is valid for almost pure condensates. The requirement that the truncated Wigner method reproduces the Bogoliubov prediction leads to the constraint that the number of field modes in the Wigner simulation must be smaller than the number of particles in the gas. For longer evolution times the nonlinear dynamics of the noncondensed modes of the field plays an important role. To demonstrate this we analyse the case of a three-dimensional spatially homogeneous Bose-condensed gas and we test the ability of the truncated Wigner method to correctly reproduce the Beliaev-Landau damping of an excitation of the condensate. We have identified the mechanism which limits the validity of the truncated Wigner method: the initial ensemble of classical fields, driven by the time-dependent Gross-Pitaevskii equation, thermalizes to a classical field distribution at a temperature T class which is larger than the initial temperature T of the quantum gas. When T class significantly exceeds T a spurious damping is observed in the Wigner simulation. This leads to the second validity condition for the truncated Wigner method, T class - T

  15. Acceptability criteria for linear dependence in validating UV-spectrophotometric methods of quantitative determination in forensic and toxicological analysis

    Directory of Open Access Journals (Sweden)

    L. Yu. Klimenko

    2014-08-01

    Full Text Available Introduction. This article is the result of authors’ research in the field of development of the approaches to validation of quantitative determination methods for purposes of forensic and toxicological analysis and devoted to the problem of acceptability criteria formation for validation parameter «linearity/calibration model». The aim of research. The purpose of this paper is to analyse the present approaches to acceptability estimation of the calibration model chosen for method description according to the requirements of the international guidances, to form the own approaches to acceptability estimation of the linear dependence when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. Materials and methods. UV-spectrophotometric method of doxylamine quantitative determination in blood. Results. The approaches to acceptability estimation of calibration models when carrying out the validation of bioanalytical methods is stated in international papers, namely «Guidance for Industry: Bioanalytical method validation» (U.S. FDA, 2001, «Standard Practices for Method Validation in Forensic Toxicology» (SWGTOX, 2012, «Guidance for the Validation of Analytical Methodology and Calibration of Equipment used for Testing of Illicit Drugs in Seized Materials and Biological Specimens» (UNODC, 2009 and «Guideline on validation of bioanalytical methods» (ЕМА, 2011 have been analysed. It has been suggested to be guided by domestic developments in the field of validation of analysis methods for medicines and, particularly, by the approaches to validation methods in the variant of the calibration curve method for forming the acceptability criteria of the obtained linear dependences when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. The choice of the method of calibration curve is

  16. Pressurised liquid extraction of flavonoids in onions. Method development and validation

    DEFF Research Database (Denmark)

    Søltoft, Malene; Christensen, J.H.; Nielsen, J.

    2009-01-01

    A rapid and reliable analytical method for quantification of flavonoids in onions was developed and validated. Five extraction methods were tested on freeze-dried onions and subsequently high performance liquid chromatography (HPLC) with UV detection was used for quantification of seven flavonoids...... extraction methods. However. PLE was the preferred extraction method because the method can be highly automated, use only small amounts of solvents, provide the cleanest extracts, and allow the extraction of light and oxygen-sensitive flavonoids to be carried out in an inert atmosphere protected from light......-step PLE method showed good selectivity, precision (RSDs = 3.1-11%) and recovery of the extractable flavonoids (98-99%). The method also appeared to be a multi-method, i.e. generally applicable to, e.g. phenolic acids in potatoes and carrots....

  17. Application of EU guidelines for the validation of screening methods for veterinary drugs

    NARCIS (Netherlands)

    Stolker, A.A.M.

    2012-01-01

    Commission Decision (CD) 2002/657/EC describes detailed rules for method validation within the framework of residue monitoring programmes. The approach described in this CD is based on criteria. For (qualitative) screening methods, the most important criteria is that the CCß has to be below any

  18. Validity of the CT to attenuation coefficient map conversion methods

    International Nuclear Information System (INIS)

    Faghihi, R.; Ahangari Shahdehi, R.; Fazilat Moadeli, M.

    2004-01-01

    The most important commercialized methods of attenuation correction in SPECT are based on attenuation coefficient map from a transmission imaging method. The transmission imaging system can be the linear source of radioelement or a X-ray CT system. The image of transmission imaging system is not useful unless to replacement of the attenuation coefficient or CT number with the attenuation coefficient in SPECT energy. In this paper we essay to evaluate the validity and estimate the error of the most used method of this transformation. The final result shows that the methods which use a linear or multi-linear curve accept a error in their estimation. The value of mA is not important but the patient thickness is very important and it can introduce a error more than 10 percent in the final result

  19. Validation of methods for the determination of radium in waters and soil

    International Nuclear Information System (INIS)

    Decaillon, J.-G.; Bickel, M.; Hill, C.; Altzitzoglou, T.

    2004-01-01

    This article describes the advantages and disadvantages of several analytical methods used to prepare the alpha-particle source. As a result of this study, a new method combining commercial extraction and ion chromatography prior to a final co-precipitation step is proposed. This method has been applied and validated on several matrices (soil, waters) in the framework of international intercomparisons. The integration of this method in a global procedure to analyze actinoids and radium from a single solution (or digested soil) is also described

  20. A systematic review and appraisal of methods of developing and validating lifestyle cardiovascular disease risk factors questionnaires.

    Science.gov (United States)

    Nse, Odunaiya; Quinette, Louw; Okechukwu, Ogah

    2015-09-01

    Well developed and validated lifestyle cardiovascular disease (CVD) risk factors questionnaires is the key to obtaining accurate information to enable planning of CVD prevention program which is a necessity in developing countries. We conducted this review to assess methods and processes used for development and content validation of lifestyle CVD risk factors questionnaires and possibly develop an evidence based guideline for development and content validation of lifestyle CVD risk factors questionnaires. Relevant databases at the Stellenbosch University library were searched for studies conducted between 2008 and 2012, in English language and among humans. Using the following databases; pubmed, cinahl, psyc info and proquest. Search terms used were CVD risk factors, questionnaires, smoking, alcohol, physical activity and diet. Methods identified for development of lifestyle CVD risk factors were; review of literature either systematic or traditional, involvement of expert and /or target population using focus group discussion/interview, clinical experience of authors and deductive reasoning of authors. For validation, methods used were; the involvement of expert panel, the use of target population and factor analysis. Combination of methods produces questionnaires with good content validity and other psychometric properties which we consider good.

  1. Validation of analytical methods for the stability studies of naproxen suppositories for infant and adult use

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar

    2011-01-01

    Analytical and validating studies were performed in this paper, with a view to using them in the stability studies of the future formulations of naproxen suppositories for children and adults. The most influential factors in the naproxen stability were determined, that is, the major degradation occurred in acid medium, oxidative medium and by light action. One high-performance liquid chromatography-based method was evaluated, which proved to be adequate to quantify naproxen in suppositories and was selective against degradation products. The quantification limit was 3,480 μg, so it was valid for these studies. Additionally, the parameters specificity for stability, detection and quantification limits were evaluated for the direct semi-aqueous acid-base method, which was formerly validated for the quality control and showed satisfactory results. Nevertheless, the volumetric methods were not regarded as stability indicators; therefore, this method will be used along with the chromatographic methods of choice, that is, thin-layer chromatography and highperformance liquid chromatography, to determine the degradation products

  2. Determination of total arsenic in fish by hydride-generation atomic absorption spectrometry: method validation, traceability and uncertainty evaluation

    Science.gov (United States)

    Nugraha, W. C.; Elishian, C.; Ketrin, R.

    2017-03-01

    Fish containing arsenic compound is one of the important indicators of arsenic contamination in water monitoring. The high level of arsenic in fish is due to absorption through food chain and accumulated in their habitat. Hydride generation (HG) coupled with atomic absorption spectrometric (AAS) detection is one of the most popular techniques employed for arsenic determination in a variety of matrices including fish. This study aimed to develop a method for the determination of total arsenic in fish by HG-AAS. The method for sample preparation from American of Analytical Chemistry (AOAC) Method 999.10-2005 was adopted for acid digestion using microwave digestion system and AOAC Method 986.15 - 2005 for dry ashing. The method was developed and validated using Certified Reference Material DORM 3 Fish Protein for trace metals for ensuring the accuracy and the traceability of the results. The sources of uncertainty of the method were also evaluated. By using the method, it was found that the total arsenic concentration in the fish was 45.6 ± 1.22 mg.Kg-1 with a coverage factor of equal to 2 at 95% of confidence level. Evaluation of uncertainty was highly influenced by the calibration curve. This result was also traceable to International Standard System through analysis of Certified Reference Material DORM 3 with 97.5% of recovery. In summary, it showed that method of preparation and HG-AAS technique for total arsenic determination in fish were valid and reliable.

  3. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    Science.gov (United States)

    Nieciąg, Halina

    2015-10-01

    Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling) was implemented as alternative to the simple sampling schema of classic algorithm.

  4. Simple Methods for Scanner Drift Normalization Validated for Automatic Segmentation of Knee Magnetic Resonance Imaging

    DEFF Research Database (Denmark)

    Dam, Erik Bjørnager

    2018-01-01

    Scanner drift is a well-known magnetic resonance imaging (MRI) artifact characterized by gradual signal degradation and scan intensity changes over time. In addition, hardware and software updates may imply abrupt changes in signal. The combined effects are particularly challenging for automatic...... image analysis methods used in longitudinal studies. The implication is increased measurement variation and a risk of bias in the estimations (e.g. in the volume change for a structure). We proposed two quite different approaches for scanner drift normalization and demonstrated the performance...... for segmentation of knee MRI using the fully automatic KneeIQ framework. The validation included a total of 1975 scans from both high-field and low-field MRI. The results demonstrated that the pre-processing method denoted Atlas Affine Normalization significantly removed scanner drift effects and ensured...

  5. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    Science.gov (United States)

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  6. Comparison of validation methods for forming simulations

    Science.gov (United States)

    Schug, Alexander; Kapphan, Gabriel; Bardl, Georg; Hinterhölzl, Roland; Drechsler, Klaus

    2018-05-01

    The forming simulation of fibre reinforced thermoplastics could reduce the development time and improve the forming results. But to take advantage of the full potential of the simulations it has to be ensured that the predictions for material behaviour are correct. For that reason, a thorough validation of the material model has to be conducted after characterising the material. Relevant aspects for the validation of the simulation are for example the outer contour, the occurrence of defects and the fibre paths. To measure these features various methods are available. Most relevant and also most difficult to measure are the emerging fibre orientations. For that reason, the focus of this study was on measuring this feature. The aim was to give an overview of the properties of different measuring systems and select the most promising systems for a comparison survey. Selected were an optical, an eddy current and a computer-assisted tomography system with the focus on measuring the fibre orientations. Different formed 3D parts made of unidirectional glass fibre and carbon fibre reinforced thermoplastics were measured. Advantages and disadvantages of the tested systems were revealed. Optical measurement systems are easy to use, but are limited to the surface plies. With an eddy current system also lower plies can be measured, but it is only suitable for carbon fibres. Using a computer-assisted tomography system all plies can be measured, but the system is limited to small parts and challenging to evaluate.

  7. Unexpected but Most Welcome: Mixed Methods for the Validation and Revision of the Participatory Evaluation Measurement Instrument

    Science.gov (United States)

    Daigneault, Pierre-Marc; Jacob, Steve

    2014-01-01

    Although combining methods is nothing new, more contributions about why and how to mix methods for validation purposes are needed. This article presents a case of validating the inferences drawn from the Participatory Evaluation Measurement Instrument, an instrument that purports to measure stakeholder participation in evaluation. Although the…

  8. Validation of histamine determination Method in yoghurt using High Performance Liquid Chromatography

    Directory of Open Access Journals (Sweden)

    M Jahedinia

    2014-02-01

    Full Text Available Biogenic amines are organic, basic nitrogenous compounds of low molecular weight that are mainly generated by the enzymatic decarboxylation of amino acids by microorganisms. Dairy products are among the foods with the highest amine content. A wide variety of methods and procedures for determination of histamine and biogenic amines have been established. Amongst, HPLC method is considered as reference method. The aim of this study was to validate Reversed Phase HPLC method determination of histamine in yoghurt. The mobile phase consisted of acetonitrile/water (18:88 v/v and the flow rate was set at 0.5 ml/min using isocratic HPLC. Detection was carried out at 254 nm using UV-detector. Calibration curve that was constructed using peak area of standards was linear and value of correlation coefficient (r2 was estimated at 0.998. Good recoveries were observed for histamine under investigation at all spiking levels and average of recoveries was 84%. The RSD% value from repeatability test was found to be %4.4. Limit of detection and limit of quantitation were 0.14 and 0.42 µ/ml, respectively. The results of validation tests showed that the method is reliable and rapid for quantification of histamine in yoghurt.

  9. Development and Validation of HPLC-DAD and UHPLC-DAD Methods for the Simultaneous Determination of Guanylhydrazone Derivatives Employing a Factorial Design.

    Science.gov (United States)

    Azevedo de Brito, Wanessa; Gomes Dantas, Monique; Andrade Nogueira, Fernando Henrique; Ferreira da Silva-Júnior, Edeildo; Xavier de Araújo-Júnior, João; Aquino, Thiago Mendonça de; Adélia Nogueira Ribeiro, Êurica; da Silva Solon, Lilian Grace; Soares Aragão, Cícero Flávio; Barreto Gomes, Ana Paula

    2017-08-30

    Guanylhydrazones are molecules with great pharmacological potential in various therapeutic areas, including antitumoral activity. Factorial design is an excellent tool in the optimization of a chromatographic method, because it is possible quickly change factors such as temperature, mobile phase composition, mobile phase pH, column length, among others to establish the optimal conditions of analysis. The aim of the present work was to develop and validate a HPLC and UHPLC methods for the simultaneous determination of guanylhydrazones with anticancer activity employing experimental design. Precise, exact, linear and robust HPLC and UHPLC methods were developed and validated for the simultaneous quantification of the guanylhydrazones LQM10, LQM14, and LQM17. The UHPLC method was more economic, with a four times less solvent consumption, and 20 times less injection volume, what allowed better column performance. Comparing the empirical approach employed in the HPLC method development to the DoE approach employed in the UHPLC method development, we can conclude that the factorial design made the method development faster, more practical and rational. This resulted in methods that can be employed in the analysis, evaluation and quality control of these new synthetic guanylhydrazones.

  10. Developmental validation of a Nextera XT mitogenome Illumina MiSeq sequencing method for high-quality samples.

    Science.gov (United States)

    Peck, Michelle A; Sturk-Andreaggi, Kimberly; Thomas, Jacqueline T; Oliver, Robert S; Barritt-Ross, Suzanne; Marshall, Charla

    2018-05-01

    Generating mitochondrial genome (mitogenome) data from reference samples in a rapid and efficient manner is critical to harnessing the greater power of discrimination of the entire mitochondrial DNA (mtDNA) marker. The method of long-range target enrichment, Nextera XT library preparation, and Illumina sequencing on the MiSeq is a well-established technique for generating mitogenome data from high-quality samples. To this end, a validation was conducted for this mitogenome method processing up to 24 samples simultaneously along with analysis in the CLC Genomics Workbench and utilizing the AQME (AFDIL-QIAGEN mtDNA Expert) tool to generate forensic profiles. This validation followed the Federal Bureau of Investigation's Quality Assurance Standards (QAS) for forensic DNA testing laboratories and the Scientific Working Group on DNA Analysis Methods (SWGDAM) validation guidelines. The evaluation of control DNA, non-probative samples, blank controls, mixtures, and nonhuman samples demonstrated the validity of this method. Specifically, the sensitivity was established at ≥25 pg of nuclear DNA input for accurate mitogenome profile generation. Unreproducible low-level variants were observed in samples with low amplicon yields. Further, variant quality was shown to be a useful metric for identifying sequencing error and crosstalk. Success of this method was demonstrated with a variety of reference sample substrates and extract types. These studies further demonstrate the advantages of using NGS techniques by highlighting the quantitative nature of heteroplasmy detection. The results presented herein from more than 175 samples processed in ten sequencing runs, show this mitogenome sequencing method and analysis strategy to be valid for the generation of reference data. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Performance Validity Testing in Neuropsychology: Methods for Measurement Development and Maximizing Diagnostic Accuracy.

    Science.gov (United States)

    Wodushek, Thomas R; Greher, Michael R

    2017-05-01

    In the first column in this 2-part series, Performance Validity Testing in Neuropsychology: Scientific Basis and Clinical Application-A Brief Review, the authors introduced performance validity tests (PVTs) and their function, provided a justification for why they are necessary, traced their ongoing endorsement by neuropsychological organizations, and described how they are used and interpreted by ever increasing numbers of clinical neuropsychologists. To enhance readers' understanding of these measures, this second column briefly describes common detection strategies used in PVTs as well as the typical methods used to validate new PVTs and determine cut scores for valid/invalid determinations. We provide a discussion of the latest research demonstrating how neuropsychologists can combine multiple PVTs in a single battery to improve sensitivity/specificity to invalid responding. Finally, we discuss future directions for the research and application of PVTs.

  12. Optimization and validation of Folin-Ciocalteu method for the determination of total polyphenol content of Pu-erh tea.

    Science.gov (United States)

    Musci, Marilena; Yao, Shicong

    2017-12-01

    Pu-erh tea is a post-fermented tea that has recently gained popularity worldwide, due to potential health benefits related to the antioxidant activity resulting from its high polyphenolic content. The Folin-Ciocalteu method is a simple, rapid, and inexpensive assay widely applied for the determination of total polyphenol content. Over the past years, it has been subjected to many modifications, often without any systematic optimization or validation. In our study, we sought to optimize the Folin-Ciocalteu method, evaluate quality parameters including linearity, precision and stability, and then apply the optimized model to determine the total polyphenol content of 57 Chinese teas, including green tea, aged and ripened Pu-erh tea. Our optimized Folin-Ciocalteu method reduced analysis time, allowed for the analysis of a large number of samples, to discriminate among the different teas, and to assess the effect of the post-fermentation process on polyphenol content.

  13. Validation method for determination of cholesterol in human urine with electrochemical sensors using gold electrodes

    Science.gov (United States)

    Riyanto, Laksono, Tomy Agung

    2017-12-01

    Electrochemical sensors for the determination of cholesterol with Au as a working electrode (Au) and its application to the analysis of urine have been done. The gold electrode was prepared using gold pure (99.99%), with size 1.0 mm by length and wide respectively, connected with silver wire using silver conductive paint. Validation methods have been investigated in the analysis of cholesterol in human urine using electrochemical sensors or cyclic voltammetry (CV) method. The effect of electrolyte and uric acid concentration has been determined to produce the optimum method. Validation method parameters for cholesterol analysis in human urine using CV are precision, recovery, linearity, limit of detection (LOD) and limit of quantification (LOQ). The result showed the correlation of concentration of cholesterol to anodic peak current is the coefficient determination of R2 = 0.916. The results of the validation method showed the precision, recovery, linearity, LOD, and LOQ are 1.2539%, 144.33%, 0.916, 1.49 × 10-1 mM and 4.96 × 10-1 mM, respectively. As a conclusion is Au electrode is a good electrode for electrochemical sensors to determination of cholesterol in human urine.

  14. Optimization and validation of high-performance liquid chromatography method for analyzing 25-desacetyl rifampicin in human urine

    Science.gov (United States)

    Lily; Laila, L.; Prasetyo, B. E.

    2018-03-01

    A selective, reproducibility, effective, sensitive, simple and fast High-Performance Liquid Chromatography (HPLC) was developed, optimized and validated to analyze 25-Desacetyl Rifampicin (25-DR) in human urine which is from tuberculosis patient. The separation was performed by HPLC Agilent Technologies with column Agilent Eclipse XDB- Ci8 and amobile phase of 65:35 v/v methanol: 0.01 M sodium phosphate buffer pH 5.2, at 254 nm and flow rate of 0.8ml/min. The mean retention time was 3.016minutes. The method was linear from 2–10μg/ml 25-DR with a correlation coefficient of 0.9978. Standard deviation, relative standard deviation and coefficient variation of 2, 6, 10μg/ml 25-DR were 0-0.0829, 03.1752, 0-0.0317%, respectively. The recovery of 5, 7, 9μg/ml25-DR was 80.8661, 91.3480 and 111.1457%, respectively. Limits of detection (LoD) and quantification (LoQ) were 0.51 and 1.7μg/ml, respectively. The method has fulfilled the validity guidelines of the International Conference on Harmonization (ICH) bioanalytical method which includes parameters of specificity, linearity, precision, accuracy, LoD, and LoQ. The developed method is suitable for pharmacokinetic analysis of various concentrations of 25-DR in human urine.

  15. Low-cost extrapolation method for maximal lte radio base station exposure estimation: Test and validation

    International Nuclear Information System (INIS)

    Verloock, L.; Joseph, W.; Gati, A.; Varsier, N.; Flach, B.; Wiart, J.; Martens, L.

    2013-01-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on down-link band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2x2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders. (authors)

  16. Review on Laryngeal Palpation Methods in Muscle Tension Dysphonia: Validity and Reliability Issues.

    Science.gov (United States)

    Khoddami, Seyyedeh Maryam; Ansari, Noureddin Nakhostin; Jalaie, Shohreh

    2015-07-01

    Laryngeal palpation is a common clinical method for the assessment of neck and laryngeal muscles in muscle tension dysphonia (MTD). To review the available laryngeal palpation methods used in patients with MTD for the assessment, diagnosis, or document of treatment outcomes. A systematic review of the literature concerning palpatory methods in MTD was conducted using the databases MEDLINE (PubMed), ScienceDirect, Scopus, Web of science, Web of knowledge and Cochrane Library between July and October 2013. Relevant studies were identified by one reviewer based on screened titles/abstracts and full texts. Manual searching was also used to track the source literature. There were five main as well as miscellaneous palpation methods that were different according to target anatomical structures, judgment or grading system, and using tasks. There were only a few scales available, and the majority of the palpatory methods were qualitative. Most of the palpatory methods evaluate the tension at both static and dynamic tasks. There was little information about the validity and reliability of the available methods. The literature on the scientific evidence of muscle tension indicators perceived by laryngeal palpation in MTD is scarce. Future studies should be conducted to investigate the validity and reliability of palpation methods. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  17. 11th GCC Closed Forum: cumulative stability; matrix stability; immunogenicity assays; laboratory manuals; biosimilars; chiral methods; hybrid LBA/LCMS assays; fit-for-purpose validation; China Food and Drug Administration bioanalytical method validation.

    Science.gov (United States)

    Islam, Rafiq; Briscoe, Chad; Bower, Joseph; Cape, Stephanie; Arnold, Mark; Hayes, Roger; Warren, Mark; Karnik, Shane; Stouffer, Bruce; Xiao, Yi Qun; van der Strate, Barry; Sikkema, Daniel; Fang, Xinping; Tudoroniu, Ariana; Tayyem, Rabab; Brant, Ashley; Spriggs, Franklin; Barry, Colin; Khan, Masood; Keyhani, Anahita; Zimmer, Jennifer; Caturla, Maria Cruz; Couerbe, Philippe; Khadang, Ardeshir; Bourdage, James; Datin, Jim; Zemo, Jennifer; Hughes, Nicola; Fatmi, Saadya; Sheldon, Curtis; Fountain, Scott; Satterwhite, Christina; Colletti, Kelly; Vija, Jenifer; Yu, Mathilde; Stamatopoulos, John; Lin, Jenny; Wilfahrt, Jim; Dinan, Andrew; Ohorodnik, Susan; Hulse, James; Patel, Vimal; Garofolo, Wei; Savoie, Natasha; Brown, Michael; Papac, Damon; Buonarati, Mike; Hristopoulos, George; Beaver, Chris; Boudreau, Nadine; Williard, Clark; Liu, Yansheng; Ray, Gene; Warrino, Dominic; Xu, Allan; Green, Rachel; Hayward-Sewell, Joanne; Marcelletti, John; Sanchez, Christina; Kennedy, Michael; Charles, Jessica St; Bouhajib, Mohammed; Nehls, Corey; Tabler, Edward; Tu, Jing; Joyce, Philip; Iordachescu, Adriana; DuBey, Ira; Lindsay, John; Yamashita, Jim; Wells, Edward

    2018-04-01

    The 11th Global CRO Council Closed Forum was held in Universal City, CA, USA on 3 April 2017. Representatives from international CRO members offering bioanalytical services were in attendance in order to discuss scientific and regulatory issues specific to bioanalysis. The second CRO-Pharma Scientific Interchange Meeting was held on 7 April 2017, which included Pharma representatives' sharing perspectives on the topics discussed earlier in the week with the CRO members. The issues discussed at the meetings included cumulative stability evaluations, matrix stability evaluations, the 2016 US FDA Immunogenicity Guidance and recent and unexpected FDA Form 483s on immunogenicity assays, the bioanalytical laboratory's role in writing PK sample collection instructions, biosimilars, CRO perspectives on the use of chiral versus achiral methods, hybrid LBA/LCMS assays, applications of fit-for-purpose validation and, at the Global CRO Council Closed Forum only, the status and trend of current regulated bioanalytical practice in China under CFDA's new BMV policy. Conclusions from discussions of these topics at both meetings are included in this report.

  18. Unsteady panel method for complex configurations including wake modeling

    CSIR Research Space (South Africa)

    Van Zyl, Lourens H

    2008-01-01

    Full Text Available implementations of the DLM are however not very versatile in terms of geometries that can be modeled. The ZONA6 code offers a versatile surface panel body model including a separated wake model, but uses a pressure panel method for lifting surfaces. This paper...

  19. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    Science.gov (United States)

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Columbia River Stock Identification Study; Validation of Genetic Method, 1980-1981 Final Report.

    Energy Technology Data Exchange (ETDEWEB)

    Milner, George B.; Teel, David J.; Utter, Fred M. (Northwest and Alaska Fisheries Science Center, Coastal Zone and Estuarine Studies Division, Seattle, WA)

    1981-06-01

    The reliability of a method for obtaining maximum likelihood estimate of component stocks in mixed populations of salmonids through the frequency of genetic variants in a mixed population and in potentially contributing stocks was tested in 1980. A data base of 10 polymorphic loci from 14 hatchery stocks of spring chinook salmon of the Columbia River was used to estimate proportions of these stocks in four different blind'' mixtures whose true composition was only revealed subsequent to obtaining estimates. The accuracy and precision of these blind tests have validated the genetic method as a valuable means for identifying components of stock mixtures. Properties of the genetic method were further examined by simulation studies using the pooled data of the four blind tests as a mixed fishery. Replicated tests with samples sizes between 100 and 1,000 indicated that actual standard deviations on estimated contributions were consistently lower than calculated standard deviations; this difference diminished as sample size increased. It is recommended that future applications of the method be preceded by simulation studies that will identify appropriate levels of sampling required for acceptable levels of accuracy and precision. Variables in such studies include the stocks involved, the loci used, and the genetic differentiation among stocks. 8 refs., 6 figs., 4 tabs.

  1. Shielding design method for LMFBR validation on the Phenix factor

    International Nuclear Information System (INIS)

    Cabrillat, J.C.; Crouzet, J.; Misrakis, J.; Salvatores, M.; Rado, V.; Palmiotti, G.

    1983-05-01

    Shielding design methods, developed at CEA for shielding calculations find a global validation by the means of Phenix power reactor (250 MWe) measurements. Particularly, the secondary sodium activation of pool type LMFBR such as Super Phenix (1200 MWe) which is subject to strict safety limitation is well calculated by the adapted scheme, i.e. a two dimension transport calculation of shielding coupled to a Monte-Carlo calculation of secondary sodium activation

  2. The Validation of AAN Method Used by Rock Sample SRM 2780

    International Nuclear Information System (INIS)

    Rina Mulyaningsih, Th.

    2004-01-01

    AAN methods is a non standard testing method. The testing laboratory must be validate its using method to ensure and confirm that it is suitable with application. The analysis of SRM 2780 Hard rock mine waste with 9 replicates has been done to test the accuracy of AAN methods. The result showed that the elements As, Ba, Mn, V, Zn and Na have good accuration were evaluated against the acceptance criteria for accuracy with confidence level 95 %. The elements As, Co, Sc, Cr, Ba, Sb, Cs, Mn, V, Au, Zn and Na have low relative bias between the analyst's value and the target value. The continued testing must be done to test the accuracy of another certificated elements. (author)

  3. A Validated, Rapid HPLC-ESI-MS/MS Method for the Determination of Lycopsamine.

    Science.gov (United States)

    Jedlinszki, Nikoletta; Csupor, Dezső

    2015-07-01

    The aim of the present work was to develop and validate an HPLC-MS/MS method for the determination of a major pyrrolizidine alkaloid of comfrey (lycopsamine) in aqueous samples as a basis for the development of a method for the determination of absorption of lycopsamine by human skin. A linear calibration curve was established in the range of 1.32-440 ng. The intraday precision during the 3-day validation period ranged between 0.57 and 2.48% while the interday precision was 1.70% and 1.95% for quality control samples. LOD was 0.014 ng and recovery was above 97%. The lycopsamine content of the samples stored for 9 and 25 days at 22 degrees C, 10 degrees C and -25 degrees C did not vary. These results underline the good repeatability and accuracy of our method and allow the analysis of samples with very low lycopsamine content.

  4. Validation of a stability-indicating spectrometric method for the determination of sulfacetamide sodium in pure form and ophthalmic preparations

    Directory of Open Access Journals (Sweden)

    Sofia Ahmed

    2017-01-01

    Full Text Available Introduction: Sulfacetamide sodium is a widely used sulfonamide for ophthalmic infections. Objective: A number of analytical methods have been reported for the analysis of sulfacetamide but they lack the ability to determine both the active drug and its major degradation product, sulfanilamide, simultaneously in a sample. Materials and Methods: In the present study a simple, rapid and economical stability-indicating UV spectrometric method has been validated for the simultaneous assay of sulfacetamide sodium and sulfanilamide in pure form and in ophthalmic preparations. Results: The method has been found to be accurate (recovery 100.03 ±0.589% and precise (RSD 0.587% with detectable and quantifiable limits of 1.67×10–6 M (0.04 mg% and 5.07×10–6 M (0.13 mg%, respectively for the assay of pure sulfacetamide sodium. The method is also found to be accurate and precise to small changes in wavelength, pH and buffer concentration as well as to forced degradation. The study further includes the validation of the method for the assay of pure sulfanilamide in solution, which has been found to be accurate, precise and robust. Conclusion: The results indicate that the proposed two-component spectrometric method is stability-indicating and can be used for the simultaneous assay of both sulfacetamide sodium and sulfanilamide in synthetic mixtures and degraded solutions.

  5. FastaValidator: an open-source Java library to parse and validate FASTA formatted sequences.

    Science.gov (United States)

    Waldmann, Jost; Gerken, Jan; Hankeln, Wolfgang; Schweer, Timmy; Glöckner, Frank Oliver

    2014-06-14

    Advances in sequencing technologies challenge the efficient importing and validation of FASTA formatted sequence data which is still a prerequisite for most bioinformatic tools and pipelines. Comparative analysis of commonly used Bio*-frameworks (BioPerl, BioJava and Biopython) shows that their scalability and accuracy is hampered. FastaValidator represents a platform-independent, standardized, light-weight software library written in the Java programming language. It targets computer scientists and bioinformaticians writing software which needs to parse quickly and accurately large amounts of sequence data. For end-users FastaValidator includes an interactive out-of-the-box validation of FASTA formatted files, as well as a non-interactive mode designed for high-throughput validation in software pipelines. The accuracy and performance of the FastaValidator library qualifies it for large data sets such as those commonly produced by massive parallel (NGS) technologies. It offers scientists a fast, accurate and standardized method for parsing and validating FASTA formatted sequence data.

  6. Validity of a Simulation Game as a Method for History Teaching

    Science.gov (United States)

    Corbeil, Pierre; Laveault, Dany

    2011-01-01

    The aim of this research is, first, to determine the validity of a simulation game as a method of teaching and an instrument for the development of reasoning and, second, to study the relationship between learning and students' behavior toward games. The participants were college students in a History of International Relations course, with two…

  7. Validated spectrophotometric methods for simultaneous determination of troxerutin and carbazochrome in dosage form

    Science.gov (United States)

    Khattab, Fatma I.; Ramadan, Nesrin K.; Hegazy, Maha A.; Al-Ghobashy, Medhat A.; Ghoniem, Nermine S.

    2015-03-01

    Four simple, accurate, sensitive and precise spectrophotometric methods were developed and validated for simultaneous determination of Troxerutin (TXN) and Carbazochrome (CZM) in their bulk powders, laboratory prepared mixtures and pharmaceutical dosage forms. Method A is first derivative spectrophotometry (D1) where TXN and CZM were determined at 294 and 483.5 nm, respectively. Method B is first derivative of ratio spectra (DD1) where the peak amplitude at 248 for TXN and 439 nm for CZM were used for their determination. Method C is ratio subtraction (RS); in which TXN was determined at its λmax (352 nm) in the presence of CZM which was determined by D1 at 483.5 nm. While, method D is mean centering of the ratio spectra (MCR) in which the mean centered values at 300 nm and 340.0 nm were used for the two drugs in a respective order. The two compounds were simultaneously determined in the concentration ranges of 5.00-50.00 μg mL-1 and 0.5-10.0 μg mL-1 for TXN and CZM, respectively. The methods were validated according to the ICH guidelines and the results were statistically compared to the manufacturer's method.

  8. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    Directory of Open Access Journals (Sweden)

    Nieciąg Halina

    2015-10-01

    Full Text Available Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling was implemented as alternative to the simple sampling schema of classic algorithm.

  9. Survey and assessment of conventional software verification and validation methods

    International Nuclear Information System (INIS)

    Miller, L.A.; Groundwater, E.; Mirsky, S.M.

    1993-04-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 134 methods so identified were classified according to their appropriateness for various phases of a developmental lifecycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes in terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each method for the four identified components of knowledge-based and expert systems, as well as the system as a whole

  10. Evaluating the Social Validity of the Early Start Denver Model: A Convergent Mixed Methods Study

    Science.gov (United States)

    Ogilvie, Emily; McCrudden, Matthew T.

    2017-01-01

    An intervention has social validity to the extent that it is socially acceptable to participants and stakeholders. This pilot convergent mixed methods study evaluated parents' perceptions of the social validity of the Early Start Denver Model (ESDM), a naturalistic behavioral intervention for children with autism. It focused on whether the parents…

  11. Validation of the quality control methods for active ingredients of Fungirex cream

    International Nuclear Information System (INIS)

    Perez Navarro, Maikel; Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania

    2014-01-01

    Fungirex cream is a two-drug product, that is, undecylenic acid and zinc undecylenate over a suitable basis. Since this is a product not documented in the official monographs of the pharmacopoeia, simple analytical methods were suggested for quantitation of analytes of interest in the cream, which are useful for release of newly prepared cream batches. To validate two volumetric methods for the quality control of active ingredients in Fungirex cream

  12. Uncertainty estimates of purity measurements based on current information: toward a "live validation" of purity methods.

    Science.gov (United States)

    Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech

    2012-12-01

    To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.

  13. Catalyst support structure, catalyst including the structure, reactor including a catalyst, and methods of forming same

    Science.gov (United States)

    Van Norman, Staci A.; Aston, Victoria J.; Weimer, Alan W.

    2017-05-09

    Structures, catalysts, and reactors suitable for use for a variety of applications, including gas-to-liquid and coal-to-liquid processes and methods of forming the structures, catalysts, and reactors are disclosed. The catalyst material can be deposited onto an inner wall of a microtubular reactor and/or onto porous tungsten support structures using atomic layer deposition techniques.

  14. The use of Geographic Information System (GIS) and non-GIS methods to assess the external validity of samples postcollection.

    Science.gov (United States)

    Richardson, Esther; Good, Margaret; McGrath, Guy; More, Simon J

    2009-09-01

    External validity is fundamental to veterinary diagnostic investigation, reflecting the accuracy with which sample results can be extrapolated to a broader population of interest. Probability sampling methods are routinely used during the collection of samples from populations, specifically to maximize external validity. Nonprobability sampling (e.g., of blood samples collected as part of routine surveillance programs or laboratory submissions) may provide useful data for further posthoc epidemiological analysis, adding value to the collection and submission of samples. As the sample has already been submitted, the analyst or investigator does not have any control over the sampling methodology, and hence external validity as routine probability sampling methods may not have been employed. The current study describes several Geographic Information System (GIS) and non-GIS methods, applied posthoc, to assess the external validity of samples collected using both probability and nonprobability sampling methods. These methods could equally be employed for inspecting other datasets. Mapping was conducted using ArcView 9.1. Based on this posthoc assessment, results from the random field sample could provide an externally valid, albeit relatively imprecise, estimate of national disease prevalence, of disease prevalence in 3 of the 4 provinces (all but Ulster, in the north and northwest, where sample size was small), and in beef and dairy herds. This study provides practical methods for examining the external validity of samples postcollection.

  15. Method of administration of PROMIS scales did not significantly impact score level, reliability, or validity

    DEFF Research Database (Denmark)

    Bjorner, Jakob B; Rose, Matthias; Gandek, Barbara

    2014-01-01

    OBJECTIVES: To test the impact of the method of administration (MOA) on score level, reliability, and validity of scales developed in the Patient Reported Outcomes Measurement Information System (PROMIS). STUDY DESIGN AND SETTING: Two nonoverlapping parallel forms each containing eight items from......, no significant mode differences were found and all confidence intervals were within the prespecified minimal important difference of 0.2 standard deviation. Parallel-forms reliabilities were very high (ICC = 0.85-0.93). Only one across-mode ICC was significantly lower than the same-mode ICC. Tests of validity...... questionnaire (PQ), personal digital assistant (PDA), or personal computer (PC) and a second form by PC, in the same administration. Method equivalence was evaluated through analyses of difference scores, intraclass correlations (ICCs), and convergent/discriminant validity. RESULTS: In difference score analyses...

  16. Validation of an open-formula, diagnostic real-time PCR method for 20-hr detection of Salmonella in animal feeds

    DEFF Research Database (Denmark)

    Löfström, Charlotta; Hoorfar, Jeffrey

    2012-01-01

    A comparative study of a 20-hr, non-commercial, open-formula PCR method and the standard culture-based method NMKL 187, for detection of Salmonella, was performed according to the validation protocol from the Nordic organization for validation of alternative microbiological methods (NordVal) on 81...

  17. Method validation and stability study of quercetin in topical emulsions

    Directory of Open Access Journals (Sweden)

    Rúbia Casagrande

    2009-01-01

    Full Text Available This study validated a high performance liquid chromatography (HPLC method for the quantitative evaluation of quercetin in topical emulsions. The method was linear within 0.05 - 200 μg/mL range with a correlation coefficient of 0.9997, and without interference in the quercetin peak. The detection and quantitation limits were 18 and 29 ng/mL, respectively. The intra- and inter-assay precisions presented R.S.D. values lower than 2%. An average of 93% and 94% of quercetin was recovered for non-ionic and anionic emulsions, respectively. The raw material and anionic emulsion, but not non-ionic emulsion, were stable in all storage conditions for one year. The method reported is a fast and reliable HPLC technique useful for quercetin determination in topical emulsions.

  18. A Validated RP-HPLC Method for the Determination of Atazanavir in Pharmaceutical Dosage Form

    Directory of Open Access Journals (Sweden)

    K. Srinivasu

    2011-01-01

    Full Text Available A validated RP HPLC method for the estimation of atazanavir in capsule dosage form on YMC ODS 150 × 4.6 mm, 5 μ column using mobile phase composition of ammonium dihydrogen phosphate buffer (pH 2.5 with acetonitrile (55:45 v/v. Flow rate was maintained at 1.5 mL/min with 288 nm UV detection. The retention time obtained for atazanavir was at 4.7 min. The detector response was linear in the concentration range of 30 - 600 μg/mL. This method has been validated and shown to be specific, sensitive, precise, linear, accurate, rugged, robust and fast. Hence, this method can be applied for routine quality control of atazanavir in capsule dosage forms as well as in bulk drug.

  19. The establishment of tocopherol reference intervals for Hungarian adult population using a validated HPLC method.

    Science.gov (United States)

    Veres, Gábor; Szpisjak, László; Bajtai, Attila; Siska, Andrea; Klivényi, Péter; Ilisz, István; Földesi, Imre; Vécsei, László; Zádori, Dénes

    2017-09-01

    Evidence suggests that decreased α-tocopherol (the most biologically active substance in the vitamin E group) level can cause neurological symptoms, most likely ataxia. The aim of the current study was to first provide reference intervals for serum tocopherols in the adult Hungarian population with appropriate sample size, recruiting healthy control subjects and neurological patients suffering from conditions without symptoms of ataxia, myopathy or cognitive deficiency. A validated HPLC method applying a diode array detector and rac-tocol as internal standard was utilized for that purpose. Furthermore, serum cholesterol levels were determined as well for data normalization. The calculated 2.5-97.5% reference intervals for α-, β/γ- and δ-tocopherols were 24.62-54.67, 0.81-3.69 and 0.29-1.07 μm, respectively, whereas the tocopherol/cholesterol ratios were 5.11-11.27, 0.14-0.72 and 0.06-0.22 μmol/mmol, respectively. The establishment of these reference intervals may improve the diagnostic accuracy of tocopherol measurements in certain neurological conditions with decreased tocopherol levels. Moreover, the current study draws special attention to the possible pitfalls in the complex process of the determination of reference intervals as well, including the selection of study population, the application of internal standard and method validation and the calculation of tocopherol/cholesterol ratios. Copyright © 2017 John Wiley & Sons, Ltd.

  20. SU-F-J-86: Method to Include Tissue Dose Response Effect in Deformable Image Registration

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, J; Liang, J; Chen, S; Qin, A; Yan, D [Beaumont Health Systeml, Royal Oak, MI (United States)

    2016-06-15

    Purpose: Organ changes shape and size during radiation treatment due to both mechanical stress and radiation dose response. However, the dose response induced deformation has not been considered in conventional deformable image registration (DIR). A novel DIR approach is proposed to include both tissue elasticity and radiation dose induced organ deformation. Methods: Assuming that organ sub-volume shrinkage was proportional to the radiation dose induced cell killing/absorption, the dose induced organ volume change was simulated applying virtual temperature on each sub-volume. Hence, both stress and heterogeneity temperature induced organ deformation. Thermal stress finite element method with organ surface boundary condition was used to solve deformation. Initial boundary correspondence on organ surface was created from conventional DIR. Boundary condition was updated by an iterative optimization scheme to minimize elastic deformation energy. The registration was validated on a numerical phantom. Treatment dose was constructed applying both the conventional DIR and the proposed method using daily CBCT image obtained from HN treatment. Results: Phantom study showed 2.7% maximal discrepancy with respect to the actual displacement. Compared with conventional DIR, subvolume displacement difference in a right parotid had the mean±SD (Min, Max) to be 1.1±0.9(−0.4∼4.8), −0.1±0.9(−2.9∼2.4) and −0.1±0.9(−3.4∼1.9)mm in RL/PA/SI directions respectively. Mean parotid dose and V30 constructed including the dose response induced shrinkage were 6.3% and 12.0% higher than those from the conventional DIR. Conclusion: Heterogeneous dose distribution in normal organ causes non-uniform sub-volume shrinkage. Sub-volume in high dose region has a larger shrinkage than the one in low dose region, therefore causing more sub-volumes to move into the high dose area during the treatment course. This leads to an unfavorable dose-volume relationship for the normal organ

  1. A Sensitive Validated Spectrophotometric Method for the Determination of Flucloxacillin Sodium

    Directory of Open Access Journals (Sweden)

    R. Singh Gujral

    2009-01-01

    Full Text Available A simple and sensitive spectrophotometric method has been proposed for the determination of flucloxacillin sodium. The determination method is based on charge transfer complexation reaction of the drug with iodine in methanol-dichloromethane medium. The absorbance was measured at 362 nm against the reagent blank. Under optimized experimental conditions, Beer's law is obeyed in the concentration ranges 1-9 μg/mL for flucloxacillin. The method was validated for specificity, linearity, precision, accuracy. The degree of linearity of the calibration curves, the percent recoveries, limit of detection and quantitation for the spectrophotometric method were determined. No interferences could be observed from the additives commonly present in the pharmaceutical formulations. The method was successfully applied for in vitro determination of human urine samples with low RSD value. This is simple, specific, accurate and sensitive spectrophotometric method.

  2. MR-based Water Content Estimation in Cartilage: Design and Validation of a Method

    DEFF Research Database (Denmark)

    Shiguetomi Medina, Juan Manuel; Kristiansen, Maja Sofie; Ringgaard, Steffen

    2012-01-01

    Objective Design and validation of an MR-based method that allows the calculation of the water content in cartilage tissue. Material and Methods We modified and adapted to cartilage tissue T1 map based water content MR sequences commonly used in the neurology field. Using a 37 Celsius degree stable...... was costumed and programmed. Finally, we validated the method after measuring and comparing 3 more cartilage samples in a living animal (pig). The obtained data was analyzed and the water content calculated. Then, the same samples were freeze-dried (this technique allows to take out all the water that a tissue...... contains) and we measured the water they contained. Results We could reproduce twice the 37 Celsius degree system and could perform the measurements in a similar way. We found that the MR T1 map based water content sequences can provide information that, after being analyzed with a special software, can...

  3. Determination of methylmercury in marine biota samples with advanced mercury analyzer: method validation.

    Science.gov (United States)

    Azemard, Sabine; Vassileva, Emilia

    2015-06-01

    In this paper, we present a simple, fast and cost-effective method for determination of methyl mercury (MeHg) in marine samples. All important parameters influencing the sample preparation process were investigated and optimized. Full validation of the method was performed in accordance to the ISO-17025 (ISO/IEC, 2005) and Eurachem guidelines. Blanks, selectivity, working range (0.09-3.0ng), recovery (92-108%), intermediate precision (1.7-4.5%), traceability, limit of detection (0.009ng), limit of quantification (0.045ng) and expanded uncertainty (15%, k=2) were assessed. Estimation of the uncertainty contribution of each parameter and the demonstration of traceability of measurement results was provided as well. Furthermore, the selectivity of the method was studied by analyzing the same sample extracts by advanced mercury analyzer (AMA) and gas chromatography-atomic fluorescence spectrometry (GC-AFS). Additional validation of the proposed procedure was effectuated by participation in the IAEA-461 worldwide inter-laboratory comparison exercises. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Software validation applied to spreadsheets used in laboratories working under ISO/IEC 17025

    Science.gov (United States)

    Banegas, J. M.; Orué, M. W.

    2016-07-01

    Several documents deal with software validation. Nevertheless, more are too complex to be applied to validate spreadsheets - surely the most used software in laboratories working under ISO/IEC 17025. The method proposed in this work is intended to be directly applied to validate spreadsheets. It includes a systematic way to document requirements, operational aspects regarding to validation, and a simple method to keep records of validation results and modifications history. This method is actually being used in an accredited calibration laboratory, showing to be practical and efficient.

  5. Softcopy quality ruler method: implementation and validation

    Science.gov (United States)

    Jin, Elaine W.; Keelan, Brian W.; Chen, Junqing; Phillips, Jonathan B.; Chen, Ying

    2009-01-01

    A softcopy quality ruler method was implemented for the International Imaging Industry Association (I3A) Camera Phone Image Quality (CPIQ) Initiative. This work extends ISO 20462 Part 3 by virtue of creating reference digital images of known subjective image quality, complimenting the hardcopy Standard Reference Stimuli (SRS). The softcopy ruler method was developed using images from a Canon EOS 1Ds Mark II D-SLR digital still camera (DSC) and a Kodak P880 point-and-shoot DSC. Images were viewed on an Apple 30in Cinema Display at a viewing distance of 34 inches. Ruler images were made for 16 scenes. Thirty ruler images were generated for each scene, representing ISO 20462 Standard Quality Scale (SQS) values of approximately 2 to 31 at an increment of one just noticeable difference (JND) by adjusting the system modulation transfer function (MTF). A Matlab GUI was developed to display the ruler and test images side-by-side with a user-adjustable ruler level controlled by a slider. A validation study was performed at Kodak, Vista Point Technology, and Aptina Imaging in which all three companies set up a similar viewing lab to run the softcopy ruler method. The results show that the three sets of data are in reasonable agreement with each other, with the differences within the range expected from observer variability. Compared to previous implementations of the quality ruler, the slider-based user interface allows approximately 2x faster assessments with 21.6% better precision.

  6. A gradient-based method for segmenting FDG-PET images: methodology and validation

    International Nuclear Information System (INIS)

    Geets, Xavier; Lee, John A.; Gregoire, Vincent; Bol, Anne; Lonneux, Max

    2007-01-01

    A new gradient-based method for segmenting FDG-PET images is described and validated. The proposed method relies on the watershed transform and hierarchical cluster analysis. To allow a better estimation of the gradient intensity, iteratively reconstructed images were first denoised and deblurred with an edge-preserving filter and a constrained iterative deconvolution algorithm. Validation was first performed on computer-generated 3D phantoms containing spheres, then on a real cylindrical Lucite phantom containing spheres of different volumes ranging from 2.1 to 92.9 ml. Moreover, laryngeal tumours from seven patients were segmented on PET images acquired before laryngectomy by the gradient-based method and the thresholding method based on the source-to-background ratio developed by Daisne (Radiother Oncol 2003;69:247-50). For the spheres, the calculated volumes and radii were compared with the known values; for laryngeal tumours, the volumes were compared with the macroscopic specimens. Volume mismatches were also analysed. On computer-generated phantoms, the deconvolution algorithm decreased the mis-estimate of volumes and radii. For the Lucite phantom, the gradient-based method led to a slight underestimation of sphere volumes (by 10-20%), corresponding to negligible radius differences (0.5-1.1 mm); for laryngeal tumours, the segmented volumes by the gradient-based method agreed with those delineated on the macroscopic specimens, whereas the threshold-based method overestimated the true volume by 68% (p = 0.014). Lastly, macroscopic laryngeal specimens were totally encompassed by neither the threshold-based nor the gradient-based volumes. The gradient-based segmentation method applied on denoised and deblurred images proved to be more accurate than the source-to-background ratio method. (orig.)

  7. Validity of a structured method of selecting abstracts for a plastic surgical scientific meeting

    NARCIS (Netherlands)

    van der Steen, LPE; Hage, JJ; Kon, M; Monstrey, SJ

    In 1999, the European Association of Plastic Surgeons accepted a structured method to assess and select the abstracts that are submitted for its yearly scientific meeting. The two criteria used to evaluate whether such a selection method is accurate were reliability and validity. The authors

  8. Development and validation of a stability-indicating RP–HPLC method for estimation of atazanavir sulfate in bulk

    Directory of Open Access Journals (Sweden)

    S. Dey

    2017-04-01

    Full Text Available A stability-indicating reverse phase–high performance liquid chromatography (RP–HPLC method was developed and validated for the determination of atazanavir sulfate in tablet dosage forms using C18 column Phenomenix (250 mm×4.6 mm, 5 μm with a mobile phase consisting of 900 mL of HPLC grade methanol and 100 mL of water of HPLC grade. The pH was adjusted to 3.55 with acetic acid. The mobile phase was sonicated for 10 min and filtered through a 0.45 μm membrane filter at a flow rate of 0.5 mL/min. The detection was carried out at 249 nm and retention time of atazanavir sulfate was found to be 8.323 min. Linearity was observed from 10 to 90 μg/mL (coefficient of determination R2 was 0.999 with equation, y=23.427x+37.732. Atazanavir sulfate was subjected to stress conditions including acidic, alkaline, oxidation, photolysis and thermal degradation, and the results showed that it was more sensitive towards acidic degradation. The method was validated as per ICH guidelines.

  9. Validated, Ultra Violet Spectroscopy method for the Dissolution study of Mycophenolate mofetil immediate release 500mg tablets

    OpenAIRE

    Surajpal P. Verma; Ozair Alam; Pooja Mullick; Nadeem Siddiqui; Suroor A. Khan

    2008-01-01

    A simple, selective and precise dissolution method was developed and validated for the Mycophenolate mofetil immediate release tablets. The method employed dissolution medium 0.1N HCl (pH1.2) and volume 900ml with USP-II apparatus (Paddle). Detection was made by measuring the absorbance on UV at the [lambda]~max~ 250nm. The method show the linearity in the range of conc. 5[micro]g/ml to 40[micro]g/ml with r^2^=0.999. The method is also validated as per International Conference of Harmonizatio...

  10. Application of Bayesian Method in Validation of TTM Decisional Balance and Self-Efficacy Constructs to Improve Nutritional Behavior in Yazdian Prediabetes

    Directory of Open Access Journals (Sweden)

    Hossein Fallahzadeh

    2017-07-01

    Full Text Available Introduction: To introduce Bayesian method in validation of transtheoretical model’s Self-Efficacy and Decisional Balance for nutritional behavior improvement among Prediabetes with ordinal data. Methods: This is an Experimental trial with parallel design and sample was included 220 Prediabetes who Participated in screening program and had over 30 years old, fasting blood glucose ranged 100-125 and at least elementary Education. We used OpenBugs 3.2.3 to fit Bayesian ordinal factor analysis to achieve validation of TTM’s decisional balance and self-efficacy. Results: All of the factor loadings corresponded to mentioned constructs was significant at α= 0.05%. That support validation of the Constructs. Correlation between Pros and Cons was not significant(-0.076, 0.007.Furthermore a specific statistical model for ordinal data created that can estimate odds ratios and marginal Probabilities for each choice of any item in questionnaire. Conclusion: Thanks to benefits of Bayesian method in use of prior information such as Meta-analysis and other resources, In comparison to similar studies that used standard or other factor analysis for ordinal data, our results had good accuracy(with aspect to standard deviation even with lower sample size.so the results can be used  in future clinical researches.

  11. An Improved Neutral a-Glucosidase Assay for Assessment of Epididymal Function—Validation and Comparison to the WHO Method

    Directory of Open Access Journals (Sweden)

    Frank Eertmans

    2014-01-01

    Full Text Available Neutral a-glucosidase (NAG activity in human seminal plasma is an important indicator for epididymis functionality. In the present study, the classic World Health Organization (WHO method has been adapted to enhance assay robustness. Changes include modified enzyme reaction buffer composition and usage of an alternative enzyme inhibitor for background correction (glucose instead of castanospermine. Both methods have been tested in parallel on 144 semen samples, obtained from 94 patients/donors and 50 vasectomized men (negative control, respectively. Passing-Bablok regression analysis demonstrated equal assay performance. In terms of assay validation, analytical specificity, detection limit, measuring range, precision, and cut-off values have been calculated. These data confirm that the adapted method is a reliable, improved tool for NAG analysis in human semen.

  12. Using digital photography in a clinical setting: a valid, accurate, and applicable method to assess food intake.

    Science.gov (United States)

    Winzer, Eva; Luger, Maria; Schindler, Karin

    2018-06-01

    Regular monitoring of food intake is hardly integrated in clinical routine. Therefore, the aim was to examine the validity, accuracy, and applicability of an appropriate and also quick and easy-to-use tool for recording food intake in a clinical setting. Two digital photography methods, the postMeal method with a picture after the meal, the pre-postMeal method with a picture before and after the meal, and the visual estimation method (plate diagram; PD) were compared against the reference method (weighed food records; WFR). A total of 420 dishes from lunch (7 weeks) were estimated with both photography methods and the visual method. Validity, applicability, accuracy, and precision of the estimation methods, and additionally food waste, macronutrient composition, and energy content were examined. Tests of validity revealed stronger correlations for photography methods (postMeal: r = 0.971, p < 0.001; pre-postMeal: r = 0.995, p < 0.001) compared to the visual estimation method (r = 0.810; p < 0.001). The pre-postMeal method showed smaller variability (bias < 1 g) and also smaller overestimation and underestimation. This method accurately and precisely estimated portion sizes in all food items. Furthermore, the total food waste was 22% for lunch over the study period. The highest food waste was observed in salads and the lowest in desserts. The pre-postMeal digital photography method is valid, accurate, and applicable in monitoring food intake in clinical setting, which enables a quantitative and qualitative dietary assessment. Thus, nutritional care might be initiated earlier. This method might be also advantageous for quantitative and qualitative evaluation of food waste, with a resultantly reduction in costs.

  13. Validation of the Nuclear Design Method for MOX Fuel Loaded LWR Cores

    International Nuclear Information System (INIS)

    Saji, E.; Inoue, Y.; Mori, M.; Ushio, T.

    2001-01-01

    The actual batch loading of mixed-oxide (MOX) fuel in light water reactors (LWRs) is now ready to start in Japan. One of the efforts that have been devoted to realizing this batch loading has been validation of the nuclear design methods calculating the MOX-fuel-loaded LWR core characteristics. This paper summarizes the validation work for the applicability of the CASMO-4/SIMULATE-3 in-core fuel management code system to MOX-fuel-loaded LWR cores. This code system is widely used by a number of electric power companies for the core management of their commercial LWRs. The validation work was performed for both boiling water reactor (BWR) and pressurized water reactor (PWR) applications. Each validation consists of two parts: analyses of critical experiments and core tracking calculations of operating plants. For the critical experiments, we have chosen a series of experiments known as the VENUS International Program (VIP), which was performed at the SCK/CEN MOL laboratory in Belgium. VIP consists of both BWR and PWR fuel assembly configurations. As for the core tracking calculations, the operating data of MOX-fuel-loaded BWR and PWR cores in Europe have been utilized

  14. A sensitive multi-residue method for the determination of 35 micropollutants including pharmaceuticals, iodinated contrast media and pesticides in water.

    Science.gov (United States)

    Valls-Cantenys, Carme; Scheurer, Marco; Iglesias, Mònica; Sacher, Frank; Brauch, Heinz-Jürgen; Salvadó, Victoria

    2016-09-01

    A sensitive, multi-residue method using solid-phase extraction followed by liquid chromatography-tandem mass spectrometry (LC-MS/MS) was developed to determine a representative group of 35 analytes, including corrosion inhibitors, pesticides and pharmaceuticals such as analgesic and anti-inflammatory drugs, five iodinated contrast media, β-blockers and some of their metabolites and transformation products in water samples. Few other methods are capable of determining such a broad range of contrast media together with other analytes. We studied the parameters affecting the extraction of the target analytes, including sorbent selection and extraction conditions, their chromatographic separation (mobile phase composition and column) and detection conditions using two ionisation sources: electrospray ionisation (ESI) and atmospheric pressure chemical ionisation (APCI). In order to correct matrix effects, a total of 20 surrogate/internal standards were used. ESI was found to have better sensitivity than APCI. Recoveries ranging from 79 to 134 % for tap water and 66 to 144 % for surface water were obtained. Intra-day precision, calculated as relative standard deviation, was below 34 % for tap water and below 21 % for surface water, groundwater and effluent wastewater. Method quantification limits (MQL) were in the low ng L(-1) range, except for the contrast agents iomeprol, amidotrizoic acid and iohexol (22, 25.5 and 17.9 ng L(-1), respectively). Finally, the method was applied to the analysis of 56 real water samples as part of the validation procedure. All of the compounds were detected in at least some of the water samples analysed. Graphical Abstract Multi-residue method for the determination of micropollutants including pharmaceuticals, iodinated contrast media and pesticides in waters by LC-MS/MS.

  15. An optimized method for fatty acid analysis, including quantification of trans fatty acids, in human adipose tissue by gas-liquid chromatography

    DEFF Research Database (Denmark)

    Bysted, Anette; Cold, S; Hølmer, Gunhild Kofoed

    1999-01-01

    Considering the need for a quick direct method for measurement of the fatty acid composition including trans isomers ofhuman adipose tissue we have developed a procedure using gas-liquid chromatography (GLC) alone, which is thussuitable for validation of fatty acid status in epidemiological studies...... for 25 min, and finally raised at 25 degrees C/min to 225 degrees C. The trans and cis isomers of18:1 were well separated from each other, as shown by silver-ion thin-layer chromatography. Verification by standardsshowed that the trans 18:1 isomers with a double bond in position 12 or lower were...

  16. Validation of cleaning method for various parts fabricated at a Beryllium facility

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Cynthia M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-15

    This study evaluated and documented a cleaning process that is used to clean parts that are fabricated at a beryllium facility at Los Alamos National Laboratory. The purpose of evaluating this cleaning process was to validate and approve it for future use to assure beryllium surface levels are below the Department of Energy’s release limits without the need to sample all parts leaving the facility. Inhaling or coming in contact with beryllium can cause an immune response that can result in an individual becoming sensitized to beryllium, which can then lead to a disease of the lungs called chronic beryllium disease, and possibly lung cancer. Thirty aluminum and thirty stainless steel parts were fabricated on a lathe in the beryllium facility, as well as thirty-two beryllium parts, for the purpose of testing a parts cleaning method that involved the use of ultrasonic cleaners. A cleaning method was created, documented, validated, and approved, to reduce beryllium contamination.

  17. Validation of internal dosimetry protocols based on stochastic method

    International Nuclear Information System (INIS)

    Mendes, Bruno M.; Fonseca, Telma C.F.; Almeida, Iassudara G.; Trindade, Bruno M.; Campos, Tarcisio P.R.

    2015-01-01

    Computational phantoms adapted to Monte Carlo codes have been applied successfully in radiation dosimetry fields. NRI research group has been developing Internal Dosimetry Protocols - IDPs, addressing distinct methodologies, software and computational human-simulators, to perform internal dosimetry, especially for new radiopharmaceuticals. Validation of the IDPs is critical to ensure the reliability of the simulations results. Inter comparisons of data from literature with those produced by our IDPs is a suitable method for validation. The aim of this study was to validate the IDPs following such inter comparison procedure. The Golem phantom has been reconfigured to run on MCNP5. The specific absorbed fractions (SAF) for photon at 30, 100 and 1000 keV energies were simulated based on the IDPs and compared with reference values (RV) published by Zankl and Petoussi-Henss, 1998. The SAF average differences from RV and those obtained in IDP simulations was 2.3 %. The SAF largest differences were found in situations involving low energy photons at 30 keV. The Adrenals and thyroid, i.e. the lowest mass organs, had the highest SAF discrepancies towards RV as 7.2 % and 3.8 %, respectively. The statistic differences of SAF applying our IDPs from reference values were considered acceptable at the 30, 100 and 1000 keV spectra. We believe that the main reason for the discrepancies in IDPs run, found in lower masses organs, was due to our source definition methodology. Improvements of source spatial distribution in the voxels may provide outputs more consistent with reference values for lower masses organs. (author)

  18. Validation of internal dosimetry protocols based on stochastic method

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, Bruno M.; Fonseca, Telma C.F., E-mail: bmm@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Almeida, Iassudara G.; Trindade, Bruno M.; Campos, Tarcisio P.R., E-mail: tprcampos@yahoo.com.br [Universidade Federal de Minas Gerais (DEN/UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2015-07-01

    Computational phantoms adapted to Monte Carlo codes have been applied successfully in radiation dosimetry fields. NRI research group has been developing Internal Dosimetry Protocols - IDPs, addressing distinct methodologies, software and computational human-simulators, to perform internal dosimetry, especially for new radiopharmaceuticals. Validation of the IDPs is critical to ensure the reliability of the simulations results. Inter comparisons of data from literature with those produced by our IDPs is a suitable method for validation. The aim of this study was to validate the IDPs following such inter comparison procedure. The Golem phantom has been reconfigured to run on MCNP5. The specific absorbed fractions (SAF) for photon at 30, 100 and 1000 keV energies were simulated based on the IDPs and compared with reference values (RV) published by Zankl and Petoussi-Henss, 1998. The SAF average differences from RV and those obtained in IDP simulations was 2.3 %. The SAF largest differences were found in situations involving low energy photons at 30 keV. The Adrenals and thyroid, i.e. the lowest mass organs, had the highest SAF discrepancies towards RV as 7.2 % and 3.8 %, respectively. The statistic differences of SAF applying our IDPs from reference values were considered acceptable at the 30, 100 and 1000 keV spectra. We believe that the main reason for the discrepancies in IDPs run, found in lower masses organs, was due to our source definition methodology. Improvements of source spatial distribution in the voxels may provide outputs more consistent with reference values for lower masses organs. (author)

  19. Statistical Validation of Engineering and Scientific Models: Background

    International Nuclear Information System (INIS)

    Hills, Richard G.; Trucano, Timothy G.

    1999-01-01

    A tutorial is presented discussing the basic issues associated with propagation of uncertainty analysis and statistical validation of engineering and scientific models. The propagation of uncertainty tutorial illustrates the use of the sensitivity method and the Monte Carlo method to evaluate the uncertainty in predictions for linear and nonlinear models. Four example applications are presented; a linear model, a model for the behavior of a damped spring-mass system, a transient thermal conduction model, and a nonlinear transient convective-diffusive model based on Burger's equation. Correlated and uncorrelated model input parameters are considered. The model validation tutorial builds on the material presented in the propagation of uncertainty tutoriaI and uses the damp spring-mass system as the example application. The validation tutorial illustrates several concepts associated with the application of statistical inference to test model predictions against experimental observations. Several validation methods are presented including error band based, multivariate, sum of squares of residuals, and optimization methods. After completion of the tutorial, a survey of statistical model validation literature is presented and recommendations for future work are made

  20. Chemical analysis of solid residue from liquid and solid fuel combustion: Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Trkmic, M. [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecturek Zagreb (Croatia); Curkovic, L. [University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb (Croatia); Asperger, D. [HEP-Proizvodnja, Thermal Power Plant Department, Zagreb (Croatia)

    2012-06-15

    This paper deals with the development and validation of methods for identifying the composition of solid residue after liquid and solid fuel combustion in thermal power plant furnaces. The methods were developed for energy dispersive X-ray fluorescence (EDXRF) spectrometer analysis. Due to the fuels used, the different composition and the location of creation of solid residue, it was necessary to develop two methods. The first method is used for identifying solid residue composition after fuel oil combustion (Method 1), while the second method is used for identifying solid residue composition after the combustion of solid fuels, i. e. coal (Method 2). Method calibration was performed on sets of 12 (Method 1) and 6 (Method 2) certified reference materials (CRM). CRMs and analysis test samples were prepared in pellet form using hydraulic press. For the purpose of method validation the linearity, accuracy, precision and specificity were determined, and the measurement uncertainty of methods for each analyte separately was assessed. The methods were applied in the analysis of real furnace residue samples. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  1. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  2. Modelling of the activity system - development of an evaluation method for integrated system validation

    International Nuclear Information System (INIS)

    Norros, Leena; Savioja, Paula

    2004-01-01

    In this paper we present our recent research which focuses on creating an evaluation method for human-system interfaces of complex systems. The method is aimed for use in the validation of modernised nuclear power plant (NPP) control rooms, and other complex systems with high reliability requirements. The task in validation is to determine whether the human-system functions safely and effectively. This question can be operationalized to the selection of relevant operational features and their appropriate acceptance criteria. Thus, there is a need to ensure that the results of the evaluation can be generalized so that they serve the purpose of integrated system validation. The definition of the appropriate acceptance criteria provides basis for the judgement of the appropriateness of the performance of the system. We propose that the operational situations and the acceptance criteria should be defined based on modelling of the NPP operation that is comprehended as an activity system. We developed a new core-tasks modelling framework. It is a formative modelling approach that combines causal, functional and understanding explanations of system performance. In this paper we reason how modelling can be used as a medium to determine the validity of the emerging control room system. (Author)

  3. Quantification of imatinib in human serum: validation of a high-performance liquid chromatography-mass spectrometry method for therapeutic drug monitoring and pharmacokinetic assays

    Directory of Open Access Journals (Sweden)

    Rezende VM

    2013-08-01

    Full Text Available Vinicius Marcondes Rezende,1 Ariane Rivellis,1 Mafalda Megumi Yoshinaga Novaes,1 Dalton de Alencar Fisher Chamone,2 Israel Bendit1,21Laboratory of Tumor Biology, 2Department of Hematology, School of Medicine, University of São Paulo, São Paulo, BrazilBackground: Imatinib mesylate has been a breakthrough treatment for chronic myeloid leukemia. It has become the ideal tyrosine kinase inhibitor and the standard treatment for chronic-phase leukemia. Striking results have recently been reported, but intolerance to imatinib and noncompliance with treatment remain to be solved. Molecular monitoring by quantitative real-time polymerase chain reaction is the gold standard for monitoring patients, and imatinib blood levels have also become an important tool for monitoring.Methods: A fast and cheap method was developed and validated using high-performance liquid chromatography-mass spectrometry for quantification of imatinib in human serum and tamsulosin as the internal standard. Remarkable advantages of the method includes use of serum instead of plasma, less time spent on processing and analysis, simpler procedures, and requiring reduced amounts of biological material, solvents, and reagents. Stability of the analyte was also studied. This research also intended to drive the validation scheme in clinical centers. The method was validated according to the requirements of the US Food and Drug Administration and Brazilian National Health Surveillance Agency within the range of 0.500–10.0 µg/mL with a limit of detection of 0.155 µg/mL. Stability data for the analyte are also presented.Conclusion: Given that the validated method has proved to be linear, accurate, precise, and robust, it is suitable for pharmacokinetic assays, such as bioavailability and bioequivalence, and is being successfully applied in routine therapeutic drug monitoring in the hospital service.Keywords: imatinib, high-performance liquid chromatography-mass spectrometry, therapeutic

  4. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods

    Directory of Open Access Journals (Sweden)

    Alistair Currie

    2011-11-01

    Full Text Available In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches.

  5. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods.

    Science.gov (United States)

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-11-29

    In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches.

  6. Global Land Product Validation Protocols: An Initiative of the CEOS Working Group on Calibration and Validation to Evaluate Satellite-derived Essential Climate Variables

    Science.gov (United States)

    Guillevic, P. C.; Nickeson, J. E.; Roman, M. O.; camacho De Coca, F.; Wang, Z.; Schaepman-Strub, G.

    2016-12-01

    The Global Climate Observing System (GCOS) has specified the need to systematically produce and validate Essential Climate Variables (ECVs). The Committee on Earth Observation Satellites (CEOS) Working Group on Calibration and Validation (WGCV) and in particular its subgroup on Land Product Validation (LPV) is playing a key coordination role leveraging the international expertise required to address actions related to the validation of global land ECVs. The primary objective of the LPV subgroup is to set standards for validation methods and reporting in order to provide traceable and reliable uncertainty estimates for scientists and stakeholders. The Subgroup is comprised of 9 focus areas that encompass 10 land surface variables. The activities of each focus area are coordinated by two international co-leads and currently include leaf area index (LAI) and fraction of absorbed photosynthetically active radiation (FAPAR), vegetation phenology, surface albedo, fire disturbance, snow cover, land cover and land use change, soil moisture, land surface temperature (LST) and emissivity. Recent additions to the focus areas include vegetation indices and biomass. The development of best practice validation protocols is a core activity of CEOS LPV with the objective to standardize the evaluation of land surface products. LPV has identified four validation levels corresponding to increasing spatial and temporal representativeness of reference samples used to perform validation. Best practice validation protocols (1) provide the definition of variables, ancillary information and uncertainty metrics, (2) describe available data sources and methods to establish reference validation datasets with SI traceability, and (3) describe evaluation methods and reporting. An overview on validation best practice components will be presented based on the LAI and LST protocol efforts to date.

  7. A frequency domain linearized Navier-Stokes method including acoustic damping by eddy viscosity using RANS

    Science.gov (United States)

    Holmberg, Andreas; Kierkegaard, Axel; Weng, Chenyang

    2015-06-01

    In this paper, a method for including damping of acoustic energy in regions of strong turbulence is derived for a linearized Navier-Stokes method in the frequency domain. The proposed method is validated and analyzed in 2D only, although the formulation is fully presented in 3D. The result is applied in a study of the linear interaction between the acoustic and the hydrodynamic field in a 2D T-junction, subject to grazing flow at Mach 0.1. Part of the acoustic energy at the upstream edge of the junction is shed as harmonically oscillating disturbances, which are conveyed across the shear layer over the junction, where they interact with the acoustic field. As the acoustic waves travel in regions of strong shear, there is a need to include the interaction between the background turbulence and the acoustic field. For this purpose, the oscillation of the background turbulence Reynold's stress, due to the acoustic field, is modeled using an eddy Newtonian model assumption. The time averaged flow is first solved for using RANS along with a k-ε turbulence model. The spatially varying turbulent eddy viscosity is then added to the spatially invariant kinematic viscosity in the acoustic set of equations. The response of the 2D T-junction to an incident acoustic field is analyzed via a plane wave scattering matrix model, and the result is compared to experimental data for a T-junction of rectangular ducts. A strong improvement in the agreement between calculation and experimental data is found when the modification proposed in this paper is implemented. Discrepancies remaining are likely due to inaccuracies in the selected turbulence model, which is known to produce large errors e.g. for flows with significant rotation, which the grazing flow across the T-junction certainly is. A natural next step is therefore to test the proposed methodology together with more sophisticated turbulence models.

  8. Quantification of imatinib in human serum: validation of a high-performance liquid chromatography-mass spectrometry method for therapeutic drug monitoring and pharmacokinetic assays.

    Science.gov (United States)

    Rezende, Vinicius Marcondes; Rivellis, Ariane; Novaes, Mafalda Megumi Yoshinaga; de Alencar Fisher Chamone, Dalton; Bendit, Israel

    2013-01-01

    Imatinib mesylate has been a breakthrough treatment for chronic myeloid leukemia. It has become the ideal tyrosine kinase inhibitor and the standard treatment for chronic-phase leukemia. Striking results have recently been reported, but intolerance to imatinib and noncompliance with treatment remain to be solved. Molecular monitoring by quantitative real-time polymerase chain reaction is the gold standard for monitoring patients, and imatinib blood levels have also become an important tool for monitoring. A fast and cheap method was developed and validated using high-performance liquid chromatography-mass spectrometry for quantification of imatinib in human serum and tamsulosin as the internal standard. Remarkable advantages of the method includes use of serum instead of plasma, less time spent on processing and analysis, simpler procedures, and requiring reduced amounts of biological material, solvents, and reagents. Stability of the analyte was also studied. This research also intended to drive the validation scheme in clinical centers. The method was validated according to the requirements of the US Food and Drug Administration and Brazilian National Health Surveillance Agency within the range of 0.500-10.0 μg/mL with a limit of detection of 0.155 μg/mL. Stability data for the analyte are also presented. Given that the validated method has proved to be linear, accurate, precise, and robust, it is suitable for pharmacokinetic assays, such as bioavailability and bioequivalence, and is being successfully applied in routine therapeutic drug monitoring in the hospital service.

  9. Methods and practices for verification and validation of programmable systems

    International Nuclear Information System (INIS)

    Heimbuerger, H.; Haapanen, P.; Pulkkinen, U.

    1993-01-01

    The programmable systems deviate by their properties and behaviour from the conventional non-programmable systems in such extent, that their verification and validation for safety critical applications requires new methods and practices. The safety assessment can not be based on conventional probabilistic methods due to the difficulties in the quantification of the reliability of the software and hardware. The reliability estimate of the system must be based on qualitative arguments linked to a conservative claim limit. Due to the uncertainty of the quantitative reliability estimate other means must be used to get more assurance about the system safety. Methods and practices based on research done by VTT for STUK, are discussed in the paper as well as the methods applicable in the reliability analysis of software based safety functions. The most essential concepts and models of quantitative reliability analysis are described. The application of software models in probabilistic safety analysis (PSA) is evaluated. (author). 18 refs

  10. Parallel Resolved Open Source CFD-DEM: Method, Validation and Application

    Directory of Open Access Journals (Sweden)

    A. Hager

    2014-03-01

    Full Text Available In the following paper the authors present a fully parallelized Open Source method for calculating the interaction of immersed bodies and surrounding fluid. A combination of computational fluid dynamics (CFD and a discrete element method (DEM accounts for the physics of both the fluid and the particles. The objects considered are relatively big compared to the cells of the fluid mesh, i.e. they cover several cells each. Thus this fictitious domain method (FDM is called resolved. The implementation is realized within the Open Source framework CFDEMcOupling (www.cfdem.com, which provides an interface between OpenFOAM® based CFD-solvers and the DEM software LIGGGHTS (www.liggghts.com. While both LIGGGHTS and OpenFOAM® were already parallelized, only a recent improvement of the algorithm permits the fully parallel computation of resolved problems. Alongside with a detailed description of the method, its implementation and recent improvements, a number of application and validation examples is presented in the scope of this paper.

  11. Reliability and validity of a brief method to assess nociceptive flexion reflex (NFR) threshold.

    Science.gov (United States)

    Rhudy, Jamie L; France, Christopher R

    2011-07-01

    The nociceptive flexion reflex (NFR) is a physiological tool to study spinal nociception. However, NFR assessment can take several minutes and expose participants to repeated suprathreshold stimulations. The 4 studies reported here assessed the reliability and validity of a brief method to assess NFR threshold that uses a single ascending series of stimulations (Peak 1 NFR), by comparing it to a well-validated method that uses 3 ascending/descending staircases of stimulations (Staircase NFR). Correlations between the NFR definitions were high, were on par with test-retest correlations of Staircase NFR, and were not affected by participant sex or chronic pain status. Results also indicated the test-retest reliabilities for the 2 definitions were similar. Using larger stimulus increments (4 mAs) to assess Peak 1 NFR tended to result in higher NFR threshold estimates than using the Staircase NFR definition, whereas smaller stimulus increments (2 mAs) tended to result in lower NFR threshold estimates than the Staircase NFR definition. Neither NFR definition was correlated with anxiety, pain catastrophizing, or anxiety sensitivity. In sum, a single ascending series of electrical stimulations results in a reliable and valid estimate of NFR threshold. However, caution may be warranted when comparing NFR thresholds across studies that differ in the ascending stimulus increments. This brief method to assess NFR threshold is reliable and valid; therefore, it should be useful to clinical pain researchers interested in quickly assessing inter- and intra-individual differences in spinal nociceptive processes. Copyright © 2011 American Pain Society. Published by Elsevier Inc. All rights reserved.

  12. Methods and procedures for the verification and validation of artificial neural networks

    CERN Document Server

    Taylor, Brian J

    2006-01-01

    Neural networks are members of a class of software that have the potential to enable intelligent computational systems capable of simulating characteristics of biological thinking and learning. This volume introduces some of the methods and techniques used for the verification and validation of neural networks and adaptive systems.

  13. Validity Argument for Assessing L2 Pragmatics in Interaction Using Mixed Methods

    Science.gov (United States)

    Youn, Soo Jung

    2015-01-01

    This study investigates the validity of assessing L2 pragmatics in interaction using mixed methods, focusing on the evaluation inference. Open role-plays that are meaningful and relevant to the stakeholders in an English for Academic Purposes context were developed for classroom assessment. For meaningful score interpretations and accurate…

  14. Validity of the remote food photography method against doubly labeled water among minority preschoolers

    Science.gov (United States)

    The aim of this study was to determine the validity of energy intake (EI) estimations made using the remote food photography method (RFPM) compared to the doubly labeled water (DLW) method in minority preschool children in a free-living environment. Seven days of food intake and spot urine samples...

  15. Validation of the Abdominal Pain Index using a revised scoring method.

    Science.gov (United States)

    Laird, Kelsey T; Sherman, Amanda L; Smith, Craig A; Walker, Lynn S

    2015-06-01

    Evaluate the psychometric properties of child- and parent-report versions of the four-item Abdominal Pain Index (API) in children with functional abdominal pain (FAP) and healthy controls, using a revised scoring method that facilitates comparisons of scores across samples and time. Pediatric patients aged 8-18 years with FAP and controls completed the API at baseline (N = 1,967); a subset of their parents (N = 290) completed the API regarding the child's pain. Subsets of patients completed follow-up assessments at 2 weeks (N = 231), 3 months (N = 330), and 6 months (N = 107). Subsets of both patients (N = 389) and healthy controls (N = 172) completed a long-term follow-up assessment (mean age at follow-up = 20.21 years, SD = 3.75). The API demonstrated good concurrent, discriminant, and construct validity, as well as good internal consistency. We conclude that the API, using the revised scoring method, is a useful, reliable, and valid measure of abdominal pain severity. © The Author 2015. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Validation approach for a fast and simple targeted screening method for 75 antibiotics in meat and aquaculture products using LC-MS/MS.

    Science.gov (United States)

    Dubreil, Estelle; Gautier, Sophie; Fourmond, Marie-Pierre; Bessiral, Mélaine; Gaugain, Murielle; Verdon, Eric; Pessel, Dominique

    2017-04-01

    An approach is described to validate a fast and simple targeted screening method for antibiotic analysis in meat and aquaculture products by LC-MS/MS. The strategy of validation was applied for a panel of 75 antibiotics belonging to different families, i.e., penicillins, cephalosporins, sulfonamides, macrolides, quinolones and phenicols. The samples were extracted once with acetonitrile, concentrated by evaporation and injected into the LC-MS/MS system. The approach chosen for the validation was based on the Community Reference Laboratory (CRL) guidelines for the validation of screening qualitative methods. The aim of the validation was to prove sufficient sensitivity of the method to detect all the targeted antibiotics at the level of interest, generally the maximum residue limit (MRL). A robustness study was also performed to test the influence of different factors. The validation showed that the method is valid to detect and identify 73 antibiotics of the 75 antibiotics studied in meat and aquaculture products at the validation levels.

  17. CONSTRUCT VALIDITY AND SCORING METHODS OF THE WORLD HEALTH ORGANIZATION- HEALTH AND WORK PERFORMANCE QUESTIONNAIRE AMONG WORKERS WITH ARTHRITIS AND RHEUMATOLOGICAL CONDITIONS

    Science.gov (United States)

    AlHeresh, Rawan; LaValley, Michael P.; Coster, Wendy; Keysor, Julie J.

    2017-01-01

    Objective To evaluate construct validity and scoring methods of the world health organization- health and work performance questionnaire (HPQ) for people with arthritis. Methods Construct validity was examined through hypothesis testing using the recommended guidelines of the Consensus-based Standards for the selection of health Measurement Instruments (COSMIN). Results The HPQ using the absolute scoring method showed moderate construct validity as 4 of the 7 hypotheses were met. The HPQ using the relative scoring method had weak construct validity as only one of the 7 hypotheses were met. Conclusion The absolute scoring method for the HPQ is superior in construct validity to the relative scoring method in assessing work performance among people with arthritis and related rheumatic conditions; however, more research is needed to further explore other psychometric properties of the HPQ. PMID:28598938

  18. Analytical method (HPLC, validation used for identification and assay of the pharmaceutical active ingredient, Tylosin tartrate for veterinary use and its finite product Tilodem 50, hydrosoluble powder

    Directory of Open Access Journals (Sweden)

    Maria Neagu

    2010-12-01

    Full Text Available In SC DELOS IMPEX ’96 SRL the quality of the active pharmaceutical ingredient (API for the finite product Tilodem 50 - hydrosoluble powder was acomkplished in the respect of last European Pharmacopoeia.The method for analysis used in this purpose was the compendial method „Tylosin tartrate for veterinary use” in EurPh. in vigour edition and represent a variant developed and validation „in house”.The parameters which was included in the methodology validation for chromatographic method are the followings: Selectivity, Linearity, Linearity range, Detection and Quantification limits, Precision, Repeatability (intra day, Inter-Day Reproductibility, Accuracy, Robustness, Solutions’ stability and System suitability. According to the European Pharmacopoeia, the active pharmaceutical ingredient is consistent, in terms of quality, if it contains Tylosin A - minimum 80% and the amount of Tylosin A, B, C, D, at minimum 95%. Identification and determination of each component separately (Tylosin A, B, C, D is possible by chromatographic separation-HPLC. Validation of analytical methods is presented below.

  19. Validation and Clinical Evaluation of a Novel Method To Measure Miltefosine in Leishmaniasis Patients Using Dried Blood Spot Sample Collection

    Science.gov (United States)

    Rosing, H.; Hillebrand, M. J. X.; Blesson, S.; Mengesha, B.; Diro, E.; Hailu, A.; Schellens, J. H. M.; Beijnen, J. H.

    2016-01-01

    To facilitate future pharmacokinetic studies of combination treatments against leishmaniasis in remote regions in which the disease is endemic, a simple cheap sampling method is required for miltefosine quantification. The aims of this study were to validate a liquid chromatography-tandem mass spectrometry method to quantify miltefosine in dried blood spot (DBS) samples and to validate its use with Ethiopian patients with visceral leishmaniasis (VL). Since hematocrit (Ht) levels are typically severely decreased in VL patients, returning to normal during treatment, the method was evaluated over a range of clinically relevant Ht values. Miltefosine was extracted from DBS samples using a simple method of pretreatment with methanol, resulting in >97% recovery. The method was validated over a calibration range of 10 to 2,000 ng/ml, and accuracy and precision were within ±11.2% and ≤7.0% (≤19.1% at the lower limit of quantification), respectively. The method was accurate and precise for blood spot volumes between 10 and 30 μl and for Ht levels of 20 to 35%, although a linear effect of Ht levels on miltefosine quantification was observed in the bioanalytical validation. DBS samples were stable for at least 162 days at 37°C. Clinical validation of the method using paired DBS and plasma samples from 16 VL patients showed a median observed DBS/plasma miltefosine concentration ratio of 0.99, with good correlation (Pearson's r = 0.946). Correcting for patient-specific Ht levels did not further improve the concordance between the sampling methods. This successfully validated method to quantify miltefosine in DBS samples was demonstrated to be a valid and practical alternative to venous blood sampling that can be applied in future miltefosine pharmacokinetic studies with leishmaniasis patients, without Ht correction. PMID:26787691

  20. Content validity across methods of malnutrition assessment in patients with cancer is limited

    NARCIS (Netherlands)

    Sealy, Martine J.; Nijholt, Willemke; Stuiver, Martijn M.; van der Berg, Marit M.; Roodenburg, Jan L. N.; Schans, van der Cees P.; Ottery, Faith D.; Jager-Wittenaar, Harriet

    Objective: To identify malnutrition assessment methods in cancer patients and assess their content validity based on internationally accepted definitions for malnutrition. Study Design and Setting: Systematic review of studies in cancer patients that operationalized malnutrition as a variable,

  1. Content validity across methods of malnutrition assessment in patients with cancer is limited

    NARCIS (Netherlands)

    Sealy, Martine; Nijholt, Willemke; Stuiver, M.M.; van der Berg, M.M.; Roodenburg, Jan; Ottery, Faith D.; van der Schans, Cees; Jager, Harriët

    2016-01-01

    Objective To identify malnutrition assessment methods in cancer patients and assess their content validity based on internationally accepted definitions for malnutrition. Study Design and Setting Systematic review of studies in cancer patients that operationalized malnutrition as a variable,

  2. Testing and Validation of the Dynamic Inertia Measurement Method

    Science.gov (United States)

    Chin, Alexander W.; Herrera, Claudia Y.; Spivey, Natalie D.; Fladung, William A.; Cloutier, David

    2015-01-01

    The Dynamic Inertia Measurement (DIM) method uses a ground vibration test setup to determine the mass properties of an object using information from frequency response functions. Most conventional mass properties testing involves using spin tables or pendulum-based swing tests, which for large aerospace vehicles becomes increasingly difficult and time-consuming, and therefore expensive, to perform. The DIM method has been validated on small test articles but has not been successfully proven on large aerospace vehicles. In response, the National Aeronautics and Space Administration Armstrong Flight Research Center (Edwards, California) conducted mass properties testing on an "iron bird" test article that is comparable in mass and scale to a fighter-type aircraft. The simple two-I-beam design of the "iron bird" was selected to ensure accurate analytical mass properties. Traditional swing testing was also performed to compare the level of effort, amount of resources, and quality of data with the DIM method. The DIM test showed favorable results for the center of gravity and moments of inertia; however, the products of inertia showed disagreement with analytical predictions.

  3. Single-laboratory validation of a saponification method for the determination of four polycyclic aromatic hydrocarbons in edible oils by HPLC-fluorescence detection.

    Science.gov (United States)

    Akdoğan, Abdullah; Buttinger, Gerhard; Wenzl, Thomas

    2016-01-01

    An analytical method is reported for the determination of four polycyclic aromatic hydrocarbons (benzo[a]pyrene (BaP), benz[a]anthracene (BaA), benzo[b]fluoranthene (BbF) and chrysene (CHR)) in edible oils (sesame, maize, sunflower and olive oil) by high-performance liquid chromatography. Sample preparation is based on three steps including saponification, liquid-liquid partitioning and, finally, clean-up by solid phase extraction on 2 g of silica. Guidance on single-laboratory validation of the proposed analysis method was taken from the second edition of the Eurachem guide on method validation. The lower level of the working range of the method was determined by the limits of quantification of the individual analytes, and the upper level was equal to 5.0 µg kg(-1). The limits of detection and quantification of the four PAHs ranged from 0.06 to 0.12 µg kg(-1) and from 0.13 to 0.24 µg kg(-1). Recoveries of more than 84.8% were achieved for all four PAHs at two concentration levels (2.5 and 5.0 µg kg(-1)), and expanded relative measurement uncertainties were below 20%. The performance of the validated method was in all aspects compliant with provisions set in European Union legislation for the performance of analytical methods employed in the official control of food. The applicability of the method to routine samples was evaluated based on a limited number of commercial edible oil samples.

  4. Gradient HPLC method development and validation for Simultaneous estimation of Rosiglitazone and Gliclazide.

    Directory of Open Access Journals (Sweden)

    Uttam Singh Baghel

    2012-10-01

    Full Text Available Objective: The aim of present work was to develop a gradient RP-HPLC method for simultaneous analysis of rosiglitazone and gliclazide, in a tablet dosage form. Method: Chromatographic system was optimized using a hypersil C18 (250mm x 4.6mm, 5毺 m column with potassium dihydrogen phosphate (pH-7.0 and acetonitrile in the ratio of 60:40, as mobile phase, at a flow rate of 1.0 ml/min. Detection was carried out at 225 nm by a SPD-20A prominence UV/Vis detector. Result: Rosiglitazone and gliclazide were eluted with retention times of 17.36 and 7.06 min, respectively. Beer’s Lambert ’s Law was obeyed over the concentration ranges of 5 to 70 毺 g/ml and 2 to 12 毺 g/ml for rosiglitazone and gliclazide, respectively. Conclusion: The high recovery and low coefficients of variation confirm the suitability of the method for simultaneous analysis of both drugs in a tablets dosage form. Statistical analysis proves that the method is sensitive and significant for the analysis of rosiglitazone and gliclazide in pure and in pharmaceutical dosage form without any interference from the excipients. The method was validated in accordance with ICH guidelines. Validation revealed the method is specific, rapid, accurate, precise, reliable, and reproducible.

  5. A validated HPTLC method for the quantification of friedelin in Putranjiva roxburghii Wall extracts and in polyherbal formulations

    Directory of Open Access Journals (Sweden)

    Kedar Kalyani Abhimanyu

    2017-06-01

    Full Text Available In present study HPTLC method was developed and validated for the determination of friedelin in Putranjiva roxburghii Wall (family: Euphorbiaceae leaf, bark extract and in polyherbal formulations. Analysis of samples were performed on TLC aluminium precoated plate (60 F254 by using mobile phase toluene: chloroform (9:1 v/v. Plate was derivatized with vanillin sulphuric acid and scanned at 580 nm. Developed method found to give compact spot for friedelin at Rf value 0.43 ± 0.01. The method was validated using International Council for Harmonization (ICH guidelines including linearity, precision, accuracy, and robustness. Friedelin was found to be present in leaf extract of Putranjiva roxburghii Wall (0.003% w/w, in bark (0.04% w/w, formulation 1 (0.002% w/w and formulation 2 (0.035% w/w. A good linearity relationship was found to be (100–500 ng spot−1 with correlation coefficient (r2 value of 0.9892 for friedelin. Limit of detection and limit of quantitation was found to be 32.15, 97.44 ng/band respectively for friedelin. The developed method was found to be accurate and precise with 0.78%, 0.9% (%RSD for interday and intraday precision. Accuracy of the method was performed by recovery studies at three different concentration levels and the average percentage recovery was found to be 98.55% for friedelin. The proposed method for the quantitation of friedelin was found to be simple, specific, accurate and robust in Putranjiva roxburghii Wall and polyherbal formulations.

  6. Validation of analytical method to quality control and the stability study of 0.025 % eyedrops Ketotiphen

    International Nuclear Information System (INIS)

    Troche Concepcion, Yenilen; Romero Diaz, Jacqueline Aylema; Garcia Penna, Caridad M

    2010-01-01

    The Ketotiphen eyedrop is prescribed to relief the signs and symptoms of allergic conjunctivitis due to its potent H 1a ntihistaminic effect showing some ability to inhibit the histamine release and other mediators in cases of mastocytosis. The aim of present paper was to develop and validate an analytical method for the high-performance liquid chromatography, to quality control and the stability studies of 0.025 % eyedrop Ketotiphen. Method was based on active principle separation by means of a Lichrosorb RP-18 (5 μm) (250 x 4 mm), with UV detection to 296 nm using a mobile phase including a non-gasified mixture of methanol:buffer-phosphate (75:25; pH 8.5) adding 1 mL of Isopropanol by each 1 000 mL of the previous mixture at a 1.2 mL/min flow velocity. The analytical method was linear, accurate, specific and exact during the study concentrations

  7. Numerical Validation of the Delaunay Normalization and the Krylov-Bogoliubov-Mitropolsky Method

    Directory of Open Access Journals (Sweden)

    David Ortigosa

    2014-01-01

    Full Text Available A scalable second-order analytical orbit propagator programme based on modern and classical perturbation methods is being developed. As a first step in the validation and verification of part of our orbit propagator programme, we only consider the perturbation produced by zonal harmonic coefficients in the Earth’s gravity potential, so that it is possible to analyze the behaviour of the mathematical expressions involved in Delaunay normalization and the Krylov-Bogoliubov-Mitropolsky method in depth and determine their limits.

  8. Validation of the quality control method for sodium dicloxacillin in Dicloxen capsules

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Perez Navarro, Maikel; Suarez Perez, Yania

    2014-01-01

    Sodium dicloxacillin is a semi synthetic derivative of the isoxasocyl penicillin group that may appear in oral suspension form and in caplets. For the analysis of the raw materials and the finished products, it is recommended to use high performance liquid chromatography that is an unavailable method at the dicloxen capsule manufacturing lab for the routine analysis of the drug. To develop and to validate a useful ultraviolet spectrophotometry method for the quality control of sodium dicloxacillin in Dicloxen capsules

  9. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  10. A validation framework for microbial forensic methods based on statistical pattern recognition

    Energy Technology Data Exchange (ETDEWEB)

    Velsko, S P

    2007-11-12

    This report discusses a general approach to validating microbial forensic methods that attempt to simultaneously distinguish among many hypotheses concerning the manufacture of a questioned biological agent sample. It focuses on the concrete example of determining growth medium from chemical or molecular properties of a bacterial agent to illustrate the concepts involved.

  11. Validity of a manual soft tissue profile prediction method following mandibular setback osteotomy.

    Science.gov (United States)

    Kolokitha, Olga-Elpis

    2007-10-01

    The aim of this study was to determine the validity of a manual cephalometric method used for predicting the post-operative soft tissue profiles of patients who underwent mandibular setback surgery and compare it to a computerized cephalometric prediction method (Dentofacial Planner). Lateral cephalograms of 18 adults with mandibular prognathism taken at the end of pre-surgical orthodontics and approximately one year after surgery were used. To test the validity of the manual method the prediction tracings were compared to the actual post-operative tracings. The Dentofacial Planner software was used to develop the computerized post-surgical prediction tracings. Both manual and computerized prediction printouts were analyzed by using the cephalometric system PORDIOS. Statistical analysis was performed by means of t-test. Comparison between manual prediction tracings and the actual post-operative profile showed that the manual method results in more convex soft tissue profiles; the upper lip was found in a more prominent position, upper lip thickness was increased and, the mandible and lower lip were found in a less posterior position than that of the actual profiles. Comparison between computerized and manual prediction methods showed that in the manual method upper lip thickness was increased, the upper lip was found in a more anterior position and the lower anterior facial height was increased as compared to the computerized prediction method. Cephalometric simulation of post-operative soft tissue profile following orthodontic-surgical management of mandibular prognathism imposes certain limitations related to the methods implied. However, both manual and computerized prediction methods remain a useful tool for patient communication.

  12. Convergent validity test, construct validity test and external validity test of the David Liberman algorithm

    Directory of Open Access Journals (Sweden)

    David Maldavsky

    2013-08-01

    Full Text Available The author first exposes a complement of a previous test about convergent validity, then a construct validity test and finally an external validity test of the David Liberman algorithm.  The first part of the paper focused on a complementary aspect, the differential sensitivity of the DLA 1 in an external comparison (to other methods, and 2 in an internal comparison (between two ways of using the same method, the DLA.  The construct validity test exposes the concepts underlined to DLA, their operationalization and some corrections emerging from several empirical studies we carried out.  The external validity test examines the possibility of using the investigation of a single case and its relation with the investigation of a more extended sample.

  13. Construct Validity and Scoring Methods of the World Health Organization: Health and Work Performance Questionnaire Among Workers With Arthritis and Rheumatological Conditions.

    Science.gov (United States)

    AlHeresh, Rawan; LaValley, Michael P; Coster, Wendy; Keysor, Julie J

    2017-06-01

    To evaluate construct validity and scoring methods of the world health organization-health and work performance questionnaire (HPQ) for people with arthritis. Construct validity was examined through hypothesis testing using the recommended guidelines of the consensus-based standards for the selection of health measurement instruments (COSMIN). The HPQ using the absolute scoring method showed moderate construct validity as four of the seven hypotheses were met. The HPQ using the relative scoring method had weak construct validity as only one of the seven hypotheses were met. The absolute scoring method for the HPQ is superior in construct validity to the relative scoring method in assessing work performance among people with arthritis and related rheumatic conditions; however, more research is needed to further explore other psychometric properties of the HPQ.

  14. A method of knowledge base verification and validation for nuclear power plants expert systems

    International Nuclear Information System (INIS)

    Kwon, Il Won

    1996-02-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. As a result of this popularity, a large number of expert systems are developed. The nature of expert systems, however, requires that they be verified and validated carefully and that detailed methodologies for their development be devised. Therefore, it is widely noted that assuring the reliability of expert systems is very important, especially in nuclear industry, and it is also recognized that the process of verification and validation is an essential part of reliability assurance for these systems. Research and practices have produced numerous methods for expert system verification and validation (V and V) that suggest traditional software and system approaches to V and V. However, many approaches and methods for expert system V and V are partial, unreliable, and not uniform. The purpose of this paper is to present a new approach to expert system V and V, based on Petri nets, providing a uniform model. We devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for checking incorrectness, inconsistency, and incompleteness in a knowledge base. We also suggest heuristic analysis for validation process to show that the reasoning path is correct

  15. Determination of methylmercury in marine sediment samples: Method validation and occurrence data

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, Luis; Vassileva, Emilia, E-mail: e.vasileva-veleva@iaea.org

    2015-01-01

    Highlights: • A method for MeHg determination at trace level in marine sediments is completely validated. • Validation is performed according to ISO-17025 and Eurachem guidelines. • The extraction efficiency of four sample preparation procedures is evaluated. • The uncertainty budget is used as a tool for evaluation of main uncertainty contributors. • Comparison with independent methods yields good agreement within stated uncertainty. - Abstract: The determination of methylmercury (MeHg) in sediment samples is a difficult task due to the extremely low MeHg/THg (total mercury) ratio and species interconversion. Here, we present the method validation of a cost-effective fit-for-purpose analytical procedure for the measurement of MeHg in sediments, which is based on aqueous phase ethylation, followed by purge and trap and hyphenated gas chromatography–pyrolysis–atomic fluorescence spectrometry (GC–Py–AFS) separation and detection. Four different extraction techniques, namely acid and alkaline leaching followed by solvent extraction and evaporation, microwave-assisted extraction with 2-mercaptoethanol, and acid leaching, solvent extraction and back extraction into sodium thiosulfate, were examined regarding their potential to selectively extract MeHg from estuarine sediment IAEA-405 certified reference material (CRM). The procedure based on acid leaching with HNO{sub 3}/CuSO{sub 4}, solvent extraction and back extraction into Na{sub 2}S{sub 2}O{sub 3} yielded the highest extraction recovery, i.e., 94 ± 3% and offered the possibility to perform the extraction of a large number of samples in a short time, by eliminating the evaporation step. The artifact formation of MeHg was evaluated by high performance liquid chromatography coupled to inductively coupled plasma mass spectrometry (HPLC–ICP–MS), using isotopically enriched Me{sup 201}Hg and {sup 202}Hg and it was found to be nonexistent. A full validation approach in line with ISO 17025 and

  16. Determination of methylmercury in marine sediment samples: Method validation and occurrence data

    International Nuclear Information System (INIS)

    Carrasco, Luis; Vassileva, Emilia

    2015-01-01

    Highlights: • A method for MeHg determination at trace level in marine sediments is completely validated. • Validation is performed according to ISO-17025 and Eurachem guidelines. • The extraction efficiency of four sample preparation procedures is evaluated. • The uncertainty budget is used as a tool for evaluation of main uncertainty contributors. • Comparison with independent methods yields good agreement within stated uncertainty. - Abstract: The determination of methylmercury (MeHg) in sediment samples is a difficult task due to the extremely low MeHg/THg (total mercury) ratio and species interconversion. Here, we present the method validation of a cost-effective fit-for-purpose analytical procedure for the measurement of MeHg in sediments, which is based on aqueous phase ethylation, followed by purge and trap and hyphenated gas chromatography–pyrolysis–atomic fluorescence spectrometry (GC–Py–AFS) separation and detection. Four different extraction techniques, namely acid and alkaline leaching followed by solvent extraction and evaporation, microwave-assisted extraction with 2-mercaptoethanol, and acid leaching, solvent extraction and back extraction into sodium thiosulfate, were examined regarding their potential to selectively extract MeHg from estuarine sediment IAEA-405 certified reference material (CRM). The procedure based on acid leaching with HNO 3 /CuSO 4 , solvent extraction and back extraction into Na 2 S 2 O 3 yielded the highest extraction recovery, i.e., 94 ± 3% and offered the possibility to perform the extraction of a large number of samples in a short time, by eliminating the evaporation step. The artifact formation of MeHg was evaluated by high performance liquid chromatography coupled to inductively coupled plasma mass spectrometry (HPLC–ICP–MS), using isotopically enriched Me 201 Hg and 202 Hg and it was found to be nonexistent. A full validation approach in line with ISO 17025 and Eurachem guidelines was followed

  17. An extended validation of the last generation of particle finite element method for free surface flows

    Science.gov (United States)

    Gimenez, Juan M.; González, Leo M.

    2015-03-01

    In this paper, a new generation of the particle method known as Particle Finite Element Method (PFEM), which combines convective particle movement and a fixed mesh resolution, is applied to free surface flows. This interesting variant, previously described in the literature as PFEM-2, is able to use larger time steps when compared to other similar numerical tools which implies shorter computational times while maintaining the accuracy of the computation. PFEM-2 has already been extended to free surface problems, being the main topic of this paper a deep validation of this methodology for a wider range of flows. To accomplish this task, different improved versions of discontinuous and continuous enriched basis functions for the pressure field have been developed to capture the free surface dynamics without artificial diffusion or undesired numerical effects when different density ratios are involved. A collection of problems has been carefully selected such that a wide variety of Froude numbers, density ratios and dominant dissipative cases are reported with the intention of presenting a general methodology, not restricted to a particular range of parameters, and capable of using large time-steps. The results of the different free-surface problems solved, which include: Rayleigh-Taylor instability, sloshing problems, viscous standing waves and the dam break problem, are compared to well validated numerical alternatives or experimental measurements obtaining accurate approximations for such complex flows.

  18. Validity of a New Quantitative Evaluation Method that Uses the Depth of the Surface Imprint as an Indicator for Pitting Edema.

    Science.gov (United States)

    Kogo, Haruki; Murata, Jun; Murata, Shin; Higashi, Toshio

    2017-01-01

    This study examined the validity of a practical evaluation method for pitting edema by comparing it to other methods, including circumference measurements and ultrasound image measurements. Fifty-one patients (102 legs) from a convalescent ward in Maruyama Hospital were recruited for study 1, and 47 patients (94 legs) from a convalescent ward in Morinaga Hospital were recruited for study 2. The relationship between the depth of the surface imprint and circumferential measurements, as well as the relationship between the depth of the surface imprint and the thickness of the subcutaneous soft tissue on an ultrasonogram, were analyzed using a Spearman correlation coefficient by rank. There was no significant relationship between the surface imprint depth and circumferential measurements. However, there was a significant relationship between the depth of the surface imprint and the thickness of the subcutaneous soft tissue as measured on an ultrasonogram (correlation coefficient 0.736). Our findings suggest that our novel evaluation method for pitting edema, based on a measurement of the surface imprint depth, is both valid and useful.

  19. Lodenafil carbonate tablets: optimization and validation of a capillary zone electrophoresis method

    OpenAIRE

    Codevilla, Cristiane F; Ferreira, Pâmela Cristina L; Sangoi, Maximiliano S; Fröehlich, Pedro Eduardo; Bergold, Ana Maria

    2012-01-01

    A simple capillary zone electrophoresis (CZE) method was developed and validated for the analysis of lodenafil carbonate in tablets. Response surface methodology was used for optimization of the pH and concentration of the buffer, applied voltage and temperature. The method employed 50 mmol L-1 borate buffer at pH 10 as background electrolyte with an applied voltage of 15 kV. The separation was carried out in a fused-silica capillary maintained at 32.5 ºC and the detection wavelength was 214 ...

  20. Validating MEDIQUAL Constructs

    Science.gov (United States)

    Lee, Sang-Gun; Min, Jae H.

    In this paper, we validate MEDIQUAL constructs through the different media users in help desk service. In previous research, only two end-users' constructs were used: assurance and responsiveness. In this paper, we extend MEDIQUAL constructs to include reliability, empathy, assurance, tangibles, and responsiveness, which are based on the SERVQUAL theory. The results suggest that: 1) five MEDIQUAL constructs are validated through the factor analysis. That is, importance of the constructs have relatively high correlations between measures of the same construct using different methods and low correlations between measures of the constructs that are expected to differ; and 2) five MEDIQUAL constructs are statistically significant on media users' satisfaction in help desk service by regression analysis.

  1. Validation of high-performance liquid chromatography (HPLC method for quantitative analysis of histamine in fish and fishery products

    Directory of Open Access Journals (Sweden)

    B.K.K.K. Jinadasa

    2016-12-01

    Full Text Available A high-performance liquid chromatography method is described for quantitative determination and validation of histamine in fish and fishery product samples. Histamine is extracted from fish/fishery products by homogenizing with tri-chloro acetic acid, separated with Amberlite CG-50 resin and C18-ODS Hypersil reversed phase column at ambient temperature (25°C. Linear standard curves with high correlation coefficients were obtained. An isocratic elution program was used; the total elution time was 10 min. The method was validated by assessing the following aspects; specificity, repeatability, reproducibility, linearity, recovery, limits of detection, limit of quantification and uncertainty. The validated parameters are in good agreement with method and it is a useful tool for determining histamine in fish and fishery products.

  2. A validated HPTLC method for estimation of moxifloxacin hydrochloride in tablets.

    Science.gov (United States)

    Dhillon, Vandana; Chaudhary, Alok Kumar

    2010-10-01

    A simple HPTLC method having high accuracy, precision and reproducibility was developed for the routine estimation of moxifloxacin hydrochloride in the tablets available in market and was validated for various parameters according to ICH guidelines. moxifloxacin hydrochloride was estimated at 292 nm by densitometry using Silica gel 60 F254 as stationary phase and a premix of methylene chloride: methanol: strong ammonia solution and acetonitrile (10:10:5:10) as mobile phase. Method was found linear in a range of 9-54 nanograms with a correlation coefficient >0.99. The regression equation was: AUC = 65.57 × (Amount in nanograms) + 163 (r(2) = 0.9908).

  3. Validation of the k0 standardization method in neutron activation analysis

    International Nuclear Information System (INIS)

    Kubesova, Marie

    2009-01-01

    The goal of this work was to validate the k 0 standardization method in neutron activation analysis for use by the Nuclear Physics Institute's NAA Laboratory. The precision and accuracy of the method were examined by using two types of reference materials: the one type comprised a set of synthetic materials and served to check the implementation of k 0 standardization, the other type consisted of matrix NIST SRMs comprising various different matrices. In general, a good agreement was obtained between the results of this work and the certified values, giving evidence of the accuracy of our results. In addition, the limits were evaluated for 61 elements

  4. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    Science.gov (United States)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  5. Review of seismic tests for qualification of components and validation of methods

    International Nuclear Information System (INIS)

    Buland, P.; Gantenbein, F.; Gibert, R.J.; Hoffmann, A.; Queval, J.C.

    1988-01-01

    Seismic tests are performed in CEA-DEMT since many years in order: to demonstrate the qualification of components, to give an experimental validation of calculation methods used for seismic design of components. The paper presents examples of these two types of tests, a description of the existing facilities and details about the new facility TAMARIS under construction. (author)

  6. Review of seismic tests for qualification of components and validation of methods

    Energy Technology Data Exchange (ETDEWEB)

    Buland, P; Gantenbein, F; Gibert, R J; Hoffmann, A; Queval, J C [CEA-CEN SACLAY-DEMT, Gif sur Yvette-Cedex (France)

    1988-07-01

    Seismic tests are performed in CEA-DEMT since many years in order: to demonstrate the qualification of components, to give an experimental validation of calculation methods used for seismic design of components. The paper presents examples of these two types of tests, a description of the existing facilities and details about the new facility TAMARIS under construction. (author)

  7. Validation of analytical methods for the quality control of Naproxen suppositories

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar; Hernandez Contreras, Orestes Yuniel

    2011-01-01

    The analysis methods that will be used for the quality control of the future Cuban-made Naproxen suppositories for adults and children were developed for the first time in this paper. One method based on direct ultraviolet spectrophotometry was put forward, which proved to be specific, linear, accurate and precise for the quality control of Naproxen suppositories, taking into account the presence of chromophore groups in their structure. Likewise, the direct semi-aqueous acid-base volumetry method aimed at the quality control of the Naproxen raw material was changed and adapted to the quality control of suppositories. On the basis of the validation process, there was demonstrated the adequate specificity of this method with respect to the formulation components, as well as its linearity, accuracy and precision in 1-3 mg/ml range. The final results were compared and no significant statistical differences among the replicas per each dose were found in both methods; therefore, both may be used in the quality control of Naproxen suppositories

  8. A statistical method (cross-validation) for bone loss region detection after spaceflight

    Science.gov (United States)

    Zhao, Qian; Li, Wenjun; Li, Caixia; Chu, Philip W.; Kornak, John; Lang, Thomas F.

    2010-01-01

    Astronauts experience bone loss after the long spaceflight missions. Identifying specific regions that undergo the greatest losses (e.g. the proximal femur) could reveal information about the processes of bone loss in disuse and disease. Methods for detecting such regions, however, remains an open problem. This paper focuses on statistical methods to detect such regions. We perform statistical parametric mapping to get t-maps of changes in images, and propose a new cross-validation method to select an optimum suprathreshold for forming clusters of pixels. Once these candidate clusters are formed, we use permutation testing of longitudinal labels to derive significant changes. PMID:20632144

  9. Two Validated HPLC Methods for the Quantification of Alizarin and other Anthraquinones in Rubia tinctorum Cultivars

    NARCIS (Netherlands)

    Derksen, G.C.H.; Lelyveld, G.P.; Beek, van T.A.; Capelle, A.; Groot, de Æ.

    2004-01-01

    Direct and indirect HPLC-UV methods for the quantitative determination of anthraquinones in dried madder root have been developed, validated and compared. In the direct method, madder root was extracted twice with refluxing ethanol-water. This method allowed the determination of the two major native

  10. Force measuring valve assemblies, systems including such valve assemblies and related methods

    Science.gov (United States)

    DeWall, Kevin George [Pocatello, ID; Garcia, Humberto Enrique [Idaho Falls, ID; McKellar, Michael George [Idaho Falls, ID

    2012-04-17

    Methods of evaluating a fluid condition may include stroking a valve member and measuring a force acting on the valve member during the stroke. Methods of evaluating a fluid condition may include measuring a force acting on a valve member in the presence of fluid flow over a period of time and evaluating at least one of the frequency of changes in the measured force over the period of time and the magnitude of the changes in the measured force over the period of time to identify the presence of an anomaly in a fluid flow and, optionally, its estimated location. Methods of evaluating a valve condition may include directing a fluid flow through a valve while stroking a valve member, measuring a force acting on the valve member during the stroke, and comparing the measured force to a reference force. Valve assemblies and related systems are also disclosed.

  11. Ecological content validation of the Information Assessment Method for parents (IAM-parent): A mixed methods study.

    Science.gov (United States)

    Bujold, M; El Sherif, R; Bush, P L; Johnson-Lafleur, J; Doray, G; Pluye, P

    2018-02-01

    This mixed methods study content validated the Information Assessment Method for parents (IAM-parent) that allows users to systematically rate and comment on online parenting information. Quantitative data and results: 22,407 IAM ratings were collected; of the initial 32 items, descriptive statistics showed that 10 had low relevance. Qualitative data and results: IAM-based comments were collected, and 20 IAM users were interviewed (maximum variation sample); the qualitative data analysis assessed the representativeness of IAM items, and identified items with problematic wording. Researchers, the program director, and Web editors integrated quantitative and qualitative results, which led to a shorter and clearer IAM-parent. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. USING CFD TO ANALYZE NUCLEAR SYSTEMS BEHAVIOR: DEFINING THE VALIDATION REQUIREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Richard Schultz

    2012-09-01

    A recommended protocol to formulate numeric tool specifications and validation needs in concert with practices accepted by regulatory agencies for advanced reactors is described. The protocol is based on the plant type and perceived transient and accident envelopes that translates to boundary conditions for a process that gives the: (a) key phenomena and figures-of-merit which must be analyzed to ensure that the advanced plant can be licensed, (b) specification of the numeric tool capabilities necessary to perform the required analyses—including bounding calculational uncertainties, and (c) specification of the validation matrices and experiments--including the desired validation data. The result of applying the process enables a complete program to be defined, including costs, for creating and benchmarking transient and accident analysis methods for advanced reactors. By following a process that is in concert with regulatory agency licensing requirements from the start to finish, based on historical acceptance of past licensing submittals, the methods derived and validated have a high probability of regulatory agency acceptance.

  13. Single-laboratory validation of a method for the determination of select volatile organic compounds in foods by using vacuum distillation with gas chromatography/mass spectrometry.

    Science.gov (United States)

    Nyman, Patricia J; Limm, William; Begley, Timothy H; Chirtel, Stuart J

    2014-01-01

    Recent studies showed that headspace and purge and trap methods have limitations when used to determine volatile organic compounds (VOCs) in foods, including matrix effects and artifact formation from precursors present in the sample matrix or from thermal decomposition. U.S. Environmental Protection Agency Method 8261A liberates VOCs from the sample matrix by using vacuum distillation at room temperature. The method was modified and validated for the determination of furan, chloroform, benzene, trichloroethene, toluene, and sytrene in infant formula, canned tuna (in water), peanut butter, and an orange beverage (orange-flavored noncarbonated beverage). The validation studies showed that the LOQ values ranged from 0.05 ng/g toluene in infant formula to 5.10 ng/g toluene in peanut butter. Fortified recoveries were determined at the first, second, and third standard additions, and concentrations ranged from 0.07 to 6.9 ng/g. When quantified by the method of standard additions, the recoveries ranged from 56 to 218% at the first standard addition and 89 to 117% at the third. The validated method was used to conduct a survey of the targeted VOCs in 18 foods. The amounts found ranged from none detected to 73.8 ng/g furan in sweet potato baby food.

  14. Development and Validation of a Rule-Based Strength Scaling Method for Musculoskeletal Modelling

    DEFF Research Database (Denmark)

    Oomen, Pieter; Annegarn, Janneke; Rasmussen, John

    2015-01-01

    performed maximal isometric knee extensions. A multiple linear regression analysis (MLR) resulted in an empirical strength scaling equation, accounting for age, mass, height, gender, segment masses and segment lengths. For validation purpose, 20 newly included healthy subjects performed a maximal isometric...

  15. Development and Validation of Liquid Chromatographic Method for Estimation of Naringin in Nanoformulation

    Directory of Open Access Journals (Sweden)

    Kranti P. Musmade

    2014-01-01

    Full Text Available A simple, precise, accurate, rapid, and sensitive reverse phase high performance liquid chromatography (RP-HPLC method with UV detection has been developed and validated for quantification of naringin (NAR in novel pharmaceutical formulation. NAR is a polyphenolic flavonoid present in most of the citrus plants having variety of pharmacological activities. Method optimization was carried out by considering the various parameters such as effect of pH and column. The analyte was separated by employing a C18 (250.0 × 4.6 mm, 5 μm column at ambient temperature in isocratic conditions using phosphate buffer pH 3.5: acetonitrile (75 : 25% v/v as mobile phase pumped at a flow rate of 1.0 mL/min. UV detection was carried out at 282 nm. The developed method was validated according to ICH guidelines Q2(R1. The method was found to be precise and accurate on statistical evaluation with a linearity range of 0.1 to 20.0 μg/mL for NAR. The intra- and interday precision studies showed good reproducibility with coefficients of variation (CV less than 1.0%. The mean recovery of NAR was found to be 99.33 ± 0.16%. The proposed method was found to be highly accurate, sensitive, and robust. The proposed liquid chromatographic method was successfully employed for the routine analysis of said compound in developed novel nanopharmaceuticals. The presence of excipients did not show any interference on the determination of NAR, indicating method specificity.

  16. Evaluation of the confusion matrix method in the validation of an automated system for measuring feeding behaviour of cattle.

    Science.gov (United States)

    Ruuska, Salla; Hämäläinen, Wilhelmiina; Kajava, Sari; Mughal, Mikaela; Matilainen, Pekka; Mononen, Jaakko

    2018-03-01

    The aim of the present study was to evaluate empirically confusion matrices in device validation. We compared the confusion matrix method to linear regression and error indices in the validation of a device measuring feeding behaviour of dairy cattle. In addition, we studied how to extract additional information on classification errors with confusion probabilities. The data consisted of 12 h behaviour measurements from five dairy cows; feeding and other behaviour were detected simultaneously with a device and from video recordings. The resulting 216 000 pairs of classifications were used to construct confusion matrices and calculate performance measures. In addition, hourly durations of each behaviour were calculated and the accuracy of measurements was evaluated with linear regression and error indices. All three validation methods agreed when the behaviour was detected very accurately or inaccurately. Otherwise, in the intermediate cases, the confusion matrix method and error indices produced relatively concordant results, but the linear regression method often disagreed with them. Our study supports the use of confusion matrix analysis in validation since it is robust to any data distribution and type of relationship, it makes a stringent evaluation of validity, and it offers extra information on the type and sources of errors. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Validation of a Novel 3-Dimensional Sonographic Method for Assessing Gastric Accommodation in Healthy Adults

    NARCIS (Netherlands)

    Buisman, Wijnand J; van Herwaarden-Lindeboom, MYA; Mauritz, Femke A; El Ouamari, Mourad; Hausken, Trygve; Olafsdottir, Edda J; van der Zee, David C; Gilja, Odd Helge

    OBJECTIVES: A novel automated 3-dimensional (3D) sonographic method has been developed for measuring gastric volumes. This study aimed to validate and assess the reliability of this novel 3D sonographic method compared to the reference standard in 3D gastric sonography: freehand magneto-based 3D

  18. Composite materials and bodies including silicon carbide and titanium diboride and methods of forming same

    Science.gov (United States)

    Lillo, Thomas M.; Chu, Henry S.; Harrison, William M.; Bailey, Derek

    2013-01-22

    Methods of forming composite materials include coating particles of titanium dioxide with a substance including boron (e.g., boron carbide) and a substance including carbon, and reacting the titanium dioxide with the substance including boron and the substance including carbon to form titanium diboride. The methods may be used to form ceramic composite bodies and materials, such as, for example, a ceramic composite body or material including silicon carbide and titanium diboride. Such bodies and materials may be used as armor bodies and armor materials. Such methods may include forming a green body and sintering the green body to a desirable final density. Green bodies formed in accordance with such methods may include particles comprising titanium dioxide and a coating at least partially covering exterior surfaces thereof, the coating comprising a substance including boron (e.g., boron carbide) and a substance including carbon.

  19. Optimization, validation and application of UV-Vis spectrophotometric-colorimetric methods for determination of trimethoprim in different medicinal products

    Directory of Open Access Journals (Sweden)

    Goran Stojković

    2016-03-01

    Full Text Available Two simple, sensitive, selective, precise, and accurate methods for determination of trimethoprim in different sulfonamide formulations intended for use in human and veterinary medicine were optimized and validated. The methods are based on the trimethoprim reaction with bromcresol green (BCG and 2,4-dinitro-1-fluorobenzene (DNFB. As extraction solvents we used 10 % N,N-dimethylacetamide in methanol and acetone for both methods, respectively. The colored products are quantified applying visible spectrophotometry at their corresponding absorption maxima. The methods were validated for linearity, sensitivity, accuracy, and precision. We tested the method applicability on four different medicinal products in tablet and powder forms containing sulfametrole and sulfamethoxazole in combination with trimethoprim. The results revealed that both methods are equally accurate with recoveries within the range 95-105 %. The obtained between-day precision for both methods, when applied on four different medicinal products, was within in the range 1.08-3.20 %. By applying the F-statistical test (P<0.05, it was concluded that for three medicinal products tested both methods are applicable with statistically insignificant difference in precision. The optimized and validated BCG and DNFB methods could find application in routine quality control of trimethoprim in various formulation forms, at different concentration levels, and in combination with different sulfonamides.

  20. Initiation devices, initiation systems including initiation devices and related methods

    Energy Technology Data Exchange (ETDEWEB)

    Daniels, Michael A.; Condit, Reston A.; Rasmussen, Nikki; Wallace, Ronald S.

    2018-04-10

    Initiation devices may include at least one substrate, an initiation element positioned on a first side of the at least one substrate, and a spark gap electrically coupled to the initiation element and positioned on a second side of the at least one substrate. Initiation devices may include a plurality of substrates where at least one substrate of the plurality of substrates is electrically connected to at least one adjacent substrate of the plurality of substrates with at least one via extending through the at least one substrate. Initiation systems may include such initiation devices. Methods of igniting energetic materials include passing a current through a spark gap formed on at least one substrate of the initiation device, passing the current through at least one via formed through the at least one substrate, and passing the current through an explosive bridge wire of the initiation device.

  1. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    NARCIS (Netherlands)

    Ramos, Daniel; Haraksim, Rudolf; Meuwly, Didier

    2017-01-01

    Data to which the authors refer to throughout this article are likelihood ratios (LR) computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR) method in forensic evidence evaluation. These data present a

  2. FDIR Strategy Validation with the B Method

    Science.gov (United States)

    Sabatier, D.; Dellandrea, B.; Chemouil, D.

    2008-08-01

    In a formation flying satellite system, the FDIR strategy (Failure Detection, Isolation and Recovery) is paramount. When a failure occurs, satellites should be able to take appropriate reconfiguration actions to obtain the best possible results given the failure, ranging from avoiding satellite-to-satellite collision to continuing the mission without disturbance if possible. To achieve this goal, each satellite in the formation has an implemented FDIR strategy that governs how it detects failures (from tests or by deduction) and how it reacts (reconfiguration using redundant equipments, avoidance manoeuvres, etc.). The goal is to protect the satellites first and the mission as much as possible. In a project initiated by the CNES, ClearSy experiments the B Method to validate the FDIR strategies developed by Thales Alenia Space, of the inter satellite positioning and communication devices that will be used for the SIMBOL-X (2 satellite configuration) and the PEGASE (3 satellite configuration) missions and potentially for other missions afterward. These radio frequency metrology sensor devices provide satellite positioning and inter satellite communication in formation flying. This article presents the results of this experience.

  3. A proactive alarm reduction method and its human factors validation test for a main control room for SMART

    International Nuclear Information System (INIS)

    Jang, Gwi-sook; Suh, Sang-moon; Kim, Sa-kil; Suh, Yong-suk; Park, Je-yun

    2013-01-01

    Highlights: ► A proactive alarm reduction method improves effectiveness on the alarm reduction. ► The method suppresses alarms based on the ECA rules and facts for the alarm reduction under an alarm flood situation. ► The alarm reduction logics are supplemented to a high hit ratio of the reduction logics during on-line operations. ► The method is validated by human factors validation test based on regulatory requirements. -- Abstract: Conventional alarm systems tend to overwhelm operators during a transient because of a large number of nearly simultaneous annunciator activations with varying degrees of relevance to operator tasks. Thus alarm processing techniques have developed to support operators in coping with the volume of alarms, to identify which alarms are significant, and to reduce the need for operators to infer the plant conditions. This paper proposes a proactive alarm reduction method for SMART (System-integrated Modular Advanced ReacTor) whereby based on the contents of the past operating effects alarm reduction is carried out during the next transient. We designed and implemented the proactive alarm reduction system and constructed the environment for the human factors validation test. Also, eight subjects actually working in a nuclear power plant (NPP) tested the practical effectiveness of the proposed proactive alarm reduction method according to the procedure of human factors validation test under a dynamic simulation of a partial scope for an NPP.

  4. Development and content validation of the information assessment method for patients and consumers.

    Science.gov (United States)

    Pluye, Pierre; Granikov, Vera; Bartlett, Gillian; Grad, Roland M; Tang, David L; Johnson-Lafleur, Janique; Shulha, Michael; Barbosa Galvão, Maria Cristiane; Ricarte, Ivan Lm; Stephenson, Randolph; Shohet, Linda; Hutsul, Jo-Anne; Repchinsky, Carol A; Rosenberg, Ellen; Burnand, Bernard; Légaré, France; Dunikowski, Lynn; Murray, Susan; Boruff, Jill; Frati, Francesca; Kloda, Lorie; Macaulay, Ann; Lagarde, François; Doray, Geneviève

    2014-02-18

    Online consumer health information addresses health problems, self-care, disease prevention, and health care services and is intended for the general public. Using this information, people can improve their knowledge, participation in health decision-making, and health. However, there are no comprehensive instruments to evaluate the value of health information from a consumer perspective. We collaborated with information providers to develop and validate the Information Assessment Method for all (IAM4all) that can be used to collect feedback from information consumers (including patients), and to enable a two-way knowledge translation between information providers and consumers. Content validation steps were followed to develop the IAM4all questionnaire. The first version was based on a theoretical framework from information science, a critical literature review and prior work. Then, 16 laypersons were interviewed on their experience with online health information and specifically their impression of the IAM4all questionnaire. Based on the summaries and interpretations of interviews, questionnaire items were revised, added, and excluded, thus creating the second version of the questionnaire. Subsequently, a panel of 12 information specialists and 8 health researchers participated in an online survey to rate each questionnaire item for relevance, clarity, representativeness, and specificity. The result of this expert panel contributed to the third, current, version of the questionnaire. The current version of the IAM4all questionnaire is structured by four levels of outcomes of information seeking/receiving: situational relevance, cognitive impact, information use, and health benefits. Following the interviews and the expert panel survey, 9 questionnaire items were confirmed as relevant, clear, representative, and specific. To improve readability and accessibility for users with a lower level of literacy, 19 items were reworded and all inconsistencies in using a

  5. Validation and Recommendation of Methods to Measure Biogas Production Potential of Animal Manure

    Directory of Open Access Journals (Sweden)

    C. H. Pham

    2013-06-01

    Full Text Available In developing countries, biogas energy production is seen as a technology that can provide clean energy in poor regions and reduce pollution caused by animal manure. Laboratories in these countries have little access to advanced gas measuring equipment, which may limit research aimed at improving local adapted biogas production. They may also be unable to produce valid estimates of an international standard that can be used for articles published in international peer-reviewed science journals. This study tested and validated methods for measuring total biogas and methane (CH4 production using batch fermentation and for characterizing the biomass. The biochemical methane potential (BMP (CH4 NL kg−1 VS of pig manure, cow manure and cellulose determined with the Moller and VDI methods was not significantly different in this test (p>0.05. The biodegradability using a ratio of BMP and theoretical BMP (TBMP was slightly higher using the Hansen method, but differences were not significant. Degradation rate assessed by methane formation rate showed wide variation within the batch method tested. The first-order kinetics constant k for the cumulative methane production curve was highest when two animal manures were fermented using the VDI 4630 method, indicating that this method was able to reach steady conditions in a shorter time, reducing fermentation duration. In precision tests, the repeatability of the relative standard deviation (RSDr for all batch methods was very low (4.8 to 8.1%, while the reproducibility of the relative standard deviation (RSDR varied widely, from 7.3 to 19.8%. In determination of biomethane concentration, the values obtained using the liquid replacement method (LRM were comparable to those obtained using gas chromatography (GC. This indicates that the LRM method could be used to determine biomethane concentration in biogas in laboratories with limited access to GC.

  6. Diagnostic Methods of Helicobacter pylori Infection for Epidemiological Studies: Critical Importance of Indirect Test Validation.

    Science.gov (United States)

    Miftahussurur, Muhammad; Yamaoka, Yoshio

    2016-01-01

    Among the methods developed to detect H. pylori infection, determining the gold standard remains debatable, especially for epidemiological studies. Due to the decreasing sensitivity of direct diagnostic tests (histopathology and/or immunohistochemistry [IHC], rapid urease test [RUT], and culture), several indirect tests, including antibody-based tests (serology and urine test), urea breath test (UBT), and stool antigen test (SAT) have been developed to diagnose H. pylori infection. Among the indirect tests, UBT and SAT became the best methods to determine active infection. While antibody-based tests, especially serology, are widely available and relatively sensitive, their specificity is low. Guidelines indicated that no single test can be considered as the gold standard for the diagnosis of H. pylori infection and that one should consider the method's advantages and disadvantages. Based on four epidemiological studies, culture and RUT present a sensitivity of 74.2-90.8% and 83.3-86.9% and a specificity of 97.7-98.8% and 95.1-97.2%, respectively, when using IHC as a gold standard. The sensitivity of serology is quite high, but that of the urine test was lower compared with that of the other methods. Thus, indirect test validation is important although some commercial kits propose universal cut-off values.

  7. Checklists for external validity

    DEFF Research Database (Denmark)

    Dyrvig, Anne-Kirstine; Kidholm, Kristian; Gerke, Oke

    2014-01-01

    to an implementation setting. In this paper, currently available checklists on external validity are identified, assessed and used as a basis for proposing a new improved instrument. METHOD: A systematic literature review was carried out in Pubmed, Embase and Cinahl on English-language papers without time restrictions....... The retrieved checklist items were assessed for (i) the methodology used in primary literature, justifying inclusion of each item; and (ii) the number of times each item appeared in checklists. RESULTS: Fifteen papers were identified, presenting a total of 21 checklists for external validity, yielding a total...... of 38 checklist items. Empirical support was considered the most valid methodology for item inclusion. Assessment of methodological justification showed that none of the items were supported empirically. Other kinds of literature justified the inclusion of 22 of the items, and 17 items were included...

  8. Pesticide applicators questionnaire content validation: A fuzzy delphi method.

    Science.gov (United States)

    Manakandan, S K; Rosnah, I; Mohd Ridhuan, J; Priya, R

    2017-08-01

    The most crucial step in forming a set of survey questionnaire is deciding the appropriate items in a construct. Retaining irrelevant items and removing important items will certainly mislead the direction of a particular study. This article demonstrates Fuzzy Delphi method as one of the scientific analysis technique to consolidate consensus agreement within a panel of experts pertaining to each item's appropriateness. This method reduces the ambiguity, diversity, and discrepancy of the opinions among the experts hence enhances the quality of the selected items. The main purpose of this study was to obtain experts' consensus on the suitability of the preselected items on the questionnaire. The panel consists of sixteen experts from the Occupational and Environmental Health Unit of Ministry of Health, Vector-borne Disease Control Unit of Ministry of Health and Occupational and Safety Health Unit of both public and private universities. A set of questionnaires related to noise and chemical exposure were compiled based on the literature search. There was a total of six constructs with 60 items in which three constructs for knowledge, attitude, and practice of noise exposure and three constructs for knowledge, attitude, and practice of chemical exposure. The validation process replicated recent Fuzzy Delphi method that using a concept of Triangular Fuzzy Numbers and Defuzzification process. A 100% response rate was obtained from all the sixteen experts with an average Likert scoring of four to five. Post FDM analysis, the first prerequisite was fulfilled with a threshold value (d) ≤ 0.2, hence all the six constructs were accepted. For the second prerequisite, three items (21%) from noise-attitude construct and four items (40%) from chemical-practice construct had expert consensus lesser than 75%, which giving rise to about 12% from the total items in the questionnaire. The third prerequisite was used to rank the items within the constructs by calculating the average

  9. Validated UV-Spectrophotometric Methods for Determination of Gemifloxacin Mesylate in Pharmaceutical Tablet Dosage Forms

    Directory of Open Access Journals (Sweden)

    R. Rote Ambadas

    2010-01-01

    Full Text Available Two simple, economic and accurate UV spectrophotometric methods have been developed for determination of gemifloxacin mesylate in pharmaceutical tablet formulation. The first UV-spectrophotometric method depends upon the measurement of absorption at the wavelength 263.8 nm. In second area under curve method the wavelength range for detection was selected from 268.5-258.5 nm. Beer’s law was obeyed in the range of 2 to 12 μgmL-1 for both the methods. The proposed methods was validated statistically and applied successfully to determination of gemifloxacin mesylate in pharmaceutical formulation.

  10. Validation of an analytical method for determining halothane in urine as an instrument for evaluating occupational exposure

    International Nuclear Information System (INIS)

    Gonzalez Chamorro, Rita Maria; Jaime Novas, Arelis; Diaz Padron, Heliodora

    2010-01-01

    The occupational exposure to harmful substances may impose the apparition of determined significative changes in the normal physiology of the organism when the adequate security measures are not taken in time in a working place where the risk may be present. Among the chemical risks that may affect the workers' health are the inhalable anesthetic agents. With the objective to take the first steps for the introduction of an epidemiological surveillance system to this personnel, an analytical method for determining this anesthetic in urine was validated with the instrumental conditions created in our laboratory. To carry out this validation the following parameters were taken into account: specificity, lineament, precision, accuracy, detection limit and quantification limit, and the uncertainty of the method was calculated. In the validation procedure it was found that the technique is specific and precise, the detection limit was of 0,118 μg/L, and of the quantification limit of 0,354 μg/L. The global uncertainty was of 0,243, and the expanded of 0,486. The validated method, together with the posterior introduction of the biological exposure limits, will serve as an auxiliary means of diagnosis which will allow us a periodical control of the personnel exposure

  11. Validate or falsify: Lessons learned from a microscopy method claimed to be useful for detecting Borrelia and Babesia organisms in human blood.

    Science.gov (United States)

    Aase, Audun; Hajdusek, Ondrej; Øines, Øivind; Quarsten, Hanne; Wilhelmsson, Peter; Herstad, Tove K; Kjelland, Vivian; Sima, Radek; Jalovecka, Marie; Lindgren, Per-Eric; Aaberge, Ingeborg S

    2016-01-01

    A modified microscopy protocol (the LM-method) was used to demonstrate what was interpreted as Borrelia spirochetes and later also Babesia sp., in peripheral blood from patients. The method gained much publicity, but was not validated prior to publication, which became the purpose of this study using appropriate scientific methodology, including a control group. Blood from 21 patients previously interpreted as positive for Borrelia and/or Babesia infection by the LM-method and 41 healthy controls without known history of tick bite were collected, blinded and analysed for these pathogens by microscopy in two laboratories by the LM-method and conventional method, respectively, by PCR methods in five laboratories and by serology in one laboratory. Microscopy by the LM-method identified structures claimed to be Borrelia- and/or Babesia in 66% of the blood samples of the patient group and in 85% in the healthy control group. Microscopy by the conventional method for Babesia only did not identify Babesia in any samples. PCR analysis detected Borrelia DNA in one sample of the patient group and in eight samples of the control group; whereas Babesia DNA was not detected in any of the blood samples using molecular methods. The structures interpreted as Borrelia and Babesia by the LM-method could not be verified by PCR. The method was, thus, falsified. This study underlines the importance of doing proper test validation before new or modified assays are introduced.

  12. WIMS-AECL/RFSP code validation of reactivity calculations following a long shutdown using the simple-cell history-based method

    International Nuclear Information System (INIS)

    Ardeshiri, F.; Donnelly, J.V.; Arsenault, B.

    1998-01-01

    The purpose of this analysis is to validate the Reactor Fuelling Simulation Program (RFSP) using the simple-cell model (SCM) history-based method in a startup simulation following a reactor shutdown period. This study is part of the validation work for history-based calculations, using the WIMS-AECL code with the ENDF/B-V library, and the SCM linked to the RFSP code. In this work, the RFSP code with the SCM history-based method was used to track a 1-year period of the Point Lepreau reactor operating history, that included a 12-day reactor shutdown and subsequent startup. Measured boron and gadolinium concentrations were used in the RFSP simulations, and the predicted values of core reactivity were compared to the reference (pre-shutdown) value. The discrepancies in core reactivity are shown to be better than ±2 milli-k at any time, and better than about ±0.5 milli-k towards the end of the startup transient. The results of this analysis also show that the calculated maximum channel and bundle powers are within an acceptable range during both the core-follow and the reactor startup simulations. (author)

  13. Validation of a method for assessing resident physicians' quality improvement proposals.

    Science.gov (United States)

    Leenstra, James L; Beckman, Thomas J; Reed, Darcy A; Mundell, William C; Thomas, Kris G; Krajicek, Bryan J; Cha, Stephen S; Kolars, Joseph C; McDonald, Furman S

    2007-09-01

    Residency programs involve trainees in quality improvement (QI) projects to evaluate competency in systems-based practice and practice-based learning and improvement. Valid approaches to assess QI proposals are lacking. We developed an instrument for assessing resident QI proposals--the Quality Improvement Proposal Assessment Tool (QIPAT-7)-and determined its validity and reliability. QIPAT-7 content was initially obtained from a national panel of QI experts. Through an iterative process, the instrument was refined, pilot-tested, and revised. Seven raters used the instrument to assess 45 resident QI proposals. Principal factor analysis was used to explore the dimensionality of instrument scores. Cronbach's alpha and intraclass correlations were calculated to determine internal consistency and interrater reliability, respectively. QIPAT-7 items comprised a single factor (eigenvalue = 3.4) suggesting a single assessment dimension. Interrater reliability for each item (range 0.79 to 0.93) and internal consistency reliability among the items (Cronbach's alpha = 0.87) were high. This method for assessing resident physician QI proposals is supported by content and internal structure validity evidence. QIPAT-7 is a useful tool for assessing resident QI proposals. Future research should determine the reliability of QIPAT-7 scores in other residency and fellowship training programs. Correlations should also be made between assessment scores and criteria for QI proposal success such as implementation of QI proposals, resident scholarly productivity, and improved patient outcomes.

  14. Validation of an ultra-fast UPLC-UV method for the separation of antituberculosis tablets.

    Science.gov (United States)

    Nguyen, Dao T-T; Guillarme, Davy; Rudaz, Serge; Veuthey, Jean-Luc

    2008-04-01

    A simple method using ultra performance LC (UPLC) coupled with UV detection was developed and validated for the determination of antituberculosis drugs in combined dosage form, i. e. isoniazid (ISN), pyrazinamide (PYR) and rifampicin (RIF). Drugs were separated on a short column (2.1 mm x 50 mm) packed with 1.7 mum particles, using an elution gradient procedure. At 30 degrees C, less than 2 min was necessary for the complete separation of the three antituberculosis drugs, while the original USP method was performed in 15 min. Further improvements were obtained with the combination of UPLC and high temperature (up to 90 degrees C), namely HT-UPLC, which allows the application of higher mobile phase flow rates. Therefore, the separation of ISN, PYR and RIF was performed in less than 1 min. After validation (selectivity, trueness, precision and accuracy), both methods (UPLC and HT-UPLC) have proven suitable for the routine quality control analysis of antituberculosis drugs in combined dosage form. Additionally, a large number of samples per day can be analysed due to the short analysis times.

  15. Reliable and valid assessment of performance in thoracoscopy

    DEFF Research Database (Denmark)

    Konge, Lars; Lehnert, Per; Hansen, Henrik Jessen

    2012-01-01

    BACKGROUND: As we move toward competency-based education in medicine, we have lagged in developing competency-based evaluation methods. In the era of minimally invasive surgery, there is a need for a reliable and valid tool dedicated to measure competence in video-assisted thoracoscopic surgery....... The purpose of this study is to create such an assessment tool, and to explore its reliability and validity. METHODS: An expert group of physicians created an assessment tool consisting of 10 items rated on a five-point rating scale. The following factors were included: economy and confidence of movement...

  16. Proposal for Requirement Validation Criteria and Method Based on Actor Interaction

    Science.gov (United States)

    Hattori, Noboru; Yamamoto, Shuichiro; Ajisaka, Tsuneo; Kitani, Tsuyoshi

    We propose requirement validation criteria and a method based on the interaction between actors in an information system. We focus on the cyclical transitions of one actor's situation against another and clarify observable stimuli and responses based on these transitions. Both actors' situations can be listed in a state transition table, which describes the observable stimuli or responses they send or receive. Examination of the interaction between both actors in the state transition tables enables us to detect missing or defective observable stimuli or responses. Typically, this method can be applied to the examination of the interaction between a resource managed by the information system and its user. As a case study, we analyzed 332 requirement defect reports of an actual system development project in Japan. We found that there were a certain amount of defects regarding missing or defective stimuli and responses, which can be detected using our proposed method if this method is used in the requirement definition phase. This means that we can reach a more complete requirement definition with our proposed method.

  17. Validity of the Remote Food Photography Method against Doubly Labeled Water among Minority Preschoolers

    OpenAIRE

    Nicklas, Theresa; Saab, Rabab; Islam, Noemi G.; Wong, William; Butte, Nancy; Schulin, Rebecca; Liu, Yan; Apolzan, John W.; Myers, Candice A.; Martin, Corby K.

    2017-01-01

    Objective To determine the validity of energy intake (EI) estimations made using the Remote Food Photography Method (RFPM) compared to the doubly-labeled water (DLW) method in minority preschool children in a free-living environment. Methods Seven days of food intake and spot urine samples excluding first void collections for DLW analysis were obtained on 39 3-to-5 year old Hispanic and African American children. Using an iPhone, caregivers captured before and after pictures of the child’s in...

  18. Measurement and data analysis methods for field-scale wind erosion studies and model validation

    NARCIS (Netherlands)

    Zobeck, T.M.; Sterk, G.; Funk, R.F.; Rajot, J.L.; Stout, J.E.; Scott Van Pelt, R.

    2003-01-01

    Accurate and reliable methods of measuring windblown sediment are needed to confirm, validate, and improve erosion models, assess the intensity of aeolian processes and related damage, determine the source of pollutants, and for other applications. This paper outlines important principles to

  19. Development and validation of a bioanalytical LC-MS method for the quantification of GHRP-6 in human plasma.

    Science.gov (United States)

    Gil, Jeovanis; Cabrales, Ania; Reyes, Osvaldo; Morera, Vivian; Betancourt, Lázaro; Sánchez, Aniel; García, Gerardo; Moya, Galina; Padrón, Gabriel; Besada, Vladimir; González, Luis Javier

    2012-02-23

    Growth hormone-releasing peptide 6 (GHRP-6, His-(DTrp)-Ala-Trp-(DPhe)-Lys-NH₂, MW=872.44 Da) is a potent growth hormone secretagogue that exhibits a cytoprotective effect, maintaining tissue viability during acute ischemia/reperfusion episodes in different organs like small bowel, liver and kidneys. In the present work a quantitative method to analyze GHRP-6 in human plasma was developed and fully validated following FDA guidelines. The method uses an internal standard (IS) of GHRP-6 with ¹³C-labeled Alanine for quantification. Sample processing includes a precipitation step with cold acetone to remove the most abundant plasma proteins, recovering the GHRP-6 peptide with a high yield. Quantification was achieved by LC-MS in positive full scan mode in a Q-Tof mass spectrometer. The sensitivity of the method was evaluated, establishing the lower limit of quantification at 5 ng/mL and a range for the calibration curve from 5 ng/mL to 50 ng/mL. A dilution integrity test was performed to analyze samples at higher concentration of GHRP-6. The validation process involved five calibration curves and the analysis of quality control samples to determine accuracy and precision. The calibration curves showed R² higher than 0.988. The stability of the analyte and its internal standard (IS) was demonstrated in all conditions the samples would experience in a real time analyses. This method was applied to the quantification of GHRP-6 in plasma from nine healthy volunteers participating in a phase I clinical trial. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Validation and User Evaluation of a Sensor-Based Method for Detecting Mobility-Related Activities in Older Adults.

    Directory of Open Access Journals (Sweden)

    Hilde A E Geraedts

    Full Text Available Regular physical activity is essential for older adults to stay healthy and independent. However, daily physical activity is generally low among older adults and mainly consists of activities such as standing and shuffling around indoors. Accurate measurement of this low-energy expenditure daily physical activity is crucial for stimulation of activity. The objective of this study was to assess the validity of a necklace-worn sensor-based method for detecting time-on-legs and daily life mobility related postures in older adults. In addition user opinion about the practical use of the sensor was evaluated. Twenty frail and non-frail older adults performed a standardized and free movement protocol in their own home. Results of the sensor-based method were compared to video observation. Sensitivity, specificity and overall agreement of sensor outcomes compared to video observation were calculated. Mobility was assessed based on time-on-legs. Further assessment included the categories standing, sitting, walking and lying. Time-on-legs based sensitivity, specificity and percentage agreement were good to excellent and comparable to laboratory outcomes in other studies. Category-based sensitivity, specificity and overall agreement were moderate to excellent. The necklace-worn sensor is considered an acceptable valid instrument for assessing home-based physical activity based upon time-on-legs in frail and non-frail older adults, but category-based assessment of gait and postures could be further developed.

  1. High-throughput screening assay used in pharmacognosy: Selection, optimization and validation of methods of enzymatic inhibition by UV-visible spectrophotometry

    Directory of Open Access Journals (Sweden)

    Graciela Granados-Guzmán

    2014-02-01

    Full Text Available In research laboratories of both organic synthesis and extraction of natural products, every day a lot of products that can potentially introduce some biological activity are obtained. Therefore it is necessary to have in vitro assays, which provide reliable information for further evaluation in in vivo systems. From this point of view, in recent years has intensified the use of high-throughput screening assays. Such trials should be optimized and validated for accurate and precise results, i.e. reliable. The present review addresses the steps needed to develop and validate bioanalytical methods, emphasizing UV-Visible spectrophotometry as detection system. Particularly focuses on the selection of the method, the optimization to determine the best experimental conditions, validation, implementation of optimized and validated method to real samples, and finally maintenance and possible transfer it to a new laboratory.

  2. A simple method for validation and verification of pipettes mounted on automated liquid handlers

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Frøslev, Tobias G

    2011-01-01

    We have implemented a simple, inexpensive, and fast procedure for validation and verification of the performance of pipettes mounted on automated liquid handlers (ALHs) as necessary for laboratories accredited under ISO 17025. A six- or seven-step serial dilution of OrangeG was prepared in quadru......We have implemented a simple, inexpensive, and fast procedure for validation and verification of the performance of pipettes mounted on automated liquid handlers (ALHs) as necessary for laboratories accredited under ISO 17025. A six- or seven-step serial dilution of OrangeG was prepared...... are freely available. In conclusion, we have set up a simple, inexpensive, and fast solution for the continuous validation of ALHs used for accredited work according to the ISO 17025 standard. The method is easy to use for aqueous solutions but requires a spectrophotometer that can read microtiter plates....

  3. Rapid determination of anti-estrogens by gas chromatography/mass spectrometry in urine: Method validation and application to real samples

    Directory of Open Access Journals (Sweden)

    E. Gerace

    2012-02-01

    Full Text Available A fast screening protocol was developed for the simultaneous determination of nine anti-estrogenic agents (aminoglutethimide, anastrozole, clomiphene, drostanolone, formestane, letrozole, mesterolone, tamoxifen, testolactone plus five of their metabolites in human urine. After an enzymatic hydrolysis, these compounds can be extracted simultaneously from urine with a simple liquid–liquid extraction at alkaline conditions. The analytes were subsequently analyzed by fast-gas chromatography/mass spectrometry (fast-GC/MS after derivatization. The use of a short column, high-flow carrier gas velocity and fast temperature ramping produced an efficient separation of all analytes in about 4 min, allowing a processing rate of 10 samples/h. The present analytical method was validated according to UNI EN ISO/IEC 17025 guidelines for qualitative methods. The range of investigated parameters included the limit of detection, selectivity, linearity, repeatability, robustness and extraction efficiency. High MS-sampling rate, using a benchtop quadrupole mass analyzer, resulted in accurate peak shape definition under both scan and selected ion monitoring modes, and high sensitivity in the latter mode. Therefore, the performances of the method are comparable to the ones obtainable from traditional GC/MS analysis. The method was successfully tested on real samples arising from clinical treatments of hospitalized patients and could profitably be used for clinical studies on anti-estrogenic drug administration. Keywords: Anti-estrogens, Fast-GC/MS, Urine screening, Validation, Breast cancer

  4. A novel technique for including surface tension in PLIC-VOF methods

    Energy Technology Data Exchange (ETDEWEB)

    Meier, M.; Yadigaroglu, G. [Swiss Federal Institute of Technology, Nuclear Engineering Lab. ETH-Zentrum, CLT, Zurich (Switzerland); Smith, B. [Paul Scherrer Inst. (PSI), Villigen (Switzerland). Lab. for Thermal-Hydraulics

    2002-02-01

    Various versions of Volume-of-Fluid (VOF) methods have been used successfully for the numerical simulation of gas-liquid flows with an explicit tracking of the phase interface. Of these, Piecewise-Linear Interface Construction (PLIC-VOF) appears as a fairly accurate, although somewhat more involved variant. Including effects due to surface tension remains a problem, however. The most prominent methods, Continuum Surface Force (CSF) of Brackbill et al. and the method of Zaleski and co-workers (both referenced later), both induce spurious or 'parasitic' currents, and only moderate accuracy in regards to determining the curvature. We present here a new method to determine curvature accurately using an estimator function, which is tuned with a least-squares-fit against reference data. Furthermore, we show how spurious currents may be drastically reduced using the reconstructed interfaces from the PLIC-VOF method. (authors)

  5. Validation of an analytical method for nitrous oxide (N2O) laughing gas by headspace gas chromatography coupled to mass spectrometry (HS-GC-MS): forensic application to a lethal intoxication.

    Science.gov (United States)

    Giuliani, N; Beyer, J; Augsburger, M; Varlet, V

    2015-03-01

    Drug abuse is a widespread problem affecting both teenagers and adults. Nitrous oxide is becoming increasingly popular as an inhalation drug, causing harmful neurological and hematological effects. Some gas chromatography-mass spectrometry (GC-MS) methods for nitrous oxide measurement have been previously described. The main drawbacks of these methods include a lack of sensitivity for forensic applications; including an inability to quantitatively determine the concentration of gas present. The following study provides a validated method using HS-GC-MS which incorporates hydrogen sulfide as a suitable internal standard allowing the quantification of nitrous oxide. Upon analysis, sample and internal standard have similar retention times and are eluted quickly from the molecular sieve 5Å PLOT capillary column and the Porabond Q column therefore providing rapid data collection whilst preserving well defined peaks. After validation, the method has been applied to a real case of N2O intoxication indicating concentrations in a mono-intoxication. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Adaptation, validation and application of the chemo-thermal oxidation method to quantify black carbon in soils

    International Nuclear Information System (INIS)

    Agarwal, Tripti; Bucheli, Thomas D.

    2011-01-01

    The chemo-thermal oxidation method at 375 o C (CTO-375) has been widely used to quantify black carbon (BC) in sediments. In the present study, CTO-375 was tested and adapted for application to soil, accounting for some matrix specific properties like high organic carbon (≤39%) and carbonate (≤37%) content. Average recoveries of standard reference material SRM-2975 ranged from 25 to 86% for nine representative Swiss and Indian samples, which is similar to literature data for sediments. The adapted method was applied to selected samples of the Swiss soil monitoring network (NABO). BC content exhibited different patterns in three soil profiles while contribution of BC to TOC was found maximum below the topsoil at all three sites, however at different depths (60-130 cm). Six different NABO sites exhibited largely constant BC concentrations over the last 25 years, with short-term (6 months) prevailing over long-term (5 years) temporal fluctuations. - Research highlights: → The CTO-375 method was adapted and validated for BC analysis in soils. → Method validation figures of merit proofed satisfactory. → Application is shown with soil cores and topsoil temporal variability. → BC content can be elevated in subsurface soils. → BC contents in surface soils were largely constant over the last 25 years. - Although widely used also for soils, the chemo-thermal oxidation method at 375 o C to quantify black carbon has never been properly validated for this matrix before.

  7. Validation and uncertainty estimation of fast neutron activation analysis method for Cu, Fe, Al, Si elements in sediment samples

    International Nuclear Information System (INIS)

    Sunardi; Samin Prihatin

    2010-01-01

    Validation and uncertainty estimation of Fast Neutron Activation Analysis (FNAA) method for Cu, Fe, Al, Si elements in sediment samples has been conduced. The aim of the research is to confirm whether FNAA method is still matches to ISO/lEC 17025-2005 standard. The research covered the verification, performance, validation of FNM and uncertainty estimation. Standard of SRM 8704 and sediments were weighted for certain weight and irradiated with 14 MeV fast neutron and then counted using gamma spectrometry. The result of validation method for Cu, Fe, Al, Si element showed that the accuracy were in the range of 95.89-98.68 %, while the precision were in the range 1.13-2.29 %. The result of uncertainty estimation for Cu, Fe, Al, and Si were 2.67, 1.46, 1.71 and 1.20 % respectively. From this data, it can be concluded that the FNM method is still reliable and valid for element contents analysis in samples, because the accuracy is up to 95 % and the precision is under 5 %, while the uncertainty are relatively small and suitable for the range 95 % level of confidence where the uncertainty maximum is 5 %. (author)

  8. Model validation of solar PV plant with hybrid data dynamic simulation based on fast-responding generator method

    Directory of Open Access Journals (Sweden)

    Zhao Dawei

    2016-01-01

    Full Text Available In recent years, a significant number of large-scale solar photovoltaic (PV plants have been put into operation or been under planning around the world. The model accuracy of solar PV plant is the key factor to investigate the mutual influences between solar PV plants and a power grid. However, this problem has not been well solved, especially in how to apply the real measurements to validate the models of the solar PV plants. Taking fast-responding generator method as an example, this paper presents a model validation methodology for solar PV plant via the hybrid data dynamic simulation. First, the implementation scheme of hybrid data dynamic simulation suitable for DIgSILENT PowerFactory software is proposed, and then an analysis model of solar PV plant integration based on IEEE 9 system is established. At last, model validation of solar PV plant is achieved by employing hybrid data dynamic simulation. The results illustrate the effectiveness of the proposed method in solar PV plant model validation.

  9. Reliability and validity of risk analysis

    International Nuclear Information System (INIS)

    Aven, Terje; Heide, Bjornar

    2009-01-01

    In this paper we investigate to what extent risk analysis meets the scientific quality requirements of reliability and validity. We distinguish between two types of approaches within risk analysis, relative frequency-based approaches and Bayesian approaches. The former category includes both traditional statistical inference methods and the so-called probability of frequency approach. Depending on the risk analysis approach, the aim of the analysis is different, the results are presented in different ways and consequently the meaning of the concepts reliability and validity are not the same.

  10. Internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis.

    NARCIS (Netherlands)

    Steultjens, M.P.M.; Dekker, J.; Baar, M.E. van; Oostendorp, R.A.B.; Bijlsma, J.W.J.

    1999-01-01

    Objective: To establish the internal consistency of validity of an observational method for assessing diasbility in mobility in patients with osteoarthritis (OA), Methods: Data were obtained from 198 patients with OA of the hip or knee. Results of the observational method were compared with results

  11. A photographic method to measure food item intake. Validation in geriatric institutions.

    Science.gov (United States)

    Pouyet, Virginie; Cuvelier, Gérard; Benattar, Linda; Giboreau, Agnès

    2015-01-01

    From both a clinical and research perspective, measuring food intake is an important issue in geriatric institutions. However, weighing food in this context can be complex, particularly when the items remaining on a plate (side dish, meat or fish and sauce) need to be weighed separately following consumption. A method based on photography that involves taking photographs after a meal to determine food intake consequently seems to be a good alternative. This method enables the storage of raw data so that unhurried analyses can be performed to distinguish the food items present in the images. Therefore, the aim of this paper was to validate a photographic method to measure food intake in terms of differentiating food item intake in the context of a geriatric institution. Sixty-six elderly residents took part in this study, which was performed in four French nursing homes. Four dishes of standardized portions were offered to the residents during 16 different lunchtimes. Three non-trained assessors then independently estimated both the total and specific food item intakes of the participants using images of their plates taken after the meal (photographic method) and a reference image of one plate taken before the meal. Total food intakes were also recorded by weighing the food. To test the reliability of the photographic method, agreements between different assessors and agreements among various estimates made by the same assessor were evaluated. To test the accuracy and specificity of this method, food intake estimates for the four dishes were compared with the food intakes determined using the weighed food method. To illustrate the added value of the photographic method, food consumption differences between the dishes were explained by investigating the intakes of specific food items. Although they were not specifically trained for this purpose, the results demonstrated that the assessor estimates agreed between assessors and among various estimates made by the same

  12. A comparison of accuracy validation methods for genomic and pedigree-based predictions of swine litter size traits using Large White and simulated data.

    Science.gov (United States)

    Putz, A M; Tiezzi, F; Maltecca, C; Gray, K A; Knauer, M T

    2018-02-01

    The objective of this study was to compare and determine the optimal validation method when comparing accuracy from single-step GBLUP (ssGBLUP) to traditional pedigree-based BLUP. Field data included six litter size traits. Simulated data included ten replicates designed to mimic the field data in order to determine the method that was closest to the true accuracy. Data were split into training and validation sets. The methods used were as follows: (i) theoretical accuracy derived from the prediction error variance (PEV) of the direct inverse (iLHS), (ii) approximated accuracies from the accf90(GS) program in the BLUPF90 family of programs (Approx), (iii) correlation between predictions and the single-step GEBVs from the full data set (GEBV Full ), (iv) correlation between predictions and the corrected phenotypes of females from the full data set (Y c ), (v) correlation from method iv divided by the square root of the heritability (Y ch ) and (vi) correlation between sire predictions and the average of their daughters' corrected phenotypes (Y cs ). Accuracies from iLHS increased from 0.27 to 0.37 (37%) in the Large White. Approximation accuracies were very consistent and close in absolute value (0.41 to 0.43). Both iLHS and Approx were much less variable than the corrected phenotype methods (ranging from 0.04 to 0.27). On average, simulated data showed an increase in accuracy from 0.34 to 0.44 (29%) using ssGBLUP. Both iLHS and Y ch approximated the increase well, 0.30 to 0.46 and 0.36 to 0.45, respectively. GEBV Full performed poorly in both data sets and is not recommended. Results suggest that for within-breed selection, theoretical accuracy using PEV was consistent and accurate. When direct inversion is infeasible to get the PEV, correlating predictions to the corrected phenotypes divided by the square root of heritability is adequate given a large enough validation data set. © 2017 Blackwell Verlag GmbH.

  13. Development of a validated HPLC method for the quantitative determination of trelagliptin succinate and its related substances in pharmaceutical dosage forms.

    Science.gov (United States)

    Luo, Zhiqiang; Chen, Xinjing; Wang, Guopeng; Du, Zhibo; Ma, Xiaoyun; Wang, Hao; Yu, Guohua; Liu, Aoxue; Li, Mengwei; Peng, Wei; Liu, Yang

    2018-01-01

    Trelagliptin succinate is a dipeptidyl peptidase IV (DPP-4) inhibitor which is used as a new long-acting drug for once-weekly treatment of type 2 diabetes mellitus (DM). In the present study, a rapid, sensitive and accurate high-performance liquid chromatography (HPLC) method was developed and validated for separation and determination of trelagliptin succinate and its eight potential process-related impurities. The chromatographic separation was achieved on a Waters Xselect CSH™ C 18 (250mm×4.6mm, 5.0μm) column. The mobile phases comprised of 0.05% trifluoroacetic acid in water as well as acetonitrile containing 0.05% trifluoroacetic acid. The compounds of interest were monitored at 224nm and 275nm. The stability-indicating capability of this method was evaluated by performing stress test studies. Trelagliptin succinate was found to degrade significantly in acid, base, oxidative and thermal stress conditions and only stable in photolytic degradation condition. The degradation products were well resolved from the main peak and its impurities. In addition, the major degradation impurities formed under acid, base, oxidative and thermal stress conditions were characterized by ultra-high-performance liquid chromatography coupled with linear ion trap-Orbitrap tandem mass spectrometry (UHPLC-LTQ-Orbitrap). The method was validated to fulfill International Conference on Harmonisation (ICH) requirements and this validation included specificity, linearity, limit of detection (LOD), limit of quantification (LOQ), accuracy, precision and robustness. The developed method in this study could be applied for routine quality control analysis of trelagliptin succinate tablets, since there is no official monograph. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Experimental validation of the buildings energy performance (PEC assessment methods with reference to occupied spaces heating

    Directory of Open Access Journals (Sweden)

    Cristian PETCU

    2010-01-01

    Full Text Available This paper is part of the series of pre-standardization research aimed to analyze the existing methods of calculating the Buildings Energy Performance (PEC in view of their correction of completing. The entire research activity aims to experimentally validate the PEC Calculation Algorithm as well as the comparative application, on the support of several case studies focused on representative buildings of the stock of buildings in Romania, of the PEC calculation methodology for buildings equipped with occupied spaces heating systems. The targets of the report are the experimental testing of the calculation models so far known (NP 048-2000, Mc 001-2006, SR EN 13790:2009, on the support provided by the CE INCERC Bucharest experimental building, together with the complex calculation algorithms specific to the dynamic modeling, for the evaluation of the occupied spaces heat demand in the cold season, specific to the traditional buildings and to modern buildings equipped with solar radiation passive systems, of the ventilated solar space type. The schedule of the measurements performed in the 2008-2009 cold season is presented as well as the primary processing of the measured data and the experimental validation of the heat demand monthly calculation methods, on the support of CE INCERC Bucharest. The calculation error per heating season (153 days of measurements between the measured heat demand and the calculated one was of 0.61%, an exceptional value confirming the phenomenological nature of the INCERC method, NP 048-2006. The mathematical model specific to the hourly thermal balance is recurrent – decisional with alternating paces. The experimental validation of the theoretical model is based on the measurements performed on the CE INCERC Bucharest building, within a time lag of 57 days (06.01-04.03.2009. The measurements performed on the CE INCERC Bucharest building confirm the accuracy of the hourly calculation model by comparison to the values

  15. Lactose, galactose and glucose determination in naturally "lactose free" hard cheese: HPAEC-PAD method validation.

    Science.gov (United States)

    Monti, Lucia; Negri, Stefano; Meucci, Aurora; Stroppa, Angelo; Galli, Andrea; Contarini, Giovanna

    2017-04-01

    A chromatographic method by HPAEC-PAD was developed and in-house validated for the quantification of low sugar levels in hard cheese, specifically Grana Padano PDO cheese. Particular attention was paid to the extraction procedure, due to residual microbial and enzymatic activities. Specificity in detection and linearity were verified. Recoveries ranged from 93% for lactose to 98% for glucose and galactose. The obtained LOD and LOQ values were, respectively, 0.25 and 0.41mg/100g for lactose, 0.14 and 0.27mg/100g for galactose, and 0.16 and 0.26mg/100g for glucose. The method was applied to 59 samples of Grana Padano PDO cheese: galactose showed the highest concentration and variability among the samples (1.36±0.89), compared to both lactose (0.45±0.12) and glucose (0.46±0.13). Considering the very low levels of sugars detected, authentic PDO Grana Padano could be safely included in the diet of people suffering from lactose intolerance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. CosmoQuest:Using Data Validation for More Than Just Data Validation

    Science.gov (United States)

    Lehan, C.; Gay, P.

    2016-12-01

    It is often taken for granted that different scientists completing the same task (e.g. mapping geologic features) will get the same results, and data validation is often skipped or under-utilized due to time and funding constraints. Robbins et. al (2014), however, demonstrated that this is a needed step, as large variation can exist even among collaborating team members completing straight-forward tasks like marking craters. Data Validation should be much more than a simple post-project verification of results. The CosmoQuest virtual research facility employs regular data-validation for a variety of benefits, including real-time user feedback, real-time tracking to observe user activity while it's happening, and using pre-solved data to analyze users' progress and to help them retain skills. Some creativity in this area can drastically improve project results. We discuss methods of validating data in citizen science projects and outline the variety of uses for validation, which, when used properly, improves the scientific output of the project and the user experience for the citizens doing the work. More than just a tool for scientists, validation can assist users in both learning and retaining important information and skills, improving the quality and quantity of data gathered. Real-time analysis of user data can give key information in the effectiveness of the project that a broad glance would miss, and properly presenting that analysis is vital. Training users to validate their own data, or the data of others, can significantly improve the accuracy of misinformed or novice users.

  17. Development and validation of a thin-layer chromatography method for stability studies of naproxen

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar; Rodriguez Borges, Tania

    2011-01-01

    The validation of an analytical method was carried out to be applied to the stability studies of the future formulations of naproxen suppositories for infant and adult use. The factors which mostly influenced in the naproxen stability were determined, the major degradation occurred in oxidizing acid medium and by action of light. The possible formation of esters between the free carboxyl group present in naproxen and the glyceryl monoestereate present in the base was identified as one of the degradation paths in the new formulation. The results were satisfactory. A thin-layer chromatography-based method was developed as well as the best chromatographic conditions were selected. GF 254 silica gel plates and ultraviolet developer at 254 nm were employed. Three solvent systems were evaluated of which A made up of glacial acetic: tetrahydrofurane:toluene (3:9:90 v/v/v)allowed adequate resolution between the analyte and the possible degradation products, with detection limit of 1 μg. The use of the suggested method was restricted to the identification of possible degradation products just for qualitative purposes and not as final test. The method proved to be sensitive and selective enough to be applied for the stated objective, according to the validation results

  18. Development and validation of Ketorolac Tromethamine in eye drop formulation by RP-HPLC method

    Directory of Open Access Journals (Sweden)

    G. Sunil

    2017-02-01

    Full Text Available A simple, precise and accurate method was developed and validated for analysis of Ketorolac Tromethamine in eye drop formulation. An isocratic HPLC analysis was performed on Kromosil C18 column (150 cm × 4.6 mm × 5 μm. The compound was separated with the mixture of methanol and ammonium dihydrogen phosphate buffer in the ratio of 55:45 V/V, pH 3.0 was adjusted with O-phosphoric acid as the mobile phase at flow of 1.5 mL min−1. UV detection was performed at 314 nm using photo diode array detection. The retention time was found to be 6.01 min. The system suitability parameters such as theoretical plate count, tailing and percentage RSD between six standard injections were within the limit. The method was validated according to ICH guidelines. Calibrations were linear over the concentration range of 50–150 μg mL−1 as indicated by correlation coefficient (r of 0.999. The robustness of the method was evaluated by deliberately altering the chromatographic conditions. The developed method can be applicable for routine quantitative analysis.

  19. Validation of micro-CT against the section method regarding the assessment of marginal leakage of sealants.

    NARCIS (Netherlands)

    Chen, X.; Cuijpers, V.M.J.I.; Fan, M.W.; Frencken, J.E.F.M.

    2012-01-01

    BACKGROUND: The aim of this study was to validate the micro-CT and related software against the section method using the stereomicroscope for marginal leakage assessment along the sealant-enamel interface. METHODS: Pits and fissures of the occlusal surface of 10 teeth were sealed with a

  20. Stability indicating method development and validation of assay method for the estimation of rizatriptan benzoate in tablet

    Directory of Open Access Journals (Sweden)

    Chandrashekhar K. Gadewar

    2017-05-01

    Full Text Available A simple, sensitive, precise and specific high performance liquid chromatography method was developed and validated for the determination of rizatriptan in rizatriptan benzoate tablet. The separation was carried out by using a mobile phase consisting of acetonitrile: pH 3.4 phosphate buffer in ratio of 20:80. The column used was Zorbax SB CN 250 mm × 4.6 mm, 5 μ with a flow rate of 1 ml/min using UV detection at 225 nm. The retention time of rizatriptan and benzoic acid was found to be 4.751 and 8.348 min respectively. A forced degradation study of rizatriptan benzoate in its tablet form was conducted under the condition of hydrolysis, oxidation, thermal and photolysis. Rizatriptan was found to be stable in basic buffer while in acidic buffer was found to be degraded (water bath at 60 °C for 15 min. The detector response of rizatriptan is directly proportional to concentration ranging from 30% to 160% of test concentration i.e. 15.032 to 80.172 mcg/ml. Results of analysis were validated statistically and by recovery studies (mean recovery = 99.44. The result of the study showed that the proposed method is simple, rapid, precise and accurate, which is useful for the routine determination of rizatriptan in pharmaceutical dosage forms.

  1. Method for appraising model validity of randomised controlled trials of homeopathic treatment: multi-rater concordance study

    Science.gov (United States)

    2012-01-01

    Background A method for assessing the model validity of randomised controlled trials of homeopathy is needed. To date, only conventional standards for assessing intrinsic bias (internal validity) of trials have been invoked, with little recognition of the special characteristics of homeopathy. We aimed to identify relevant judgmental domains to use in assessing the model validity of homeopathic treatment (MVHT). We define MVHT as the extent to which a homeopathic intervention and the main measure of its outcome, as implemented in a randomised controlled trial (RCT), reflect 'state-of-the-art' homeopathic practice. Methods Using an iterative process, an international group of experts developed a set of six judgmental domains, with associated descriptive criteria. The domains address: (I) the rationale for the choice of the particular homeopathic intervention; (II) the homeopathic principles reflected in the intervention; (III) the extent of homeopathic practitioner input; (IV) the nature of the main outcome measure; (V) the capability of the main outcome measure to detect change; (VI) the length of follow-up to the endpoint of the study. Six papers reporting RCTs of homeopathy of varying design were randomly selected from the literature. A standard form was used to record each assessor's independent response per domain, using the optional verdicts 'Yes', 'Unclear', 'No'. Concordance among the eight verdicts per domain, across all six papers, was evaluated using the kappa (κ) statistic. Results The six judgmental domains enabled MVHT to be assessed with 'fair' to 'almost perfect' concordance in each case. For the six RCTs examined, the method allowed MVHT to be classified overall as 'acceptable' in three, 'unclear' in two, and 'inadequate' in one. Conclusion Future systematic reviews of RCTs in homeopathy should adopt the MVHT method as part of a complete appraisal of trial validity. PMID:22510227

  2. A simple HPLC method for the determination of halcinonide in lipid nanoparticles: development, validation, encapsulation efficiency, and in vitro drug permeation

    Directory of Open Access Journals (Sweden)

    Clarissa Elize Lopes

    2017-06-01

    Full Text Available ABSTRACT Halcinonide is a high-potency topical glucocorticoid used for skin inflammation treatments that presents toxic systemic effects. A simple and quick analytical method to quantify the amount of halcinonide encapsulated into lipid nanoparticles, such as polymeric lipid-core nanoparticles and solid lipid nanoparticles, was developed and validated regarding the drug's encapsulation efficiency and in vitro permeation. The development and validation of the analytical method were carried out using the high performance liquid chromatography with the UV detection at 239 nm. The validation parameters were specificity, linearity, precision and accuracy, limits of detection and quantitation, and robustness. The method presented an isocratic flow rate of 1.0 mL.min-1, a mobile phase methanol:water (85:15 v/v, and a retention time of 4.21 min. The method was validated according to international and national regulations. The halcinonide encapsulation efficiency in nanoparticles was greater than 99% and the in vitro drug permeation study showed that less than 9% of the drug permeated through the membrane, indicating a nanoparticle reservoir effect, which can reduce the halcinonide's toxic systemic effects. These studies demonstrated the applicability of the developed and validated analytical method to quantify halcinonide in lipid nanoparticles.

  3. Validated Spectrophotometric Methods for Simultaneous Determination of Food Colorants and Sweeteners

    Directory of Open Access Journals (Sweden)

    Fatma Turak

    2013-01-01

    Full Text Available Two simple spectrophotometric methods have been proposed for simultaneous determination of two colorants (Indigotin and Brilliant Blue and two sweeteners (Acesulfame-K and Aspartame in synthetic mixtures and chewing gums without any prior separation or purification. The first method, derivative spectrophotometry (ZCDS, is based on recording the first derivative curves (for Indigotin, Brillant Blue, and Acesulfame-K and third-derivative curve (for Aspartame and determining each component using the zero-crossing technique. The other method, ratio derivative spectrophotometry (RDS, depends on application ratio spectra of first- and third-derivative spectrophotometry to resolve the interference due to spectral overlapping. Both colorants and sweeteners showed good linearity, with regression coefficients of 0.9992–0.9999. The LOD and LOQ values ranged from 0.05 to 0.33 μgmL−1 and from 0.06 to 0.47 μgmL−1, respectively. The intraday and interday precision tests produced good RSD% values (<0.81%; recoveries ranged from 99.78% to 100.67% for all two methods. The accuracy and precision of the methods have been determined, and the methods have been validated by analyzing synthetic mixtures containing colorants and sweeteners. Two methods were applied for the above combination, and satisfactory results were obtained. The results obtained by applying the ZCDS method were statistically compared with those obtained by the RDS method.

  4. Validation of the Five-Phase Method for Simulating Complex Fenestration Systems with Radiance against Field Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Geisler-Moroder, David [Bartenbach GmbH, Aldrans (Austria); Lee, Eleanor S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ward, Gregory J. [Anyhere Software, Albany, NY (United States)

    2016-08-29

    The Five-Phase Method (5-pm) for simulating complex fenestration systems with Radiance is validated against field measurements. The capability of the method to predict workplane illuminances, vertical sensor illuminances, and glare indices derived from captured and rendered high dynamic range (HDR) images is investigated. To be able to accurately represent the direct sun part of the daylight not only in sensor point simulations, but also in renderings of interior scenes, the 5-pm calculation procedure was extended. The validation shows that the 5-pm is superior to the Three-Phase Method for predicting horizontal and vertical illuminance sensor values as well as glare indices derived from rendered images. Even with input data from global and diffuse horizontal irradiance measurements only, daylight glare probability (DGP) values can be predicted within 10% error of measured values for most situations.

  5. Validity of the Modified Baecke Questionnaire: comparison with energy expenditure according to the doubly labeled water method

    Directory of Open Access Journals (Sweden)

    Peeters Petra HM

    2008-05-01

    Full Text Available Abstract Background In epidemiological research, physical activity is usually assessed by questionnaires. Questionnaires are suitable for large study populations since they are relatively inexpensive and not very time consuming. However, questionnaire information is by definition subjective and prone to recall bias, especially among elderly subjects. The Modified Baecke Questionnaire, developed by Voorrips and coworkers, measures habitual physical activity in the elderly. The questionnaire includes questions on household activities, sports, and leisure time activities, over a time period of one year. The Modified Baecke Questionnaire results in a score to classify people as high, moderate, or low in daily physical activity, based on tertiles. Methods The validity of the Modified Baecke Questionnaire score was assessed among 21 elderly men and women using the doubly labeled water method as the reference criterion. This method is considered to be the gold standard for measuring energy expenditure in free-living individuals. Energy expenditure on physical activity is estimated by the ratio of total energy expenditure measured by the doubly labeled water method and resting metabolic rate measured by indirect calorimetry. This ratio is called the physical activity ratio. Results The Spearman correlation coefficient between the questionnaire score and the physical activity ratio (PAR was 0.54 (95% CI 0.22–0.66. Correct classification by the questionnaire occurred in 71% of participants who were in the lowest tertile of PAR, in 14% of participants in the middle tertile, and in 43% of participants in the highest tertile. Subjects were not wrongly classified in an opposite tertile. Conclusion The validity of the Modified Baecke Questionnaire is fair-to-moderate. This study shows that the questionnaire can correctly classify individuals as low or high active, but does a poor job for moderately active individuals.

  6. A Method for Ship Collision Damage and Energy Absorption Analysis and its Validation

    DEFF Research Database (Denmark)

    Zhang, Shengming; Pedersen, Preben Terndrup

    2016-01-01

    -examine this method’s validity andaccuracy for ship collision damage analysis in shipdesign assessments by comprehensive validations withthe experimental results from the public domain. Twentyexperimental tests have been selected, analysed andcompared with the results calculated using the proposedmethod. It can......For design evaluation there is a need for a method whichis fast, practical and yet accurate enough to determine theabsorbed energy and collision damage extent in shipcollision analysis. The most well-known simplifiedempirical approach to collision analysis was madeprobably by Minorsky and its...... limitation is also wellrecognized.The authors have previously developedsimple expressions for the relation between the absorbedenergy and the damaged material volume which take intoaccount the structural arrangements, the materialproperties and the damage modes. The purpose of thepresent paper is to re...

  7. Validation of an improved abnormality insertion method for medical image perception investigations

    Science.gov (United States)

    Madsen, Mark T.; Durst, Gregory R.; Caldwell, Robert T.; Schartz, Kevin M.; Thompson, Brad H.; Berbaum, Kevin S.

    2009-02-01

    The ability to insert abnormalities in clinical tomographic images makes image perception studies with medical images practical. We describe a new insertion technique and its experimental validation that uses complementary image masks to select an abnormality from a library and place it at a desired location. The method was validated using a 4-alternative forced-choice experiment. For each case, four quadrants were simultaneously displayed consisting of 5 consecutive frames of a chest CT with a pulmonary nodule. One quadrant was unaltered, while the other 3 had the nodule from the unaltered quadrant artificially inserted. 26 different sets were generated and repeated with order scrambling for a total of 52 cases. The cases were viewed by radiology staff and residents who ranked each quadrant by realistic appearance. On average, the observers were able to correctly identify the unaltered quadrant in 42% of cases, and identify the unaltered quadrant both times it appeared in 25% of cases. Consensus, defined by a majority of readers, correctly identified the unaltered quadrant in only 29% of 52 cases. For repeats, the consensus observer successfully identified the unaltered quadrant only once. We conclude that the insertion method can be used to reliably place abnormalities in perception experiments.

  8. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software in Young People with Down Syndrome.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Rey-Abella, Ferran; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2016-05-01

    People with Down syndrome present skeletal abnormalities in their feet that can be analyzed by commonly used gold standard indices (the Hernández-Corvo index, the Chippaux-Smirak index, the Staheli arch index, and the Clarke angle) based on footprint measurements. The use of Photoshop CS5 software (Adobe Systems Software Ireland Ltd, Dublin, Ireland) to measure footprints has been validated in the general population. The present study aimed to assess the reliability and validity of this footprint assessment technique in the population with Down syndrome. Using optical podography and photography, 44 footprints from 22 patients with Down syndrome (11 men [mean ± SD age, 23.82 ± 3.12 years] and 11 women [mean ± SD age, 24.82 ± 6.81 years]) were recorded in a static bipedal standing position. A blinded observer performed the measurements using a validated manual method three times during the 4-month study, with 2 months between measurements. Test-retest was used to check the reliability of the Photoshop CS5 software measurements. Validity and reliability were obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed very good values for the Photoshop CS5 method (ICC, 0.982-0.995). Validity testing also found no differences between the techniques (ICC, 0.988-0.999). The Photoshop CS5 software method is reliable and valid for the study of footprints in young people with Down syndrome.

  9. Development and validation of an automated, microscopy-based method for enumeration of groups of intestinal bacteria

    NARCIS (Netherlands)

    Jansen, GJ; Wildeboer-Veloo, ACM; Tonk, RHJ; Franks, AH; Welling, G

    An automated microscopy-based method using fluorescently labelled 16S rRNA-targeted oligonucleotide probes directed against the predominant groups of intestinal bacteria was developed and validated. The method makes use of the Leica 600HR. image analysis system, a Kodak MegaPlus camera model 1.4 and

  10. The Diabetes Evaluation Framework for Innovative National Evaluations (DEFINE): Construct and Content Validation Using a Modified Delphi Method.

    Science.gov (United States)

    Paquette-Warren, Jann; Tyler, Marie; Fournie, Meghan; Harris, Stewart B

    2017-06-01

    In order to scale-up successful innovations, more evidence is needed to evaluate programs that attempt to address the rising prevalence of diabetes and the associated burdens on patients and the healthcare system. This study aimed to assess the construct and content validity of the Diabetes Evaluation Framework for Innovative National Evaluations (DEFINE), a tool developed to guide the evaluation, design and implementation with built-in knowledge translation principles. A modified Delphi method, including 3 individual rounds (questionnaire with 7-point agreement/importance Likert scales and/or open-ended questions) and 1 group round (open discussion) were conducted. Twelve experts in diabetes, research, knowledge translation, evaluation and policy from Canada (Ontario, Quebec and British Columbia) and Australia participated. Quantitative consensus criteria were an interquartile range of ≤1. Qualitative data were analyzed thematically and confirmed by participants. An importance scale was used to determine a priority multi-level indicator set. Items rated very or extremely important by 80% or more of the experts were reviewed in the final group round to build the final set. Participants reached consensus on the content and construct validity of DEFINE, including its title, overall goal, 5-step evaluation approach, medical and nonmedical determinants of health schematics, full list of indicators and associated measurement tools, priority multi-level indicator set and next steps in DEFINE's development. Validated by experts, DEFINE has the right theoretic components to evaluate comprehensively diabetes prevention and management programs and to support acquisition of evidence that could influence the knowledge translation of innovations to reduce the burden of diabetes. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Validation of an Instrument to Measure High School Students' Attitudes toward Fitness Testing

    Science.gov (United States)

    Mercier, Kevin; Silverman, Stephen

    2014-01-01

    Purpose: The purpose of this investigation was to develop an instrument that has scores that are valid and reliable for measuring students' attitudes toward fitness testing. Method: The method involved the following steps: (a) an elicitation study, (b) item development, (c) a pilot study, and (d) a validation study. The pilot study included 427…

  12. Fisk-based criteria to support validation of detection methods for drinking water and air.

    Energy Technology Data Exchange (ETDEWEB)

    MacDonell, M.; Bhattacharyya, M.; Finster, M.; Williams, M.; Picel, K.; Chang, Y.-S.; Peterson, J.; Adeshina, F.; Sonich-Mullin, C.; Environmental Science Division; EPA

    2009-02-18

    This report was prepared to support the validation of analytical methods for threat contaminants under the U.S. Environmental Protection Agency (EPA) National Homeland Security Research Center (NHSRC) program. It is designed to serve as a resource for certain applications of benchmark and fate information for homeland security threat contaminants. The report identifies risk-based criteria from existing health benchmarks for drinking water and air for potential use as validation targets. The focus is on benchmarks for chronic public exposures. The priority sources are standard EPA concentration limits for drinking water and air, along with oral and inhalation toxicity values. Many contaminants identified as homeland security threats to drinking water or air would convert to other chemicals within minutes to hours of being released. For this reason, a fate analysis has been performed to identify potential transformation products and removal half-lives in air and water so appropriate forms can be targeted for detection over time. The risk-based criteria presented in this report to frame method validation are expected to be lower than actual operational targets based on realistic exposures following a release. Note that many target criteria provided in this report are taken from available benchmarks without assessing the underlying toxicological details. That is, although the relevance of the chemical form and analogues are evaluated, the toxicological interpretations and extrapolations conducted by the authoring organizations are not. It is also important to emphasize that such targets in the current analysis are not health-based advisory levels to guide homeland security responses. This integrated evaluation of chronic public benchmarks and contaminant fate has identified more than 200 risk-based criteria as method validation targets across numerous contaminants and fate products in drinking water and air combined. The gap in directly applicable values is

  13. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    Science.gov (United States)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  14. The Signal Validation method of Digital Process Instrumentation System on signal conditioner for SMART

    International Nuclear Information System (INIS)

    Moon, Hee Gun; Park, Sang Min; Kim, Jung Seon; Shon, Chang Ho; Park, Heui Youn; Koo, In Soo

    2005-01-01

    The function of PIS(Process Instrumentation System) for SMART is to acquire the process data from sensor or transmitter. The PIS consists of signal conditioner, A/D converter, DSP(Digital Signal Process) and NIC(Network Interface Card). So, It is fully digital system after A/D converter. The PI cabinet and PDAS(Plant Data Acquisition System) in commercial plant is responsible for data acquisition of the sensor or transmitter include RTD, TC, level, flow, pressure and so on. The PDAS has the software that processes each sensor data and PI cabinet has the signal conditioner, which is need for maintenance and test. The signal conditioner has the potentiometer to adjust the span and zero for test and maintenance. The PIS of SMART also has the signal conditioner which has the span and zero adjust same as the commercial plant because the signal conditioner perform the signal condition for AD converter such as 0∼10Vdc. But, To adjust span and zero is manual test and calibration. So, This paper presents the method of signal validation and calibration, which is used by digital feature in SMART. There are I/E(current to voltage), R/E(resistor to voltage), F/E(frequency to voltage), V/V(voltage to voltage). Etc. In this paper show only the signal validation and calibration about I/E converter that convert level, pressure, flow such as 4∼20mA into signal for AD conversion such as 0∼10Vdc

  15. The X-Ray Pebble Recirculation Experiment (X-PREX): Facility Description, Preliminary Discrete Element Method Simulation Validation Studies, and Future Test Program

    International Nuclear Information System (INIS)

    Laufer, Michael R.; Bickel, Jeffrey E.; Buster, Grant C.; Krumwiede, David L.; Peterson, Per F.

    2014-01-01

    This paper presents a facility description, preliminary results, and future test program of the new X-Ray Pebble Recirculation Experiment (X-PREX), which is now operational and being used to collect data on the behavior of slow dense granular flows relevant to pebble bed reactor core designs. The X-PREX facility uses digital x-ray tomography methods to track both the translational and rotational motion of spherical pebbles, which provides unique experimental results that can be used to validate discrete element method (DEM) simulations of pebble motion. The validation effort supported by the X-PREX facility provides a means to build confidence in analysis of pebble bed configuration and residence time distributions that impact the neutronics, thermal hydraulics, and safety analysis of pebble bed reactor cores. Preliminary experimental and DEM simulation results are reported for silo drainage, a classical problem in the granular flow literature, at several hopper angles. These studies include conventional converging and novel diverging geometries that provide additional flexibility in the design of pebble bed reactor cores. Excellent agreement is found between the X-PREX experimental and DEM simulation results. Finally, this paper discusses additional studies in progress relevant to the design and analysis of pebble bed reactor cores including pebble recirculation in cylindrical core geometries and evaluation of forces on shut down blades inserted directly into a packed pebble bed. (author)

  16. Summary of Validation of Multi-Pesticide Methods for Various Pesticide Formulations

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    The validation of multi-pesticide methods applicable for various types of pesticide formulations is treated. In a worked-out practical example, i.e. lambda cyhalothrin, the theoretical considerations outlined in the General Guidance section are put into practice. GC conditions, selection of an internal standard and criteria for an acceptable repeatability of injections are outlined, followed by sample preparation, calibration, batch analysis and confirmation of results through comparison using different separation columns. Complete sets of data are displayed in tabular form for other pesticide active ingredients and real formulations. (author)

  17. TRANSAT-- method for detecting the conserved helices of functional RNA structures, including transient, pseudo-knotted and alternative structures.

    Science.gov (United States)

    Wiebe, Nicholas J P; Meyer, Irmtraud M

    2010-06-24

    The prediction of functional RNA structures has attracted increased interest, as it allows us to study the potential functional roles of many genes. RNA structure prediction methods, however, assume that there is a unique functional RNA structure and also do not predict functional features required for in vivo folding. In order to understand how functional RNA structures form in vivo, we require sophisticated experiments or reliable prediction methods. So far, there exist only a few, experimentally validated transient RNA structures. On the computational side, there exist several computer programs which aim to predict the co-transcriptional folding pathway in vivo, but these make a range of simplifying assumptions and do not capture all features known to influence RNA folding in vivo. We want to investigate if evolutionarily related RNA genes fold in a similar way in vivo. To this end, we have developed a new computational method, Transat, which detects conserved helices of high statistical significance. We introduce the method, present a comprehensive performance evaluation and show that Transat is able to predict the structural features of known reference structures including pseudo-knotted ones as well as those of known alternative structural configurations. Transat can also identify unstructured sub-sequences bound by other molecules and provides evidence for new helices which may define folding pathways, supporting the notion that homologous RNA sequence not only assume a similar reference RNA structure, but also fold similarly. Finally, we show that the structural features predicted by Transat differ from those assuming thermodynamic equilibrium. Unlike the existing methods for predicting folding pathways, our method works in a comparative way. This has the disadvantage of not being able to predict features as function of time, but has the considerable advantage of highlighting conserved features and of not requiring a detailed knowledge of the cellular

  18. Method for Determining Volumetric Efficiency and Its Experimental Validation

    Directory of Open Access Journals (Sweden)

    Ambrozik Andrzej

    2017-12-01

    Full Text Available Modern means of transport are basically powered by piston internal combustion engines. Increasingly rigorous demands are placed on IC engines in order to minimise the detrimental impact they have on the natural environment. That stimulates the development of research on piston internal combustion engines. The research involves experimental and theoretical investigations carried out using computer technologies. While being filled, the cylinder is considered to be an open thermodynamic system, in which non-stationary processes occur. To make calculations of thermodynamic parameters of the engine operating cycle, based on the comparison of cycles, it is necessary to know the mean constant value of cylinder pressure throughout this process. Because of the character of in-cylinder pressure pattern and difficulties in pressure experimental determination, in the present paper, a novel method for the determination of this quantity was presented. In the new approach, the iteration method was used. In the method developed for determining the volumetric efficiency, the following equations were employed: the law of conservation of the amount of substance, the first law of thermodynamics for open system, dependences for changes in the cylinder volume vs. the crankshaft rotation angle, and the state equation. The results of calculations performed with this method were validated by means of experimental investigations carried out for a selected engine at the engine test bench. A satisfactory congruence of computational and experimental results as regards determining the volumetric efficiency was obtained. The method for determining the volumetric efficiency presented in the paper can be used to investigate the processes taking place in the cylinder of an IC engine.

  19. Comparison of the Effects of Cross-validation Methods on Determining Performances of Classifiers Used in Diagnosing Congestive Heart Failure

    Directory of Open Access Journals (Sweden)

    Isler Yalcin

    2015-08-01

    Full Text Available Congestive heart failure (CHF occurs when the heart is unable to provide sufficient pump action to maintain blood flow to meet the needs of the body. Early diagnosis is important since the mortality rate of the patients with CHF is very high. There are different validation methods to measure performances of classifier algorithms designed for this purpose. In this study, k-fold and leave-one-out cross-validation methods were tested for performance measures of five distinct classifiers in the diagnosis of the patients with CHF. Each algorithm was run 100 times and the average and the standard deviation of classifier performances were recorded. As a result, it was observed that average performance was enhanced and the variability of performances was decreased when the number of data sections used in the cross-validation method was increased.

  20. [Method for evaluating the competence of specialists--the validation of 360-degree-questionnaire].

    Science.gov (United States)

    Nørgaard, Kirsten; Pedersen, Juri; Ravn, Lisbeth; Albrecht-Beste, Elisabeth; Holck, Kim; Fredløv, Maj; Møller, Lars Krag

    2010-04-19

    Assessment of physicians' performance focuses on the quality of their work. The aim of this study was to develop a valid, usable and acceptable multisource feedback assessment tool (MFAT) for hospital consultants. Statements were produced on consultant competencies within non-medical areas like collaboration, professionalism, communication, health promotion, academics and administration. The statements were validated by physicians and later by non-physician professionals after adjustments had been made. In a pilot test, a group of consultants was assessed using the final collection of statements of the MFAT. They received a report with their personal results and subsequently evaluated the assessment method. In total, 66 statements were developed and after validation they were reduced and reformulated to 35. Mean scores for relevance and "easy to understand" of the statements were in the range between "very high degree" and "high degree". In the pilot test, 18 consultants were assessed by themselves, by 141 other physicians and by 125 other professionals in the hospital. About two thirds greatly benefited of the assessment report and half identified areas for personal development. About a third did not want the head of their department to know the assessment results directly; however, two thirds found a potential value in discussing the results with the head. We developed an MFAT for consultants with relevant and understandable statements. A pilot test confirmed that most of the consultants gained from the assessment, but some did not like to share their results with their heads. For these specialists other methods should be used.

  1. Cross-validation and Peeling Strategies for Survival Bump Hunting using Recursive Peeling Methods

    Science.gov (United States)

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    We introduce a framework to build a survival/risk bump hunting model with a censored time-to-event response. Our Survival Bump Hunting (SBH) method is based on a recursive peeling procedure that uses a specific survival peeling criterion derived from non/semi-parametric statistics such as the hazards-ratio, the log-rank test or the Nelson--Aalen estimator. To optimize the tuning parameter of the model and validate it, we introduce an objective function based on survival or prediction-error statistics, such as the log-rank test and the concordance error rate. We also describe two alternative cross-validation techniques adapted to the joint task of decision-rule making by recursive peeling and survival estimation. Numerical analyses show the importance of replicated cross-validation and the differences between criteria and techniques in both low and high-dimensional settings. Although several non-parametric survival models exist, none addresses the problem of directly identifying local extrema. We show how SBH efficiently estimates extreme survival/risk subgroups unlike other models. This provides an insight into the behavior of commonly used models and suggests alternatives to be adopted in practice. Finally, our SBH framework was applied to a clinical dataset. In it, we identified subsets of patients characterized by clinical and demographic covariates with a distinct extreme survival outcome, for which tailored medical interventions could be made. An R package PRIMsrc (Patient Rule Induction Method in Survival, Regression and Classification settings) is available on CRAN (Comprehensive R Archive Network) and GitHub. PMID:27034730

  2. Validation of the actuator line method using near wake measurements of the MEXICO rotor

    DEFF Research Database (Denmark)

    Nilsson, Karl; Shen, Wen Zhong; Sørensen, Jens Nørkær

    2015-01-01

    The purpose of the present work is to validate the capability of the actuator line method to compute vortex structures in the near wake behind the MEXICO experimental wind turbine rotor. In the MEXICO project/MexNext Annex, particle image velocimetry measurements have made it possible to determine...

  3. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation

    NARCIS (Netherlands)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-01-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes’ inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of

  4. The Bland-Altman Method Should Not Be Used in Regression Cross-Validation Studies

    Science.gov (United States)

    O'Connor, Daniel P.; Mahar, Matthew T.; Laughlin, Mitzi S.; Jackson, Andrew S.

    2011-01-01

    The purpose of this study was to demonstrate the bias in the Bland-Altman (BA) limits of agreement method when it is used to validate regression models. Data from 1,158 men were used to develop three regression equations to estimate maximum oxygen uptake (R[superscript 2] = 0.40, 0.61, and 0.82, respectively). The equations were evaluated in a…

  5. Development and validation of HPLC analytical method for quantitative determination of metronidazole in human plasma

    International Nuclear Information System (INIS)

    Safdar, K.A.; Shyum, S.B.; Usman, S.

    2016-01-01

    The objective of the present study was to develop a simple, rapid and sensitive reversed-phase high performance liquid chromatographic (RP-HPLC) analytical method with UV detection system for the quantitative determination of metronidazole in human plasma. The chromatographic separation was performed by using C18 RP column (250mm X 4.6mm, 5 meu m) as stationary phase and 0.01M potassium dihydrogen phosphate buffered at pH 3.0 and acetonitrile (83:17, v/v) as mobile phase at flow rate of 1.0 ml/min. The UV detection was carried out at 320nm. The method was validated as per the US FDA guideline for bioanalytical method validation and was found to be selective without interferences from mobile phase components, impurities and biological matrix. The method found to be linear over the concentration range of 0.2812 meu g/ml to 18.0 meu g/ml (r2 = 0.9987) with adequate level of accuracy and precision. The samples were found to be stable under various recommended laboratory and storage conditions. Therefore, the method can be used with adequate level of confidence and assurance for bioavailability, bioequivalence and other pharmacokinetic studies of metronidazole in human. (author)

  6. Validation and application of an high-order spectral difference method for flow induced noise simulation

    KAUST Repository

    Parsani, Matteo; Ghorbaniasl, Ghader; Lacor, C.

    2011-01-01

    . The method is based on the Ffowcs WilliamsHawkings approach, which provides noise contributions for monopole, dipole and quadrupole acoustic sources. This paper will focus on the validation and assessment of this hybrid approach using different test cases

  7. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods

    Science.gov (United States)

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-01-01

    Simple Summary Many vaccines are tested for quality in experiments that require the use of large numbers of animals in procedures that often cause significant pain and distress. Newer technologies have fostered the development of vaccine quality control tests that reduce or eliminate the use of animals, but the availability of these newer methods has not guaranteed their acceptance by regulators or use by manufacturers. We discuss a strategic approach that has been used to assess and ultimately increase the use of non-animal vaccine quality tests in the U.S. and U.K. Abstract In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches. PMID:26486625

  8. Validation of uncertainty of weighing in the preparation of radionuclide standards by Monte Carlo Method

    International Nuclear Information System (INIS)

    Cacais, F.L.; Delgado, J.U.; Loayza, V.M.

    2016-01-01

    In preparing solutions for the production of radionuclide metrology standards is necessary measuring the quantity Activity by mass. The gravimetric method by elimination is applied to perform weighing with smaller uncertainties. At this work is carried out the validation, by the Monte Carlo method, of the uncertainty calculation approach implemented by Lourenco and Bobin according to ISO GUM for the method by elimination. The results obtained by both uncertainty calculation methods were consistent indicating that were fulfilled the conditions for the application of ISO GUM in the preparation of radioactive standards. (author)

  9. Fuzzy decision-making: a new method in model selection via various validity criteria

    International Nuclear Information System (INIS)

    Shakouri Ganjavi, H.; Nikravesh, K.

    2001-01-01

    Modeling is considered as the first step in scientific investigations. Several alternative models may be candida ted to express a phenomenon. Scientists use various criteria to select one model between the competing models. Based on the solution of a Fuzzy Decision-Making problem, this paper proposes a new method in model selection. The method enables the scientist to apply all desired validity criteria, systematically by defining a proper Possibility Distribution Function due to each criterion. Finally, minimization of a utility function composed of the Possibility Distribution Functions will determine the best selection. The method is illustrated through a modeling example for the A verage Daily Time Duration of Electrical Energy Consumption in Iran

  10. Development and validation of a stability-indicating capillary zone electrophoretic method for the assessment of entecavir and its correlation with liquid chromatographic methods.

    Science.gov (United States)

    Dalmora, Sergio Luiz; Nogueira, Daniele Rubert; D'Avila, Felipe Bianchini; Souto, Ricardo Bizogne; Leal, Diogo Paim

    2011-01-01

    A stability-indicating capillary zone electrophoresis (CZE) method was validated for the analysis of entecavir in pharmaceutical formulations, using nimesulide as an internal standard. A fused-silica capillary (50 µm i.d.; effective length, 40 cm) was used while being maintained at 25°C; the applied voltage was 25 kV. A background electrolyte solution consisted of a 20 mM sodium tetraborate solution at pH 10. Injections were performed using a pressure mode at 50 mbar for 5 s, with detection at 216 nm. The specificity and stability-indicating capability were proven through forced degradation studies, evaluating also the in vitro cytotoxicity test of the degraded products. The method was linear over the concentration range of 1-200 µg mL(-1) (r(2) = 0.9999), and was applied for the analysis of entecavir in tablet dosage forms. The results were correlated to those of validated conventional and fast LC methods, showing non-significant differences (p > 0.05).

  11. Validation of lumbar spine loading from a musculoskeletal model including the lower limbs and lumbar spine.

    Science.gov (United States)

    Actis, Jason A; Honegger, Jasmin D; Gates, Deanna H; Petrella, Anthony J; Nolasco, Luis A; Silverman, Anne K

    2018-02-08

    Low back mechanics are important to quantify to study injury, pain and disability. As in vivo forces are difficult to measure directly, modeling approaches are commonly used to estimate these forces. Validation of model estimates is critical to gain confidence in modeling results across populations of interest, such as people with lower-limb amputation. Motion capture, ground reaction force and electromyographic data were collected from ten participants without an amputation (five male/five female) and five participants with a unilateral transtibial amputation (four male/one female) during trunk-pelvis range of motion trials in flexion/extension, lateral bending and axial rotation. A musculoskeletal model with a detailed lumbar spine and the legs including 294 muscles was used to predict L4-L5 loading and muscle activations using static optimization. Model estimates of L4-L5 intervertebral joint loading were compared to measured intradiscal pressures from the literature and muscle activations were compared to electromyographic signals. Model loading estimates were only significantly different from experimental measurements during trunk extension for males without an amputation and for people with an amputation, which may suggest a greater portion of L4-L5 axial load transfer through the facet joints, as facet loads are not captured by intradiscal pressure transducers. Pressure estimates between the model and previous work were not significantly different for flexion, lateral bending or axial rotation. Timing of model-estimated muscle activations compared well with electromyographic activity of the lumbar paraspinals and upper erector spinae. Validated estimates of low back loading can increase the applicability of musculoskeletal models to clinical diagnosis and treatment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. A Validated Stability-Indicating HPLC Method for Simultaneous Determination of Amoxicillin and Enrofloxacin Combination in an Injectable Suspension

    Directory of Open Access Journals (Sweden)

    Nidal Batrawi

    2017-02-01

    Full Text Available The combination of amoxicillin and enrofloxacin is a well-known mixture of veterinary drugs; it is used for the treatment of Gram-positive and Gram-negative bacteria. In the scientific literature, there is no high-performance liquid chromatography (HPLC-UV method for the simultaneous determination of this combination. The objective of this work is to develop and validate an HPLC method for the determination of this combination. In this regard, a new, simple and efficient reversed-phase HPLC method for simultaneous qualitative and quantitative determination of amoxicillin and enrofloxacin, in an injectable preparation with a mixture of inactive excipients, has been developed and validated. The HPLC separation method was performed using a reversed-phase (RP-C18e (250 mm × 4.0 mm, 5 μm column at room temperature, with a gradient mobile phase of acetonitrile and phosphate buffer containing methanol at pH 5.0, a flow rate of 0.8 mL/min and ultraviolet detection at 267 nm. This method was validated in accordance with the Food and Drug Administration (FDA and the International Conference on Harmonisation (ICH guidelines and showed excellent linearity, accuracy, precision, specificity, robustness, ruggedness, and system suitability results within the acceptance criteria. A stability-indicating study was also carried out and indicated that this method can also be used for purity and degradation evaluation of these formulations.

  13. A newly validated high-performance liquid chromatography method with diode array ultraviolet detection for analysis of the antimalarial drug primaquine in the blood plasma

    Directory of Open Access Journals (Sweden)

    Ana Paula Barbosa do Carmo

    Full Text Available Abstract INTRODUCTION: Primaquine (PQ diphosphate is an 8-aminoquinoline antimalarial drug with unique therapeutic properties. It is the only drug that prevents relapses of Plasmodium vivax or Plasmodium ovale infections. In this study, a fast, sensitive, cost-effective, and robust method for the extraction and high-performance liquid chromatography with diode array ultraviolet detection (HPLC-DAD-UV analysis of PQ in the blood plasma was developed and validated. METHODS: After plasma protein precipitation, PQ was obtained by liquid-liquid extraction and analyzed by HPLC-DAD-UV with a modified-silica cyanopropyl column (250mm × 4.6mm i.d. × 5μm as the stationary phase and a mixture of acetonitrile and 10mM ammonium acetate buffer (pH = 3.80 (45:55 as the mobile phase. The flow rate was 1.0mL·min-1, the oven temperature was 50OC, and absorbance was measured at 264nm. The method was validated for linearity, intra-day and inter-day precision, accuracy, recovery, and robustness. The detection (LOD and quantification (LOQ limits were 1.0 and 3.5ng·mL-1, respectively. The method was used to analyze the plasma of female DBA-2 mice treated with 20mg.kg-1 (oral PQ diphosphate. RESULTS: By combining a simple, low-cost extraction procedure with a sensitive, precise, accurate, and robust method, it was possible to analyze PQ in small volumes of plasma. The new method presents lower LOD and LOQ limits and requires a shorter analysis time and smaller plasma volumes than those of previously reported HPLC methods with DAD-UV detection. CONCLUSIONS: The new validated method is suitable for kinetic studies of PQ in small rodents, including mouse models for the study of malaria.

  14. Validation of a Residual Stress Measurement Method by Swept High-Frequency Eddy Currents

    International Nuclear Information System (INIS)

    Lee, C.; Shen, Y.; Lo, C. C. H.; Nakagawa, N.

    2007-01-01

    This paper reports on a swept high-frequency eddy current (SHFEC) measurement method developed for electromagnetic nondestructive characterization of residual stresses in shot peened aerospace materials. In this approach, we regard shot-peened surfaces as modified surface layers of varying conductivity, and determine the conductivity deviation profile by inversion of the SHFEC data. The SHFEC measurement system consists of a pair of closely matched printed-circuit-board coils driven by laboratory instrument under software control. This provides improved sensitivity and high frequency performance compared to conventional coils, so that swept frequency EC measurements up to 50 MHz can be made to achieve the smallest skin depth of 80 μm for nickel-based superalloys. We devised a conductivity profile inversion procedure based on the laterally uniform multi-layer theory of Cheng, Dodd and Deeds. The main contribution of this paper is the methodology validation. Namely, the forward and inverse models were validated against measurements on artificial layer specimens consisting of metal films with different conductivities placed on a metallic substrate. The inversion determined the film conductivities which were found to agree with those measured using the direct current potential drop (DCPD) method

  15. Validation of a Residual Stress Measurement Method by Swept High-Frequency Eddy Currents

    Science.gov (United States)

    Lee, C.; Shen, Y.; Lo, C. C. H.; Nakagawa, N.

    2007-03-01

    This paper reports on a swept high-frequency eddy current (SHFEC) measurement method developed for electromagnetic nondestructive characterization of residual stresses in shot peened aerospace materials. In this approach, we regard shot-peened surfaces as modified surface layers of varying conductivity, and determine the conductivity deviation profile by inversion of the SHFEC data. The SHFEC measurement system consists of a pair of closely matched printed-circuit-board coils driven by laboratory instrument under software control. This provides improved sensitivity and high frequency performance compared to conventional coils, so that swept frequency EC measurements up to 50 MHz can be made to achieve the smallest skin depth of 80 μm for nickel-based superalloys. We devised a conductivity profile inversion procedure based on the laterally uniform multi-layer theory of Cheng, Dodd and Deeds. The main contribution of this paper is the methodology validation. Namely, the forward and inverse models were validated against measurements on artificial layer specimens consisting of metal films with different conductivities placed on a metallic substrate. The inversion determined the film conductivities which were found to agree with those measured using the direct current potential drop (DCPD) method.

  16. Validated modified Lycopodium spore method development for standardisation of ingredients of an ayurvedic powdered formulation Shatavaryadi churna.

    Science.gov (United States)

    Kumar, Puspendra; Jha, Shivesh; Naved, Tanveer

    2013-01-01

    Validated modified lycopodium spore method has been developed for simple and rapid quantification of herbal powdered drugs. Lycopodium spore method was performed on ingredients of Shatavaryadi churna, an ayurvedic formulation used as immunomodulator, galactagogue, aphrodisiac and rejuvenator. Estimation of diagnostic characters of each ingredient of Shatavaryadi churna individually was carried out. Microscopic determination, counting of identifying number, measurement of area, length and breadth of identifying characters were performed using Leica DMLS-2 microscope. The method was validated for intraday precision, linearity, specificity, repeatability, accuracy and system suitability, respectively. The method is simple, precise, sensitive, and accurate, and can be used for routine standardisation of raw materials of herbal drugs. This method gives the ratio of individual ingredients in the powdered drug so that any adulteration of genuine drug with its adulterant can be found out. The method shows very good linearity value between 0.988-0.999 for number of identifying character and area of identifying character. Percentage purity of the sample drug can be determined by using the linear equation of standard genuine drug.

  17. Raman fiber-optical method for colon cancer detection: Cross-validation and outlier identification approach

    Science.gov (United States)

    Petersen, D.; Naveed, P.; Ragheb, A.; Niedieker, D.; El-Mashtoly, S. F.; Brechmann, T.; Kötting, C.; Schmiegel, W. H.; Freier, E.; Pox, C.; Gerwert, K.

    2017-06-01

    Endoscopy plays a major role in early recognition of cancer which is not externally accessible and therewith in increasing the survival rate. Raman spectroscopic fiber-optical approaches can help to decrease the impact on the patient, increase objectivity in tissue characterization, reduce expenses and provide a significant time advantage in endoscopy. In gastroenterology an early recognition of malign and precursor lesions is relevant. Instantaneous and precise differentiation between adenomas as precursor lesions for cancer and hyperplastic polyps on the one hand and between high and low-risk alterations on the other hand is important. Raman fiber-optical measurements of colon biopsy samples taken during colonoscopy were carried out during a clinical study, and samples of adenocarcinoma (22), tubular adenomas (141), hyperplastic polyps (79) and normal tissue (101) from 151 patients were analyzed. This allows us to focus on the bioinformatic analysis and to set stage for Raman endoscopic measurements. Since spectral differences between normal and cancerous biopsy samples are small, special care has to be taken in data analysis. Using a leave-one-patient-out cross-validation scheme, three different outlier identification methods were investigated to decrease the influence of systematic errors, like a residual risk in misplacement of the sample and spectral dilution of marker bands (esp. cancerous tissue) and therewith optimize the experimental design. Furthermore other validations methods like leave-one-sample-out and leave-one-spectrum-out cross-validation schemes were compared with leave-one-patient-out cross-validation. High-risk lesions were differentiated from low-risk lesions with a sensitivity of 79%, specificity of 74% and an accuracy of 77%, cancer and normal tissue with a sensitivity of 79%, specificity of 83% and an accuracy of 81%. Additionally applied outlier identification enabled us to improve the recognition of neoplastic biopsy samples.

  18. Raman fiber-optical method for colon cancer detection: Cross-validation and outlier identification approach.

    Science.gov (United States)

    Petersen, D; Naveed, P; Ragheb, A; Niedieker, D; El-Mashtoly, S F; Brechmann, T; Kötting, C; Schmiegel, W H; Freier, E; Pox, C; Gerwert, K

    2017-06-15

    Endoscopy plays a major role in early recognition of cancer which is not externally accessible and therewith in increasing the survival rate. Raman spectroscopic fiber-optical approaches can help to decrease the impact on the patient, increase objectivity in tissue characterization, reduce expenses and provide a significant time advantage in endoscopy. In gastroenterology an early recognition of malign and precursor lesions is relevant. Instantaneous and precise differentiation between adenomas as precursor lesions for cancer and hyperplastic polyps on the one hand and between high and low-risk alterations on the other hand is important. Raman fiber-optical measurements of colon biopsy samples taken during colonoscopy were carried out during a clinical study, and samples of adenocarcinoma (22), tubular adenomas (141), hyperplastic polyps (79) and normal tissue (101) from 151 patients were analyzed. This allows us to focus on the bioinformatic analysis and to set stage for Raman endoscopic measurements. Since spectral differences between normal and cancerous biopsy samples are small, special care has to be taken in data analysis. Using a leave-one-patient-out cross-validation scheme, three different outlier identification methods were investigated to decrease the influence of systematic errors, like a residual risk in misplacement of the sample and spectral dilution of marker bands (esp. cancerous tissue) and therewith optimize the experimental design. Furthermore other validations methods like leave-one-sample-out and leave-one-spectrum-out cross-validation schemes were compared with leave-one-patient-out cross-validation. High-risk lesions were differentiated from low-risk lesions with a sensitivity of 79%, specificity of 74% and an accuracy of 77%, cancer and normal tissue with a sensitivity of 79%, specificity of 83% and an accuracy of 81%. Additionally applied outlier identification enabled us to improve the recognition of neoplastic biopsy samples. Copyright

  19. Design for validation: An approach to systems validation

    Science.gov (United States)

    Carter, William C.; Dunham, Janet R.; Laprie, Jean-Claude; Williams, Thomas; Howden, William; Smith, Brian; Lewis, Carl M. (Editor)

    1989-01-01

    Every complex system built is validated in some manner. Computer validation begins with review of the system design. As systems became too complicated for one person to review, validation began to rely on the application of adhoc methods by many individuals. As the cost of the changes mounted and the expense of failure increased, more organized procedures became essential. Attempts at devising and carrying out those procedures showed that validation is indeed a difficult technical problem. The successful transformation of the validation process into a systematic series of formally sound, integrated steps is necessary if the liability inherent in the future digita-system-based avionic and space systems is to be minimized. A suggested framework and timetable for the transformtion are presented. Basic working definitions of two pivotal ideas (validation and system life-cyle) are provided and show how the two concepts interact. Many examples are given of past and present validation activities by NASA and others. A conceptual framework is presented for the validation process. Finally, important areas are listed for ongoing development of the validation process at NASA Langley Research Center.

  20. Validation of methods for determination of free water content in poultry meat

    Directory of Open Access Journals (Sweden)

    Jarmila Žítková

    2007-01-01

    Full Text Available Methods for determination of free water content in poultry meat are described in Commission Regulation EEC No 1538/91 as amended and in ČSN 57 3100. Two of them (method A and D have been validated in conditions of a Czech poultry processing plant. The capacity of slaughtering was 6000 pieces per hour and carcasses were chilled by air with spraying. All determinations were carried out in the plant’s lab and in the lab of the Institute of Food Technology. Method A was used to detect the amount of water lost from frozen chicken during thawing in controlled conditions. Twenty carcasses from six weight groups (900 g–1400 g were tested. The average values of thaw loss water contents ranged between 0.46% and 1.71%, the average value of total 120 samples was 1.16%. The results were compared with the required maximum limit value of 3.3%. The water loss content was in negative correlation with the weight of chicken (r = –0.56. Method D (chemical test has been applied to determine the total water content of certain poultry cuts. It involved the determination of water and protein contents of 62 representative samples in total. The average values of ratio of water weight to proteins weight WA/RPA were in breast fillets 3.29, in legs with a portion of the back 4.06, legs 4.00, thighs 3.85 and drumsticks 4.10. The results corresponded to the required limit values for breast fillets 3.40 and for leg cuts 4.15. The ratio of water weight to proteins weight WA/RPA was correlated with the weight of chicken for breast fillets negatively (r = –0.61 and for leg cuts positively (r = 0.70. Different correlations can be explained by the distribution of water, protein and fat in carcasses. The evaluation of methods in the parameter of percentage ratio of the average value to the limit showed that method D (results were at the level of 97% of the limit was more exact than method A (results were at the level 32% of the limit but it is more expensive. Both methods

  1. The Dynamic Similitude Design Method of Thin Walled Structures and Experimental Validation

    Directory of Open Access Journals (Sweden)

    Zhong Luo

    2016-01-01

    Full Text Available For the applicability of dynamic similitude models of thin walled structures, such as engine blades, turbine discs, and cylindrical shells, the dynamic similitude design of typical thin walled structures is investigated. The governing equation of typical thin walled structures is firstly unified, which guides to establishing dynamic scaling laws of typical thin walled structures. Based on the governing equation, geometrically complete scaling law of the typical thin walled structure is derived. In order to determine accurate distorted scaling laws of typical thin walled structures, three principles are proposed and theoretically proved by combining the sensitivity analysis and governing equation. Taking the thin walled annular plate as an example, geometrically complete and distorted scaling laws can be obtained based on the principles of determining dynamic scaling laws. Furthermore, the previous five orders’ accurate distorted scaling laws of thin walled annular plates are presented and numerically validated. Finally, the effectiveness of the similitude design method is validated by experimental annular plates.

  2. Validated Reverse Phase HPLC Method for the Determination of Impurities in Etoricoxib

    Directory of Open Access Journals (Sweden)

    S. Venugopal

    2011-01-01

    Full Text Available This paper describes the development of reverse phase HPLC method for etoricoxib in the presence of impurities and degradation products generated from the forced degradation studies. The drug substance was subjected to stress conditions of hydrolysis, oxidation, photolysis and thermal degradation. The degradation of etoricoxib was observed under base and oxidation environment. The drug was found stable in other stress conditions studied. Successful separation of the drug from the process related impurities and degradation products were achieved on zorbax SB CN (250 x 4.6 mm 5 μm particle size column using reverse phase HPLC method. The isocratic method employed with a mixture of buffer and acetonitrile in a ratio of 60:40 respectively. Disodium hydrogen orthophosphate (0.02 M is used as buffer and pH adjusted to 7.20 with 1 N sodium hydroxide solution. The HPLC method was developed and validated with respect to linearity, accuracy, precision, specificity and ruggedness.

  3. The validation of Huffaz Intelligence Test (HIT)

    Science.gov (United States)

    Rahim, Mohd Azrin Mohammad; Ahmad, Tahir; Awang, Siti Rahmah; Safar, Ajmain

    2017-08-01

    In general, a hafiz who can memorize the Quran has many specialties especially in respect to their academic performances. In this study, the theory of multiple intelligences introduced by Howard Gardner is embedded in a developed psychometric instrument, namely Huffaz Intelligence Test (HIT). This paper presents the validation and the reliability of HIT of some tahfiz students in Malaysia Islamic schools. A pilot study was conducted involving 87 huffaz who were randomly selected to answer the items in HIT. The analysis method used includes Partial Least Square (PLS) on reliability, convergence and discriminant validation. The study has validated nine intelligences. The findings also indicated that the composite reliabilities for the nine types of intelligences are greater than 0.8. Thus, the HIT is a valid and reliable instrument to measure the multiple intelligences among huffaz.

  4. Validated spectophotometric methods for the assay of cinitapride hydrogen tartrate in pharmaceuticals

    Directory of Open Access Journals (Sweden)

    Satyanarayana K.V.V.

    2013-01-01

    Full Text Available Three simple, selective and rapid spectrophotometric methods have been established for the determination of cinitapride hydrogen tartrate (CHT in pharmaceutical tablets. The proposed methods are based on the diazotization of CHT with sodium nitrite and hydrochloric acid, followed by coupling with resorcinol, 1-benzoylacetone and 8-hydroxyquinoline in alkaline medium for methods A, B and C respectively. The formed azo dyes are measured at 442, 465 and 552 nm for methods A, B and C respectively. The parameters that affect the reaction were carefully optimized. Under optimum conditions, Beer’s law is obeyed over the ranges 2.0-32.0, 1.0-24.0 and 1.0-20.0 μg. mL-1 for methods A, B, and C, respectively. The calculated molar absorptivity values are 1.2853 x104, 1.9624 x104 and 3.92 x104 L.mol-1.cm-1 for methods A, B and C, respectively. The results of the proposed procedures were validated statistically according to ICH guidelines. The proposed methods were successfully applied to the determination of CHT in Cintapro tablets without interference from common excipients encountered.

  5. Methods for validating the performance of wearable motion-sensing devices under controlled conditions

    International Nuclear Information System (INIS)

    Bliley, Kara E; Kaufman, Kenton R; Gilbert, Barry K

    2009-01-01

    This paper presents validation methods for assessing the accuracy and precision of motion-sensing device (i.e. accelerometer) measurements. The main goals of this paper were to assess the accuracy and precision of these measurements against a gold standard, to determine if differences in manufacturing and assembly significantly affected device performance and to determine if measurement differences due to manufacturing and assembly could be corrected by applying certain post-processing techniques to the measurement data during analysis. In this paper, the validation of a posture and activity detector (PAD), a device containing a tri-axial accelerometer, is described. Validation of the PAD devices required the design of two test fixtures: a test fixture to position the device in a known orientation, and a test fixture to rotate the device at known velocities and accelerations. Device measurements were compared to these known orientations and accelerations. Several post-processing techniques were utilized in an attempt to reduce variability in the measurement error among the devices. In conclusion, some of the measurement errors due to the inevitable differences in manufacturing and assembly were significantly improved (p < 0.01) by these post-processing techniques

  6. Validation of a method to determine methylmercury in fish tissues using gas chromatography

    International Nuclear Information System (INIS)

    Vega Bolannos, Luisa O.; Arias Verdes, Jose A.; Beltran Llerandi, Gilberto; Castro Diaz, Odalys; Moreno Tellez, Olga L.

    2000-01-01

    We validated a method to determine methylmercury in fish tissues using gas chromatography with an electron capture detector as described by the Association of Official Analytical Chemist (AOAC) International. The linear curve range was 0.02 to 1 g/ml and linear correlation coefficient was 0.9979. A 1 mg/kg methylmercury-contaminated fish sample was analyzed 20 times to determine repeatability of the method. The quantification limit was 0.16 mg/kg and detection limit was 0.06 ppm. Fish samples contaminated with 0.2 to 10 mg/kg methylmercury showed recovery indexes from 94.66 to 108.8%

  7. A validated high performance thin layer chromatography method for determination of yohimbine hydrochloride in pharmaceutical preparations

    OpenAIRE

    Jihan M Badr

    2013-01-01

    Background: Yohimbine is an indole alkaloid used as a promising therapy for erectile dysfunction. A number of methods were reported for the analysis of yohimbine in the bark or in pharmaceutical preparations. Materials and Method: In the present work, a simple and sensitive high performance thin layer chromatographic method is developed for determination of yohimbine (occurring as yohimbine hydrochloride) in pharmaceutical preparations and validated according to International Conference of Ha...

  8. Strategies for the screening of antibiotic residues in eggs: comparison of the validation of the classical microbiological method with an immunobiosensor method.

    Science.gov (United States)

    Gaudin, Valérie; Rault, Annie; Hedou, Celine; Soumet, Christophe; Verdon, Eric

    2017-09-01

    Efficient screening methods are needed to control antibiotic residues in eggs. A microbiological kit (Explorer® 2.0 test (Zeu Inmunotech, Spain)) and an immunobiosensor kit (Microarray II (AM® II) on Evidence Investigator™ system (Randox, UK)) have been evaluated and validated for screening of antibiotic residues in eggs, according to the European decision EC/2002/657 and to the European guideline for the validation of screening methods. The e-reader™ system, a new automatic incubator/reading system, was coupled to the Explorer 2.0 test. The AM II kit can detect residues of six different families of antibiotics in different matrices including eggs. For both tests, a different liquid/liquid extraction of eggs had to be developed. Specificities of the Explorer 2.0 and AM II kit were equal to 8% and 0% respectively. The detection capabilities were determined for 19 antibiotics, with representatives from different families, for Explorer 2.0 and 12 antibiotics for the AM II kit. For the nine antibiotics having a maximum residue limit (MRL) in eggs, the detection capabilities CCβ of Explorer 2.0 were below the MRL for four antibiotics, equal to the MRL for two antibiotics and between 1 and 1.5 MRLs for the three remaining antibiotics (tetracyclines). For the antibiotics from other families, the detection capabilities were low for beta-lactams and sulfonamides and satisfactory for dihydrostreptomycin (DHS) and fluoroquinolones, which are usually difficult to detect with microbiological tests. The CCβ values of the AM II kit were much lower than the respective MRLs for three detected antibiotics (tetracycline, oxytetracycline, tylosin). Concerning the nine other antibiotics, the detection capabilities determined were low. The highest CCβ was obtained for streptomycin (100 µg kg -1 ).

  9. Validation of a Russian Language Oswestry Disability Index Questionnaire.

    Science.gov (United States)

    Yu, Elizabeth M; Nosova, Emily V; Falkenstein, Yuri; Prasad, Priya; Leasure, Jeremi M; Kondrashov, Dimitriy G

    2016-11-01

    Study Design  Retrospective reliability and validity study. Objective  To validate a recently translated Russian language version of the Oswestry Disability Index (R-ODI) using standardized methods detailed from previous validations in other languages. Methods  We included all subjects who were seen in our spine surgery clinic, over the age of 18, and fluent in the Russian language. R-ODI was translated by six bilingual people and combined into a consensus version. R-ODI and visual analog scale (VAS) questionnaires for leg and back pain were distributed to subjects during both their initial and follow-up visits. Test validity, stability, and internal consistency were measured using standardized psychometric methods. Results Ninety-seven subjects participated in the study. No change in the meaning of the questions on R-ODI was noted with translation from English to Russian. There was a significant positive correlation between R-ODI and VAS scores for both the leg and back during both the initial and follow-up visits ( p  Russian-speaking population in the United States.

  10. Validation on milk and sprouts of EN ISO 16654:2001 - Microbiology of food and animal feeding stuffs - Horizontal method for the detection of Escherichia coli O157.

    Science.gov (United States)

    Tozzoli, Rosangela; Maugliani, Antonella; Michelacci, Valeria; Minelli, Fabio; Caprioli, Alfredo; Morabito, Stefano

    2018-05-08

    In 2006, the European Committee for standardisation (CEN)/Technical Committee 275 - Food analysis - Horizontal methods/Working Group 6 - Microbiology of the food chain (TC275/WG6), launched the project of validating the method ISO 16654:2001 for the detection of Escherichia coli O157 in foodstuff by the evaluation of its performance, in terms of sensitivity and specificity, through collaborative studies. Previously, a validation study had been conducted to assess the performance of the Method No 164 developed by the Nordic Committee for Food Analysis (NMKL), which aims at detecting E. coli O157 in food as well, and is based on a procedure equivalent to that of the ISO 16654:2001 standard. Therefore, CEN established that the validation data obtained for the NMKL Method 164 could be exploited for the ISO 16654:2001 validation project, integrated with new data obtained through two additional interlaboratory studies on milk and sprouts, run in the framework of the CEN mandate No. M381. The ISO 16654:2001 validation project was led by the European Union Reference Laboratory for Escherichia coli including VTEC (EURL-VTEC), which organized the collaborative validation study on milk in 2012 with 15 participating laboratories and that on sprouts in 2014, with 14 participating laboratories. In both studies, a total of 24 samples were tested by each laboratory. Test materials were spiked with different concentration of E. coli O157 and the 24 samples corresponded to eight replicates of three levels of contamination: zero, low and high spiking level. The results submitted by the participating laboratories were analyzed to evaluate the sensitivity and specificity of the ISO 16654:2001 method when applied to milk and sprouts. The performance characteristics calculated on the data of the collaborative validation studies run under the CEN mandate No. M381 returned sensitivity and specificity of 100% and 94.4%, respectively for the milk study. As for sprouts matrix, the sensitivity

  11. General criteria for validation of dosimetry methods in the context of a quality system ISO / IEC 17025; Criterios generales sobre validacion de metodos de dosimetria en el marco de un sistema de calidad ISO/IEC 17025

    Energy Technology Data Exchange (ETDEWEB)

    Martin Garcia, R.; Navarro Bravo, T.

    2011-07-01

    The accreditation of a testing laboratory in accordance with ISO / IEC 17025 recognizes the technical competence of a laboratory to perform certain tests. One of the requirements of that rule states that laboratories must demonstrate that the methods used are valid and appropriate for the intended use and customer needs. This demonstration is accomplished through the process of validation of methods, defined in the rule it self as {sup c}onfirmation by examination and provision of objective evidence that the requirements for a particular purpose{sup .} The process of validating a test method should be well planned and documented, including the requirements under the applicable rules and criteria established by the laboratory to comply with these requirements.

  12. Validated stability-indicating spectrofluorimetric methods for the determination of ebastine in pharmaceutical preparations

    Directory of Open Access Journals (Sweden)

    Eid Manal

    2011-03-01

    Full Text Available Abstract Two sensitive, selective, economic, and validated spectrofluorimetric methods were developed for the determination of ebastine (EBS in pharmaceutical preparations depending on reaction with its tertiary amino group. Method I involves condensation of the drug with mixed anhydrides (citric and acetic anhydrides producing a product with intense fluorescence, which was measured at 496 nm after excitation at 388 nm. Method (IIA describes quantitative fluorescence quenching of eosin upon addition of the studied drug where the decrease in the fluorescence intensity was directly proportional to the concentration of ebastine; the fluorescence quenching was measured at 553 nm after excitation at 457 nm. This method was extended to (Method IIB to apply first and second derivative synchronous spectrofluorimetric method (FDSFS & SDSFS for the simultaneous analysis of EBS in presence of its alkaline, acidic, and UV degradation products. The proposed methods were successfully applied for the determination of the studied compound in its dosage forms. The results obtained were in good agreement with those obtained by a comparison method. Both methods were utilized to investigate the kinetics of the degradation of the drug.

  13. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  14. A Study of Method Development, Validation, and Forced Degradation for Simultaneous Quantification of Paracetamol and Ibuprofen in Pharmaceutical Dosage Form by RP-HPLC Method

    OpenAIRE

    Jahan, Md. Sarowar; Islam, Md. Jahirul; Begum, Rehana; Kayesh, Ruhul; Rahman, Asma

    2014-01-01

    A rapid and stability-indicating reversed phase high-performance liquid chromatography (RP-HPLC) method was developed for simultaneous quantification of paracetamol and ibuprofen in their combined dosage form especially to get some more advantages over other methods already developed for this combination. The method was validated according to United States Pharmacopeia (USP) guideline with respect to accuracy, precision, specificity, linearity, solution stability, robustness, sensitivity, and...

  15. Validation of a digital photographic method for assessment of dietary quality of school lunch sandwiches brought from home

    DEFF Research Database (Denmark)

    Sabinsky, Marianne; Toft, Ulla; Andersen, Klaus K

    2013-01-01

    Background: It is a challenge to assess children’s dietary intake. The digital photographic method (DPM) may be an objective method that can overcome some of these challenges. Objective: The aim of this study was to evaluate the validity and reliability of a DPM to assess the quality of dietary....... The lunches were photographed using a standardised DPM. From the digital images, the dietary components were estimated by a trained image analyst using weights or household measures and the dietary quality was assessed using a validated Meal Index of Dietary Quality (Meal IQ). The dietary components...... and the Meal IQ obtained from the digital images were validated against the objective weighed foods of the school lunch sandwiches. To determine interrater reliability, the digital images were evaluated by a second image analyst. Results: Correlation coefficients between the DPM and the weighed foods ranged...

  16. Validation of a rapid, non-radioactive method to quantify internalisation of G-protein coupled receptors

    NARCIS (Netherlands)

    Jongsma, Maikel; Florczyk, Urszula M.; Hendriks-Balk, Marieelle C.; Michel, Martin C.; Peters, Stephan L. M.; Alewijnse, Astrid E.

    2007-01-01

    Agonist exposure can cause internalisation of G-protein coupled receptors (GPCRs), which may be a part of desensitisation but also of cellular signaling. Previous methods to study internalisation have been tedious or only poorly quantitative. Therefore, we have developed and validated a quantitative

  17. Validation of multivariate classification methods using analytical fingerprints – concept and case study on organic feed for laying hens

    NARCIS (Netherlands)

    Alewijn, Martin; Voet, van der Hilko; Ruth, van Saskia

    2016-01-01

    Multivariate classification methods based on analytical fingerprints have found many applications in the food and feed area, but practical applications are still scarce due to a lack of a generally accepted validation procedure. This paper proposes a new approach for validation of this type of

  18. Establishing survey validity and reliability for American Indians through "think aloud" and test-retest methods.

    Science.gov (United States)

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D

    2015-06-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. © The Author(s) 2015.

  19. Validation of a method to measure plutonium levels in marine sediments in Cuba

    International Nuclear Information System (INIS)

    Sibello Hernández, Rita Y.; Cartas Aguila, Héctor A.; Cozzella, María Letizia

    2008-01-01

    The main objective of this research was to develop and to validate a method of radiochemical separation of plutonium, suitable from the economic and practical point of view, in Cuba conditions. This method allowed to determine plutonium activity levels in the marine sediments from Cienfuegos Bay. The selected method of radiochemical separation was that of anionic chromatography and the measure technique was the quadrupole inductively coupled plasma mass spectrometry. The method was applied to a certified reference material, six repetitions were carried out and a good correspondence between the average measured value and the average certified value of plutonium was achieved, so the trueness of the method was demonstrated. It was also proven the precision of the method, since it was obtained a variation coefficient of 11% at 95% confidence level. The obtained results show that the presence of plutonium in the analyzed marine sediment samples is only due to the global radioactive fallout. (author)

  20. Validation of Housing Standards Addressing Accessibility

    DEFF Research Database (Denmark)

    Helle, Tina

    2013-01-01

    The aim was to explore the use of an activity-based approach to determine the validity of a set of housing standards addressing accessibility. This included examination of the frequency and the extent of accessibility problems among older people with physical functional limitations who used...... participant groups were examined. Performing well-known kitchen activities was associated with accessibility problems for all three participant groups, in particular those using a wheelchair. The overall validity of the housing standards examined was poor. Observing older people interacting with realistic...... environments while performing real everyday activities seems to be an appropriate method for assessing accessibility problems....

  1. Worldwide Protein Data Bank validation information: usage and trends.

    Science.gov (United States)

    Smart, Oliver S; Horský, Vladimír; Gore, Swanand; Svobodová Vařeková, Radka; Bendová, Veronika; Kleywegt, Gerard J; Velankar, Sameer

    2018-03-01

    Realising the importance of assessing the quality of the biomolecular structures deposited in the Protein Data Bank (PDB), the Worldwide Protein Data Bank (wwPDB) partners established Validation Task Forces to obtain advice on the methods and standards to be used to validate structures determined by X-ray crystallography, nuclear magnetic resonance spectroscopy and three-dimensional electron cryo-microscopy. The resulting wwPDB validation pipeline is an integral part of the wwPDB OneDep deposition, biocuration and validation system. The wwPDB Validation Service webserver (https://validate.wwpdb.org) can be used to perform checks prior to deposition. Here, it is shown how validation metrics can be combined to produce an overall score that allows the ranking of macromolecular structures and domains in search results. The ValTrends DB database provides users with a convenient way to access and analyse validation information and other properties of X-ray crystal structures in the PDB, including investigating trends in and correlations between different structure properties and validation metrics.

  2. A newly validated high-performance liquid chromatography method with diode array ultraviolet detection for analysis of the antimalarial drug primaquine in the blood plasma.

    Science.gov (United States)

    Carmo, Ana Paula Barbosa do; Borborema, Manoella; Ribeiro, Stephan; De-Oliveira, Ana Cecilia Xavier; Paumgartten, Francisco Jose Roma; Moreira, Davyson de Lima

    2017-01-01

    Primaquine (PQ) diphosphate is an 8-aminoquinoline antimalarial drug with unique therapeutic properties. It is the only drug that prevents relapses of Plasmodium vivax or Plasmodium ovale infections. In this study, a fast, sensitive, cost-effective, and robust method for the extraction and high-performance liquid chromatography with diode array ultraviolet detection (HPLC-DAD-UV ) analysis of PQ in the blood plasma was developed and validated. After plasma protein precipitation, PQ was obtained by liquid-liquid extraction and analyzed by HPLC-DAD-UV with a modified-silica cyanopropyl column (250mm × 4.6mm i.d. × 5μm) as the stationary phase and a mixture of acetonitrile and 10mM ammonium acetate buffer (pH = 3.80) (45:55) as the mobile phase. The flow rate was 1.0mL·min-1, the oven temperature was 50OC, and absorbance was measured at 264nm. The method was validated for linearity, intra-day and inter-day precision, accuracy, recovery, and robustness. The detection (LOD) and quantification (LOQ) limits were 1.0 and 3.5ng·mL-1, respectively. The method was used to analyze the plasma of female DBA-2 mice treated with 20mg.kg-1 (oral) PQ diphosphate. By combining a simple, low-cost extraction procedure with a sensitive, precise, accurate, and robust method, it was possible to analyze PQ in small volumes of plasma. The new method presents lower LOD and LOQ limits and requires a shorter analysis time and smaller plasma volumes than those of previously reported HPLC methods with DAD-UV detection. The new validated method is suitable for kinetic studies of PQ in small rodents, including mouse models for the study of malaria.

  3. Validation of an HPLC–UV method for the determination of digoxin residues on the surface of manufacturing equipment

    Directory of Open Access Journals (Sweden)

    ZORAN B. TODOROVIĆ

    2009-09-01

    Full Text Available In the pharmaceutical industry, an important step consists in the removal of possible drug residues from the involved equipments and areas. The cleaning procedures must be validated and methods to determine trace amounts of drugs have, therefore, to be considered with special attention. An HPLC–UV method for the determination of digoxin residues on stainless steel surfaces was developed and validated in order to control a cleaning procedure. Cotton swabs, moistened with methanol were used to remove any residues of drugs from stainless steel surfaces, and give recoveries of 85.9, 85.2 and 78.7 % for three concentration levels. The precision of the results, reported as the relative standard deviation (RSD, were below 6.3 %. The method was validated over a concentration range of 0.05–12.5 µg mL-1. Low quantities of drug residues were determined by HPLC–UV using a Symmetry C18 column (150´4.6 mm, 5 µm at 20 °C with an acetonitrile–water (28:72, v/v mobile phase at a flow rate of 1.1 mL min-1, an injection volume of 100 µL and were detected at 220 nm. A simple, selective and sensitive HPLC–UV assay for the determination of digoxin residues on stainless steel was developed, validated and applied.

  4. Validation of nitrogen-nitrate analysis by the chromotropic acid method

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Ana Claudia O.; Matoso, Erika, E-mail: anaclaudia.oliveira@marinha.mil.br [Centro Tecnológico da Marinha em São Paulo (CTMSP/CEA), Iperó, SP (Brazil). Centro Experimental ARAMAR

    2017-07-01

    The problems caused by contamination of water bodies demand strict control of disposal in rivers, seas and oceans. Nitrate ion is present in agricultural inputs, which are applied to the soil to boost plant growth. However, excess or indiscriminate use of these products contaminates water bodies, triggering eutrophication of the aquatic ecosystems. Furthermore, due to diseases that can be caused by the ingestion of high levels of nitrate, such as methaemoglobinaemia, nitrate levels should be controlled in drinking waters and effluents. There are several methods for the determination of nitrate, being the chromotropic acid method a simple and low-cost solution. This method consists of acid addition into the sample in the presence of H{sub 2}SO{sub 4}. The absorbance related to the produced yellow color can be measured by a UV-Vis spectrophotometer at 410 nm. In a modified form, this method can be applied to different aqueous matrices by use of other reagents that eliminate interferences. The aim of this study was to validate the nitrate determination method in waters using chromotropic acid. This method is used in Laboratório Radioecológico (LARE) to analyze effluent to comply with Wastewater Controlling Program of Centro Tecnológico da Marinha em São Paulo – Centro Experimental ARAMAR (CTMSP-CEA). The correlation coefficient for the linearity test was 0.9997. The evaluated detection limit was relatively high (LD = 0.045 mgN/L), if compared to ion chromatography, for example, but enough to determine the presence of this ion, considering the maximum limit proposed by the current legislation. The chromotropic acid method showed to be a robust, accurate and precise method, according the parameters used in this work. (author)

  5. Validation of nitrogen-nitrate analysis by the chromotropic acid method

    International Nuclear Information System (INIS)

    Santos, Ana Claudia O.; Matoso, Erika

    2017-01-01

    The problems caused by contamination of water bodies demand strict control of disposal in rivers, seas and oceans. Nitrate ion is present in agricultural inputs, which are applied to the soil to boost plant growth. However, excess or indiscriminate use of these products contaminates water bodies, triggering eutrophication of the aquatic ecosystems. Furthermore, due to diseases that can be caused by the ingestion of high levels of nitrate, such as methaemoglobinaemia, nitrate levels should be controlled in drinking waters and effluents. There are several methods for the determination of nitrate, being the chromotropic acid method a simple and low-cost solution. This method consists of acid addition into the sample in the presence of H 2 SO 4 . The absorbance related to the produced yellow color can be measured by a UV-Vis spectrophotometer at 410 nm. In a modified form, this method can be applied to different aqueous matrices by use of other reagents that eliminate interferences. The aim of this study was to validate the nitrate determination method in waters using chromotropic acid. This method is used in Laboratório Radioecológico (LARE) to analyze effluent to comply with Wastewater Controlling Program of Centro Tecnológico da Marinha em São Paulo – Centro Experimental ARAMAR (CTMSP-CEA). The correlation coefficient for the linearity test was 0.9997. The evaluated detection limit was relatively high (LD = 0.045 mgN/L), if compared to ion chromatography, for example, but enough to determine the presence of this ion, considering the maximum limit proposed by the current legislation. The chromotropic acid method showed to be a robust, accurate and precise method, according the parameters used in this work. (author)

  6. Internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis

    NARCIS (Netherlands)

    Steultjens, M. P.; Dekker, J.; van Baar, M. E.; Oostendorp, R. A.; Bijlsma, J. W.

    1999-01-01

    To establish the internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis (OA). Data were obtained from 198 patients with OA of the hip or knee. Results of the observational method were compared with results of self-report

  7. Novel Automated Morphometric and Kinematic Handwriting Assessment: A Validity Study in Children with ASD and ADHD

    Science.gov (United States)

    Dirlikov, Benjamin; Younes, Laurent; Nebel, Mary Beth; Martinelli, Mary Katherine; Tiedemann, Alyssa Nicole; Koch, Carolyn A.; Fiorilli, Diana; Bastian, Amy J.; Denckla, Martha Bridge; Miller, Michael I.; Mostofsky, Stewart H.

    2017-01-01

    This study presents construct validity for a novel automated morphometric and kinematic handwriting assessment, including (1) convergent validity, establishing reliability of automated measures with traditional manual-derived Minnesota Handwriting Assessment (MHA), and (2) discriminant validity, establishing that the automated methods distinguish…

  8. An Anatomically Validated Brachial Plexus Contouring Method for Intensity Modulated Radiation Therapy Planning

    International Nuclear Information System (INIS)

    Van de Velde, Joris; Audenaert, Emmanuel; Speleers, Bruno; Vercauteren, Tom; Mulliez, Thomas; Vandemaele, Pieter; Achten, Eric; Kerckaert, Ingrid; D'Herde, Katharina; De Neve, Wilfried; Van Hoof, Tom

    2013-01-01

    Purpose: To develop contouring guidelines for the brachial plexus (BP) using anatomically validated cadaver datasets. Magnetic resonance imaging (MRI) and computed tomography (CT) were used to obtain detailed visualizations of the BP region, with the goal of achieving maximal inclusion of the actual BP in a small contoured volume while also accommodating for anatomic variations. Methods and Materials: CT and MRI were obtained for 8 cadavers positioned for intensity modulated radiation therapy. 3-dimensional reconstructions of soft tissue (from MRI) and bone (from CT) were combined to create 8 separate enhanced CT project files. Dissection of the corresponding cadavers anatomically validated the reconstructions created. Seven enhanced CT project files were then automatically fitted, separately in different regions, to obtain a single dataset of superimposed BP regions that incorporated anatomic variations. From this dataset, improved BP contouring guidelines were developed. These guidelines were then applied to the 7 original CT project files and also to 1 additional file, left out from the superimposing procedure. The percentage of BP inclusion was compared with the published guidelines. Results: The anatomic validation procedure showed a high level of conformity for the BP regions examined between the 3-dimensional reconstructions generated and the dissected counterparts. Accurate and detailed BP contouring guidelines were developed, which provided corresponding guidance for each level in a clinical dataset. An average margin of 4.7 mm around the anatomically validated BP contour is sufficient to accommodate for anatomic variations. Using the new guidelines, 100% inclusion of the BP was achieved, compared with a mean inclusion of 37.75% when published guidelines were applied. Conclusion: Improved guidelines for BP delineation were developed using combined MRI and CT imaging with validation by anatomic dissection

  9. An assessment of the validity of inelastic design analysis methods by comparisons of predictions with test results

    International Nuclear Information System (INIS)

    Corum, J.M.; Clinard, J.A.; Sartory, W.K.

    1976-01-01

    The use of computer programs that employ relatively complex constitutive theories and analysis procedures to perform inelastic design calculations on fast reactor system components introduces questions of validation and acceptance of the analysis results. We may ask ourselves, ''How valid are the answers.'' These questions, in turn, involve the concepts of verification of computer programs as well as qualification of the computer programs and of the underlying constitutive theories and analysis procedures. This paper addresses the latter - the qualification of the analysis methods for inelastic design calculations. Some of the work underway in the United States to provide the necessary information to evaluate inelastic analysis methods and computer programs is described, and typical comparisons of analysis predictions with inelastic structural test results are presented. It is emphasized throughout that rather than asking ourselves how valid, or correct, are the analytical predictions, we might more properly question whether or not the combination of the predictions and the associated high-temperature design criteria leads to an acceptable level of structural integrity. It is believed that in this context the analysis predictions are generally valid, even though exact correlations between predictions and actual behavior are not obtained and cannot be expected. Final judgment, however, must be reserved for the design analyst in each specific case. (author)

  10. A Method for Ship Collision Damage and Energy Absorption Analysis and its Validation

    DEFF Research Database (Denmark)

    Zhang, Shengming; Pedersen, Preben Terndrup

    2017-01-01

    For design evaluation, there is a need for a method which is fast, practical and yet accurate enough to deter-mine the absorbed energy and collision damage extent in ship collision analysis. The most well-known sim-plified empirical approach to collision analysis was made probably by Minorsky......, and its limitation is alsowell-recognised. The authors have previously developed simple expressions for the relation between theabsorbed energy and the damaged material volume which take into account the structural arrangements,the material properties and the damage modes. The purpose of the present paper...... is to re-examine thismethod’s validity and accuracy for ship collision damage analysis in ship design assessments by compre-hensive validations with experimental results from the public domain. In total, 20 experimental tests havebeen selected, analysed and compared with the results calculated using...

  11. Validity and Interrater Reliability of the Visual Quarter-Waste Method for Assessing Food Waste in Middle School and High School Cafeteria Settings.

    Science.gov (United States)

    Getts, Katherine M; Quinn, Emilee L; Johnson, Donna B; Otten, Jennifer J

    2017-11-01

    Measuring food waste (ie, plate waste) in school cafeterias is an important tool to evaluate the effectiveness of school nutrition policies and interventions aimed at increasing consumption of healthier meals. Visual assessment methods are frequently applied in plate waste studies because they are more convenient than weighing. The visual quarter-waste method has become a common tool in studies of school meal waste and consumption, but previous studies of its validity and reliability have used correlation coefficients, which measure association but not necessarily agreement. The aims of this study were to determine, using a statistic measuring interrater agreement, whether the visual quarter-waste method is valid and reliable for assessing food waste in a school cafeteria setting when compared with the gold standard of weighed plate waste. To evaluate validity, researchers used the visual quarter-waste method and weighed food waste from 748 trays at four middle schools and five high schools in one school district in Washington State during May 2014. To assess interrater reliability, researcher pairs independently assessed 59 of the same trays using the visual quarter-waste method. Both validity and reliability were assessed using a weighted κ coefficient. For validity, as compared with the measured weight, 45% of foods assessed using the visual quarter-waste method were in almost perfect agreement, 42% of foods were in substantial agreement, 10% were in moderate agreement, and 3% were in slight agreement. For interrater reliability between pairs of visual assessors, 46% of foods were in perfect agreement, 31% were in almost perfect agreement, 15% were in substantial agreement, and 8% were in moderate agreement. These results suggest that the visual quarter-waste method is a valid and reliable tool for measuring plate waste in school cafeteria settings. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  12. Validation of QuEChERS method for the determination of some pesticide residues in two apple varieties.

    Science.gov (United States)

    Tiryaki, Osman

    2016-10-02

    This study was undertaken to validate the "quick, easy, cheap, effective, rugged and safe" (QuEChERS) method using Golden Delicious and Starking Delicious apple matrices spiked at 0.1 maximum residue limit (MRL), 1.0 MRL and 10 MRL levels of the four pesticides (chlorpyrifos, dimethoate, indoxacarb and imidacloprid). For the extraction and cleanup, original QuEChERS method was followed, then the samples were subjected to liquid chromatography-triple quadrupole mass spectrometry (LC-MS/MS) for chromatographic analyses. According to t test, matrix effect was not significant for chlorpyrifos in both sample matrices, but it was significant for dimethoate, indoxacarb and imidacloprid in both sample matrices. Thus, matrix-matched calibration (MC) was used to compensate matrix effect and quantifications were carried out by using MC. The overall recovery of the method was 90.15% with a relative standard deviation of 13.27% (n = 330). Estimated method detection limit of analytes blew the MRLs. Some other parameters of the method validation, such as recovery, precision, accuracy and linearity were found to be within the required ranges.

  13. [The development and validation of the methods for the quantitative determination of sibutramine derivatives in dietary supplements].

    Science.gov (United States)

    Stern, K I; Malkova, T L

    The objective of the present study was the development and validation of sibutramine demethylated derivatives, desmethyl sibutramine and didesmethyl sibutramine. Gas-liquid chromatography with the flame ionization detector was used for the quantitative determination of the above substances in dietary supplements. The conditions for the chromatographic determination of the analytes in the presence of the reference standard, methyl stearate, were proposed allowing to achieve the efficient separation. The method has the necessary sensitivity, specificity, linearity, accuracy, and precision (on the intra-day and inter-day basis) which suggests its good validation characteristics. The proposed method can be employed in the analytical laboratories for the quantitative determination of sibutramine derivatives in biologically active dietary supplements.

  14. Validation of the analytical method for sodium dichloroisocyanurate aimed at drinking water disinfection

    International Nuclear Information System (INIS)

    Martinez Alvarez, Luis Octavio; Alejo Cisneros, Pedro; Garcia Pereira, Reynaldo; Campos Valdez, Doraily

    2014-01-01

    Cuba has developed the first effervescent 3.5 mg sodium dichloroisocyanurate tablets as a non-therapeutic active principle. This ingredient releases certain amount of chlorine when dissolved into a litre of water and it can cause adequate disinfection of drinking water ready to be taken after 30 min. Developing and validating an analytical iodometric method applicable to the quality control of effervescent 3.5 mg sodium dichloroisocyanurate tablets

  15. [Validation of the telephone call as a method for measuring compliance to arterial hypertension treatment in Extremadura].

    Science.gov (United States)

    Espinosa-García, J; Cobaleda-Polo, J; González-Velasco, M; Fernández-Bergés, D

    2014-10-01

    Pharmacological non-compliance is a significant problem that can affect patient health. The main aim of this investigation is to validate the telephone call to the patient' home as a self-report method of counting the amount of tablets taken by the patient, as an alternative method to a simple tablet count in the clinic (gold standard). An observational, multicentre, prospective, and longitudinal study was conducted by 25 researchers in different health centres in Extremadura, and which included 125 consecutively enrolled patients with uncontrolled arterial hypertension, 121 ended the study. Three visits were made, including enrollment visit, follow-up visit at 4 weeks, and final visit at 8 weeks. A telephone call was made prior to the enrollment and final visit to remind the patients of the next visit, and to ask at the same time about the number of tablets remaining. A total of 121 patients completed the study. In the final visit, the phone-call method of compliance showed: 100% sensitivity, 86% specificity, 86.8% of overall accuracy, 30.4% PPV, 100% NPV, CP+ 7.13, CP- 0.0, and a kappa index of 0.415 (Pphone call, as a therapeutic compliance method, can be a good alternative due to being almost universal, easy to use, its reduced cost, and without the need of patients to go to the medical centres. Copyright © 2013 Sociedad Española de Médicos de Atención Primaria (SEMERGEN). Publicado por Elsevier España. All rights reserved.

  16. Isotopic and molecular fractionation in combustion; three routes to molecular marker validation, including direct molecular 'dating' (GC/AMS)

    Science.gov (United States)

    Currie, L. A.; Klouda, G. A.; Benner, B. A.; Garrity, K.; Eglinton, T. I.

    The identification of unique isotopic, elemental, and molecular markers for sources of combustion aerosol has growing practical importance because of the potential effects of fine particle aerosol on health, visibility and global climate. It is urgent, therefore, that substantial efforts be directed toward the validation of assumptions involving the use of such tracers for source apportionment. We describe here three independent routes toward carbonaceous aerosol molecular marker identification and validation: (1) tracer regression and multivariate statistical techniques applied to field measurements of mixed source, carbonaceous aerosols; (2) a new development in aerosol 14C metrology: direct, pure compound accelerator mass spectrometry (AMS) by off-line GC/AMS ('molecular dating'); and (3) direct observation of isotopic and molecular source emissions during controlled laboratory combustion of specific fuels. Findings from the combined studies include: independent support for benzo( ghi)perylene as a motor vehicle tracer from the first (statistical) and second (direct 'dating') studies; a new indication, from the third (controlled combustion) study, of a relation between 13C isotopic fractionation and PAH molecular fractionation, also linked with fuel and stage of combustion; and quantitative data showing the influence of both fuel type and combustion conditions on the yields of such species as elemental carbon and PAH, reinforcing the importance of exercising caution when applying presumed conservative elemental or organic tracers to fossil or biomass burning field data as in the first study.

  17. Estimating incidence from prevalence in generalised HIV epidemics: methods and validation.

    Directory of Open Access Journals (Sweden)

    Timothy B Hallett

    2008-04-01

    Full Text Available HIV surveillance of generalised epidemics in Africa primarily relies on prevalence at antenatal clinics, but estimates of incidence in the general population would be more useful. Repeated cross-sectional measures of HIV prevalence are now becoming available for general populations in many countries, and we aim to develop and validate methods that use these data to estimate HIV incidence.Two methods were developed that decompose observed changes in prevalence between two serosurveys into the contributions of new infections and mortality. Method 1 uses cohort mortality rates, and method 2 uses information on survival after infection. The performance of these two methods was assessed using simulated data from a mathematical model and actual data from three community-based cohort studies in Africa. Comparison with simulated data indicated that these methods can accurately estimates incidence rates and changes in incidence in a variety of epidemic conditions. Method 1 is simple to implement but relies on locally appropriate mortality data, whilst method 2 can make use of the same survival distribution in a wide range of scenarios. The estimates from both methods are within the 95% confidence intervals of almost all actual measurements of HIV incidence in adults and young people, and the patterns of incidence over age are correctly captured.It is possible to estimate incidence from cross-sectional prevalence data with sufficient accuracy to monitor the HIV epidemic. Although these methods will theoretically work in any context, we have able to test them only in southern and eastern Africa, where HIV epidemics are mature and generalised. The choice of method will depend on the local availability of HIV mortality data.

  18. Quantification of endocrine disruptors and pesticides in water by gas chromatography-tandem mass spectrometry. Method validation using weighted linear regression schemes.

    Science.gov (United States)

    Mansilha, C; Melo, A; Rebelo, H; Ferreira, I M P L V O; Pinho, O; Domingues, V; Pinho, C; Gameiro, P

    2010-10-22

    A multi-residue methodology based on a solid phase extraction followed by gas chromatography-tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC-MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Validation and comparison of dispersion models of RTARC DSS

    International Nuclear Information System (INIS)

    Duran, J.; Pospisil, M.

    2004-01-01

    RTARC DSS (Real Time Accident Release Consequences - Decision Support System) is a computer code developed at the VUJE Trnava, Inc. (Stubna, M. et al, 1993). The code calculations include atmospheric transport and diffusion, dose assessment, evaluation and displaying of the affected zones, evaluation of the early health effects, concentration and dose rate time dependence in the selected sites etc. The simulation of the protective measures (sheltering, iodine administration) is involved. The aim of this paper is to present the process of validation of the RTARC dispersion models. RTARC includes models for calculations of release for very short (Method Monte Carlo - MEMOC), short (Gaussian Straight-Line Model) and long distances (Puff Trajectory Model - PTM). Validation of the code RTARC was performed using the results of comparisons and experiments summarized in the Table 1.: 1. Experiments and comparisons in the process of validation of the system RTARC - experiments or comparison - distance - model. Wind tunnel experiments (Universitaet der Bundeswehr, Muenchen) - Area of NPP - Method Monte Carlo. INEL (Idaho National Engineering Laboratory) - short/medium - Gaussian model and multi tracer atmospheric experiment - distances - PTM. Model Validation Kit - short distances - Gaussian model. STEP II.b 'Realistic Case Studies' - long distances - PTM. ENSEMBLE comparison - long distances - PTM (orig.)

  20. Concurrent Validation of Experimental Army Enlisted Personnel Selection and Classification Measures

    National Research Council Canada - National Science Library

    Knapp, Deirdre J; Tremble, Trueman R

    2007-01-01

    .... This report documents the method and results of the criterion-related validation. The predictor set includes measures of cognitive ability, temperament, psychomotor skills, values, expectations...

  1. A chiral capillary electrophoresis method for ropivacaine hydrochloride in pharmaceutical formulations : Validation and comparison with chiral liquid chromatography

    NARCIS (Netherlands)

    Sänger-Van De Griend, C. E.; Wahlström, H.; Gröningsson, K.; Widahl-Näsman, Monica E.

    A capillary electrophoresis method for the determination of the enantiomeric purity of the local anaesthetic ropivacaine hydrochloride in injection solutions has been validated. The method showed the required limit of quantitation of 0.1% enantiomeric impurity. Good performances were shown for

  2. Sensitivity and uncertainty analyses applied to criticality safety validation, methods development. Volume 1

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Hopper, C.M.; Childs, R.L.; Parks, C.V.

    1999-01-01

    This report presents the application of sensitivity and uncertainty (S/U) analysis methodologies to the code/data validation tasks of a criticality safety computational study. Sensitivity and uncertainty analysis methods were first developed for application to fast reactor studies in the 1970s. This work has revitalized and updated the available S/U computational capabilities such that they can be used as prototypic modules of the SCALE code system, which contains criticality analysis tools currently used by criticality safety practitioners. After complete development, simplified tools are expected to be released for general use. The S/U methods that are presented in this volume are designed to provide a formal means of establishing the range (or area) of applicability for criticality safety data validation studies. The development of parameters that are analogous to the standard trending parameters forms the key to the technique. These parameters are the D parameters, which represent the differences by group of sensitivity profiles, and the ck parameters, which are the correlation coefficients for the calculational uncertainties between systems; each set of parameters gives information relative to the similarity between pairs of selected systems, e.g., a critical experiment and a specific real-world system (the application)

  3. A Validated RP-HPLC Method for Simultaneous Estimation of Atenolol and Indapamide in Pharmaceutical Formulations

    Directory of Open Access Journals (Sweden)

    G. Tulja Rani

    2011-01-01

    Full Text Available A simple, fast, precise, selective and accurate RP-HPLC method was developed and validated for the simultaneous determination of atenolol and indapamide from bulk and formulations. Chromatographic separation was achieved isocratically on a Waters C18 column (250×4.6 mm, 5 µ particle size using a mobile phase, methanol and water (adjusted to pH 2.7 with 1% orthophosphoric acid in the ratio of 80:20. The flow rate was 1 mL/min and effluent was detected at 230 nm. The retention time of atenolol and indapamide were 1.766 min and 3.407 min. respectively. Linearity was observed in the concentration range of 12.5-150 µg/mL for atenolol and 0.625-7.5 µg/mL for indapamide. Percent recoveries obtained for both the drugs were 99.74-100.06% and 98.65-99.98% respectively. The method was validated according to the ICH guidelines with respect to specificity, linearity, accuracy, precision and robustness. The method developed can be used for the routine analysis of atenolol and indapamide from their combined dosage form.

  4. Validation of Serious Games

    Directory of Open Access Journals (Sweden)

    Katinka van der Kooij

    2015-09-01

    Full Text Available The application of games for behavioral change has seen a surge in popularity but evidence on the efficacy of these games is contradictory. Anecdotal findings seem to confirm their motivational value whereas most quantitative findings from randomized controlled trials (RCT are negative or difficult to interpret. One cause for the contradictory evidence could be that the standard RCT validation methods are not sensitive to serious games’ effects. To be able to adapt validation methods to the properties of serious games we need a framework that can connect properties of serious game design to the factors that influence the quality of quantitative research outcomes. The Persuasive Game Design model [1] is particularly suitable for this aim as it encompasses the full circle from game design to behavioral change effects on the user. We therefore use this model to connect game design features, such as the gamification method and the intended transfer effect, to factors that determine the conclusion validity of an RCT. In this paper we will apply this model to develop guidelines for setting up validation methods for serious games. This way, we offer game designers and researchers handles on how to develop tailor-made validation methods.

  5. The Numerical Welding Simulation - Developments and Validation of Simplified and Bead Lumping Methods

    International Nuclear Information System (INIS)

    Baup, Olivier

    2001-01-01

    The aim of this work was to study the TIG multipass welding process on stainless steel, by means of numerical methods and then to work out simplified and bead lumping methods in order to reduce adjusting and realisation times of these calculations. A simulation was used as reference for the validation of these methods; after the presentation of the test series having led to the option choices of this calculation (2D generalised plane strains, elastoplastic model with an isotropic hardening, hardening restoration due to high temperatures), various simplifications were tried on a plate geometry. These simplifications related various modelling points with a correct plastic flow representation in the plate. The use of a reduced number of thermal fields characterising the bead deposit and a low number of tensile curves allow to obtain interesting results, decreasing significantly the Computing times. In addition various lumping bead methods have been studied and concerning both the shape and the thermic of the macro-deposits. The macro-deposit shapes studied are in 'L', or in layer or they represent two beads one on top of the other. Among these three methods, only those using a few number of lumping beads gave bad results since thermo-mechanical history was deeply modified near and inside the weld. Thereafter, simplified methods have been applied to a tubular geometry. On this new geometry, experimental measurements were made during welding, which allow a validation of the reference calculation. Simplified and reference calculations gave approximately the same stress fields as found on plate geometry. Finally, in the last part of this document a procedure for automatic data setting permitting to reduce significantly the calculation phase preparation is presented. It has been applied to the calculation of thick pipe welding in 90 beads; the results are compared with a simplified simulation realised by Framatome and with experimental measurements. A bead by

  6. Comparison and validation of statistical methods for predicting power outage durations in the event of hurricanes.

    Science.gov (United States)

    Nateghi, Roshanak; Guikema, Seth D; Quiring, Steven M

    2011-12-01

    This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out-of-sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy. © 2011 Society for Risk Analysis.

  7. Validation of asthma recording in electronic health records: a systematic review

    Directory of Open Access Journals (Sweden)

    Nissen F

    2017-12-01

    Full Text Available Francis Nissen,1 Jennifer K Quint,2 Samantha Wilkinson,1 Hana Mullerova,3 Liam Smeeth,1 Ian J Douglas1 1Department of Non-Communicable Disease Epidemiology, London School of Hygiene and Tropical Medicine, London, UK; 2National Heart and Lung Institute, Imperial College, London, UK; 3RWD & Epidemiology, GSK R&D, Uxbridge, UK Objective: To describe the methods used to validate asthma diagnoses in electronic health records and summarize the results of the validation studies. Background: Electronic health records are increasingly being used for research on asthma to inform health services and health policy. Validation of the recording of asthma diagnoses in electronic health records is essential to use these databases for credible epidemiological asthma research.Methods: We searched EMBASE and MEDLINE databases for studies that validated asthma diagnoses detected in electronic health records up to October 2016. Two reviewers independently assessed the full text against the predetermined inclusion criteria. Key data including author, year, data source, case definitions, reference standard, and validation statistics (including sensitivity, specificity, positive predictive value [PPV], and negative predictive value [NPV] were summarized in two tables.Results: Thirteen studies met the inclusion criteria. Most studies demonstrated a high validity using at least one case definition (PPV >80%. Ten studies used a manual validation as the reference standard; each had at least one case definition with a PPV of at least 63%, up to 100%. We also found two studies using a second independent database to validate asthma diagnoses. The PPVs of the best performing case definitions ranged from 46% to 58%. We found one study which used a questionnaire as the reference standard to validate a database case definition; the PPV of the case definition algorithm in this study was 89%. Conclusion: Attaining high PPVs (>80% is possible using each of the discussed validation

  8. Validity Evaluation of the Assessment Method for Postural Loading on the Upper Body in Printing Industry

    Directory of Open Access Journals (Sweden)

    Mohammad Khandan

    2016-07-01

    Full Text Available Background and Objectives: Musculoskeletal disorders and injuries are known as a global occupational challenge. These injuries are more are concentrated in the upper limb. There are several methods to assess this kind of disorders, each of which have different efficiencies for various jobs based on their strengths and weaknesses. This study aimed to assess the validity of LUBA method in order to evaluate risk factors for musculoskeletal disorders in a printing industry in Qom province, 2014. Methods: In this descriptive cross-sectional study, all operational workers (n=94 were investigated in 2014. Nordic Musculoskeletal Questionnaire (NMQ was used to collect data on musculoskeletal disorders. We also used LUBA method to analyze postures in four different parts of the body (neck, shoulder, elbow, and wrist. The obtained data were analyzed using Mann-Whitney, Kruskal Wallis, and Kappa agreement tests. Results: Lumbar region of back with 35.1% prevalence had the most problems. The results of LUBA method showed that most postures were located at the second corrective action level, and need further studies. Agreement between assessment of shoulder posture and its disorders was significant (p0.05.  Conclusion: According to the results of this study on reliability and predictive validity of the LUBA method in printing industry, it can be concluded that this method is not a reliable method for posture assessment; however, further and more comprehensive studies are recommended.  

  9. Method development and validation for simultaneous determination of IEA-R1 reactor’s pool water uranium and silicon content by ICP OES

    Science.gov (United States)

    Ulrich, J. C.; Guilhen, S. N.; Cotrim, M. E. B.; Pires, M. A. F.

    2018-03-01

    IPEN’s research reactor, IEA-R1, an open pool type research reactor moderated and cooled by light water. High quality water is a key factor in preventing the corrosion of the spent fuel stored in the pool. Leaching of radionuclides from the corroded fuel cladding may be prevented by an efficient water treatment and purification system. However, as a safety management policy, IPEN has adopted a water chemistry control which periodically monitors the levels of uranium (U) and silicon (Si) in the pool’s reactor, since IEA-R1 employs U3Si2-Al dispersion fuel. An analytical method was developed and validated for the determination of uranium and silicon by ICP OES. This work describes the validation process, in a context of quality assurance, including the parameters selectivity, linearity, quantification limit, precision and recovery.

  10. Development and Single-Laboratory Validation of a Liquid Chromatography Tandem Mass Spectrometry Method for Quantitation of Tetrodotoxin in Mussels and Oysters.

    Science.gov (United States)

    Turner, Andrew D; Boundy, Michael J; Rapkova, Monika Dhanji

    2017-09-01

    In recent years, evidence has grown for the presence of tetrodotoxin (TTX) in bivalve mollusks, leading to the potential for consumers of contaminated products to be affected by Tetrodotoxin Shellfish Poisoning (TSP). A single-laboratory validation was conducted for the hydrophilic interaction LC (HILIC) tandem MS (MS/MS) analysis of TTX in common mussels and Pacific oysters-the bivalve species that have been found to contain TTXs in the United Kingdom in recent years. The method consists of a single-step dispersive extraction in 1% acetic acid, followed by a carbon SPE cleanup step before dilution and instrumental analysis. The full method was developed as a rapid tool for the quantitation of TTX, as well as for the associated analogs 4-epi-TTX; 5,6,11-trideoxy TTX; 11-nor TTX-6-ol; 5-deoxy TTX; and 4,9-anhydro TTX. The method can also be run as the acquisition of TTX together with paralytic shellfish toxins. Results demonstrated acceptable method performance characteristics for specificity, linearity, recovery, ruggedness, repeatability, matrix variability, and within-laboratory reproducibility for the analysis of TTX. The LOD and LOQ were fit-for-purpose in comparison to the current action limit for TTX enforced in The Netherlands. In addition, aspects of method performance (LOD, LOQ, and within-laboratory reproducibility) were found to be satisfactory for three other TTX analogs (11-nor TTX-6-ol, 5-deoxy TTX, and 4,9-anhydro TTX). The method was found to be practical and suitable for use in regulatory testing, providing rapid turnaround of sample analysis. Plans currently underway on a full collaborative study to validate a HILIC-MS/MS method for paralytic shellfish poisoning toxins will be extended to include TTX in order to generate international acceptance, ultimately for use as an alternative official control testing method should regulatory controls be adopted.

  11. Validating a High Performance Liquid Chromatography-Ion Chromatography (HPLC-IC) Method with Conductivity Detection After Chemical Suppression for Water Fluoride Estimation.

    Science.gov (United States)

    Bondu, Joseph Dian; Selvakumar, R; Fleming, Jude Joseph

    2018-01-01

    A variety of methods, including the Ion Selective Electrode (ISE), have been used for estimation of fluoride levels in drinking water. But as these methods suffer many drawbacks, the newer method of IC has replaced many of these methods. The study aimed at (1) validating IC for estimation of fluoride levels in drinking water and (2) to assess drinking water fluoride levels of villages in and around Vellore district using IC. Forty nine paired drinking water samples were measured using ISE and IC method (Metrohm). Water samples from 165 randomly selected villages in and around Vellore district were collected for fluoride estimation over 1 year. Standardization of IC method showed good within run precision, linearity and coefficient of variance with correlation coefficient R 2  = 0.998. The limit of detection was 0.027 ppm and limit of quantification was 0.083 ppm. Among 165 villages, 46.1% of the villages recorded water fluoride levels >1.00 ppm from which 19.4% had levels ranging from 1 to 1.5 ppm, 10.9% had recorded levels 1.5-2 ppm and about 12.7% had levels of 2.0-3.0 ppm. Three percent of villages had more than 3.0 ppm fluoride in the water tested. Most (44.42%) of these villages belonged to Jolarpet taluk with moderate to high (0.86-3.56 ppm) water fluoride levels. Ion Chromatography method has been validated and is therefore a reliable method in assessment of fluoride levels in the drinking water. While the residents of Jolarpet taluk (Vellore distict) are found to be at a high risk of developing dental and skeletal fluorosis.

  12. Development and validity of a method for the evaluation of printed education material.

    Directory of Open Access Journals (Sweden)

    Castro MS

    2007-06-01

    Full Text Available Objectives: To develop and study the validity of an instrument for evaluation of Printed Education Materials (PEM; to evaluate the use of acceptability indices; to identify possible influences of professional aspects. Methods: An instrument for PEM evaluation was developed which included tree steps: domain identification, item generation and instrument design. A reading to easy PEM was developed for education of patient with systemic hypertension and its treatment with hydrochlorothiazide. Construct validity was measured based on previously established errors purposively introduced into the PEM, which served as extreme groups. An acceptability index was applied taking into account the rate of professionals who should approve each item. Participants were 10 physicians (9 men and 5 nurses (all women.Results: Many professionals identified intentional errors of crude character. Few participants identified errors that needed more careful evaluation, and no one detected the intentional error that required literature analysis. Physicians considered as acceptable 95.8% of the items of the PEM, and nurses 29.2%. The differences between the scoring were statistically significant in 27% of the items. In the overall evaluation, 66.6% were considered as acceptable. The analysis of each item revealed a behavioral pattern for each professional group.Conclusions: The use of instruments for evaluation of printed education materials is required and may improve the quality of the PEM available for the patients. Not always are the acceptability indices totally correct or represent high quality of information. The professional experience, the practice pattern, and perhaps the gendre of the reviewers may influence their evaluation. An analysis of the PEM by professionals in communication, in drug information, and patients should be carried out to improve the quality of the proposed material.

  13. Validation of the Gatortail method for accurate sizing of pulmonary vessels from 3D medical images.

    Science.gov (United States)

    O'Dell, Walter G; Gormaley, Anne K; Prida, David A

    2017-12-01

    Detailed characterization of changes in vessel size is crucial for the diagnosis and management of a variety of vascular diseases. Because clinical measurement of vessel size is typically dependent on the radiologist's subjective interpretation of the vessel borders, it is often prone to high inter- and intra-user variability. Automatic methods of vessel sizing have been developed for two-dimensional images but a fully three-dimensional (3D) method suitable for vessel sizing from volumetric X-ray computed tomography (CT) or magnetic resonance imaging has heretofore not been demonstrated and validated robustly. In this paper, we refined and objectively validated Gatortail, a method that creates a mathematical geometric 3D model of each branch in a vascular tree, simulates the appearance of the virtual vascular tree in a 3D CT image, and uses the similarity of the simulated image to a patient's CT scan to drive the optimization of the model parameters, including vessel size, to match that of the patient. The method was validated with a 2-dimensional virtual tree structure under deformation, and with a realistic 3D-printed vascular phantom in which the diameter of 64 branches were manually measured 3 times each. The phantom was then scanned on a conventional clinical CT imaging system and the images processed with the in-house software to automatically segment and mathematically model the vascular tree, label each branch, and perform the Gatortail optimization of branch size and trajectory. Previously proposed methods of vessel sizing using matched Gaussian filters and tubularity metrics were also tested. The Gatortail method was then demonstrated on the pulmonary arterial tree segmented from a human volunteer's CT scan. The standard deviation of the difference between the manually measured and Gatortail-based radii in the 3D physical phantom was 0.074 mm (0.087 in-plane pixel units for image voxels of dimension 0.85 × 0.85 × 1.0 mm) over the 64 branches

  14. Validity of Various Methods for Determining Velocity, Force, and Power in the Back Squat.

    Science.gov (United States)

    Banyard, Harry G; Nosaka, Ken; Sato, Kimitake; Haff, G Gregory

    2017-10-01

    To examine the validity of 2 kinematic systems for assessing mean velocity (MV), peak velocity (PV), mean force (MF), peak force (PF), mean power (MP), and peak power (PP) during the full-depth free-weight back squat performed with maximal concentric effort. Ten strength-trained men (26.1 ± 3.0 y, 1.81 ± 0.07 m, 82.0 ± 10.6 kg) performed three 1-repetition-maximum (1RM) trials on 3 separate days, encompassing lifts performed at 6 relative intensities including 20%, 40%, 60%, 80%, 90%, and 100% of 1RM. Each repetition was simultaneously recorded by a PUSH band and commercial linear position transducer (LPT) (GymAware [GYM]) and compared with measurements collected by a laboratory-based testing device consisting of 4 LPTs and a force plate. Trials 2 and 3 were used for validity analyses. Combining all 120 repetitions indicated that the GYM was highly valid for assessing all criterion variables while the PUSH was only highly valid for estimations of PF (r = .94, CV = 5.4%, ES = 0.28, SEE = 135.5 N). At each relative intensity, the GYM was highly valid for assessing all criterion variables except for PP at 20% (ES = 0.81) and 40% (ES = 0.67) of 1RM. Moreover, the PUSH was only able to accurately estimate PF across all relative intensities (r = .92-.98, CV = 4.0-8.3%, ES = 0.04-0.26, SEE = 79.8-213.1 N). PUSH accuracy for determining MV, PV, MF, MP, and PP across all 6 relative intensities was questionable for the back squat, yet the GYM was highly valid at assessing all criterion variables, with some caution given to estimations of MP and PP performed at lighter loads.

  15. Validity of Dietary Assessment in Athletes: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Louise Capling

    2017-12-01

    Full Text Available Dietary assessment methods that are recognized as appropriate for the general population are usually applied in a similar manner to athletes, despite the knowledge that sport-specific factors can complicate assessment and impact accuracy in unique ways. As dietary assessment methods are used extensively within the field of sports nutrition, there is concern the validity of methodologies have not undergone more rigorous evaluation in this unique population sub-group. The purpose of this systematic review was to compare two or more methods of dietary assessment, including dietary intake measured against biomarkers or reference measures of energy expenditure, in athletes. Six electronic databases were searched for English-language, full-text articles published from January 1980 until June 2016. The search strategy combined the following keywords: diet, nutrition assessment, athlete, and validity; where the following outcomes are reported but not limited to: energy intake, macro and/or micronutrient intake, food intake, nutritional adequacy, diet quality, or nutritional status. Meta-analysis was performed on studies with sufficient methodological similarity, with between-group standardized mean differences (or effect size and 95% confidence intervals (CI being calculated. Of the 1624 studies identified, 18 were eligible for inclusion. Studies comparing self-reported energy intake (EI to energy expenditure assessed via doubly labelled water were grouped for comparison (n = 11 and demonstrated mean EI was under-estimated by 19% (−2793 ± 1134 kJ/day. Meta-analysis revealed a large pooled effect size of −1.006 (95% CI: −1.3 to −0.7; p < 0.001. The remaining studies (n = 7 compared a new dietary tool or instrument to a reference method(s (e.g., food record, 24-h dietary recall, biomarker as part of a validation study. This systematic review revealed there are limited robust studies evaluating dietary assessment methods in athletes. Existing

  16. Systems and Methods for Fabricating Structures Including Metallic Glass-Based Materials Using Low Pressure Casting

    Science.gov (United States)

    Hofmann, Douglas C. (Inventor); Kennett, Andrew (Inventor)

    2018-01-01

    Systems and methods to fabricate objects including metallic glass-based materials using low-pressure casting techniques are described. In one embodiment, a method of fabricating an object that includes a metallic glass-based material includes: introducing molten alloy into a mold cavity defined by a mold using a low enough pressure such that the molten alloy does not conform to features of the mold cavity that are smaller than 100 microns; and cooling the molten alloy such that it solidifies, the solid including a metallic glass-based material.

  17. Development, optimization, validation and application of faster gas chromatography - flame ionization detector method for the analysis of total petroleum hydrocarbons in contaminated soils.

    Science.gov (United States)

    Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly

    2015-12-18

    This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up

  18. A simple validated multi-analyte method for detecting drugs in oral fluid by ultra-performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS).

    Science.gov (United States)

    Zheng, Yufang; Sparve, Erik; Bergström, Mats

    2018-06-01

    A UPLC-MS/MS method was developed to identify and quantitate 37 commonly abused drugs in oral fluid. Drugs of interest included amphetamines, benzodiazepines, cocaine, opiates, opioids, phencyclidine and tetrahydrocannabinol. Sample preparation and extraction are simple, and analysis times short. Validation showed satisfactory performance at relevant concentrations. The possibility of contaminated samples as well as the interpretation in relation to well-knows matrices, such as urine, will demand further study. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Development of a validated HPLC method for the determination of sennoside A and B, two major constituents of Cassia obovata Coll.

    Directory of Open Access Journals (Sweden)

    Ghassemi-Dehkordi Nasrollah

    2014-04-01

    Full Text Available Introduction: Cassia obovata Coll is the only Senna species which grows wild in Iran. In the present study, an optimised reverse High Performance Liquid Chromatography (HPLC validated method was established for quantification of sennosides A and B, the major constituents of C. obovata with a simple and accurate method. Methods: HPLC analysis was done using Waters 515 pump on a Nova-Pak C18 (3.9 × 150 mm. Millennium software was used for the determination of the sennoside A and B in Cassia species and processing the information. The method was validated according to USP 32 requirements. Results: The solvent impact on the selectivity factor and partition coefficient parameters evaluated. Using a conventional RP-18 L1 column, 3.9 × 150 mm, the mobile phase was selected after several trials with different mixtures of water and acetonitrile. Sennosides A and B were determined using the external standard calibration method. Using USP 35-NF 30, the LOD and LOQ were calculated. The reliability of the HPLC-method for analysis of sennoside A + B was validated through its linearity, reproducibility, repeatability, and recovery. Fina1ly ethanol:water (1:1 extracts of Cassia obovata and Cassia angustifolia were standardized by assay of sennoside A and B through above HPLC validated method. Conclusion: Through the above method, determination of sennosides in Cassia species are completely possible. Moreover, through comparing the results, even though sennosides are rich in Cassia angustifolia but, the results shows that C. obovata could be considered as an alternative source for sennosides A and B.

  20. Solar cells, structures including organometallic halide perovskite monocrystalline films, and methods of preparation thereof

    KAUST Repository

    Bakr, Osman; Peng, Wei; Wang, Lingfei

    2017-01-01

    Embodiments of the present disclosure provide for solar cells including an organometallic halide perovskite monocrystalline film (see fig. 1.1B), other devices including the organometallic halide perovskite monocrystalline film, methods of making

  1. Validation of methods for measurement of insulin secretion in humans in vivo

    DEFF Research Database (Denmark)

    Kjems, L L; Christiansen, E; Vølund, A

    2000-01-01

    To detect and understand the changes in beta-cell function in the pathogenesis of type 2 diabetes, an accurate and precise estimation of prehepatic insulin secretion rate (ISR) is essential. There are two common methods to assess ISR, the deconvolution method (by Eaton and Polonsky)-considered th......To detect and understand the changes in beta-cell function in the pathogenesis of type 2 diabetes, an accurate and precise estimation of prehepatic insulin secretion rate (ISR) is essential. There are two common methods to assess ISR, the deconvolution method (by Eaton and Polonsky...... of these mathematical techniques for quantification of insulin secretion have been tested in dogs, but not in humans. In the present studies, we examined the validity of both methods to recover the known infusion rates of insulin and C-peptide mimicking ISR during an oral glucose tolerance test. ISR from both......, and a close agreement was found for the results of an oral glucose tolerance test. We also studied whether C-peptide kinetics are influenced by somatostatin infusion. The decay curves after bolus injection of exogenous biosynthetic human C-peptide, the kinetic parameters, and the metabolic clearance rate were...

  2. Statistical Analysis Methods for Physics Models Verification and Validation

    CERN Document Server

    De Luca, Silvia

    2017-01-01

    The validation and verification process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal

  3. Implementation of the 3Rs (refinement, reduction, and replacement): validation and regulatory acceptance considerations for alternative toxicological test methods.

    Science.gov (United States)

    Schechtman, Leonard M

    2002-01-01

    Toxicological testing in the current regulatory environment is steeped in a history of using animals to answer questions about the safety of products to which humans are exposed. That history forms the basis for the testing strategies that have evolved to satisfy the needs of the regulatory bodies that render decisions that affect, for the most part, virtually all phases of premarket product development and evaluation and, to a lesser extent, postmarketing surveillance. Only relatively recently have the levels of awareness of, and responsiveness to, animal welfare issues reached current proportions. That paradigm shift, although sluggish, has nevertheless been progressive. New and alternative toxicological methods for hazard evaluation and risk assessment have now been adopted and are being viewed as a means to address those issues in a manner that considers humane treatment of animals yet maintains scientific credibility and preserves the goal of ensuring human safety. To facilitate this transition, regulatory agencies and regulated industry must work together toward improved approaches. They will need assurance that the methods will be reliable and the results comparable with, or better than, those derived from the current classical methods. That confidence will be a function of the scientific validation and resultant acceptance of any given method. In the United States, to fulfill this need, the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) and its operational center, the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM), have been constituted as prescribed in federal law. Under this mandate, ICCVAM has developed a process and established criteria for the scientific validation and regulatory acceptance of new and alternative methods. The role of ICCVAM in the validation and acceptance process and the criteria instituted toward that end are described. Also

  4. Validation of the LWR-EIR methods for the evaluation of compact beds

    International Nuclear Information System (INIS)

    Foskolos, K.; Grimm, P.; Maeder, C.; Paratte, J.M.

    1983-10-01

    The EIR code system for the calculation of light water reactors is presented and the methods used are briefly described. The application of the system on various types of critical experiments and benchmark problems proves its good precision, even for heterogeneous configurations with strong neutron absorbers like Boral. As the accuracy of the multiplication factor ksub(eff) is always better than 0.5% for normal LWR configurations, this code system is validated for the calculation of such configurations with a safety margin of 1.5% on ksub(eff). (Auth.)

  5. Development and validation of a spectrophotometry method for the determination of histamine in fresh tuna (Thunnus tunna)

    International Nuclear Information System (INIS)

    Chacon-Silva, Fainier; Barquero-Quiros, Miriam

    2002-01-01

    Histamine in foods can promote allergic reactions in sensitive persons. A colorimetric microscale method for histamine determination was developed and validated. Cu 2+ histamine chelation occurs at 9,5 ph. Dichloromethane extraction of the complex as the salt with tetrabromo phenolphthalein ethyl ester, allows photometric quantitation at 515 nm. The validation of micro method was accomplished trough its performance parameters, detection limit, quantitation limit, sensitivity, linearity, precision, recuperation. This methodology was applied to twenty raw tuna samples, collected in San Jose metropolitan area. It was found that 45% of analyzed samples had a histamine content in the range between 100-200 mg/kg. This finding indicates bacterial contamination, 15% of samples analyzed were over 500 mg/kg FDA level of sanitary risk. (Author) [es

  6. Inter-laboratory validation of an inexpensive streamlined method to measure inorganic arsenic in rice grain.

    Science.gov (United States)

    Chaney, Rufus L; Green, Carrie E; Lehotay, Steven J

    2018-05-04

    With the establishment by CODEX of a 200 ng/g limit of inorganic arsenic (iAs) in polished rice grain, more analyses of iAs will be necessary to ensure compliance in regulatory and trade applications, to assess quality control in commercial rice production, and to conduct research involving iAs in rice crops. Although analytical methods using high-performance liquid chromatography-inductively coupled plasma-mass spectrometry (HPLC-ICP-MS) have been demonstrated for full speciation of As, this expensive and time-consuming approach is excessive when regulations are based only on iAs. We report a streamlined sample preparation and analysis of iAs in powdered rice based on heated extraction with 0.28 M HNO 3 followed by hydride generation (HG) under control of acidity and other simple conditions. Analysis of iAs is then conducted using flow-injection HG and inexpensive ICP-atomic emission spectroscopy (AES) or other detection means. A key innovation compared with previous methods was to increase the acidity of the reagent solution with 4 M HCl (prior to reduction of As 5+ to As 3+ ), which minimized interferences from dimethylarsinic acid. An inter-laboratory method validation was conducted among 12 laboratories worldwide in the analysis of six shared blind duplicates and a NIST Standard Reference Material involving different types of rice and iAs levels. Also, four laboratories used the standard HPLC-ICP-MS method to analyze the samples. The results between the methods were not significantly different, and the Horwitz ratio averaged 0.52 for the new method, which meets official method validation criteria. Thus, the simpler, more versatile, and less expensive method may be used by laboratories for several purposes to accurately determine iAs in rice grain. Graphical abstract Comparison of iAs results from new and FDA methods.

  7. Electrode assemblies, plasma apparatuses and systems including electrode assemblies, and methods for generating plasma

    Science.gov (United States)

    Kong, Peter C; Grandy, Jon D; Detering, Brent A; Zuck, Larry D

    2013-09-17

    Electrode assemblies for plasma reactors include a structure or device for constraining an arc endpoint to a selected area or region on an electrode. In some embodiments, the structure or device may comprise one or more insulating members covering a portion of an electrode. In additional embodiments, the structure or device may provide a magnetic field configured to control a location of an arc endpoint on the electrode. Plasma generating modules, apparatus, and systems include such electrode assemblies. Methods for generating a plasma include covering at least a portion of a surface of an electrode with an electrically insulating member to constrain a location of an arc endpoint on the electrode. Additional methods for generating a plasma include generating a magnetic field to constrain a location of an arc endpoint on an electrode.

  8. Development and validation of a stability indicating HPTLC-densitometric method for lafutidine

    Directory of Open Access Journals (Sweden)

    Dinesh Dhamecha

    2013-01-01

    Full Text Available Background: A simple, selective, precise, and stability indicating high-performance thin layer chromatographic method has been established and validated for analysis of lafutidine in bulk drug and formulations. Materials and Methods: The compounds were analyzed on aluminum backed silica gel 60 F 254 plates with chloroform:ethanol:acetic Acid (8:1:1 as mobile phase. Densitometric analysis of lafutidine was performed at 230 nm. Result : Regression analysis data for the calibration plots were indicative of good linear relationship between response and concentration over the range 100-500 ng per spot. The correlation coefficient (r 2 was 0.998±0.002. Conclusion: Lafutidine was subjected to acid, base, peroxide, and sunlight degradation. In stability tests, the drug was susceptible to acid and basic hydrolysis, oxidation, and photodegradation.

  9. Development and Validation of New Spectrophotometric Methods to Determine Enrofloxacin in Pharmaceuticals

    Science.gov (United States)

    Rajendraprasad, N.; Basavaiah, K.

    2015-07-01

    Four spectrophotometric methods, based on oxidation with cerium(IV), are investigated and developed to determine EFX in pure form and in dosage forms. The frst and second methods (Method A and method B) are direct, in which after the oxidation of EFX with cerium(IV) in acid medium, the absorbance of reduced and unreacted oxidant is measured at 275 and 320 nm, respectively. In the third (C) and fourth (D) methods after the reaction between EFX and oxidant is ensured to be completed the surplus oxidant is treated with either N-phenylanthranilic acid (NPA) or Alizarin Red S (ARS) dye and the absorbance of the oxidized NPA or ARS is measured at 440 or 420 nm. The methods showed good linearity over the concentration ranges of 0.5-5.0, 1.25-12.5, 10.0-100.0, and 6.0-60.0 μg/ml, for method A, B, C and D, respectively, with apparent molar absorptivity values of 4.42 × 10 4 , 8.7 × 10 3 , 9.31 × 10 2 , and 2.28 × 10 3 l/(mol· cm). The limits of detection (LOD), quantification (LOQ), and Sandell's sensitivity values and other validation results have also been reported. The proposed methods are successfully applied to determine EFX in pure form and in dosage forms.

  10. Validation of GOSAT XCO2 and XCH4 retrieved by PPDF-S method and evaluation of sensitivity of aerosols to gas concentrations

    Science.gov (United States)

    Iwasaki, C.; Imasu, R.; Bril, A.; Yokota, T.; Yoshida, Y.; Morino, I.; Oshchepkov, S.; Rokotyan, N.; Zakharov, V.; Gribanov, K.

    2017-12-01

    Photon path length probability density function-Simultaneous (PPDF-S) method is one of effective algorithms for retrieving column-averaged concentrations of carbon dioxide (XCO2) and methane (XCH4) from Greenhouse gases Observing SATellite (GOSAT) spectra in Short Wavelength InfraRed (SWIR) [Oshchepkov et al., 2013]. In this study, we validated XCO2 and XCH4 retrieved by the PPDF-S method through comparison with the Total Carbon Column Observing Network (TCCON) data [Wunch et al., 2011] from 26 sites including additional site of the Ural Atmospheric Station at Kourovka [57.038°N and 59.545°E], Russia. Validation results using TCCON data show that bias and its standard deviation of PPDF-S data are respectively 0.48 and 2.10 ppm for XCO2, and -0.73 and 15.77 ppb for XCH4. The results for XCO2 are almost identical with those of Iwasaki et al. [2017] for which the validation data were limited at selected 11 sites. However, the bias of XCH4 shows opposite sign against that of Iwasaki et al. [2017]. Furthermore, the data at Kourouvka showed different features particularly for XCH4. In order to investigate the causes for the differences, we have carried out simulation studies mainly focusing on the effects of aerosols which modify the light path length of solar radiation [O'Brien and Rayner, 2002; Aben et al., 2007; Oshchepkov et al., 2008]. Based on the simulation studies using multiple radiation transfer code based on Discrete Ordinate Method (DOM), Polarization System for Transfer of Atmospheric Radiation3 (Pstar3) [Ota et al., 2010], sensitivity of aerosols to gas concentrations was examined.

  11. Cleaning Validation of Fermentation Tanks

    DEFF Research Database (Denmark)

    Salo, Satu; Friis, Alan; Wirtanen, Gun

    2008-01-01

    Reliable test methods for checking cleanliness are needed to evaluate and validate the cleaning process of fermentation tanks. Pilot scale tanks were used to test the applicability of various methods for this purpose. The methods found to be suitable for validation of the clenlinees were visula...

  12. Content validity and reliability of the Copenhagen social relations questionnaire

    DEFF Research Database (Denmark)

    Lund, Rikke; Nielsen, Lene Snabe; Henriksen, Pia Wichmann

    2014-01-01

    OBJECTIVE: The aim of the present article is to describe the face and content validity as well as reliability of the Copenhagen Social Relations Questionnaire (CSRQ). METHOD: The face and content validity test was based on focus group discussions and individual interviews with 31 informants...... from the interviews. Two additional themes not covered by CSRQ on dynamics and reciprocity of social relations were identified. DISCUSSION: CSRQ holds satisfactory face and content validity as well as reliability, and is suitable for measuring structure and function of social relations including...

  13. Validation of a capillary electrophoresis method for the enantiomeric purity testing of ropivacaine, a new local anaesthetic compound

    NARCIS (Netherlands)

    Sänger-Van De Griend, C. E.; Gröningsson, K.

    A capillary electrophoresis method for the determination of the enantiomeric purity of ropivacaine, a new local anaesthetic compound developed by Astra Pain Control AB, has been validated. The method showed the required limit of quantitation of 0.1%, enantiomeric impurity and proved to be robust.

  14. A Validated HPLC-DAD Method for Simultaneous Determination of Etodolac and Pantoprazole in Rat Plasma

    Directory of Open Access Journals (Sweden)

    Ali S. Abdelhameed

    2014-01-01

    Full Text Available A simple, sensitive, and accurate HPLC-DAD method has been developed and validated for the simultaneous determination of pantoprazole and etodolac in rat plasma as a tool for therapeutic drug monitoring. Optimal chromatographic separation of the analytes was achieved on a Waters Symmetry C18 column using a mobile phase that consisted of phosphate buffer pH~4.0 as eluent A and acetonitrile as eluent B in a ratio of A : B, 55 : 45 v/v for 6 min, pumped isocratically at a flow rate of 0.8 mL min−1. The eluted analytes were monitored using photodiode array detector set to quantify samples at 254 nm. The method was linear with r2=0.9999 for PTZ and r2=0.9995 for ETD at a concentration range of 0.1–15 and 5–50 μgmL−1 for PTZ and ETD, respectively. The limits of detection were found to be 0.033 and 0.918 μgmL−1 for PTZ and ETD, respectively. The method was statistically validated for linearity, accuracy, precision, and selectivity following the International Conference for Harmonization (ICH guidelines. The reproducibility of the method was reliable with the intra- and interday precision (% RSD <7.76% for PTZ and <7.58 % for ETD.

  15. Validation of a UV Spectrometric Method for the Assay of Tolfenamic Acid in Organic Solvents

    Directory of Open Access Journals (Sweden)

    Sofia Ahmed

    2015-01-01

    Full Text Available The present study has been carried out to validate a UV spectrometric method for the assay of tolfenamic acid (TA in organic solvents. TA is insoluble in water; therefore, a total of thirteen commonly used organic solvents have been selected in which the drug is soluble. Fresh stock solutions of TA in each solvent in a concentration of 1 × 10−4 M (2.62 mg% were prepared for the assay. The method has been validated according to the guideline of International Conference on Harmonization and parameters like linearity, range, accuracy, precision, sensitivity, and robustness have been studied. Although the method was found to be efficient for the determination of TA in all solvents on the basis of statistical data 1-octanol, followed by ethanol and methanol, was found to be comparatively better than the other studied solvents. No change in the stock solution stability of TA has been observed in each solvent for 24 hours stored either at room (25±1°C or at refrigerated temperature (2–8°C. A shift in the absorption maxima has been observed for TA in various solvents indicating drug-solvent interactions. The studied method is simple, rapid, economical, accurate, and precise for the assay of TA in different organic solvents.

  16. Validation of a computational method for assessing the impact of intra-fraction motion on helical tomotherapy plans

    Energy Technology Data Exchange (ETDEWEB)

    Ngwa, Wilfred; Meeks, Sanford L; Kupelian, Patrick A; Langen, Katja M [Department of Radiation Oncology, M D Anderson Cancer Center Orlando, 1400 South Orange Avenue, Orlando, FL 32806 (United States); Schnarr, Eric [TomoTherapy, Inc., 1240 Deming Way, Madison, WI 53717 (United States)], E-mail: wilfred.ngwa@orlandohealth.com

    2009-11-07

    In this work, a method for direct incorporation of patient motion into tomotherapy dose calculations is developed and validated. This computational method accounts for all treatment dynamics and can incorporate random as well as cyclical motion data. Hence, interplay effects between treatment dynamics and patient motion are taken into account during dose calculation. This allows for a realistic assessment of intra-fraction motion on the dose distribution. The specific approach entails modifying the position and velocity events in the tomotherapy delivery plan to accommodate any known motion. The computational method is verified through phantom and film measurements. Here, measured prostate motion and simulated respiratory motion tracks were incorporated in the dose calculation. The calculated motion-encoded dose profiles showed excellent agreement with the measurements. Gamma analysis using 3 mm and 3% tolerance criteria showed over 97% and 96% average of points passing for the prostate and breathing motion tracks, respectively. The profile and gamma analysis results validate the accuracy of this method for incorporating intra-fraction motion into the dose calculation engine for assessment of dosimetric effects on helical tomotherapy dose deliveries.

  17. Experimental validation of the intrinsic spatial efficiency method over a wide range of sizes for cylindrical sources

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz-Ramŕez, Pablo, E-mail: rapeitor@ug.uchile.cl; Larroquette, Philippe [Departamento de Física, Facultad de Ciencias, Universidad de Chile (Chile); Camilla, S. [Departamento de Física, Universidad Tecnológica Metropolitana (Chile)

    2016-07-07

    The intrinsic spatial efficiency method is a new absolute method to determine the efficiency of a gamma spectroscopy system for any extended source. In the original work the method was experimentally demonstrated and validated for homogeneous cylindrical sources containing {sup 137}Cs, whose sizes varied over a small range (29.5 mm radius and 15.0 to 25.9 mm height). In this work we present an extension of the validation over a wide range of sizes. The dimensions of the cylindrical sources vary between 10 to 40 mm height and 8 to 30 mm radius. The cylindrical sources were prepared using the reference material IAEA-372, which had a specific activity of 11320 Bq/kg at july 2006. The obtained results were better for the sources with 29 mm radius showing relative bias lesser than 5% and for the sources with 10 mm height showing relative bias lesser than 6%. In comparison with the obtained results in the work where we present the method, the majority of these results show an excellent agreement.

  18. DEVELOPMENT AND VALIDATION OF NUMERICAL METHOD FOR STRENGTH ANALYSIS OF LATTICE COMPOSITE FUSELAGE STRUCTURES

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available Lattice composite fuselage structures are developed as an alternative to conventional composite structures based on laminated skin and stiffeners. Structure layout of lattice structures allows to realize advantages of current composite materials to a maximal extent, at the same time minimizing their main shortcomings, that allows to provide higher weight efficiency for these structures in comparison with conventional analogues.Development and creation of lattice composite structures requires development of novel methods of strength anal- ysis, as conventional methods, as a rule, are aiming to strength analysis of thin-walled elements and do not allow to get confident estimation of local strength of high-loaded unidirectional composite ribs.In the present work the method of operative strength analysis of lattice composite structure is presented, based onspecialized FE-models of unidirectional composite ribs and their intersections. In the frames of the method, every rib is modeled by a caisson structure, consisting of arbitrary number of flanges and webs, modeled by membrane finite elements. Parameters of flanges and webs are calculated automatically from the condition of stiffness characteristics equality of real rib and the model. This method allows to perform local strength analysis of high-loaded ribs of lattice structure without use of here-dimensional finite elements, that allows to shorten time of calculations and sufficiently simplify the procedure of analysis of results of calculations.For validation of the suggested method, the results of experimental investigations of full-scale prototype of shell of lattice composite fuselage section have been used. The prototype of the lattice section was manufactured in CRISM and tested in TsAGI within the frames of a number of Russian and International scientific projects. The results of validation have shown that the suggested method allows to provide high operability of strength analysis, keeping

  19. Validation of the cleaning and sanitization method for radiopharmaceutical production facilities

    International Nuclear Information System (INIS)

    Robles, Anita; Morote, Mario; Moore, Mariel; Castro, Delcy; Paragulla, Wilson; Novoa, Carlos; Otero, Manuel; Miranda, Jesus; Herrera, Jorge; Gonzales, Luis

    2014-01-01

    A protocol for the cleaning and sanitization method for radiopharmaceutical production facilities has been designed and developed for the inner surface of the hot cells for the production of Sodium Pertechnetate Tc-99m and Sm-153 EDTMP, considering an extreme situation for each hot cell. Cleaning is performed with double-distilled water and sanitation with two disinfectant solutions, 70 % isopropyl alcohol and 3 % hydrogen peroxide in alternate weeks. Microbiological analysis of sanitized surfaces were made after 20 minutes and 48 hours for the hot cell of Tc-99m and 72 hours for the hot cell of EDTMP Sm-153 in 3 consecutive tests by the method of direct contact with plates containing culture medium, made for each sampling point (6 in the first and five in the second). The results showed that the microbial load on surfaces sanitized was below acceptable limits and that the lifetime of cleaning and sanitization is 48 hours for the hot cell of Tc-99m and 72 hours for the one of EDTMP-Sm-153. As a conclusion, the method of cleaning and sanitization is effective to reduce or eliminate microbial contamination therefore, the process is validated. (authors).

  20. A validated Fourier transform infrared spectroscopy method for quantification of total lactones in Inula racemosa and Andrographis paniculata.

    Science.gov (United States)

    Shivali, Garg; Praful, Lahorkar; Vijay, Gadgil

    2012-01-01

    Fourier transform infrared (FT-IR) spectroscopy is a technique widely used for detection and quantification of various chemical moieties. This paper describes the use of the FT-IR spectroscopy technique for the quantification of total lactones present in Inula racemosa and Andrographis paniculata. To validate the FT-IR spectroscopy method for quantification of total lactones in I. racemosa and A. paniculata. Dried and powdered I. racemosa roots and A. paniculata plant were extracted with ethanol and dried to remove ethanol completely. The ethanol extract was analysed in a KBr pellet by FT-IR spectroscopy. The FT-IR spectroscopy method was validated and compared with a known spectrophotometric method for quantification of lactones in A. paniculata. By FT-IR spectroscopy, the amount of total lactones was found to be 2.12 ± 0.47% (n = 3) in I. racemosa and 8.65 ± 0.51% (n = 3) in A. paniculata. The method showed comparable results with a known spectrophotometric method used for quantification of such lactones: 8.42 ± 0.36% (n = 3) in A. paniculata. Limits of detection and quantification for isoallantolactone were 1 µg and 10 µg respectively; for andrographolide they were 1.5 µg and 15 µg respectively. Recoveries were over 98%, with good intra- and interday repeatability: RSD ≤ 2%. The FT-IR spectroscopy method proved linear, accurate, precise and specific, with low limits of detection and quantification, for estimation of total lactones, and is less tedious than the UV spectrophotometric method for the compounds tested. This validated FT-IR spectroscopy method is readily applicable for the quality control of I. racemosa and A. paniculata. Copyright © 2011 John Wiley & Sons, Ltd.