WorldWideScience

Sample records for validation method scope

  1. A scoping review of rapid review methods.

    Science.gov (United States)

    Tricco, Andrea C; Antony, Jesmin; Zarin, Wasifa; Strifler, Lisa; Ghassemi, Marco; Ivory, John; Perrier, Laure; Hutton, Brian; Moher, David; Straus, Sharon E

    2015-09-16

    Rapid reviews are a form of knowledge synthesis in which components of the systematic review process are simplified or omitted to produce information in a timely manner. Although numerous centers are conducting rapid reviews internationally, few studies have examined the methodological characteristics of rapid reviews. We aimed to examine articles, books, and reports that evaluated, compared, used or described rapid reviews or methods through a scoping review. MEDLINE, EMBASE, the Cochrane Library, internet websites of rapid review producers, and reference lists were searched to identify articles for inclusion. Two reviewers independently screened literature search results and abstracted data from included studies. Descriptive analysis was conducted. We included 100 articles plus one companion report that were published between 1997 and 2013. The studies were categorized as 84 application papers, seven development papers, six impact papers, and four comparison papers (one was included in two categories). The rapid reviews were conducted between 1 and 12 months, predominantly in Europe (58 %) and North America (20 %). The included studies failed to report 6 % to 73 % of the specific systematic review steps examined. Fifty unique rapid review methods were identified; 16 methods occurred more than once. Streamlined methods that were used in the 82 rapid reviews included limiting the literature search to published literature (24 %) or one database (2 %), limiting inclusion criteria by date (68 %) or language (49 %), having one person screen and another verify or screen excluded studies (6 %), having one person abstract data and another verify (23 %), not conducting risk of bias/quality appraisal (7 %) or having only one reviewer conduct the quality appraisal (7 %), and presenting results as a narrative summary (78 %). Four case studies were identified that compared the results of rapid reviews to systematic reviews. Three studies found that the conclusions between

  2. Models & Methods in SCOPE A Status Report

    Science.gov (United States)

    2008-04-01

    Projecttitel april 2008 Soldaat Optreden Scope Consolidatie Auteur (s) drs. E.M. Ubink Programmanummer Projectnummer dr. W.A. Lotens V707 032.13 114 drs...starvation. The fat and protein involved is handled through the muscle equation. Excess glucose production is converted to body fat. In theory body fat is

  3. Epicurean Meteorology: Sources, method, scope and organization

    NARCIS (Netherlands)

    Bakker, F.A.

    2016-01-01

    In Epicurean Meteorology Frederik Bakker discusses the meteorology as laid out by Epicurus (341-270 BCE) and Lucretius (1st century BCE). Although in scope and organization their ideas are clearly rooted in the Peripatetic tradition, their meteorology sets itself apart from this tradition by its

  4. Validation Process Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); English, Christine M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gesick, Joshua C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukkamala, Saikrishna [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-01-04

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  5. Scoping reviews: time for clarity in definition, methods, and reporting.

    Science.gov (United States)

    Colquhoun, Heather L; Levac, Danielle; O'Brien, Kelly K; Straus, Sharon; Tricco, Andrea C; Perrier, Laure; Kastner, Monika; Moher, David

    2014-12-01

    The scoping review has become increasingly popular as a form of knowledge synthesis. However, a lack of consensus on scoping review terminology, definition, methodology, and reporting limits the potential of this form of synthesis. In this article, we propose recommendations to further advance the field of scoping review methodology. We summarize current understanding of scoping review publication rates, terms, definitions, and methods. We propose three recommendations for clarity in term, definition and methodology. We recommend adopting the terms "scoping review" or "scoping study" and the use of a proposed definition. Until such time as further guidance is developed, we recommend the use of the methodological steps outlined in the Arksey and O'Malley framework and further enhanced by Levac et al. The development of reporting guidance for the conduct and reporting of scoping reviews is underway. Consistency in the proposed domains and methodologies of scoping reviews, along with the development of reporting guidance, will facilitate methodological advancement, reduce confusion, facilitate collaboration and improve knowledge translation of scoping review findings. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    Science.gov (United States)

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  7. Evaluation of full-scope simulator testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Feher, M P; Moray, N; Senders, J W; Biron, K [Human Factors North Inc., Toronto, ON (Canada)

    1995-03-01

    This report discusses the use of full scope nuclear power plant simulators in licensing examinations for Unit First Operators of CANDU reactors. The existing literature is reviewed, and an annotated bibliography of the more important sources provided. Since existing methods are judged inadequate, conceptual bases for designing a system for licensing are discussed, and a method proposed which would make use of objective scoring methods based on data collection in full-scope simulators. A field trial of such a method is described. The practicality of such a method is critically discussed and possible advantages of subjective methods of evaluation considered. (author). 32 refs., 1 tab., 4 figs.

  8. Evaluation of full-scope simulator testing methods

    International Nuclear Information System (INIS)

    Feher, M.P.; Moray, N.; Senders, J.W.; Biron, K.

    1995-03-01

    This report discusses the use of full scope nuclear power plant simulators in licensing examinations for Unit First Operators of CANDU reactors. The existing literature is reviewed, and an annotated bibliography of the more important sources provided. Since existing methods are judged inadequate, conceptual bases for designing a system for licensing are discussed, and a method proposed which would make use of objective scoring methods based on data collection in full-scope simulators. A field trial of such a method is described. The practicality of such a method is critically discussed and possible advantages of subjective methods of evaluation considered. (author). 32 refs., 1 tab., 4 figs

  9. The importance of having a flexible scope ISO 15189 accreditation and quality specifications based on biological variation – the case of validation of the biochemistry analyzer Dimension Vista

    OpenAIRE

    Fernandez-Calle, Pilar; Pelaz, Sandra; Oliver, Paloma; Alcaide, Maria Jose; Gomez-Rioja, Ruben; Buno, Antonio; Iturzaeta, Jose Manuel

    2013-01-01

    Introduction: Technological innovation requires the laboratories to ensure that modifications or incorporations of new techniques do not alter the quality of their results. In an ISO 15189 accredited laboratory, flexible scope accreditation facilitates the inclusion of these changes prior to accreditation body evaluation. A strategy to perform the validation of a biochemistry analyzer in an accredited laboratory having a flexible scope is shown. Materials and methods: A validation procedur...

  10. Validation of AEGIS/SCOPE2 system through actual core follow calculations with irregular operational conditions

    International Nuclear Information System (INIS)

    Tabuchi, M.; Tatsumi, M.; Ohoka, Y.; Nagano, H.; Ishizaki, K.

    2017-01-01

    This paper describes overview of AEGIS/SCOPE2 system, an advanced in-core fuel management system for pressurized water reactors, and its validation results of actual core follow calculations including irregular operational conditions. AEGIS and SCOPE2 codes adopt more detailed and accurate calculation models compared to the current core design codes while computational cost is minimized with various techniques on numerical and computational algorithms. Verification and validation of AEGIS/SCOPE2 has been intensively performed to confirm validity of the system. As a part of the validation, core follow calculations have been carried out mainly for typical operational conditions. After the Fukushima Daiichi nuclear power plant accident, however, all the nuclear reactors in Japan suffered from long suspension and irregular operational conditions. In such situations, measured data in the restart and operation of the reactors should be good examinations for validation of the codes. Therefore, core follow calculations were carried out with AEGIS/SCOPE2 for various cases including zero power reactor physics tests with irregular operational conditions. Comparisons between measured data and predictions by AEGIS/SCOPE2 revealed the validity and robustness of the system. (author)

  11. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    2014-01-01

    The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution, as it req......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  12. Building "Applied Linguistic Historiography": Rationale, Scope, and Methods

    Science.gov (United States)

    Smith, Richard

    2016-01-01

    In this article I argue for the establishment of "Applied Linguistic Historiography" (ALH), that is, a new domain of enquiry within applied linguistics involving a rigorous, scholarly, and self-reflexive approach to historical research. Considering issues of rationale, scope, and methods in turn, I provide reasons why ALH is needed and…

  13. SCoPE: an efficient method of Cosmological Parameter Estimation

    International Nuclear Information System (INIS)

    Das, Santanu; Souradeep, Tarun

    2014-01-01

    Markov Chain Monte Carlo (MCMC) sampler is widely used for cosmological parameter estimation from CMB and other data. However, due to the intrinsic serial nature of the MCMC sampler, convergence is often very slow. Here we present a fast and independently written Monte Carlo method for cosmological parameter estimation named as Slick Cosmological Parameter Estimator (SCoPE), that employs delayed rejection to increase the acceptance rate of a chain, and pre-fetching that helps an individual chain to run on parallel CPUs. An inter-chain covariance update is also incorporated to prevent clustering of the chains allowing faster and better mixing of the chains. We use an adaptive method for covariance calculation to calculate and update the covariance automatically as the chains progress. Our analysis shows that the acceptance probability of each step in SCoPE is more than 95% and the convergence of the chains are faster. Using SCoPE, we carry out some cosmological parameter estimations with different cosmological models using WMAP-9 and Planck results. One of the current research interests in cosmology is quantifying the nature of dark energy. We analyze the cosmological parameters from two illustrative commonly used parameterisations of dark energy models. We also asses primordial helium fraction in the universe can be constrained by the present CMB data from WMAP-9 and Planck. The results from our MCMC analysis on the one hand helps us to understand the workability of the SCoPE better, on the other hand it provides a completely independent estimation of cosmological parameters from WMAP-9 and Planck data

  14. Cost-of-illness studies: concepts, scopes, and methods

    Directory of Open Access Journals (Sweden)

    Changik Jo

    2014-12-01

    Full Text Available Liver diseases are one of the main causes of death, and their ever-increasing prevalence is threatening to cause significant damage both to individuals and society as a whole. This damage is especially serious for the economically active population in Korea. From the societal perspective, it is therefore necessary to consider the economic impacts associated with liver diseases, and identify interventions that can reduce the burden of these diseases. The cost-of-illness study is considered to be an essential evaluation technique in health care. By measuring and comparing the economic burdens of diseases to society, such studies can help health-care decision-makers to set up and prioritize health-care policies and interventions. Using economic theories, this paper introduces various study methods that are generally applicable to most disease cases for estimating the costs of illness associated with mortality, morbidity, disability, and other disease characteristics. It also presents concepts and scopes of costs along with different cost categories from different research perspectives in cost estimations. By discussing the epidemiological and economic grounds of the cost-of-illness study, the reported results represent useful information about several evaluation techniques at an advanced level, such as cost-benefit analysis, cost-effectiveness analysis, and cost-utility analysis.

  15. Cost-of-illness studies: concepts, scopes, and methods

    Science.gov (United States)

    2014-01-01

    Liver diseases are one of the main causes of death, and their ever-increasing prevalence is threatening to cause significant damage both to individuals and society as a whole. This damage is especially serious for the economically active population in Korea. From the societal perspective, it is therefore necessary to consider the economic impacts associated with liver diseases, and identify interventions that can reduce the burden of these diseases. The cost-of-illness study is considered to be an essential evaluation technique in health care. By measuring and comparing the economic burdens of diseases to society, such studies can help health-care decision-makers to set up and prioritize health-care policies and interventions. Using economic theories, this paper introduces various study methods that are generally applicable to most disease cases for estimating the costs of illness associated with mortality, morbidity, disability, and other disease characteristics. It also presents concepts and scopes of costs along with different cost categories from different research perspectives in cost estimations. By discussing the epidemiological and economic grounds of the cost-of-illness study, the reported results represent useful information about several evaluation techniques at an advanced level, such as cost-benefit analysis, cost-effectiveness analysis, and cost-utility analysis. PMID:25548737

  16. ATWS thermal-hydraulic analysis for Krsko Full Scope Simulator validation

    International Nuclear Information System (INIS)

    Parzer, I.; Kljenak, I.

    2005-01-01

    The purpose of this analysis was to simulate Anticipated Transient without Scram transient for Krsko NPP. The results of these calculations were used for annual ANSI/ANS validation of reactor coolant system thermal-hydraulic response predicted by Krsko Full Scope Simulator. For the thermal-hydraulic analyses the RELAP5/MOD3.3 code and the input model for NPP Krsko, delivered by NPP Krsko, was used. In the presented paper the most severe ATWS scenario has been analyzed, starting with the loss of Main Feedwater at both steam generators. Thus, gradual loss of secondary heat sink occurred. On top of that, control rods were not supposed to scram, leaving the chain reaction to be controlled only by inherent physical properties of the fuel and moderator and eventual actions of the BOP system. The primary system response has been studied assuming AMSAC availability. (author)

  17. Validation for chromatographic and electrophoretic methods

    OpenAIRE

    Ribani, Marcelo; Bottoli, Carla Beatriz Grespan; Collins, Carol H.; Jardim, Isabel Cristina Sales Fontes; Melo, Lúcio Flávio Costa

    2004-01-01

    The validation of an analytical method is fundamental to implementing a quality control system in any analytical laboratory. As the separation techniques, GC, HPLC and CE, are often the principal tools used in such determinations, procedure validation is a necessity. The objective of this review is to describe the main aspects of validation in chromatographic and electrophoretic analysis, showing, in a general way, the similarities and differences between the guidelines established by the dif...

  18. The Strengths and Difficulties Questionnaire (SDQ) in Africa: a scoping review of its application and validation.

    Science.gov (United States)

    Hoosen, Nikhat; Davids, Eugene Lee; de Vries, Petrus J; Shung-King, Maylene

    2018-01-01

    Child and adolescent mental health in Africa remains largely neglected. Quick and cost-effective ways for early detection may aid early intervention. The Strengths and Difficulties Questionnaire (SDQ) is globally used to screen for mental health problems, but little is known about its use in Africa. We set out to perform a scoping review to examine existing studies that have used the SDQ in Africa. A comprehensive scoping review methodology was used to identify all peer-reviewed studies ever published that have used the SDQ in Africa. Data were extracted and analysed to assess the countries, languages and SDQ versions used, the purpose of the SDQ studies, psychometric properties of the SDQ, and to consider knowledge gaps for future in-country and cross-country studies. Fifty-four studies from 12 African countries were identified, most from South Africa. Many different languages were used, but authorized SDQs in those languages were not always available on the SDQinfo website. Authors frequently commented on challenges in the translation and backtranslation of mental health terminology in African languages. The SDQ was typically used to investigate internalisation/externalization disorders in different clinical populations, and was most frequently used in the evaluation of children and adolescents affected by HIV/AIDS. Sixteen studies (29.6%) administered the SDQ to participants outside the intended age range, only 4 (7.4%) used triangulation of all versions to generate assessments, and eight studies (14.8%) used only subscales of the SDQ. Only one study conducted thorough psychometric validation of the SDQ, including examination of internal consistency and factor analysis. Where 'caseness' was defined in studies, UK cut-off scores were used in all but one of the studies. The SDQ may be a very useful tool in an African setting, but the scoping review suggested that, where it was used in Africa researchers did not always follow instrument guidelines, and highlighted

  19. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.; Adams, M.L.; McClarren, R.G.; Mallick, B.K.

    2011-01-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework

  20. National programmes for validating physician competence and fitness for practice: a scoping review.

    Science.gov (United States)

    Horsley, Tanya; Lockyer, Jocelyn; Cogo, Elise; Zeiter, Jeanie; Bursey, Ford; Campbell, Craig

    2016-04-15

    To explore and categorise the state of existing literature for national programmes designed to affirm or establish the continuing competence of physicians. Scoping review. MEDLINE, ERIC, Sociological Abstracts, web/grey literature (2000-2014). Included when a record described a (1) national-level physician validation system, (2) recognised as a system for affirming competence and (3) reported relevant data. Using bibliographic software, title and abstracts were reviewed using an assessment matrix to ensure duplicate, paired screening. Dyads included both a methodologist and content expert on each assessment, reflective of evidence-informed best practices to decrease errors. 45 reports were included. Publication dates ranged from 2002 to 2014 with the majority of publications occurring in the previous six years (n=35). Country of origin--defined as that of the primary author--included the USA (N=32), the UK (N=8), Canada (N=3), Kuwait (N=1) and Australia (N=1). Three broad themes emerged from this heterogeneous data set: contemporary national programmes, contextual factors and terminological consistency. Four national physician validation systems emerged from the data: the American Board of Medical Specialties Maintenance of Certification Program, the Federation of State Medical Boards Maintenance of Licensure Program, the Canadian Revalidation Program and the UK Revalidation Program. Three contextual factors emerged as stimuli for the implementation of national validation systems: medical regulation, quality of care and professional competence. Finally, great variation among the definitions of key terms was identified. There is an emerging literature focusing on national physician validation systems. Four major systems have been implemented in recent years and it is anticipated that more will follow. Much of this work is descriptive, and gaps exist for the extent to which systems build on current evidence or theory. Terminology is highly variable across programmes

  1. ASTM Validates Air Pollution Test Methods

    Science.gov (United States)

    Chemical and Engineering News, 1973

    1973-01-01

    The American Society for Testing and Materials (ASTM) has validated six basic methods for measuring pollutants in ambient air as the first part of its Project Threshold. Aim of the project is to establish nationwide consistency in measuring pollutants; determining precision, accuracy and reproducibility of 35 standard measuring methods. (BL)

  2. Validated modified Lycopodium spore method development for ...

    African Journals Online (AJOL)

    Validated modified lycopodium spore method has been developed for simple and rapid quantification of herbal powdered drugs. Lycopodium spore method was performed on ingredients of Shatavaryadi churna, an ayurvedic formulation used as immunomodulator, galactagogue, aphrodisiac and rejuvenator. Estimation of ...

  3. Method Validation Procedure in Gamma Spectroscopy Laboratory

    International Nuclear Information System (INIS)

    El Samad, O.; Baydoun, R.

    2008-01-01

    The present work describes the methodology followed for the application of ISO 17025 standards in gamma spectroscopy laboratory at the Lebanese Atomic Energy Commission including the management and technical requirements. A set of documents, written procedures and records were prepared to achieve the management part. The technical requirements, internal method validation was applied through the estimation of trueness, repeatability , minimum detectable activity and combined uncertainty, participation in IAEA proficiency tests assure the external method validation, specially that the gamma spectroscopy lab is a member of ALMERA network (Analytical Laboratories for the Measurements of Environmental Radioactivity). Some of these results are presented in this paper. (author)

  4. Implementing Montessori Methods for Dementia: A Scoping Review.

    Science.gov (United States)

    Hitzig, Sander L; Sheppard, Christine L

    2017-10-01

    A scoping review was conducted to develop an understanding of Montessori-based programing (MBP) approaches used in dementia care and to identify optimal ways to implement these programs across various settings. Six peer-reviewed databases were searched for relevant abstracts by 2 independent reviewers. Included articles and book chapters were those available in English and published by the end of January 2016. Twenty-three articles and 2 book chapters met the inclusion criteria. Four approaches to implementing MBP were identified: (a) staff assisted (n = 14); (b) intergenerational (n = 5); (c) resident assisted (n = 4); and (d) volunteer or family assisted (n = 2). There is a high degree of variability with how MBP was delivered and no clearly established "best practices" or standardized protocol emerged across approaches except for resident-assisted MBP. The findings from this scoping review provide an initial road map on suggestions for implementing MBP across dementia care settings. Irrespective of implementation approach, there are several pragmatic and logistical issues that need to be taken into account for optimal implementation. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Moving beyond Traditional Methods of Survey Validation

    Science.gov (United States)

    Maul, Andrew

    2017-01-01

    In his focus article, "Rethinking Traditional Methods of Survey Validation," published in this issue of "Measurement: Interdisciplinary Research and Perspectives," Andrew Maul wrote that it is commonly believed that self-report, survey-based instruments can be used to measure a wide range of psychological attributes, such as…

  6. Validation of the Rotation Ratios Method

    International Nuclear Information System (INIS)

    Foss, O.A.; Klaksvik, J.; Benum, P.; Anda, S.

    2007-01-01

    Background: The rotation ratios method describes rotations between pairs of sequential pelvic radiographs. The method seems promising but has not been validated. Purpose: To validate the accuracy of the rotation ratios method. Material and Methods: Known pelvic rotations between 165 radiographs obtained from five skeletal pelvises in an experimental material were compared with the corresponding calculated rotations to describe the accuracy of the method. The results from a clinical material of 262 pelvic radiographs from 46 patients defined the ranges of rotational differences compared. Repeated analyses, both on the experimental and the clinical material, were performed using the selected reference points to describe the robustness and the repeatability of the method. Results: The reference points were easy to identify and barely influenced by pelvic rotations. The mean differences between calculated and real pelvic rotations were 0.0 deg (SD 0.6) for vertical rotations and 0.1 deg (SD 0.7) for transversal rotations in the experimental material. The intra- and interobserver repeatability of the method was good. Conclusion: The accuracy of the method was reasonably high, and the method may prove to be clinically useful

  7. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  8. PSA Level 2:Scope And Method Of PSA Level 2 For Nuclear Power Plant

    International Nuclear Information System (INIS)

    Widodo, Surip; Antariksawan, Anhar R.

    2001-01-01

    A study of scope and method of PSA Level 2 had been conducted. The background of the study is the need to gain the capability to well perform PSA Level 2 for nuclear facilities. This study is a literature survey. The scope of PSA Level 2 consists of generating plant damage states, accident progression analysis, and grouping source terms. Concerning accident progression analysis, several methods are used, among others event tree method, named accident progression event tree (APET) or containment event tree (CET), and fault tree method. The end result of PSA Level 2 is release end states which is grouped into release bins. The results will be used for PSA Level 3

  9. Spacecraft early design validation using formal methods

    International Nuclear Information System (INIS)

    Bozzano, Marco; Cimatti, Alessandro; Katoen, Joost-Pieter; Katsaros, Panagiotis; Mokos, Konstantinos; Nguyen, Viet Yen; Noll, Thomas; Postma, Bart; Roveri, Marco

    2014-01-01

    The size and complexity of software in spacecraft is increasing exponentially, and this trend complicates its validation within the context of the overall spacecraft system. Current validation methods are labor-intensive as they rely on manual analysis, review and inspection. For future space missions, we developed – with challenging requirements from the European space industry – a novel modeling language and toolset for a (semi-)automated validation approach. Our modeling language is a dialect of AADL and enables engineers to express the system, the software, and their reliability aspects. The COMPASS toolset utilizes state-of-the-art model checking techniques, both qualitative and probabilistic, for the analysis of requirements related to functional correctness, safety, dependability and performance. Several pilot projects have been performed by industry, with two of them having focused on the system-level of a satellite platform in development. Our efforts resulted in a significant advancement of validating spacecraft designs from several perspectives, using a single integrated system model. The associated technology readiness level increased from level 1 (basic concepts and ideas) to early level 4 (laboratory-tested)

  10. [Identification of the scope of practice for dental nurses with Delphi method].

    Science.gov (United States)

    Li, Yu-Hong; Lu, Yue-Cen; Huang, Yao; Ruan, Hong; Wu, Zheng-Yi

    2016-10-01

    To identify the practice scope of dental nurses under the new situations. The draft of scope of practice for dental nurses was based on theoretical analysis, literature review and consultation of advisory panel, and the final scope of practice for dental nurses was established by using the Delphi method. Statistical analysis was implemented using coefficient of variation, Kendall W with SPSS 17.0 software package. Thirty experts were consulted twice by using the Delphi method. The effective rates of two rounds of questionnaire were 100% and 73.3%, respectively. The authority coefficient was 0.837, and the P value of expert coordination coefficients W was less than 0.05. There were totally 116 suggestions from the experts, and 96 were accepted. The scope of practice for dental nurses was finally established, including 4 primary indexes and 25 secondary indexes. The scope of practice for dental nurses under the new situations is established in China through scientific methods. It is favorable for position management of dental nurses and may promote the development of nurse specialists in dental clinic.

  11. Development of an Assessment Method for Building Materials Under Euratom Scope.

    Science.gov (United States)

    de With, Govert

    2017-11-01

    In 2013, the European Commission published its basic safety standards for protection against the dangers arising from exposure to ionizing radiation (Council Directive 2013/59/Euratom)-also known as EU-BSS. As a result, the use of raw materials with potentially elevated activity concentrations such as fly ash, phosphogypsum, and slags will now fall under EU-BSS scope when applied in building materials. In light of this new policy, a variety of tools are available to assess compliance with the 1-mSv y reference level for building materials. At the heart of these tools is a gamma-spectrometric determination of the naturally occurring radionuclides Ra, Th, and K in the material of concern. As a large number of construction products contain a certain amount of the raw material that falls under the scope of the EU regulation, this policy will lead to substantial measurement of building materials that pose little radiation risk. For this reason, a method is developed to enable assessment against the 1-mSv value not on the basis of gamma-spectrometric analysis but rather based on the product's material composition. The proposed method prescribes a maximum permitted content of raw materials with potentially elevated activity concentrations in terms of a weight percentage of the end product, where the raw materials of concern are defined as those listed in Annex XIII of the EU-BSS. The permitted content is a function of the product's surface density. Therefore, a product with a low surface density of up to 25 kg m can consist of nearly 100% raw materials with potentially elevated activity concentrations, and this percentage drops to around 15% for products with a surface density of around 500 kg m. Building materials that comply with these requirements on product composition are exempt from testing, while products that do not comply must perform regular gamma-spectrometric analysis. A full validation and testing of the method is provided. In addition, the paper discusses

  12. Space Suit Joint Torque Measurement Method Validation

    Science.gov (United States)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  13. Study on the scope of fault tree method applicability

    International Nuclear Information System (INIS)

    Ito, Taiju

    1980-03-01

    In fault tree analysis of the reliability of nuclear safety system, including reliability analysis of nuclear protection system, there seem to be some documents in which application of the fault tree method is unreasonable. In fault tree method, the addition rule and the multiplication rule are usually used. The addition rule and the multiplication rule must hold exactly or at least practically. The addition rule has no problem but the multiplication rule has occasionally some problem. For unreliability, mean unavailability and instantaneous unavailability of the elements, holding or not of the multiplication rule has been studied comprehensively. Between the unreliability of each element without maintenance, the multiplication rule holds. Between the instantaneous unavailability of each element, with maintenance or not, the multiplication rule also holds. Between the unreliability of each subsystem with maintenance, however, the multiplication rule does not hold, because the product value is larger than the value of unreliability for a parallel system consisting of the two subsystems with maintenance. Between the mean unavailability of each element without maintenance, the multiplication rule also does not hold, because the product value is smaller than the value of mean unavailability for a parallel system consisting of the two elements without maintenance. In these cases, therefore, the fault tree method may not be applied by rote for reliability analysis of the system. (author)

  14. Widening the Scope of R-matrix Methods

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Ian J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dimitriou, Paraskevi [IAEA, Vienna (Austria); DeBoer, Richard J. [Nieuwland Science Hall, Notre Dame, IN (United States); Kunieda, Satoshi [Nuclear Data Center (JAEA), Tokai (Japan); Paris, Mark [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Thompson, Ian [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Trkov, Andrej [IAEA, Vienna (Austria)

    2016-03-01

    A Consultant’s Meeting was held at the IAEA Headquarters, from 7 to 9 December 2015, to discuss the status of R-matrix codes currently used in calculations of charged-particle induced reaction cross sections at low energies. The ultimate goal was to initiate an international effort, coordinated by the IAEA, to evaluate charged-particle induced reactions in the resolved-resonance region. Participants reviewed the capabilities of the codes, the different implementations of R-matrix theory and translatability of the R-matrix parameters, the evaluation methods and suitable data formats for broader dissemination. The details of the presentations and technical discussions, as well as the actions that were proposed to achieve the goal of the meeting are summarized in this report.

  15. Comparison of validation methods for forming simulations

    Science.gov (United States)

    Schug, Alexander; Kapphan, Gabriel; Bardl, Georg; Hinterhölzl, Roland; Drechsler, Klaus

    2018-05-01

    The forming simulation of fibre reinforced thermoplastics could reduce the development time and improve the forming results. But to take advantage of the full potential of the simulations it has to be ensured that the predictions for material behaviour are correct. For that reason, a thorough validation of the material model has to be conducted after characterising the material. Relevant aspects for the validation of the simulation are for example the outer contour, the occurrence of defects and the fibre paths. To measure these features various methods are available. Most relevant and also most difficult to measure are the emerging fibre orientations. For that reason, the focus of this study was on measuring this feature. The aim was to give an overview of the properties of different measuring systems and select the most promising systems for a comparison survey. Selected were an optical, an eddy current and a computer-assisted tomography system with the focus on measuring the fibre orientations. Different formed 3D parts made of unidirectional glass fibre and carbon fibre reinforced thermoplastics were measured. Advantages and disadvantages of the tested systems were revealed. Optical measurement systems are easy to use, but are limited to the surface plies. With an eddy current system also lower plies can be measured, but it is only suitable for carbon fibres. Using a computer-assisted tomography system all plies can be measured, but the system is limited to small parts and challenging to evaluate.

  16. Softcopy quality ruler method: implementation and validation

    Science.gov (United States)

    Jin, Elaine W.; Keelan, Brian W.; Chen, Junqing; Phillips, Jonathan B.; Chen, Ying

    2009-01-01

    A softcopy quality ruler method was implemented for the International Imaging Industry Association (I3A) Camera Phone Image Quality (CPIQ) Initiative. This work extends ISO 20462 Part 3 by virtue of creating reference digital images of known subjective image quality, complimenting the hardcopy Standard Reference Stimuli (SRS). The softcopy ruler method was developed using images from a Canon EOS 1Ds Mark II D-SLR digital still camera (DSC) and a Kodak P880 point-and-shoot DSC. Images were viewed on an Apple 30in Cinema Display at a viewing distance of 34 inches. Ruler images were made for 16 scenes. Thirty ruler images were generated for each scene, representing ISO 20462 Standard Quality Scale (SQS) values of approximately 2 to 31 at an increment of one just noticeable difference (JND) by adjusting the system modulation transfer function (MTF). A Matlab GUI was developed to display the ruler and test images side-by-side with a user-adjustable ruler level controlled by a slider. A validation study was performed at Kodak, Vista Point Technology, and Aptina Imaging in which all three companies set up a similar viewing lab to run the softcopy ruler method. The results show that the three sets of data are in reasonable agreement with each other, with the differences within the range expected from observer variability. Compared to previous implementations of the quality ruler, the slider-based user interface allows approximately 2x faster assessments with 21.6% better precision.

  17. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.

    2011-09-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which experimental data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport universe, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new experiments within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies. © 2011 Elsevier Ltd. All rights reserved.

  18. Toward a Unified Validation Framework in Mixed Methods Research

    Science.gov (United States)

    Dellinger, Amy B.; Leech, Nancy L.

    2007-01-01

    The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as…

  19. FDIR Strategy Validation with the B Method

    Science.gov (United States)

    Sabatier, D.; Dellandrea, B.; Chemouil, D.

    2008-08-01

    In a formation flying satellite system, the FDIR strategy (Failure Detection, Isolation and Recovery) is paramount. When a failure occurs, satellites should be able to take appropriate reconfiguration actions to obtain the best possible results given the failure, ranging from avoiding satellite-to-satellite collision to continuing the mission without disturbance if possible. To achieve this goal, each satellite in the formation has an implemented FDIR strategy that governs how it detects failures (from tests or by deduction) and how it reacts (reconfiguration using redundant equipments, avoidance manoeuvres, etc.). The goal is to protect the satellites first and the mission as much as possible. In a project initiated by the CNES, ClearSy experiments the B Method to validate the FDIR strategies developed by Thales Alenia Space, of the inter satellite positioning and communication devices that will be used for the SIMBOL-X (2 satellite configuration) and the PEGASE (3 satellite configuration) missions and potentially for other missions afterward. These radio frequency metrology sensor devices provide satellite positioning and inter satellite communication in formation flying. This article presents the results of this experience.

  20. Optimization of axial enrichment distribution for BWR fuels using scoping libraries and block coordinate descent method

    Energy Technology Data Exchange (ETDEWEB)

    Tung, Wu-Hsiung, E-mail: wstong@iner.gov.tw; Lee, Tien-Tso; Kuo, Weng-Sheng; Yaur, Shung-Jung

    2017-03-15

    Highlights: • An optimization method for axial enrichment distribution in a BWR fuel was developed. • Block coordinate descent method is employed to search for optimal solution. • Scoping libraries are used to reduce computational effort. • Optimization search space consists of enrichment difference parameters. • Capability of the method to find optimal solution is demonstrated. - Abstract: An optimization method has been developed to search for the optimal axial enrichment distribution in a fuel assembly for a boiling water reactor core. The optimization method features: (1) employing the block coordinate descent method to find the optimal solution in the space of enrichment difference parameters, (2) using scoping libraries to reduce the amount of CASMO-4 calculation, and (3) integrating a core critical constraint into the objective function that is used to quantify the quality of an axial enrichment design. The objective function consists of the weighted sum of core parameters such as shutdown margin and critical power ratio. The core parameters are evaluated by using SIMULATE-3, and the cross section data required for the SIMULATE-3 calculation are generated by using CASMO-4 and scoping libraries. The application of the method to a 4-segment fuel design (with the highest allowable segment enrichment relaxed to 5%) demonstrated that the method can obtain an axial enrichment design with improved thermal limit ratios and objective function value while satisfying the core design constraints and core critical requirement through the use of an objective function. The use of scoping libraries effectively reduced the number of CASMO-4 calculation, from 85 to 24, in the 4-segment optimization case. An exhausted search was performed to examine the capability of the method in finding the optimal solution for a 4-segment fuel design. The results show that the method found a solution very close to the optimum obtained by the exhausted search. The number of

  1. Development and Validation of a Dissolution Test Method for ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a dissolution test method for dissolution release of artemether and lumefantrine from tablets. Methods: A single dissolution method for evaluating the in vitro release of artemether and lumefantrine from tablets was developed and validated. The method comprised of a dissolution medium of ...

  2. A scoping review of the potential for chart stimulated recall as a clinical research method.

    Science.gov (United States)

    Sinnott, Carol; Kelly, Martina A; Bradley, Colin P

    2017-08-22

    Chart-stimulated recall (CSR) is a case-based interviewing technique, which is used in the assessment of clinical decision-making in medical education and professional certification. Increasingly, clinical decision-making is a concern for clinical research in primary care. In this study, we review the prior application and utility of CSR as a technique for research interviews in primary care. Following Arksey & O'Malley's method for scoping reviews, we searched seven databases, grey literature, reference lists, and contacted experts in the field. We excluded studies on medical education or competence assessment. Retrieved citations were screened by one reviewer and full texts were ordered for all potentially relevant abstracts. Two researchers independently reviewed full texts and performed data extraction and quality appraisal if inclusion criteria were met. Data were collated and summarised using a published framework on the reporting of qualitative interview techniques, which was chosen a priori. The preferred reporting items for systematic reviews and meta-analyses (PRISMA) guidelines informed the review report. From an initial list of 789 citations, eight studies using CSR in research interviews were included in the review: six from North America, one from the Netherlands, and one from Ireland. The most common purpose of included studies was to examine the influence of guidelines on physicians' decisions. The number of interviewees ranged from seven to twenty nine, while the number of charts discussed per interview ranged from one to twelve. CSR gave insights into physicians' reasoning for actions taken or not taken; the unrecorded social and clinical influences on decisions; and discrepancies between physicians' real and perceived practice. Ethical concerns and the training and influence of the researcher were poorly discussed in most of the studies. Potential pitfalls included the risk of recall, selection and observation biases. Despite the proven validity

  3. The Value of Qualitative Methods in Social Validity Research

    Science.gov (United States)

    Leko, Melinda M.

    2014-01-01

    One quality indicator of intervention research is the extent to which the intervention has a high degree of social validity, or practicality. In this study, I drew on Wolf's framework for social validity and used qualitative methods to ascertain five middle schoolteachers' perceptions of the social validity of System 44®--a phonics-based reading…

  4. Method validation in pharmaceutical analysis: from theory to practical optimization

    Directory of Open Access Journals (Sweden)

    Jaqueline Kaleian Eserian

    2015-01-01

    Full Text Available The validation of analytical methods is required to obtain high-quality data. For the pharmaceutical industry, method validation is crucial to ensure the product quality as regards both therapeutic efficacy and patient safety. The most critical step in validating a method is to establish a protocol containing well-defined procedures and criteria. A well planned and organized protocol, such as the one proposed in this paper, results in a rapid and concise method validation procedure for quantitative high performance liquid chromatography (HPLC analysis.   Type: Commentary

  5. AOAC Official MethodSM Matrix Extension Validation Study of Assurance GDSTM for the Detection of Salmonella in Selected Spices.

    Science.gov (United States)

    Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew

    2015-01-01

    Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella.

  6. Example of severe accident management guidelines validation and verification using full scope simulator

    International Nuclear Information System (INIS)

    Krajnc, B.; Basic, I.; Spiler, J.

    2001-01-01

    The purpose of Severe Accident Management Guidelines (SAMG) is to provide guidelines to mitigate and control beyond design bases accidents. These guidelines are to be used by the technical support center that is established at the plant within one hour after the beginning of the accident as a technical support for the main control room operators. Since some of the accidents can progress very fast there are also two guidelines provided for the main control room operators. The first one is to be used if the core damage occurs and the TSC is not established yet and the second one after technical support center become operational. After SG replacement and power uprate in year 2000, NPP Krsko developed Rev.1 of these procedures, which have been validated and verified during one-week effort. Plant specific simulator capable of simulating severe accidents was extensively used.(author)

  7. Invisible hand in the process of making economics or on the method and scope of economics

    Directory of Open Access Journals (Sweden)

    Yay Turan

    2010-01-01

    Full Text Available As a social science, economics cannot be reduced to simply an a priori science or an ideology. In addition economics cannot be solely an empirical or a historical science. Economics is a research field which studies only one dimension of human behavior, with the four fields of mathematics, econometrics, ethics and history intersecting one another. The purpose of this paper is to discuss the two parts of the proposition above, in connection with the controversies surrounding the method and the scope of economics: economics as an applied mathematics and economics as a predictive/empirical science.

  8. The scope of application of incremental rapid prototyping methods in foundry engineering

    Directory of Open Access Journals (Sweden)

    M. Stankiewicz

    2010-01-01

    Full Text Available The article presents the scope of application of selected incremental Rapid Prototyping methods in the process of manufacturing casting models, casting moulds and casts. The Rapid Prototyping methods (SL, SLA, FDM, 3DP, JS are predominantly used for the production of models and model sets for casting moulds. The Rapid Tooling methods, such as: ZCast-3DP, ProMetalRCT and VoxelJet, enable the fabrication of casting moulds in the incremental process. The application of the RP methods in cast production makes it possible to speed up the prototype preparation process. This is particularly vital to elements of complex shapes. The time required for the manufacture of the model, the mould and the cast proper may vary from a few to several dozen hours.

  9. Development and validation of a spectroscopic method for the ...

    African Journals Online (AJOL)

    Development and validation of a spectroscopic method for the simultaneous analysis of ... advanced analytical methods such as high pressure liquid ..... equipment. DECLARATIONS ... high-performance liquid chromatography. J Chromatogr.

  10. Validation of method in instrumental NAA for food products sample

    International Nuclear Information System (INIS)

    Alfian; Siti Suprapti; Setyo Purwanto

    2010-01-01

    NAA is a method of testing that has not been standardized. To affirm and confirm that this method is valid. it must be done validation of the method with various sample standard reference materials. In this work. the validation is carried for food product samples using NIST SRM 1567a (wheat flour) and NIST SRM 1568a (rice flour). The results show that the validation method for testing nine elements (Al, K, Mg, Mn, Na, Ca, Fe, Se and Zn) in SRM 1567a and eight elements (Al, K, Mg, Mn, Na, Ca, Se and Zn ) in SRM 1568a pass the test of accuracy and precision. It can be conclude that this method has power to give valid result in determination element of the food products samples. (author)

  11. Validated High Performance Liquid Chromatography Method for ...

    African Journals Online (AJOL)

    Purpose: To develop a simple, rapid and sensitive high performance liquid chromatography (HPLC) method for the determination of cefadroxil monohydrate in human plasma. Methods: Schimadzu HPLC with LC solution software was used with Waters Spherisorb, C18 (5 μm, 150mm × 4.5mm) column. The mobile phase ...

  12. Questionnaires used to assess barriers of clinical guideline use among physicians are not comprehensive, reliable, or valid: a scoping review.

    Science.gov (United States)

    Willson, Melina L; Vernooij, Robin W M; Gagliardi, Anna R

    2017-06-01

    This study described the number and characteristics of questionnaires used to assess barriers of guideline use among physicians. A scoping review was conducted. MEDLINE and EMBASE were searched from 2005 to June 2016. English-language studies that administered a questionnaire to assess barriers of guideline use among practicing physicians were eligible. Summary statistics were used to report study and questionnaire characteristics. Questionnaire content was assessed with a checklist of 57 known barriers. Each of the 178 included studies administered a unique questionnaire. The number of questionnaires increased yearly from 2005 to 2015. Few were pilot-tested (50, 28.1%) or tested for psychometric properties (3, 1.7%). Two were based on theory. None probed for the full range of known barriers. Ten included a free-text option. The majority assessed professional barriers (177, 99.4%) but few of the 14 factors within this domain. Questionnaire characteristics did not change over time. Organizations administered questionnaires that were not reliable or valid and did not comprehensively assess barriers and may have selected interventions unlikely to promote guideline use. Research is needed to construct a questionnaire that is practical, adaptable, and robust and leads to the selection of interventions that support guideline use. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Human Factors methods concerning integrated validation of nuclear power plant control rooms; Metodutveckling foer integrerad validering

    Energy Technology Data Exchange (ETDEWEB)

    Oskarsson, Per-Anders; Johansson, Bjoern J.E.; Gonzalez, Natalia (Swedish Defence Research Agency, Information Systems, Linkoeping (Sweden))

    2010-02-15

    The frame of reference for this work was existing recommendations and instructions from the NPP area, experiences from the review of the Turbic Validation and experiences from system validations performed at the Swedish Armed Forces, e.g. concerning military control rooms and fighter pilots. These enterprises are characterized by complex systems in extreme environments, often with high risks, where human error can lead to serious consequences. A focus group has been performed with representatives responsible for Human Factors issues from all Swedish NPP:s. The questions that were discussed were, among other things, for whom an integrated validation (IV) is performed and its purpose, what should be included in an IV, the comparison with baseline measures, the design process, the role of SSM, which methods of measurement should be used, and how the methods are affected of changes in the control room. The report brings different questions to discussion concerning the validation process. Supplementary methods of measurement for integrated validation are discussed, e.g. dynamic, psychophysiological, and qualitative methods for identification of problems. Supplementary methods for statistical analysis are presented. The study points out a number of deficiencies in the validation process, e.g. the need of common guidelines for validation and design, criteria for different types of measurements, clarification of the role of SSM, and recommendations for the responsibility of external participants in the validation process. The authors propose 12 measures for taking care of the identified problems

  14. Validated high performance liquid chromatographic (HPLC) method ...

    African Journals Online (AJOL)

    STORAGESEVER

    2010-02-22

    Feb 22, 2010 ... specific and accurate high performance liquid chromatographic method for determination of ZER in micro-volumes ... tional medicine as a cure for swelling, sores, loss of appetite and ... Receptor Activator for Nuclear Factor κ B Ligand .... The effect of ... be suitable for preclinical pharmacokinetic studies. The.

  15. Validation of qualitative microbiological test methods

    NARCIS (Netherlands)

    IJzerman-Boon, Pieta C.; van den Heuvel, Edwin R.

    2015-01-01

    This paper considers a statistical model for the detection mechanism of qualitative microbiological test methods with a parameter for the detection proportion (the probability to detect a single organism) and a parameter for the false positive rate. It is demonstrated that the detection proportion

  16. Validation Method of a Telecommunications Blackout Attack

    National Research Council Canada - National Science Library

    Amado, Joao; Nunes, Paulo

    2005-01-01

    ..., and to obtain the maximum disruptive effect over the services. The proposed method uses a top-down approach, starting on the service level and ending on the different network elements that can be identified in the end as the targets for the attack.

  17. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    Science.gov (United States)

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  18. A preliminary investigation of PSA validation methods

    Energy Technology Data Exchange (ETDEWEB)

    Unwin, S D [Science Applications International Corp., (United States)

    1995-09-01

    This document has been prepared to support the initial phase of the Atomic Energy Control Board`s program to review and evaluate Probabilistic Safety Assessment (PSA) studies conducted by nuclear generating station designers and licensees. The document provides (1) a review of current and prospective applications of PSA technology in the Canadian nuclear power industry; (2) an assessment of existing practices and techniques for the review or risk and hazard identification studies in the international nuclear power sector and other technological sectors; and (3) proposed analytical framework in which to develop systematic techniques for the scrutiny and evaluation of a PSA model. These frameworks are based on consideration of the mathematical structure of a PSA model and are intended to facilitate the development of methods to evaluate a model relative to intended end-uses. (author). 34 refs., 10 tabs., 3 figs.

  19. A preliminary investigation of PSA validation methods

    International Nuclear Information System (INIS)

    Unwin, S.D.

    1995-09-01

    This document has been prepared to support the initial phase of the Atomic Energy Control Board's program to review and evaluate Probabilistic Safety Assessment (PSA) studies conducted by nuclear generating station designers and licensees. The document provides (1) a review of current and prospective applications of PSA technology in the Canadian nuclear power industry; (2) an assessment of existing practices and techniques for the review or risk and hazard identification studies in the international nuclear power sector and other technological sectors; and (3) proposed analytical framework in which to develop systematic techniques for the scrutiny and evaluation of a PSA model. These frameworks are based on consideration of the mathematical structure of a PSA model and are intended to facilitate the development of methods to evaluate a model relative to intended end-uses. (author). 34 refs., 10 tabs., 3 figs

  20. Qualitative-Geospatial Methods of Exploring Person-Place Transactions in Aging Adults: A Scoping Review.

    Science.gov (United States)

    Hand, Carri; Huot, Suzanne; Laliberte Rudman, Debbie; Wijekoon, Sachindri

    2017-06-01

    Research exploring how places shape and interact with the lives of aging adults must be grounded in the places where aging adults live and participate. Combined participatory geospatial and qualitative methods have the potential to illuminate the complex processes enacted between person and place to create much-needed knowledge in this area. The purpose of this scoping review was to identify methods that can be used to study person-place relationships among aging adults and their neighborhoods by determining the extent and nature of research with aging adults that combines qualitative methods with participatory geospatial methods. A systematic search of nine databases identified 1,965 articles published from 1995 to late 2015. We extracted data and assessed whether the geospatial and qualitative methods were supported by a specified methodology, the methods of data analysis, and the extent of integration of geospatial and qualitative methods. Fifteen studies were included and used the photovoice method, global positioning system tracking plus interview, or go-along interviews. Most included articles provided sufficient detail about data collection methods, yet limited detail about methodologies supporting the study designs and/or data analysis. Approaches that combine participatory geospatial and qualitative methods are beginning to emerge in the aging literature. By more explicitly grounding studies in a methodology, better integrating different types of data during analysis, and reflecting on methods as they are applied, these methods can be further developed and utilized to provide crucial place-based knowledge that can support aging adults' health, well-being, engagement, and participation. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Method validation for strobilurin fungicides in cereals and fruit

    DEFF Research Database (Denmark)

    Christensen, Hanne Bjerre; Granby, Kit

    2001-01-01

    Strobilurins are a new class of fungicides that are active against a broad spectrum of fungi. In the present work a GC method for analysis of strobilurin fungicides was validated. The method was based on extraction with ethyl acetate/cyclohexane, clean-up by gel permeation chromatography (GPC......) and determination of the content by gas chromatography (GC) with electron capture (EC-), nitrogen/phosphorous (NP-), and mass spectrometric (MS-) detection. Three strobilurins, azoxystrobin, kresoxim-methyl and trifloxystrobin were validated on three matrices, wheat, apple and grapes. The validation was based...

  2. International Harmonization and Cooperation in the Validation of Alternative Methods.

    Science.gov (United States)

    Barroso, João; Ahn, Il Young; Caldeira, Cristiane; Carmichael, Paul L; Casey, Warren; Coecke, Sandra; Curren, Rodger; Desprez, Bertrand; Eskes, Chantra; Griesinger, Claudius; Guo, Jiabin; Hill, Erin; Roi, Annett Janusch; Kojima, Hajime; Li, Jin; Lim, Chae Hyung; Moura, Wlamir; Nishikawa, Akiyoshi; Park, HyeKyung; Peng, Shuangqing; Presgrave, Octavio; Singer, Tim; Sohn, Soo Jung; Westmoreland, Carl; Whelan, Maurice; Yang, Xingfen; Yang, Ying; Zuang, Valérie

    The development and validation of scientific alternatives to animal testing is important not only from an ethical perspective (implementation of 3Rs), but also to improve safety assessment decision making with the use of mechanistic information of higher relevance to humans. To be effective in these efforts, it is however imperative that validation centres, industry, regulatory bodies, academia and other interested parties ensure a strong international cooperation, cross-sector collaboration and intense communication in the design, execution, and peer review of validation studies. Such an approach is critical to achieve harmonized and more transparent approaches to method validation, peer-review and recommendation, which will ultimately expedite the international acceptance of valid alternative methods or strategies by regulatory authorities and their implementation and use by stakeholders. It also allows achieving greater efficiency and effectiveness by avoiding duplication of effort and leveraging limited resources. In view of achieving these goals, the International Cooperation on Alternative Test Methods (ICATM) was established in 2009 by validation centres from Europe, USA, Canada and Japan. ICATM was later joined by Korea in 2011 and currently also counts with Brazil and China as observers. This chapter describes the existing differences across world regions and major efforts carried out for achieving consistent international cooperation and harmonization in the validation and adoption of alternative approaches to animal testing.

  3. Practical procedure for method validation in INAA- A tutorial

    Energy Technology Data Exchange (ETDEWEB)

    Petroni, Robson; Moreira, Edson G., E-mail: robsonpetroni@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    This paper describes the procedure employed by the Neutron Activation Laboratory at the Nuclear and Energy Research Institute (LAN, IPEN - CNEN/SP) for validation of Instrumental Neutron Activation Analysis (INAA) methods. According to recommendations of ISO/IEC 17025 the method performance characteristics (limit of detection, limit of quantification, trueness, repeatability, intermediate precision, reproducibility, selectivity, linearity and uncertainties budget) were outline in an easy, fast and convenient way. The paper presents step by step how to calculate the required method performance characteristics in a process of method validation, what are the procedures, adopted strategies and acceptance criteria for the results, that is, how to make a method validation in INAA. In order to exemplify the methodology applied, obtained results for the method validation of mass fraction determination of Co, Cr, Fe, Rb, Se and Zn in biological matrix samples, using an internal reference material of mussel tissue were presented. It was concluded that the methodology applied for validation of INAA methods is suitable, meeting all the requirements of ISO/IEC 17025, and thereby, generating satisfactory results for the studies carried at LAN, IPEN - CNEN/SP. (author)

  4. Practical procedure for method validation in INAA- A tutorial

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2015-01-01

    This paper describes the procedure employed by the Neutron Activation Laboratory at the Nuclear and Energy Research Institute (LAN, IPEN - CNEN/SP) for validation of Instrumental Neutron Activation Analysis (INAA) methods. According to recommendations of ISO/IEC 17025 the method performance characteristics (limit of detection, limit of quantification, trueness, repeatability, intermediate precision, reproducibility, selectivity, linearity and uncertainties budget) were outline in an easy, fast and convenient way. The paper presents step by step how to calculate the required method performance characteristics in a process of method validation, what are the procedures, adopted strategies and acceptance criteria for the results, that is, how to make a method validation in INAA. In order to exemplify the methodology applied, obtained results for the method validation of mass fraction determination of Co, Cr, Fe, Rb, Se and Zn in biological matrix samples, using an internal reference material of mussel tissue were presented. It was concluded that the methodology applied for validation of INAA methods is suitable, meeting all the requirements of ISO/IEC 17025, and thereby, generating satisfactory results for the studies carried at LAN, IPEN - CNEN/SP. (author)

  5. Evaluation methods for physical activity-promoting mobile technologies: an interdisciplinary scoping review

    Directory of Open Access Journals (Sweden)

    Claire McCallum

    2015-10-01

    Full Text Available There are many thousands of mobile apps, wearables and other technologies available to support and promote physical activity. However, the rapidly evolving nature of these technologies means that the methodologies traditionally used to evaluate the effectiveness of behaviour change interventions (such as the randomised controlled trial may not be appropriate to evaluate their effectiveness. A scoping review was conducted to identify the methods currently being used to evaluate physical activity-promoting mobile technologies across health and computing science disciplines. In addition to the range of methods used, the review explored their strengths and weaknesses. The results improve understandings of when and why to use existing methods from health and computing science. Opportunities for combining and hybridising methods across the two disciplines are also identified. The review will be used to inform the development and piloting of novel, ‘fit-for-purpose’ research designs that will allow rigorous evaluation of the effectiveness of rapidly-evolving physical activity-promoting mobile technologies and their ‘active ingredients’ to build an evidence base of what works, why and for whom.

  6. Testing and Validation of Computational Methods for Mass Spectrometry.

    Science.gov (United States)

    Gatto, Laurent; Hansen, Kasper D; Hoopmann, Michael R; Hermjakob, Henning; Kohlbacher, Oliver; Beyer, Andreas

    2016-03-04

    High-throughput methods based on mass spectrometry (proteomics, metabolomics, lipidomics, etc.) produce a wealth of data that cannot be analyzed without computational methods. The impact of the choice of method on the overall result of a biological study is often underappreciated, but different methods can result in very different biological findings. It is thus essential to evaluate and compare the correctness and relative performance of computational methods. The volume of the data as well as the complexity of the algorithms render unbiased comparisons challenging. This paper discusses some problems and challenges in testing and validation of computational methods. We discuss the different types of data (simulated and experimental validation data) as well as different metrics to compare methods. We also introduce a new public repository for mass spectrometric reference data sets ( http://compms.org/RefData ) that contains a collection of publicly available data sets for performance evaluation for a wide range of different methods.

  7. Development and Validation of Improved Method for Fingerprint ...

    African Journals Online (AJOL)

    Purpose: To develop and validate an improved method by capillary zone electrophoresis with photodiode array detection for the fingerprint analysis of Ligusticum chuanxiong Hort. (Rhizoma Chuanxiong). Methods: The optimum high performance capillary electrophoresis (HPCE) conditions were 30 mM borax containing 5 ...

  8. Development and Validation of a Bioanalytical Method for Direct ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a user-friendly spiked plasma method for the extraction of diclofenac potassium that reduces the number of treatments with plasma sample, in order to minimize human error. Method: Instead of solvent evaporation technique, the spiked plasma sample was modified with H2SO4 and NaCl, ...

  9. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    Development and Validation of Analytical Method for Losartan-Copper Complex Using UV-Vis Spectrophotometry. ... Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with ...

  10. Triangulation, Respondent Validation, and Democratic Participation in Mixed Methods Research

    Science.gov (United States)

    Torrance, Harry

    2012-01-01

    Over the past 10 years or so the "Field" of "Mixed Methods Research" (MMR) has increasingly been exerting itself as something separate, novel, and significant, with some advocates claiming paradigmatic status. Triangulation is an important component of mixed methods designs. Triangulation has its origins in attempts to validate research findings…

  11. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  12. 76 FR 28664 - Method 301-Field Validation of Pollutant Measurement Methods From Various Waste Media

    Science.gov (United States)

    2011-05-18

    ... . d m = The mean of the paired sample differences. n = Total number of paired samples. 7.4.2 t Test... being compared to a validated test method as part of the Method 301 validation and an audit sample for... tighten the acceptance criteria for the precision of candidate alternative test methods. One commenter...

  13. Validation of NAA Method for Urban Particulate Matter

    International Nuclear Information System (INIS)

    Woro Yatu Niken Syahfitri; Muhayatun; Diah Dwiana Lestiani; Natalia Adventini

    2009-01-01

    Nuclear analytical techniques have been applied in many countries for determination of environmental pollutant. Method of NAA (neutron activation analysis) representing one of nuclear analytical technique of that has low detection limits, high specificity, high precision, and accuracy for large majority of naturally occurring elements, and ability of non-destructive and simultaneous determination of multi-elemental, and can handle small sample size (< 1 mg). To ensure quality and reliability of the method, validation are needed to be done. A standard reference material, SRM NIST 1648 Urban Particulate Matter, has been used to validate NAA method. Accuracy and precision test were used as validation parameters. Particulate matter were validated for 18 elements: Ti, I, V, Br, Mn, Na, K, Cl, Cu, Al, As, Fe, Co, Zn, Ag, La, Cr, and Sm,. The result showed that the percent relative standard deviation of the measured elemental concentrations are found to be within ranged from 2 to 14,8% for most of the elements analyzed whereas Hor rat value in range 0,3-1,3. Accuracy test results showed that relative bias ranged from -11,1 to 3,6%. Based on validation results, it can be stated that NAA method is reliable for characterization particulate matter and other similar matrix samples to support air quality monitoring. (author)

  14. Validation of calculational methods for nuclear criticality safety - approved 1975

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    The American National Standard for Nuclear Criticality Safety in Operations with Fissionable Materials Outside Reactors, N16.1-1975, states in 4.2.5: In the absence of directly applicable experimental measurements, the limits may be derived from calculations made by a method shown to be valid by comparison with experimental data, provided sufficient allowances are made for uncertainties in the data and in the calculations. There are many methods of calculation which vary widely in basis and form. Each has its place in the broad spectrum of problems encountered in the nuclear criticality safety field; however, the general procedure to be followed in establishing validity is common to all. The standard states the requirements for establishing the validity and area(s) of applicability of any calculational method used in assessing nuclear criticality safety

  15. Development and Validation of a Liquid Chromatographic Method ...

    African Journals Online (AJOL)

    A liquid chromatographic method for the simultaneous determination of six human immunodeficiency virus (HIV) protease inhibitors, indinavir, saquinavir, ritonavir, amprenavir, nelfinavir and lopinavir, was developed and validated. Optimal separation was achieved on a PLRP-S 100 Å, 250 x 4.6 mm I.D. column maintained ...

  16. Validated RP-HPLC Method for Quantification of Phenolic ...

    African Journals Online (AJOL)

    Purpose: To evaluate the total phenolic content and antioxidant potential of the methanol extracts of aerial parts and roots of Thymus sipyleus Boiss and also to determine some phenolic compounds using a newly developed and validated reversed phase high performance liquid chromatography (RP-HPLC) method.

  17. Comparison of the performances and validation of three methods for ...

    African Journals Online (AJOL)

    SARAH

    2014-02-28

    Feb 28, 2014 ... bacteria in Norwegian slaughter pigs. Int J. Food Microbiol 1, 301–309. [NCFA] Nordic Committee of Food Analysis (1996). Yersinia enterocolitica Detection in foods 117,. 3rd,edn,1-12. Nowak, B., Mueffling, T.V., Caspari, K. and Hartung, J. 2006 Validation of a method for the detection of virulent Yersinia ...

  18. Development and validation of analytical methods for dietary supplements

    International Nuclear Information System (INIS)

    Sullivan, Darryl; Crowley, Richard

    2006-01-01

    The expanding use of innovative botanical ingredients in dietary supplements and foods has resulted in a flurry of research aimed at the development and validation of analytical methods for accurate measurement of active ingredients. The pressing need for these methods is being met through an expansive collaborative initiative involving industry, government, and analytical organizations. This effort has resulted in the validation of several important assays as well as important advances in the method engineering procedures which have improved the efficiency of the process. The initiative has also allowed researchers to hurdle many of the barricades that have hindered accurate analysis such as the lack of reference standards and comparative data. As the availability for nutraceutical products continues to increase these methods will provide consumers and regulators with the scientific information needed to assure safety and dependable labeling

  19. The Validity of Dimensional Regularization Method on Fractal Spacetime

    Directory of Open Access Journals (Sweden)

    Yong Tao

    2013-01-01

    Full Text Available Svozil developed a regularization method for quantum field theory on fractal spacetime (1987. Such a method can be applied to the low-order perturbative renormalization of quantum electrodynamics but will depend on a conjectural integral formula on non-integer-dimensional topological spaces. The main purpose of this paper is to construct a fractal measure so as to guarantee the validity of the conjectural integral formula.

  20. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  1. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  2. Shielding design method for LMFBR validation on the Phenix factor

    International Nuclear Information System (INIS)

    Cabrillat, J.C.; Crouzet, J.; Misrakis, J.; Salvatores, M.; Rado, V.; Palmiotti, G.

    1983-05-01

    Shielding design methods, developed at CEA for shielding calculations find a global validation by the means of Phenix power reactor (250 MWe) measurements. Particularly, the secondary sodium activation of pool type LMFBR such as Super Phenix (1200 MWe) which is subject to strict safety limitation is well calculated by the adapted scheme, i.e. a two dimension transport calculation of shielding coupled to a Monte-Carlo calculation of secondary sodium activation

  3. Validating the JobFit system functional assessment method

    Energy Technology Data Exchange (ETDEWEB)

    Jenny Legge; Robin Burgess-Limerick

    2007-05-15

    Workplace injuries are costing the Australian coal mining industry and its communities $410 Million a year. This ACARP study aims to meet those demands by developing a safe, reliable and valid pre-employment functional assessment tool. All JobFit System Pre-Employment Functional Assessments (PEFAs) consist of a musculoskeletal screen, balance test, aerobic fitness test and job-specific postural tolerances and material handling tasks. The results of each component are compared to the applicant's job demands and an overall PEFA score between 1 and 4 is given with 1 being the better score. The reliability study and validity study were conducted concurrently. The reliability study examined test-retest, intra-tester and inter-tester reliability of the JobFit System Functional Assessment Method. Overall, good to excellent reliability was found, which was sufficient to be used for comparison with injury data for determining the validity of the assessment. The overall assessment score and material handling tasks had the greatest reliability. The validity study compared the assessment results of 336 records from a Queensland underground and open cut coal mine with their injury records. A predictive relationship was found between PEFA score and the risk of a back/trunk/shoulder injury from manual handling. An association was also found between PEFA score of 1 and increased length of employment. Lower aerobic fitness test results had an inverse relationship with injury rates. The study found that underground workers, regardless of PEFA score, were more likely to have an injury when compared to other departments. No relationship was found between age and risk of injury. These results confirm the validity of the JobFit System Functional Assessment method.

  4. Robustness study in SSNTD method validation: indoor radon quality

    International Nuclear Information System (INIS)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L.

    2017-01-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  5. Robustness study in SSNTD method validation: indoor radon quality

    Energy Technology Data Exchange (ETDEWEB)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L., E-mail: danilacdias@gmail.com [Comissao Nacional de Energia Nuclear (LAPOC/CNEN), Pocos de Caldas, MG (Brazil). Laboratorio de Pocos de Caldas

    2017-07-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  6. How far is mixed methods research in the field of health policy and systems in Africa? A scoping review.

    Science.gov (United States)

    De Allegri, M; Sieleunou, I; Abiiro, G A; Ridde, V

    2018-04-01

    Both the academic and the policy community are calling for wider application of mixed methods research, suggesting that combined use of quantitative and qualitative methods is most suitable to assess and understand the complexities of health interventions. In spite of recent growth in mixed methods studies, limited efforts have been directed towards appraising and synthetizing to what extent and how mixed methods have been applied specifically to Health Policy and Systems Research (HPSR) in low- and middle-income countries (LMICs). We aimed at filling this gap in knowledge, by exploring the scope and quality of mixed methods research in the African context. We conducted a scoping review applying the framework developed by Arksey and O'Malley and modified by Levac et al. to identify and extract data from relevant studies published between 1950 and 2013. We limited our search to peer-reviewed HPSR publications in English, which combined at least one qualitative and one quantitative method and focused on Africa. Among the 105 studies that were retained for data extraction, over 60% were published after 2010. Nearly 50% of all studies addressed topics relevant to Health Systems, while Health Policy and Health Outcomes studies accounted respectively for 40% and 10% of all publications. The quality of the application of mixed methods varied greatly across studies, with a relatively small proportion of studies stating clearly defined research questions and differentiating quantitative and qualitative elements, including sample sizes and analytical approaches. The methodological weaknesses observed could be linked to the paucity of specific training opportunities available to people interested in applying mixed methods to HPSR in LMICs as well as to the limitations on word limit, scope and peer-review processes at the journals levels. Increasing training opportunities and enhancing journal flexibility may result in more and better quality mixed methods publications.

  7. Stepwise Procedure for Development and Validation of a Multipesticide Method

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    The stepwise procedure for development and the validation of so called multi-pesticide methods are described. Principles, preliminary actions, criteria for the selection of chromatographic separation, detection and performance verification of multi-pesticide methods are outlined. Also the long term repeatability and reproducibility, as well as the necessity for the documentation of laboratory work are highlighted. Appendix I hereof describes in detail the calculation of calibration parameters, whereas Appendix II focuses on the calculation of the significance of differences of concentrations obtained on two different separation columns. (author)

  8. OWL-based reasoning methods for validating archetypes.

    Science.gov (United States)

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Survey and assessment of conventional software verification and validation methods

    International Nuclear Information System (INIS)

    Miller, L.A.; Groundwater, E.; Mirsky, S.M.

    1993-04-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 134 methods so identified were classified according to their appropriateness for various phases of a developmental lifecycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes in terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each method for the four identified components of knowledge-based and expert systems, as well as the system as a whole

  10. Dependability validation by means of fault injection: method, implementation, application

    International Nuclear Information System (INIS)

    Arlat, Jean

    1990-01-01

    This dissertation presents theoretical and practical results concerning the use of fault injection as a means for testing fault tolerance in the framework of the experimental dependability validation of computer systems. The dissertation first presents the state-of-the-art of published work on fault injection, encompassing both hardware (fault simulation, physical fault Injection) and software (mutation testing) issues. Next, the major attributes of fault injection (faults and their activation, experimental readouts and measures, are characterized taking into account: i) the abstraction levels used to represent the system during the various phases of its development (analytical, empirical and physical models), and Il) the validation objectives (verification and evaluation). An evaluation method is subsequently proposed that combines the analytical modeling approaches (Monte Carlo Simulations, closed-form expressions. Markov chains) used for the representation of the fault occurrence process and the experimental fault Injection approaches (fault Simulation and physical injection); characterizing the error processing and fault treatment provided by the fault tolerance mechanisms. An experimental tool - MESSALINE - is then defined and presented. This tool enables physical faults to be Injected In an hardware and software prototype of the system to be validated. Finally, the application of MESSALINE for testing two fault-tolerant systems possessing very dissimilar features and the utilization of the experimental results obtained - both as design feedbacks and for dependability measures evaluation - are used to illustrate the relevance of the method. (author) [fr

  11. A New Method for Blood NT-proBNP Determination Based on a Near-infrared Point of Care Testing Device with High Sensitivity and Wide Scope.

    Science.gov (United States)

    Zhang, Xiao Guang; Shu, Yao Gen; Gao, Ju; Wang, Xuan; Liu, Li Peng; Wang, Meng; Cao, Yu Xi; Zeng, Yi

    2017-06-01

    To develop a rapid, highly sensitive, and quantitative method for the detection of NT-proBNP levels based on a near-infrared point-of-care diagnostic (POCT) device with wide scope. The lateral flow assay (LFA) strip of NT-proBNP was first prepared to achieve rapid detection. Then, the antibody pairs for NT-proBNP were screened and labeled with the near-infrared fluorescent dye Dylight-800. The capture antibody was fixed on a nitrocellulose membrane by a scribing device. Serial dilutions of serum samples were prepared using NT-proBNP-free serum series. The prepared test strips, combined with a near-infrared POCT device, were validated by known concentrations of clinical samples. The POCT device gave the output of the ratio of the intensity of the fluorescence signal of the detection line to that of the quality control line. The relationship between the ratio value and the concentration of the specimen was plotted as a work curve. The results of 62 clinical specimens obtained from our method were compared in parallel with those obtained from the Roche E411 kit. Based on the log-log plot, the new method demonstrated that there was a good linear relationship between the ratio value and NT-proBNP concentrations ranging from 20 pg/mL to 10 ng/mL. The results of the 62 clinical specimens measured by our method showed a good linear correlation with those measured by the Roche E411 kit. The new LFA detection method of NT-proBNP levels based on the near-infrared POCT device was rapid and highly sensitive with wide scope and was thus suitable for rapid and early clinical diagnosis of cardiac impairment. Copyright © 2017 The Editorial Board of Biomedical and Environmental Sciences. Published by China CDC. All rights reserved.

  12. VALUE - Validating and Integrating Downscaling Methods for Climate Change Research

    Science.gov (United States)

    Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose

    2013-04-01

    Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of

  13. Validation of ultraviolet method to determine serum phosphorus level

    International Nuclear Information System (INIS)

    Garcia Borges, Lisandra; Perez Prieto, Teresa Maria; Valdes Diez, Lilliam

    2009-01-01

    Validation of a spectrophotometry method applicable in clinic labs was proposed to analytical assessment of serum phosphates using a kit UV-phosphorus of domestic production from 'Carlos J. Finlay' Biologics Production Enterprise (Havana, Cuba). Analysis method was based on phosphorus reaction to ammonium molybdenum to acid pH to creating a measurable complex to 340 nm. Specificity and precision were measured considering the method strength, linearity, accuracy and sensitivity. Analytical method was linear up to 4,8 mmol/L, precise (CV 30 .999) during clinical interest concentration interval where there were not interferences by matrix. Detection limit values were of 0.037 mmol/L and of quantification of 0.13 mmol/L both were satisfactory for product use

  14. Validity of the CT to attenuation coefficient map conversion methods

    International Nuclear Information System (INIS)

    Faghihi, R.; Ahangari Shahdehi, R.; Fazilat Moadeli, M.

    2004-01-01

    The most important commercialized methods of attenuation correction in SPECT are based on attenuation coefficient map from a transmission imaging method. The transmission imaging system can be the linear source of radioelement or a X-ray CT system. The image of transmission imaging system is not useful unless to replacement of the attenuation coefficient or CT number with the attenuation coefficient in SPECT energy. In this paper we essay to evaluate the validity and estimate the error of the most used method of this transformation. The final result shows that the methods which use a linear or multi-linear curve accept a error in their estimation. The value of mA is not important but the patient thickness is very important and it can introduce a error more than 10 percent in the final result

  15. Methods and practices for verification and validation of programmable systems

    International Nuclear Information System (INIS)

    Heimbuerger, H.; Haapanen, P.; Pulkkinen, U.

    1993-01-01

    The programmable systems deviate by their properties and behaviour from the conventional non-programmable systems in such extent, that their verification and validation for safety critical applications requires new methods and practices. The safety assessment can not be based on conventional probabilistic methods due to the difficulties in the quantification of the reliability of the software and hardware. The reliability estimate of the system must be based on qualitative arguments linked to a conservative claim limit. Due to the uncertainty of the quantitative reliability estimate other means must be used to get more assurance about the system safety. Methods and practices based on research done by VTT for STUK, are discussed in the paper as well as the methods applicable in the reliability analysis of software based safety functions. The most essential concepts and models of quantitative reliability analysis are described. The application of software models in probabilistic safety analysis (PSA) is evaluated. (author). 18 refs

  16. The Validation of NAA Method Used as Test Method in Serpong NAA Laboratory

    International Nuclear Information System (INIS)

    Rina-Mulyaningsih, Th.

    2004-01-01

    The Validation Of NAA Method Used As Test Method In Serpong NAA Laboratory. NAA Method is a non standard testing method. The testing laboratory shall validate its using method to ensure and confirm that it is suitable with application. The validation of NAA methods have been done with the parameters of accuracy, precision, repeatability and selectivity. The NIST 1573a Tomato Leaves, NIES 10C Rice flour unpolished and standard elements were used in this testing program. The result of testing with NIST 1573a showed that the elements of Na, Zn, Al and Mn are met from acceptance criteria of accuracy and precision, whereas Co is rejected. The result of testing with NIES 10C showed that Na and Zn elements are met from acceptance criteria of accuracy and precision, but Mn element is rejected. The result of selectivity test showed that the value of quantity is between 0.1-2.5 μg, depend on the elements. (author)

  17. Validation method training: nurses' experiences and ratings of work climate.

    Science.gov (United States)

    Söderlund, Mona; Norberg, Astrid; Hansebo, Görel

    2014-03-01

    Training nursing staff in communication skills can impact on the quality of care for residents with dementia and contributes to nurses' job satisfaction. Changing attitudes and practices takes time and energy and can affect the entire nursing staff, not just the nurses directly involved in a training programme. Therefore, it seems important to study nurses' experiences of a training programme and any influence of the programme on work climate among the entire nursing staff. To explore nurses' experiences of a 1-year validation method training programme conducted in a nursing home for residents with dementia and to describe ratings of work climate before and after the programme. A mixed-methods approach. Twelve nurses participated in the training and were interviewed afterwards. These individual interviews were tape-recorded and transcribed, then analysed using qualitative content analysis. The Creative Climate Questionnaire was administered before (n = 53) and after (n = 56) the programme to the entire nursing staff in the participating nursing home wards and analysed with descriptive statistics. Analysis of the interviews resulted in four categories: being under extra strain, sharing experiences, improving confidence in care situations and feeling uncertain about continuing the validation method. The results of the questionnaire on work climate showed higher mean values in the assessment after the programme had ended. The training strengthened the participating nurses in caring for residents with dementia, but posed an extra strain on them. These nurses also described an extra strain on the entire nursing staff that was not reflected in the results from the questionnaire. The work climate at the nursing home wards might have made it easier to conduct this extensive training programme. Training in the validation method could develop nurses' communication skills and improve their handling of complex care situations. © 2013 Blackwell Publishing Ltd.

  18. Parallel Resolved Open Source CFD-DEM: Method, Validation and Application

    Directory of Open Access Journals (Sweden)

    A. Hager

    2014-03-01

    Full Text Available In the following paper the authors present a fully parallelized Open Source method for calculating the interaction of immersed bodies and surrounding fluid. A combination of computational fluid dynamics (CFD and a discrete element method (DEM accounts for the physics of both the fluid and the particles. The objects considered are relatively big compared to the cells of the fluid mesh, i.e. they cover several cells each. Thus this fictitious domain method (FDM is called resolved. The implementation is realized within the Open Source framework CFDEMcOupling (www.cfdem.com, which provides an interface between OpenFOAM® based CFD-solvers and the DEM software LIGGGHTS (www.liggghts.com. While both LIGGGHTS and OpenFOAM® were already parallelized, only a recent improvement of the algorithm permits the fully parallel computation of resolved problems. Alongside with a detailed description of the method, its implementation and recent improvements, a number of application and validation examples is presented in the scope of this paper.

  19. A proactive alarm reduction method and its human factors validation test for a main control room for SMART

    International Nuclear Information System (INIS)

    Jang, Gwi-sook; Suh, Sang-moon; Kim, Sa-kil; Suh, Yong-suk; Park, Je-yun

    2013-01-01

    Highlights: ► A proactive alarm reduction method improves effectiveness on the alarm reduction. ► The method suppresses alarms based on the ECA rules and facts for the alarm reduction under an alarm flood situation. ► The alarm reduction logics are supplemented to a high hit ratio of the reduction logics during on-line operations. ► The method is validated by human factors validation test based on regulatory requirements. -- Abstract: Conventional alarm systems tend to overwhelm operators during a transient because of a large number of nearly simultaneous annunciator activations with varying degrees of relevance to operator tasks. Thus alarm processing techniques have developed to support operators in coping with the volume of alarms, to identify which alarms are significant, and to reduce the need for operators to infer the plant conditions. This paper proposes a proactive alarm reduction method for SMART (System-integrated Modular Advanced ReacTor) whereby based on the contents of the past operating effects alarm reduction is carried out during the next transient. We designed and implemented the proactive alarm reduction system and constructed the environment for the human factors validation test. Also, eight subjects actually working in a nuclear power plant (NPP) tested the practical effectiveness of the proposed proactive alarm reduction method according to the procedure of human factors validation test under a dynamic simulation of a partial scope for an NPP.

  20. The Validity of International Sales Contracts: Irrelevance of the 'Validity Exception' in Article 4 Vienna Sales Convention and a Novel Approach to Determining the Convention's Scope

    OpenAIRE

    Schroeter, Ulrich

    2017-01-01

    in: Ingeborg Schwenzer and Lisa Spagnolo (eds.), Boundaries and Intersections: The 5th Annual MAA Schlechtriem CISG Conference, The Hague: Eleven International Publishing (2014), pp. 95-117 Throughout the history of uniform law for international sales, the rules governing the validity of cross-border sales contracts have proven particularly difficult to harmonize because they differ greatly between the various domestic laws. This dilemma inter alia resulted in the "validity exception" in Arti...

  1. A broad scope knowledge based model for optimization of VMAT in esophageal cancer: validation and assessment of plan quality among different treatment centers

    International Nuclear Information System (INIS)

    Fogliata, Antonella; Nicolini, Giorgia; Clivio, Alessandro; Vanetti, Eugenio; Laksar, Sarbani; Tozzi, Angelo; Scorsetti, Marta; Cozzi, Luca

    2015-01-01

    To evaluate the performance of a broad scope model-based optimisation process for volumetric modulated arc therapy applied to esophageal cancer. A set of 70 previously treated patients in two different institutions, were selected to train a model for the prediction of dose-volume constraints. The model was built with a broad-scope purpose, aiming to be effective for different dose prescriptions and tumour localisations. It was validated on three groups of patients from the same institution and from another clinic not providing patients for the training phase. Comparison of the automated plans was done against reference cases given by the clinically accepted plans. Quantitative improvements (statistically significant for the majority of the analysed dose-volume parameters) were observed between the benchmark and the test plans. Of 624 dose-volume objectives assessed for plan evaluation, in 21 cases (3.3 %) the reference plans failed to respect the constraints while the model-based plans succeeded. Only in 3 cases (<0.5 %) the reference plans passed the criteria while the model-based failed. In 5.3 % of the cases both groups of plans failed and in the remaining cases both passed the tests. Plans were optimised using a broad scope knowledge-based model to determine the dose-volume constraints. The results showed dosimetric improvements when compared to the benchmark data. Particularly the plans optimised for patients from the third centre, not participating to the training, resulted in superior quality. The data suggests that the new engine is reliable and could encourage its application to clinical practice. The online version of this article (doi:10.1186/s13014-015-0530-5) contains supplementary material, which is available to authorized users

  2. Method validation and stability study of quercetin in topical emulsions

    Directory of Open Access Journals (Sweden)

    Rúbia Casagrande

    2009-01-01

    Full Text Available This study validated a high performance liquid chromatography (HPLC method for the quantitative evaluation of quercetin in topical emulsions. The method was linear within 0.05 - 200 μg/mL range with a correlation coefficient of 0.9997, and without interference in the quercetin peak. The detection and quantitation limits were 18 and 29 ng/mL, respectively. The intra- and inter-assay precisions presented R.S.D. values lower than 2%. An average of 93% and 94% of quercetin was recovered for non-ionic and anionic emulsions, respectively. The raw material and anionic emulsion, but not non-ionic emulsion, were stable in all storage conditions for one year. The method reported is a fast and reliable HPLC technique useful for quercetin determination in topical emulsions.

  3. Statistical methods for mechanistic model validation: Salt Repository Project

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1988-07-01

    As part of the Department of Energy's Salt Repository Program, Pacific Northwest Laboratory (PNL) is studying the emplacement of nuclear waste containers in a salt repository. One objective of the SRP program is to develop an overall waste package component model which adequately describes such phenomena as container corrosion, waste form leaching, spent fuel degradation, etc., which are possible in the salt repository environment. The form of this model will be proposed, based on scientific principles and relevant salt repository conditions with supporting data. The model will be used to predict the future characteristics of the near field environment. This involves several different submodels such as the amount of time it takes a brine solution to contact a canister in the repository, how long it takes a canister to corrode and expose its contents to the brine, the leach rate of the contents of the canister, etc. These submodels are often tested in a laboratory and should be statistically validated (in this context, validate means to demonstrate that the model adequately describes the data) before they can be incorporated into the waste package component model. This report describes statistical methods for validating these models. 13 refs., 1 fig., 3 tabs

  4. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Directory of Open Access Journals (Sweden)

    Shankarjee Krishnamoorthi

    Full Text Available We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  5. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Science.gov (United States)

    Krishnamoorthi, Shankarjee; Perotti, Luigi E; Borgstrom, Nils P; Ajijola, Olujimi A; Frid, Anna; Ponnaluri, Aditya V; Weiss, James N; Qu, Zhilin; Klug, William S; Ennis, Daniel B; Garfinkel, Alan

    2014-01-01

    We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  6. Laboratory diagnostic methods, system of quality and validation

    Directory of Open Access Journals (Sweden)

    Ašanin Ružica

    2005-01-01

    Full Text Available It is known that laboratory investigations secure safe and reliable results that provide a final confirmation of the quality of work. Ideas, planning, knowledge, skills, experience, and environment, along with good laboratory practice, quality control and reliability of quality, make the area of biological investigations very complex. In recent years, quality control, including the control of work in the laboratory, is based on international standards and is used at that level. The implementation of widely recognized international standards, such as the International Standard ISO/IEC 17025 (1 and the implementing of the quality system series ISO/IEC 9000 (2 have become the imperative on the grounds of which laboratories have a formal, visible and corresponding system of quality. The diagnostic methods that are used must constantly yield results which identify the animal as positive or negative, and the precise status of the animal is determined with a predefined degree of statistical significance. Methods applied on a selected population reduce the risk of obtaining falsely positive or falsely negative results. A condition for this are well conceived and documented methods, with the application of the corresponding reagents, and work with professional and skilled staff. This process requires also a consistent implementation of the most rigorous experimental plans, epidemiological and statistical data and estimations, with constant monitoring of the validity of the applied methods. Such an approach is necessary in order to cut down the number of misconceptions and accidental mistakes, for a referent population of animals on which the validity of a method is tested. Once a valid method is included in daily routine investigations, it is necessary to apply constant monitoring for the purpose of internal quality control, in order adequately to evaluate its reproducibility and reliability. Consequently, it is necessary at least twice yearly to conduct

  7. Validation study of core analysis methods for full MOX BWR

    International Nuclear Information System (INIS)

    2013-01-01

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO 2 and MOX fuel rods, (3) analysis of isotopic composition data for UO 2 and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  8. Validation study of core analysis methods for full MOX BWR

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO{sub 2} and MOX fuel rods, (3) analysis of isotopic composition data for UO{sub 2} and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  9. Validation of single-sample doubly labeled water method

    International Nuclear Information System (INIS)

    Webster, M.D.; Weathers, W.W.

    1989-01-01

    We have experimentally validated a single-sample variant of the doubly labeled water method for measuring metabolic rate and water turnover in a very small passerine bird, the verdin (Auriparus flaviceps). We measured CO 2 production using the Haldane gravimetric technique and compared these values with estimates derived from isotopic data. Doubly labeled water results based on the one-sample calculations differed from Haldane values by less than 0.5% on average (range -8.3 to 11.2%, n = 9). Water flux computed by the single-sample method differed by -1.5% on average from results for the same birds based on the standard, two-sample technique (range -13.7 to 2.0%, n = 9)

  10. Validation and further development of a novel thermal analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Mathews, E.H.; Shuttleworth, A.G.; Rousseau, P.G. [Pretoria Univ. (South Africa). Dept. of Mechanical Engineering

    1994-12-31

    The design of thermal and energy efficient buildings requires inter alia the investigation of the passive performance, natural ventilation, mechanical ventilation as well as structural and evaporative cooling of the building. Only when these fail to achieve the desired thermal comfort should mechanical cooling systems be considered. Few computer programs have the ability to investigate all these comfort regulating methods at the design stage. The QUICK design program can simulate these options with the exception of mechanical cooling. In this paper, Quick`s applicability is extended to include the analysis of basic air-conditioning systems. Since the design of these systems is based on indoor loads, it was necessary to validate QUICK`s load predictions before extending it. This article addresses validation in general and proposes a procedure to establish the efficiency of a program`s load predictions. This proposed procedure is used to compare load predictions by the ASHRAE, CIBSE, CARRIER, CHEETAH, BSIMAC and QUICK methods for 46 case studies involving 36 buildings in various climatic conditions. Although significant differences in the results of the various methods were observed, it is concluded that QUICK can be used with the same confidence as the other methods. It was further shown that load prediction programs usually under-estimate the effect of building mass and therefore over-estimate the peak loads. The details for the 46 case studies are available to other researchers for further verification purposes. With the confidence gained in its load predictions, QUICK was extended to include air-conditioning system analysis. The program was then applied to different case studies. It is shown that system size and energy usage can be reduced by more than 60% by using a combination of passive and mechanical cooling systems as well as different control strategies. (author)

  11. Validation of internal dosimetry protocols based on stochastic method

    International Nuclear Information System (INIS)

    Mendes, Bruno M.; Fonseca, Telma C.F.; Almeida, Iassudara G.; Trindade, Bruno M.; Campos, Tarcisio P.R.

    2015-01-01

    Computational phantoms adapted to Monte Carlo codes have been applied successfully in radiation dosimetry fields. NRI research group has been developing Internal Dosimetry Protocols - IDPs, addressing distinct methodologies, software and computational human-simulators, to perform internal dosimetry, especially for new radiopharmaceuticals. Validation of the IDPs is critical to ensure the reliability of the simulations results. Inter comparisons of data from literature with those produced by our IDPs is a suitable method for validation. The aim of this study was to validate the IDPs following such inter comparison procedure. The Golem phantom has been reconfigured to run on MCNP5. The specific absorbed fractions (SAF) for photon at 30, 100 and 1000 keV energies were simulated based on the IDPs and compared with reference values (RV) published by Zankl and Petoussi-Henss, 1998. The SAF average differences from RV and those obtained in IDP simulations was 2.3 %. The SAF largest differences were found in situations involving low energy photons at 30 keV. The Adrenals and thyroid, i.e. the lowest mass organs, had the highest SAF discrepancies towards RV as 7.2 % and 3.8 %, respectively. The statistic differences of SAF applying our IDPs from reference values were considered acceptable at the 30, 100 and 1000 keV spectra. We believe that the main reason for the discrepancies in IDPs run, found in lower masses organs, was due to our source definition methodology. Improvements of source spatial distribution in the voxels may provide outputs more consistent with reference values for lower masses organs. (author)

  12. Validation of internal dosimetry protocols based on stochastic method

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, Bruno M.; Fonseca, Telma C.F., E-mail: bmm@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Almeida, Iassudara G.; Trindade, Bruno M.; Campos, Tarcisio P.R., E-mail: tprcampos@yahoo.com.br [Universidade Federal de Minas Gerais (DEN/UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2015-07-01

    Computational phantoms adapted to Monte Carlo codes have been applied successfully in radiation dosimetry fields. NRI research group has been developing Internal Dosimetry Protocols - IDPs, addressing distinct methodologies, software and computational human-simulators, to perform internal dosimetry, especially for new radiopharmaceuticals. Validation of the IDPs is critical to ensure the reliability of the simulations results. Inter comparisons of data from literature with those produced by our IDPs is a suitable method for validation. The aim of this study was to validate the IDPs following such inter comparison procedure. The Golem phantom has been reconfigured to run on MCNP5. The specific absorbed fractions (SAF) for photon at 30, 100 and 1000 keV energies were simulated based on the IDPs and compared with reference values (RV) published by Zankl and Petoussi-Henss, 1998. The SAF average differences from RV and those obtained in IDP simulations was 2.3 %. The SAF largest differences were found in situations involving low energy photons at 30 keV. The Adrenals and thyroid, i.e. the lowest mass organs, had the highest SAF discrepancies towards RV as 7.2 % and 3.8 %, respectively. The statistic differences of SAF applying our IDPs from reference values were considered acceptable at the 30, 100 and 1000 keV spectra. We believe that the main reason for the discrepancies in IDPs run, found in lower masses organs, was due to our source definition methodology. Improvements of source spatial distribution in the voxels may provide outputs more consistent with reference values for lower masses organs. (author)

  13. Oxcarbazepine: validation and application of an analytical method

    Directory of Open Access Journals (Sweden)

    Paula Cristina Rezende Enéas

    2010-06-01

    Full Text Available Oxcarbazepine (OXC is an important anticonvulsant and mood stabilizing drug. A pharmacopoeial monograph for OXC is not yet available and therefore the development and validation of a new analytical method for quantification of this drug is essential. In the present study, a UV spectrophotometric method for the determination of OXC was developed. The various parameters, such as linearity, precision, accuracy and specificity, were studied according to International Conference on Harmonization Guidelines. Batches of 150 mg OXC capsules were prepared and analyzed using the validated UV method. The formulations were also evaluated for parameters including drug-excipient compatibility, flowability, uniformity of weight, disintegration time, assay, uniformity of content and the amount of drug dissolved during the first hour.Oxcarbazepina (OXC é um fármaco anticonvulsivante e estabilizante do humor. O desenvolvimento e validação de método analítico para quantificação da OXC são de fundamental importância devido à ausência de monografias farmacopéicas oficiais para esse fármaco. Nesse trabalho, um método espectrofotométrico UV para determinação da OXC foi desenvolvido. O método proposto foi validado seguindo os parâmetros de linearidade, precisão, exatidão e especificidade de acordo com as normas da Conferência Internacional de Harmonização. Cápsulas de OXC 150 mg foram preparadas e analisadas utilizando-se o método analítico validado. As formulações foram avaliadas com relação à compatibilidade fármaco-excipientes, fluidez, determinação de peso, tempo de desintegração, doseamento, uniformidade de conteúdo e quantidade do fármaco dissolvido após 60 minutos.

  14. Global scope assessment: A novel method and its application to the Chinese paper industry

    International Nuclear Information System (INIS)

    Wang, X.W.; Hua, B.

    2007-01-01

    We strongly suggest an idea we call Global Scope Assessment (GSA) on the basis of popular Life Cycle Assessment and Environomics. It is used to justify the industry development pattern in a region (country) considering the specific regional resources conditions. In terms of GSA, the greatest synthetic benefit is expected while taking full advantage of regional comparative superiority and the worldwide distribution of related industry links. As an example, we choose Chinese paper industry to be the subject for application of GSA and the optimal industry links distribution among related countries is obtained. Our study indicates that the adjustment of industry structure should be a fair approach to relieve the pressure from environment and resources and to balance the contradiction between development and resources

  15. Feeling validated yet? A scoping review of the use of consumer-targeted wearable and mobile technology to measure and improve sleep.

    Science.gov (United States)

    Baron, Kelly Glazer; Duffecy, Jennifer; Berendsen, Mark A; Cheung Mason, Ivy; Lattie, Emily G; Manalo, Natalie C

    2017-12-20

    The objectives of this review were to evaluate the use of consumer-targeted wearable and mobile sleep monitoring technology, identify gaps in the literature and determine the potential for use in behavioral interventions. We undertook a scoping review of studies conducted in adult populations using consumer-targeted wearable technology or mobile devices designed to measure and/or improve sleep. After screening for inclusion/exclusion criteria, data were extracted from the articles by two co-authors. Articles included in the search were using wearable or mobile technology to estimate or evaluate sleep, published in English and conducted in adult populations. Our search returned 3897 articles and 43 met our inclusion criteria. Results indicated that the majority of studies focused on validating technology to measure sleep (n = 23) or were observational studies (n = 10). Few studies were used to identify sleep disorders (n = 2), evaluate response to interventions (n = 3) or deliver interventions (n = 5). In conclusion, the use of consumer-targeted wearable and mobile sleep monitoring technology has largely focused on validation of devices and applications compared with polysomnography (PSG) but opportunities exist for observational research and for delivery of behavioral interventions. Multidisciplinary research is needed to determine the uses of these technologies in interventions as well as the use in more diverse populations including sleep disorders and other patient populations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Validation of a method for radionuclide activity optimize in SPECT

    International Nuclear Information System (INIS)

    Perez Diaz, M.; Diaz Rizo, O.; Lopez Diaz, A.; Estevez Aparicio, E.; Roque Diaz, R.

    2007-01-01

    A discriminant method for optimizing the activity administered in NM studies is validated by comparison with ROC curves. the method is tested in 21 SPECT, performed with a Cardiac phantom. Three different cold lesions (L1, L2 and L3) were placed in the myocardium-wall for each SPECT. Three activities (84 MBq, 37 MBq or 18.5 MBq) of Tc-99m diluted in water were used as background. The linear discriminant analysis was used to select the parameters that characterize image quality (Background-to-Lesion (B/L) and Signal-to-Noise (S/N) ratios). Two clusters with different image quality (p=0.021) were obtained following the selected variables. the first one involved the studies performed with 37 MBq and 84 MBq, and the second one included the studies with 18.5 MBq. the ratios B/L, B/L2 and B/L3 are the parameters capable to construct the function, with 100% of cases correctly classified into the clusters. The value of 37 MBq is the lowest tested activity for which good results for the B/Li variables were obtained,without significant differences from the results with 84 MBq (p>0.05). The result is coincident with the applied ROC-analysis. A correlation between both method of r=890 was obtained. (Author) 26 refs

  17. Testing and Validation of the Dynamic Inertia Measurement Method

    Science.gov (United States)

    Chin, Alexander W.; Herrera, Claudia Y.; Spivey, Natalie D.; Fladung, William A.; Cloutier, David

    2015-01-01

    The Dynamic Inertia Measurement (DIM) method uses a ground vibration test setup to determine the mass properties of an object using information from frequency response functions. Most conventional mass properties testing involves using spin tables or pendulum-based swing tests, which for large aerospace vehicles becomes increasingly difficult and time-consuming, and therefore expensive, to perform. The DIM method has been validated on small test articles but has not been successfully proven on large aerospace vehicles. In response, the National Aeronautics and Space Administration Armstrong Flight Research Center (Edwards, California) conducted mass properties testing on an "iron bird" test article that is comparable in mass and scale to a fighter-type aircraft. The simple two-I-beam design of the "iron bird" was selected to ensure accurate analytical mass properties. Traditional swing testing was also performed to compare the level of effort, amount of resources, and quality of data with the DIM method. The DIM test showed favorable results for the center of gravity and moments of inertia; however, the products of inertia showed disagreement with analytical predictions.

  18. Method for Determining Volumetric Efficiency and Its Experimental Validation

    Directory of Open Access Journals (Sweden)

    Ambrozik Andrzej

    2017-12-01

    Full Text Available Modern means of transport are basically powered by piston internal combustion engines. Increasingly rigorous demands are placed on IC engines in order to minimise the detrimental impact they have on the natural environment. That stimulates the development of research on piston internal combustion engines. The research involves experimental and theoretical investigations carried out using computer technologies. While being filled, the cylinder is considered to be an open thermodynamic system, in which non-stationary processes occur. To make calculations of thermodynamic parameters of the engine operating cycle, based on the comparison of cycles, it is necessary to know the mean constant value of cylinder pressure throughout this process. Because of the character of in-cylinder pressure pattern and difficulties in pressure experimental determination, in the present paper, a novel method for the determination of this quantity was presented. In the new approach, the iteration method was used. In the method developed for determining the volumetric efficiency, the following equations were employed: the law of conservation of the amount of substance, the first law of thermodynamics for open system, dependences for changes in the cylinder volume vs. the crankshaft rotation angle, and the state equation. The results of calculations performed with this method were validated by means of experimental investigations carried out for a selected engine at the engine test bench. A satisfactory congruence of computational and experimental results as regards determining the volumetric efficiency was obtained. The method for determining the volumetric efficiency presented in the paper can be used to investigate the processes taking place in the cylinder of an IC engine.

  19. Methods, strategies and technologies used to conduct a scoping literature review of collaboration between primary care and public health.

    Science.gov (United States)

    Valaitis, Ruta; Martin-Misener, Ruth; Wong, Sabrina T; MacDonald, Marjorie; Meagher-Stewart, Donna; Austin, Patricia; Kaczorowski, Janusz; O-Mara, Linda; Savage, Rachel

    2012-07-01

    This paper describes the methods, strategies and technologies used to conduct a scoping literature review examining primary care (PC) and public health (PH) collaboration. It presents challenges encountered as well as recommendations and 'lessons learned' from conducting the review with a large geographically distributed team comprised of researchers and decision-makers using an integrated knowledge translation approach. Scoping studies comprehensively map literature in a specific area guided by general research questions. This methodology is especially useful in researching complex topics. Thus, their popularity is growing. Stakeholder consultations are an important strategy to enhance study results. Therefore, information about how best to involve stakeholders throughout the process is necessary to improve quality and uptake of reviews. This review followed Arksey and O'Malley's five stages: identifying research questions; identifying relevant studies; study selection; charting the data; and collating, summarizing and reporting results. Technological tools and strategies included: citation management software (Reference Manager®), qualitative data analysis software (NVivo 8), web conferencing (Elluminate Live!) and a PH portal (eHealthOntario), teleconferences, email and face-to-face meetings. Of 6125 papers identified, 114 were retained as relevant. Most papers originated in the United Kingdom (38%), the United States (34%) and Canada (19%). Of 80 papers that reported on specific collaborations, most were descriptive reports (51.3%). Research studies represented 34 papers: 31% were program evaluations, 9% were literature reviews and 9% were discussion papers. Key strategies to ensure rigor in conducting a scoping literature review while engaging a large geographically dispersed team are presented for each stage. The use of enabling technologies was essential to managing the process. Leadership in championing the use of technologies and a clear governance

  20. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    Science.gov (United States)

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  1. What is the most appropriate knowledge synthesis method to conduct a review? Protocol for a scoping review

    Directory of Open Access Journals (Sweden)

    Kastner Monika

    2012-08-01

    Full Text Available Abstract Background A knowledge synthesis attempts to summarize all pertinent studies on a specific question, can improve the understanding of inconsistencies in diverse evidence, and can identify gaps in research evidence to define future research agendas. Knowledge synthesis activities in healthcare have largely focused on systematic reviews of interventions. However, a wider range of synthesis methods has emerged in the last decade addressing different types of questions (e.g., realist synthesis to explore mediating mechanisms and moderators of interventions. Many different knowledge synthesis methods exist in the literature across multiple disciplines, but locating these, particularly for qualitative research, present challenges. There is a need for a comprehensive manual for synthesis methods (quantitative/qualitative or mixed, outlining how these methods are related, and how to match the most appropriate knowledge synthesis method to answer a research question. The objectives of this scoping review are to: 1 conduct a systematic search of the literature for knowledge synthesis methods across multi-disciplinary fields; 2 compare and contrast the different knowledge synthesis methods; and, 3 map out the specific steps to conducting the knowledge syntheses to inform the development of a knowledge synthesis methods manual/tool. Methods We will search relevant electronic databases (e.g., MEDLINE, CINAHL, grey literature, and discipline-based listservs. The scoping review will consider all study designs including qualitative and quantitative methodologies (excluding economic analysis or clinical practice guideline development, and identify knowledge synthesis methods across the disciplines of health, education, sociology, and philosophy. Two reviewers will pilot-test the screening criteria and data abstraction forms, and will independently screen the literature and abstract the data. A three-step synthesis process will be used to map the

  2. Pesticide applicators questionnaire content validation: A fuzzy delphi method.

    Science.gov (United States)

    Manakandan, S K; Rosnah, I; Mohd Ridhuan, J; Priya, R

    2017-08-01

    The most crucial step in forming a set of survey questionnaire is deciding the appropriate items in a construct. Retaining irrelevant items and removing important items will certainly mislead the direction of a particular study. This article demonstrates Fuzzy Delphi method as one of the scientific analysis technique to consolidate consensus agreement within a panel of experts pertaining to each item's appropriateness. This method reduces the ambiguity, diversity, and discrepancy of the opinions among the experts hence enhances the quality of the selected items. The main purpose of this study was to obtain experts' consensus on the suitability of the preselected items on the questionnaire. The panel consists of sixteen experts from the Occupational and Environmental Health Unit of Ministry of Health, Vector-borne Disease Control Unit of Ministry of Health and Occupational and Safety Health Unit of both public and private universities. A set of questionnaires related to noise and chemical exposure were compiled based on the literature search. There was a total of six constructs with 60 items in which three constructs for knowledge, attitude, and practice of noise exposure and three constructs for knowledge, attitude, and practice of chemical exposure. The validation process replicated recent Fuzzy Delphi method that using a concept of Triangular Fuzzy Numbers and Defuzzification process. A 100% response rate was obtained from all the sixteen experts with an average Likert scoring of four to five. Post FDM analysis, the first prerequisite was fulfilled with a threshold value (d) ≤ 0.2, hence all the six constructs were accepted. For the second prerequisite, three items (21%) from noise-attitude construct and four items (40%) from chemical-practice construct had expert consensus lesser than 75%, which giving rise to about 12% from the total items in the questionnaire. The third prerequisite was used to rank the items within the constructs by calculating the average

  3. Validation of an electrophoretic method to detect albuminuria in cats.

    Science.gov (United States)

    Ferlizza, Enea; Dondi, Francesco; Andreani, Giulia; Bucci, Diego; Archer, Joy; Isani, Gloria

    2017-08-01

    Objectives The aims of this study were to validate a semi-automated high-resolution electrophoretic technique to quantify urinary albumin in healthy and diseased cats, and to evaluate its diagnostic performance in cases of proteinuria and renal diseases. Methods Urine samples were collected from 88 cats (healthy; chronic kidney disease [CKD]; lower urinary tract disease [LUTD]; non-urinary tract diseases [OTHER]). Urine samples were routinely analysed and high-resolution electrophoresis (HRE) was performed. Within-assay and between-assay variability, linearity, accuracy, recovery and the lowest detectable and quantifiable bands were calculated. Receiver operating curve (ROC) analysis was also performed. Results All coefficients of variation were HRE allowed the visualisation of a faint band of albumin and a diffused band between alpha and beta zones in healthy cats, while profiles from diseased cats were variable. Albumin (mg/dl) and urine albumin:creatinine ratio (UAC) were significantly ( P HRE is an accurate and precise method that could be used to measure albuminuria in cats. UAC was useful to correctly classify proteinuria and to discriminate between healthy and diseased cats. HRE might also provide additional information on urine proteins with a profile of all proteins (albumin and globulins) to aid clinicians in the diagnosis of diseases characterised by proteinuria.

  4. New validated method for piracetam HPLC determination in human plasma.

    Science.gov (United States)

    Curticapean, Augustin; Imre, Silvia

    2007-01-10

    The new method for HPLC determination of piracetam in human plasma was developed and validated by a new approach. The simple determination by UV detection was performed on supernatant, obtained from plasma, after proteins precipitation with perchloric acid. The chromatographic separation of piracetam under a gradient elution was achieved at room temperature with a RP-18 LiChroSpher 100 column and aqueous mobile phase containing acetonitrile and methanol. The quantitative determination of piracetam was performed at 200 nm with a lower limit of quantification LLQ=2 microg/ml. For this limit, the calculated values of the coefficient of variation and difference between mean and the nominal concentration are CV%=9.7 and bias%=0.9 for the intra-day assay, and CV%=19.1 and bias%=-7.45 for the between-days assay. For precision, the range was CV%=1.8/11.6 in the intra-day and between-days assay, and for accuracy, the range was bias%=2.3/14.9 in the intra-day and between-days assay. In addition, the stability of piracetam in different conditions was verified. Piracetam proved to be stable in plasma during 4 weeks at -20 degrees C and for 36 h at 20 degrees C in the supernatant after protein precipitation. The new proposed method was used for a bioequivalence study of two medicines containing 800 mg piracetam.

  5. Project Deep Drilling KLX02 - Phase 2. Methods, scope of activities and results. Summary report

    International Nuclear Information System (INIS)

    Ekman, L.

    2001-04-01

    Geoscientific investigations performed by SKB, including those at the Aespoe Hard Rock Laboratory, have so far comprised the bedrock horizon down to about 1000 m. The primary purposes with the c. 1700 m deep, φ76 mm, sub vertical core borehole KLX02, drilled during the autumn 1992 at Laxemar, Oskarshamn, was to test core drilling technique at large depths and with a relatively large diameter and to enable geoscientific investigations beyond 1000 m. Drilling of borehole KLX02 was fulfilled very successfully. Results of the drilling commission and the borehole investigations conducted in conjunction with drilling have been reported earlier. The present report provides a summary of the investigations made during a five year period after completion of drilling. Results as well as methods applied are described. A variety of geoscientific investigations to depths exceeding 1600 m were successfully performed. However, the investigations were not entirely problem-free. For example, borehole equipment got stuck in the borehole at several occasions. Special investigations, among them a fracture study, were initiated in order to reveal the mechanisms behind this problem. Different explanations seem possible, e.g. breakouts from the borehole wall, which may be a specific problem related to the stress situation in deep boreholes. The investigation approach for borehole KLX02 followed, in general outline, the SKB model for site investigations, where a number of key issues for site characterization are studied. For each of those, a number of geoscientific parameters are investigated and determined. One important aim is to erect a lithological-structural model of the site, which constitutes the basic requirement for modelling mechanical stability, thermal properties, groundwater flow, groundwater chemistry and transport of solutes. The investigations in borehole KLX02 resulted in a thorough lithological-structural characterization of the rock volume near the borehole. In order to

  6. Project Deep Drilling KLX02 - Phase 2. Methods, scope of activities and results. Summary report

    Energy Technology Data Exchange (ETDEWEB)

    Ekman, L. [GEOSIGMA AB/LE Geokonsult AB, Uppsala (Sweden)

    2001-04-01

    Geoscientific investigations performed by SKB, including those at the Aespoe Hard Rock Laboratory, have so far comprised the bedrock horizon down to about 1000 m. The primary purposes with the c. 1700 m deep, {phi}76 mm, sub vertical core borehole KLX02, drilled during the autumn 1992 at Laxemar, Oskarshamn, was to test core drilling technique at large depths and with a relatively large diameter and to enable geoscientific investigations beyond 1000 m. Drilling of borehole KLX02 was fulfilled very successfully. Results of the drilling commission and the borehole investigations conducted in conjunction with drilling have been reported earlier. The present report provides a summary of the investigations made during a five year period after completion of drilling. Results as well as methods applied are described. A variety of geoscientific investigations to depths exceeding 1600 m were successfully performed. However, the investigations were not entirely problem-free. For example, borehole equipment got stuck in the borehole at several occasions. Special investigations, among them a fracture study, were initiated in order to reveal the mechanisms behind this problem. Different explanations seem possible, e.g. breakouts from the borehole wall, which may be a specific problem related to the stress situation in deep boreholes. The investigation approach for borehole KLX02 followed, in general outline, the SKB model for site investigations, where a number of key issues for site characterization are studied. For each of those, a number of geoscientific parameters are investigated and determined. One important aim is to erect a lithological-structural model of the site, which constitutes the basic requirement for modelling mechanical stability, thermal properties, groundwater flow, groundwater chemistry and transport of solutes. The investigations in borehole KLX02 resulted in a thorough lithological-structural characterization of the rock volume near the borehole. In order

  7. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    Science.gov (United States)

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  8. Development of radionuclide inventory estimation method using scaling factor for the Korean NPPs: scope and status

    International Nuclear Information System (INIS)

    Hwang, Ki Ha; Lee, Sang Chul; Kang, Sang Hee; Lee, Kun Jai; Jeong, Chan Woo; Ahn, Sang Myeon; Kim, Tae Wook; Kim, Kyoung Doek; Herr, Y. H.

    2003-01-01

    Regulations and guidelines for radionuclide waste disposal require detailed information about the characteristics of radioactive waste drums prior to the transport to the disposal sites. Therefore, it is important to know the accurate radionuclide inventory of radioactive waste. However, estimation of radionuclide concentrations on drummed radioactive waste is difficult and unreliable. In order to overcome these difficulties, scaling factors have been used to assess the activities of radionuclides which could not be directly analyzed. A radionuclides assay system has been operated at Korean nuclear power plant (KORI site) since 1996 and consolidated scaling factor concept has played a dominant role in determination of radionuclides concentrations. For corrosion product radionuclides, generic scaling factors were used due to the similar trend and better-characterized properties of Korean analyzed data compared to the worldwide database. It is not easy to use the generic scaling factors for fission product and TRU radionuclides. Thus simple model reflecting the history of the operation of power plant and nuclear fuel condition is applied. However, some problems are still remained. For examples, disparity between the actual and ideal correlation pairs, inaccuracy of analyzed sample values, uncertainty in representative of derived scaling factor values and so on. As a result, the correlation ratios are somewhat dispersive. So it is planned to establish an assay system using more improved scaling factors. In this study, the scope of research is expanded and planned such as following. 1) Considering more assay target nuclides, 2) Considering more target NPPs, 3) More reliable sampling and measurement techniques, 4) Improvement of accuracy and representativeness of derived scaling factor values and 5) Conformation of correlation pairs based on Korean analyzed data. As this study progresses, it is possible to get more accurate and reliable prediction for the information of

  9. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method.

    Science.gov (United States)

    Badran, Hani; Pluye, Pierre; Grad, Roland

    2017-03-14

    The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n

  10. A method for the statistical interpretation of friction ridge skin impression evidence: Method development and validation.

    Science.gov (United States)

    Swofford, H J; Koertner, A J; Zemp, F; Ausdemore, M; Liu, A; Salyards, M J

    2018-04-03

    The forensic fingerprint community has faced increasing amounts of criticism by scientific and legal commentators, challenging the validity and reliability of fingerprint evidence due to the lack of an empirically demonstrable basis to evaluate and report the strength of the evidence in a given case. This paper presents a method, developed as a stand-alone software application, FRStat, which provides a statistical assessment of the strength of fingerprint evidence. The performance was evaluated using a variety of mated and non-mated datasets. The results show strong performance characteristics, often with values supporting specificity rates greater than 99%. This method provides fingerprint experts the capability to demonstrate the validity and reliability of fingerprint evidence in a given case and report the findings in a more transparent and standardized fashion with clearly defined criteria for conclusions and known error rate information thereby responding to concerns raised by the scientific and legal communities. Published by Elsevier B.V.

  11. PROJECT SCOPE MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    Yana Derenskaya

    2018-01-01

    Full Text Available The purpose of the article is to define the essence of project scope management process, its components, as well as to develop an algorithm of project scope management in terms of pharmaceutical production. Methodology. To carry out the study, available information sources on standards of project management in whole and elements of project scope management in particular are analysed. Methods of system and structural analysis, logical generalization are used to study the totality of subprocesses of project scope management, input and output documents, and to provide each of them. Methods of network planning are used to construct a precedence diagram of project scope management process. Results of the research showed that components of the project scope management are managing the scope of the project product and managing the content of project work. It is the second component is investigated in the presented work as a subject of research. Accordingly, it is defined that project scope management process is to substantiate and bring to the realization the necessary amount of work that ensures the successful implementation of the project (achievement of its goal and objectives of individual project participants. It is also determined that the process of managing the project scope takes into account the planning, definition of the project scope, creation of the structure of project work, confirmation of the scope and management of the project scope. Participants of these subprocesses are: customer, investor, and other project participants – external organizations (contractors of the project; project review committee; project manager and project team. It is revealed that the key element of planning the project scope is the formation of the structure of design work, the justification of the number of works, and the sequence of their implementation. It is recommended to use the following sequence of stages for creating the structure of project work

  12. Reliability and concurrent validity of an alternative method of lateral ...

    African Journals Online (AJOL)

    1 University of Northern Iowa, Division of Athletic Training, 003C Human. Performance Center, Cedar ... concurrent validity of the fingertip-to-floor distance test (FFD) ... in these protocols are spinal and extremity range of motion, pelvic control ...

  13. Alternative validation practice of an automated faulting measurement method.

    Science.gov (United States)

    2010-03-08

    A number of states have adopted profiler based systems to automatically measure faulting, : in jointed concrete pavements. However, little published work exists which documents the : validation process used for such automated faulting systems. This p...

  14. A mixed methods inquiry into the validity of data

    Directory of Open Access Journals (Sweden)

    Vaarst Mette

    2008-07-01

    Full Text Available Abstract Background Research in herd health management solely using a quantitative approach may present major challenges to the interpretation of the results, because the humans involved may have responded to their observations based on previous experiences and own beliefs. This challenge can be met through increased awareness and dialogue between researchers and farmers or other stakeholders about the background for data collection related to management and changes in management. By integrating quantitative and qualitative research methods in a mixed methods research approach, the researchers will improve their understanding of this potential bias of the observed data and farms, which will enable them to obtain more useful results of quantitative analyses. Case description An example is used to illustrate the potentials of combining quantitative and qualitative approaches to herd health related data analyses. The example is based on two studies on bovine metritis. The first study was a quantitative observational study of risk factors for metritis in Danish dairy cows based on data from the Danish Cattle Database. The other study was a semi-structured interview study involving 20 practicing veterinarians with the aim to gain insight into veterinarians' decision making when collecting and processing data related to metritis. Discussion and Evaluation The relations between risk factors and metritis in the first project supported the findings in several other quantitative observational studies; however, the herd incidence risk was highly skewed. There may be simple practical reasons for this, e.g. underreporting and differences in the veterinarians' decision making. Additionally, the interviews in the second project identified several problems with correctness and validity of data regarding the occurrence of metritis because of differences regarding case definitions and thresholds for treatments between veterinarians. Conclusion Studies where

  15. Use of Mixed Methods Research in Research on Coronary Artery Disease, Diabetes Mellitus, and Hypertension: A Scoping Review.

    Science.gov (United States)

    Campbell, David J T; Tam-Tham, Helen; Dhaliwal, Kirnvir K; Manns, Braden J; Hemmelgarn, Brenda R; Sanmartin, Claudia; King-Shier, Kathryn

    2017-01-01

    Mixed methods research, the use of both qualitative and quantitative methods within 1 program of study, is becoming increasingly popular to allow investigators to explore patient experiences (qualitative) and also measure outcomes (quantitative). Coronary artery disease and its risk factors are some of the most studied conditions; however, the extent to which mixed methods studies are being conducted in these content areas is unknown. We sought to comprehensively describe the characteristics of published mixed methods studies on coronary artery disease and major risk factors (diabetes mellitus and hypertension). We conducted a scoping review of the literature indexed in PubMed, Medline, EMBASE, and CINAHL. We identified 811 abstracts for screening, of which 254 articles underwent full-text review and 97 reports of 81 studies met criteria for inclusion. The majority of studies in this area were conducted in the past 10 years by nurse researchers from the United States and United Kingdom. Diabetes mellitus was the most common content area for mixed methods investigation (compared with coronary artery disease and hypertension). Most authors described their rationale for using mixed methods as complementarity and did not describe study priority or how they reconciled differences in methodological paradigms. Some mixed methods study designs were more commonly used than others, including concurrent timing and integration at the interpretation stage. Qualitative strands were most commonly descriptive studies using interviews for data collection. Quantitative strands were most commonly cross-sectional observational studies, which relied heavily on self-report data such as surveys and scales. Although mixed methods research is becoming increasingly popular in the area of coronary artery disease and its risk factors, many of the more advanced mixed methods, qualitative, and quantitative techniques have not been commonly used in these areas. © 2016 American Heart Association

  16. Scoping review of response shift methods: current reporting practices and recommendations.

    Science.gov (United States)

    Sajobi, Tolulope T; Brahmbatt, Ronak; Lix, Lisa M; Zumbo, Bruno D; Sawatzky, Richard

    2018-05-01

    Response shift (RS) has been defined as a change in the meaning of an individual's self-evaluation of his/her health status and quality of life. Several statistical model- and design-based methods have been developed to test for RS in longitudinal data. We reviewed the uptake of these methods in patient-reported outcomes (PRO) literature. CINHAHL, EMBASE, Medline, ProQuest, PsycINFO, and Web of Science were searched to identify English-language articles about RS published until 2016. Data on year and country of publication, PRO measure adopted, RS detection method, type of RS detected, and testing of underlying model assumptions were extracted from the included articles. Of the 1032 articles identified, 101 (9.8%) articles were included in the study. While 54.5 of the articles reported on the Then-test, 30.7% of the articles reported on Oort's or Schmitt's structural equation modeling (SEM) procedure. Newer RS detection methods, such as relative importance analysis and random forest regression, have been used less frequently. Less than 25% reported on testing the assumptions underlying the adopted RS detection method(s). Despite rapid methodological advancements in RS research, this review highlights the need for further research about RS detection methods for complex longitudinal data and standardized reporting guidelines.

  17. Methods for Alleviating Stress and Increasing Resilience in the Midwifery Community: A Scoping Review of the Literature.

    Science.gov (United States)

    Wright, Erin M; Matthai, Maude Theo; Warren, Nicole

    2017-11-01

    Work-related stress and exposure to traumatic birth have deleterious impacts on midwifery practice, the midwife's physiologic well-being, and the midwifery workforce. This is a global phenomenon, and the specific sources of this stress vary dependent on practice setting. This scoping review aims to determine which, if any, modalities help to reduce stress and increase resilience among a population of midwives. A scoping review of the literature published between January 2011 and September 2016 using PubMed, CINAHL, Embase, PsycINFO, and Cochrane databases was performed. Of the initial 796 reviewed records, 6 met inclusion criteria. Three of the 6 included studies were quantitative in nature, 2 were qualitative, and one used mixed methods. Countries where studies were conducted include Uganda, Iran, the United Kingdom, Israel, and Australia. Three of the studies used interventions for stress reduction and increased coping. Two of these 3 used a mindfulness-based stress reduction program resulting in improved stress levels and coping skills. In each study, midwives express a desire for work-based programs and support from colleagues and employers for increasing coping abilities. These studies focused on stress reduction and/or increasing resilience. While modalities such as mindfulness-based stress reduction show promise, further studies with a cohort of midwives should be conducted. These studies should include interventions aimed at addressing the needs of midwives to improve psychological outcomes related to employment-related stress on a global scale and specific to each health care context. © 2017 by the American College of Nurse-Midwives.

  18. Evaluating the effectiveness of behavior change techniques in health-related behavior: a scoping review of methods used.

    Science.gov (United States)

    Michie, Susan; West, Robert; Sheals, Kate; Godinho, Cristina A

    2018-03-01

    Behavior change interventions typically contain multiple potentially active components: behavior change techniques (BCTs). Identifying which specific BCTs or BCT combinations have the potential to be effective for a given behavior in a given context presents a major challenge. The aim of this study was to review the methods that have been used to identify effective BCTs for given behaviors in given contexts and evaluate their strengths and limitations. A scoping review was conducted of studies that had sought to identify effective BCTs. Articles referring to "behavio(u)r change technique(s)" in the abstract/text were located, and ones that involved identification of effective BCTs were selected. The methods reported were coded. The methods were analyzed in general terms using "PASS" criteria: Practicability (facility to apply the method appropriately), Applicability (facility to generalize from findings to contexts and populations of interest), Sensitivity (facility to identify effective BCTs), and Specificity (facility to rule out ineffective BCTs). A sample of 10% of the studies reviewed was then evaluated using these criteria to assess how far the strengths and limitations identified in principle were borne out in practice. One hundred and thirty-five studies were identified. The methods used in those studies were experimental manipulation of BCTs, observational studies comparing outcomes in the presence or absence of BCTs, meta-analyses of BCT comparisons, meta-regressions evaluating effect sizes with and without specific BCTs, reviews of BCTs found in effective interventions, and meta-classification and regression trees. The limitations of each method meant that only weak conclusions could be drawn regarding the effectiveness of specific BCTs or BCT combinations. Methods for identifying effective BCTs linked to target behavior and context all have important inherent limitations. A strategy needs to be developed that can systematically combine the strengths of

  19. Use and misuse of mixed methods in population oral health research: A scoping review.

    Science.gov (United States)

    Gupta, A; Keuskamp, D

    2018-05-30

    Despite the known benefits of a mixed methods approach in health research, little is known of its use in the field of population oral health. To map the extent of literature using a mixed methods approach to examine population oral health outcomes. For a comprehensive search of all the available literature published in the English language, databases including PubMed, Dentistry and Oral Sciences Source (DOSS), CINAHL, Web of Science and EMBASE (including Medline) were searched using a range of keywords from inception to October 2017. Only peer-reviewed, population-based studies of oral health outcomes conducted among non-institutionalised participants and using mixed methods were considered eligible for inclusion. Only nine studies met the inclusion criteria and were included in the review. The most frequent oral health outcome investigated was caries experience. However, most studies lacked a theoretical rationale or framework for using mixed methods, or supporting the use of qualitative data. Concurrent triangulation with a convergent design was the most commonly used mixed methods typology for integrating quantitative and qualitative data. The tools used to collect quantitative and qualitative data were mostly limited to surveys and interviews. With growing complexity recognised in the determinants of oral disease, future studies addressing population oral health outcomes are likely to benefit from the use of mixed methods. Explicit consideration of theoretical framework and methodology will strengthen those investigations. Copyright© 2018 Dennis Barber Ltd.

  20. The method validation step of biological dosimetry accreditation process

    International Nuclear Information System (INIS)

    Roy, L.; Voisin, P.A.; Guillou, A.C.; Busset, A.; Gregoire, E.; Buard, V.; Delbos, M.; Voisin, Ph.

    2006-01-01

    One of the missions of the Laboratory of Biological Dosimetry (L.D.B.) of the Institute for Radiation and Nuclear Safety (I.R.S.N.) is to assess the radiological dose after an accidental overexposure suspicion to ionising radiation, by using radio-induced changes of some biological parameters. The 'gold standard' is the yield of dicentrics observed in patients lymphocytes, and this yield is converted in dose using dose effect relationships. This method is complementary to clinical and physical dosimetry, for medical team in charge of the patients. To obtain a formal recognition of its operational activity, the laboratory decided three years ago, to require an accreditation, by following the recommendations of both 17025 General Requirements for the Competence of Testing and Calibration Laboratories and 19238 Performance criteria for service laboratories performing biological dosimetry by cyto-genetics. Diagnostics, risks analysis were realized to control the whole analysis process leading to documents writing. Purchases, personnel department, vocational training were also included in the quality system. Audits were very helpful to improve the quality system. One specificity of this technique is that it is not normalized therefore apart from quality management aspects, several technical points needed some validations. An inventory of potentially influent factors was carried out. To estimate their real effect on the yield of dicentrics, a Placket-Burman experimental design was conducted. The effect of seven parameters was tested: the BUdr (bromodeoxyuridine), PHA (phytohemagglutinin) and colcemid concentration, the culture duration, the incubator temperature, the blood volume and the medium volume. The chosen values were calculated according to the uncertainties on the way they were measured i.e. pipettes, thermometers, test tubes. None of the factors has a significant impact on the yield of dicentrics. Therefore the uncertainty linked to their use was considered as

  1. Validation of pestice multi residue analysis method on cucumber

    International Nuclear Information System (INIS)

    2011-01-01

    In this study we aimed to validate the method of multi pesticide residue analysis on cucumber. Before real sample injection, system suitability test was performed in gas chromatography (GC). For this purpose, a sensitive pesticide mixture was used for GC-NPD and estimated the performance parameters such as number of effective theoretical plates, resolution factor, asymmetry, tailing and selectivity. It was detected that the system was suitable for calibration and sample injection. Samples were fortified at the level of 0.02, 0.2, 0.8 and 1 mg/kg with mixture of dichlorvos, malathion and chloropyrifos pesticides. In the fortification step 1 4C-carbaryl was also added on homogenized analytical portions to make use of 1 4C labelled pesticides for the determining extraction efficiency. Then the basic analytical process, such as ethyl acetate extraction, filtration, evaporation and cleanup, were performed. The GPC calibration using 1 4C- carbaryl and fortification mixture (dichlorvos, malathion and chloropyrifos) showed that pesticide fraction come through the column between the 8-23 ml fractions. The recovery of 1 4C-carbaryl after the extraction and cleanup step were 92.63-111.73 % and 74.83-102.22 %, respectively. The stability of pesticides during analysis is an important factor. In this study, stability test was performed including matrix effect. Our calculation and t test results showed that above mentioned pesticides were not stabile during sample processing in our laboratory conditions and it was found that sample comminution with dry ice may improve stability. In the other part of the study, 1 4C-chloropyrifos was used to determine homogeneity of analytical portions taken from laboratory samples. Use of 1 4C labelled pesticides allows us for quick quantification analyte, even with out clean-up. The analytical results show that after sample processing with waring blender, analytical portions were homogenous. Sample processing uncertainty depending on quantity of

  2. The method validation step of biological dosimetry accreditation process

    Energy Technology Data Exchange (ETDEWEB)

    Roy, L.; Voisin, P.A.; Guillou, A.C.; Busset, A.; Gregoire, E.; Buard, V.; Delbos, M.; Voisin, Ph. [Institut de Radioprotection et de Surete Nucleaire, LDB, 92 - Fontenay aux Roses (France)

    2006-07-01

    One of the missions of the Laboratory of Biological Dosimetry (L.D.B.) of the Institute for Radiation and Nuclear Safety (I.R.S.N.) is to assess the radiological dose after an accidental overexposure suspicion to ionising radiation, by using radio-induced changes of some biological parameters. The 'gold standard' is the yield of dicentrics observed in patients lymphocytes, and this yield is converted in dose using dose effect relationships. This method is complementary to clinical and physical dosimetry, for medical team in charge of the patients. To obtain a formal recognition of its operational activity, the laboratory decided three years ago, to require an accreditation, by following the recommendations of both 17025 General Requirements for the Competence of Testing and Calibration Laboratories and 19238 Performance criteria for service laboratories performing biological dosimetry by cyto-genetics. Diagnostics, risks analysis were realized to control the whole analysis process leading to documents writing. Purchases, personnel department, vocational training were also included in the quality system. Audits were very helpful to improve the quality system. One specificity of this technique is that it is not normalized therefore apart from quality management aspects, several technical points needed some validations. An inventory of potentially influent factors was carried out. To estimate their real effect on the yield of dicentrics, a Placket-Burman experimental design was conducted. The effect of seven parameters was tested: the BUdr (bromodeoxyuridine), PHA (phytohemagglutinin) and colcemid concentration, the culture duration, the incubator temperature, the blood volume and the medium volume. The chosen values were calculated according to the uncertainties on the way they were measured i.e. pipettes, thermometers, test tubes. None of the factors has a significant impact on the yield of dicentrics. Therefore the uncertainty linked to their use was

  3. A comparison of two search methods for determining the scope of systematic reviews and health technology assessments.

    Science.gov (United States)

    Forsetlund, Louise; Kirkehei, Ingvild; Harboe, Ingrid; Odgaard-Jensen, Jan

    2012-01-01

    This study aims to compare two different search methods for determining the scope of a requested systematic review or health technology assessment. The first method (called the Direct Search Method) included performing direct searches in the Cochrane Database of Systematic Reviews (CDSR), Database of Abstracts of Reviews of Effects (DARE) and the Health Technology Assessments (HTA). Using the comparison method (called the NHS Search Engine) we performed searches by means of the search engine of the British National Health Service, NHS Evidence. We used an adapted cross-over design with a random allocation of fifty-five requests for systematic reviews. The main analyses were based on repeated measurements adjusted for the order in which the searches were conducted. The Direct Search Method generated on average fewer hits (48 percent [95 percent confidence interval {CI} 6 percent to 72 percent], had a higher precision (0.22 [95 percent CI, 0.13 to 0.30]) and more unique hits than when searching by means of the NHS Search Engine (50 percent [95 percent CI, 7 percent to 110 percent]). On the other hand, the Direct Search Method took longer (14.58 minutes [95 percent CI, 7.20 to 21.97]) and was perceived as somewhat less user-friendly than the NHS Search Engine (-0.60 [95 percent CI, -1.11 to -0.09]). Although the Direct Search Method had some drawbacks such as being more time-consuming and less user-friendly, it generated more unique hits than the NHS Search Engine, retrieved on average fewer references and fewer irrelevant results.

  4. Education in the responsible conduct of research in psychology: methods and scope.

    Science.gov (United States)

    DiLorenzo, Terry A; Becker-Fiegeles, Jill; Gibelman, Margaret

    2014-01-01

    In this mixed-method study of education in the responsible conduct of research (RCR) in psychology, phase one survey respondents (n = 141) reported that faculty and students were familiar with RCR standards and procedures to educate them were believed to be adequate. However, educational methods varied widely. In phase two, seven survey respondents completed in-depth interviews assessing RCR training and education and research review procedures. Educational methods through which RCR content was presented included the following ones: traditional (lectures), technical (web-based), and experiential (internships), but RCR was often minimally considered in the formal curriculum. Our results suggest that psychology training programs might benefit from more formal consideration of RCR education and training in the curriculum.

  5. A simple multi-residue method for the determination of pesticides in fruits and vegetables using a methanolic extraction and ultra-high-performance liquid chromatography-tandem mass spectrometry: optimization and extension of scope.

    Science.gov (United States)

    Hanot, V; Goscinny, S; Deridder, M

    2015-03-06

    In 2004, a new multi-residue pesticides method had been published using methanol as extraction solvent. Our goal for this study was to optimize the analytical scheme while extending the compound scope from 19 to 200 pesticides. The main changes from the original method take place at the sample extraction and processing with a special attention to make the overall method fit for routine analysis with minimal cost. Hence, after a quick Ultra-Turrax homogenization with a methanolic solution, the sample is simply diluted before the separation and detection by ultra-high-performance liquid chromatography and MS/MS detection for quantitative and confirmatory purposes. The performance of the method including limits of quantification (LOQs), linearity, matrix effect, precision was evaluated during validation in accordance with the European Union SANCO/12571/2013 regulatory guidelines. Two representative matrices, lettuce and orange, were selected and fortified at two concentration levels for these experiments. At the LOQ and ten times the LOQ, recoveries of the analytes were mostly within 70-120%, with coefficients of variation lower than 25% in intra-day repeatability conditions. In addition to being simple and fast, these results demonstrate the suitability of the optimized method for the analysis of large scope pesticides in routine laboratories. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Lay and professional stakeholder involvement in scoping palliative care issues: Methods used in seven European countries.

    Science.gov (United States)

    Brereton, Louise; Ingleton, Christine; Gardiner, Clare; Goyder, Elizabeth; Mozygemba, Kati; Lysdahl, Kristin Bakke; Tummers, Marcia; Sacchini, Dario; Leppert, Wojciech; Blaževičienė, Aurelija; van der Wilt, Gert Jan; Refolo, Pietro; De Nicola, Martina; Chilcott, James; Oortwijn, Wija

    2017-02-01

    Stakeholders are people with an interest in a topic. Internationally, stakeholder involvement in palliative care research and health technology assessment requires development. Stakeholder involvement adds value throughout research (from prioritising topics to disseminating findings). Philosophies and understandings about the best ways to involve stakeholders in research differ internationally. Stakeholder involvement took place in seven countries (England, Germany, Italy, Lithuania, the Netherlands, Norway and Poland). Findings informed a project that developed concepts and methods for health technology assessment and applied these to evaluate models of palliative care service delivery. To report on stakeholder involvement in the INTEGRATE-HTA project and how issues identified informed project development. Using stakeholder consultation or a qualitative research design, as appropriate locally, stakeholders in seven countries acted as 'advisors' to aid researchers' decision making. Thematic analysis was used to identify key issues across countries. A total of 132 stakeholders (82 professionals and 50 'lay' people) aged ⩾18 participated in individual face-to-face or telephone interviews, consultation meetings or focus groups. Different stakeholder involvement methods were used successfully to identify key issues in palliative care. A total of 23 issues common to three or more countries informed decisions about the intervention and comparator of interest, sub questions and specific assessments within the health technology assessment. Stakeholders, including patients and families undergoing palliative care, can inform project decision making using various involvement methods according to the local context. Researchers should consider local understandings about stakeholder involvement as views of appropriate and feasible methods vary. Methods for stakeholder involvement, especially consultation, need further development.

  7. A validated RP-HPLC method for the determination of Irinotecan hydrochloride residues for cleaning validation in production area

    Directory of Open Access Journals (Sweden)

    Sunil Reddy

    2013-03-01

    Full Text Available Introduction: cleaning validation is an integral part of current good manufacturing practices in pharmaceutical industry. The main purpose of cleaning validation is to prove the effectiveness and consistency of cleaning in a given pharmaceutical production equipment to prevent cross contamination and adulteration of drug product with other active ingredient. Objective: a rapid, sensitive and specific reverse phase HPLC method was developed and validated for the quantitative determination of irinotecan hydrochloride in cleaning validation swab samples. Method: the method was validated using waters symmetry shield RP-18 (250mm x 4.6mm 5 µm column with isocratic mobile phase containing a mixture of 0.02 M potassium di-hydrogen ortho-phosphate, pH adjusted to 3.5 with ortho-phosphoric acid, methanol and acetonitrile (60:20:20 v/v/v. The flow rate of mobile phase was 1.0 mL/min with column temperature of 25°C and detection wavelength at 220nm. The sample injection volume was 100 µl. Results: the calibration curve was linear over a concentration range from 0.024 to 0.143 µg/mL with a correlation coefficient of 0.997. The intra-day and inter-day precision expressed as relative standard deviation were below 3.2%. The recoveries obtained from stainless steel, PCGI, epoxy, glass and decron cloth surfaces were more than 85% and there was no interference from the cotton swab. The detection limit (DL and quantitation limit (QL were 0.008 and 0.023 µg ml-1, respectively. Conclusion: the developed method was validated with respect to specificity, linearity, limit of detection and quantification, accuracy, precision and solution stability. The overall procedure can be used as part of a cleaning validation program in pharmaceutical manufacture of irinotecan hydrochloride.

  8. Invisible hand in the process of making economics or on the method and scope of economics

    OpenAIRE

    Yay Turan; Tastan Huseyin

    2010-01-01

    As a social science, economics cannot be reduced to simply an a priori science or an ideology. In addition economics cannot be solely an empirical or a historical science. Economics is a research field which studies only one dimension of human behavior, with the four fields of mathematics, econometrics, ethics and history intersecting one another. The purpose of this paper is to discuss the two parts of the proposition above, in connection with the controversies surrounding the method and the...

  9. Specification and Preliminary Validation of IAT (Integrated Analysis Techniques) Methods: Executive Summary.

    Science.gov (United States)

    1985-03-01

    conceptual framwork , and preliminary validation of IAT concepts. Planned work for FY85, including more extensive validation, is also described. 20...Developments: Required Capabilities .... ......... 10 2-1 IAT Conceptual Framework - FY85 (FEO) ..... ........... 11 2-2 Recursive Nature of Decomposition...approach: 1) Identify needs & requirements for IAT. 2) Develop IAT conceptual framework. 3) Validate IAT methods. 4) Develop applications materials. To

  10. Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory

    Science.gov (United States)

    Long, Haiying

    2017-01-01

    Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…

  11. Integration of educational methods and physical settings: design guidelines for High/Scope methodology in pre-schools

    Directory of Open Access Journals (Sweden)

    Shirin Izadpanah

    2014-06-01

    Full Text Available Quality design and appropriate space organization in preschool settings can support preschool children's educational activities. Although the relationship between the well-being and development of children and physical settings has been emphasized by many early childhood researchers, there is still a need for theoretical design guidelines that are geared towards the improvement of this issue. This research focuses on High/Scope education and aims to shape a theoretical guideline that raises teachers' awareness about the potential of learning spaces and guides them to improve the quality of the physical spaces. To create a theoretical framework, reliable sources are investigated in the light of High/Scope education and the requirements of pre-school children educational spaces. Physical space characteristics, the preschool child's requirements and High/Scope methodology identified design inputs, design considerations and recommendations that shape the final guideline for spatial arrangement in a High/Scope setting are integrated. Discussions and suggestions in this research benefit both designers and High/ Scope teaching staff. Results help High/Scope teaching staff increase the quality of a space in an educational setting without having an architectural background. The theoretical framework of the research allows designers to consider key features and users' possible activities in High/ Scope settings and shape their designs accordingly.

  12. Scoping Evaluation of the IRIS Radiation Environment by Using the FW-CADIS Method and SCALE MAVRIC Code

    International Nuclear Information System (INIS)

    Petrovic, B.

    2008-01-01

    IRIS is an advanced pressurized water reactor of integral configuration. This integral configuration with its relatively large reactor vessel and thick downcomer (1.7 m) results in a significant reduction of radiation field and material activation. It thus enables setting up aggressive dose reduction objectives, but at the same time presents challenges for the shielding analysis which needs to be performed over a large spatial domain and include flux attenuation by many orders of magnitude. The Monte Carlo method enables accurately representing irregular geometry and potential streaming paths, but may require significant computational efforts to reduce statistical uncertainty within the acceptable range. Variance reduction methods do exist, but they are designed to provide results for individual detectors and in limited regions, whereas in the scoping phase of IRIS shielding analysis the results are sought throughout the whole containment. To facilitate such analysis, the SCALE MAVRIC was employed. Based on the recently developed FW-CADIS method, MAVRIC uses forward and adjoint deterministic transport theory calculations to generate effective biasing parameters for Monte Carlo simulations throughout the problem. Previous studies have confirmed the potential of this method for obtaining Monte Carlo solutions with acceptable statistics over large spatial domains. The objective of this work was to evaluate the capability of the FW-CADIS/MAVRIC to efficiently perform the required shielding analysis of IRIS. For that purpose, a representative model was prepared, retaining the main problem characteristics, i.e., a large spatial domain (over 10 m in each dimension) and significant attenuation (over 12 orders of magnitude), but geometrically rather simplified. The obtained preliminary results indicate that the FW-CADIS method implemented through the MAVRIC sequence in SCALE will enable determination of radiation field throughout the large spatial domain of the IRIS nuclear

  13. Developing Skills in Counselling and Psychotherapy: A Scoping Review of Interpersonal Process Recall and Reflecting Team Methods in Initial Therapist Training

    Science.gov (United States)

    Meekums, Bonnie; Macaskie, Jane; Kapur, Tricia

    2016-01-01

    The authors conducted a scoping review of the peer-reviewed literature associated with Interpersonal Process Recall (IPR) and Reflecting Team (RT) methods in order to find evidence for their use within skills development in therapist trainings. Inclusion criteria were: empirical research, reviews of empirical research, and responses to these; RT…

  14. Current Methods of Evaluating Speech-Language Outcomes for Preschoolers with Communication Disorders: A Scoping Review Using the ICF-CY

    Science.gov (United States)

    Cunningham, Barbara Jane; Washington, Karla N.; Binns, Amanda; Rolfe, Katelyn; Robertson, Bernadette; Rosenbaum, Peter

    2017-01-01

    Purpose: The purpose of this scoping review was to identify current measures used to evaluate speech-language outcomes for preschoolers with communication disorders within the framework of the International Classification of Functioning, Disability and Health-Children and Youth Version (ICF-CY; World Health Organization, 2007). Method: The review…

  15. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    Science.gov (United States)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  16. SCOPING STUDIES TO DEVELOP A METHOD TO DETERMINE PARTICLE SIZE IN SIMULANT SLUDGE SLURRIES BY SIEVING

    International Nuclear Information System (INIS)

    DAMON, CLICK

    2005-01-01

    A physical separation method (i.e. sieving) was investigated to determine particle size distribution in non-radioactive sludge slurry simulants with the goal of implementation into the SRNL (Savannah River National Laboratory) shielded cells for use with radioactive sludge slurries. The investigation included obtaining the necessary experimental equipment, developing accessory equipment for use with the sieve shaker (to be able to sieve simulant slurries with aqueous solutions), sieving three different simulant slurries through a number of sieves and determining the particle size distribution gravimetrically, and developing a sufficient cleaning protocol of the sieves for re-use. The experimental protocol involved successive sieving of a NIST standard (to check the particle size retention of the sieves) and three non-radioactive slurry simulants (Batch 3 Tank 40 Test 3, Tank 40 Drum 3 and CETL Sludge Batch 2, which had been previously characterized by Microtrac analysis) through smaller and smaller sieves (150 microns x 5 microns) via use of the wet sieving system or by hand. For each of the three slurries, duplicate experiments were carried out using filtered supernate and DI water (to check the accuracy of the method versus Microtrac data) to sieve the slurry. Particle size determinations using the wet sieving system with DI water agree well with Microtrac data on a volume basis and in some cases the sieving data may be more accurate particularly if the material sieved had large particles. A correction factor had to be applied to data obtained from experiments done with supernate due to the dissolved solids which dried upon the sieves in the drying stage of the experiments. Upon subtraction of the correction factors, the experimental results were very similar to those obtained with DI water. It should be noted that approximately 250 mL of each of three simulant slurries was necessary to have enough filtered supernate available to carry out the experiments. The

  17. Scope of harmonisation of pharmacopoeial liquid chromatography (LC) methods for diazepam and its related substances

    International Nuclear Information System (INIS)

    Shar, G.Q.; Jatoi, W.B.

    2015-01-01

    Drug analysis is an imperative activity to check the quality of a drug compound. Pharmacopoeial monographs provide important information about the quality of a drug substance. The expected quality of a medicine during period of use is also explained in such monographs. Analytical tools such as spectroscopic and chromatographic methods have been developed for such investigations. We have analysed the purity of a well known anxiolytic drug; diazepam, by using liquid chromatographic (LC) technique. It was noticed that with Zorbax Eclipse XDB - C8 (4.6 x 150 mm, 5 mu m) column and recommended mobile phase comprising acetonitrile - methanol -potassium dihydrogen phosphate (22+34+44 v/v), the desired results obtained were not according the chromatograms provided by European Pharmacopeia (EP), but by using another column (ACE- 5 - C8) (4.6 x 150 mm, 5 ?m), an extra peak of diazepam degradant was obtained, which showed that by using appropriate mobile phase containing CH3CN- CH/sub 3/OH- KH/sub 2/PO/sub 4/ (20+32+48 v/v), the better results can be achieved. The mean retention time for diazepam analysis was 2.9 minutes. (author)

  18. Validity of the Demirjian method for dental age estimation for ...

    African Journals Online (AJOL)

    2015-02-04

    Feb 4, 2015 ... Dental age was calculated using the Demirjian's method. Chronologic age was .... in order to avoid the examiner bias at the time of collecting data. ... age using the. Demirjian method for different age groups and total sample.

  19. System Identification Methods for Aircraft Flight Control Development and Validation

    Science.gov (United States)

    1995-10-01

    System-identification methods compose a mathematical model, or series of models, : from measurements of inputs and outputs of dynamic systems. This paper : discusses the use of frequency-domain system-identification methods for the : development and ...

  20. Methods and validity of dietary assessments in four Scandinavian populations

    DEFF Research Database (Denmark)

    Bingham, S; Wiggins, H S; Englyst, H

    1982-01-01

    and duplicate collections of all food eaten, was validated by chemical analysis of the duplicates, by measuring 24-hour urine and fecal nitrogen excretion, and by comparing the constituents of the urine samples collected during the survey with similar collections 1-2 weeks later. There were good agreements...... between estimates of fat and protein intake obtained by food-table calculations of the 4-day weighed record and the chemically analyzed duplicates. Urinary plus fecal nitrogen excretion was equal to estimated nitrogen intake during the survey, and no discernable changes in urinary output occurred after...

  1. Scope Definition

    DEFF Research Database (Denmark)

    Bjørn, Anders; Owsianiak, Mikołaj; Laurent, Alexis

    2018-01-01

    The scope definition is the second phase of an LCA. It determines what product systems are to be assessed and how this assessment should take place. This chapter teaches how to perform a scope definition. First, important terminology and key concepts of LCA are introduced. Then, the nine items...... making up a scope definition are elaborately explained: (1) Deliverables. (2) Object of assessment, (3) LCI modelling framework and handling of multifunctional processes, (4) System boundaries and completeness requirements, (5) Representativeness of LCI data, (6) Preparing the basis for the impact...... assessment, (7) Special requirements for system comparisons, (8) Critical review needs and (9) Planning reporting of results. The instructions relate both to the performance and reporting of a scope definition and are largely based on ILCD....

  2. A long-term validation of the modernised DC-ARC-OES solid-sample method.

    Science.gov (United States)

    Flórián, K; Hassler, J; Förster, O

    2001-12-01

    The validation procedure based on ISO 17025 standard has been used to study and illustrate both the longterm stability of the calibration process of the DC-ARC solid sample spectrometric method and the main validation criteria of the method. In the calculation of the validation characteristics depending on the linearity(calibration), also the fulfilment of predetermining criteria such as normality and homoscedasticity was checked. In order to decide whether there are any trends in the time-variation of the analytical signal or not, also the Neumann test of trend was applied and evaluated. Finally, a comparison with similar validation data of the ETV-ICP-OES method was carried out.

  3. Content validity of methods to assess malnutrition in cancer patients: a systematic review

    NARCIS (Netherlands)

    Sealy, Martine; Nijholt, Willemke; Stuiver, M.M.; van der Berg, M.M.; Ottery, Faith D.; van der Schans, Cees; Roodenburg, Jan L N; Jager-Wittenaar, Harriët

    Content validity of methods to assess malnutrition in cancer patients: A systematic review Rationale: Inadequate operationalisation of the multidimensial concept of malnutrition may result in inadequate evaluation of nutritional status. In this review we aimed to assess content validity of methods

  4. Critical Values for Lawshe's Content Validity Ratio: Revisiting the Original Methods of Calculation

    Science.gov (United States)

    Ayre, Colin; Scally, Andrew John

    2014-01-01

    The content validity ratio originally proposed by Lawshe is widely used to quantify content validity and yet methods used to calculate the original critical values were never reported. Methods for original calculation of critical values are suggested along with tables of exact binomial probabilities.

  5. A VALIDATED STABILITY INDICATED RP-HPLC METHOD FOR DUTASTERIDE

    OpenAIRE

    D. Pavan Kumar a, b *, Naga Jhansi a, G. Srinivasa Rao b, Kirti Kumar Jain a

    2018-01-01

    ABSTRACT A Simple, Stability indicating, Isocratic, reverse phase High Performance Liquid Chromatographic (RPLC) related substance method was developed for Dutasteride in API. This method separates the impurities which are co-eluting in the pharmacopeia method. Successful separation of degradation impurities and synthetic impurities was achieved by YMC Triat phenyl column. Chromatographic was carried out on YMC Triat phenyl (150 X 4.6 mm, 3.0µm) column using 0.01M Potassium Dihydrogen Pho...

  6. Development and Validation of a Stability-Indicating LC-UV Method ...

    African Journals Online (AJOL)

    Keywords: Ketotifen, Cetirizine, Stability indicating method, Stressed conditions, Validation. Tropical ... in biological fluids [13] are also reported. Stability indicating HPLC method is reported for ketotifen where drug is ..... paracetamol, cetirizine.

  7. Validation of two ribosomal RNA removal methods for microbial metatranscriptomics

    Energy Technology Data Exchange (ETDEWEB)

    He, Shaomei; Wurtzel, Omri; Singh, Kanwar; Froula, Jeff L; Yilmaz, Suzan; Tringe, Susannah G; Wang, Zhong; Chen, Feng; Lindquist, Erika A; Sorek, Rotem; Hugenholtz, Philip

    2010-10-01

    The predominance of rRNAs in the transcriptome is a major technical challenge in sequence-based analysis of cDNAs from microbial isolates and communities. Several approaches have been applied to deplete rRNAs from (meta)transcriptomes, but no systematic investigation of potential biases introduced by any of these approaches has been reported. Here we validated the effectiveness and fidelity of the two most commonly used approaches, subtractive hybridization and exonuclease digestion, as well as combinations of these treatments, on two synthetic five-microorganism metatranscriptomes using massively parallel sequencing. We found that the effectiveness of rRNA removal was a function of community composition and RNA integrity for these treatments. Subtractive hybridization alone introduced the least bias in relative transcript abundance, whereas exonuclease and in particular combined treatments greatly compromised mRNA abundance fidelity. Illumina sequencing itself also can compromise quantitative data analysis by introducing a G+C bias between runs.

  8. Method for validating radiobiological samples using a linear accelerator

    International Nuclear Information System (INIS)

    Brengues, Muriel; Liu, David; Korn, Ronald; Zenhausern, Frederic

    2014-01-01

    There is an immediate need for rapid triage of the population in case of a large scale exposure to ionizing radiation. Knowing the dose absorbed by the body will allow clinicians to administer medical treatment for the best chance of recovery for the victim. In addition, today's radiotherapy treatment could benefit from additional information regarding the patient's sensitivity to radiation before starting the treatment. As of today, there is no system in place to respond to this demand. This paper will describe specific procedures to mimic the effects of human exposure to ionizing radiation creating the tools for optimization of administered radiation dosimetry for radiotherapy and/or to estimate the doses of radiation received accidentally during a radiation event that could pose a danger to the public. In order to obtain irradiated biological samples to study ionizing radiation absorbed by the body, we performed ex-vivo irradiation of human blood samples using the linear accelerator (LINAC). The LINAC was implemented and calibrated for irradiating human whole blood samples. To test the calibration, a 2 Gy test run was successfully performed on a tube filled with water with an accuracy of 3% in dose distribution. To validate our technique the blood samples were ex-vivo irradiated and the results were analyzed using a gene expression assay to follow the effect of the ionizing irradiation by characterizing dose responsive biomarkers from radiobiological assays. The response of 5 genes was monitored resulting in expression increase with the dose of radiation received. The blood samples treated with the LINAC can provide effective irradiated blood samples suitable for molecular profiling to validate radiobiological measurements via the gene-expression based biodosimetry tools. (orig.)

  9. Method for validating radiobiological samples using a linear accelerator.

    Science.gov (United States)

    Brengues, Muriel; Liu, David; Korn, Ronald; Zenhausern, Frederic

    2014-04-29

    There is an immediate need for rapid triage of the population in case of a large scale exposure to ionizing radiation. Knowing the dose absorbed by the body will allow clinicians to administer medical treatment for the best chance of recovery for the victim. In addition, today's radiotherapy treatment could benefit from additional information regarding the patient's sensitivity to radiation before starting the treatment. As of today, there is no system in place to respond to this demand. This paper will describe specific procedures to mimic the effects of human exposure to ionizing radiation creating the tools for optimization of administered radiation dosimetry for radiotherapy and/or to estimate the doses of radiation received accidentally during a radiation event that could pose a danger to the public. In order to obtain irradiated biological samples to study ionizing radiation absorbed by the body, we performed ex-vivo irradiation of human blood samples using the linear accelerator (LINAC). The LINAC was implemented and calibrated for irradiating human whole blood samples. To test the calibration, a 2 Gy test run was successfully performed on a tube filled with water with an accuracy of 3% in dose distribution. To validate our technique the blood samples were ex-vivo irradiated and the results were analyzed using a gene expression assay to follow the effect of the ionizing irradiation by characterizing dose responsive biomarkers from radiobiological assays. The response of 5 genes was monitored resulting in expression increase with the dose of radiation received. The blood samples treated with the LINAC can provide effective irradiated blood samples suitable for molecular profiling to validate radiobiological measurements via the gene-expression based biodosimetry tools.

  10. Development and validation of a spectroscopic method for the ...

    African Journals Online (AJOL)

    Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new analytical method for the quantitative analysis of miconazole ... a simple, reliable and robust method for the characterization of a mixture of the drugs in a dosage form. ... By Country · List All Titles · Free To Read Titles This Journal is Open Access.

  11. Validated method for the detection and quantitation of synthetic ...

    African Journals Online (AJOL)

    These methods were applied to postmortem cases from the Johannesburg Forensic Pathology Services Medicolegal Laboratory (FPS-MLL) to assess the prevalence of these synthetic cannabinoids amongst the local postmortem population. Urine samples were extracted utilizing a solid phase extraction (SPE) method, ...

  12. Validation, verification and comparison: Adopting new methods in ...

    African Journals Online (AJOL)

    2005-07-03

    Jul 3, 2005 ... chemical analyses can be assumed to be homogeneously distrib- uted. When introduced ... For water microbiology this has been resolved with the publication of .... tion exercise can result in a laboratory adopting the method. If, however, the new ... For methods used for environmental sam- ples, a range of ...

  13. Testing and Validation of the Dynamic Interia Measurement Method

    Science.gov (United States)

    Chin, Alexander; Herrera, Claudia; Spivey, Natalie; Fladung, William; Cloutier, David

    2015-01-01

    This presentation describes the DIM method and how it measures the inertia properties of an object by analyzing the frequency response functions measured during a ground vibration test (GVT). The DIM method has been in development at the University of Cincinnati and has shown success on a variety of small scale test articles. The NASA AFRC version was modified for larger applications.

  14. Willingness to pay function for two fuel treatments to reduce wildfire acreage burned: A scope test and comparison of White and Hispanic households

    Science.gov (United States)

    John B. Loomis; Le Trong Hung; Armando Gonzalez-Caban

    2009-01-01

    This research uses the Contingent Valuation Method to test whether willingness to pay increases for larger reductions in acres of forests burned by wildfires across the states of California. Florida and Montana. This is known as a test of scope, a measure of internal validity of the contingent valuation method (CVM). The scope test is conducted separately for White...

  15. The full size validation of remanent life assessment methods

    International Nuclear Information System (INIS)

    Hepworth, J.K.; Williams, J.A.

    1988-03-01

    A range of possible life assessment techniques for the remanent life appraisal of creeping structures is available in the published literature. However, due to the safety implications, the true conservatism of such methods cannot be assessed on operating plant. Consequently, the CEGB set up a four vessel programme in the Pressure Vessel Test Facility at the Marchwood Engineering Laboratories of the CEGB to underwrite and quantify the accuracy of these methods. The application of two non-destructive methods, namely strain monitoring and hardness measurement, to the data generated during about 12,000 hours of testing is examined. The current state of development of these methods is reviewed. Finally, the future CEGB programme relating to these vessels is discussed. (author)

  16. A Validated Method for the Detection and Quantitation of Synthetic ...

    African Journals Online (AJOL)

    NICOLAAS

    A LC-HRMS (liquid chromatography coupled with high resolution mass spectrometry) method for the ... its ease of availability, from head shops (shops selling predomi- ..... cannabinoids in whole blood in plastic containers with several common ...

  17. Validation of Standing Wave Liner Impedance Measurement Method, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Hersh Acoustical Engineering, Inc. proposes to establish the feasibility and practicality of using the Standing Wave Method (SWM) to measure the impedance of...

  18. Validation of EIA sampling methods - bacterial and biochemical analysis

    Digital Repository Service at National Institute of Oceanography (India)

    Sheelu, G.; LokaBharathi, P.A.; Nair, S.; Raghukumar, C.; Mohandass, C.

    to temporal factors. Paired T-test between pre- and post-disturbance samples suggested that the above methods of sampling and variables like TC, protein and TOC could be used for monitoring disturbance....

  19. Development and Validation of Improved Method for Fingerprint ...

    African Journals Online (AJOL)

    Methods: The optimum high performance capillary electrophoresis (HPCE) ... organic solvent, and were analyzed using HPLC ... quantified to 200 ml with water and centrifuged at ..... for the analysis of flavonoids in selected Thai plants by.

  20. Development and validation of bioanalytical UHPLC-UV method for simultaneous analysis of unchanged fenofibrate and its metabolite fenofibric acid in rat plasma: Application to pharmacokinetics

    Directory of Open Access Journals (Sweden)

    Rayan G. Alamri

    2017-01-01

    Full Text Available A simple, precise, selective and fast ultra-high performance liquid chromatography (UHPLC-UV method has been developed and validated for the simultaneous determination of a lipid regulating agent fenofibrate and its metabolite fenofibric acid in rat plasma. The chromatographic separation was carried out on a reversed-phase Acquity® BEH C18 column using methanol–water (65:35, v/v as the mobile phase. The isocratic flow was 0.3 ml/min with rapid run time of 2.5 min and UV detection was at 284 nm. The method was validated over a concentration range of 100–10000 ng/ml (r2 ⩾ 0.9993. The selectivity, specificity, recovery, accuracy and precision were validated for determination of fenofibrate/fenofibric acid in rat plasma. The lower limits of detection and quantitation of the method were 30 and 90 ng/ml for fenofibrate and 40 and 100 ng/ml for fenofibric acid, respectively. The within and between-day coefficients of variation were less than 5%. The validated method has been successfully applied to measure the plasma concentrations in pharmacokinetics study of fenofibrate in an animal model to illustrate the scope and application of the method.

  1. Validation of some FM-based fitness for purpose methods

    Energy Technology Data Exchange (ETDEWEB)

    Broekhoven, M J.G. [Ministry of Social Affairs, The Hague (Netherlands)

    1988-12-31

    The reliability of several FM-based fitness-for-purpose methods has been investigated on a number of objects for which accurate fracture data were available from experiments or from practice, viz. 23 wide plates, 30 mm thickness (surface and through thickness cracks, cracks at holes, with and without welds), 45 pipelines sections with cracks, pressure vessels and a T-joint. The methods applied mainly comprise ASME XI, PD 6493 and R6. This contribution reviews the results. (author). 11 refs.

  2. A mixed methods inquiry into the validity of data

    DEFF Research Database (Denmark)

    Kristensen, Erling Lundager; Nielsen, Dorthe B; Jensen, Laila N

    2008-01-01

    increased awareness and dialogue between researchers and farmers or other stakeholders about the background for data collection related to management and changes in management. By integrating quantitative and qualitative research methods in a mixed methods research approach, the researchers will improve...... greatly by adding a qualitative perspective to the quantitative approach as illustrated and discussed in this article. The combined approach requires, besides skills and interdisciplinary collaboration, also openness, reflection and scepticism from the involved scientists, but the benefits may be extended...

  3. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    Science.gov (United States)

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the

  4. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation.

    Science.gov (United States)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-07-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes' inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of the evidence for a trace specimen, e.g. a fingermark, and a reference specimen, e.g. a fingerprint, to originate from common or different sources. Some theoretical aspects of probabilities necessary for this Guideline were discussed prior to its elaboration, which started after a workshop of forensic researchers and practitioners involved in this topic. In the workshop, the following questions were addressed: "which aspects of a forensic evaluation scenario need to be validated?", "what is the role of the LR as part of a decision process?" and "how to deal with uncertainty in the LR calculation?". The questions: "what to validate?" focuses on the validation methods and criteria and "how to validate?" deals with the implementation of the validation protocol. Answers to these questions were deemed necessary with several objectives. First, concepts typical for validation standards [1], such as performance characteristics, performance metrics and validation criteria, will be adapted or applied by analogy to the LR framework. Second, a validation strategy will be defined. Third, validation methods will be described. Finally, a validation protocol and an example of validation report will be proposed, which can be applied to the forensic fields developing and validating LR methods for the evaluation of the strength of evidence at source level under the following propositions. Copyright © 2016. Published by Elsevier B.V.

  5. Flight critical system design guidelines and validation methods

    Science.gov (United States)

    Holt, H. M.; Lupton, A. O.; Holden, D. G.

    1984-01-01

    Efforts being expended at NASA-Langley to define a validation methodology, techniques for comparing advanced systems concepts, and design guidelines for characterizing fault tolerant digital avionics are described with an emphasis on the capabilities of AIRLAB, an environmentally controlled laboratory. AIRLAB has VAX 11/750 and 11/780 computers with an aggregate of 22 Mb memory and over 650 Mb storage, interconnected at 256 kbaud. An additional computer is programmed to emulate digital devices. Ongoing work is easily accessed at user stations by either chronological or key word indexing. The CARE III program aids in analyzing the capabilities of test systems to recover from faults. An additional code, the semi-Markov unreliability program (SURE) generates upper and lower reliability bounds. The AIRLAB facility is mainly dedicated to research on designs of digital flight-critical systems which must have acceptable reliability before incorporation into aircraft control systems. The digital systems would be too costly to submit to a full battery of flight tests and must be initially examined with the AIRLAB simulation capabilities.

  6. Method validation for chemical composition determination by electron microprobe with wavelength dispersive spectrometer

    Science.gov (United States)

    Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.

    2016-07-01

    The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.

  7. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    Directory of Open Access Journals (Sweden)

    Daniel Ramos

    2017-02-01

    Full Text Available Data to which the authors refer to throughout this article are likelihood ratios (LR computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR method in forensic evidence evaluation. These data present a necessary asset for conducting validation experiments when validating LR methods used in forensic evidence evaluation and set up validation reports. These data can be also used as a baseline for comparing the fingermark evidence in the same minutiae configuration as presented in (D. Meuwly, D. Ramos, R. Haraksim, [1], although the reader should keep in mind that different feature extraction algorithms and different AFIS systems used may produce different LRs values. Moreover, these data may serve as a reproducibility exercise, in order to train the generation of validation reports of forensic methods, according to [1]. Alongside the data, a justification and motivation for the use of methods is given. These methods calculate LRs from the fingerprint/mark data and are subject to a validation procedure. The choice of using real forensic fingerprint in the validation and simulated data in the development is described and justified. Validation criteria are set for the purpose of validation of the LR methods, which are used to calculate the LR values from the data and the validation report. For privacy and data protection reasons, the original fingerprint/mark images cannot be shared. But these images do not constitute the core data for the validation, contrarily to the LRs that are shared.

  8. The Language Teaching Methods Scale: Reliability and Validity Studies

    Science.gov (United States)

    Okmen, Burcu; Kilic, Abdurrahman

    2016-01-01

    The aim of this research is to develop a scale to determine the language teaching methods used by English teachers. The research sample consisted of 300 English teachers who taught at Duzce University and in primary schools, secondary schools and high schools in the Provincial Management of National Education in the city of Duzce in 2013-2014…

  9. A method to determine validity and reliability of activity sensors

    NARCIS (Netherlands)

    Boerema, Simone Theresa; Hermens, Hermanus J.

    2013-01-01

    METHOD Four sensors were securely fastened to a mechanical oscillator (Vibration Exciter, type 4809, Brüel & Kjær) and moved at various frequencies (6.67Hz; 13.45Hz; 19.88Hz) within the range of human physical activity. For each of the three sensor axes, the sensors were simultaneously moved for

  10. Reliability and Validity of the Research Methods Skills Assessment

    Science.gov (United States)

    Smith, Tamarah; Smith, Samantha

    2018-01-01

    The Research Methods Skills Assessment (RMSA) was created to measure psychology majors' statistics knowledge and skills. The American Psychological Association's Guidelines for the Undergraduate Major in Psychology (APA, 2007, 2013) served as a framework for development. Results from a Rasch analysis with data from n = 330 undergraduates showed…

  11. Validity of the Demirjian method for dental age estimation for ...

    African Journals Online (AJOL)

    2015-02-04

    Feb 4, 2015 ... Conclusions: It is appropriate to use the Demirjian method in southern Turkish children; however, a revision is needed in some ... Departments of Pediatric Dentistry and 1Orthodontics, Faculty of Dentistry, University of Akdeniz, Antalya, Turkey .... agenesis excluded from the study because dental anomalies.

  12. Reliability and validity of the AutoCAD software method in lumbar lordosis measurement.

    Science.gov (United States)

    Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe

    2011-12-01

    The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis.

  13. Impurities in biogas - validation of analytical methods for siloxanes; Foeroreningar i biogas - validering av analysmetodik foer siloxaner

    Energy Technology Data Exchange (ETDEWEB)

    Arrhenius, Karine; Magnusson, Bertil; Sahlin, Eskil [SP Technical Research Institute of Sweden, Boraas (Sweden)

    2011-11-15

    Biogas produced from digester or landfill contains impurities which can be harmful for component that will be in contact with the biogas during its utilization. Among these, the siloxanes are often mentioned. During combustion, siloxanes are converted to silicon dioxide which accumulates on the heated surfaces in combustion equipment. Silicon dioxide is a solid compound and will remain in the engine and cause damages. Consequently, it is necessary to develop methods for the accurate determination of these compounds in biogases. In the first part of this report, a method for analysis of siloxanes in biogases was validated. The sampling was performed directly at the plant by drawing a small volume of biogas onto an adsorbent tube under a short period of time. These tubes were subsequently sent to the laboratory for analysis. The purpose of method validation is to demonstrate that the established method is fit for the purpose. This means that the method, as used by the laboratory generating the data, will provide data that meets a set of criteria concerning precision and accuracy. At the end, the uncertainty of the method was calculated. In the second part of this report, the validated method was applied to real samples collected in waste water treatment plants, co-digestion plants and plants digesting other wastes (agriculture waste). Results are presented at the end of this report. As expected, the biogases from waste water treatment plants contained largely higher concentrations of siloxanes than biogases from co-digestion plants and plants digesting agriculture wastes. The concentration of siloxanes in upgraded biogas regardless of which feedstock was digested and which upgrading technique was used was low.

  14. Human Factors methods concerning integrated validation of nuclear power plant control rooms

    International Nuclear Information System (INIS)

    Oskarsson, Per-Anders; Johansson, Bjoern J.E.; Gonzalez, Natalia

    2010-02-01

    The frame of reference for this work was existing recommendations and instructions from the NPP area, experiences from the review of the Turbic Validation and experiences from system validations performed at the Swedish Armed Forces, e.g. concerning military control rooms and fighter pilots. These enterprises are characterized by complex systems in extreme environments, often with high risks, where human error can lead to serious consequences. A focus group has been performed with representatives responsible for Human Factors issues from all Swedish NPP:s. The questions that were discussed were, among other things, for whom an integrated validation (IV) is performed and its purpose, what should be included in an IV, the comparison with baseline measures, the design process, the role of SSM, which methods of measurement should be used, and how the methods are affected of changes in the control room. The report brings different questions to discussion concerning the validation process. Supplementary methods of measurement for integrated validation are discussed, e.g. dynamic, psychophysiological, and qualitative methods for identification of problems. Supplementary methods for statistical analysis are presented. The study points out a number of deficiencies in the validation process, e.g. the need of common guidelines for validation and design, criteria for different types of measurements, clarification of the role of SSM, and recommendations for the responsibility of external participants in the validation process. The authors propose 12 measures for taking care of the identified problems

  15. Reliability and validity of non-radiographic methods of thoracic kyphosis measurement: a systematic review.

    Science.gov (United States)

    Barrett, Eva; McCreesh, Karen; Lewis, Jeremy

    2014-02-01

    A wide array of instruments are available for non-invasive thoracic kyphosis measurement. Guidelines for selecting outcome measures for use in clinical and research practice recommend that properties such as validity and reliability are considered. This systematic review reports on the reliability and validity of non-invasive methods for measuring thoracic kyphosis. A systematic search of 11 electronic databases located studies assessing reliability and/or validity of non-invasive thoracic kyphosis measurement techniques. Two independent reviewers used a critical appraisal tool to assess the quality of retrieved studies. Data was extracted by the primary reviewer. The results were synthesized qualitatively using a level of evidence approach. 27 studies satisfied the eligibility criteria and were included in the review. The reliability, validity and both reliability and validity were investigated by sixteen, two and nine studies respectively. 17/27 studies were deemed to be of high quality. In total, 15 methods of thoracic kyphosis were evaluated in retrieved studies. All investigated methods showed high (ICC ≥ .7) to very high (ICC ≥ .9) levels of reliability. The validity of the methods ranged from low to very high. The strongest levels of evidence for reliability exists in support of the Debrunner kyphometer, Spinal Mouse and Flexicurve index, and for validity supports the arcometer and Flexicurve index. Further reliability and validity studies are required to strengthen the level of evidence for the remaining methods of measurement. This should be addressed by future research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Validity of a Manual Soft Tissue Profile Prediction Method Following Mandibular Setback Osteotomy

    OpenAIRE

    Kolokitha, Olga-Elpis

    2007-01-01

    Objectives The aim of this study was to determine the validity of a manual cephalometric method used for predicting the post-operative soft tissue profiles of patients who underwent mandibular setback surgery and compare it to a computerized cephalometric prediction method (Dentofacial Planner). Lateral cephalograms of 18 adults with mandibular prognathism taken at the end of pre-surgical orthodontics and approximately one year after surgery were used. Methods To test the validity of the manu...

  17. TESTING METHODS FOR MECHANICALLY IMPROVED SOILS: RELIABILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Ana Petkovšek

    2017-10-01

    Full Text Available A possibility of in-situ mechanical improvement for reducing the liquefaction potential of silty sands was investigated by using three different techniques: Vibratory Roller Compaction, Rapid Impact Compaction (RIC and Soil Mixing. Material properties at all test sites were investigated before and after improvement with the laboratory and the in situ tests (CPT, SDMT, DPSH B, static and dynamic load plate test, geohydraulic tests. Correlation between the results obtained by different test methods gave inconclusive answers.

  18. A Virtual Upgrade Validation Method for Software-Reliant Systems

    Science.gov (United States)

    2012-06-01

    behalf of the Army Program Executive Office Aviation (PEO- AVN ). The work consists of the development of the VUV method, the subject of this report...Introduction This report is the first in a series of three reports developed by the SEI for the ASSIP and sponsored by the Army PEO AVN . This first report...Technology OQA operational quality attribute OSATE Open Source AADL Tool Environment PCI Peripheral Control Interface PEO AVN Program Executive

  19. The development and validation of control rod calculation methods

    International Nuclear Information System (INIS)

    Rowlands, J.L.; Sweet, D.W.; Franklin, B.M.

    1979-01-01

    Fission rate distributions have been measured in the zero power critical facility, ZEBRA, for a series of eight different arrays of boron carbide control rods. Diffusion theory calculations have been compared with these measurements. The normalised fission rates differ by up to about 30% in some regions, between the different arrays, and these differences are well predicted by the calculations. A development has been made to a method used to produce homogenised cross sections for lattice regions containing control rods. Calculations show that the method also reproduces the reaction rate within the rod and the fission rate dip at the surface of the rod in satisfactory agreement with the more accurate calculations which represent the fine structure of the rod. A comparison between diffusion theory and transport theory calculations of control rod reactivity worths in the CDFR shows that for the standard design method the finite mesh approximation and the difference between diffusion theory and transport theory (the transport correction) tend to cancel and result in corrections to be applied to the standard mesh diffusion theory calculations of about +- 2% or less. This result applies for mesh centred finite difference diffusion theory codes and for the arrays of natural boron carbide control rods for which the calculations were made. Improvements have also been made to the effective diffusion coefficients used in diffusion theory calculations for control rod followers and these give satisfactory agreement with transport theory calculations. (U.K.)

  20. Validation of the method for investigation of radiopharmaceuticals for in vitro use

    International Nuclear Information System (INIS)

    Vranjes, S; Jovanovic, M.; Orlic, M.; Lazic, E. . E-mail address of corresponding author: sanjav@vin.bg.ac.yu

    2005-01-01

    The aim of this study was to validate analytical method for determination of total radioactivity and radioactive concentration of 125 I-triiodotironin, radiopharmaceutical for in vitro use. Analytical parameters: selectivity, accuracy, linearity and range of this method were determined. Values obtained for all parameters are reasonable for analytical methods, therefore this method could be used for farther investigation. (author)

  1. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  2. Evaluating the Social Validity of the Early Start Denver Model: A Convergent Mixed Methods Study

    Science.gov (United States)

    Ogilvie, Emily; McCrudden, Matthew T.

    2017-01-01

    An intervention has social validity to the extent that it is socially acceptable to participants and stakeholders. This pilot convergent mixed methods study evaluated parents' perceptions of the social validity of the Early Start Denver Model (ESDM), a naturalistic behavioral intervention for children with autism. It focused on whether the parents…

  3. Rationale and methods of the European Food Consumption Validation (EFCOVAL) Project

    NARCIS (Netherlands)

    Boer, de E.J.; Slimani, N.; Boeing, H.; Feinberg, M.; Leclerq, C.; Trolle, E.; Amiano, P.; Andersen, L.F.; Freisling, H.; Geelen, A.; Harttig, U.; Huybrechts, I.; Kaic-Rak, A.; Lafay, L.; Lillegaard, I.T.L.; Ruprich, J.; Vries, de J.H.M.; Ocke, M.C.

    2011-01-01

    Background/Objectives: The overall objective of the European Food Consumption Validation (EFCOVAL) Project was to further develop and validate a trans-European food consumption method to be used for the evaluation of the intake of foods, nutrients and potentially hazardous chemicals within the

  4. Validation of Likelihood Ratio Methods Used for Forensic Evidence Evaluation: Application in Forensic Fingerprints

    NARCIS (Netherlands)

    Haraksim, Rudolf

    2014-01-01

    In this chapter the Likelihood Ratio (LR) inference model will be introduced, the theoretical aspects of probabilities will be discussed and the validation framework for LR methods used for forensic evidence evaluation will be presented. Prior to introducing the validation framework, following

  5. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    Science.gov (United States)

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  6. Pressure Autoregulation Measurement Techniques in Adult Traumatic Brain Injury, Part I: A Scoping Review of Intermittent/Semi-Intermittent Methods.

    Science.gov (United States)

    Zeiler, Frederick A; Donnelly, Joseph; Calviello, Leanne; Menon, David K; Smielewski, Peter; Czosnyka, Marek

    2017-12-01

    The purpose of this study was to perform a systematic, scoping review of commonly described intermittent/semi-intermittent autoregulation measurement techniques in adult traumatic brain injury (TBI). Nine separate systematic reviews were conducted for each intermittent technique: computed tomographic perfusion (CTP)/Xenon-CT (Xe-CT), positron emission tomography (PET), magnetic resonance imaging (MRI), arteriovenous difference in oxygen (AVDO 2 ) technique, thigh cuff deflation technique (TCDT), transient hyperemic response test (THRT), orthostatic hypotension test (OHT), mean flow index (Mx), and transfer function autoregulation index (TF-ARI). MEDLINE ® , BIOSIS, EMBASE, Global Health, Scopus, Cochrane Library (inception to December 2016), and reference lists of relevant articles were searched. A two tier filter of references was conducted. The total number of articles utilizing each of the nine searched techniques for intermittent/semi-intermittent autoregulation techniques in adult TBI were: CTP/Xe-CT (10), PET (6), MRI (0), AVDO 2 (10), ARI-based TCDT (9), THRT (6), OHT (3), Mx (17), and TF-ARI (6). The premise behind all of the intermittent techniques is manipulation of systemic blood pressure/blood volume via either chemical (such as vasopressors) or mechanical (such as thigh cuffs or carotid compression) means. Exceptionally, Mx and TF-ARI are based on spontaneous fluctuations of cerebral perfusion pressure (CPP) or mean arterial pressure (MAP). The method for assessing the cerebral circulation during these manipulations varies, with both imaging-based techniques and TCD utilized. Despite the limited literature for intermittent/semi-intermittent techniques in adult TBI (minus Mx), it is important to acknowledge the availability of such tests. They have provided fundamental insight into human autoregulatory capacity, leading to the development of continuous and more commonly applied techniques in the intensive care unit (ICU). Numerous methods of

  7. Experimental Validation for Hot Stamping Process by Using Taguchi Method

    Science.gov (United States)

    Fawzi Zamri, Mohd; Lim, Syh Kai; Razlan Yusoff, Ahmad

    2016-02-01

    Due to the demand for reduction in gas emissions, energy saving and producing safer vehicles has driven the development of Ultra High Strength Steel (UHSS) material. To strengthen UHSS material such as boron steel, it needed to undergo a process of hot stamping for heating at certain temperature and time. In this paper, Taguchi method is applied to determine the appropriate parameter of thickness, heating temperature and heating time to achieve optimum strength of boron steel. The experiment is conducted by using flat square shape of hot stamping tool with tensile dog bone as a blank product. Then, the value of tensile strength and hardness is measured as response. The results showed that the lower thickness, higher heating temperature and heating time give the higher strength and hardness for the final product. In conclusion, boron steel blank are able to achieve up to 1200 MPa tensile strength and 650 HV of hardness.

  8. Validation of a spectrophotometric method for quantification of carboxyhemoglobin.

    Science.gov (United States)

    Luchini, Paulo D; Leyton, Jaime F; Strombech, Maria de Lourdes C; Ponce, Julio C; Jesus, Maria das Graças S; Leyton, Vilma

    2009-10-01

    The measurement of carboxyhemoglobin (COHb) levels in blood is a valuable procedure to confirm exposure to carbon monoxide (CO) either for forensic or occupational matters. A previously described method using spectrophotometric readings at 420 and 432 nm after reduction of oxyhemoglobin (O(2)Hb) and methemoglobin with sodium hydrosulfite solution leads to an exponential curve. This curve, used with pre-established factors, serves well for lower concentrations (1-7%) or for high concentrations (> 20%) but very rarely for both. The authors have observed that small variations on the previously described factors F1, F2, and F3, obtained from readings for 100% COHb and 100% O(2)Hb, turn into significant changes in COHb% results and propose that these factors should be determined every time COHb is measured by reading CO and O(2) saturated samples. This practice leads to an increase in accuracy and precision.

  9. VALIDATION OF THE ASSR TEST THROUGH COMPLEMENTARY AUDIOLOGYICAL METHODS

    Directory of Open Access Journals (Sweden)

    C. Mârtu

    2016-04-01

    Full Text Available Introduction: Auditory Steady State Response (ASSR is an objective method for determining the auditive threshold, applicable and necessary especially in children. The test is extremely important for recommending cochlear implant in children. The aim of the study was to compare pure tone audiometry responses and auditory steady-state thresholds. Materials and method: The study was performed on a group including both patients with normal hearing and with hearing loss. The main inclusion criteria accepted only patients with normal otomicroscopic aspect, normal tympanogram, capable to respond to pure tone audiometry, and with ear conduction thresholds between 0 and 80 dB NHL. The patients with suppurative otic processes or ear malformations were excluded. The research protocol was followed, the tests being performed in soundproofed rooms, starting with pure tone audiometry followed, after a pause, by ASSR determinations at frequencies of 0.5, 1.2 and 4 KHz. The audiological instruments were provided by a single manufacturer. ASSR was recorded at least two times for both borderline intensities, namely the one defining the auditory threshold and the first no-response intensity. The recorded responses were stored in a database and further processed in Excel. Discussion: The differences observed between pure tone audiometry and ASSR thresholds are important at 500 Hz and insignificant at the other frequencies. When approaching the PTA-ASSR relation, whatever the main characteristic between the PTA and ASSR thresholds in one ear, the profile of the lines gap maintains the same shape on the opposite ear. Conclusions: ASSR is a confident objective test, maintaining attention to low frequencies, where some differences might occur.

  10. [Use of THP-1 for allergens identification method validation].

    Science.gov (United States)

    Zhao, Xuezheng; Jia, Qiang; Zhang, Jun; Li, Xue; Zhang, Yanshu; Dai, Yufei

    2014-05-01

    Look for an in vitro test method to evaluate sensitization using THP-1 cells by the changes of the expression of cytokines to provide more reliable markers of the identification of sensitization. The monocyte-like THP-1 cells were induced and differentiated into THP-1-macrophages with PMA (0.1 microg/ml). The changes of expression of cytokines at different time points after the cells being treated with five known allergens, 2,4-dinitrochlorobenzene (DNCB), nickel sulfate (NiSO4), phenylene diamine (PPDA) potassium dichromate (K2Cr2O7) and toluene diisocyanate (TDI) and two non-allergens sodium dodecyl sulfate (SDS) and isopropanol (IPA) at various concentrations were evaluated. The IL-6 and TNF-alpha production was measured by ELISA. The secretion of IL-1beta and IL-8 was analyzed by Cytometric Bead Array (CBA). The section of the IL-6, TNF-alpha, IL-1beta and IL-8 were the highest when THP-1 cells were exposed to NiSO4, DNCB and K2Cr2O7 for 6h, PPDA and TDI for 12h. The production of IL-6 were approximately 40, 25, 20, 50 and 50 times for five kinds chemical allergens NiSO4, DNCB, K2Cr2O7, PPDA and TDI respectively at the optimum time points and the optimal concentration compared to the control group. The expression of TNF-alpha were 20, 12, 20, 8 and 5 times more than the control group respectively. IL-1beta secretion were 30, 60, 25, 30 and 45 times respectively compared to the control group. The production of IL-8 were approximately 15, 12, 15, 12 and 7 times respectively compared to the control group. Both non-allergens SDS and IPA significantly induced IL-6 secretion in a dose-dependent manner however SDS cause a higher production levels, approximately 20 times of the control. Therefore IL-6 may not be a reliable marker for identification of allergens. TNF-alpha, IL-1beta and IL-8 expressions did not change significantly after exposed to the two non-allergens. The test method using THP-1 cells by detecting the productions of cytokines (TNF-alpha, IL-1beta and

  11. Validation needs of seismic probabilistic risk assessment (PRA) methods applied to nuclear power plants

    International Nuclear Information System (INIS)

    Kot, C.A.; Srinivasan, M.G.; Hsieh, B.J.

    1985-01-01

    An effort to validate seismic PRA methods is in progress. The work concentrates on the validation of plant response and fragility estimates through the use of test data and information from actual earthquake experience. Validation needs have been identified in the areas of soil-structure interaction, structural response and capacity, and equipment fragility. Of particular concern is the adequacy of linear methodology to predict nonlinear behavior. While many questions can be resolved through the judicious use of dynamic test data, other aspects can only be validated by means of input and response measurements during actual earthquakes. A number of past, ongoing, and planned testing programs which can provide useful validation data have been identified, and validation approaches for specific problems are being formulated

  12. Content Validity of National Post Marriage Educational Program Using Mixed Methods

    Science.gov (United States)

    MOHAJER RAHBARI, Masoumeh; SHARIATI, Mohammad; KERAMAT, Afsaneh; YUNESIAN, Masoud; ESLAMI, Mohammad; MOUSAVI, Seyed Abbas; MONTAZERI, Ali

    2015-01-01

    Background: Although the validity of content of program is mostly conducted with qualitative methods, this study used both qualitative and quantitative methods for the validation of content of post marriage training program provided for newly married couples. Content validity is a preliminary step of obtaining authorization required to install the program in country's health care system. Methods: This mixed methodological content validation study carried out in four steps with forming three expert panels. Altogether 24 expert panelists were involved in 3 qualitative and quantitative panels; 6 in the first item development one; 12 in the reduction kind, 4 of them were common with the first panel, and 10 executive experts in the last one organized to evaluate psychometric properties of CVR and CVI and Face validity of 57 educational objectives. Results: The raw data of post marriage program had been written by professional experts of Ministry of Health, using qualitative expert panel, the content was more developed by generating 3 topics and refining one topic and its respective content. In the second panel, totally six other objectives were deleted, three for being out of agreement cut of point and three on experts' consensus. The validity of all items was above 0.8 and their content validity indices (0.8–1) were completely appropriate in quantitative assessment. Conclusion: This study provided a good evidence for validation and accreditation of national post marriage program planned for newly married couples in health centers of the country in the near future. PMID:26056672

  13. Content validity across methods of malnutrition assessment in patients with cancer is limited

    NARCIS (Netherlands)

    Sealy, Martine J.; Nijholt, Willemke; Stuiver, Martijn M.; van der Berg, Marit M.; Roodenburg, Jan L. N.; Schans, van der Cees P.; Ottery, Faith D.; Jager-Wittenaar, Harriet

    Objective: To identify malnutrition assessment methods in cancer patients and assess their content validity based on internationally accepted definitions for malnutrition. Study Design and Setting: Systematic review of studies in cancer patients that operationalized malnutrition as a variable,

  14. Content validity across methods of malnutrition assessment in patients with cancer is limited

    NARCIS (Netherlands)

    Sealy, Martine; Nijholt, Willemke; Stuiver, M.M.; van der Berg, M.M.; Roodenburg, Jan; Ottery, Faith D.; van der Schans, Cees; Jager, Harriët

    2016-01-01

    Objective To identify malnutrition assessment methods in cancer patients and assess their content validity based on internationally accepted definitions for malnutrition. Study Design and Setting Systematic review of studies in cancer patients that operationalized malnutrition as a variable,

  15. Validation and application of an high-order spectral difference method for flow induced noise simulation

    KAUST Repository

    Parsani, Matteo; Ghorbaniasl, Ghader; Lacor, C.

    2011-01-01

    . The method is based on the Ffowcs WilliamsHawkings approach, which provides noise contributions for monopole, dipole and quadrupole acoustic sources. This paper will focus on the validation and assessment of this hybrid approach using different test cases

  16. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    Science.gov (United States)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  17. Development and Validation of a RP-HPLC Method for the ...

    African Journals Online (AJOL)

    Development and Validation of a RP-HPLC Method for the Simultaneous Determination of Rifampicin and a Flavonoid Glycoside - A Novel ... range, accuracy, precision, limit of detection, limit of quantification, robustness and specificity.

  18. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  19. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    Science.gov (United States)

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  20. Comparing the Validity of Non-Invasive Methods in Measuring Thoracic Kyphosis and Lumbar Lordosis

    Directory of Open Access Journals (Sweden)

    Mohammad Yousefi

    2012-04-01

    Full Text Available Background: the purpose of this article is to study the validity of each of the non-invasive methods (flexible ruler, spinal mouse, and processing the image versus the one through-Ray radiation (the basic method and comparing them with each other.Materials and Methods: for evaluating the validity of each of these non-invasive methods, the thoracic Kyphosis and lumber Lordosis angle of 20 students of Birjand University (age mean and standard deviation: 26±2, weight: 72±2.5 kg, height: 169±5.5 cm through fours methods of flexible ruler, spinal mouse, and image processing and X-ray.Results: the results indicated that the validity of the methods including flexible ruler, spinal mouse, and image processing in measuring the thoracic Kyphosis and lumber Lordosis angle respectively have an adherence of 0.81, 0.87, 0.73, 0.76, 0.83, 0.89 (p>0.05. As a result, regarding the gained validity against the golden method of X-ray, it could be stated that the three mentioned non-invasive methods have adequate validity. In addition, the one-way analysis of variance test indicated that there existed a meaningful relationship between the three methods of measuring the thoracic Kyphosis and lumber Lordosis, and with respect to the Tukey’s test result, the image processing method is the most precise one.Conclusion as a result, this method could be used along with other non-invasive methods as a valid measuring method.

  1. Two Validated HPLC Methods for the Quantification of Alizarin and other Anthraquinones in Rubia tinctorum Cultivars

    NARCIS (Netherlands)

    Derksen, G.C.H.; Lelyveld, G.P.; Beek, van T.A.; Capelle, A.; Groot, de Æ.

    2004-01-01

    Direct and indirect HPLC-UV methods for the quantitative determination of anthraquinones in dried madder root have been developed, validated and compared. In the direct method, madder root was extracted twice with refluxing ethanol-water. This method allowed the determination of the two major native

  2. Determination of Modafinil in Tablet Formulation Using Three New Validated Spectrophotometric Methods

    International Nuclear Information System (INIS)

    Basniwal, P.K.; Jain, D.; Basniwal, P.K.

    2014-01-01

    In this study, three new UV spectrophotometric methods viz. linear regression equation (LRE), standard absorptivity (SA) and first order derivative (FOD) method were developed and validated for determination of modafinil in tablet form. The Beer-Lamberts law was obeyed as linear in the range of 10-50 μg/ mL and all the methods were validated for linearity, accuracy, precision and robustness. These methods were successfully applied for assay of modafinil drug content in tablets in the range of 100.20 - 100.42 %, 100.11 - 100.58 % and 100.25 - 100.34 %, respectively with acceptable standard deviation (less than two) for all the methods. The validated spectrophotometric methods may be successfully applied for assay, dissolution studies, bio-equivalence studies as well as routine analysis in pharmaceutical industries. (author)

  3. Empirical Performance of Cross-Validation With Oracle Methods in a Genomics Context.

    Science.gov (United States)

    Martinez, Josue G; Carroll, Raymond J; Müller, Samuel; Sampson, Joshua N; Chatterjee, Nilanjan

    2011-11-01

    When employing model selection methods with oracle properties such as the smoothly clipped absolute deviation (SCAD) and the Adaptive Lasso, it is typical to estimate the smoothing parameter by m-fold cross-validation, for example, m = 10. In problems where the true regression function is sparse and the signals large, such cross-validation typically works well. However, in regression modeling of genomic studies involving Single Nucleotide Polymorphisms (SNP), the true regression functions, while thought to be sparse, do not have large signals. We demonstrate empirically that in such problems, the number of selected variables using SCAD and the Adaptive Lasso, with 10-fold cross-validation, is a random variable that has considerable and surprising variation. Similar remarks apply to non-oracle methods such as the Lasso. Our study strongly questions the suitability of performing only a single run of m-fold cross-validation with any oracle method, and not just the SCAD and Adaptive Lasso.

  4. Fuzzy-logic based strategy for validation of multiplex methods: example with qualitative GMO assays.

    Science.gov (United States)

    Bellocchi, Gianni; Bertholet, Vincent; Hamels, Sandrine; Moens, W; Remacle, José; Van den Eede, Guy

    2010-02-01

    This paper illustrates the advantages that a fuzzy-based aggregation method could bring into the validation of a multiplex method for GMO detection (DualChip GMO kit, Eppendorf). Guidelines for validation of chemical, bio-chemical, pharmaceutical and genetic methods have been developed and ad hoc validation statistics are available and routinely used, for in-house and inter-laboratory testing, and decision-making. Fuzzy logic allows summarising the information obtained by independent validation statistics into one synthetic indicator of overall method performance. The microarray technology, introduced for simultaneous identification of multiple GMOs, poses specific validation issues (patterns of performance for a variety of GMOs at different concentrations). A fuzzy-based indicator for overall evaluation is illustrated in this paper, and applied to validation data for different genetically modified elements. Remarks were drawn on the analytical results. The fuzzy-logic based rules were shown to be applicable to improve interpretation of results and facilitate overall evaluation of the multiplex method.

  5. Validation of Cyanoacrylate Method for Collection of Stratum Corneum in Human Skin for Lipid Analysis

    DEFF Research Database (Denmark)

    Jungersted, JM; Hellgren, Lars; Drachmann, Tue

    2010-01-01

    Background and Objective: Lipids in the stratum corneum (SC) are of major importance for the skin barrier function. Many different methods have been used for the collection of SC for the analysis of SC lipids. The objective of the present study was to validate the cyanoacrylate method for the col......Background and Objective: Lipids in the stratum corneum (SC) are of major importance for the skin barrier function. Many different methods have been used for the collection of SC for the analysis of SC lipids. The objective of the present study was to validate the cyanoacrylate method...

  6. Microscale validation of 4-aminoantipyrine test method for quantifying phenolic compounds in microbial culture

    International Nuclear Information System (INIS)

    Justiz Mendoza, Ibrahin; Aguilera Rodriguez, Isabel; Perez Portuondo, Irasema

    2014-01-01

    Validation of test methods microscale is currently of great importance due to the economic and environmental advantages possessed, which constitutes a prerequisite for the performance of services and quality assurance of the results to provide customer. This paper addresses the microscale validation of 4-aminoantipyrine spectrophotometric method for the quantification of phenolic compounds in culture medium. Parameters linearity, precision, regression, accuracy, detection limits, quantification limits and robustness were evaluated, addition to the comparison test with no standardized method for determining polyphenols (Folin Ciocalteu). The results showed that both methods are feasible for determining phenols

  7. Method development and validation of potent pyrimidine derivative by UV-VIS spectrophotometer.

    Science.gov (United States)

    Chaudhary, Anshu; Singh, Anoop; Verma, Prabhakar Kumar

    2014-12-01

    A rapid and sensitive ultraviolet-visible (UV-VIS) spectroscopic method was developed for the estimation of pyrimidine derivative 6-Bromo-3-(6-(2,6-dichlorophenyl)-2-(morpolinomethylamino) pyrimidine4-yl) -2H-chromen-2-one (BT10M) in bulk form. Pyrimidine derivative was monitored at 275 nm with UV detection, and there is no interference of diluents at 275 nm. The method was found to be linear in the range of 50 to 150 μg/ml. The accuracy and precision were determined and validated statistically. The method was validated as a guideline. The results showed that the proposed method is suitable for the accurate, precise, and rapid determination of pyrimidine derivative. Graphical Abstract Method development and validation of potent pyrimidine derivative by UV spectroscopy.

  8. [Data validation methods and discussion on Chinese materia medica resource survey].

    Science.gov (United States)

    Zhang, Yue; Ma, Wei-Feng; Zhang, Xiao-Bo; Zhu, Shou-Dong; Guo, Lan-Ping; Wang, Xing-Xing

    2013-07-01

    From the beginning of the fourth national survey of the Chinese materia medica resources, there were 22 provinces have conducted pilots. The survey teams have reported immense data, it put forward the very high request to the database system construction. In order to ensure the quality, it is necessary to check and validate the data in database system. Data validation is important methods to ensure the validity, integrity and accuracy of census data. This paper comprehensively introduce the data validation system of the fourth national survey of the Chinese materia medica resources database system, and further improve the design idea and programs of data validation. The purpose of this study is to promote the survey work smoothly.

  9. Session-RPE Method for Training Load Monitoring: Validity, Ecological Usefulness, and Influencing Factors

    Directory of Open Access Journals (Sweden)

    Monoem Haddad

    2017-11-01

    Full Text Available Purpose: The aim of this review is to (1 retrieve all data validating the Session-rating of perceived exertion (RPE-method using various criteria, (2 highlight the rationale of this method and its ecological usefulness, and (3 describe factors that can alter RPE and users of this method should take into consideration.Method: Search engines such as SPORTDiscus, PubMed, and Google Scholar databases in the English language between 2001 and 2016 were consulted for the validity and usefulness of the session-RPE method. Studies were considered for further analysis when they used the session-RPE method proposed by Foster et al. in 2001. Participants were athletes of any gender, age, or level of competition. Studies using languages other than English were excluded in the analysis of the validity and reliability of the session-RPE method. Other studies were examined to explain the rationale of the session-RPE method and the origin of RPE.Results: A total of 950 studies cited the Foster et al. study that proposed the session RPE-method. 36 studies have examined the validity and reliability of this proposed method using the modified CR-10.Conclusion: These studies confirmed the validity and good reliability and internal consistency of session-RPE method in several sports and physical activities with men and women of different age categories (children, adolescents, and adults among various expertise levels. This method could be used as “standing alone” method for training load (TL monitoring purposes though some recommend to combine it with other physiological parameters as heart rate.

  10. Development and validation of a dissolution method using HPLC for diclofenac potassium in oral suspension

    Directory of Open Access Journals (Sweden)

    Alexandre Machado Rubim

    2014-04-01

    Full Text Available The present study describes the development and validation of an in vitro dissolution method for evaluation to release diclofenac potassium in oral suspension. The dissolution test was developed and validated according to international guidelines. Parameters like linearity, specificity, precision and accuracy were evaluated, as well as the influence of rotation speed and surfactant concentration on the medium. After selecting the best conditions, the method was validated using apparatus 2 (paddle, 50-rpm rotation speed, 900 mL of water with 0.3% sodium lauryl sulfate (SLS as dissolution medium at 37.0 ± 0.5°C. Samples were analyzed using the HPLC-UV (PDA method. The results obtained were satisfactory for the parameters evaluated. The method developed may be useful in routine quality control for pharmaceutical industries that produce oral suspensions containing diclofenac potassium.

  11. Validation of innovative technologies and strategies for regulatory safety assessment methods: challenges and opportunities.

    Science.gov (United States)

    Stokes, William S; Wind, Marilyn

    2010-01-01

    Advances in science and innovative technologies are providing new opportunities to develop test methods and strategies that may improve safety assessments and reduce animal use for safety testing. These include high throughput screening and other approaches that can rapidly measure or predict various molecular, genetic, and cellular perturbations caused by test substances. Integrated testing and decision strategies that consider multiple types of information and data are also being developed. Prior to their use for regulatory decision-making, new methods and strategies must undergo appropriate validation studies to determine the extent that their use can provide equivalent or improved protection compared to existing methods and to determine the extent that reproducible results can be obtained in different laboratories. Comprehensive and optimal validation study designs are expected to expedite the validation and regulatory acceptance of new test methods and strategies that will support improved safety assessments and reduced animal use for regulatory testing.

  12. ReSOLV: Applying Cryptocurrency Blockchain Methods to Enable Global Cross-Platform Software License Validation

    Directory of Open Access Journals (Sweden)

    Alan Litchfield

    2018-05-01

    Full Text Available This paper presents a method for a decentralised peer-to-peer software license validation system using cryptocurrency blockchain technology to ameliorate software piracy, and to provide a mechanism for software developers to protect copyrighted works. Protecting software copyright has been an issue since the late 1970s and software license validation has been a primary method employed in an attempt to minimise software piracy and protect software copyright. The method described creates an ecosystem in which the rights and privileges of participants are observed.

  13. Increased efficacy for in-house validation of real-time PCR GMO detection methods.

    Science.gov (United States)

    Scholtens, I M J; Kok, E J; Hougs, L; Molenaar, B; Thissen, J T N M; van der Voet, H

    2010-03-01

    To improve the efficacy of the in-house validation of GMO detection methods (DNA isolation and real-time PCR, polymerase chain reaction), a study was performed to gain insight in the contribution of the different steps of the GMO detection method to the repeatability and in-house reproducibility. In the present study, 19 methods for (GM) soy, maize canola and potato were validated in-house of which 14 on the basis of an 8-day validation scheme using eight different samples and five on the basis of a more concise validation protocol. In this way, data was obtained with respect to the detection limit, accuracy and precision. Also, decision limits were calculated for declaring non-conformance (>0.9%) with 95% reliability. In order to estimate the contribution of the different steps in the GMO analysis to the total variation variance components were estimated using REML (residual maximum likelihood method). From these components, relative standard deviations for repeatability and reproducibility (RSD(r) and RSD(R)) were calculated. The results showed that not only the PCR reaction but also the factors 'DNA isolation' and 'PCR day' are important factors for the total variance and should therefore be included in the in-house validation. It is proposed to use a statistical model to estimate these factors from a large dataset of initial validations so that for similar GMO methods in the future, only the PCR step needs to be validated. The resulting data are discussed in the light of agreed European criteria for qualified GMO detection methods.

  14. Experimental validation for calcul methods of structures having shock non-linearity

    International Nuclear Information System (INIS)

    Brochard, D.; Buland, P.

    1987-01-01

    For the seismic analysis of non-linear structures, numerical methods have been developed which need to be validated on experimental results. The aim of this paper is to present the design method of a test program which results will be used for this purpose. Some applications to nuclear components will illustrate this presentation [fr

  15. Examination of packaging materials in bakery products : a validated method for detection and quantification

    NARCIS (Netherlands)

    Raamsdonk, van L.W.D.; Pinckaers, V.G.Z.; Vliege, J.J.M.; Egmond, van H.J.

    2012-01-01

    Methods for the detection and quantification of packaging materials are necessary for the control of the prohibition of these materials according to Regulation (EC)767/2009. A method has been developed and validated at RIKILT for bakery products, including sweet bread and raisin bread. This choice

  16. Validation of a Novel 3-Dimensional Sonographic Method for Assessing Gastric Accommodation in Healthy Adults

    NARCIS (Netherlands)

    Buisman, Wijnand J; van Herwaarden-Lindeboom, MYA; Mauritz, Femke A; El Ouamari, Mourad; Hausken, Trygve; Olafsdottir, Edda J; van der Zee, David C; Gilja, Odd Helge

    OBJECTIVES: A novel automated 3-dimensional (3D) sonographic method has been developed for measuring gastric volumes. This study aimed to validate and assess the reliability of this novel 3D sonographic method compared to the reference standard in 3D gastric sonography: freehand magneto-based 3D

  17. Validity of a structured method of selecting abstracts for a plastic surgical scientific meeting

    NARCIS (Netherlands)

    van der Steen, LPE; Hage, JJ; Kon, M; Monstrey, SJ

    In 1999, the European Association of Plastic Surgeons accepted a structured method to assess and select the abstracts that are submitted for its yearly scientific meeting. The two criteria used to evaluate whether such a selection method is accurate were reliability and validity. The authors

  18. Application of EU guidelines for the validation of screening methods for veterinary drugs

    NARCIS (Netherlands)

    Stolker, A.A.M.

    2012-01-01

    Commission Decision (CD) 2002/657/EC describes detailed rules for method validation within the framework of residue monitoring programmes. The approach described in this CD is based on criteria. For (qualitative) screening methods, the most important criteria is that the CCß has to be below any

  19. Validity of the remote food photography method against doubly labeled water among minority preschoolers

    Science.gov (United States)

    The aim of this study was to determine the validity of energy intake (EI) estimations made using the remote food photography method (RFPM) compared to the doubly labeled water (DLW) method in minority preschool children in a free-living environment. Seven days of food intake and spot urine samples...

  20. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.

    Science.gov (United States)

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil

    2014-08-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.

  1. Validation parameters of instrumental method for determination of total bacterial count in milk

    Directory of Open Access Journals (Sweden)

    Nataša Mikulec

    2004-10-01

    Full Text Available The method of flow citometry as rapid, instrumental and routine microbiological method is used for determination of total bacterial count in milk. The results of flow citometry are expressed as individual bacterial cells count. Problems regarding the interpretation of the results of total bacterial count can be avoided by transformation of the results of flow citometry method onto the scale of reference method (HRN ISO 6610:2001.. The method of flow citometry, like any analitycal method, according to the HRN EN ISO/IEC 17025:2000 standard, requires validation and verification. This paper describes parameters of validation: accuracy, precision, specificity, range, robustness and measuring uncertainty for the method of flow citometry.

  2. Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1313 and Method 1316

    Science.gov (United States)

    This document summarizes the results of an interlaboratory study conducted to generate precision estimates for two parallel batch leaching methods which are part of the Leaching Environmental Assessment Framework (LEAF). These methods are: (1) Method 1313: Liquid-Solid Partition...

  3. Comprehensive validation scheme for in situ fiber optics dissolution method for pharmaceutical drug product testing.

    Science.gov (United States)

    Mirza, Tahseen; Liu, Qian Julie; Vivilecchia, Richard; Joshi, Yatindra

    2009-03-01

    There has been a growing interest during the past decade in the use of fiber optics dissolution testing. Use of this novel technology is mainly confined to research and development laboratories. It has not yet emerged as a tool for end product release testing despite its ability to generate in situ results and efficiency improvement. One potential reason may be the lack of clear validation guidelines that can be applied for the assessment of suitability of fiber optics. This article describes a comprehensive validation scheme and development of a reliable, robust, reproducible and cost-effective dissolution test using fiber optics technology. The test was successfully applied for characterizing the dissolution behavior of a 40-mg immediate-release tablet dosage form that is under development at Novartis Pharmaceuticals, East Hanover, New Jersey. The method was validated for the following parameters: linearity, precision, accuracy, specificity, and robustness. In particular, robustness was evaluated in terms of probe sampling depth and probe orientation. The in situ fiber optic method was found to be comparable to the existing manual sampling dissolution method. Finally, the fiber optic dissolution test was successfully performed by different operators on different days, to further enhance the validity of the method. The results demonstrate that the fiber optics technology can be successfully validated for end product dissolution/release testing. (c) 2008 Wiley-Liss, Inc. and the American Pharmacists Association

  4. A simple method for validation and verification of pipettes mounted on automated liquid handlers

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Frøslev, Tobias Guldberg

     We have implemented a simple method for validation and verification of the performance of pipettes mounted on automated liquid handlers as necessary for laboratories accredited under ISO 17025. An 8-step serial dilution of Orange G was prepared in quadruplicates in a flat bottom 96-well microtit...... available. In conclusion, we have set up a simple solution for the continuous validation of automated liquid handlers used for accredited work. The method is cheap, simple and easy to use for aqueous solutions but requires a spectrophotometer that can read microtiter plates....... We have implemented a simple method for validation and verification of the performance of pipettes mounted on automated liquid handlers as necessary for laboratories accredited under ISO 17025. An 8-step serial dilution of Orange G was prepared in quadruplicates in a flat bottom 96-well microtiter...

  5. Principles of Single-Laboratory Validation of Analytical Methods for Testing the Chemical Composition of Pesticides

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Underlying theoretical and practical approaches towards pesticide formulation analysis are discussed, i.e. general principles, performance characteristics, applicability of validation data, verification of method performance, and adaptation of validated methods by other laboratories. The principles of single laboratory validation of analytical methods for testing the chemical composition of pesticides are outlined. Also the theoretical background is described for performing pesticide formulation analysis as outlined in ISO, CIPAC/AOAC and IUPAC guidelines, including methodological characteristics such as specificity, selectivity, linearity, accuracy, trueness, precision and bias. Appendices I–III hereof give practical and elaborated examples on how to use the Horwitz approach and formulae for estimating the target standard deviation towards acceptable analytical repeatability. The estimation of trueness and the establishment of typical within-laboratory reproducibility are treated in greater detail by means of worked-out examples. (author)

  6. Bayesian risk-based decision method for model validation under uncertainty

    International Nuclear Information System (INIS)

    Jiang Xiaomo; Mahadevan, Sankaran

    2007-01-01

    This paper develops a decision-making methodology for computational model validation, considering the risk of using the current model, data support for the current model, and cost of acquiring new information to improve the model. A Bayesian decision theory-based method is developed for this purpose, using a likelihood ratio as the validation metric for model assessment. An expected risk or cost function is defined as a function of the decision costs, and the likelihood and prior of each hypothesis. The risk is minimized through correctly assigning experimental data to two decision regions based on the comparison of the likelihood ratio with a decision threshold. A Bayesian validation metric is derived based on the risk minimization criterion. Two types of validation tests are considered: pass/fail tests and system response value measurement tests. The methodology is illustrated for the validation of reliability prediction models in a tension bar and an engine blade subjected to high cycle fatigue. The proposed method can effectively integrate optimal experimental design into model validation to simultaneously reduce the cost and improve the accuracy of reliability model assessment

  7. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2015-05-01

    Several sophisticated methods of footprint analysis currently exist. However, it is sometimes useful to apply standard measurement methods of recognized evidence with an easy and quick application. We sought to assess the reliability and validity of a new method of footprint assessment in a healthy population using Photoshop CS5 software (Adobe Systems Inc, San Jose, California). Forty-two footprints, corresponding to 21 healthy individuals (11 men with a mean ± SD age of 20.45 ± 2.16 years and 10 women with a mean ± SD age of 20.00 ± 1.70 years) were analyzed. Footprints were recorded in static bipedal standing position using optical podography and digital photography. Three trials for each participant were performed. The Hernández-Corvo, Chippaux-Smirak, and Staheli indices and the Clarke angle were calculated by manual method and by computerized method using Photoshop CS5 software. Test-retest was used to determine reliability. Validity was obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed high values (ICC, 0.98-0.99). Moreover, the validity test clearly showed no difference between techniques (ICC, 0.99-1). The reliability and validity of a method to measure, assess, and record the podometric indices using Photoshop CS5 software has been demonstrated. This provides a quick and accurate tool useful for the digital recording of morphostatic foot study parameters and their control.

  8. A validated high performance thin layer chromatography method for determination of yohimbine hydrochloride in pharmaceutical preparations

    OpenAIRE

    Jihan M Badr

    2013-01-01

    Background: Yohimbine is an indole alkaloid used as a promising therapy for erectile dysfunction. A number of methods were reported for the analysis of yohimbine in the bark or in pharmaceutical preparations. Materials and Method: In the present work, a simple and sensitive high performance thin layer chromatographic method is developed for determination of yohimbine (occurring as yohimbine hydrochloride) in pharmaceutical preparations and validated according to International Conference of Ha...

  9. Performance Validity Testing in Neuropsychology: Methods for Measurement Development and Maximizing Diagnostic Accuracy.

    Science.gov (United States)

    Wodushek, Thomas R; Greher, Michael R

    2017-05-01

    In the first column in this 2-part series, Performance Validity Testing in Neuropsychology: Scientific Basis and Clinical Application-A Brief Review, the authors introduced performance validity tests (PVTs) and their function, provided a justification for why they are necessary, traced their ongoing endorsement by neuropsychological organizations, and described how they are used and interpreted by ever increasing numbers of clinical neuropsychologists. To enhance readers' understanding of these measures, this second column briefly describes common detection strategies used in PVTs as well as the typical methods used to validate new PVTs and determine cut scores for valid/invalid determinations. We provide a discussion of the latest research demonstrating how neuropsychologists can combine multiple PVTs in a single battery to improve sensitivity/specificity to invalid responding. Finally, we discuss future directions for the research and application of PVTs.

  10. Method validation for preparing serum and plasma samples from human blood for downstream proteomic, metabolomic, and circulating nucleic acid-based applications.

    Science.gov (United States)

    Ammerlaan, Wim; Trezzi, Jean-Pierre; Lescuyer, Pierre; Mathay, Conny; Hiller, Karsten; Betsou, Fay

    2014-08-01

    Formal method validation for biospecimen processing in the context of accreditation in laboratories and biobanks is lacking. Serum and plasma processing protocols were validated for fitness-for-purpose in terms of key downstream endpoints, and this article demonstrates methodology for biospecimen processing method validation. Serum and plasma preparation from human blood was optimized for centrifugation conditions with respect to microparticle counts. Optimal protocols were validated for methodology and reproducibility in terms of acceptance criteria based on microparticle counts, DNA and hemoglobin concentration, and metabolomic and proteomic profiles. These parameters were also used to evaluate robustness for centrifugation temperature (4°C versus room temperature [RT]), deceleration (low, medium, high) and blood stability (after a 2-hour delay). Optimal protocols were 10-min centrifugation for serum and 20-min for plasma at 2000 g, medium brake, RT. Methodology and reproducibility acceptance criteria were met for both protocols except for reproducibility of plasma metabolomics. Overall, neither protocol was robust for centrifugation at 4°C versus RT. RT gave higher microparticles and free DNA yields in serum, and fewer microparticles with less hemolysis in plasma. Overall, both protocols were robust for fast, medium, and low deceleration, with a medium brake considered optimal. Pre-centrifugation stability after a 2-hour delay was seen at both temperatures for hemoglobin concentration and proteomics, but not for microparticle counts. We validated serum and plasma collection methods suitable for downstream protein, metabolite, or free nucleic acid-based applications. Temperature and pre-centrifugation delay can influence analytic results, and laboratories and biobanks should systematically record these conditions in the scope of accreditation.

  11. Validation of Experimental whole-body SAR Assessment Method in a Complex Indoor Environment

    DEFF Research Database (Denmark)

    Bamba, Aliou; Joseph, Wout; Vermeeren, Gunter

    2012-01-01

    Assessing experimentally the whole-body specific absorption rate (SARwb) in a complex indoor environment is very challenging. An experimental method based on room electromagnetics theory (accounting only the Line-Of-Sight as specular path) to assess the whole-body SAR is validated by numerical...... of the proposed method is that it allows discarding the computation burden because it does not use any discretizations. Results show good agreement between measurement and computation at 2.8 GHz, as long as the plane wave assumption is valid, i.e., for high distances from the transmitter. Relative deviations 0...

  12. A chromatographic method validation to quantify tablets Mephenesine of national production

    International Nuclear Information System (INIS)

    Suarez Perez, Yania; Izquierdo Castro, Adalberto; Milian Sanchez, Jana Daria

    2009-01-01

    Authors made validation of an analytical method by high performance liquid chromatography (HPLC) for quantification of Mephenesine in recently reformulated 500 mg tablets. With regard to its application to quality control, validation included the following parameters: linearity, accuracy, precision, and selectivity. Results were satisfactory within 50-150 % rank. In the case of its use in subsequent studies of chemical stability, the selectivity for stability and sensitivity was assessed. Estimated detection and quantification limits were appropriate, and the method was selective versus the possible degradation products. (Author)

  13. Validation of methods for the detection and quantification of engineered nanoparticles in food

    DEFF Research Database (Denmark)

    Linsinger, T.P.J.; Chaudhry, Q.; Dehalu, V.

    2013-01-01

    the methods apply equally well to particles of different suppliers. In trueness testing, information whether the particle size distribution has changed during analysis is required. Results are largely expected to follow normal distributions due to the expected high number of particles. An approach...... approach for the validation of methods for detection and quantification of nanoparticles in food samples. It proposes validation of identity, selectivity, precision, working range, limit of detection and robustness, bearing in mind that each “result” must include information about the chemical identity...

  14. Quasi-experimental Methods in Empirical Regional Science and Policy Analysis – Is there a Scope for Application?

    DEFF Research Database (Denmark)

    Mitze, Timo; Paloyo, Alfredo R.; Alecke, Björn

    Applied econometrics has recently emphasized the identification of causal parameters for policy analysis. This revolution has yet to fully propagate to the field of regional science. We examine the scope for application of the matching approach – part of the modern applied econometrics toolkit...... – in regional science and highlight special features of regional data that make such an application difficult. In particular, our analysis of the effect of regional subsidies on labor-productivity growth in Germany indicates that such policies are effective, but only up to a certain maximum treatment intensity...... to be interpreted with some caution. The matching approach nevertheless can be of great value for regional policy analysis and should be the subject of future research efforts in the field of empirical regional science....

  15. Chemical analysis of solid residue from liquid and solid fuel combustion: Method development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Trkmic, M. [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecturek Zagreb (Croatia); Curkovic, L. [University of Zagreb, Faculty of Chemical Engineering and Technology, Zagreb (Croatia); Asperger, D. [HEP-Proizvodnja, Thermal Power Plant Department, Zagreb (Croatia)

    2012-06-15

    This paper deals with the development and validation of methods for identifying the composition of solid residue after liquid and solid fuel combustion in thermal power plant furnaces. The methods were developed for energy dispersive X-ray fluorescence (EDXRF) spectrometer analysis. Due to the fuels used, the different composition and the location of creation of solid residue, it was necessary to develop two methods. The first method is used for identifying solid residue composition after fuel oil combustion (Method 1), while the second method is used for identifying solid residue composition after the combustion of solid fuels, i. e. coal (Method 2). Method calibration was performed on sets of 12 (Method 1) and 6 (Method 2) certified reference materials (CRM). CRMs and analysis test samples were prepared in pellet form using hydraulic press. For the purpose of method validation the linearity, accuracy, precision and specificity were determined, and the measurement uncertainty of methods for each analyte separately was assessed. The methods were applied in the analysis of real furnace residue samples. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  16. A Comparison of Three Methods for the Analysis of Skin Flap Viability: Reliability and Validity.

    Science.gov (United States)

    Tim, Carla Roberta; Martignago, Cintia Cristina Santi; da Silva, Viviane Ribeiro; Dos Santos, Estefany Camila Bonfim; Vieira, Fabiana Nascimento; Parizotto, Nivaldo Antonio; Liebano, Richard Eloin

    2018-05-01

    Objective: Technological advances have provided new alternatives to the analysis of skin flap viability in animal models; however, the interrater validity and reliability of these techniques have yet to be analyzed. The present study aimed to evaluate the interrater validity and reliability of three different methods: weight of paper template (WPT), paper template area (PTA), and photographic analysis. Approach: Sixteen male Wistar rats had their cranially based dorsal skin flap elevated. On the seventh postoperative day, the viable tissue area and the necrotic area of the skin flap were recorded using the paper template method and photo image. The evaluation of the percentage of viable tissue was performed using three methods, simultaneously and independently by two raters. The analysis of interrater reliability and viability was performed using the intraclass correlation coefficient and Bland Altman Plot Analysis was used to visualize the presence or absence of systematic bias in the evaluations of data validity. Results: The results showed that interrater reliability for WPT, measurement of PTA, and photographic analysis were 0.995, 0.990, and 0.982, respectively. For data validity, a correlation >0.90 was observed for all comparisons made between the three methods. In addition, Bland Altman Plot Analysis showed agreement between the comparisons of the methods and the presence of systematic bias was not observed. Innovation: Digital methods are an excellent choice for assessing skin flap viability; moreover, they make data use and storage easier. Conclusion: Independently from the method used, the interrater reliability and validity proved to be excellent for the analysis of skin flaps' viability.

  17. Development and validation of stability indicating UPLC assay method for ziprasidone active pharma ingredient

    Directory of Open Access Journals (Sweden)

    Sonam Mittal

    2012-01-01

    Full Text Available Background: Ziprasidone, a novel antipsychotic, exhibits a potent highly selective antagonistic activity on D2 and 5HT2A receptors. Literature survey for ziprasidone revealed several analytical methods based on different techniques but no UPLC method has been reported so far. Aim: Aim of this research paper is to present a simple and rapid stability indicating isocratic, ultra performance liquid chromatographic (UPLC method which was developed and validated for the determination of ziprasidone active pharmaceutical ingredient. Forced degradation studies of ziprasidone were studied under acid, base, oxidative hydrolysis, thermal stress and photo stress conditions. Materials and Methods: The quantitative determination of ziprasidone drug was performed on a Supelco analytical column (100×2.1 mm i.d., 2.7 ΅m with 10 mM ammonium acetate buffer (pH: 6.7 and acetonitrile (ACN as mobile phase with the ratio (55:45-Buffer:ACN at a flow rate of 0.35 ml/ min. For UPLC method, UV detection was made at 318 nm and the run time was 3 min. Developed UPLC method was validated as per ICH guidelines. Results and Conclusion: Mild degradation of the drug substance was observed during oxidative hydrolysis and considerable degradation observed during basic hydrolysis. During method validation, parameters such as precision, linearity, ruggedness, stability, robustness, and specificity were evaluated, which remained within acceptable limits. Developed UPLC method was successfully applied for evaluating assay of Ziprasidone active Pharma ingredient.

  18. Validated UV-Spectrophotometric Methods for Determination of Gemifloxacin Mesylate in Pharmaceutical Tablet Dosage Forms

    Directory of Open Access Journals (Sweden)

    R. Rote Ambadas

    2010-01-01

    Full Text Available Two simple, economic and accurate UV spectrophotometric methods have been developed for determination of gemifloxacin mesylate in pharmaceutical tablet formulation. The first UV-spectrophotometric method depends upon the measurement of absorption at the wavelength 263.8 nm. In second area under curve method the wavelength range for detection was selected from 268.5-258.5 nm. Beer’s law was obeyed in the range of 2 to 12 μgmL-1 for both the methods. The proposed methods was validated statistically and applied successfully to determination of gemifloxacin mesylate in pharmaceutical formulation.

  19. Method of administration of PROMIS scales did not significantly impact score level, reliability, or validity

    DEFF Research Database (Denmark)

    Bjorner, Jakob B; Rose, Matthias; Gandek, Barbara

    2014-01-01

    OBJECTIVES: To test the impact of the method of administration (MOA) on score level, reliability, and validity of scales developed in the Patient Reported Outcomes Measurement Information System (PROMIS). STUDY DESIGN AND SETTING: Two nonoverlapping parallel forms each containing eight items from......, no significant mode differences were found and all confidence intervals were within the prespecified minimal important difference of 0.2 standard deviation. Parallel-forms reliabilities were very high (ICC = 0.85-0.93). Only one across-mode ICC was significantly lower than the same-mode ICC. Tests of validity...... questionnaire (PQ), personal digital assistant (PDA), or personal computer (PC) and a second form by PC, in the same administration. Method equivalence was evaluated through analyses of difference scores, intraclass correlations (ICCs), and convergent/discriminant validity. RESULTS: In difference score analyses...

  20. Method validation for uranium content analysis using a potentiometer T-90

    International Nuclear Information System (INIS)

    Torowati; Ngatijo; Rahmiati

    2016-01-01

    An experimental method validation has been conducted for uranium content analysis using Potentiometer T-90. The method validation experiment was performed in the quality control laboratory of Experiment Fuel Element Installation, PTBBN - BATAN. The objective is to determine the level of precision and accuracy of analytical results for uranium analysis referring to the latest American Standard Test Method (ASTM) of ASTM C1267-11, which is a modified reference method by reducing of reagent consumption by 10% of the amount used by the original method. The ASTM C 1267-11 reference is a new ASTM as a substitute for the older ASTM namely ASTM C799, Vol.12.01, 2003. It is, therefore, necessary to validate the renewed method. The tool used for the analysis of uranium was potentiometer T-90 and the material used was standard uranium oxide powder CRM (Certificate Reference Material). Validation of the method was done by analyzing standard uranium powder by 7 times weighing and 7 times analysis. Analysis results were used to determine the level of accuracy, precision, Relative Standard Deviation (RSD) and Horwitz coefficient Variation and limit detection and quantitation. The average uranium obtained for this method validation is 84.36% with Standard Deviation (SD) of 0.12%, Relative Standard Deviation (RSD) 0.14% and 2/3 Horwitz coefficient Variation (CV Horwitz) 2.05%. The results show that RSD value is smaller than the value of (2/3) CV Horwitz, which means that this method has a high precision. The accuracy value obtained is 0.48%, and since the acceptance limit of high level of accuracy is when the accuracy value is <2.00%, this method is regarded as having a high degree of accuracy [1]. The limit of detection (LOD) and and the limit of quantitation (LOQ) are 0.0145 g/L and 0.0446 g/L respectively. It is concluded that the ASTM C 1267-11 reference method is valid for use. (author)

  1. Methods and procedures for the verification and validation of artificial neural networks

    CERN Document Server

    Taylor, Brian J

    2006-01-01

    Neural networks are members of a class of software that have the potential to enable intelligent computational systems capable of simulating characteristics of biological thinking and learning. This volume introduces some of the methods and techniques used for the verification and validation of neural networks and adaptive systems.

  2. Review of seismic tests for qualification of components and validation of methods

    International Nuclear Information System (INIS)

    Buland, P.; Gantenbein, F.; Gibert, R.J.; Hoffmann, A.; Queval, J.C.

    1988-01-01

    Seismic tests are performed in CEA-DEMT since many years in order: to demonstrate the qualification of components, to give an experimental validation of calculation methods used for seismic design of components. The paper presents examples of these two types of tests, a description of the existing facilities and details about the new facility TAMARIS under construction. (author)

  3. A validation framework for microbial forensic methods based on statistical pattern recognition

    Energy Technology Data Exchange (ETDEWEB)

    Velsko, S P

    2007-11-12

    This report discusses a general approach to validating microbial forensic methods that attempt to simultaneously distinguish among many hypotheses concerning the manufacture of a questioned biological agent sample. It focuses on the concrete example of determining growth medium from chemical or molecular properties of a bacterial agent to illustrate the concepts involved.

  4. Validation of the DLW method in Japanese quail at different water fluxes using laser and IRMS

    NARCIS (Netherlands)

    van Trigt, R; Kerstel, E.R.T.; Neubert, R.E.M.; Meijer, H.A.J.; Mclean, M.; Visser, G.H.

    2002-01-01

    In Japanese quail (Coturnix c. japonica; n = 9), the doubly labeled water (DLW) method (H-2, O-18) for estimation Of CO2 production (1/day) was validated. To evaluate its sensitivity to water efflux levels (r(H2Oe); g/day) and to assumptions of fractional evaporative water loss (x; dimensionless),

  5. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    NARCIS (Netherlands)

    Ramos, Daniel; Haraksim, Rudolf; Meuwly, Didier

    2017-01-01

    Data to which the authors refer to throughout this article are likelihood ratios (LR) computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR) method in forensic evidence evaluation. These data present a

  6. Validation of the ultraviolet spectrophotometry method for the quality control of ciprofloxacin chlorhydrate in Ciprecu tablets

    International Nuclear Information System (INIS)

    Perez Navarro, Maikel; Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania

    2014-01-01

    Quinolones are a group of antimicrobials of high clinical significance. Ciprofloxacin hydrochloride monohydrate is a second-generation antibacterial fluoroquinolone for treatment of several infections and is marketed as eye drops, injections, capsule and tablets. To develop and to validate an ultraviolet spectrophotometric analytical method to be used in the quality control of ciprofloxacin hydrochloride monohydrate in newly manufactured Ciprecu tablets

  7. Measurement and data analysis methods for field-scale wind erosion studies and model validation

    NARCIS (Netherlands)

    Zobeck, T.M.; Sterk, G.; Funk, R.F.; Rajot, J.L.; Stout, J.E.; Scott Van Pelt, R.

    2003-01-01

    Accurate and reliable methods of measuring windblown sediment are needed to confirm, validate, and improve erosion models, assess the intensity of aeolian processes and related damage, determine the source of pollutants, and for other applications. This paper outlines important principles to

  8. Validity of a Simulation Game as a Method for History Teaching

    Science.gov (United States)

    Corbeil, Pierre; Laveault, Dany

    2011-01-01

    The aim of this research is, first, to determine the validity of a simulation game as a method of teaching and an instrument for the development of reasoning and, second, to study the relationship between learning and students' behavior toward games. The participants were college students in a History of International Relations course, with two…

  9. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation

    NARCIS (Netherlands)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-01-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes’ inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of

  10. Validation of the actuator line method using near wake measurements of the MEXICO rotor

    DEFF Research Database (Denmark)

    Nilsson, Karl; Shen, Wen Zhong; Sørensen, Jens Nørkær

    2015-01-01

    The purpose of the present work is to validate the capability of the actuator line method to compute vortex structures in the near wake behind the MEXICO experimental wind turbine rotor. In the MEXICO project/MexNext Annex, particle image velocimetry measurements have made it possible to determine...

  11. Validity Argument for Assessing L2 Pragmatics in Interaction Using Mixed Methods

    Science.gov (United States)

    Youn, Soo Jung

    2015-01-01

    This study investigates the validity of assessing L2 pragmatics in interaction using mixed methods, focusing on the evaluation inference. Open role-plays that are meaningful and relevant to the stakeholders in an English for Academic Purposes context were developed for classroom assessment. For meaningful score interpretations and accurate…

  12. Review of seismic tests for qualification of components and validation of methods

    Energy Technology Data Exchange (ETDEWEB)

    Buland, P; Gantenbein, F; Gibert, R J; Hoffmann, A; Queval, J C [CEA-CEN SACLAY-DEMT, Gif sur Yvette-Cedex (France)

    1988-07-01

    Seismic tests are performed in CEA-DEMT since many years in order: to demonstrate the qualification of components, to give an experimental validation of calculation methods used for seismic design of components. The paper presents examples of these two types of tests, a description of the existing facilities and details about the new facility TAMARIS under construction. (author)

  13. Validity and reliability of a method for assessment of cervical vertebral maturation.

    Science.gov (United States)

    Zhao, Xiao-Guang; Lin, Jiuxiang; Jiang, Jiu-Hui; Wang, Qingzhu; Ng, Sut Hong

    2012-03-01

    To evaluate the validity and reliability of the cervical vertebral maturation (CVM) method with a longitudinal sample. Eighty-six cephalograms from 18 subjects (5 males and 13 females) were selected from the longitudinal database. Total mandibular length was measured on each film; an increased rate served as the gold standard in examination of the validity of the CVM method. Eleven orthodontists, after receiving intensive training in the CVM method, evaluated all films twice. Kendall's W and the weighted kappa statistic were employed. Kendall's W values were higher than 0.8 at both times, indicating strong interobserver reproducibility, but interobserver agreement was documented twice at less than 50%. A wide range of intraobserver agreement was noted (40.7%-79.1%), and substantial intraobserver reproducibility was proved by kappa values (0.53-0.86). With regard to validity, moderate agreement was reported between the gold standard and observer staging at the initial time (kappa values 0.44-0.61). However, agreement seemed to be unacceptable for clinical use, especially in cervical stage 3 (26.8%). Even though the validity and reliability of the CVM method proved statistically acceptable, we suggest that many other growth indicators should be taken into consideration in evaluating adolescent skeletal maturation.

  14. Alternative method to validate the seasonal land cover regions of the conterminous United States

    Science.gov (United States)

    Zhiliang Zhu; Donald O. Ohlen; Raymond L. Czaplewski; Robert E. Burgan

    1996-01-01

    An accuracy assessment method involving double sampling and the multivariate composite estimator has been used to validate the prototype seasonal land cover characteristics database of the conterminous United States. The database consists of 159 land cover classes, classified using time series of 1990 1-km satellite data and augmented with ancillary data including...

  15. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    International Nuclear Information System (INIS)

    Reznik, L.

    1994-01-01

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  16. A systematic review of validated methods to capture acute bronchospasm using administrative or claims data.

    Science.gov (United States)

    Sharifi, Mona; Krishanswami, Shanthi; McPheeters, Melissa L

    2013-12-30

    To identify and assess billing, procedural, or diagnosis code, or pharmacy claim-based algorithms used to identify acute bronchospasm in administrative and claims databases. We searched the MEDLINE database from 1991 to September 2012 using controlled vocabulary and key terms related to bronchospasm, wheeze and acute asthma. We also searched the reference lists of included studies. Two investigators independently assessed the full text of studies against pre-determined inclusion criteria. Two reviewers independently extracted data regarding participant and algorithm characteristics. Our searches identified 677 citations of which 38 met our inclusion criteria. In these 38 studies, the most commonly used ICD-9 code was 493.x. Only 3 studies reported any validation methods for the identification of bronchospasm, wheeze or acute asthma in administrative and claims databases; all were among pediatric populations and only 2 offered any validation statistics. Some of the outcome definitions utilized were heterogeneous and included other disease based diagnoses, such as bronchiolitis and pneumonia, which are typically of an infectious etiology. One study offered the validation of algorithms utilizing Emergency Department triage chief complaint codes to diagnose acute asthma exacerbations with ICD-9 786.07 (wheezing) revealing the highest sensitivity (56%), specificity (97%), PPV (93.5%) and NPV (76%). There is a paucity of studies reporting rigorous methods to validate algorithms for the identification of bronchospasm in administrative data. The scant validated data available are limited in their generalizability to broad-based populations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Update of Standard Practices for New Method Validation in Forensic Toxicology.

    Science.gov (United States)

    Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T

    2017-01-01

    International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. MR-based water content estimation in cartilage: design and validation of a method

    DEFF Research Database (Denmark)

    Shiguetomi Medina, Juan Manuel; Kristiansen, Maja Sophie; Ringgaard, Steffen

    Purpose: Design and validation of an MR-based method that allows the calculation of the water content in cartilage tissue. Methods and Materials: Cartilage tissue T1 map based water content MR sequences were used on a 37 Celsius degree stable system. The T1 map intensity signal was analyzed on 6...... cartilage samples from living animals (pig) and on 8 gelatin samples which water content was already known. For the data analysis a T1 intensity signal map software analyzer used. Finally, the method was validated after measuring and comparing 3 more cartilage samples in a living animal (pig). The obtained...... map based water content sequences can provide information that, after being analyzed using a T1-map analysis software, can be interpreted as the water contained inside a cartilage tissue. The amount of water estimated using this method was similar to the one obtained at the dry-freeze procedure...

  19. GC Method Validation for the Analysis of Menthol in Suppository Pharmaceutical Dosage Form

    Directory of Open Access Journals (Sweden)

    Murad N. Abualhasan

    2017-01-01

    Full Text Available Menthol is widely used as a fragrance and flavor in the food and cosmetic industries. It is also used in the medical and pharmaceutical fields for its various biological effects. Gas chromatography (GC is considered to be a sensitive method for the analysis of menthol. GC chromatographic separation was developed using capillary column (VF-624 and a flame ionization detector (FID. The method was validated as per ICH guidelines for various parameters such as precision, linearity, accuracy, solution stability, robustness, limit of detection, and quantification. The tested validation parameters were found to be within acceptable limits. The method was successfully applied for the quantification of menthol in suppositories formulations. Quality control departments and official pharmacopeias can use our developed method in the analysis of menthol in pharmaceutical dosage formulation and raw material.

  20. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    Science.gov (United States)

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders.

  1. Low-cost extrapolation method for maximal lte radio base station exposure estimation: Test and validation

    International Nuclear Information System (INIS)

    Verloock, L.; Joseph, W.; Gati, A.; Varsier, N.; Flach, B.; Wiart, J.; Martens, L.

    2013-01-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on down-link band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2x2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders. (authors)

  2. Validation of three new methods for determination of metal emissions using a modified Environmental Protection Agency Method 301

    Energy Technology Data Exchange (ETDEWEB)

    Catherine A. Yanca; Douglas C. Barth; Krag A. Petterson; Michael P. Nakanishi; John A. Cooper; Bruce E. Johnsen; Richard H. Lambert; Daniel G. Bivins [Cooper Environmental Services, LLC, Portland, OR (United States)

    2006-12-15

    Three new methods applicable to the determination of hazardous metal concentrations in stationary source emissions were developed and evaluated for use in U.S. Environmental Protection Agency (EPA) compliance applications. Two of the three independent methods, a continuous emissions monitor-based method (Xact) and an X-ray-based filter method (XFM), are used to measure metal emissions. The third method involves a quantitative aerosol generator (QAG), which produces a reference aerosol used to evaluate the measurement methods. A modification of EPA Method 301 was used to validate the three methods for As, Cd, Cr, Pb, and Hg, representing three hazardous waste combustor Maximum Achievable Control Technology (MACT) metal categories (low volatile, semivolatile, and volatile). The measurement methods were evaluated at a hazardous waste combustor (HWC) by comparing measured with reference aerosol concentrations. The QAG, Xact, and XFM met the modified Method 301 validation criteria. All three of the methods demonstrated precisions and accuracies on the order of 5%. The measurement methods should be applicable to emissions from a wide range of sources, and the reference aerosol generator should be applicable to additional analytes. EPA recently approved an alternative monitoring petition for an HWC at Eli Lilly's Tippecanoe site in Lafayette, IN, in which the Xact is used for demonstrating compliance with the HWC MACT metal emissions (low volatile, semivolatile, and volatile). The QAG reference aerosol generator was approved as a method for providing a quantitative reference aerosol, which is required for certification and continuing quality assurance of the Xact. 30 refs., 5 figs., 11 tabs.

  3. Validation of the quality control methods for active ingredients of Fungirex cream

    International Nuclear Information System (INIS)

    Perez Navarro, Maikel; Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania

    2014-01-01

    Fungirex cream is a two-drug product, that is, undecylenic acid and zinc undecylenate over a suitable basis. Since this is a product not documented in the official monographs of the pharmacopoeia, simple analytical methods were suggested for quantitation of analytes of interest in the cream, which are useful for release of newly prepared cream batches. To validate two volumetric methods for the quality control of active ingredients in Fungirex cream

  4. A Newly Improved Modified Method Development and Validation of Bromofenac Sodium Sesquihydrate in Bulk Drug Manufacturing

    OpenAIRE

    Sunil Kumar Yelamanchi V; Useni Reddy Mallu; I. V Kasi Viswanath; D. Balasubramanyam; G. Narshima Murthy

    2016-01-01

    The main objective of this study was to develop a simple, efficient, specific, precise and accurate newly improved modified Reverse Phase High Performance Liquid Chromatographic Purity (or) Related substance method for bromofenac sodium sesquihydrate active pharmaceuticals ingredient dosage form. Validation of analytical method is the confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled as per ICH, USP...

  5. Validation of the quality control method for sodium dicloxacillin in Dicloxen capsules

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Perez Navarro, Maikel; Suarez Perez, Yania

    2014-01-01

    Sodium dicloxacillin is a semi synthetic derivative of the isoxasocyl penicillin group that may appear in oral suspension form and in caplets. For the analysis of the raw materials and the finished products, it is recommended to use high performance liquid chromatography that is an unavailable method at the dicloxen capsule manufacturing lab for the routine analysis of the drug. To develop and to validate a useful ultraviolet spectrophotometry method for the quality control of sodium dicloxacillin in Dicloxen capsules

  6. Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1314 and Method 1315

    Science.gov (United States)

    This report summarizes the results of an interlaboratory study conducted to generate precision estimates for two leaching methods under review by the U.S. EPA’s OSWER for inclusion into the EPA’s SW-846: Method 1314: Liquid-Solid Partitioning as a Function of Liquid...

  7. Validation of methods for the determination of radium in waters and soil

    International Nuclear Information System (INIS)

    Decaillon, J.-G.; Bickel, M.; Hill, C.; Altzitzoglou, T.

    2004-01-01

    This article describes the advantages and disadvantages of several analytical methods used to prepare the alpha-particle source. As a result of this study, a new method combining commercial extraction and ion chromatography prior to a final co-precipitation step is proposed. This method has been applied and validated on several matrices (soil, waters) in the framework of international intercomparisons. The integration of this method in a global procedure to analyze actinoids and radium from a single solution (or digested soil) is also described

  8. Validation of uncertainty of weighing in the preparation of radionuclide standards by Monte Carlo Method

    International Nuclear Information System (INIS)

    Cacais, F.L.; Delgado, J.U.; Loayza, V.M.

    2016-01-01

    In preparing solutions for the production of radionuclide metrology standards is necessary measuring the quantity Activity by mass. The gravimetric method by elimination is applied to perform weighing with smaller uncertainties. At this work is carried out the validation, by the Monte Carlo method, of the uncertainty calculation approach implemented by Lourenco and Bobin according to ISO GUM for the method by elimination. The results obtained by both uncertainty calculation methods were consistent indicating that were fulfilled the conditions for the application of ISO GUM in the preparation of radioactive standards. (author)

  9. Validation of ascorbic acid tablets of national production by igh-performance liquid chromatography method

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Izquierdo Castro, Idalberto

    2009-01-01

    We validate an analytical method by high-performance liquid chromatography to determine ascorbic acid proportion in vitamin C tablets, which was designed as an alternative method to quality control and to follow-up of active principle chemical stability, since official techniques to quality control of ascorbic acid in tablets are not selective with degradation products. Method was modified according to that reported in USP 28, 2005 for analysis of injectable product. We used a RP-18 column of 250 x 4.6 mm 5 μm with a UV detector to 245 nm. Its validation was necessary for both objectives, considering parameters required for methods of I and II categories. This method was enough linear, exact, and precise in the rank of 100-300 μg/mL. Also, it was selective with remaining components of matrix and with the possible degradation products achieved in stressing conditions. Detection and quantification limits were estimated. When method was validated it was applied to ascorbic acid quantification in two batches of expired tablets and we detected a marked influence of container in active degradation principle after 12 months at room temperature. (Author)

  10. High-performance liquid chromatography method validation for determination of tetracycline residues in poultry meat

    Directory of Open Access Journals (Sweden)

    Vikas Gupta

    2014-01-01

    Full Text Available Background: In this study, a method for determination of tetracycline (TC residues in poultry with the help of high-performance liquid chromatography technique was validated. Materials and Methods: The principle step involved in ultrasonic-assisted extraction of TCs from poultry samples by 2 ml of 20% trichloroacetic acid and phosphate buffer (pH 4, which gave a clearer supernatant and high recovery, followed by centrifugation and purification by using 0.22 μm filter paper. Results: Validity study of the method revealed that all obtained calibration curves showed good linearity (r2 > 0.999 over the range of 40-4500 ng. Sensitivity was found to be 1.54 and 1.80 ng for oxytetracycline (OTC and TC. Accuracy was in the range of 87.94-96.20% and 72.40-79.84% for meat. Precision was lower than 10% in all cases indicating that the method can be used as a validated method. Limit of detection was found to be 4.8 and 5.10 ng for OTC and TC, respectively. The corresponding values of limit of quantitation were 11 and 12 ng. Conclusion: The method reliably identifies and quantifies the selected TC and OTC in the reconstituted poultry meat in the low and sub-nanogram range and can be applied in any laboratory.

  11. Method validation for simultaneous counting of Total α , β in Drinking Water using Liquid Scintillation Counter

    International Nuclear Information System (INIS)

    Al-Masri, M. S.; Nashawati, A.

    2014-05-01

    In this work, Method Validation Methods and Pulse Shape Analysis were validated to determine gross Alpha and Beta Emitters in Drinking Water using Liquid Scintillation Counter Win spectral 1414. Validation parameters include Method Detection Limit, Method Quantitation Limit, Repeatability Limit, Intermediate Precision, Trueness) Bias), Recovery Coefficient, Linearity and Uncertainty Budget in analysis. The results show that the Method Detection Limit and Method Quantitation Limit were 0.07, 0.24 Bq/l for Alpha emitters respectively, and 0.42, 1.4 Bq/l for Beta emitters, respectively. The relative standard deviation of Repeatability Limit reached 2.81% for Alpha emitters and 3.96% for Beta emitters. In addition to, the relative standard deviation of Intermediate Precisionis was 0.54% for Alpha emitters and 1.17% for Beta emitters. Moreover, the trueness was - 7.7% for Alpha emitters and - 4.5% for Beta emitters. Recovery Coefficient ranged between 87 - 96% and 88-101 for Alpha and Beta emitters, respectively. Linearity reached 1 for both Alpha and Beta emitters. on the other hand, Uncertainty Budget for all continents was 96.65% ,83.14% for Alpha and Beta emitters, respectively (author).

  12. Pressurised liquid extraction of flavonoids in onions. Method development and validation

    DEFF Research Database (Denmark)

    Søltoft, Malene; Christensen, J.H.; Nielsen, J.

    2009-01-01

    A rapid and reliable analytical method for quantification of flavonoids in onions was developed and validated. Five extraction methods were tested on freeze-dried onions and subsequently high performance liquid chromatography (HPLC) with UV detection was used for quantification of seven flavonoids...... extraction methods. However. PLE was the preferred extraction method because the method can be highly automated, use only small amounts of solvents, provide the cleanest extracts, and allow the extraction of light and oxygen-sensitive flavonoids to be carried out in an inert atmosphere protected from light......-step PLE method showed good selectivity, precision (RSDs = 3.1-11%) and recovery of the extractable flavonoids (98-99%). The method also appeared to be a multi-method, i.e. generally applicable to, e.g. phenolic acids in potatoes and carrots....

  13. Developing and validating a nutrition knowledge questionnaire: key methods and considerations.

    Science.gov (United States)

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-10-01

    To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.

  14. A method for EIA scoping of wave energy converters—based on classification of the used technology

    International Nuclear Information System (INIS)

    Margheritini, Lucia; Hansen, Anne Merrild; Frigaard, Peter

    2012-01-01

    During the first decade of the 21st Century the World faces spread concern for global warming caused by rise of green house gasses produced mainly by combustion of fossil fuels. Under this latest spin all renewable energies run parallel in order to achieve sustainable development. Among them wave energy has an unequivocal potential and technology is ready to enter the market and contribute to the renewable energy sector. Yet, frameworks and regulations for wave energy development are not fully ready, experiencing a setback caused by lack of understanding of the interaction of the technologies and marine environment, lack of coordination from the competent Authorities regulating device deployment and conflicts of maritime areas utilization. The EIA within the consent process is central in the realization of full scale devices and often is the meeting point for technology, politics and public. This paper presents the development of a classification of wave energy converters that is based on the different impact the technologies are expected to have on the environment. This innovative classification can be used in order to simplify the scoping process for developers and authorities.

  15. A method for EIA scoping of wave energy converters-based on classification of the used technology

    Energy Technology Data Exchange (ETDEWEB)

    Margheritini, Lucia, E-mail: lm@civil.aau.dk [Aalborg University, Department of Civil Engineering, Sohngardsholmsvej 57, DK - 9000, Aalborg (Denmark); Hansen, Anne Merrild, E-mail: merrild@plan.aau.dk [Aalborg University, Department of Planning and Development, Fibigerstraede 13, DK - 9220, Aalborg (Denmark); Frigaard, Peter, E-mail: pf@civil.aau.dk [Aalborg University, Department of Civil Engineering, Sohngardsholmsvej 57, DK - 9000, Aalborg (Denmark)

    2012-01-15

    During the first decade of the 21st Century the World faces spread concern for global warming caused by rise of green house gasses produced mainly by combustion of fossil fuels. Under this latest spin all renewable energies run parallel in order to achieve sustainable development. Among them wave energy has an unequivocal potential and technology is ready to enter the market and contribute to the renewable energy sector. Yet, frameworks and regulations for wave energy development are not fully ready, experiencing a setback caused by lack of understanding of the interaction of the technologies and marine environment, lack of coordination from the competent Authorities regulating device deployment and conflicts of maritime areas utilization. The EIA within the consent process is central in the realization of full scale devices and often is the meeting point for technology, politics and public. This paper presents the development of a classification of wave energy converters that is based on the different impact the technologies are expected to have on the environment. This innovative classification can be used in order to simplify the scoping process for developers and authorities.

  16. Assessment of Advanced Life Support competence when combining different test methods--reliability and validity

    DEFF Research Database (Denmark)

    Ringsted, C; Lippert, F; Hesselfeldt, R

    2007-01-01

    Cardiac Arrest Simulation Test (CASTest) scenarios for the assessments according to guidelines 2005. AIMS: To analyse the reliability and validity of the individual sub-tests provided by ERC and to find a combination of MCQ and CASTest that provides a reliable and valid single effect measure of ALS...... that possessed high reliability, equality of test sets, and ability to discriminate between the two groups of supposedly different ALS competence. CONCLUSIONS: ERC sub-tests of ALS competence possess sufficient reliability and validity. A combined ALS score with equal weighting of one MCQ and one CASTest can...... competence. METHODS: Two groups of participants were included in this randomised, controlled experimental study: a group of newly graduated doctors, who had not taken the ALS course (N=17) and a group of students, who had passed the ALS course 9 months before the study (N=16). Reliability in terms of inter...

  17. A simple method for validation and verification of pipettes mounted on automated liquid handlers

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Frøslev, Tobias G

    2011-01-01

    We have implemented a simple, inexpensive, and fast procedure for validation and verification of the performance of pipettes mounted on automated liquid handlers (ALHs) as necessary for laboratories accredited under ISO 17025. A six- or seven-step serial dilution of OrangeG was prepared in quadru......We have implemented a simple, inexpensive, and fast procedure for validation and verification of the performance of pipettes mounted on automated liquid handlers (ALHs) as necessary for laboratories accredited under ISO 17025. A six- or seven-step serial dilution of OrangeG was prepared...... are freely available. In conclusion, we have set up a simple, inexpensive, and fast solution for the continuous validation of ALHs used for accredited work according to the ISO 17025 standard. The method is easy to use for aqueous solutions but requires a spectrophotometer that can read microtiter plates....

  18. MR-based Water Content Estimation in Cartilage: Design and Validation of a Method

    DEFF Research Database (Denmark)

    Shiguetomi Medina, Juan Manuel; Kristiansen, Maja Sofie; Ringgaard, Steffen

    2012-01-01

    Objective Design and validation of an MR-based method that allows the calculation of the water content in cartilage tissue. Material and Methods We modified and adapted to cartilage tissue T1 map based water content MR sequences commonly used in the neurology field. Using a 37 Celsius degree stable...... was costumed and programmed. Finally, we validated the method after measuring and comparing 3 more cartilage samples in a living animal (pig). The obtained data was analyzed and the water content calculated. Then, the same samples were freeze-dried (this technique allows to take out all the water that a tissue...... contains) and we measured the water they contained. Results We could reproduce twice the 37 Celsius degree system and could perform the measurements in a similar way. We found that the MR T1 map based water content sequences can provide information that, after being analyzed with a special software, can...

  19. A Validated RP-HPLC Method for the Determination of Atazanavir in Pharmaceutical Dosage Form

    Directory of Open Access Journals (Sweden)

    K. Srinivasu

    2011-01-01

    Full Text Available A validated RP HPLC method for the estimation of atazanavir in capsule dosage form on YMC ODS 150 × 4.6 mm, 5 μ column using mobile phase composition of ammonium dihydrogen phosphate buffer (pH 2.5 with acetonitrile (55:45 v/v. Flow rate was maintained at 1.5 mL/min with 288 nm UV detection. The retention time obtained for atazanavir was at 4.7 min. The detector response was linear in the concentration range of 30 - 600 μg/mL. This method has been validated and shown to be specific, sensitive, precise, linear, accurate, rugged, robust and fast. Hence, this method can be applied for routine quality control of atazanavir in capsule dosage forms as well as in bulk drug.

  20. A Validated, Rapid HPLC-ESI-MS/MS Method for the Determination of Lycopsamine.

    Science.gov (United States)

    Jedlinszki, Nikoletta; Csupor, Dezső

    2015-07-01

    The aim of the present work was to develop and validate an HPLC-MS/MS method for the determination of a major pyrrolizidine alkaloid of comfrey (lycopsamine) in aqueous samples as a basis for the development of a method for the determination of absorption of lycopsamine by human skin. A linear calibration curve was established in the range of 1.32-440 ng. The intraday precision during the 3-day validation period ranged between 0.57 and 2.48% while the interday precision was 1.70% and 1.95% for quality control samples. LOD was 0.014 ng and recovery was above 97%. The lycopsamine content of the samples stored for 9 and 25 days at 22 degrees C, 10 degrees C and -25 degrees C did not vary. These results underline the good repeatability and accuracy of our method and allow the analysis of samples with very low lycopsamine content.

  1. Uncertainty estimates of purity measurements based on current information: toward a "live validation" of purity methods.

    Science.gov (United States)

    Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech

    2012-12-01

    To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.

  2. Determination of vitamin C in foods: current state of method validation.

    Science.gov (United States)

    Spínola, Vítor; Llorent-Martínez, Eulogio J; Castilho, Paula C

    2014-11-21

    Vitamin C is one of the most important vitamins, so reliable information about its content in foodstuffs is a concern to both consumers and quality control agencies. However, the heterogeneity of food matrixes and the potential degradation of this vitamin during its analysis create enormous challenges. This review addresses the development and validation of high-performance liquid chromatography methods for vitamin C analysis in food commodities, during the period 2000-2014. The main characteristics of vitamin C are mentioned, along with the strategies adopted by most authors during sample preparation (freezing and acidification) to avoid vitamin oxidation. After that, the advantages and handicaps of different analytical methods are discussed. Finally, the main aspects concerning method validation for vitamin C analysis are critically discussed. Parameters such as selectivity, linearity, limit of quantification, and accuracy were studied by most authors. Recovery experiments during accuracy evaluation were in general satisfactory, with usual values between 81 and 109%. However, few methods considered vitamin C stability during the analytical process, and the study of the precision was not always clear or complete. Potential future improvements regarding proper method validation are indicated to conclude this review. Copyright © 2014. Published by Elsevier B.V.

  3. Validation of analytical methods for the stability studies of naproxen suppositories for infant and adult use

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar

    2011-01-01

    Analytical and validating studies were performed in this paper, with a view to using them in the stability studies of the future formulations of naproxen suppositories for children and adults. The most influential factors in the naproxen stability were determined, that is, the major degradation occurred in acid medium, oxidative medium and by light action. One high-performance liquid chromatography-based method was evaluated, which proved to be adequate to quantify naproxen in suppositories and was selective against degradation products. The quantification limit was 3,480 μg, so it was valid for these studies. Additionally, the parameters specificity for stability, detection and quantification limits were evaluated for the direct semi-aqueous acid-base method, which was formerly validated for the quality control and showed satisfactory results. Nevertheless, the volumetric methods were not regarded as stability indicators; therefore, this method will be used along with the chromatographic methods of choice, that is, thin-layer chromatography and highperformance liquid chromatography, to determine the degradation products

  4. Validation method for determination of cholesterol in human urine with electrochemical sensors using gold electrodes

    Science.gov (United States)

    Riyanto, Laksono, Tomy Agung

    2017-12-01

    Electrochemical sensors for the determination of cholesterol with Au as a working electrode (Au) and its application to the analysis of urine have been done. The gold electrode was prepared using gold pure (99.99%), with size 1.0 mm by length and wide respectively, connected with silver wire using silver conductive paint. Validation methods have been investigated in the analysis of cholesterol in human urine using electrochemical sensors or cyclic voltammetry (CV) method. The effect of electrolyte and uric acid concentration has been determined to produce the optimum method. Validation method parameters for cholesterol analysis in human urine using CV are precision, recovery, linearity, limit of detection (LOD) and limit of quantification (LOQ). The result showed the correlation of concentration of cholesterol to anodic peak current is the coefficient determination of R2 = 0.916. The results of the validation method showed the precision, recovery, linearity, LOD, and LOQ are 1.2539%, 144.33%, 0.916, 1.49 × 10-1 mM and 4.96 × 10-1 mM, respectively. As a conclusion is Au electrode is a good electrode for electrochemical sensors to determination of cholesterol in human urine.

  5. Market scope

    International Nuclear Information System (INIS)

    2002-01-01

    Nova Scotia's Energy Strategy is aimed at opening the electricity market to wholesale competition, allowing eligible wholesale customers (such as municipal distribution utilities) to buy their electricity from competitive sources. The Nova Scotia Electricity Marketplace Governance Committee (EMGC) is concerned that this will not promote long-term competition because these eligible customers form only a very small fraction (1.6 per cent) of the total electricity demand in the province. This report examines the possible extensions of competition beyond the minimum specified in the Energy Strategy. It also identifies approaches that the EMGC may consider, including other potential levels of competition and their associated issues. The report discussed the issue of implementing wholesale competition as it relates to unbundling of the transmission tariff from the cost of energy supply and from the cost of distribution in retail marketing. The stages of expanding the market scope were also described with reference to large industrial customers, medium industrial and large commercial customers, and small commercial and residential customers. The report states that it is unlikely that the transition to an open access transmission market will need to be reversed, as it is likely to be an essential component of any further development. The EMGC feels it could minimize future transition costs and promote future evolution of competition by recommending for an institutional and market structure that is compatible with a broader competitive market

  6. Contribution to the validation of thermal ratchetting prevision methods in metallic structures; Contribution a la validation des methodes de prevision du rochet thermique dans les structures metalliques

    Energy Technology Data Exchange (ETDEWEB)

    Rakotovelo, A.M

    1998-03-01

    This work concerns the steady state assessment in the metallic structures subjected to thermomechanical cyclic loadings in biaxial stress state. The effect of the short time mechanical overloads is also investigated. The first chapter is devoted to a bibliographic research concerning the behaviour of the materials and the structures in the cyclic plasticity. Some works relate to the experimental aspect as well as the numerical one for the steady state assessment of such structures are presented. The experimental part of the study is presented in the second chapter. The experimental device was carried out in order to prescribe tension and torsion forces combined with cyclic thermal loading. Some tests was then carried out, among these tests certain include some overloads in tension or torsion. The last chapter describes the numerical calculations using different models (linear isotropic hardening, linear kinematic hardening and elasto-viscoplastic Chaboche's model) and the application of some simplified methods for the ratchetting assessment in the structures. We have considered two categories of methods. The first one is based on an elastic analysis (Bree's diagram, 3 Sm rule, efficiency rule) and the second one combines elastic analysis and elastoplastic analysis of the first cycle (Gatt's and Taleb's methods). The results of this study have enabled: to validate in the biaxial stress state an expression which takes into account the effect of mechanical short time overloads; to test the performances of considered models to describe the evolution of the structure during the first cycle and to take into account the effect of short time overloads. Among the considered models, the elastoplastic Chaboche's model seems to be the most accurate to describe the structure's behaviour during the first cycles; to validate some simplified methods. Certain methods based only on elastic analysis (Bee's diagram and efficiency rule) seem not

  7. Determination of proline in honey: comparison between official methods, optimization and validation of the analytical methodology.

    Science.gov (United States)

    Truzzi, Cristina; Annibaldi, Anna; Illuminati, Silvia; Finale, Carolina; Scarponi, Giuseppe

    2014-05-01

    The study compares official spectrophotometric methods for the determination of proline content in honey - those of the International Honey Commission (IHC) and the Association of Official Analytical Chemists (AOAC) - with the original Ough method. Results show that the extra time-consuming treatment stages added by the IHC method with respect to the Ough method are pointless. We demonstrate that the AOACs method proves to be the best in terms of accuracy and time saving. The optimized waiting time for the absorbance recording is set at 35min from the removal of reaction tubes from the boiling bath used in the sample treatment. The optimized method was validated in the matrix: linearity up to 1800mgL(-1), limit of detection 20mgL(-1), limit of quantification 61mgL(-1). The method was applied to 43 unifloral honey samples from the Marche region, Italy. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Estimating Rooftop Suitability for PV: A Review of Methods, Patents, and Validation Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Melius, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ong, S. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    A number of methods have been developed using remote sensing data to estimate rooftop area suitable for the installation of photovoltaics (PV) at various geospatial resolutions. This report reviews the literature and patents on methods for estimating rooftop-area appropriate for PV, including constant-value methods, manual selection methods, and GIS-based methods. This report also presents NREL's proposed method for estimating suitable rooftop area for PV using Light Detection and Ranging (LiDAR) data in conjunction with a GIS model to predict areas with appropriate slope, orientation, and sunlight. NREL's method is validated against solar installation data from New Jersey, Colorado, and California to compare modeled results to actual on-the-ground measurements.

  9. Development and validation of a multiresidue method for pesticide analysis in honey by UFLC-MS

    Directory of Open Access Journals (Sweden)

    Adriana M. Zamudio S.

    2017-05-01

    Full Text Available A method for the determination of pesticide residues in honey by ultra fast liquid chromatography coupled with mass spectrometry was developed. For this purpose, different variations of the QuECHERS method were performed: (i amount of sample, (ii type of salt to control pH, (iii buffer pH, and (iv different mixtures for cleaning-up. In addition, to demonstrate that the method is reliable, different validation parameters were studied: accuracy, limits of detection and quantification, linearity and selectivity. The results showed that by means of the changes introduced it was possible to get a more selective method that improves the accuracy of about 19 pesticides selected from the original method. It was found that the method is suitable for the analysis of 50 pesticides, out of 56. Furthermore, with the developed method recoveries between 70 and 120% and relative standard deviation below 15% were found.

  10. Development and Statistical Validation of Spectrophotometric Methods for the Estimation of Nabumetone in Tablet Dosage Form

    Directory of Open Access Journals (Sweden)

    A. R. Rote

    2010-01-01

    Full Text Available Three new simple, economic spectrophotometric methods were developed and validated for the estimation of nabumetone in bulk and tablet dosage form. First method includes determination of nabumetone at absorption maxima 330 nm, second method applied was area under curve for analysis of nabumetone in the wavelength range of 326-334 nm and third method was First order derivative spectra with scaling factor 4. Beer law obeyed in the concentration range of 10-30 μg/mL for all three methods. The correlation coefficients were found to be 0.9997, 0.9998 and 0.9998 by absorption maxima, area under curve and first order derivative spectra. Results of analysis were validated statistically and by performing recovery studies. The mean percent recoveries were found satisfactory for all three methods. The developed methods were also compared statistically using one way ANOVA. The proposed methods have been successfully applied for the estimation of nabumetone in bulk and pharmaceutical tablet dosage form.

  11. Review on Laryngeal Palpation Methods in Muscle Tension Dysphonia: Validity and Reliability Issues.

    Science.gov (United States)

    Khoddami, Seyyedeh Maryam; Ansari, Noureddin Nakhostin; Jalaie, Shohreh

    2015-07-01

    Laryngeal palpation is a common clinical method for the assessment of neck and laryngeal muscles in muscle tension dysphonia (MTD). To review the available laryngeal palpation methods used in patients with MTD for the assessment, diagnosis, or document of treatment outcomes. A systematic review of the literature concerning palpatory methods in MTD was conducted using the databases MEDLINE (PubMed), ScienceDirect, Scopus, Web of science, Web of knowledge and Cochrane Library between July and October 2013. Relevant studies were identified by one reviewer based on screened titles/abstracts and full texts. Manual searching was also used to track the source literature. There were five main as well as miscellaneous palpation methods that were different according to target anatomical structures, judgment or grading system, and using tasks. There were only a few scales available, and the majority of the palpatory methods were qualitative. Most of the palpatory methods evaluate the tension at both static and dynamic tasks. There was little information about the validity and reliability of the available methods. The literature on the scientific evidence of muscle tension indicators perceived by laryngeal palpation in MTD is scarce. Future studies should be conducted to investigate the validity and reliability of palpation methods. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  12. Validity Evaluation of the Assessment Method for Postural Loading on the Upper Body in Printing Industry

    Directory of Open Access Journals (Sweden)

    Mohammad Khandan

    2016-07-01

    Full Text Available Background and Objectives: Musculoskeletal disorders and injuries are known as a global occupational challenge. These injuries are more are concentrated in the upper limb. There are several methods to assess this kind of disorders, each of which have different efficiencies for various jobs based on their strengths and weaknesses. This study aimed to assess the validity of LUBA method in order to evaluate risk factors for musculoskeletal disorders in a printing industry in Qom province, 2014. Methods: In this descriptive cross-sectional study, all operational workers (n=94 were investigated in 2014. Nordic Musculoskeletal Questionnaire (NMQ was used to collect data on musculoskeletal disorders. We also used LUBA method to analyze postures in four different parts of the body (neck, shoulder, elbow, and wrist. The obtained data were analyzed using Mann-Whitney, Kruskal Wallis, and Kappa agreement tests. Results: Lumbar region of back with 35.1% prevalence had the most problems. The results of LUBA method showed that most postures were located at the second corrective action level, and need further studies. Agreement between assessment of shoulder posture and its disorders was significant (p0.05.  Conclusion: According to the results of this study on reliability and predictive validity of the LUBA method in printing industry, it can be concluded that this method is not a reliable method for posture assessment; however, further and more comprehensive studies are recommended.  

  13. Development of a benchtop baking method for chemically leavened crackers. II. Validation of the method

    Science.gov (United States)

    A benchtop baking method has been developed to predict the contribution of gluten functionality to overall flour performance for chemically leavened crackers. Using a diagnostic formula and procedure, dough rheology was analyzed to evaluate the extent of gluten development during mixing and machinin...

  14. Methods, metrics and research gaps around minimum data sets for nursing practice and fundamental care: A scoping literature review.

    Science.gov (United States)

    Muntlin Athlin, Åsa

    2018-06-01

    To examine and map research on minimum data sets linked to nursing practice and the fundamentals of care. Another aim was to identify gaps in the evidence to suggest future research questions to highlight the need for standardisation of terminology around nursing practice and fundamental care. Addressing fundamental care has been highlighted internationally as a response to missed nursing care. Systematic performance measurements are needed to capture nursing practice outcomes. Overview of the literature framed by the scoping study methodology. PubMed and CINAHL were searched using the following inclusion criteria: peer-reviewed empirical quantitative and qualitative studies related to minimum data sets and nursing practice published in English. No time restrictions were set. Exclusion criteria were as follows: no available full text, reviews and methodological and discursive studies. Data were categorised into one of the fundamentals of care elements. The review included 20 studies published in 1999-2016. Settings were mainly nursing homes or hospitals. Of 14 elements of the fundamentals of care, 11 were identified as measures in the included studies, but their frequency varied. The most commonly identified elements concerned safety, prevention and medication (n = 11), comfort (n = 6) and eating and drinking (n = 5). Studies have used minimum data sets and included variables linked to nursing practices and fundamentals of care. However, the relations of these variables to nursing practice were not always clearly described and the main purpose of the studies was seldom to measure the outcomes of nursing interventions. More robust studies focusing on nursing practice and patient outcomes are warranted. Using minimum data sets can highlight the nurses' work and what impact it has on direct patient care. Appropriate models, systems and standardised terminology are needed to facilitate the documentation of nursing activities. © 2017 John Wiley & Sons Ltd.

  15. Validity of the Remote Food Photography Method against Doubly Labeled Water among Minority Preschoolers

    OpenAIRE

    Nicklas, Theresa; Saab, Rabab; Islam, Noemi G.; Wong, William; Butte, Nancy; Schulin, Rebecca; Liu, Yan; Apolzan, John W.; Myers, Candice A.; Martin, Corby K.

    2017-01-01

    Objective To determine the validity of energy intake (EI) estimations made using the Remote Food Photography Method (RFPM) compared to the doubly-labeled water (DLW) method in minority preschool children in a free-living environment. Methods Seven days of food intake and spot urine samples excluding first void collections for DLW analysis were obtained on 39 3-to-5 year old Hispanic and African American children. Using an iPhone, caregivers captured before and after pictures of the child’s in...

  16. A statistical method (cross-validation) for bone loss region detection after spaceflight

    Science.gov (United States)

    Zhao, Qian; Li, Wenjun; Li, Caixia; Chu, Philip W.; Kornak, John; Lang, Thomas F.

    2010-01-01

    Astronauts experience bone loss after the long spaceflight missions. Identifying specific regions that undergo the greatest losses (e.g. the proximal femur) could reveal information about the processes of bone loss in disuse and disease. Methods for detecting such regions, however, remains an open problem. This paper focuses on statistical methods to detect such regions. We perform statistical parametric mapping to get t-maps of changes in images, and propose a new cross-validation method to select an optimum suprathreshold for forming clusters of pixels. Once these candidate clusters are formed, we use permutation testing of longitudinal labels to derive significant changes. PMID:20632144

  17. A method of knowledge base verification and validation for nuclear power plants expert systems

    International Nuclear Information System (INIS)

    Kwon, Il Won

    1996-02-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. As a result of this popularity, a large number of expert systems are developed. The nature of expert systems, however, requires that they be verified and validated carefully and that detailed methodologies for their development be devised. Therefore, it is widely noted that assuring the reliability of expert systems is very important, especially in nuclear industry, and it is also recognized that the process of verification and validation is an essential part of reliability assurance for these systems. Research and practices have produced numerous methods for expert system verification and validation (V and V) that suggest traditional software and system approaches to V and V. However, many approaches and methods for expert system V and V are partial, unreliable, and not uniform. The purpose of this paper is to present a new approach to expert system V and V, based on Petri nets, providing a uniform model. We devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for checking incorrectness, inconsistency, and incompleteness in a knowledge base. We also suggest heuristic analysis for validation process to show that the reasoning path is correct

  18. Modelling of the activity system - development of an evaluation method for integrated system validation

    International Nuclear Information System (INIS)

    Norros, Leena; Savioja, Paula

    2004-01-01

    In this paper we present our recent research which focuses on creating an evaluation method for human-system interfaces of complex systems. The method is aimed for use in the validation of modernised nuclear power plant (NPP) control rooms, and other complex systems with high reliability requirements. The task in validation is to determine whether the human-system functions safely and effectively. This question can be operationalized to the selection of relevant operational features and their appropriate acceptance criteria. Thus, there is a need to ensure that the results of the evaluation can be generalized so that they serve the purpose of integrated system validation. The definition of the appropriate acceptance criteria provides basis for the judgement of the appropriateness of the performance of the system. We propose that the operational situations and the acceptance criteria should be defined based on modelling of the NPP operation that is comprehended as an activity system. We developed a new core-tasks modelling framework. It is a formative modelling approach that combines causal, functional and understanding explanations of system performance. In this paper we reason how modelling can be used as a medium to determine the validity of the emerging control room system. (Author)

  19. Development and Validation of High Performance Liquid Chromatography Method for Determination Atorvastatin in Tablet

    Science.gov (United States)

    Yugatama, A.; Rohmani, S.; Dewangga, A.

    2018-03-01

    Atorvastatin is the primary choice for dyslipidemia treatment. Due to patent expiration of atorvastatin, the pharmaceutical industry makes copy of the drug. Therefore, the development methods for tablet quality tests involving atorvastatin concentration on tablets needs to be performed. The purpose of this research was to develop and validate the simple atorvastatin tablet analytical method by HPLC. HPLC system used in this experiment consisted of column Cosmosil C18 (150 x 4,6 mm, 5 µm) as the stationary reverse phase chomatography, a mixture of methanol-water at pH 3 (80:20 v/v) as the mobile phase, flow rate of 1 mL/min, and UV detector at wavelength of 245 nm. Validation methods were including: selectivity, linearity, accuracy, precision, limit of detection (LOD), and limit of quantitation (LOQ). The results of this study indicate that the developed method had good validation including selectivity, linearity, accuracy, precision, LOD, and LOQ for analysis of atorvastatin tablet content. LOD and LOQ were 0.2 and 0.7 ng/mL, and the linearity range were 20 - 120 ng/mL.

  20. Method for Pre-Conditioning a Measured Surface Height Map for Model Validation

    Science.gov (United States)

    Sidick, Erkin

    2012-01-01

    This software allows one to up-sample or down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. Because the re-sampling of a surface map is accomplished based on the analytical expressions of Zernike-polynomials and a power spectral density model, such re-sampling does not introduce any aliasing and interpolation errors as is done by the conventional interpolation and FFT-based (fast-Fourier-transform-based) spatial-filtering method. Also, this new method automatically eliminates the measurement noise and other measurement errors such as artificial discontinuity. The developmental cycle of an optical system, such as a space telescope, includes, but is not limited to, the following two steps: (1) deriving requirements or specs on the optical quality of individual optics before they are fabricated through optical modeling and simulations, and (2) validating the optical model using the measured surface height maps after all optics are fabricated. There are a number of computational issues related to model validation, one of which is the "pre-conditioning" or pre-processing of the measured surface maps before using them in a model validation software tool. This software addresses the following issues: (1) up- or down-sampling a measured surface map to match it with the gridded data format of a model validation tool, and (2) eliminating the surface measurement noise or measurement errors such that the resulted surface height map is continuous or smoothly-varying. So far, the preferred method used for re-sampling a surface map is two-dimensional interpolation. The main problem of this method is that the same pixel can take different values when the method of interpolation is changed among the different methods such as the "nearest," "linear," "cubic," and "spline" fitting in Matlab. The conventional, FFT-based spatial filtering method used to

  1. A Sensitive Validated Spectrophotometric Method for the Determination of Flucloxacillin Sodium

    Directory of Open Access Journals (Sweden)

    R. Singh Gujral

    2009-01-01

    Full Text Available A simple and sensitive spectrophotometric method has been proposed for the determination of flucloxacillin sodium. The determination method is based on charge transfer complexation reaction of the drug with iodine in methanol-dichloromethane medium. The absorbance was measured at 362 nm against the reagent blank. Under optimized experimental conditions, Beer's law is obeyed in the concentration ranges 1-9 μg/mL for flucloxacillin. The method was validated for specificity, linearity, precision, accuracy. The degree of linearity of the calibration curves, the percent recoveries, limit of detection and quantitation for the spectrophotometric method were determined. No interferences could be observed from the additives commonly present in the pharmaceutical formulations. The method was successfully applied for in vitro determination of human urine samples with low RSD value. This is simple, specific, accurate and sensitive spectrophotometric method.

  2. Validation of a method to measure plutonium levels in marine sediments in Cuba

    International Nuclear Information System (INIS)

    Sibello Hernández, Rita Y.; Cartas Aguila, Héctor A.; Cozzella, María Letizia

    2008-01-01

    The main objective of this research was to develop and to validate a method of radiochemical separation of plutonium, suitable from the economic and practical point of view, in Cuba conditions. This method allowed to determine plutonium activity levels in the marine sediments from Cienfuegos Bay. The selected method of radiochemical separation was that of anionic chromatography and the measure technique was the quadrupole inductively coupled plasma mass spectrometry. The method was applied to a certified reference material, six repetitions were carried out and a good correspondence between the average measured value and the average certified value of plutonium was achieved, so the trueness of the method was demonstrated. It was also proven the precision of the method, since it was obtained a variation coefficient of 11% at 95% confidence level. The obtained results show that the presence of plutonium in the analyzed marine sediment samples is only due to the global radioactive fallout. (author)

  3. Experimental validation of the buildings energy performance (PEC assessment methods with reference to occupied spaces heating

    Directory of Open Access Journals (Sweden)

    Cristian PETCU

    2010-01-01

    Full Text Available This paper is part of the series of pre-standardization research aimed to analyze the existing methods of calculating the Buildings Energy Performance (PEC in view of their correction of completing. The entire research activity aims to experimentally validate the PEC Calculation Algorithm as well as the comparative application, on the support of several case studies focused on representative buildings of the stock of buildings in Romania, of the PEC calculation methodology for buildings equipped with occupied spaces heating systems. The targets of the report are the experimental testing of the calculation models so far known (NP 048-2000, Mc 001-2006, SR EN 13790:2009, on the support provided by the CE INCERC Bucharest experimental building, together with the complex calculation algorithms specific to the dynamic modeling, for the evaluation of the occupied spaces heat demand in the cold season, specific to the traditional buildings and to modern buildings equipped with solar radiation passive systems, of the ventilated solar space type. The schedule of the measurements performed in the 2008-2009 cold season is presented as well as the primary processing of the measured data and the experimental validation of the heat demand monthly calculation methods, on the support of CE INCERC Bucharest. The calculation error per heating season (153 days of measurements between the measured heat demand and the calculated one was of 0.61%, an exceptional value confirming the phenomenological nature of the INCERC method, NP 048-2006. The mathematical model specific to the hourly thermal balance is recurrent – decisional with alternating paces. The experimental validation of the theoretical model is based on the measurements performed on the CE INCERC Bucharest building, within a time lag of 57 days (06.01-04.03.2009. The measurements performed on the CE INCERC Bucharest building confirm the accuracy of the hourly calculation model by comparison to the values

  4. In Vitro Dissolution Profile of Dapagliflozin: Development, Method Validation, and Analysis of Commercial Tablets

    Directory of Open Access Journals (Sweden)

    Rafaela Zielinski Cavalheiro de Meira

    2017-01-01

    Full Text Available Dapagliflozin was the first of its class (inhibitors of sodium-glucose cotransporter to be approved in Europe, USA, and Brazil. As the drug was recently approved, there is the need for research on analytical methods, including dissolution studies for the quality evaluation and assurance of tablets. The dissolution methodology was developed with apparatus II (paddle in 900 mL of medium (simulated gastric fluid, pH 1.2, temperature set at 37±0.5°C, and stirring speed of 50 rpm. For the quantification, a spectrophotometric (λ=224 nm method was developed and validated. In validation studies, the method proved to be specific and linear in the range from 0.5 to 15 μg·mL−1 (r2=0.998. The precision showed results with RSD values lower than 2%. The recovery of 80.72, 98.47, and 119.41% proved the accuracy of the method. Through a systematic approach by applying Factorial 23, the robustness of the method was confirmed (p>0.05. The studies of commercial tablets containing 5 or 10 mg demonstrated that they could be considered similar through f1, f2, and dissolution efficiency analyses. Also, the developed method can be used for the quality evaluation of dapagliflozin tablets and can be considered as a scientific basis for future official pharmacopoeial methods.

  5. Validated spectrophotometric methods for simultaneous determination of troxerutin and carbazochrome in dosage form

    Science.gov (United States)

    Khattab, Fatma I.; Ramadan, Nesrin K.; Hegazy, Maha A.; Al-Ghobashy, Medhat A.; Ghoniem, Nermine S.

    2015-03-01

    Four simple, accurate, sensitive and precise spectrophotometric methods were developed and validated for simultaneous determination of Troxerutin (TXN) and Carbazochrome (CZM) in their bulk powders, laboratory prepared mixtures and pharmaceutical dosage forms. Method A is first derivative spectrophotometry (D1) where TXN and CZM were determined at 294 and 483.5 nm, respectively. Method B is first derivative of ratio spectra (DD1) where the peak amplitude at 248 for TXN and 439 nm for CZM were used for their determination. Method C is ratio subtraction (RS); in which TXN was determined at its λmax (352 nm) in the presence of CZM which was determined by D1 at 483.5 nm. While, method D is mean centering of the ratio spectra (MCR) in which the mean centered values at 300 nm and 340.0 nm were used for the two drugs in a respective order. The two compounds were simultaneously determined in the concentration ranges of 5.00-50.00 μg mL-1 and 0.5-10.0 μg mL-1 for TXN and CZM, respectively. The methods were validated according to the ICH guidelines and the results were statistically compared to the manufacturer's method.

  6. Establishing survey validity and reliability for American Indians through "think aloud" and test-retest methods.

    Science.gov (United States)

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D

    2015-06-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. © The Author(s) 2015.

  7. Reference Proteome Extracts for Mass Spec Instrument Performance Validation and Method Development

    Science.gov (United States)

    Rosenblatt, Mike; Urh, Marjeta; Saveliev, Sergei

    2014-01-01

    Biological samples of high complexity are required to test protein mass spec sample preparation procedures and validate mass spec instrument performance. Total cell protein extracts provide the needed sample complexity. However, to be compatible with mass spec applications, such extracts should meet a number of design requirements: compatibility with LC/MS (free of detergents, etc.)high protein integrity (minimal level of protein degradation and non-biological PTMs)compatibility with common sample preparation methods such as proteolysis, PTM enrichment and mass-tag labelingLot-to-lot reproducibility Here we describe total protein extracts from yeast and human cells that meet the above criteria. Two extract formats have been developed: Intact protein extracts with primary use for sample preparation method development and optimizationPre-digested extracts (peptides) with primary use for instrument validation and performance monitoring

  8. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    Science.gov (United States)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  9. Validation of cleaning method for various parts fabricated at a Beryllium facility

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Cynthia M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-15

    This study evaluated and documented a cleaning process that is used to clean parts that are fabricated at a beryllium facility at Los Alamos National Laboratory. The purpose of evaluating this cleaning process was to validate and approve it for future use to assure beryllium surface levels are below the Department of Energy’s release limits without the need to sample all parts leaving the facility. Inhaling or coming in contact with beryllium can cause an immune response that can result in an individual becoming sensitized to beryllium, which can then lead to a disease of the lungs called chronic beryllium disease, and possibly lung cancer. Thirty aluminum and thirty stainless steel parts were fabricated on a lathe in the beryllium facility, as well as thirty-two beryllium parts, for the purpose of testing a parts cleaning method that involved the use of ultrasonic cleaners. A cleaning method was created, documented, validated, and approved, to reduce beryllium contamination.

  10. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    Science.gov (United States)

    Nieciąg, Halina

    2015-10-01

    Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling) was implemented as alternative to the simple sampling schema of classic algorithm.

  11. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    Directory of Open Access Journals (Sweden)

    Nieciąg Halina

    2015-10-01

    Full Text Available Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling was implemented as alternative to the simple sampling schema of classic algorithm.

  12. NordVal: A Nordic system for validation of alternative microbiological methods

    DEFF Research Database (Denmark)

    Qvist, Sven

    2007-01-01

    NordVal was created in 1999 by the Nordic Committee of Senior Officials for Food Issues under the Nordic Council of Ministers. The Committee adopted the following objective for NordVal: NordVal evaluates the performance and field of application of alternative microbiological methods. This includes...... analyses of food, water, feed, animal faeces and food environmental samples in the Nordic countries. NordVal is managed by a steering group, which is appointed by the National Food Administrations in Denmark, Finland, Iceland, Norway and Sweden. The background for creation of NordVal was a Danish...... validation system (DanVal) established in 1995 to cope with a need to validate alternative methods to be used in the Danish Salmonella Action Program. The program attracted considerable attention in the other Nordic countries. NordVal has elaborated a number of documents, which describe the requirements...

  13. Development and Validation of HPLC-PDA Assay method of Frangula emodin

    Directory of Open Access Journals (Sweden)

    Deborah Duca

    2016-03-01

    Full Text Available Frangula emodin, (1,3,8-trihydroxy-6-methyl-anthraquinone, is one of the anthraquinone derivatives found abundantly in the roots and bark of a number of plant families traditionally used to treat constipation and haemorrhoids. The present study describes the development and subsequent validation of a specific Assay HPLC method for emodin. The separation was achieved on a Waters Symmetry C18, 4.6 × 250 mm, 5 μm particle size, column at a temperature of 35 °C, with UV detection at 287 and 436 nm. An isocratic elution mode consisting of 0.1% formic acid and 0.01% trifluoroacetic acid as the aqueous mobile phase, and methanol was used. The method was successfully and statistically validated for linearity, range, precision, accuracy, specificity and solution stability.

  14. HPTLC Method Development and Validation of Zolpidem Tartrate in Bulk and Marketed Formulation

    OpenAIRE

    Abhay R. Shirode; Bharti G. Jadhav; Vilasrao J. Kadam

    2015-01-01

    High performance thin layer chromatography (HPTLC) offers many advantages over HPLC. It reduces the cost of analysis as compare to HPLC. The mobile phase consumption per sample is extremely low in HPTLC, hence reducing the acquisition and disposal cost. Considering the cost and suitability of analysis for estimation of zolpidem tartrate in bulk and its marketed formulation, HPTLC method was developed and validated. The Camag HPTLC system, employed with software winCATS (ver.1.4.1.8) was used ...

  15. Potential of accuracy profile for method validation in inductively coupled plasma spectrochemistry

    International Nuclear Information System (INIS)

    Mermet, J.M.; Granier, G.

    2012-01-01

    Method validation is usually performed over a range of concentrations for which analytical criteria must be verified. One important criterion in quantitative analysis is accuracy, i.e. the contribution of both trueness and precision. The study of accuracy over this range is called an accuracy profile and provides experimental tolerance intervals. Comparison with acceptability limits fixed by the end user defines a validity domain. This work describes the computation involved in the building of the tolerance intervals, particularly for the intermediate precision with within-laboratory experiments and for the reproducibility with interlaboratory studies. Computation is based on ISO 5725‐4 and on previously published work. Moreover, the bias uncertainty is also computed to verify the bias contribution to accuracy. The various types of accuracy profile behavior are exemplified with results obtained by using ICP-MS and ICP-AES. This procedure allows the analyst to define unambiguously a validity domain for a given accuracy. However, because the experiments are time-consuming, the accuracy profile method is mainly dedicated to method validation. - Highlights: ► An analytical method is defined by its accuracy, i.e. both trueness and precision. ► The accuracy as a function of an analyte concentration is an accuracy profile. ► Profile basic concepts are explained for trueness and intermediate precision. ► Profile-based tolerance intervals have to be compared with acceptability limits. ► Typical accuracy profiles are given for both ICP-AES and ICP-MS techniques.

  16. Method Development and Validation for UHPLC-MS-MS Determination of Hop Prenylflavonoids in Human Serum

    OpenAIRE

    Yuan, Yang; Qiu, Xi; Nikolic, Dejan; Dahl, Jeffrey H.; van Breemen, Richard B.

    2012-01-01

    Hops (Humulus lupulus L.) are used in the brewing of beer, and hop extracts containing prenylated compounds such as xanthohumol and 8-prenylnaringenin are under investigation as dietary supplements for cancer chemoprevention and for the management of hot flashes in menopausal women. To facilitate clinical studies of hop safety and efficacy, a selective, sensitive, and fast ultra-high pressure liquid chromatography tandem mass spectrometry (UHPLC-MS-MS) method was developed and validated for t...

  17. The Bland-Altman Method Should Not Be Used in Regression Cross-Validation Studies

    Science.gov (United States)

    O'Connor, Daniel P.; Mahar, Matthew T.; Laughlin, Mitzi S.; Jackson, Andrew S.

    2011-01-01

    The purpose of this study was to demonstrate the bias in the Bland-Altman (BA) limits of agreement method when it is used to validate regression models. Data from 1,158 men were used to develop three regression equations to estimate maximum oxygen uptake (R[superscript 2] = 0.40, 0.61, and 0.82, respectively). The equations were evaluated in a…

  18. Development and Validation of a Stability-Indicating LC-UV Method ...

    African Journals Online (AJOL)

    Development and Validation of a Stability-Indicating LC-UV Method for Simultaneous Determination of Ketotifen and Cetirizine in Pharmaceutical Dosage Forms. ... 5 μm) using an isocratic mobile phase that consisted of acetonitrile and 10 mM disodium hydrogen phosphate buffer (pH 6.5) in a ratio of 45:55 % v/v at a flow ...

  19. Validation of the analytical method for sodium dichloroisocyanurate aimed at drinking water disinfection

    International Nuclear Information System (INIS)

    Martinez Alvarez, Luis Octavio; Alejo Cisneros, Pedro; Garcia Pereira, Reynaldo; Campos Valdez, Doraily

    2014-01-01

    Cuba has developed the first effervescent 3.5 mg sodium dichloroisocyanurate tablets as a non-therapeutic active principle. This ingredient releases certain amount of chlorine when dissolved into a litre of water and it can cause adequate disinfection of drinking water ready to be taken after 30 min. Developing and validating an analytical iodometric method applicable to the quality control of effervescent 3.5 mg sodium dichloroisocyanurate tablets

  20. Validation of verbal autopsy methods using hospital medical records: a case study in Vietnam.

    Science.gov (United States)

    Tran, Hong Thi; Nguyen, Hoa Phuong; Walker, Sue M; Hill, Peter S; Rao, Chalapati

    2018-05-18

    Information on causes of death (COD) is crucial for measuring the health outcomes of populations and progress towards the Sustainable Development Goals. In many countries such as Vietnam where the civil registration and vital statistics (CRVS) system is dysfunctional, information on vital events will continue to rely on verbal autopsy (VA) methods. This study assesses the validity of VA methods used in Vietnam, and provides recommendations on methods for implementing VA validation studies in Vietnam. This validation study was conducted on a sample of 670 deaths from a recent VA study in Quang Ninh province. The study covered 116 cases from this sample, which met three inclusion criteria: a) the death occurred within 30 days of discharge after last hospitalisation, and b) medical records (MRs) for the deceased were available from respective hospitals, and c) the medical record mentioned that the patient was terminally ill at discharge. For each death, the underlying cause of death (UCOD) identified from MRs was compared to the UCOD from VA. The validity of VA diagnoses for major causes of death was measured using sensitivity, specificity and positive predictive value (PPV). The sensitivity of VA was at least 75% in identifying some leading CODs such as stroke, road traffic accidents and several site-specific cancers. However, sensitivity was less than 50% for other important causes including ischemic heart disease, chronic obstructive pulmonary diseases, and diabetes. Overall, there was 57% agreement between UCOD from VA and MR, which increased to 76% when multiple causes from VA were compared to UCOD from MR. Our findings suggest that VA is a valid method to ascertain UCOD in contexts such as Vietnam. Furthermore, within cultural contexts in which patients prefer to die at home instead of a healthcare facility, using the available MRs as the gold standard may be meaningful to the extent that recall bias from the interval between last hospital discharge and death

  1. New clinical validation method for automated sphygmomanometer: a proposal by Japan ISO-WG for sphygmomanometer standard.

    Science.gov (United States)

    Shirasaki, Osamu; Asou, Yosuke; Takahashi, Yukio

    2007-12-01

    Owing to fast or stepwise cuff deflation, or measuring at places other than the upper arm, the clinical accuracy of most recent automated sphygmomanometers (auto-BPMs) cannot be validated by one-arm simultaneous comparison, which would be the only accurate validation method based on auscultation. Two main alternative methods are provided by current standards, that is, two-arm simultaneous comparison (method 1) and one-arm sequential comparison (method 2); however, the accuracy of these validation methods might not be sufficient to compensate for the suspicious accuracy in lateral blood pressure (BP) differences (LD) and/or BP variations (BPV) between the device and reference readings. Thus, the Japan ISO-WG for sphygmomanometer standards has been studying a new method that might improve validation accuracy (method 3). The purpose of this study is to determine the appropriateness of method 3 by comparing immunity to LD and BPV with those of the current validation methods (methods 1 and 2). The validation accuracy of the above three methods was assessed in human participants [N=120, 45+/-15.3 years (mean+/-SD)]. An oscillometric automated monitor, Omron HEM-762, was used as the tested device. When compared with the others, methods 1 and 3 showed a smaller intra-individual standard deviation of device error (SD1), suggesting their higher reproducibility of validation. The SD1 by method 2 (P=0.004) significantly correlated with the participant's BP, supporting our hypothesis that the increased SD of device error by method 2 is at least partially caused by essential BPV. Method 3 showed a significantly (P=0.0044) smaller interparticipant SD of device error (SD2), suggesting its higher interparticipant consistency of validation. Among the methods of validation of the clinical accuracy of auto-BPMs, method 3, which showed the highest reproducibility and highest interparticipant consistency, can be proposed as being the most appropriate.

  2. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods

    Directory of Open Access Journals (Sweden)

    Alistair Currie

    2011-11-01

    Full Text Available In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches.

  3. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods.

    Science.gov (United States)

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-11-29

    In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches.

  4. Gradient HPLC method development and validation for Simultaneous estimation of Rosiglitazone and Gliclazide.

    Directory of Open Access Journals (Sweden)

    Uttam Singh Baghel

    2012-10-01

    Full Text Available Objective: The aim of present work was to develop a gradient RP-HPLC method for simultaneous analysis of rosiglitazone and gliclazide, in a tablet dosage form. Method: Chromatographic system was optimized using a hypersil C18 (250mm x 4.6mm, 5毺 m column with potassium dihydrogen phosphate (pH-7.0 and acetonitrile in the ratio of 60:40, as mobile phase, at a flow rate of 1.0 ml/min. Detection was carried out at 225 nm by a SPD-20A prominence UV/Vis detector. Result: Rosiglitazone and gliclazide were eluted with retention times of 17.36 and 7.06 min, respectively. Beer’s Lambert ’s Law was obeyed over the concentration ranges of 5 to 70 毺 g/ml and 2 to 12 毺 g/ml for rosiglitazone and gliclazide, respectively. Conclusion: The high recovery and low coefficients of variation confirm the suitability of the method for simultaneous analysis of both drugs in a tablets dosage form. Statistical analysis proves that the method is sensitive and significant for the analysis of rosiglitazone and gliclazide in pure and in pharmaceutical dosage form without any interference from the excipients. The method was validated in accordance with ICH guidelines. Validation revealed the method is specific, rapid, accurate, precise, reliable, and reproducible.

  5. Validation of a Smartphone Image-Based Dietary Assessment Method for Pregnant Women

    Directory of Open Access Journals (Sweden)

    Amy M. Ashman

    2017-01-01

    Full Text Available Image-based dietary records could lower participant burden associated with traditional prospective methods of dietary assessment. They have been used in children, adolescents and adults, but have not been evaluated in pregnant women. The current study evaluated relative validity of the DietBytes image-based dietary assessment method for assessing energy and nutrient intakes. Pregnant women collected image-based dietary records (via a smartphone application of all food, drinks and supplements consumed over three non-consecutive days. Intakes from the image-based method were compared to intakes collected from three 24-h recalls, taken on random days; once per week, in the weeks following the image-based record. Data were analyzed using nutrient analysis software. Agreement between methods was ascertained using Pearson correlations and Bland-Altman plots. Twenty-five women (27 recruited, one withdrew, one incomplete, median age 29 years, 15 primiparas, eight Aboriginal Australians, completed image-based records for analysis. Significant correlations between the two methods were observed for energy, macronutrients and fiber (r = 0.58–0.84, all p < 0.05, and for micronutrients both including (r = 0.47–0.94, all p < 0.05 and excluding (r = 0.40–0.85, all p < 0.05 supplements in the analysis. Bland-Altman plots confirmed acceptable agreement with no systematic bias. The DietBytes method demonstrated acceptable relative validity for assessment of nutrient intakes of pregnant women.

  6. Validation of the Nuclear Design Method for MOX Fuel Loaded LWR Cores

    International Nuclear Information System (INIS)

    Saji, E.; Inoue, Y.; Mori, M.; Ushio, T.

    2001-01-01

    The actual batch loading of mixed-oxide (MOX) fuel in light water reactors (LWRs) is now ready to start in Japan. One of the efforts that have been devoted to realizing this batch loading has been validation of the nuclear design methods calculating the MOX-fuel-loaded LWR core characteristics. This paper summarizes the validation work for the applicability of the CASMO-4/SIMULATE-3 in-core fuel management code system to MOX-fuel-loaded LWR cores. This code system is widely used by a number of electric power companies for the core management of their commercial LWRs. The validation work was performed for both boiling water reactor (BWR) and pressurized water reactor (PWR) applications. Each validation consists of two parts: analyses of critical experiments and core tracking calculations of operating plants. For the critical experiments, we have chosen a series of experiments known as the VENUS International Program (VIP), which was performed at the SCK/CEN MOL laboratory in Belgium. VIP consists of both BWR and PWR fuel assembly configurations. As for the core tracking calculations, the operating data of MOX-fuel-loaded BWR and PWR cores in Europe have been utilized

  7. Optimization and validation of spectrophotometric methods for determination of finasteride in dosage and biological forms

    Science.gov (United States)

    Amin, Alaa S.; Kassem, Mohammed A.

    2012-01-01

    Aim and Background: Three simple, accurate and sensitive spectrophotometric methods for the determination of finasteride in pure, dosage and biological forms, and in the presence of its oxidative degradates were developed. Materials and Methods: These methods are indirect, involve the addition of excess oxidant potassium permanganate for method A; cerric sulfate [Ce(SO4)2] for methods B; and N-bromosuccinimide (NBS) for method C of known concentration in acid medium to finasteride, and the determination of the unreacted oxidant by measurement of the decrease in absorbance of methylene blue for method A, chromotrope 2R for method B, and amaranth for method C at a suitable maximum wavelength, λmax: 663, 528, and 520 nm, for the three methods, respectively. The reaction conditions for each method were optimized. Results: Regression analysis of the Beer plots showed good correlation in the concentration ranges of 0.12–3.84 μg mL–1 for method A, and 0.12–3.28 μg mL–1 for method B and 0.14 – 3.56 μg mL–1 for method C. The apparent molar absorptivity, Sandell sensitivity, detection and quantification limits were evaluated. The stoichiometric ratio between the finasteride and the oxidant was estimated. The validity of the proposed methods was tested by analyzing dosage forms and biological samples containing finasteride with relative standard deviation ≤ 0.95. Conclusion: The proposed methods could successfully determine the studied drug with varying excess of its oxidative degradation products, with recovery between 99.0 and 101.4, 99.2 and 101.6, and 99.6 and 101.0% for methods A, B, and C, respectively. PMID:23781478

  8. Numerical Validation of the Delaunay Normalization and the Krylov-Bogoliubov-Mitropolsky Method

    Directory of Open Access Journals (Sweden)

    David Ortigosa

    2014-01-01

    Full Text Available A scalable second-order analytical orbit propagator programme based on modern and classical perturbation methods is being developed. As a first step in the validation and verification of part of our orbit propagator programme, we only consider the perturbation produced by zonal harmonic coefficients in the Earth’s gravity potential, so that it is possible to analyze the behaviour of the mathematical expressions involved in Delaunay normalization and the Krylov-Bogoliubov-Mitropolsky method in depth and determine their limits.

  9. Optimisation and validation of methods to assess single nucleotide polymorphisms (SNPs) in archival histological material

    DEFF Research Database (Denmark)

    Andreassen, C N; Sørensen, Flemming Brandt; Overgaard

    2004-01-01

    only archival specimens are available. This study was conducted to validate protocols optimised for assessment of SNPs based on paraffin embedded, formalin fixed tissue samples.PATIENTS AND METHODS: In 137 breast cancer patients, three TGFB1 SNPs were assessed based on archival histological specimens...... precipitation).RESULTS: Assessment of SNPs based on archival histological material is encumbered by a number of obstacles and pitfalls. However, these can be widely overcome by careful optimisation of the methods used for sample selection, DNA extraction and PCR. Within 130 samples that fulfil the criteria...

  10. Validation of a method to determine methylmercury in fish tissues using gas chromatography

    International Nuclear Information System (INIS)

    Vega Bolannos, Luisa O.; Arias Verdes, Jose A.; Beltran Llerandi, Gilberto; Castro Diaz, Odalys; Moreno Tellez, Olga L.

    2000-01-01

    We validated a method to determine methylmercury in fish tissues using gas chromatography with an electron capture detector as described by the Association of Official Analytical Chemist (AOAC) International. The linear curve range was 0.02 to 1 g/ml and linear correlation coefficient was 0.9979. A 1 mg/kg methylmercury-contaminated fish sample was analyzed 20 times to determine repeatability of the method. The quantification limit was 0.16 mg/kg and detection limit was 0.06 ppm. Fish samples contaminated with 0.2 to 10 mg/kg methylmercury showed recovery indexes from 94.66 to 108.8%

  11. A validated HPTLC method for estimation of moxifloxacin hydrochloride in tablets.

    Science.gov (United States)

    Dhillon, Vandana; Chaudhary, Alok Kumar

    2010-10-01

    A simple HPTLC method having high accuracy, precision and reproducibility was developed for the routine estimation of moxifloxacin hydrochloride in the tablets available in market and was validated for various parameters according to ICH guidelines. moxifloxacin hydrochloride was estimated at 292 nm by densitometry using Silica gel 60 F254 as stationary phase and a premix of methylene chloride: methanol: strong ammonia solution and acetonitrile (10:10:5:10) as mobile phase. Method was found linear in a range of 9-54 nanograms with a correlation coefficient >0.99. The regression equation was: AUC = 65.57 × (Amount in nanograms) + 163 (r(2) = 0.9908).

  12. Lodenafil carbonate tablets: optimization and validation of a capillary zone electrophoresis method

    OpenAIRE

    Codevilla, Cristiane F; Ferreira, Pâmela Cristina L; Sangoi, Maximiliano S; Fröehlich, Pedro Eduardo; Bergold, Ana Maria

    2012-01-01

    A simple capillary zone electrophoresis (CZE) method was developed and validated for the analysis of lodenafil carbonate in tablets. Response surface methodology was used for optimization of the pH and concentration of the buffer, applied voltage and temperature. The method employed 50 mmol L-1 borate buffer at pH 10 as background electrolyte with an applied voltage of 15 kV. The separation was carried out in a fused-silica capillary maintained at 32.5 ºC and the detection wavelength was 214 ...

  13. Validation of the k0 standardization method in neutron activation analysis

    International Nuclear Information System (INIS)

    Kubesova, Marie

    2009-01-01

    The goal of this work was to validate the k 0 standardization method in neutron activation analysis for use by the Nuclear Physics Institute's NAA Laboratory. The precision and accuracy of the method were examined by using two types of reference materials: the one type comprised a set of synthetic materials and served to check the implementation of k 0 standardization, the other type consisted of matrix NIST SRMs comprising various different matrices. In general, a good agreement was obtained between the results of this work and the certified values, giving evidence of the accuracy of our results. In addition, the limits were evaluated for 61 elements

  14. Reliability and validity of a brief method to assess nociceptive flexion reflex (NFR) threshold.

    Science.gov (United States)

    Rhudy, Jamie L; France, Christopher R

    2011-07-01

    The nociceptive flexion reflex (NFR) is a physiological tool to study spinal nociception. However, NFR assessment can take several minutes and expose participants to repeated suprathreshold stimulations. The 4 studies reported here assessed the reliability and validity of a brief method to assess NFR threshold that uses a single ascending series of stimulations (Peak 1 NFR), by comparing it to a well-validated method that uses 3 ascending/descending staircases of stimulations (Staircase NFR). Correlations between the NFR definitions were high, were on par with test-retest correlations of Staircase NFR, and were not affected by participant sex or chronic pain status. Results also indicated the test-retest reliabilities for the 2 definitions were similar. Using larger stimulus increments (4 mAs) to assess Peak 1 NFR tended to result in higher NFR threshold estimates than using the Staircase NFR definition, whereas smaller stimulus increments (2 mAs) tended to result in lower NFR threshold estimates than the Staircase NFR definition. Neither NFR definition was correlated with anxiety, pain catastrophizing, or anxiety sensitivity. In sum, a single ascending series of electrical stimulations results in a reliable and valid estimate of NFR threshold. However, caution may be warranted when comparing NFR thresholds across studies that differ in the ascending stimulus increments. This brief method to assess NFR threshold is reliable and valid; therefore, it should be useful to clinical pain researchers interested in quickly assessing inter- and intra-individual differences in spinal nociceptive processes. Copyright © 2011 American Pain Society. Published by Elsevier Inc. All rights reserved.

  15. Pre-validation methods for developing a patient reported outcome instrument

    Directory of Open Access Journals (Sweden)

    Castillo Mayret M

    2011-08-01

    Full Text Available Abstract Background Measures that reflect patients' assessment of their health are of increasing importance as outcome measures in randomised controlled trials. The methodological approach used in the pre-validation development of new instruments (item generation, item reduction and question formatting should be robust and transparent. The totality of the content of existing PRO instruments for a specific condition provides a valuable resource (pool of items that can be utilised to develop new instruments. Such 'top down' approaches are common, but the explicit pre-validation methods are often poorly reported. This paper presents a systematic and generalisable 5-step pre-validation PRO instrument methodology. Methods The method is illustrated using the example of the Aberdeen Glaucoma Questionnaire (AGQ. The five steps are: 1 Generation of a pool of items; 2 Item de-duplication (three phases; 3 Item reduction (two phases; 4 Assessment of the remaining items' content coverage against a pre-existing theoretical framework appropriate to the objectives of the instrument and the target population (e.g. ICF; and 5 qualitative exploration of the target populations' views of the new instrument and the items it contains. Results The AGQ 'item pool' contained 725 items. Three de-duplication phases resulted in reduction of 91, 225 and 48 items respectively. The item reduction phases discarded 70 items and 208 items respectively. The draft AGQ contained 83 items with good content coverage. The qualitative exploration ('think aloud' study resulted in removal of a further 15 items and refinement to the wording of others. The resultant draft AGQ contained 68 items. Conclusions This study presents a novel methodology for developing a PRO instrument, based on three sources: literature reporting what is important to patient; theoretically coherent framework; and patients' experience of completing the instrument. By systematically accounting for all items dropped

  16. Validation of histamine determination Method in yoghurt using High Performance Liquid Chromatography

    Directory of Open Access Journals (Sweden)

    M Jahedinia

    2014-02-01

    Full Text Available Biogenic amines are organic, basic nitrogenous compounds of low molecular weight that are mainly generated by the enzymatic decarboxylation of amino acids by microorganisms. Dairy products are among the foods with the highest amine content. A wide variety of methods and procedures for determination of histamine and biogenic amines have been established. Amongst, HPLC method is considered as reference method. The aim of this study was to validate Reversed Phase HPLC method determination of histamine in yoghurt. The mobile phase consisted of acetonitrile/water (18:88 v/v and the flow rate was set at 0.5 ml/min using isocratic HPLC. Detection was carried out at 254 nm using UV-detector. Calibration curve that was constructed using peak area of standards was linear and value of correlation coefficient (r2 was estimated at 0.998. Good recoveries were observed for histamine under investigation at all spiking levels and average of recoveries was 84%. The RSD% value from repeatability test was found to be %4.4. Limit of detection and limit of quantitation were 0.14 and 0.42 µ/ml, respectively. The results of validation tests showed that the method is reliable and rapid for quantification of histamine in yoghurt.

  17. Validity of a Simple Method for Measuring Force-Velocity-Power Profile in Countermovement Jump.

    Science.gov (United States)

    Jiménez-Reyes, Pedro; Samozino, Pierre; Pareja-Blanco, Fernando; Conceição, Filipe; Cuadrado-Peñafiel, Víctor; González-Badillo, Juan José; Morin, Jean-Benoît

    2017-01-01

    To analyze the reliability and validity of a simple computation method to evaluate force (F), velocity (v), and power (P) output during a countermovement jump (CMJ) suitable for use in field conditions and to verify the validity of this computation method to compute the CMJ force-velocity (F-v) profile (including unloaded and loaded jumps) in trained athletes. Sixteen high-level male sprinters and jumpers performed maximal CMJs under 6 different load conditions (0-87 kg). A force plate sampling at 1000 Hz was used to record vertical ground-reaction force and derive vertical-displacement data during CMJ trials. For each condition, mean F, v, and P of the push-off phase were determined from both force-plate data (reference method) and simple computation measures based on body mass, jump height (from flight time), and push-off distance and used to establish the linear F-v relationship for each individual. Mean absolute bias values were 0.9% (± 1.6%), 4.7% (± 6.2%), 3.7% (± 4.8%), and 5% (± 6.8%) for F, v, P, and slope of the F-v relationship (S Fv ), respectively. Both methods showed high correlations for F-v-profile-related variables (r = .985-.991). Finally, all variables computed from the simple method showed high reliability, with ICC >.980 and CV push-off distance, and jump height are known.

  18. Development and validation of HPLC analytical method for quantitative determination of metronidazole in human plasma

    International Nuclear Information System (INIS)

    Safdar, K.A.; Shyum, S.B.; Usman, S.

    2016-01-01

    The objective of the present study was to develop a simple, rapid and sensitive reversed-phase high performance liquid chromatographic (RP-HPLC) analytical method with UV detection system for the quantitative determination of metronidazole in human plasma. The chromatographic separation was performed by using C18 RP column (250mm X 4.6mm, 5 meu m) as stationary phase and 0.01M potassium dihydrogen phosphate buffered at pH 3.0 and acetonitrile (83:17, v/v) as mobile phase at flow rate of 1.0 ml/min. The UV detection was carried out at 320nm. The method was validated as per the US FDA guideline for bioanalytical method validation and was found to be selective without interferences from mobile phase components, impurities and biological matrix. The method found to be linear over the concentration range of 0.2812 meu g/ml to 18.0 meu g/ml (r2 = 0.9987) with adequate level of accuracy and precision. The samples were found to be stable under various recommended laboratory and storage conditions. Therefore, the method can be used with adequate level of confidence and assurance for bioavailability, bioequivalence and other pharmacokinetic studies of metronidazole in human. (author)

  19. Validation of the Analytical Method for the Determination of Flavonoids in Broccoli

    Directory of Open Access Journals (Sweden)

    Tuszyńska Magdalena

    2014-09-01

    Full Text Available A simple, accurate and selective HPLC method was developed and validated for determination of quercetin and kaempferol, which are the main flavonols in broccoli. The separation was achieved on a reversed-phase C18 column using a mobile phase composed of methanol/water (60/40 and phosphoric acid 0.2% at a flow rate of 1.0 ml min-1. The detection was carried out on a DAD detector at 370 nm. This method was validated according to the requirements for new methods, which include selectivity, linearity, precision, accuracy, limit of detection and limit of quantitation. The current method demonstrates good linearity, with R2 > 0.99. The recovery is within 98.07-102.15% and 97.92-101.83% for quercetin and kaempferol, respectively. The method is selective, in that quercetin and kaempferol are well separated from other compounds of broccoli with good resolution. The low limit of detection and limit of quantitation of quercetin and kaempferol enable the detection and quantitation of these flavonoids in broccoli at low con–centrations.

  20. Validation and Recommendation of Methods to Measure Biogas Production Potential of Animal Manure

    Directory of Open Access Journals (Sweden)

    C. H. Pham

    2013-06-01

    Full Text Available In developing countries, biogas energy production is seen as a technology that can provide clean energy in poor regions and reduce pollution caused by animal manure. Laboratories in these countries have little access to advanced gas measuring equipment, which may limit research aimed at improving local adapted biogas production. They may also be unable to produce valid estimates of an international standard that can be used for articles published in international peer-reviewed science journals. This study tested and validated methods for measuring total biogas and methane (CH4 production using batch fermentation and for characterizing the biomass. The biochemical methane potential (BMP (CH4 NL kg−1 VS of pig manure, cow manure and cellulose determined with the Moller and VDI methods was not significantly different in this test (p>0.05. The biodegradability using a ratio of BMP and theoretical BMP (TBMP was slightly higher using the Hansen method, but differences were not significant. Degradation rate assessed by methane formation rate showed wide variation within the batch method tested. The first-order kinetics constant k for the cumulative methane production curve was highest when two animal manures were fermented using the VDI 4630 method, indicating that this method was able to reach steady conditions in a shorter time, reducing fermentation duration. In precision tests, the repeatability of the relative standard deviation (RSDr for all batch methods was very low (4.8 to 8.1%, while the reproducibility of the relative standard deviation (RSDR varied widely, from 7.3 to 19.8%. In determination of biomethane concentration, the values obtained using the liquid replacement method (LRM were comparable to those obtained using gas chromatography (GC. This indicates that the LRM method could be used to determine biomethane concentration in biogas in laboratories with limited access to GC.

  1. Inter-laboratory validation of an inexpensive streamlined method to measure inorganic arsenic in rice grain.

    Science.gov (United States)

    Chaney, Rufus L; Green, Carrie E; Lehotay, Steven J

    2018-05-04

    With the establishment by CODEX of a 200 ng/g limit of inorganic arsenic (iAs) in polished rice grain, more analyses of iAs will be necessary to ensure compliance in regulatory and trade applications, to assess quality control in commercial rice production, and to conduct research involving iAs in rice crops. Although analytical methods using high-performance liquid chromatography-inductively coupled plasma-mass spectrometry (HPLC-ICP-MS) have been demonstrated for full speciation of As, this expensive and time-consuming approach is excessive when regulations are based only on iAs. We report a streamlined sample preparation and analysis of iAs in powdered rice based on heated extraction with 0.28 M HNO 3 followed by hydride generation (HG) under control of acidity and other simple conditions. Analysis of iAs is then conducted using flow-injection HG and inexpensive ICP-atomic emission spectroscopy (AES) or other detection means. A key innovation compared with previous methods was to increase the acidity of the reagent solution with 4 M HCl (prior to reduction of As 5+ to As 3+ ), which minimized interferences from dimethylarsinic acid. An inter-laboratory method validation was conducted among 12 laboratories worldwide in the analysis of six shared blind duplicates and a NIST Standard Reference Material involving different types of rice and iAs levels. Also, four laboratories used the standard HPLC-ICP-MS method to analyze the samples. The results between the methods were not significantly different, and the Horwitz ratio averaged 0.52 for the new method, which meets official method validation criteria. Thus, the simpler, more versatile, and less expensive method may be used by laboratories for several purposes to accurately determine iAs in rice grain. Graphical abstract Comparison of iAs results from new and FDA methods.

  2. Validity of a manual soft tissue profile prediction method following mandibular setback osteotomy.

    Science.gov (United States)

    Kolokitha, Olga-Elpis

    2007-10-01

    The aim of this study was to determine the validity of a manual cephalometric method used for predicting the post-operative soft tissue profiles of patients who underwent mandibular setback surgery and compare it to a computerized cephalometric prediction method (Dentofacial Planner). Lateral cephalograms of 18 adults with mandibular prognathism taken at the end of pre-surgical orthodontics and approximately one year after surgery were used. To test the validity of the manual method the prediction tracings were compared to the actual post-operative tracings. The Dentofacial Planner software was used to develop the computerized post-surgical prediction tracings. Both manual and computerized prediction printouts were analyzed by using the cephalometric system PORDIOS. Statistical analysis was performed by means of t-test. Comparison between manual prediction tracings and the actual post-operative profile showed that the manual method results in more convex soft tissue profiles; the upper lip was found in a more prominent position, upper lip thickness was increased and, the mandible and lower lip were found in a less posterior position than that of the actual profiles. Comparison between computerized and manual prediction methods showed that in the manual method upper lip thickness was increased, the upper lip was found in a more anterior position and the lower anterior facial height was increased as compared to the computerized prediction method. Cephalometric simulation of post-operative soft tissue profile following orthodontic-surgical management of mandibular prognathism imposes certain limitations related to the methods implied. However, both manual and computerized prediction methods remain a useful tool for patient communication.

  3. A gradient-based method for segmenting FDG-PET images: methodology and validation

    International Nuclear Information System (INIS)

    Geets, Xavier; Lee, John A.; Gregoire, Vincent; Bol, Anne; Lonneux, Max

    2007-01-01

    A new gradient-based method for segmenting FDG-PET images is described and validated. The proposed method relies on the watershed transform and hierarchical cluster analysis. To allow a better estimation of the gradient intensity, iteratively reconstructed images were first denoised and deblurred with an edge-preserving filter and a constrained iterative deconvolution algorithm. Validation was first performed on computer-generated 3D phantoms containing spheres, then on a real cylindrical Lucite phantom containing spheres of different volumes ranging from 2.1 to 92.9 ml. Moreover, laryngeal tumours from seven patients were segmented on PET images acquired before laryngectomy by the gradient-based method and the thresholding method based on the source-to-background ratio developed by Daisne (Radiother Oncol 2003;69:247-50). For the spheres, the calculated volumes and radii were compared with the known values; for laryngeal tumours, the volumes were compared with the macroscopic specimens. Volume mismatches were also analysed. On computer-generated phantoms, the deconvolution algorithm decreased the mis-estimate of volumes and radii. For the Lucite phantom, the gradient-based method led to a slight underestimation of sphere volumes (by 10-20%), corresponding to negligible radius differences (0.5-1.1 mm); for laryngeal tumours, the segmented volumes by the gradient-based method agreed with those delineated on the macroscopic specimens, whereas the threshold-based method overestimated the true volume by 68% (p = 0.014). Lastly, macroscopic laryngeal specimens were totally encompassed by neither the threshold-based nor the gradient-based volumes. The gradient-based segmentation method applied on denoised and deblurred images proved to be more accurate than the source-to-background ratio method. (orig.)

  4. Using deuterated PAH amendments to validate chemical extraction methods to predict PAH bioavailability in soils

    International Nuclear Information System (INIS)

    Gomez-Eyles, Jose L.; Collins, Chris D.; Hodson, Mark E.

    2011-01-01

    Validating chemical methods to predict bioavailable fractions of polycyclic aromatic hydrocarbons (PAHs) by comparison with accumulation bioassays is problematic. Concentrations accumulated in soil organisms not only depend on the bioavailable fraction but also on contaminant properties. A historically contaminated soil was freshly spiked with deuterated PAHs (dPAHs). dPAHs have a similar fate to their respective undeuterated analogues, so chemical methods that give good indications of bioavailability should extract the fresh more readily available dPAHs and historic more recalcitrant PAHs in similar proportions to those in which they are accumulated in the tissues of test organisms. Cyclodextrin and butanol extractions predicted the bioavailable fraction for earthworms (Eisenia fetida) and plants (Lolium multiflorum) better than the exhaustive extraction. The PAHs accumulated by earthworms had a larger dPAH:PAH ratio than that predicted by chemical methods. The isotope ratio method described here provides an effective way of evaluating other chemical methods to predict bioavailability. - Research highlights: → Isotope ratios can be used to evaluate chemical methods to predict bioavailability. → Chemical methods predicted bioavailability better than exhaustive extractions. → Bioavailability to earthworms was still far from that predicted by chemical methods. - A novel method using isotope ratios to assess the ability of chemical methods to predict PAH bioavailability to soil biota.

  5. Using deuterated PAH amendments to validate chemical extraction methods to predict PAH bioavailability in soils

    Energy Technology Data Exchange (ETDEWEB)

    Gomez-Eyles, Jose L., E-mail: j.l.gomezeyles@reading.ac.uk [University of Reading, School of Human and Environmental Sciences, Soil Research Centre, Reading, RG6 6DW Berkshire (United Kingdom); Collins, Chris D.; Hodson, Mark E. [University of Reading, School of Human and Environmental Sciences, Soil Research Centre, Reading, RG6 6DW Berkshire (United Kingdom)

    2011-04-15

    Validating chemical methods to predict bioavailable fractions of polycyclic aromatic hydrocarbons (PAHs) by comparison with accumulation bioassays is problematic. Concentrations accumulated in soil organisms not only depend on the bioavailable fraction but also on contaminant properties. A historically contaminated soil was freshly spiked with deuterated PAHs (dPAHs). dPAHs have a similar fate to their respective undeuterated analogues, so chemical methods that give good indications of bioavailability should extract the fresh more readily available dPAHs and historic more recalcitrant PAHs in similar proportions to those in which they are accumulated in the tissues of test organisms. Cyclodextrin and butanol extractions predicted the bioavailable fraction for earthworms (Eisenia fetida) and plants (Lolium multiflorum) better than the exhaustive extraction. The PAHs accumulated by earthworms had a larger dPAH:PAH ratio than that predicted by chemical methods. The isotope ratio method described here provides an effective way of evaluating other chemical methods to predict bioavailability. - Research highlights: > Isotope ratios can be used to evaluate chemical methods to predict bioavailability. > Chemical methods predicted bioavailability better than exhaustive extractions. > Bioavailability to earthworms was still far from that predicted by chemical methods. - A novel method using isotope ratios to assess the ability of chemical methods to predict PAH bioavailability to soil biota.

  6. The truncated Wigner method for Bose-condensed gases: limits of validity and applications

    International Nuclear Information System (INIS)

    Sinatra, Alice; Lobo, Carlos; Castin, Yvan

    2002-01-01

    We study the truncated Wigner method applied to a weakly interacting spinless Bose-condensed gas which is perturbed away from thermal equilibrium by a time-dependent external potential. The principle of the method is to generate an ensemble of classical fields ψ(r) which samples the Wigner quasi-distribution function of the initial thermal equilibrium density operator of the gas, and then to evolve each classical field with the Gross-Pitaevskii equation. In the first part of the paper we improve the sampling technique over our previous work (Sinatra et al 2000 J. Mod. Opt. 47 2629-44) and we test its accuracy against the exactly solvable model of the ideal Bose gas. In the second part of the paper we investigate the conditions of validity of the truncated Wigner method. For short evolution times it is known that the time-dependent Bogoliubov approximation is valid for almost pure condensates. The requirement that the truncated Wigner method reproduces the Bogoliubov prediction leads to the constraint that the number of field modes in the Wigner simulation must be smaller than the number of particles in the gas. For longer evolution times the nonlinear dynamics of the noncondensed modes of the field plays an important role. To demonstrate this we analyse the case of a three-dimensional spatially homogeneous Bose-condensed gas and we test the ability of the truncated Wigner method to correctly reproduce the Beliaev-Landau damping of an excitation of the condensate. We have identified the mechanism which limits the validity of the truncated Wigner method: the initial ensemble of classical fields, driven by the time-dependent Gross-Pitaevskii equation, thermalizes to a classical field distribution at a temperature T class which is larger than the initial temperature T of the quantum gas. When T class significantly exceeds T a spurious damping is observed in the Wigner simulation. This leads to the second validity condition for the truncated Wigner method, T class - T

  7. Unexpected but Most Welcome: Mixed Methods for the Validation and Revision of the Participatory Evaluation Measurement Instrument

    Science.gov (United States)

    Daigneault, Pierre-Marc; Jacob, Steve

    2014-01-01

    Although combining methods is nothing new, more contributions about why and how to mix methods for validation purposes are needed. This article presents a case of validating the inferences drawn from the Participatory Evaluation Measurement Instrument, an instrument that purports to measure stakeholder participation in evaluation. Although the…

  8. Clashing Validities in the Comparative Method? Balancing In-Depth Understanding and Generalizability in Small-N Policy Studies

    NARCIS (Netherlands)

    van der Heijden, J.

    2013-01-01

    The comparative method receives considerable attention in political science. To some a main advantage of the method is that it allows for both in-depth insights (internal validity), and generalizability beyond the cases studied (external validity). However, others consider internal and external

  9. [Method for evaluating the competence of specialists--the validation of 360-degree-questionnaire].

    Science.gov (United States)

    Nørgaard, Kirsten; Pedersen, Juri; Ravn, Lisbeth; Albrecht-Beste, Elisabeth; Holck, Kim; Fredløv, Maj; Møller, Lars Krag

    2010-04-19

    Assessment of physicians' performance focuses on the quality of their work. The aim of this study was to develop a valid, usable and acceptable multisource feedback assessment tool (MFAT) for hospital consultants. Statements were produced on consultant competencies within non-medical areas like collaboration, professionalism, communication, health promotion, academics and administration. The statements were validated by physicians and later by non-physician professionals after adjustments had been made. In a pilot test, a group of consultants was assessed using the final collection of statements of the MFAT. They received a report with their personal results and subsequently evaluated the assessment method. In total, 66 statements were developed and after validation they were reduced and reformulated to 35. Mean scores for relevance and "easy to understand" of the statements were in the range between "very high degree" and "high degree". In the pilot test, 18 consultants were assessed by themselves, by 141 other physicians and by 125 other professionals in the hospital. About two thirds greatly benefited of the assessment report and half identified areas for personal development. About a third did not want the head of their department to know the assessment results directly; however, two thirds found a potential value in discussing the results with the head. We developed an MFAT for consultants with relevant and understandable statements. A pilot test confirmed that most of the consultants gained from the assessment, but some did not like to share their results with their heads. For these specialists other methods should be used.

  10. An Anatomically Validated Brachial Plexus Contouring Method for Intensity Modulated Radiation Therapy Planning

    International Nuclear Information System (INIS)

    Van de Velde, Joris; Audenaert, Emmanuel; Speleers, Bruno; Vercauteren, Tom; Mulliez, Thomas; Vandemaele, Pieter; Achten, Eric; Kerckaert, Ingrid; D'Herde, Katharina; De Neve, Wilfried; Van Hoof, Tom

    2013-01-01

    Purpose: To develop contouring guidelines for the brachial plexus (BP) using anatomically validated cadaver datasets. Magnetic resonance imaging (MRI) and computed tomography (CT) were used to obtain detailed visualizations of the BP region, with the goal of achieving maximal inclusion of the actual BP in a small contoured volume while also accommodating for anatomic variations. Methods and Materials: CT and MRI were obtained for 8 cadavers positioned for intensity modulated radiation therapy. 3-dimensional reconstructions of soft tissue (from MRI) and bone (from CT) were combined to create 8 separate enhanced CT project files. Dissection of the corresponding cadavers anatomically validated the reconstructions created. Seven enhanced CT project files were then automatically fitted, separately in different regions, to obtain a single dataset of superimposed BP regions that incorporated anatomic variations. From this dataset, improved BP contouring guidelines were developed. These guidelines were then applied to the 7 original CT project files and also to 1 additional file, left out from the superimposing procedure. The percentage of BP inclusion was compared with the published guidelines. Results: The anatomic validation procedure showed a high level of conformity for the BP regions examined between the 3-dimensional reconstructions generated and the dissected counterparts. Accurate and detailed BP contouring guidelines were developed, which provided corresponding guidance for each level in a clinical dataset. An average margin of 4.7 mm around the anatomically validated BP contour is sufficient to accommodate for anatomic variations. Using the new guidelines, 100% inclusion of the BP was achieved, compared with a mean inclusion of 37.75% when published guidelines were applied. Conclusion: Improved guidelines for BP delineation were developed using combined MRI and CT imaging with validation by anatomic dissection

  11. The Validation of AAN Method Used by Rock Sample SRM 2780

    International Nuclear Information System (INIS)

    Rina Mulyaningsih, Th.

    2004-01-01

    AAN methods is a non standard testing method. The testing laboratory must be validate its using method to ensure and confirm that it is suitable with application. The analysis of SRM 2780 Hard rock mine waste with 9 replicates has been done to test the accuracy of AAN methods. The result showed that the elements As, Ba, Mn, V, Zn and Na have good accuration were evaluated against the acceptance criteria for accuracy with confidence level 95 %. The elements As, Co, Sc, Cr, Ba, Sb, Cs, Mn, V, Au, Zn and Na have low relative bias between the analyst's value and the target value. The continued testing must be done to test the accuracy of another certificated elements. (author)

  12. Validated spectrofluorometric method for determination of gemfibrozil in self nanoemulsifying drug delivery systems (SNEDDS)

    Science.gov (United States)

    Sierra Villar, Ana M.; Calpena Campmany, Ana C.; Bellowa, Lyda Halbaut; Trenchs, Monserrat Aróztegui; Naveros, Beatriz Clares

    2013-09-01

    A spectrofluorometric method has been developed and validated for the determination of gemfibrozil. The method is based on the excitation and emission capacities of gemfibrozil with excitation and emission wavelengths of 276 and 304 nm respectively. This method allows de determination of the drug in a self-nanoemulsifying drug delivery system (SNEDDS) for improve its intestinal absorption. Results obtained showed linear relationships with good correlation coefficients (r2 > 0.999) and low limits of detection and quantification (LOD of 0.075 μg mL-1 and LOQ of 0.226 μg mL-1) in the range of 0.2-5 μg mL-1, equally this method showed a good robustness and stability. Thus the amounts of gemfibrozil released from SNEDDS contained in gastro resistant hard gelatine capsules were analysed, and release studies could be performed satisfactorily.

  13. Validated Reverse Phase HPLC Method for the Determination of Impurities in Etoricoxib

    Directory of Open Access Journals (Sweden)

    S. Venugopal

    2011-01-01

    Full Text Available This paper describes the development of reverse phase HPLC method for etoricoxib in the presence of impurities and degradation products generated from the forced degradation studies. The drug substance was subjected to stress conditions of hydrolysis, oxidation, photolysis and thermal degradation. The degradation of etoricoxib was observed under base and oxidation environment. The drug was found stable in other stress conditions studied. Successful separation of the drug from the process related impurities and degradation products were achieved on zorbax SB CN (250 x 4.6 mm 5 μm particle size column using reverse phase HPLC method. The isocratic method employed with a mixture of buffer and acetonitrile in a ratio of 60:40 respectively. Disodium hydrogen orthophosphate (0.02 M is used as buffer and pH adjusted to 7.20 with 1 N sodium hydroxide solution. The HPLC method was developed and validated with respect to linearity, accuracy, precision, specificity and ruggedness.

  14. Scope and limitations of the TEMPO/EPR method for singlet oxygen detection: the misleading role of electron transfer.

    Science.gov (United States)

    Nardi, Giacomo; Manet, Ilse; Monti, Sandra; Miranda, Miguel A; Lhiaubet-Vallet, Virginie

    2014-12-01

    For many biological and biomedical studies, it is essential to detect the production of (1)O2 and quantify its production yield. Among the available methods, detection of the characteristic 1270-nm phosphorescence of singlet oxygen by time-resolved near-infrared (TRNIR) emission constitutes the most direct and unambiguous approach. An alternative indirect method is electron paramagnetic resonance (EPR) in combination with a singlet oxygen probe. This is based on the detection of the TEMPO free radical formed after oxidation of TEMP (2,2,6,6-tetramethylpiperidine) by singlet oxygen. Although the TEMPO/EPR method has been widely employed, it can produce misleading data. This is demonstrated by the present study, in which the quantum yields of singlet oxygen formation obtained by TRNIR emission and by the TEMPO/EPR method are compared for a set of well-known photosensitizers. The results reveal that the TEMPO/EPR method leads to significant overestimation of singlet oxygen yield when the singlet or triplet excited state of the photosensitizer is efficiently quenched by TEMP, acting as electron donor. In such case, generation of the TEMP(+) radical cation, followed by deprotonation and reaction with molecular oxygen, gives rise to an EPR-detectable TEMPO signal that is not associated with singlet oxygen production. This knowledge is essential for an appropriate and error-free application of the TEMPO/EPR method in chemical, biological, and medical studies. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. The Dynamic Similitude Design Method of Thin Walled Structures and Experimental Validation

    Directory of Open Access Journals (Sweden)

    Zhong Luo

    2016-01-01

    Full Text Available For the applicability of dynamic similitude models of thin walled structures, such as engine blades, turbine discs, and cylindrical shells, the dynamic similitude design of typical thin walled structures is investigated. The governing equation of typical thin walled structures is firstly unified, which guides to establishing dynamic scaling laws of typical thin walled structures. Based on the governing equation, geometrically complete scaling law of the typical thin walled structure is derived. In order to determine accurate distorted scaling laws of typical thin walled structures, three principles are proposed and theoretically proved by combining the sensitivity analysis and governing equation. Taking the thin walled annular plate as an example, geometrically complete and distorted scaling laws can be obtained based on the principles of determining dynamic scaling laws. Furthermore, the previous five orders’ accurate distorted scaling laws of thin walled annular plates are presented and numerically validated. Finally, the effectiveness of the similitude design method is validated by experimental annular plates.

  16. Method validation in plasma source optical emission spectroscopy (ICP-OES) - From samples to results

    International Nuclear Information System (INIS)

    Pilon, Fabien; Vielle, Karine; Birolleau, Jean-Claude; Vigneau, Olivier; Labet, Alexandre; Arnal, Nadege; Adam, Christelle; Camilleri, Virginie; Amiel, Jeanine; Granier, Guy; Faure, Joel; Arnaud, Regine; Beres, Andre; Blanchard, Jean-Marc; Boyer-Deslys, Valerie; Broudic, Veronique; Marques, Caroline; Augeray, Celine; Bellefleur, Alexandre; Bienvenu, Philippe; Delteil, Nicole; Boulet, Beatrice; Bourgarit, David; Brennetot, Rene; Fichet, Pascal; Celier, Magali; Chevillotte, Rene; Klelifa, Aline; Fuchs, Gilbert; Le Coq, Gilles; Mermet, Jean-Michel

    2017-01-01

    Even though ICP-OES (Inductively Coupled Plasma - Optical Emission Spectroscopy) is now a routine analysis technique, requirements for measuring processes impose a complete control and mastering of the operating process and of the associated quality management system. The aim of this (collective) book is to guide the analyst during all the measurement validation procedure and to help him to guarantee the mastering of its different steps: administrative and physical management of samples in the laboratory, preparation and treatment of the samples before measuring, qualification and monitoring of the apparatus, instrument setting and calibration strategy, exploitation of results in terms of accuracy, reliability, data covariance (with the practical determination of the accuracy profile). The most recent terminology is used in the book, and numerous examples and illustrations are given in order to a better understanding and to help the elaboration of method validation documents

  17. A Component-Based Modeling and Validation Method for PLC Systems

    Directory of Open Access Journals (Sweden)

    Rui Wang

    2014-05-01

    Full Text Available Programmable logic controllers (PLCs are complex embedded systems that are widely used in industry. This paper presents a component-based modeling and validation method for PLC systems using the behavior-interaction-priority (BIP framework. We designed a general system architecture and a component library for a type of device control system. The control software and hardware of the environment were all modeled as BIP components. System requirements were formalized as monitors. Simulation was carried out to validate the system model. A realistic example from industry of the gates control system was employed to illustrate our strategies. We found a couple of design errors during the simulation, which helped us to improve the dependability of the original systems. The results of experiment demonstrated the effectiveness of our approach.

  18. A Method for Ship Collision Damage and Energy Absorption Analysis and its Validation

    DEFF Research Database (Denmark)

    Zhang, Shengming; Pedersen, Preben Terndrup

    2016-01-01

    -examine this method’s validity andaccuracy for ship collision damage analysis in shipdesign assessments by comprehensive validations withthe experimental results from the public domain. Twentyexperimental tests have been selected, analysed andcompared with the results calculated using the proposedmethod. It can......For design evaluation there is a need for a method whichis fast, practical and yet accurate enough to determine theabsorbed energy and collision damage extent in shipcollision analysis. The most well-known simplifiedempirical approach to collision analysis was madeprobably by Minorsky and its...... limitation is also wellrecognized.The authors have previously developedsimple expressions for the relation between the absorbedenergy and the damaged material volume which take intoaccount the structural arrangements, the materialproperties and the damage modes. The purpose of thepresent paper is to re...

  19. A Method for Ship Collision Damage and Energy Absorption Analysis and its Validation

    DEFF Research Database (Denmark)

    Zhang, Shengming; Pedersen, Preben Terndrup

    2017-01-01

    For design evaluation, there is a need for a method which is fast, practical and yet accurate enough to deter-mine the absorbed energy and collision damage extent in ship collision analysis. The most well-known sim-plified empirical approach to collision analysis was made probably by Minorsky......, and its limitation is alsowell-recognised. The authors have previously developed simple expressions for the relation between theabsorbed energy and the damaged material volume which take into account the structural arrangements,the material properties and the damage modes. The purpose of the present paper...... is to re-examine thismethod’s validity and accuracy for ship collision damage analysis in ship design assessments by compre-hensive validations with experimental results from the public domain. In total, 20 experimental tests havebeen selected, analysed and compared with the results calculated using...

  20. A New Statistical Method to Determine the Degree of Validity of Health Economic Model Outcomes against Empirical Data.

    Science.gov (United States)

    Corro Ramos, Isaac; van Voorn, George A K; Vemer, Pepijn; Feenstra, Talitha L; Al, Maiwenn J

    2017-09-01

    The validation of health economic (HE) model outcomes against empirical data is of key importance. Although statistical testing seems applicable, guidelines for the validation of HE models lack guidance on statistical validation, and actual validation efforts often present subjective judgment of graphs and point estimates. To discuss the applicability of existing validation techniques and to present a new method for quantifying the degrees of validity statistically, which is useful for decision makers. A new Bayesian method is proposed to determine how well HE model outcomes compare with empirical data. Validity is based on a pre-established accuracy interval in which the model outcomes should fall. The method uses the outcomes of a probabilistic sensitivity analysis and results in a posterior distribution around the probability that HE model outcomes can be regarded as valid. We use a published diabetes model (Modelling Integrated Care for Diabetes based on Observational data) to validate the outcome "number of patients who are on dialysis or with end-stage renal disease." Results indicate that a high probability of a valid outcome is associated with relatively wide accuracy intervals. In particular, 25% deviation from the observed outcome implied approximately 60% expected validity. Current practice in HE model validation can be improved by using an alternative method based on assessing whether the model outcomes fit to empirical data at a predefined level of accuracy. This method has the advantage of assessing both model bias and parameter uncertainty and resulting in a quantitative measure of the degree of validity that penalizes models predicting the mean of an outcome correctly but with overly wide credible intervals. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  1. USFDA-GUIDELINE BASED VALIDATION OF TESTING METHOD FOR RIFAMPICIN IN INDONESIAN SERUM SPECIMEN

    Directory of Open Access Journals (Sweden)

    Tri Joko Raharjo

    2010-06-01

    Full Text Available Regarding a new regulation from Indonesia FDA (Badan POM-RI, all new non patent drugs should show bioequivalence with the originator drug prior to registration. Bioequivalence testing (BE-testing has to be performed to the people that represented of population to which the drug to be administrated. BE testing need a valid bio-analytical method for certain drug target and group of population. This research report specific validation of bio-analysis of Rifampicin in Indonesian serum specimen in order to be used for BE testing. The extraction was performed using acetonitrile while the chromatographic separation was accomplished on a RP 18 column (250 × 4.6 mm i.d., 5 µm, with a mobile phase composed of KH2PO4 10 mM-Acetonitrile (40:60, v/v and UV detection was set at 333 nm. The method shown specificity compared to blank serum specimen with retention time of rifampicin at 2.1 min. Lower limit of quantification (LLOQ was 0.06 µg/mL with dynamic range up to 20 µg/mL (R>0.990. Precision of the method was very good with coefficient of variance (CV 0.58; 7.40 and 5.56% for concentration at 0.06, 5, 15 µg/mL, respectively. Accuracies of the method were 3.22; 1.94; 1.90% for concentration 0.06, 5 and 15 µg/mL respectively. The average recoveries were 97.82, 95.50 and 97.31% for concentration of rifampicin 1, 5 and 5 µg/mL, respectively. The method was also shown reliable result on stability test on freezing-thawing, short-term and long-term stability as well as post preparation stability. Validation result shown that the method was ready to be used for Rifampicin BE testing with Indonesian subject.   Keywords: Rifampicin, Validation, USFDA-Guideline

  2. Development and Validation of Spectrophotometric Methods for the Determination of Rasagiline in Pharmaceutical Preparations

    Directory of Open Access Journals (Sweden)

    Serife Evrim Kepekci Tekkeli

    2013-01-01

    Full Text Available This study presents three simple, rapid, and accurate spectrophotometric methods for the determination of Rasagiline (RSG in pharmaceutical preparations. The determination procedures depend on the reaction of RSG with chloranilic acid for method A, tetrachloro-1,4-benzoquinone for method B, and 7,7,8,8-tetracyanoquinodimethane for method C. The colored products were quantitated spectrophotometrically at 524, 535, and 843 nm for methods A, B, and C, respectively. Different variables affecting the reaction were optimized. Linearity ranges of the methods with good correlation coefficients (0.9988–0.9996 were observed as 25–300 µg mL−1, 25–350 µg mL−1, and 50–500 µg mL−1 for methods A, B, and C, respectively. The formation of products takes place through different mechanisms. The sites of interaction were confirmed by elemental analysis using IR and 1H-NMR spectroscopy. The validation of the methods was carried out in terms of specificity, linearity, accuracy, precision, robustness, limit of detection, and limit of quantitation. No interference was observed from concomitants usually present in dosage forms. The methods were applied successfully to the determination of RSG in pharmaceutical preparations.

  3. Validated Spectrophotometric Methods for Simultaneous Determination of Food Colorants and Sweeteners

    Directory of Open Access Journals (Sweden)

    Fatma Turak

    2013-01-01

    Full Text Available Two simple spectrophotometric methods have been proposed for simultaneous determination of two colorants (Indigotin and Brilliant Blue and two sweeteners (Acesulfame-K and Aspartame in synthetic mixtures and chewing gums without any prior separation or purification. The first method, derivative spectrophotometry (ZCDS, is based on recording the first derivative curves (for Indigotin, Brillant Blue, and Acesulfame-K and third-derivative curve (for Aspartame and determining each component using the zero-crossing technique. The other method, ratio derivative spectrophotometry (RDS, depends on application ratio spectra of first- and third-derivative spectrophotometry to resolve the interference due to spectral overlapping. Both colorants and sweeteners showed good linearity, with regression coefficients of 0.9992–0.9999. The LOD and LOQ values ranged from 0.05 to 0.33 μgmL−1 and from 0.06 to 0.47 μgmL−1, respectively. The intraday and interday precision tests produced good RSD% values (<0.81%; recoveries ranged from 99.78% to 100.67% for all two methods. The accuracy and precision of the methods have been determined, and the methods have been validated by analyzing synthetic mixtures containing colorants and sweeteners. Two methods were applied for the above combination, and satisfactory results were obtained. The results obtained by applying the ZCDS method were statistically compared with those obtained by the RDS method.

  4. Validation of different spectrophotometric methods for determination of vildagliptin and metformin in binary mixture

    Science.gov (United States)

    Abdel-Ghany, Maha F.; Abdel-Aziz, Omar; Ayad, Miriam F.; Tadros, Mariam M.

    New, simple, specific, accurate, precise and reproducible spectrophotometric methods have been developed and subsequently validated for determination of vildagliptin (VLG) and metformin (MET) in binary mixture. Zero order spectrophotometric method was the first method used for determination of MET in the range of 2-12 μg mL-1 by measuring the absorbance at 237.6 nm. The second method was derivative spectrophotometric technique; utilized for determination of MET at 247.4 nm, in the range of 1-12 μg mL-1. Derivative ratio spectrophotometric method was the third technique; used for determination of VLG in the range of 4-24 μg mL-1 at 265.8 nm. Fourth and fifth methods adopted for determination of VLG in the range of 4-24 μg mL-1; were ratio subtraction and mean centering spectrophotometric methods, respectively. All the results were statistically compared with the reported methods, using one-way analysis of variance (ANOVA). The developed methods were satisfactorily applied to analysis of the investigated drugs and proved to be specific and accurate for quality control of them in pharmaceutical dosage forms.

  5. Validated stability-indicating spectrofluorimetric methods for the determination of ebastine in pharmaceutical preparations

    Directory of Open Access Journals (Sweden)

    Eid Manal

    2011-03-01

    Full Text Available Abstract Two sensitive, selective, economic, and validated spectrofluorimetric methods were developed for the determination of ebastine (EBS in pharmaceutical preparations depending on reaction with its tertiary amino group. Method I involves condensation of the drug with mixed anhydrides (citric and acetic anhydrides producing a product with intense fluorescence, which was measured at 496 nm after excitation at 388 nm. Method (IIA describes quantitative fluorescence quenching of eosin upon addition of the studied drug where the decrease in the fluorescence intensity was directly proportional to the concentration of ebastine; the fluorescence quenching was measured at 553 nm after excitation at 457 nm. This method was extended to (Method IIB to apply first and second derivative synchronous spectrofluorimetric method (FDSFS & SDSFS for the simultaneous analysis of EBS in presence of its alkaline, acidic, and UV degradation products. The proposed methods were successfully applied for the determination of the studied compound in its dosage forms. The results obtained were in good agreement with those obtained by a comparison method. Both methods were utilized to investigate the kinetics of the degradation of the drug.

  6. Development and validation of NIR-chemometric methods for chemical and pharmaceutical characterization of meloxicam tablets.

    Science.gov (United States)

    Tomuta, Ioan; Iovanov, Rares; Bodoki, Ede; Vonica, Loredana

    2014-04-01

    Near-Infrared (NIR) spectroscopy is an important component of a Process Analytical Technology (PAT) toolbox and is a key technology for enabling the rapid analysis of pharmaceutical tablets. The aim of this research work was to develop and validate NIR-chemometric methods not only for the determination of active pharmaceutical ingredients content but also pharmaceutical properties (crushing strength, disintegration time) of meloxicam tablets. The development of the method for active content assay was performed on samples corresponding to 80%, 90%, 100%, 110% and 120% of meloxicam content and the development of the methods for pharmaceutical characterization was performed on samples prepared at seven different compression forces (ranging from 7 to 45 kN) using NIR transmission spectra of intact tablets and PLS as a regression method. The results show that the developed methods have good trueness, precision and accuracy and are appropriate for direct active content assay in tablets (ranging from 12 to 18 mg/tablet) and also for predicting crushing strength and disintegration time of intact meloxicam tablets. The comparative data show that the proposed methods are in good agreement with the reference methods currently used for the characterization of meloxicam tablets (HPLC-UV methods for the assay and European Pharmacopeia methods for determining the crushing strength and disintegration time). The results show the possibility to predict both chemical properties (active content) and physical/pharmaceutical properties (crushing strength and disintegration time) directly, without any sample preparation, from the same NIR transmission spectrum of meloxicam tablets.

  7. Raman fiber-optical method for colon cancer detection: Cross-validation and outlier identification approach

    Science.gov (United States)

    Petersen, D.; Naveed, P.; Ragheb, A.; Niedieker, D.; El-Mashtoly, S. F.; Brechmann, T.; Kötting, C.; Schmiegel, W. H.; Freier, E.; Pox, C.; Gerwert, K.

    2017-06-01

    Endoscopy plays a major role in early recognition of cancer which is not externally accessible and therewith in increasing the survival rate. Raman spectroscopic fiber-optical approaches can help to decrease the impact on the patient, increase objectivity in tissue characterization, reduce expenses and provide a significant time advantage in endoscopy. In gastroenterology an early recognition of malign and precursor lesions is relevant. Instantaneous and precise differentiation between adenomas as precursor lesions for cancer and hyperplastic polyps on the one hand and between high and low-risk alterations on the other hand is important. Raman fiber-optical measurements of colon biopsy samples taken during colonoscopy were carried out during a clinical study, and samples of adenocarcinoma (22), tubular adenomas (141), hyperplastic polyps (79) and normal tissue (101) from 151 patients were analyzed. This allows us to focus on the bioinformatic analysis and to set stage for Raman endoscopic measurements. Since spectral differences between normal and cancerous biopsy samples are small, special care has to be taken in data analysis. Using a leave-one-patient-out cross-validation scheme, three different outlier identification methods were investigated to decrease the influence of systematic errors, like a residual risk in misplacement of the sample and spectral dilution of marker bands (esp. cancerous tissue) and therewith optimize the experimental design. Furthermore other validations methods like leave-one-sample-out and leave-one-spectrum-out cross-validation schemes were compared with leave-one-patient-out cross-validation. High-risk lesions were differentiated from low-risk lesions with a sensitivity of 79%, specificity of 74% and an accuracy of 77%, cancer and normal tissue with a sensitivity of 79%, specificity of 83% and an accuracy of 81%. Additionally applied outlier identification enabled us to improve the recognition of neoplastic biopsy samples.

  8. Raman fiber-optical method for colon cancer detection: Cross-validation and outlier identification approach.

    Science.gov (United States)

    Petersen, D; Naveed, P; Ragheb, A; Niedieker, D; El-Mashtoly, S F; Brechmann, T; Kötting, C; Schmiegel, W H; Freier, E; Pox, C; Gerwert, K

    2017-06-15

    Endoscopy plays a major role in early recognition of cancer which is not externally accessible and therewith in increasing the survival rate. Raman spectroscopic fiber-optical approaches can help to decrease the impact on the patient, increase objectivity in tissue characterization, reduce expenses and provide a significant time advantage in endoscopy. In gastroenterology an early recognition of malign and precursor lesions is relevant. Instantaneous and precise differentiation between adenomas as precursor lesions for cancer and hyperplastic polyps on the one hand and between high and low-risk alterations on the other hand is important. Raman fiber-optical measurements of colon biopsy samples taken during colonoscopy were carried out during a clinical study, and samples of adenocarcinoma (22), tubular adenomas (141), hyperplastic polyps (79) and normal tissue (101) from 151 patients were analyzed. This allows us to focus on the bioinformatic analysis and to set stage for Raman endoscopic measurements. Since spectral differences between normal and cancerous biopsy samples are small, special care has to be taken in data analysis. Using a leave-one-patient-out cross-validation scheme, three different outlier identification methods were investigated to decrease the influence of systematic errors, like a residual risk in misplacement of the sample and spectral dilution of marker bands (esp. cancerous tissue) and therewith optimize the experimental design. Furthermore other validations methods like leave-one-sample-out and leave-one-spectrum-out cross-validation schemes were compared with leave-one-patient-out cross-validation. High-risk lesions were differentiated from low-risk lesions with a sensitivity of 79%, specificity of 74% and an accuracy of 77%, cancer and normal tissue with a sensitivity of 79%, specificity of 83% and an accuracy of 81%. Additionally applied outlier identification enabled us to improve the recognition of neoplastic biopsy samples. Copyright

  9. Validation of a food frequency questionnaire to determine vitamin D intakes using the method of triads.

    Science.gov (United States)

    Weir, R R; Carson, E L; Mulhern, M S; Laird, E; Healy, M; Pourshahidi, L K

    2016-04-01

    Dietary sources of vitamin D (both natural and fortified) are increasingly contributing to consumers' vitamin D intake and status. Therefore, the present study aimed to validate a vitamin D food frequency questionnaire (FFQ) for the assessment of habitual vitamin D intake. A total of 49 apparently healthy consenting adults (aged 18-64 years) from the local community were sampled at the end of winter. Dietary intakes were recorded using a 4-day weighed food record (4d-WFR) and a 17-item FFQ based on foods known to contribute to dietary vitamin D intake. Fasting vitamin D status was quantified by serum 25-hydroxyvitamin D [25(OH)D] using liquid chromatography tandem mass spectrometry. The method of triads was applied using these three measurements to determine the overall validity of the FFQ. Vitamin D intakes from 4d-WFR ranged between 0.42 and 31.65 μg day(-1), whereas intakes determined from the FFQ ranged from 1.03 to 36.08 μg day(-1). Serum 25(OH)D concentrations ranged between 12.89 and 279.00 nmol L(-1). The mean (SD) difference between the FFQ and 4d-WFR was +1.62 ( 3.86). There were strong correlations between the vitamin D intake estimated by the FFQ and that from the 4d-WFR (r = 0.562) and also with serum 25(OH)D concentrations (r = 0.567). Vitamin D intake estimated from the 4d-WFR was also strongly correlated with serum 25(OH)D concentrations (r = 0.411). The overall validity coefficient calculated using the method of triads was high (0.881). The vitamin D FFQ has been validated for use in future studies aiming to assess habitual vitamin D intake. © 2015 The British Dietetic Association Ltd.

  10. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  11. Method validation using weighted linear regression models for quantification of UV filters in water samples.

    Science.gov (United States)

    da Silva, Claudia Pereira; Emídio, Elissandro Soares; de Marchi, Mary Rosa Rodrigues

    2015-01-01

    This paper describes the validation of a method consisting of solid-phase extraction followed by gas chromatography-tandem mass spectrometry for the analysis of the ultraviolet (UV) filters benzophenone-3, ethylhexyl salicylate, ethylhexyl methoxycinnamate and octocrylene. The method validation criteria included evaluation of selectivity, analytical curve, trueness, precision, limits of detection and limits of quantification. The non-weighted linear regression model has traditionally been used for calibration, but it is not necessarily the optimal model in all cases. Because the assumption of homoscedasticity was not met for the analytical data in this work, a weighted least squares linear regression was used for the calibration method. The evaluated analytical parameters were satisfactory for the analytes and showed recoveries at four fortification levels between 62% and 107%, with relative standard deviations less than 14%. The detection limits ranged from 7.6 to 24.1 ng L(-1). The proposed method was used to determine the amount of UV filters in water samples from water treatment plants in Araraquara and Jau in São Paulo, Brazil. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Development and validation of a thin-layer chromatography method for stability studies of naproxen

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar; Rodriguez Borges, Tania

    2011-01-01

    The validation of an analytical method was carried out to be applied to the stability studies of the future formulations of naproxen suppositories for infant and adult use. The factors which mostly influenced in the naproxen stability were determined, the major degradation occurred in oxidizing acid medium and by action of light. The possible formation of esters between the free carboxyl group present in naproxen and the glyceryl monoestereate present in the base was identified as one of the degradation paths in the new formulation. The results were satisfactory. A thin-layer chromatography-based method was developed as well as the best chromatographic conditions were selected. GF 254 silica gel plates and ultraviolet developer at 254 nm were employed. Three solvent systems were evaluated of which A made up of glacial acetic: tetrahydrofurane:toluene (3:9:90 v/v/v)allowed adequate resolution between the analyte and the possible degradation products, with detection limit of 1 μg. The use of the suggested method was restricted to the identification of possible degradation products just for qualitative purposes and not as final test. The method proved to be sensitive and selective enough to be applied for the stated objective, according to the validation results

  13. Validation of a UV Spectrometric Method for the Assay of Tolfenamic Acid in Organic Solvents

    Directory of Open Access Journals (Sweden)

    Sofia Ahmed

    2015-01-01

    Full Text Available The present study has been carried out to validate a UV spectrometric method for the assay of tolfenamic acid (TA in organic solvents. TA is insoluble in water; therefore, a total of thirteen commonly used organic solvents have been selected in which the drug is soluble. Fresh stock solutions of TA in each solvent in a concentration of 1 × 10−4 M (2.62 mg% were prepared for the assay. The method has been validated according to the guideline of International Conference on Harmonization and parameters like linearity, range, accuracy, precision, sensitivity, and robustness have been studied. Although the method was found to be efficient for the determination of TA in all solvents on the basis of statistical data 1-octanol, followed by ethanol and methanol, was found to be comparatively better than the other studied solvents. No change in the stock solution stability of TA has been observed in each solvent for 24 hours stored either at room (25±1°C or at refrigerated temperature (2–8°C. A shift in the absorption maxima has been observed for TA in various solvents indicating drug-solvent interactions. The studied method is simple, rapid, economical, accurate, and precise for the assay of TA in different organic solvents.

  14. Validation of Field Methods to Assess Body Fat Percentage in Elite Youth Soccer Players.

    Science.gov (United States)

    Munguia-Izquierdo, Diego; Suarez-Arrones, Luis; Di Salvo, Valter; Paredes-Hernandez, Victor; Alcazar, Julian; Ara, Ignacio; Kreider, Richard; Mendez-Villanueva, Alberto

    2018-05-01

    This study determined the most effective field method for quantifying body fat percentage in male elite youth soccer players and developed prediction equations based on anthropometric variables. Forty-four male elite-standard youth soccer players aged 16.3-18.0 years underwent body fat percentage assessments, including bioelectrical impedance analysis and the calculation of various skinfold-based prediction equations. Dual X-ray absorptiometry provided a criterion measure of body fat percentage. Correlation coefficients, bias, limits of agreement, and differences were used as validity measures, and regression analyses were used to develop soccer-specific prediction equations. The equations from Sarria et al. (1998) and Durnin & Rahaman (1967) reached very large correlations and the lowest biases, and they reached neither the practically worthwhile difference nor the substantial difference between methods. The new youth soccer-specific skinfold equation included a combination of triceps and supraspinale skinfolds. None of the practical methods compared in this study are adequate for estimating body fat percentage in male elite youth soccer players, except for the equations from Sarria et al. (1998) and Durnin & Rahaman (1967). The new youth soccer-specific equation calculated in this investigation is the only field method specifically developed and validated in elite male players, and it shows potentially good predictive power. © Georg Thieme Verlag KG Stuttgart · New York.

  15. Development and validation of Ketorolac Tromethamine in eye drop formulation by RP-HPLC method

    Directory of Open Access Journals (Sweden)

    G. Sunil

    2017-02-01

    Full Text Available A simple, precise and accurate method was developed and validated for analysis of Ketorolac Tromethamine in eye drop formulation. An isocratic HPLC analysis was performed on Kromosil C18 column (150 cm × 4.6 mm × 5 μm. The compound was separated with the mixture of methanol and ammonium dihydrogen phosphate buffer in the ratio of 55:45 V/V, pH 3.0 was adjusted with O-phosphoric acid as the mobile phase at flow of 1.5 mL min−1. UV detection was performed at 314 nm using photo diode array detection. The retention time was found to be 6.01 min. The system suitability parameters such as theoretical plate count, tailing and percentage RSD between six standard injections were within the limit. The method was validated according to ICH guidelines. Calibrations were linear over the concentration range of 50–150 μg mL−1 as indicated by correlation coefficient (r of 0.999. The robustness of the method was evaluated by deliberately altering the chromatographic conditions. The developed method can be applicable for routine quantitative analysis.

  16. A Validated HPLC-DAD Method for Simultaneous Determination of Etodolac and Pantoprazole in Rat Plasma

    Directory of Open Access Journals (Sweden)

    Ali S. Abdelhameed

    2014-01-01

    Full Text Available A simple, sensitive, and accurate HPLC-DAD method has been developed and validated for the simultaneous determination of pantoprazole and etodolac in rat plasma as a tool for therapeutic drug monitoring. Optimal chromatographic separation of the analytes was achieved on a Waters Symmetry C18 column using a mobile phase that consisted of phosphate buffer pH~4.0 as eluent A and acetonitrile as eluent B in a ratio of A : B, 55 : 45 v/v for 6 min, pumped isocratically at a flow rate of 0.8 mL min−1. The eluted analytes were monitored using photodiode array detector set to quantify samples at 254 nm. The method was linear with r2=0.9999 for PTZ and r2=0.9995 for ETD at a concentration range of 0.1–15 and 5–50 μgmL−1 for PTZ and ETD, respectively. The limits of detection were found to be 0.033 and 0.918 μgmL−1 for PTZ and ETD, respectively. The method was statistically validated for linearity, accuracy, precision, and selectivity following the International Conference for Harmonization (ICH guidelines. The reproducibility of the method was reliable with the intra- and interday precision (% RSD <7.76% for PTZ and <7.58 % for ETD.

  17. Bridging the Gap Between Validation and Implementation of Non-Animal Veterinary Vaccine Potency Testing Methods

    Science.gov (United States)

    Dozier, Samantha; Brown, Jeffrey; Currie, Alistair

    2011-01-01

    Simple Summary Many vaccines are tested for quality in experiments that require the use of large numbers of animals in procedures that often cause significant pain and distress. Newer technologies have fostered the development of vaccine quality control tests that reduce or eliminate the use of animals, but the availability of these newer methods has not guaranteed their acceptance by regulators or use by manufacturers. We discuss a strategic approach that has been used to assess and ultimately increase the use of non-animal vaccine quality tests in the U.S. and U.K. Abstract In recent years, technologically advanced high-throughput techniques have been developed that replace, reduce or refine animal use in vaccine quality control tests. Following validation, these tests are slowly being accepted for use by international regulatory authorities. Because regulatory acceptance itself has not guaranteed that approved humane methods are adopted by manufacturers, various organizations have sought to foster the preferential use of validated non-animal methods by interfacing with industry and regulatory authorities. After noticing this gap between regulation and uptake by industry, we began developing a paradigm that seeks to narrow the gap and quicken implementation of new replacement, refinement or reduction guidance. A systematic analysis of our experience in promoting the transparent implementation of validated non-animal vaccine potency assays has led to the refinement of our paradigmatic process, presented here, by which interested parties can assess the local regulatory acceptance of methods that reduce animal use and integrate them into quality control testing protocols, or ensure the elimination of peripheral barriers to their use, particularly for potency and other tests carried out on production batches. PMID:26486625

  18. Stability indicating method development and validation of assay method for the estimation of rizatriptan benzoate in tablet

    Directory of Open Access Journals (Sweden)

    Chandrashekhar K. Gadewar

    2017-05-01

    Full Text Available A simple, sensitive, precise and specific high performance liquid chromatography method was developed and validated for the determination of rizatriptan in rizatriptan benzoate tablet. The separation was carried out by using a mobile phase consisting of acetonitrile: pH 3.4 phosphate buffer in ratio of 20:80. The column used was Zorbax SB CN 250 mm × 4.6 mm, 5 μ with a flow rate of 1 ml/min using UV detection at 225 nm. The retention time of rizatriptan and benzoic acid was found to be 4.751 and 8.348 min respectively. A forced degradation study of rizatriptan benzoate in its tablet form was conducted under the condition of hydrolysis, oxidation, thermal and photolysis. Rizatriptan was found to be stable in basic buffer while in acidic buffer was found to be degraded (water bath at 60 °C for 15 min. The detector response of rizatriptan is directly proportional to concentration ranging from 30% to 160% of test concentration i.e. 15.032 to 80.172 mcg/ml. Results of analysis were validated statistically and by recovery studies (mean recovery = 99.44. The result of the study showed that the proposed method is simple, rapid, precise and accurate, which is useful for the routine determination of rizatriptan in pharmaceutical dosage forms.

  19. Determination of methylmercury in marine sediment samples: Method validation and occurrence data

    International Nuclear Information System (INIS)

    Carrasco, Luis; Vassileva, Emilia

    2015-01-01

    Highlights: • A method for MeHg determination at trace level in marine sediments is completely validated. • Validation is performed according to ISO-17025 and Eurachem guidelines. • The extraction efficiency of four sample preparation procedures is evaluated. • The uncertainty budget is used as a tool for evaluation of main uncertainty contributors. • Comparison with independent methods yields good agreement within stated uncertainty. - Abstract: The determination of methylmercury (MeHg) in sediment samples is a difficult task due to the extremely low MeHg/THg (total mercury) ratio and species interconversion. Here, we present the method validation of a cost-effective fit-for-purpose analytical procedure for the measurement of MeHg in sediments, which is based on aqueous phase ethylation, followed by purge and trap and hyphenated gas chromatography–pyrolysis–atomic fluorescence spectrometry (GC–Py–AFS) separation and detection. Four different extraction techniques, namely acid and alkaline leaching followed by solvent extraction and evaporation, microwave-assisted extraction with 2-mercaptoethanol, and acid leaching, solvent extraction and back extraction into sodium thiosulfate, were examined regarding their potential to selectively extract MeHg from estuarine sediment IAEA-405 certified reference material (CRM). The procedure based on acid leaching with HNO 3 /CuSO 4 , solvent extraction and back extraction into Na 2 S 2 O 3 yielded the highest extraction recovery, i.e., 94 ± 3% and offered the possibility to perform the extraction of a large number of samples in a short time, by eliminating the evaporation step. The artifact formation of MeHg was evaluated by high performance liquid chromatography coupled to inductively coupled plasma mass spectrometry (HPLC–ICP–MS), using isotopically enriched Me 201 Hg and 202 Hg and it was found to be nonexistent. A full validation approach in line with ISO 17025 and Eurachem guidelines was followed

  20. Determination of methylmercury in marine sediment samples: Method validation and occurrence data

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, Luis; Vassileva, Emilia, E-mail: e.vasileva-veleva@iaea.org

    2015-01-01

    Highlights: • A method for MeHg determination at trace level in marine sediments is completely validated. • Validation is performed according to ISO-17025 and Eurachem guidelines. • The extraction efficiency of four sample preparation procedures is evaluated. • The uncertainty budget is used as a tool for evaluation of main uncertainty contributors. • Comparison with independent methods yields good agreement within stated uncertainty. - Abstract: The determination of methylmercury (MeHg) in sediment samples is a difficult task due to the extremely low MeHg/THg (total mercury) ratio and species interconversion. Here, we present the method validation of a cost-effective fit-for-purpose analytical procedure for the measurement of MeHg in sediments, which is based on aqueous phase ethylation, followed by purge and trap and hyphenated gas chromatography–pyrolysis–atomic fluorescence spectrometry (GC–Py–AFS) separation and detection. Four different extraction techniques, namely acid and alkaline leaching followed by solvent extraction and evaporation, microwave-assisted extraction with 2-mercaptoethanol, and acid leaching, solvent extraction and back extraction into sodium thiosulfate, were examined regarding their potential to selectively extract MeHg from estuarine sediment IAEA-405 certified reference material (CRM). The procedure based on acid leaching with HNO{sub 3}/CuSO{sub 4}, solvent extraction and back extraction into Na{sub 2}S{sub 2}O{sub 3} yielded the highest extraction recovery, i.e., 94 ± 3% and offered the possibility to perform the extraction of a large number of samples in a short time, by eliminating the evaporation step. The artifact formation of MeHg was evaluated by high performance liquid chromatography coupled to inductively coupled plasma mass spectrometry (HPLC–ICP–MS), using isotopically enriched Me{sup 201}Hg and {sup 202}Hg and it was found to be nonexistent. A full validation approach in line with ISO 17025 and

  1. Fisk-based criteria to support validation of detection methods for drinking water and air.

    Energy Technology Data Exchange (ETDEWEB)

    MacDonell, M.; Bhattacharyya, M.; Finster, M.; Williams, M.; Picel, K.; Chang, Y.-S.; Peterson, J.; Adeshina, F.; Sonich-Mullin, C.; Environmental Science Division; EPA

    2009-02-18

    This report was prepared to support the validation of analytical methods for threat contaminants under the U.S. Environmental Protection Agency (EPA) National Homeland Security Research Center (NHSRC) program. It is designed to serve as a resource for certain applications of benchmark and fate information for homeland security threat contaminants. The report identifies risk-based criteria from existing health benchmarks for drinking water and air for potential use as validation targets. The focus is on benchmarks for chronic public exposures. The priority sources are standard EPA concentration limits for drinking water and air, along with oral and inhalation toxicity values. Many contaminants identified as homeland security threats to drinking water or air would convert to other chemicals within minutes to hours of being released. For this reason, a fate analysis has been performed to identify potential transformation products and removal half-lives in air and water so appropriate forms can be targeted for detection over time. The risk-based criteria presented in this report to frame method validation are expected to be lower than actual operational targets based on realistic exposures following a release. Note that many target criteria provided in this report are taken from available benchmarks without assessing the underlying toxicological details. That is, although the relevance of the chemical form and analogues are evaluated, the toxicological interpretations and extrapolations conducted by the authoring organizations are not. It is also important to emphasize that such targets in the current analysis are not health-based advisory levels to guide homeland security responses. This integrated evaluation of chronic public benchmarks and contaminant fate has identified more than 200 risk-based criteria as method validation targets across numerous contaminants and fate products in drinking water and air combined. The gap in directly applicable values is

  2. Ecological content validation of the Information Assessment Method for parents (IAM-parent): A mixed methods study.

    Science.gov (United States)

    Bujold, M; El Sherif, R; Bush, P L; Johnson-Lafleur, J; Doray, G; Pluye, P

    2018-02-01

    This mixed methods study content validated the Information Assessment Method for parents (IAM-parent) that allows users to systematically rate and comment on online parenting information. Quantitative data and results: 22,407 IAM ratings were collected; of the initial 32 items, descriptive statistics showed that 10 had low relevance. Qualitative data and results: IAM-based comments were collected, and 20 IAM users were interviewed (maximum variation sample); the qualitative data analysis assessed the representativeness of IAM items, and identified items with problematic wording. Researchers, the program director, and Web editors integrated quantitative and qualitative results, which led to a shorter and clearer IAM-parent. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Performance of the Tariff Method: validation of a simple additive algorithm for analysis of verbal autopsies

    Directory of Open Access Journals (Sweden)

    Murray Christopher JL

    2011-08-01

    Full Text Available Abstract Background Verbal autopsies provide valuable information for studying mortality patterns in populations that lack reliable vital registration data. Methods for transforming verbal autopsy results into meaningful information for health workers and policymakers, however, are often costly or complicated to use. We present a simple additive algorithm, the Tariff Method (termed Tariff, which can be used for assigning individual cause of death and for determining cause-specific mortality fractions (CSMFs from verbal autopsy data. Methods Tariff calculates a score, or "tariff," for each cause, for each sign/symptom, across a pool of validated verbal autopsy data. The tariffs are summed for a given response pattern in a verbal autopsy, and this sum (score provides the basis for predicting the cause of death in a dataset. We implemented this algorithm and evaluated the method's predictive ability, both in terms of chance-corrected concordance at the individual cause assignment level and in terms of CSMF accuracy at the population level. The analysis was conducted separately for adult, child, and neonatal verbal autopsies across 500 pairs of train-test validation verbal autopsy data. Results Tariff is capable of outperforming physician-certified verbal autopsy in most cases. In terms of chance-corrected concordance, the method achieves 44.5% in adults, 39% in children, and 23.9% in neonates. CSMF accuracy was 0.745 in adults, 0.709 in children, and 0.679 in neonates. Conclusions Verbal autopsies can be an efficient means of obtaining cause of death data, and Tariff provides an intuitive, reliable method for generating individual cause assignment and CSMFs. The method is transparent and flexible and can be readily implemented by users without training in statistics or computer science.

  4. Development and Validation of Liquid Chromatographic Method for Estimation of Naringin in Nanoformulation

    Directory of Open Access Journals (Sweden)

    Kranti P. Musmade

    2014-01-01

    Full Text Available A simple, precise, accurate, rapid, and sensitive reverse phase high performance liquid chromatography (RP-HPLC method with UV detection has been developed and validated for quantification of naringin (NAR in novel pharmaceutical formulation. NAR is a polyphenolic flavonoid present in most of the citrus plants having variety of pharmacological activities. Method optimization was carried out by considering the various parameters such as effect of pH and column. The analyte was separated by employing a C18 (250.0 × 4.6 mm, 5 μm column at ambient temperature in isocratic conditions using phosphate buffer pH 3.5: acetonitrile (75 : 25% v/v as mobile phase pumped at a flow rate of 1.0 mL/min. UV detection was carried out at 282 nm. The developed method was validated according to ICH guidelines Q2(R1. The method was found to be precise and accurate on statistical evaluation with a linearity range of 0.1 to 20.0 μg/mL for NAR. The intra- and interday precision studies showed good reproducibility with coefficients of variation (CV less than 1.0%. The mean recovery of NAR was found to be 99.33 ± 0.16%. The proposed method was found to be highly accurate, sensitive, and robust. The proposed liquid chromatographic method was successfully employed for the routine analysis of said compound in developed novel nanopharmaceuticals. The presence of excipients did not show any interference on the determination of NAR, indicating method specificity.

  5. Validation of a same-day real-time PCR method for screening of meat and carcass swabs for Salmonella

    DEFF Research Database (Denmark)

    Löfström, Charlotta; Krause, Michael; Josefsen, Mathilde Hartmann

    2009-01-01

    of the published PCR methods for Salmonella have been validated in collaborative studies. This study describes a validation including comparative and collaborative trials, based on the recommendations from the Nordic organization for validation of alternative microbiological methods (NordVal) of a same-day, non....... Partly based on results obtained in this study, the method has obtained NordVal approval for analysis of Salmonella in meat and carcass swabs. The PCR method was transferred to a production laboratory and the performance was compared with the BAX Salmonella test on 39 pork samples artificially...... contaminated with Salmonella. There was no significant difference in the results obtained by the two methods. Conclusion: The real-time PCR method for detection of Salmonella in meat and carcass swabs was validated in comparative and collaborative trials according to NordVal recommendations. The PCR method...

  6. A photographic method to measure food item intake. Validation in geriatric institutions.

    Science.gov (United States)

    Pouyet, Virginie; Cuvelier, Gérard; Benattar, Linda; Giboreau, Agnès

    2015-01-01

    From both a clinical and research perspective, measuring food intake is an important issue in geriatric institutions. However, weighing food in this context can be complex, particularly when the items remaining on a plate (side dish, meat or fish and sauce) need to be weighed separately following consumption. A method based on photography that involves taking photographs after a meal to determine food intake consequently seems to be a good alternative. This method enables the storage of raw data so that unhurried analyses can be performed to distinguish the food items present in the images. Therefore, the aim of this paper was to validate a photographic method to measure food intake in terms of differentiating food item intake in the context of a geriatric institution. Sixty-six elderly residents took part in this study, which was performed in four French nursing homes. Four dishes of standardized portions were offered to the residents during 16 different lunchtimes. Three non-trained assessors then independently estimated both the total and specific food item intakes of the participants using images of their plates taken after the meal (photographic method) and a reference image of one plate taken before the meal. Total food intakes were also recorded by weighing the food. To test the reliability of the photographic method, agreements between different assessors and agreements among various estimates made by the same assessor were evaluated. To test the accuracy and specificity of this method, food intake estimates for the four dishes were compared with the food intakes determined using the weighed food method. To illustrate the added value of the photographic method, food consumption differences between the dishes were explained by investigating the intakes of specific food items. Although they were not specifically trained for this purpose, the results demonstrated that the assessor estimates agreed between assessors and among various estimates made by the same

  7. Possibilities and scope of the double isotope effect method in the elucidation of mechanisms of enzyme catalyzed reactions

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, H L; Medina, R [Technische Univ. Muenchen, Freising (Germany, F.R.). Lehrstuhl fuer Allgemeine Chemie und Biochemie

    1991-01-01

    Kinetic isotope effects on enzyme catalyzed reactions are indicative for the first irreversible in a sequence of individual steps. Hints on the relative velocities of other steps can only be obtained from the partitioning factor R and its dependence on external reaction conditions. In general, the experimental data needed are obtained from isotope abundance measurements in a defined position of the substrate or product as a function of turnover. This method does not reveal events dealing with neighbour atoms or preceding the main isotope sensitive step. In the method presented here, the analytical measurement is extended to the second atom involved in a bond fission of formation (Double Isotope Effect Method). It is shown that the additional results obtained support the identification of the main isotopically sensitive step and its relative contribution to the overall reaction rate, the identification of other kinetically significant steps and the differentiation between stepwise and concerted reaction mechanisms. The method and its advantages are demonstrated on reactions comprising C-N-bond splitting (urease and arginase reaction), C-C-bound fission (reactions catalyzed by pyruvate-dehydrogenase, pyruvate-formiate-lyase and lactate-oxidase), C-O-bound formation (ribulose-bisphosphate-oxygenase reaction), and N-O-bond fission (nitrate- and nitrite-reductase reactions). (orig.).

  8. Development and Validation of New Spectrophotometric Methods to Determine Enrofloxacin in Pharmaceuticals

    Science.gov (United States)

    Rajendraprasad, N.; Basavaiah, K.

    2015-07-01

    Four spectrophotometric methods, based on oxidation with cerium(IV), are investigated and developed to determine EFX in pure form and in dosage forms. The frst and second methods (Method A and method B) are direct, in which after the oxidation of EFX with cerium(IV) in acid medium, the absorbance of reduced and unreacted oxidant is measured at 275 and 320 nm, respectively. In the third (C) and fourth (D) methods after the reaction between EFX and oxidant is ensured to be completed the surplus oxidant is treated with either N-phenylanthranilic acid (NPA) or Alizarin Red S (ARS) dye and the absorbance of the oxidized NPA or ARS is measured at 440 or 420 nm. The methods showed good linearity over the concentration ranges of 0.5-5.0, 1.25-12.5, 10.0-100.0, and 6.0-60.0 μg/ml, for method A, B, C and D, respectively, with apparent molar absorptivity values of 4.42 × 10 4 , 8.7 × 10 3 , 9.31 × 10 2 , and 2.28 × 10 3 l/(mol· cm). The limits of detection (LOD), quantification (LOQ), and Sandell's sensitivity values and other validation results have also been reported. The proposed methods are successfully applied to determine EFX in pure form and in dosage forms.

  9. Validation of the intrinsic spatial efficiency method for non cylindrical homogeneous sources using MC simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz-Ramírez, Pablo, E-mail: rapeitor@ug.uchile.cl; Ruiz, Andrés [Departamento de Física, Facultad de Ciencias, Universidad de Chile (Chile)

    2016-07-07

    The Monte Carlo simulation of the gamma spectroscopy systems is a common practice in these days. The most popular softwares to do this are MCNP and Geant4 codes. The intrinsic spatial efficiency method is a general and absolute method to determine the absolute efficiency of a spectroscopy system for any extended sources, but this was only demonstrated experimentally for cylindrical sources. Due to the difficulty that the preparation of sources with any shape represents, the simplest way to do this is by the simulation of the spectroscopy system and the source. In this work we present the validation of the intrinsic spatial efficiency method for sources with different geometries and for photons with an energy of 661.65 keV. In the simulation the matrix effects (the auto-attenuation effect) are not considered, therefore these results are only preliminaries. The MC simulation is carried out using the FLUKA code and the absolute efficiency of the detector is determined using two methods: the statistical count of Full Energy Peak (FEP) area (traditional method) and the intrinsic spatial efficiency method. The obtained results show total agreement between the absolute efficiencies determined by the traditional method and the intrinsic spatial efficiency method. The relative bias is lesser than 1% in all cases.

  10. Validation of analytical method for quality control of B12 Vitamin-10 000 injection

    International Nuclear Information System (INIS)

    Botet Garcia, Martha; Garcia Penna, Caridad Margarita; Troche Concepcion, Yenilen; Cannizares Arencibia, Yanara; Moreno Correoso, Barbara

    2009-01-01

    Analytical method reported by USA Pharmacopeia was validated for quality control of injectable B 1 2 Vitamin (10 000 U) by UV spectrophotometry because this is a simpler and low-cost method allowing quality control of finished product. Calibration curve was graphed at 60 to 140% interval, where it was linear with a correlation coefficient similar to 0, 9999; statistical test for interception and slope was considered non-significant. There was a recovery of 99.7 % in study concentrations interval where the Cochran (G) and Student(t) test were not significant too. Variation coefficient in repetition study was similar to 0.59 % for the 6 assayed replies, whereas in intermediate precision analysis, the Fisher and Student tests were not significant. Analytical method was linear, precise, specific and exact in study concentrations interval

  11. Validation and application of an improved method for the rapid determination of proline in grape berries.

    Science.gov (United States)

    Rienth, Markus; Romieu, Charles; Gregan, Rebecca; Walsh, Caroline; Torregrosa, Laurent; Kelly, Mary T

    2014-04-16

    A rapid and sensitive method is presented for the determination of proline in grape berries. Following acidification with formic acid, proline is derivatized by heating at 100 °C for 15 min with 3% ninhydrin in dimethyl sulfoxide, and the absorbance, which is stable for at least 60 min, is read at 520 nm. The method was statistically validated in the concentration range from 2.5 to 15 mg/L, giving a repeatability and intermediate precision of generally amino acid analyzer. In terms of sample preparation, a simple dilution (5-20-fold) is required, and sugars, primary amino acids, and anthocyanins were demonstrated not to interfere, as the latter are bleached by ninhydrin under the experimental conditions. The method was applied to the study of proline accumulation in the fruits of microvines grown in phytotrons, and it was established that proline accumulation and concentrations closely resemble those of field-grown macrovines.

  12. Validation of a residue method to determine pesticide residues in cucumber by using nuclear techniques

    International Nuclear Information System (INIS)

    Baysoyu, D.; Tiryaki, O.; Secer, E.; Aydin, G.

    2009-01-01

    In this study, a multi-residue method using ethyl acetate for extraction and gel permeation chromatography for clean-up was validated to determine chlorpyrifos, malathion and dichlorvos in cucumber by gas chromatography. For this purpose, homogenized cucumber samples were fortified with pesticides at 0.02 0.2, 0.8 and 1 mg/kg levels. The efficiency and repeatability of the method in extraction and cleanup steps were performed using 1 4C-carbaryl by radioisotope tracer technique. 1 4C-carbaryl recoveries after the extraction and cleanup steps were between 92.63-111.73 % with a repeatability of 4.85% (CV) and 74.83-102.22 % with a repeatability of 7.19% (CV), respectively. The homogeneity of analytical samples and the stability of pesticides during homogenization were determined using radio tracer technique and chromatographic methods, respectively.

  13. A New Validated RP- HPLC Method for the Determination of Nevirapine in Human Plasma

    Directory of Open Access Journals (Sweden)

    C. H. Venkata Kumar

    2010-01-01

    Full Text Available A rapid, selective and sensitive high performance liquid chromatographic method for the estimation of nevirapine in human plasma has been developed. Chromatography was carried out on a Hypersil BDS C18 column using a mixture of ammonium acetate buffer (pH 4.0 ± 0.05 and acetonitrile (85:15 v/v as the mobile phase. The eluents were monitored for the drug by UV detection at 254 nm. Oxcarbazepine was used as an internal standard for this study. The retention times for nevirapine and oxcarbazepine were found to be 7.2 and 14.7 min respectively. The method was found to be linear in the concentration range of 50 ng/mL to 5003.7 ng/mL. The method was validated as per FDA guidelines and was found to be suitable for bioequivalence and pharmacokinetic studies.

  14. Analytical method validation for quality control and the study of the 50 mg Propylthiouracil stability

    International Nuclear Information System (INIS)

    Valdes Bendoyro, Maria Olga; Garcia Penna, Caridad Margarita; Fernandez, Juan Lugones; Garcia Borges, Lisandra; Martinez Espinosa, Vivian

    2010-01-01

    A high-performance liquid chromatography analytical method was developed and validated for the quality control and stability studies of 50 mg Propylthiouracil tablets. Method is based in active principle separation through a 100 RP-18 RP-18 (5 μm) (250 x 4 mm) Lichrospher chromatography with UV detection to 272 nm, using a mobile phase composed by a ungaseous mixture of a 0.025 M buffer solution-monobasic potassium phosphate to pH= 4,6 ad acetonitrile in a 80:20 ratio with a flux speed of 0,5 mL/min. Analytical method was linear, precise, specific and exact in the study concentrations interval

  15. Development and Validation of UV Spectrophotometric Method For Estimation of Dolutegravir Sodium in Tablet Dosage Form

    International Nuclear Information System (INIS)

    Balasaheb, B.G.

    2015-01-01

    A simple, rapid, precise and accurate spectrophotometric method has been developed for quantitative analysis of Dolutegravir sodium in tablet formulations. The initial stock solution of Dolutegravir sodium was prepared in methanol solvent and subsequent dilution was done in water. The standard solution of Dolutegravir sodium in water showed maximum absorption at wavelength 259.80 nm. The drug obeyed Beer-Lamberts law in the concentration range of 5-40 μg/ mL with coefficient of correlation (R"2) was 0.9992. The method was validated as per the ICH guidelines. The developed method can be adopted in routine analysis of Dolutegravir sodium in bulk or tablet dosage form and it involves relatively low cost solvents and no complex extraction techniques. (author)

  16. Method validation and uncertainty evaluation of organically bound tritium analysis in environmental sample.

    Science.gov (United States)

    Huang, Yan-Jun; Zeng, Fan; Zhang, Bing; Chen, Chao-Feng; Qin, Hong-Juan; Wu, Lian-Sheng; Guo, Gui-Yin; Yang, Li-Tao; Shang-Guan, Zhi-Hong

    2014-08-01

    The analytical method for organically bound tritium (OBT) was developed in our laboratory. The optimized operating conditions and parameters were established for sample drying, special combustion, distillation, and measurement on a liquid scintillation spectrometer (LSC). Selected types of OBT samples such as rice, corn, rapeseed, fresh lettuce and pork were analyzed for method validation of recovery rate reproducibility, the minimum detection concentration, and the uncertainty for typical low level environmental sample was evaluated. The combustion water recovery rate of different dried environmental sample was kept at about 80%, the minimum detection concentration of OBT ranged from 0.61 to 0.89 Bq/kg (dry weight), depending on the hydrogen content. It showed that this method is suitable for OBT analysis of environmental sample with stable recovery rate, and the combustion water yield of a sample with weight about 40 g would provide sufficient quantity for measurement on LSC. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Fuzzy decision-making: a new method in model selection via various validity criteria

    International Nuclear Information System (INIS)

    Shakouri Ganjavi, H.; Nikravesh, K.

    2001-01-01

    Modeling is considered as the first step in scientific investigations. Several alternative models may be candida ted to express a phenomenon. Scientists use various criteria to select one model between the competing models. Based on the solution of a Fuzzy Decision-Making problem, this paper proposes a new method in model selection. The method enables the scientist to apply all desired validity criteria, systematically by defining a proper Possibility Distribution Function due to each criterion. Finally, minimization of a utility function composed of the Possibility Distribution Functions will determine the best selection. The method is illustrated through a modeling example for the A verage Daily Time Duration of Electrical Energy Consumption in Iran

  18. Methods for validating the performance of wearable motion-sensing devices under controlled conditions

    International Nuclear Information System (INIS)

    Bliley, Kara E; Kaufman, Kenton R; Gilbert, Barry K

    2009-01-01

    This paper presents validation methods for assessing the accuracy and precision of motion-sensing device (i.e. accelerometer) measurements. The main goals of this paper were to assess the accuracy and precision of these measurements against a gold standard, to determine if differences in manufacturing and assembly significantly affected device performance and to determine if measurement differences due to manufacturing and assembly could be corrected by applying certain post-processing techniques to the measurement data during analysis. In this paper, the validation of a posture and activity detector (PAD), a device containing a tri-axial accelerometer, is described. Validation of the PAD devices required the design of two test fixtures: a test fixture to position the device in a known orientation, and a test fixture to rotate the device at known velocities and accelerations. Device measurements were compared to these known orientations and accelerations. Several post-processing techniques were utilized in an attempt to reduce variability in the measurement error among the devices. In conclusion, some of the measurement errors due to the inevitable differences in manufacturing and assembly were significantly improved (p < 0.01) by these post-processing techniques

  19. Experimental methods to validate measures of emotional state and readiness for duty in critical operations

    International Nuclear Information System (INIS)

    Weston, Louise Marie

    2007-01-01

    A recent report on criticality accidents in nuclear facilities indicates that human error played a major role in a significant number of incidents with serious consequences and that some of these human errors may be related to the emotional state of the individual. A pre-shift test to detect a deleterious emotional state could reduce the occurrence of such errors in critical operations. The effectiveness of pre-shift testing is a challenge because of the need to gather predictive data in a relatively short test period and the potential occurrence of learning effects due to a requirement for frequent testing. This report reviews the different types of reliability and validity methods and testing and statistical analysis procedures to validate measures of emotional state. The ultimate value of a validation study depends upon the percentage of human errors in critical operations that are due to the emotional state of the individual. A review of the literature to identify the most promising predictors of emotional state for this application is highly recommended

  20. Development and Validation Dissolution Analytical Method of Nimesulide beta-Cyclodextrin 400 mg Tablet

    Directory of Open Access Journals (Sweden)

    Carlos Eduardo Carvalho Pereira

    2016-10-01

    Full Text Available The nimesulide (N-(4-nitro-2-phenoxyphenylmethanesulfonamide belongs to the class of non-steroidal anti-inflammatory drugs (NSAIDs and category II of the biopharmaceutical classification, The complexation of nimesulide with b-cyclodextrin is a pharmacological strategy to increase the solubility of the drug The objective of this study was to develop and validate an analytical methodology for dissolving the nimesulide beta-cyclodextrin 400 mg tablet and meets the guidelines of ANVISA for drug registration purposes. Once developed, the dissolution methodology was validated according to the RE of parameters no.  899/2003. In the development of the method it was noted that the duration of the dissolution test was 60 minutes, the volume and the most suitable dissolution medium was 900 mL of aqueous solution of sodium lauryl sulfate 1% (w/ v. It was also noted that rotation of 100 rpm and the paddle apparatus was the most appropriate to evaluate the dissolution of the drug. Spectrophotometric methodology was used to quantify the percentage of dissolved drug. The wavelength was 390 nm using the quantification. The validation of the methodology, system suitability parameters, specificity/selectivity, linearity, precision, accuracy and robustness were satisfactory and proved that the developed dissolution methodology was duly executed. DOI: http://dx.doi.org/10.17807/orbital.v8i5.827

  1. Validation of a method for assessing resident physicians' quality improvement proposals.

    Science.gov (United States)

    Leenstra, James L; Beckman, Thomas J; Reed, Darcy A; Mundell, William C; Thomas, Kris G; Krajicek, Bryan J; Cha, Stephen S; Kolars, Joseph C; McDonald, Furman S

    2007-09-01

    Residency programs involve trainees in quality improvement (QI) projects to evaluate competency in systems-based practice and practice-based learning and improvement. Valid approaches to assess QI proposals are lacking. We developed an instrument for assessing resident QI proposals--the Quality Improvement Proposal Assessment Tool (QIPAT-7)-and determined its validity and reliability. QIPAT-7 content was initially obtained from a national panel of QI experts. Through an iterative process, the instrument was refined, pilot-tested, and revised. Seven raters used the instrument to assess 45 resident QI proposals. Principal factor analysis was used to explore the dimensionality of instrument scores. Cronbach's alpha and intraclass correlations were calculated to determine internal consistency and interrater reliability, respectively. QIPAT-7 items comprised a single factor (eigenvalue = 3.4) suggesting a single assessment dimension. Interrater reliability for each item (range 0.79 to 0.93) and internal consistency reliability among the items (Cronbach's alpha = 0.87) were high. This method for assessing resident physician QI proposals is supported by content and internal structure validity evidence. QIPAT-7 is a useful tool for assessing resident QI proposals. Future research should determine the reliability of QIPAT-7 scores in other residency and fellowship training programs. Correlations should also be made between assessment scores and criteria for QI proposal success such as implementation of QI proposals, resident scholarly productivity, and improved patient outcomes.

  2. Absolute quantification method and validation of airborne snow crab allergen tropomyosin using tandem mass spectrometry

    International Nuclear Information System (INIS)

    Rahman, Anas M. Abdel; Lopata, Andreas L.; Randell, Edward W.; Helleur, Robert J.

    2010-01-01

    Measuring the levels of the major airborne allergens of snow crab in the workplace is very important in studying the prevalence of crab asthma in workers. Previously, snow crab tropomyosin (SCTM) was identified as the major aeroallergen in crab plants and a unique signature peptide was identified for this protein. The present study advances our knowledge on aeroallergens by developing a method of quantification of airborne SCTM by using isotope dilution mass spectrometry. Liquid chromatography tandem mass spectrometry was developed for separation and analysis of the signature peptides. The tryptic digestion conditions were optimized to accomplish complete digestion. The validity of the method was studied using international conference on harmonization protocol, Where 2-9% for CV (precision) and 101-110% for accuracy, at three different levels of quality control. Recovery of the spiked protein from PTFE and TopTip filters was measured to be 99% and 96%, respectively. To further demonstrate the applicability and the validity of the method for real samples, 45 kg of whole snow crab were processed in an enclosed (simulated) crab processing line and air samples were collected. The levels of SCTM ranged between 0.36-3.92 μg m -3 and 1.70-2.31 μg m -3 for butchering and cooking stations, respectively.

  3. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    Science.gov (United States)

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Experimental results and validation of a method to reconstruct forces on the ITER test blanket modules

    International Nuclear Information System (INIS)

    Zeile, Christian; Maione, Ivan A.

    2015-01-01

    Highlights: • An in operation force measurement system for the ITER EU HCPB TBM has been developed. • The force reconstruction methods are based on strain measurements on the attachment system. • An experimental setup and a corresponding mock-up have been built. • A set of test cases representing ITER relevant excitations has been used for validation. • The influence of modeling errors on the force reconstruction has been investigated. - Abstract: In order to reconstruct forces on the test blanket modules in ITER, two force reconstruction methods, the augmented Kalman filter and a model predictive controller, have been selected and developed to estimate the forces based on strain measurements on the attachment system. A dedicated experimental setup with a corresponding mock-up has been designed and built to validate these methods. A set of test cases has been defined to represent possible excitation of the system. It has been shown that the errors in the estimated forces mainly depend on the accuracy of the identified model used by the algorithms. Furthermore, it has been found that a minimum of 10 strain gauges is necessary to allow for a low error in the reconstructed forces.

  5. A Validated Method for the Quality Control of Andrographis paniculata Preparations.

    Science.gov (United States)

    Karioti, Anastasia; Timoteo, Patricia; Bergonzi, Maria Camilla; Bilia, Anna Rita

    2017-10-01

    Andrographis paniculata is a herbal drug of Asian traditional medicine largely employed for the treatment of several diseases. Recently, it has been introduced in Europe for the prophylactic and symptomatic treatment of common cold and as an ingredient of dietary supplements. The active principles are diterpenes with andrographolide as the main representative. In the present study, an analytical protocol was developed for the determination of the main constituents in the herb and preparations of A. paniculata . Three different extraction protocols (methanol extraction using a modified Soxhlet procedure, maceration under ultrasonication, and decoction) were tested. Ultrasonication achieved the highest content of analytes. HPLC conditions were optimized in terms of solvent mixtures, time course, and temperature. A reversed phase C18 column eluted with a gradient system consisting of acetonitrile and acidified water and including an isocratic step at 30 °C was used. The HPLC method was validated for linearity, limits of quantitation and detection, repeatability, precision, and accuracy. The overall method was validated for precision and accuracy over at least three different concentration levels. Relative standard deviation was less than 1.13%, whereas recovery was between 95.50% and 97.19%. The method also proved to be suitable for the determination of a large number of commercial samples and was proposed to the European Pharmacopoeia for the quality control of Andrographidis herba. Georg Thieme Verlag KG Stuttgart · New York.

  6. Validation of an ultra-fast UPLC-UV method for the separation of antituberculosis tablets.

    Science.gov (United States)

    Nguyen, Dao T-T; Guillarme, Davy; Rudaz, Serge; Veuthey, Jean-Luc

    2008-04-01

    A simple method using ultra performance LC (UPLC) coupled with UV detection was developed and validated for the determination of antituberculosis drugs in combined dosage form, i. e. isoniazid (ISN), pyrazinamide (PYR) and rifampicin (RIF). Drugs were separated on a short column (2.1 mm x 50 mm) packed with 1.7 mum particles, using an elution gradient procedure. At 30 degrees C, less than 2 min was necessary for the complete separation of the three antituberculosis drugs, while the original USP method was performed in 15 min. Further improvements were obtained with the combination of UPLC and high temperature (up to 90 degrees C), namely HT-UPLC, which allows the application of higher mobile phase flow rates. Therefore, the separation of ISN, PYR and RIF was performed in less than 1 min. After validation (selectivity, trueness, precision and accuracy), both methods (UPLC and HT-UPLC) have proven suitable for the routine quality control analysis of antituberculosis drugs in combined dosage form. Additionally, a large number of samples per day can be analysed due to the short analysis times.

  7. Determination of methylmercury in marine biota samples with advanced mercury analyzer: method validation.

    Science.gov (United States)

    Azemard, Sabine; Vassileva, Emilia

    2015-06-01

    In this paper, we present a simple, fast and cost-effective method for determination of methyl mercury (MeHg) in marine samples. All important parameters influencing the sample preparation process were investigated and optimized. Full validation of the method was performed in accordance to the ISO-17025 (ISO/IEC, 2005) and Eurachem guidelines. Blanks, selectivity, working range (0.09-3.0ng), recovery (92-108%), intermediate precision (1.7-4.5%), traceability, limit of detection (0.009ng), limit of quantification (0.045ng) and expanded uncertainty (15%, k=2) were assessed. Estimation of the uncertainty contribution of each parameter and the demonstration of traceability of measurement results was provided as well. Furthermore, the selectivity of the method was studied by analyzing the same sample extracts by advanced mercury analyzer (AMA) and gas chromatography-atomic fluorescence spectrometry (GC-AFS). Additional validation of the proposed procedure was effectuated by participation in the IAEA-461 worldwide inter-laboratory comparison exercises. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Validated method for the analysis of goji berry, a rich source of zeaxanthin dipalmitate.

    Science.gov (United States)

    Karioti, Anastasia; Bergonzi, Maria Camilla; Vincieri, Franco F; Bilia, Anna Rita

    2014-12-31

    In the present study an HPLC-DAD method was developed for the determination of the main carotenoid, zeaxanthin dipalmitate, in the fruits of Lycium barbarum. The aim was to develop and optimize an extraction protocol to allow fast, exhaustive, and repeatable extraction, suitable for labile carotenoid content. Use of liquid N2 allowed the grinding of the fruit. A step of ultrasonication with water removed efficiently the polysaccharides and enabled the exhaustive extraction of carotenoids by hexane/acetone 50:50. The assay was fast and simple and permitted the quality control of a large number of commercial samples including fruits, juices, and a jam. The HPLC method was validated according to ICH guidelines and satisfied the requirements. Finally, the overall method was validated for precision (% RSD ranging between 3.81 and 4.13) and accuracy at three concentration levels. The recovery was between 94 and 107% with RSD values <2%, within the acceptable limits, especially if the difficulty of the matrix is taken into consideration.

  9. A Validated RP-HPLC Method for Simultaneous Estimation of Atenolol and Indapamide in Pharmaceutical Formulations

    Directory of Open Access Journals (Sweden)

    G. Tulja Rani

    2011-01-01

    Full Text Available A simple, fast, precise, selective and accurate RP-HPLC method was developed and validated for the simultaneous determination of atenolol and indapamide from bulk and formulations. Chromatographic separation was achieved isocratically on a Waters C18 column (250×4.6 mm, 5 µ particle size using a mobile phase, methanol and water (adjusted to pH 2.7 with 1% orthophosphoric acid in the ratio of 80:20. The flow rate was 1 mL/min and effluent was detected at 230 nm. The retention time of atenolol and indapamide were 1.766 min and 3.407 min. respectively. Linearity was observed in the concentration range of 12.5-150 µg/mL for atenolol and 0.625-7.5 µg/mL for indapamide. Percent recoveries obtained for both the drugs were 99.74-100.06% and 98.65-99.98% respectively. The method was validated according to the ICH guidelines with respect to specificity, linearity, accuracy, precision and robustness. The method developed can be used for the routine analysis of atenolol and indapamide from their combined dosage form.

  10. Absolute quantification method and validation of airborne snow crab allergen tropomyosin using tandem mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Anas M. Abdel, E-mail: anasar@mun.ca [Department of Chemistry, Memorial University of Newfoundland, St. John' s, Newfoundland A1B 3X7 (Canada); Lopata, Andreas L. [School of Applied Science, Marine Biomedical Sciences and Health Research Group, RMIT University, Bundoora, 3083 Victoria (Australia); Randell, Edward W. [Department of Laboratory Medicine, Memorial University of Newfoundland, Eastern Health, St. John' s, Newfoundland and Labrador A1B 3V6 (Canada); Helleur, Robert J. [Department of Chemistry, Memorial University of Newfoundland, St. John' s, Newfoundland A1B 3X7 (Canada)

    2010-11-29

    Measuring the levels of the major airborne allergens of snow crab in the workplace is very important in studying the prevalence of crab asthma in workers. Previously, snow crab tropomyosin (SCTM) was identified as the major aeroallergen in crab plants and a unique signature peptide was identified for this protein. The present study advances our knowledge on aeroallergens by developing a method of quantification of airborne SCTM by using isotope dilution mass spectrometry. Liquid chromatography tandem mass spectrometry was developed for separation and analysis of the signature peptides. The tryptic digestion conditions were optimized to accomplish complete digestion. The validity of the method was studied using international conference on harmonization protocol, Where 2-9% for CV (precision) and 101-110% for accuracy, at three different levels of quality control. Recovery of the spiked protein from PTFE and TopTip filters was measured to be 99% and 96%, respectively. To further demonstrate the applicability and the validity of the method for real samples, 45 kg of whole snow crab were processed in an enclosed (simulated) crab processing line and air samples were collected. The levels of SCTM ranged between 0.36-3.92 {mu}g m{sup -3} and 1.70-2.31 {mu}g m{sup -3} for butchering and cooking stations, respectively.

  11. A validated high performance thin layer chromatography method for determination of yohimbine hydrochloride in pharmaceutical preparations.

    Science.gov (United States)

    Badr, Jihan M

    2013-01-01

    Yohimbine is an indole alkaloid used as a promising therapy for erectile dysfunction. A number of methods were reported for the analysis of yohimbine in the bark or in pharmaceutical preparations. In the present work, a simple and sensitive high performance thin layer chromatographic method is developed for determination of yohimbine (occurring as yohimbine hydrochloride) in pharmaceutical preparations and validated according to International Conference of Harmonization (ICH) guidelines. The method employed thin layer chromatography aluminum sheets precoated with silica gel as the stationary phase and the mobile phase consisted of chloroform:methanol:ammonia (97:3:0.2), which gave compact bands of yohimbine hydrochloride. Linear regression data for the calibration curves of standard yohimbine hydrochloride showed a good linear relationship over a concentration range of 80-1000 ng/spot with respect to the area and correlation coefficient (R(2)) was 0.9965. The method was evaluated regarding accuracy, precision, selectivity, and robustness. Limits of detection and quantitation were recorded as 5 and 40 ng/spot, respectively. The proposed method efficiently separated yohimbine hydrochloride from other components even in complex mixture containing powdered plants. The amount of yohimbine hydrochloride ranged from 2.3 to 5.2 mg/tablet or capsule in preparations containing the pure alkaloid, while it varied from zero (0) to 1.5-1.8 mg/capsule in dietary supplements containing powdered yohimbe bark. We concluded that this method employing high performance thin layer chromatography (HPTLC) in quantitative determination of yohimbine hydrochloride in pharmaceutical preparations is efficient, simple, accurate, and validated.

  12. Contribution to the validation of thermal ratchetting prevision methods in metallic structures

    International Nuclear Information System (INIS)

    Rakotovelo, A.M.

    1998-03-01

    This work concerns the steady state assessment in the metallic structures subjected to thermomechanical cyclic loadings in biaxial stress state. The effect of the short time mechanical overloads is also investigated. The first chapter is devoted to a bibliographic research concerning the behaviour of the materials and the structures in the cyclic plasticity. Some works relate to the experimental aspect as well as the numerical one for the steady state assessment of such structures are presented. The experimental part of the study is presented in the second chapter. The experimental device was carried out in order to prescribe tension and torsion forces combined with cyclic thermal loading. Some tests was then carried out, among these tests certain include some overloads in tension or torsion. The last chapter describes the numerical calculations using different models (linear isotropic hardening, linear kinematic hardening and elasto-viscoplastic Chaboche's model) and the application of some simplified methods for the ratchetting assessment in the structures. We have considered two categories of methods. The first one is based on an elastic analysis (Bree's diagram, 3 Sm rule, efficiency rule) and the second one combines elastic analysis and elastoplastic analysis of the first cycle (Gatt's and Taleb's methods). The results of this study have enabled: to validate in the biaxial stress state an expression which takes into account the effect of mechanical short time overloads; to test the performances of considered models to describe the evolution of the structure during the first cycle and to take into account the effect of short time overloads. Among the considered models, the elastoplastic Chaboche's model seems to be the most accurate to describe the structure's behaviour during the first cycles; to validate some simplified methods. Certain methods based only on elastic analysis (Bee's diagram and efficiency rule) seem not suitable for the considered kind of

  13. Validation of quantitative 1H NMR method for the analysis of pharmaceutical formulations

    International Nuclear Information System (INIS)

    Santos, Maiara da S.

    2013-01-01

    The need for effective and reliable quality control in products from pharmaceutical industries renders the analyses of their active ingredients and constituents of great importance. This study presents the theoretical basis of ¹H NMR for quantitative analyses and an example of the method validation according to Resolution RE N. 899 by the Brazilian National Health Surveillance Agency (ANVISA), in which the compound paracetamol was the active ingredient. All evaluated parameters (selectivity, linearity, accuracy, repeatability and robustness) showed satisfactory results. It was concluded that a single NMR measurement provides structural and quantitative information of active components and excipients in the sample. (author)

  14. Analytical Method Development and Validation of Solifenacin in Pharmaceutical Dosage Forms by RP-HPLC

    OpenAIRE

    Shaik, Rihana Parveen; Puttagunta, Srinivasa Babu; Kothapalli Bannoth, Chandrasekar; Challa, Bala Sekhara Reddy

    2014-01-01

    A new, accurate, precise, and robust HPLC method was developed and validated for the determination of solifenacin in tablet dosage form. The chromatographic separation was achieved on an Inertsil ODS 3V C18 (150 mm × 4.6 mm, 5 μm) stationary phase maintained at ambient temperature with a mobile phase combination of monobasic potassium phosphate (pH 3.5) containing 0.1% triethylamine and methanol (gradient mode) at a flow rate of 1.5 mL/min, and the detection was carried out by using UV detect...

  15. Summary of Validation of Multi-Pesticide Methods for Various Pesticide Formulations

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    The validation of multi-pesticide methods applicable for various types of pesticide formulations is treated. In a worked-out practical example, i.e. lambda cyhalothrin, the theoretical considerations outlined in the General Guidance section are put into practice. GC conditions, selection of an internal standard and criteria for an acceptable repeatability of injections are outlined, followed by sample preparation, calibration, batch analysis and confirmation of results through comparison using different separation columns. Complete sets of data are displayed in tabular form for other pesticide active ingredients and real formulations. (author)

  16. Results of a survey on accident and safety analysis codes, benchmarks, verification and validation methods

    International Nuclear Information System (INIS)

    Lee, A.G.; Wilkin, G.B.

    1995-01-01

    This report is a compilation of the information submitted by AECL, CIAE, JAERI, ORNL and Siemens in response to a need identified at the 'Workshop on R and D Needs' at the IGORR-3 meeting. The survey compiled information on the national standards applied to the Safety Quality Assurance (SQA) programs undertaken by the participants. Information was assembled for the computer codes and nuclear data libraries used in accident and safety analyses for research reactors and the methods used to verify and validate the codes and libraries. Although the survey was not comprehensive, it provides a basis for exchanging information of common interest to the research reactor community

  17. Validation of the LWR-EIR methods for the evaluation of compact beds

    International Nuclear Information System (INIS)

    Foskolos, K.; Grimm, P.; Maeder, C.; Paratte, J.M.

    1983-10-01

    The EIR code system for the calculation of light water reactors is presented and the methods used are briefly described. The application of the system on various types of critical experiments and benchmark problems proves its good precision, even for heterogeneous configurations with strong neutron absorbers like Boral. As the accuracy of the multiplication factor ksub(eff) is always better than 0.5% for normal LWR configurations, this code system is validated for the calculation of such configurations with a safety margin of 1.5% on ksub(eff). (Auth.)

  18. Validation of the EIR LWR calculation methods for criticality assessment of storage pools

    International Nuclear Information System (INIS)

    Grimm, P.; Paratte, J.M.

    1986-11-01

    The EIR code system for the calculation of light water reactors is presented and the methods used are briefly described. The application of the system to various types of critical experiments and benchmark problems proves its good accuracy, even for heterogeneous configurations containing strong neutron absorbers such as Boral. Since the multiplication factor k eff is normally somewhat overpredicted and the spread of the results is small, this code system is validated for the calculation of storage pools, taking into account a safety margins of 1.5% on k eff . (author)

  19. Validated spectophotometric methods for the assay of cinitapride hydrogen tartrate in pharmaceuticals

    Directory of Open Access Journals (Sweden)

    Satyanarayana K.V.V.

    2013-01-01

    Full Text Available Three simple, selective and rapid spectrophotometric methods have been established for the determination of cinitapride hydrogen tartrate (CHT in pharmaceutical tablets. The proposed methods are based on the diazotization of CHT with sodium nitrite and hydrochloric acid, followed by coupling with resorcinol, 1-benzoylacetone and 8-hydroxyquinoline in alkaline medium for methods A, B and C respectively. The formed azo dyes are measured at 442, 465 and 552 nm for methods A, B and C respectively. The parameters that affect the reaction were carefully optimized. Under optimum conditions, Beer’s law is obeyed over the ranges 2.0-32.0, 1.0-24.0 and 1.0-20.0 μg. mL-1 for methods A, B, and C, respectively. The calculated molar absorptivity values are 1.2853 x104, 1.9624 x104 and 3.92 x104 L.mol-1.cm-1 for methods A, B and C, respectively. The results of the proposed procedures were validated statistically according to ICH guidelines. The proposed methods were successfully applied to the determination of CHT in Cintapro tablets without interference from common excipients encountered.

  20. Validity studies among hierarchical methods of cluster analysis using cophenetic correlation coefficient

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Priscilla R.; Munita, Casimiro S.; Lapolli, André L., E-mail: prii.ramos@gmail.com, E-mail: camunita@ipen.br, E-mail: alapolli@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    The literature presents many methods for partitioning of data base, and is difficult choose which is the most suitable, since the various combinations of methods based on different measures of dissimilarity can lead to different patterns of grouping and false interpretations. Nevertheless, little effort has been expended in evaluating these methods empirically using an archaeological data base. In this way, the objective of this work is make a comparative study of the different cluster analysis methods and identify which is the most appropriate. For this, the study was carried out using a data base of the Archaeometric Studies Group from IPEN-CNEN/SP, in which 45 samples of ceramic fragments from three archaeological sites were analyzed by instrumental neutron activation analysis (INAA) which were determinate the mass fraction of 13 elements (As, Ce, Cr, Eu, Fe, Hf, La, Na, Nd, Sc, Sm, Th, U). The methods used for this study were: single linkage, complete linkage, average linkage, centroid and Ward. The validation was done using the cophenetic correlation coefficient and comparing these values the average linkage method obtained better results. A script of the statistical program R with some functions was created to obtain the cophenetic correlation. By means of these values was possible to choose the most appropriate method to be used in the data base. (author)

  1. Validity studies among hierarchical methods of cluster analysis using cophenetic correlation coefficient

    International Nuclear Information System (INIS)

    Carvalho, Priscilla R.; Munita, Casimiro S.; Lapolli, André L.

    2017-01-01

    The literature presents many methods for partitioning of data base, and is difficult choose which is the most suitable, since the various combinations of methods based on different measures of dissimilarity can lead to different patterns of grouping and false interpretations. Nevertheless, little effort has been expended in evaluating these methods empirically using an archaeological data base. In this way, the objective of this work is make a comparative study of the different cluster analysis methods and identify which is the most appropriate. For this, the study was carried out using a data base of the Archaeometric Studies Group from IPEN-CNEN/SP, in which 45 samples of ceramic fragments from three archaeological sites were analyzed by instrumental neutron activation analysis (INAA) which were determinate the mass fraction of 13 elements (As, Ce, Cr, Eu, Fe, Hf, La, Na, Nd, Sc, Sm, Th, U). The methods used for this study were: single linkage, complete linkage, average linkage, centroid and Ward. The validation was done using the cophenetic correlation coefficient and comparing these values the average linkage method obtained better results. A script of the statistical program R with some functions was created to obtain the cophenetic correlation. By means of these values was possible to choose the most appropriate method to be used in the data base. (author)

  2. Estimating incidence from prevalence in generalised HIV epidemics: methods and validation.

    Directory of Open Access Journals (Sweden)

    Timothy B Hallett

    2008-04-01

    Full Text Available HIV surveillance of generalised epidemics in Africa primarily relies on prevalence at antenatal clinics, but estimates of incidence in the general population would be more useful. Repeated cross-sectional measures of HIV prevalence are now becoming available for general populations in many countries, and we aim to develop and validate methods that use these data to estimate HIV incidence.Two methods were developed that decompose observed changes in prevalence between two serosurveys into the contributions of new infections and mortality. Method 1 uses cohort mortality rates, and method 2 uses information on survival after infection. The performance of these two methods was assessed using simulated data from a mathematical model and actual data from three community-based cohort studies in Africa. Comparison with simulated data indicated that these methods can accurately estimates incidence rates and changes in incidence in a variety of epidemic conditions. Method 1 is simple to implement but relies on locally appropriate mortality data, whilst method 2 can make use of the same survival distribution in a wide range of scenarios. The estimates from both methods are within the 95% confidence intervals of almost all actual measurements of HIV incidence in adults and young people, and the patterns of incidence over age are correctly captured.It is possible to estimate incidence from cross-sectional prevalence data with sufficient accuracy to monitor the HIV epidemic. Although these methods will theoretically work in any context, we have able to test them only in southern and eastern Africa, where HIV epidemics are mature and generalised. The choice of method will depend on the local availability of HIV mortality data.

  3. The Suzuki-Miyaura Cross-Coupling Reaction of Halogenated Aminopyrazoles: Method Development, Scope, and Mechanism of Dehalogenation Side Reaction.

    Science.gov (United States)

    Jedinák, Lukáš; Zátopková, Renáta; Zemánková, Hana; Šustková, Alena; Cankař, Petr

    2017-01-06

    The efficient Suzuki-Miyaura cross-coupling reaction of halogenated aminopyrazoles and their amides or ureas with a range of aryl, heteroaryl, and styryl boronic acids or esters has been developed. The method allowed incorporation of problematic substrates: aminopyrazoles bearing protected or unprotected pyrazole NH, as well as the free amino or N-amide group. Direct comparison of the chloro, bromo, and iodopyrazoles in the Suzuki-Miyaura reaction revealed that Br and Cl derivatives were superior to iodopyrazoles, as a result of reduced propensity to dehalogenation. Moreover, the mechanism and factors affecting the undesired dehalogenation side reaction were revealed.

  4. A sediment extraction and cleanup method for wide-scope multitarget screening by liquid chromatography-high-resolution mass spectrometry.

    Science.gov (United States)

    Massei, Riccardo; Byers, Harry; Beckers, Liza-Marie; Prothmann, Jens; Brack, Werner; Schulze, Tobias; Krauss, Martin

    2018-01-01

    Previous studies on organic sediment contaminants focused mainly on a limited number of highly hydrophobic micropollutants accessible to gas chromatography using nonpolar, aprotic extraction solvents. The development of liquid chromatography-high-resolution mass spectrometry (LC-HRMS) permits the spectrum of analysis to be expanded to a wider range of more polar and ionic compounds present in sediments and allows target, suspect, and nontarget screening to be conducted with high sensitivity and selectivity. In this study, we propose a comprehensive multitarget extraction and sample preparation method for characterization of sediment pollution covering a broad range of physicochemical properties that is suitable for LC-HRMS screening analysis. We optimized pressurized liquid extraction, cleanup, and sample dilution for a target list of 310 compounds. Finally, the method was tested on sediment samples from a small river and its tributaries. The results show that the combination of 100 °C for ethyl acetate-acetone (50:50, neutral extract) followed by 80 °C for acetone-formic acid (100:1, acidic extract) and methanol-10 mM sodium tetraborate in water (90:10, basic extract) offered the best extraction recoveries for 287 of 310 compounds. At a spiking level of 1 μg mL -1 , we obtained satisfactory cleanup recoveries for the neutral extract-(93 ± 23)%-and for the combined acidic/basic extracts-(42 ± 16)%-after solvent exchange. Among the 69 compounds detected in environmental samples, we successfully quantified several pharmaceuticals and polar pesticides.

  5. Validation of a qualitative screening method for pesticides in fruits and vegetables by gas chromatography quadrupole-time of flight mass spectrometry with atmospheric pressure chemical ionization

    Energy Technology Data Exchange (ETDEWEB)

    Portolés, T. [Research Institute for Pesticides and Water, University Jaume I, 12071 Castellón (Spain); RIKILT Institute of Food Safety, Wageningen University and Research Centre, Akkermaalsbos 2, 6708 WB Wageningen (Netherlands); Mol, J.G.J. [RIKILT Institute of Food Safety, Wageningen University and Research Centre, Akkermaalsbos 2, 6708 WB Wageningen (Netherlands); Sancho, J.V.; López, Francisco J. [Research Institute for Pesticides and Water, University Jaume I, 12071 Castellón (Spain); Hernández, F., E-mail: hernandf@uji.es [Research Institute for Pesticides and Water, University Jaume I, 12071 Castellón (Spain)

    2014-08-01

    Highlights: • Applicability of GC-(APCI)QTOF MS as new tool for wide-scope screening of pesticides in fruits and vegetables demonstrated. • Validation of screening method according to SANCO/12571/2013. • Detection of the pesticides based on the presence of M+·/MH+ in most cases. • Screening detection limit 0.01 mg kg{sup −1} for 77% of the pesticides investigated. • Successful identification at 0.01 mg kg{sup −1} for 70% of the pesticides/matrix combinations. - Abstract: A wide-scope screening method was developed for the detection of pesticides in fruit and vegetables. The method was based on gas chromatography coupled to a hybrid quadrupole time-of-flight mass spectrometer with an atmospheric pressure chemical ionization source (GC-(APCI)QTOF MS). A non-target acquisition was performed through two alternating scan events: one at low collision energy and another at a higher collision energy ramp (MS{sup E}). In this way, both protonated molecule and/or molecular ion together with fragment ions were obtained in a single run. Validation was performed according to SANCO/12571/2013 by analysing 20 samples (10 different commodities in duplicate), fortified with a test set of 132 pesticides at 0.01, 0.05 and 0.20 mg kg{sup −1}. For screening, the detection was based on one diagnostic ion (in most cases the protonated molecule). Overall, at the 0.01 mg kg{sup −1} level, 89% of the 2620 fortifications made were detected. The screening detection limit for individual pesticides was 0.01 mg kg{sup −1} for 77% of the pesticides investigated. The possibilities for identification according to the SANCO criteria, requiring two ions with a mass accuracy ≤±5 ppm and an ion-ratio deviation ≤±30%, were investigated. At the 0.01 mg kg{sup −1} level, identification was possible for 70% of the pesticides detected during screening. This increased to 87% and 93% at the 0.05 and 0.20 mg kg{sup −1} level, respectively. Insufficient sensitivity for the second

  6. Validity and reliability of the session-RPE method for quantifying training load in karate athletes.

    Science.gov (United States)

    Tabben, M; Tourny, C; Haddad, M; Chaabane, H; Chamari, K; Coquart, J B

    2015-04-24

    To test the construct validity and reliability of the session rating of perceived exertion (sRPE) method by examining the relationship between RPE and physiological parameters (heart rate: HR and blood lactate concentration: [La --] ) and the correlations between sRPE and two HR--based methods for quantifying internal training load (Banister's method and Edwards's method) during karate training camp. Eighteen elite karate athletes: ten men (age: 24.2 ± 2.3 y, body mass: 71.2 ± 9.0 kg, body fat: 8.2 ± 1.3% and height: 178 ± 7 cm) and eight women (age: 22.6 ± 1.2 y, body mass: 59.8 ± 8.4 kg, body fat: 20.2 ± 4.4%, height: 169 ± 4 cm) were included in the study. During training camp, subjects participated in eight karate--training sessions including three training modes (4 tactical--technical, 2 technical--development, and 2 randori training), during which RPE, HR, and [La -- ] were recorded. Significant correlations were found between RPE and physiological parameters (percentage of maximal HR: r = 0.75, 95% CI = 0.64--0.86; [La --] : r = 0.62, 95% CI = 0.49--0.75; P training load ( r = 0.65--0.95; P reliability of the same intensity across training sessions (Cronbach's α = 0.81, 95% CI = 0.61--0.92). This study demonstrates that the sRPE method is valid for quantifying internal training load and intensity in karate.

  7. Chemometric approach for development, optimization, and validation of different chromatographic methods for separation of opium alkaloids.

    Science.gov (United States)

    Acevska, J; Stefkov, G; Petkovska, R; Kulevanova, S; Dimitrovska, A

    2012-05-01

    The excessive and continuously growing interest in the simultaneous determination of poppy alkaloids imposes the development and optimization of convenient high-throughput methods for the assessment of the qualitative and quantitative profile of alkaloids in poppy straw. Systematic optimization of two chromatographic methods (gas chromatography (GC)/flame ionization detector (FID)/mass spectrometry (MS) and reversed-phase (RP)-high-performance liquid chromatography (HPLC)/diode array detector (DAD)) for the separation of alkaloids from Papaver somniferum L. (Papaveraceae) was carried out. The effects of various conditions on the predefined chromatographic descriptors were investigated using chemometrics. A full factorial linear design of experiments for determining the relationship between chromatographic conditions and the retention behavior of the analytes was used. Central composite circumscribed design was utilized for the final method optimization. By conducting the optimization of the methods in very rational manner, a great deal of excessive and unproductive laboratory research work was avoided. The developed chromatographic methods were validated and compared in line with the resolving power, sensitivity, accuracy, speed, cost, ecological aspects, and compatibility with the poppy straw extraction procedure. The separation of the opium alkaloids using the GC/FID/MS method was achieved within 10 min, avoiding any derivatization step. This method has a stronger resolving power, shorter analysis time, better cost/effectiveness factor than the RP-HPLC/DAD method and is in line with the "green trend" of the analysis. The RP-HPLC/DAD method on the other hand displayed better sensitivity for all tested alkaloids. The proposed methods provide both fast screening and an accurate content assessment of the six alkaloids in the poppy samples obtained from the selection program of Papaver strains.

  8. Validity of the t-plot method to assess microporosity in hierarchical micro/mesoporous materials.

    Science.gov (United States)

    Galarneau, Anne; Villemot, François; Rodriguez, Jeremy; Fajula, François; Coasne, Benoit

    2014-11-11

    The t-plot method is a well-known technique which allows determining the micro- and/or mesoporous volumes and the specific surface area of a sample by comparison with a reference adsorption isotherm of a nonporous material having the same surface chemistry. In this paper, the validity of the t-plot method is discussed in the case of hierarchical porous materials exhibiting both micro- and mesoporosities. Different hierarchical zeolites with MCM-41 type ordered mesoporosity are prepared using pseudomorphic transformation. For comparison, we also consider simple mechanical mixtures of microporous and mesoporous materials. We first show an intrinsic failure of the t-plot method; this method does not describe the fact that, for a given surface chemistry and pressure, the thickness of the film adsorbed in micropores or small mesopores (plot method to estimate the micro- and mesoporous volumes of hierarchical samples is then discussed, and an abacus is given to correct the underestimated microporous volume by the t-plot method.

  9. The role of validated analytical methods in JECFA drug assessments and evaluation for recommending MRLs.

    Science.gov (United States)

    Boison, Joe O

    2016-05-01

    The Joint Food and Agriculture Organization and World Health Organization (FAO/WHO) Expert Committee on Food Additives (JECFA) is one of three Codex committees tasked with applying risk analysis and relying on independent scientific advice provided by expert bodies organized by FAO/WHO when developing standards. While not officially part of the Codex Alimentarius Commission structure, JECFA provides independent scientific advice to the Commission and its specialist committees such as the Codex Committee on Residues of Veterinary Drugs in Foods (CCRVDF) in setting maximum residue limits (MRLs) for veterinary drugs. Codex methods of analysis (Types I, II, III, and IV) are defined in the Codex Procedural Manual as are criteria to be used for selecting methods of analysis. However, if a method is to be used under a single laboratory condition to support regulatory work, it must be validated according to an internationally recognized protocol and the use of the method must be embedded in a quality assurance system in compliance with ISO/IEC 17025:2005. This paper examines the attributes of the methods used to generate residue depletion data for drug registration and/or licensing and for supporting regulatory enforcement initiatives that experts consider to be useful and appropriate in their assessment of methods of analysis. Copyright © 2016 Her Majesty the Queen in Right of Canada. Drug Testing and Analysis © 2016 John Wiley & Sons, Ltd. © 2016 Her Majesty the Queen in Right of Canada. Drug Testing and Analysis © 2016 John Wiley & Sons, Ltd.

  10. Validation of the Use of Dried Blood Spot (DBS) Method to Assess Vitamin A Status

    Science.gov (United States)

    Fallah, Elham; Peighambardoust, Seyed Hadi

    2012-01-01

    Background: Vitamin A deficiency is an important dietary deficiency in the world. Thus, the ne¬cessity of screening for deficient populations is obvious. This paper introduces a fast, cheap and relatively reliable method called “dried blood spot” (DBS) method in screening the deficient populations. The validity of this method for retinol measurement was investigated. Method: The “precision” and “agreement” criteria of the DBS method were assessed. The preci¬sion was calculated and compared with those of plasma using F-test. The agreement was eva¬luated using Bland-Altman plot. Results: The imprecision of retinol measurements in dried spots was not significantly different from those of the control (plasma). A good correlation coefficient (r2=0.78) was obtained for dried spots’ retinol measurements versus plasma’s retinol analysis (P dried spots was stable for 90 days. Overall, the DBS method provided a precise measurement of retinol, showing results that were comparable with the measurement of retinol in plasma. PMID:24688932

  11. Cross-validation and Peeling Strategies for Survival Bump Hunting using Recursive Peeling Methods

    Science.gov (United States)

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    We introduce a framework to build a survival/risk bump hunting model with a censored time-to-event response. Our Survival Bump Hunting (SBH) method is based on a recursive peeling procedure that uses a specific survival peeling criterion derived from non/semi-parametric statistics such as the hazards-ratio, the log-rank test or the Nelson--Aalen estimator. To optimize the tuning parameter of the model and validate it, we introduce an objective function based on survival or prediction-error statistics, such as the log-rank test and the concordance error rate. We also describe two alternative cross-validation techniques adapted to the joint task of decision-rule making by recursive peeling and survival estimation. Numerical analyses show the importance of replicated cross-validation and the differences between criteria and techniques in both low and high-dimensional settings. Although several non-parametric survival models exist, none addresses the problem of directly identifying local extrema. We show how SBH efficiently estimates extreme survival/risk subgroups unlike other models. This provides an insight into the behavior of commonly used models and suggests alternatives to be adopted in practice. Finally, our SBH framework was applied to a clinical dataset. In it, we identified subsets of patients characterized by clinical and demographic covariates with a distinct extreme survival outcome, for which tailored medical interventions could be made. An R package PRIMsrc (Patient Rule Induction Method in Survival, Regression and Classification settings) is available on CRAN (Comprehensive R Archive Network) and GitHub. PMID:27034730

  12. Validation of methods for determination of free water content in poultry meat

    Directory of Open Access Journals (Sweden)

    Jarmila Žítková

    2007-01-01

    Full Text Available Methods for determination of free water content in poultry meat are described in Commission Regulation EEC No 1538/91 as amended and in ČSN 57 3100. Two of them (method A and D have been validated in conditions of a Czech poultry processing plant. The capacity of slaughtering was 6000 pieces per hour and carcasses were chilled by air with spraying. All determinations were carried out in the plant’s lab and in the lab of the Institute of Food Technology. Method A was used to detect the amount of water lost from frozen chicken during thawing in controlled conditions. Twenty carcasses from six weight groups (900 g–1400 g were tested. The average values of thaw loss water contents ranged between 0.46% and 1.71%, the average value of total 120 samples was 1.16%. The results were compared with the required maximum limit value of 3.3%. The water loss content was in negative correlation with the weight of chicken (r = –0.56. Method D (chemical test has been applied to determine the total water content of certain poultry cuts. It involved the determination of water and protein contents of 62 representative samples in total. The average values of ratio of water weight to proteins weight WA/RPA were in breast fillets 3.29, in legs with a portion of the back 4.06, legs 4.00, thighs 3.85 and drumsticks 4.10. The results corresponded to the required limit values for breast fillets 3.40 and for leg cuts 4.15. The ratio of water weight to proteins weight WA/RPA was correlated with the weight of chicken for breast fillets negatively (r = –0.61 and for leg cuts positively (r = 0.70. Different correlations can be explained by the distribution of water, protein and fat in carcasses. The evaluation of methods in the parameter of percentage ratio of the average value to the limit showed that method D (results were at the level of 97% of the limit was more exact than method A (results were at the level 32% of the limit but it is more expensive. Both methods

  13. The establishment of tocopherol reference intervals for Hungarian adult population using a validated HPLC method.

    Science.gov (United States)

    Veres, Gábor; Szpisjak, László; Bajtai, Attila; Siska, Andrea; Klivényi, Péter; Ilisz, István; Földesi, Imre; Vécsei, László; Zádori, Dénes

    2017-09-01

    Evidence suggests that decreased α-tocopherol (the most biologically active substance in the vitamin E group) level can cause neurological symptoms, most likely ataxia. The aim of the current study was to first provide reference intervals for serum tocopherols in the adult Hungarian population with appropriate sample size, recruiting healthy control subjects and neurological patients suffering from conditions without symptoms of ataxia, myopathy or cognitive deficiency. A validated HPLC method applying a diode array detector and rac-tocol as internal standard was utilized for that purpose. Furthermore, serum cholesterol levels were determined as well for data normalization. The calculated 2.5-97.5% reference intervals for α-, β/γ- and δ-tocopherols were 24.62-54.67, 0.81-3.69 and 0.29-1.07 μm, respectively, whereas the tocopherol/cholesterol ratios were 5.11-11.27, 0.14-0.72 and 0.06-0.22 μmol/mmol, respectively. The establishment of these reference intervals may improve the diagnostic accuracy of tocopherol measurements in certain neurological conditions with decreased tocopherol levels. Moreover, the current study draws special attention to the possible pitfalls in the complex process of the determination of reference intervals as well, including the selection of study population, the application of internal standard and method validation and the calculation of tocopherol/cholesterol ratios. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Validation of a Residual Stress Measurement Method by Swept High-Frequency Eddy Currents

    International Nuclear Information System (INIS)

    Lee, C.; Shen, Y.; Lo, C. C. H.; Nakagawa, N.

    2007-01-01

    This paper reports on a swept high-frequency eddy current (SHFEC) measurement method developed for electromagnetic nondestructive characterization of residual stresses in shot peened aerospace materials. In this approach, we regard shot-peened surfaces as modified surface layers of varying conductivity, and determine the conductivity deviation profile by inversion of the SHFEC data. The SHFEC measurement system consists of a pair of closely matched printed-circuit-board coils driven by laboratory instrument under software control. This provides improved sensitivity and high frequency performance compared to conventional coils, so that swept frequency EC measurements up to 50 MHz can be made to achieve the smallest skin depth of 80 μm for nickel-based superalloys. We devised a conductivity profile inversion procedure based on the laterally uniform multi-layer theory of Cheng, Dodd and Deeds. The main contribution of this paper is the methodology validation. Namely, the forward and inverse models were validated against measurements on artificial layer specimens consisting of metal films with different conductivities placed on a metallic substrate. The inversion determined the film conductivities which were found to agree with those measured using the direct current potential drop (DCPD) method

  15. Validation of a Residual Stress Measurement Method by Swept High-Frequency Eddy Currents

    Science.gov (United States)

    Lee, C.; Shen, Y.; Lo, C. C. H.; Nakagawa, N.

    2007-03-01

    This paper reports on a swept high-frequency eddy current (SHFEC) measurement method developed for electromagnetic nondestructive characterization of residual stresses in shot peened aerospace materials. In this approach, we regard shot-peened surfaces as modified surface layers of varying conductivity, and determine the conductivity deviation profile by inversion of the SHFEC data. The SHFEC measurement system consists of a pair of closely matched printed-circuit-board coils driven by laboratory instrument under software control. This provides improved sensitivity and high frequency performance compared to conventional coils, so that swept frequency EC measurements up to 50 MHz can be made to achieve the smallest skin depth of 80 μm for nickel-based superalloys. We devised a conductivity profile inversion procedure based on the laterally uniform multi-layer theory of Cheng, Dodd and Deeds. The main contribution of this paper is the methodology validation. Namely, the forward and inverse models were validated against measurements on artificial layer specimens consisting of metal films with different conductivities placed on a metallic substrate. The inversion determined the film conductivities which were found to agree with those measured using the direct current potential drop (DCPD) method.

  16. Validation of an improved abnormality insertion method for medical image perception investigations

    Science.gov (United States)

    Madsen, Mark T.; Durst, Gregory R.; Caldwell, Robert T.; Schartz, Kevin M.; Thompson, Brad H.; Berbaum, Kevin S.

    2009-02-01

    The ability to insert abnormalities in clinical tomographic images makes image perception studies with medical images practical. We describe a new insertion technique and its experimental validation that uses complementary image masks to select an abnormality from a library and place it at a desired location. The method was validated using a 4-alternative forced-choice experiment. For each case, four quadrants were simultaneously displayed consisting of 5 consecutive frames of a chest CT with a pulmonary nodule. One quadrant was unaltered, while the other 3 had the nodule from the unaltered quadrant artificially inserted. 26 different sets were generated and repeated with order scrambling for a total of 52 cases. The cases were viewed by radiology staff and residents who ranked each quadrant by realistic appearance. On average, the observers were able to correctly identify the unaltered quadrant in 42% of cases, and identify the unaltered quadrant both times it appeared in 25% of cases. Consensus, defined by a majority of readers, correctly identified the unaltered quadrant in only 29% of 52 cases. For repeats, the consensus observer successfully identified the unaltered quadrant only once. We conclude that the insertion method can be used to reliably place abnormalities in perception experiments.

  17. Validation of a high performance liquid chromatography method for the stabilization of epigallocatechin gallate.

    Science.gov (United States)

    Fangueiro, Joana F; Parra, Alexander; Silva, Amélia M; Egea, Maria A; Souto, Eliana B; Garcia, Maria L; Calpena, Ana C

    2014-11-20

    Epigallocatechin gallate (EGCG) is a green tea catechin with potential health benefits, such as anti-oxidant, anti-carcinogenic and anti-inflammatory effects. In general, EGCG is highly susceptible to degradation, therefore presenting stability problems. The present paper was focused on the study of EGCG stability in HEPES (N-2-hydroxyethylpiperazine-N'-2-ethanesulfonic acid) medium regarding the pH dependency, storage temperature and in the presence of ascorbic acid a reducing agent. The evaluation of EGCG in HEPES buffer has demonstrated that this molecule is not able of maintaining its physicochemical properties and potential beneficial effects, since it is partially or completely degraded, depending on the EGCG concentration. The storage temperature of EGCG most suitable to maintain its structure was shown to be the lower values (4 or -20 °C). The pH 3.5 was able to provide greater stability than pH 7.4. However, the presence of a reducing agent (i.e., ascorbic acid) was shown to provide greater protection against degradation of EGCG. A validation method based on RP-HPLC with UV-vis detection was carried out for two media: water and a biocompatible physiological medium composed of Transcutol®P, ethanol and ascorbic acid. The quantification of EGCG for purposes, using pure EGCG, requires a validated HPLC method which could be possible to apply in pharmacokinetic and pharmacodynamics studies. Copyright © 2014. Published by Elsevier B.V.

  18. An extended validation of the last generation of particle finite element method for free surface flows

    Science.gov (United States)

    Gimenez, Juan M.; González, Leo M.

    2015-03-01

    In this paper, a new generation of the particle method known as Particle Finite Element Method (PFEM), which combines convective particle movement and a fixed mesh resolution, is applied to free surface flows. This interesting variant, previously described in the literature as PFEM-2, is able to use larger time steps when compared to other similar numerical tools which implies shorter computational times while maintaining the accuracy of the computation. PFEM-2 has already been extended to free surface problems, being the main topic of this paper a deep validation of this methodology for a wider range of flows. To accomplish this task, different improved versions of discontinuous and continuous enriched basis functions for the pressure field have been developed to capture the free surface dynamics without artificial diffusion or undesired numerical effects when different density ratios are involved. A collection of problems has been carefully selected such that a wide variety of Froude numbers, density ratios and dominant dissipative cases are reported with the intention of presenting a general methodology, not restricted to a particular range of parameters, and capable of using large time-steps. The results of the different free-surface problems solved, which include: Rayleigh-Taylor instability, sloshing problems, viscous standing waves and the dam break problem, are compared to well validated numerical alternatives or experimental measurements obtaining accurate approximations for such complex flows.

  19. "INTRODUCING A FULL VALIDATED ANALYTICAL PROCEDURE AS AN OFFICIAL COMPENDIAL METHOD FOR FENTANYL TRANSDERMAL PATCHES"

    Directory of Open Access Journals (Sweden)

    Amir Mehdizadeh

    2005-04-01

    Full Text Available A simple, sensitive and specific HPLC method and also a simple and fast extraction procedure were developed for quantitative analysis of fentanyl transdermal patches. Chloroform, methanol and ethanol were used as extracting solvents with recovery percent of 92.1, 94.3 and 99.4% respectively. Fentanyl was extracted with ethanol and the eluted fentanyl through the C18 column was monitored by UV detection at 230 nm. The linearity was at the range of 0.5-10 µg/mL with correlation coefficient (r2 of 0.9992. Both intra and inter-day accuracy and precision were within acceptable limits. The detection limit (DL and quantitation limit (QL were 0.15 and 0.5 µg/mL, respectively. Other validation characteristics such as selectivity, robustness and ruggedness were evaluated. Following method validation, a system suitability test (SST including capacity factor (k´, plate number (N, tailing factor (T, and RSD was defined for routine test.

  20. Method validation to measure Strontium-90 in urine sample for internal dosimetry assessment

    International Nuclear Information System (INIS)

    Bitar, A.; Maghrabi, M.; Alhamwi, A.

    2010-12-01

    Occupational individuals exposed at some scientific centers in Syrian Arab Republic to potentially significant intake by ingestion or inhalation during process of producing radiopharmaceutical compounds. The received radioactive intake differs in relation to the amount of radionuclides released during the preparation processes, to the work conditions and to the applying ways of the radiation protection procedures. TLD (Thermoluminescence Dosimeter) is usually used for external radiation monitoring for workers in radioisotope centers. During the external monitoring programme, it was noticed that some workers were exposed to high external dose resultant from radiation accident in their laboratory when preparing Y-90 from Sr-90. For internal dose assessment, chemical method to measure the amount of Sr-90 in urine samples was validated and explained in details in this study. Urine bioassays were carried out and the activities of 90 Sr were determined using liquid scintillation counter. Then, the validated method was used for internal occupational monitoring purposes through the design of internal monitoring programme. The programme was established for four workers who are dealing, twice per month, with an amount of about 20 mCi in each time. At the beginning, theoretical study was done to assess maximum risks for workers. Calculated internal doses showed that it is necessary to apply internal routine monitoring programme for those workers. (author)

  1. Validation of the Abdominal Pain Index using a revised scoring method.

    Science.gov (United States)

    Laird, Kelsey T; Sherman, Amanda L; Smith, Craig A; Walker, Lynn S

    2015-06-01

    Evaluate the psychometric properties of child- and parent-report versions of the four-item Abdominal Pain Index (API) in children with functional abdominal pain (FAP) and healthy controls, using a revised scoring method that facilitates comparisons of scores across samples and time. Pediatric patients aged 8-18 years with FAP and controls completed the API at baseline (N = 1,967); a subset of their parents (N = 290) completed the API regarding the child's pain. Subsets of patients completed follow-up assessments at 2 weeks (N = 231), 3 months (N = 330), and 6 months (N = 107). Subsets of both patients (N = 389) and healthy controls (N = 172) completed a long-term follow-up assessment (mean age at follow-up = 20.21 years, SD = 3.75). The API demonstrated good concurrent, discriminant, and construct validity, as well as good internal consistency. We conclude that the API, using the revised scoring method, is a useful, reliable, and valid measure of abdominal pain severity. © The Author 2015. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Sensitivity and uncertainty analyses applied to criticality safety validation, methods development. Volume 1

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Hopper, C.M.; Childs, R.L.; Parks, C.V.

    1999-01-01

    This report presents the application of sensitivity and uncertainty (S/U) analysis methodologies to the code/data validation tasks of a criticality safety computational study. Sensitivity and uncertainty analysis methods were first developed for application to fast reactor studies in the 1970s. This work has revitalized and updated the available S/U computational capabilities such that they can be used as prototypic modules of the SCALE code system, which contains criticality analysis tools currently used by criticality safety practitioners. After complete development, simplified tools are expected to be released for general use. The S/U methods that are presented in this volume are designed to provide a formal means of establishing the range (or area) of applicability for criticality safety data validation studies. The development of parameters that are analogous to the standard trending parameters forms the key to the technique. These parameters are the D parameters, which represent the differences by group of sensitivity profiles, and the ck parameters, which are the correlation coefficients for the calculational uncertainties between systems; each set of parameters gives information relative to the similarity between pairs of selected systems, e.g., a critical experiment and a specific real-world system (the application)

  3. Amphenicols stability in medicated feed – development and validation of liquid chromatography method

    Directory of Open Access Journals (Sweden)

    Pietro Wojciech Jerzy

    2014-12-01

    Full Text Available A liquid chromatography-ultraviolet detection method for the determination of florfenicol (FF and thiamphenicol (TAP in feeds is presented. The method comprises the extraction of analytes from the matrix with a mixture of methanol and acetonitrile, drying of the extract, and its dissolution in phosphate buffer. The analysis was performed with a gradient programme of the mobile phase composed of acetonitrile and buffer (pH = 7.3 on a Zorbax Eclipse Plus C18 (150 × 4.6 mm, 5 μm analytical column with UV (λ = 220 nm detection. The analytical procedure has been successfully adopted and validated for quantitative determination of florfenicol and thiamphenicol in feed samples. Sensitivity, specificity, linearity, repeatability, and intralaboratory reproducibility were included in the validation. The mean recovery of amphenicols was 93.5% within the working range of 50-4000 mg/kg. Simultaneous determination of chloramphenicol, which is banned in the feed, was also included within the same procedure of FF and TAP stability studies. Storing the medicated feed at room temperature for up to one month decreased concentration in the investigated drugs even by 45%. These findings are relevant to successful provision of therapy to animals.

  4. Validation of MIMGO: a method to identify differentially expressed GO terms in a microarray dataset

    Directory of Open Access Journals (Sweden)

    Yamada Yoichi

    2012-12-01

    Full Text Available Abstract Background We previously proposed an algorithm for the identification of GO terms that commonly annotate genes whose expression is upregulated or downregulated in some microarray data compared with in other microarray data. We call these “differentially expressed GO terms” and have named the algorithm “matrix-assisted identification method of differentially expressed GO terms” (MIMGO. MIMGO can also identify microarray data in which genes annotated with a differentially expressed GO term are upregulated or downregulated. However, MIMGO has not yet been validated on a real microarray dataset using all available GO terms. Findings We combined Gene Set Enrichment Analysis (GSEA with MIMGO to identify differentially expressed GO terms in a yeast cell cycle microarray dataset. GSEA followed by MIMGO (GSEA + MIMGO correctly identified (p Conclusions MIMGO is a reliable method to identify differentially expressed GO terms comprehensively.

  5. Validated UV-spectrophotometric method for the evaluation of the efficacy of makeup remover.

    Science.gov (United States)

    Charoennit, P; Lourith, N

    2012-04-01

    A UV-spectrophotometric method for the analysis of makeup remover was developed and validated according to ICH guidelines. Three makeup removers for which the main ingredients consisted of vegetable oil (A), mineral oil and silicone (B) and mineral oil and water (C) were sampled in this study. Ethanol was the optimal solvent because it did not interfere with the maximum absorbance of the liquid foundation at 250 nm. The linearity was determined over a range of makeup concentrations from 0.540 to 1.412 mg mL⁻¹ (R² = 0.9977). The accuracy of this method was determined by analysing low, intermediate and high concentrations of the liquid foundation and gave 78.59-91.57% recoveries with a relative standard deviation of makeup remover efficacy. © 2011 The Authors. ICS © 2011 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  6. Development and validation of a stability indicating HPTLC-densitometric method for lafutidine

    Directory of Open Access Journals (Sweden)

    Dinesh Dhamecha

    2013-01-01

    Full Text Available Background: A simple, selective, precise, and stability indicating high-performance thin layer chromatographic method has been established and validated for analysis of lafutidine in bulk drug and formulations. Materials and Methods: The compounds were analyzed on aluminum backed silica gel 60 F 254 plates with chloroform:ethanol:acetic Acid (8:1:1 as mobile phase. Densitometric analysis of lafutidine was performed at 230 nm. Result : Regression analysis data for the calibration plots were indicative of good linear relationship between response and concentration over the range 100-500 ng per spot. The correlation coefficient (r 2 was 0.998±0.002. Conclusion: Lafutidine was subjected to acid, base, peroxide, and sunlight degradation. In stability tests, the drug was susceptible to acid and basic hydrolysis, oxidation, and photodegradation.

  7. Numerical and experimental validation of a particle Galerkin method for metal grinding simulation

    Science.gov (United States)

    Wu, C. T.; Bui, Tinh Quoc; Wu, Youcai; Luo, Tzui-Liang; Wang, Morris; Liao, Chien-Chih; Chen, Pei-Yin; Lai, Yu-Sheng

    2018-03-01

    In this paper, a numerical approach with an experimental validation is introduced for modelling high-speed metal grinding processes in 6061-T6 aluminum alloys. The derivation of the present numerical method starts with an establishment of a stabilized particle Galerkin approximation. A non-residual penalty term from strain smoothing is introduced as a means of stabilizing the particle Galerkin method. Additionally, second-order strain gradients are introduced to the penalized functional for the regularization of damage-induced strain localization problem. To handle the severe deformation in metal grinding simulation, an adaptive anisotropic Lagrangian kernel is employed. Finally, the formulation incorporates a bond-based failure criterion to bypass the prospective spurious damage growth issues in material failure and cutting debris simulation. A three-dimensional metal grinding problem is analyzed and compared with the experimental results to demonstrate the effectiveness and accuracy of the proposed numerical approach.

  8. Development and Validation of High Performance Liquid Chromatographic Method for Determination of Lamivudine from Pharmaceutical Preparation

    Directory of Open Access Journals (Sweden)

    S. K. Patro

    2010-01-01

    Full Text Available A new, simple, specific, accurate and precise RP-HPLC method was developed for determination of lamivudine in pure and tablet formulations. A Thermo BDS C18 column in isocratic mode, with a mobile phase consisting of 0.01 M ammonium dihydrogen orthophosphate buffer adjusted to pH 2.48 by using formic acid and methanol in the ratio of 50:50 was used. The flow rate was set at 0.6 mL/min and UV detection was carried out at 264 nm. The retention time of lamivudine and nevirapine were 2.825 min and 4.958 min respectively. The method was validated for linearity, precision, robustness and recovery. Linearity for lamivudine was found in the range of 50-175 μg/mL. Hence, it can be applied for routine quality control of lamivudine in bulk and pharmaceutical formulations.

  9. Validation of the activity expansion method with ultrahigh pressure shock equations of state

    International Nuclear Information System (INIS)

    Rogers, F.J.; Young, D.A.

    1997-01-01

    Laser shock experiments have recently been used to measure the equation of state (EOS) of matter in the ultrahigh pressure region between condensed matter and a weakly coupled plasma. Some ultrahigh pressure data from nuclear-generated shocks are also available. Matter at these conditions has proven very difficult to treat theoretically. The many-body activity expansion method (ACTEX) has been used for some time to calculate EOS and opacity data in this region, for use in modeling inertial confinement fusion and stellar interior plasmas. In the present work, we carry out a detailed comparison with the available experimental data in order to validate the method. The agreement is good, showing that ACTEX adequately describes strongly shocked matter. copyright 1997 The American Physical Society

  10. Simple Methods for Scanner Drift Normalization Validated for Automatic Segmentation of Knee Magnetic Resonance Imaging

    DEFF Research Database (Denmark)

    Dam, Erik Bjørnager

    2018-01-01

    Scanner drift is a well-known magnetic resonance imaging (MRI) artifact characterized by gradual signal degradation and scan intensity changes over time. In addition, hardware and software updates may imply abrupt changes in signal. The combined effects are particularly challenging for automatic...... image analysis methods used in longitudinal studies. The implication is increased measurement variation and a risk of bias in the estimations (e.g. in the volume change for a structure). We proposed two quite different approaches for scanner drift normalization and demonstrated the performance...... for segmentation of knee MRI using the fully automatic KneeIQ framework. The validation included a total of 1975 scans from both high-field and low-field MRI. The results demonstrated that the pre-processing method denoted Atlas Affine Normalization significantly removed scanner drift effects and ensured...

  11. Validation of the activity expansion method with ultrahigh pressure shock equations of state

    Science.gov (United States)

    Rogers, Forrest J.; Young, David A.

    1997-11-01

    Laser shock experiments have recently been used to measure the equation of state (EOS) of matter in the ultrahigh pressure region between condensed matter and a weakly coupled plasma. Some ultrahigh pressure data from nuclear-generated shocks are also available. Matter at these conditions has proven very difficult to treat theoretically. The many-body activity expansion method (ACTEX) has been used for some time to calculate EOS and opacity data in this region, for use in modeling inertial confinement fusion and stellar interior plasmas. In the present work, we carry out a detailed comparison with the available experimental data in order to validate the method. The agreement is good, showing that ACTEX adequately describes strongly shocked matter.

  12. Validation of the activity expansion method with ultrahigh pressure shock equations of state

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, F.J.; Young, D.A. [Physics Department, Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, California 94550 (United States)

    1997-11-01

    Laser shock experiments have recently been used to measure the equation of state (EOS) of matter in the ultrahigh pressure region between condensed matter and a weakly coupled plasma. Some ultrahigh pressure data from nuclear-generated shocks are also available. Matter at these conditions has proven very difficult to treat theoretically. The many-body activity expansion method (ACTEX) has been used for some time to calculate EOS and opacity data in this region, for use in modeling inertial confinement fusion and stellar interior plasmas. In the present work, we carry out a detailed comparison with the available experimental data in order to validate the method. The agreement is good, showing that ACTEX adequately describes strongly shocked matter. {copyright} {ital 1997} {ital The American Physical Society}

  13. Validation of multivariate classification methods using analytical fingerprints – concept and case study on organic feed for laying hens

    NARCIS (Netherlands)

    Alewijn, Martin; Voet, van der Hilko; Ruth, van Saskia

    2016-01-01

    Multivariate classification methods based on analytical fingerprints have found many applications in the food and feed area, but practical applications are still scarce due to a lack of a generally accepted validation procedure. This paper proposes a new approach for validation of this type of

  14. Comparison of CORA and EN4 in-situ datasets validation methods, toward a better quality merged dataset.

    Science.gov (United States)

    Szekely, Tanguy; Killick, Rachel; Gourrion, Jerome; Reverdin, Gilles

    2017-04-01

    validation test results is now performed to find the best features of both datasets. The study shows the differences between the EN4 and CORA validation results. It highlights the complementarity between the EN4 and CORA higher order tests. The design of the CORA and EN4 validation charts is discussed to understand how a different approach on the dataset scope can lead to differences in data validation. The new validation chart of the Copernicus Marine Service dataset is presented.

  15. DEVELOPMENT AND VALIDATION OF NUMERICAL METHOD FOR STRENGTH ANALYSIS OF LATTICE COMPOSITE FUSELAGE STRUCTURES

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available Lattice composite fuselage structures are developed as an alternative to conventional composite structures based on laminated skin and stiffeners. Structure layout of lattice structures allows to realize advantages of current composite materials to a maximal extent, at the same time minimizing their main shortcomings, that allows to provide higher weight efficiency for these structures in comparison with conventional analogues.Development and creation of lattice composite structures requires development of novel methods of strength anal- ysis, as conventional methods, as a rule, are aiming to strength analysis of thin-walled elements and do not allow to get confident estimation of local strength of high-loaded unidirectional composite ribs.In the present work the method of operative strength analysis of lattice composite structure is presented, based onspecialized FE-models of unidirectional composite ribs and their intersections. In the frames of the method, every rib is modeled by a caisson structure, consisting of arbitrary number of flanges and webs, modeled by membrane finite elements. Parameters of flanges and webs are calculated automatically from the condition of stiffness characteristics equality of real rib and the model. This method allows to perform local strength analysis of high-loaded ribs of lattice structure without use of here-dimensional finite elements, that allows to shorten time of calculations and sufficiently simplify the procedure of analysis of results of calculations.For validation of the suggested method, the results of experimental investigations of full-scale prototype of shell of lattice composite fuselage section have been used. The prototype of the lattice section was manufactured in CRISM and tested in TsAGI within the frames of a number of Russian and International scientific projects. The results of validation have shown that the suggested method allows to provide high operability of strength analysis, keeping

  16. Development and validation of a simplified titration method for monitoring volatile fatty acids in anaerobic digestion.

    Science.gov (United States)

    Sun, Hao; Guo, Jianbin; Wu, Shubiao; Liu, Fang; Dong, Renjie

    2017-09-01

    The volatile fatty acids (VFAs) concentration has been considered as one of the most sensitive process performance indicators in anaerobic digestion (AD) process. However, the accurate determination of VFAs concentration in AD processes normally requires advanced equipment and complex pretreatment procedures. A simplified method with fewer sample pretreatment procedures and improved accuracy is greatly needed, particularly for on-site application. This report outlines improvements to the Nordmann method, one of the most popular titrations used for VFA monitoring. The influence of ion and solid interfering subsystems in titrated samples on results accuracy was discussed. The total solid content in titrated samples was the main factor affecting accuracy in VFA monitoring. Moreover, a high linear correlation was established between the total solids contents and VFA measurement differences between the traditional Nordmann equation and gas chromatography (GC). Accordingly, a simplified titration method was developed and validated using a semi-continuous experiment of chicken manure anaerobic digestion with various organic loading rates. The good fitting of the results obtained by this method in comparison with GC results strongly supported the potential application of this method to VFA monitoring. Copyright © 2017. Published by Elsevier Ltd.

  17. European validation of Real-Time PCR method for detection of Salmonella spp. in pork meat.

    Science.gov (United States)

    Delibato, Elisabetta; Rodriguez-Lazaro, David; Gianfranceschi, Monica; De Cesare, Alessandra; Comin, Damiano; Gattuso, Antonietta; Hernandez, Marta; Sonnessa, Michele; Pasquali, Frédérique; Sreter-Lancz, Zuzsanna; Saiz-Abajo, María-José; Pérez-De-Juan, Javier; Butrón, Javier; Prukner-Radovcic, Estella; Horvatek Tomic, Danijela; Johannessen, Gro S; Jakočiūnė, Džiuginta; Olsen, John E; Chemaly, Marianne; Le Gall, Francoise; González-García, Patricia; Lettini, Antonia Anna; Lukac, Maja; Quesne, Segolénè; Zampieron, Claudia; De Santis, Paola; Lovari, Sarah; Bertasi, Barbara; Pavoni, Enrico; Proroga, Yolande T R; Capuano, Federico; Manfreda, Gerardo; De Medici, Dario

    2014-08-01

    The classical microbiological method for detection of Salmonella spp. requires more than five days for final confirmation, and consequently there is a need for an alternative methodology for detection of this pathogen particularly in those food categories with a short shelf-life. This study presents an international (at European level) ISO 16140-based validation study of a non-proprietary Real-Time PCR-based method that can generate final results the day following sample analysis. It is based on an ISO compatible enrichment coupled to an easy and inexpensive DNA extraction and a consolidated Real-Time PCR assay. Thirteen laboratories from seven European Countries participated to this trial, and pork meat was selected as food model. The limit of detection observed was down to 10 CFU per 25 g of sample, showing excellent concordance and accordance values between samples and laboratories (100%). In addition, excellent values were obtained for relative accuracy, specificity and sensitivity (100%) when the results obtained for the Real-Time PCR-based methods were compared to those of the ISO 6579:2002 standard method. The results of this international trial demonstrate that the evaluated Real-Time PCR-based method represents an excellent alternative to the ISO standard. In fact, it shows an equal and solid performance as well as it reduces dramatically the extent of the analytical process, and can be easily implemented routinely by the Competent Authorities and Food Industry laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Compatible validated spectrofluorimetric and spectrophotometric methods for determination of vildagliptin and saxagliptin by factorial design experiments

    Science.gov (United States)

    Abdel-Aziz, Omar; Ayad, Miriam F.; Tadros, Mariam M.

    2015-04-01

    Simple, selective and reproducible spectrofluorimetric and spectrophotometric methods have been developed for the determination of vildagliptin and saxagliptin in bulk and their pharmaceutical dosage forms. The first proposed spectrofluorimetric method is based on the dansylation reaction of the amino group of vildagliptin with dansyl chloride to form a highly fluorescent product. The formed product was measured spectrofluorimetrically at 455 nm after excitation at 345 nm. Beer's law was obeyed in a concentration range of 100-600 μg ml-1. The second proposed spectrophotometric method is based on the charge transfer complex of saxagliptin with tetrachloro-1,4-benzoquinone (p-chloranil). The formed charge transfer complex was measured spectrophotometrically at 530 nm. Beer's law was obeyed in a concentration range of 100-850 μg ml-1. The third proposed spectrophotometric method is based on the condensation reaction of the primary amino group of saxagliptin with formaldehyde and acetyl acetone to form a yellow colored product known as Hantzsch reaction, measured at 342.5 nm. Beer's law was obeyed in a concentration range of 50-300 μg ml-1. All the variables were studied to optimize the reactions' conditions using factorial design. The developed methods were validated and proved to be specific and accurate for quality control of vildagliptin and saxagliptin in their pharmaceutical dosage forms.

  19. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    Science.gov (United States)

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  20. Development and validation of an HPLC method for tetracycline-related USP monographs.

    Science.gov (United States)

    Hussien, Emad M

    2014-09-01

    A novel reversed-phase HPLC method was developed and validated for the assay of tetracycline hydrochloride and the limit of 4-epianhydrotetracycline hydrochloride impurity in tetracycline hydrochloride commercial bulk and pharmaceutical products. The method employed L1 (3 µm, 150 × 4.6 mm) columns, a mobile phase of 0.1% phosphoric acid and acetonitrile at a flow rate of 1.0 mL/min, and detection at 280 nm. The separation was performed in HPLC gradient mode. Forced degradation studies showed that tetracycline eluted as a spectrally pure peak and was well resolved from its degradation products. The fast degradation of tetracycline hydrochloride and 4-epianhydrotetracycline hydrochloride in solution was retarded by controlling the autosampler temperature at 4 °C and using 0.1% H3 PO4 as diluent. The robustness of the method was tested starting with the maximum variations allowed in the US Pharmacopeia (USP) general chapter Chromatography . The method was linear over the range 80-120% of the assay concentration (0.1 mg/mL) for tetracycline hydrochloride and 50-150% of the acceptance criteria specified in the individual USP monographs for 4-epianhydrotetracycline hydrochloride. The limit of quantification for 4-epianhydrotetracycline hydrochloride was 0.1 µg/mL, 20 times lower than the acceptance criteria. The method was specific, precise, accurate and robust. Copyright © 2014 John Wiley & Sons, Ltd.

  1. Method validation to determine total alpha beta emitters in water samples using LSC

    International Nuclear Information System (INIS)

    Al-Masri, M. S.; Nashawati, A.; Al-akel, B.; Saaid, S.

    2006-06-01

    In this work a method was validated to determine gross alpha and beta emitters in water samples using liquid scintillation counter. 200 ml of water from each sample were evaporated to 20 ml and 8 ml of them were mixed with 12 ml of the suitable cocktail to be measured by liquid scintillation counter Wallac Winspectral 1414. The lower detection limit by this method (LDL) was 0.33 DPM for total alpha emitters and 1.3 DPM for total beta emitters. and the reproducibility limit was (± 2.32 DPM) and (±1.41 DPM) for total alpha and beta emitters respectively, and the repeatability limit was (±2.19 DPM) and (±1.11 DPM) for total alpha and beta emitters respectively. The method is easy and fast because of the simple preparation steps and the large number of samples that can be measured at the same time. In addition, many real samples and standard samples were analyzed by the method and showed accurate results so it was concluded that the method can be used with various water samples. (author)

  2. Proposal for Requirement Validation Criteria and Method Based on Actor Interaction

    Science.gov (United States)

    Hattori, Noboru; Yamamoto, Shuichiro; Ajisaka, Tsuneo; Kitani, Tsuyoshi

    We propose requirement validation criteria and a method based on the interaction between actors in an information system. We focus on the cyclical transitions of one actor's situation against another and clarify observable stimuli and responses based on these transitions. Both actors' situations can be listed in a state transition table, which describes the observable stimuli or responses they send or receive. Examination of the interaction between both actors in the state transition tables enables us to detect missing or defective observable stimuli or responses. Typically, this method can be applied to the examination of the interaction between a resource managed by the information system and its user. As a case study, we analyzed 332 requirement defect reports of an actual system development project in Japan. We found that there were a certain amount of defects regarding missing or defective stimuli and responses, which can be detected using our proposed method if this method is used in the requirement definition phase. This means that we can reach a more complete requirement definition with our proposed method.

  3. Validation of analytical methods for the quality control of Naproxen suppositories

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar; Hernandez Contreras, Orestes Yuniel

    2011-01-01

    The analysis methods that will be used for the quality control of the future Cuban-made Naproxen suppositories for adults and children were developed for the first time in this paper. One method based on direct ultraviolet spectrophotometry was put forward, which proved to be specific, linear, accurate and precise for the quality control of Naproxen suppositories, taking into account the presence of chromophore groups in their structure. Likewise, the direct semi-aqueous acid-base volumetry method aimed at the quality control of the Naproxen raw material was changed and adapted to the quality control of suppositories. On the basis of the validation process, there was demonstrated the adequate specificity of this method with respect to the formulation components, as well as its linearity, accuracy and precision in 1-3 mg/ml range. The final results were compared and no significant statistical differences among the replicas per each dose were found in both methods; therefore, both may be used in the quality control of Naproxen suppositories

  4. Optimization and Validation of Quantitative Spectrophotometric Methods for the Determination of Alfuzosin in Pharmaceutical Formulations

    Directory of Open Access Journals (Sweden)

    M. Vamsi Krishna

    2007-01-01

    Full Text Available Three accurate, simple and precise spectrophotometric methods for the determination of alfuzosin hydrochloride in bulk drugs and tablets are developed. The first method is based on the reaction of alfuzosin with ninhydrin reagent in N, N'-dimethylformamide medium (DMF producing a colored product which absorbs maximally at 575 nm. Beer’s law is obeyed in the concentration range 12.5-62.5 µg/mL of alfuzosin. The second method is based on the reaction of drug with ascorbic acid in DMF medium resulting in the formation of a colored product, which absorbs maximally at 530 nm. Beer’s law is obeyed in the concentration 10-50 µg/mL of alfuzosin. The third method is based on the reaction of alfuzosin with p-benzoquinone (PBQ to form a colored product with λmax at 400 nm. The products of the reaction were stable for 2 h at room temperature. The optimum experimental parameters for the reactions have been studied. The validity of the described procedures was assessed. Statistical analysis of the results has been carried out revealing high accuracy and good precision. The proposed methods could be used for the determination of alfuzosin in pharmaceutical formulations. The procedures were rapid, simple and suitable for quality control application.

  5. Validation of methods for measurement of insulin secretion in humans in vivo

    DEFF Research Database (Denmark)

    Kjems, L L; Christiansen, E; Vølund, A

    2000-01-01

    To detect and understand the changes in beta-cell function in the pathogenesis of type 2 diabetes, an accurate and precise estimation of prehepatic insulin secretion rate (ISR) is essential. There are two common methods to assess ISR, the deconvolution method (by Eaton and Polonsky)-considered th......To detect and understand the changes in beta-cell function in the pathogenesis of type 2 diabetes, an accurate and precise estimation of prehepatic insulin secretion rate (ISR) is essential. There are two common methods to assess ISR, the deconvolution method (by Eaton and Polonsky...... of these mathematical techniques for quantification of insulin secretion have been tested in dogs, but not in humans. In the present studies, we examined the validity of both methods to recover the known infusion rates of insulin and C-peptide mimicking ISR during an oral glucose tolerance test. ISR from both......, and a close agreement was found for the results of an oral glucose tolerance test. We also studied whether C-peptide kinetics are influenced by somatostatin infusion. The decay curves after bolus injection of exogenous biosynthetic human C-peptide, the kinetic parameters, and the metabolic clearance rate were...

  6. A validated stability-indicating UPLC method for desloratadine and its impurities in pharmaceutical dosage forms.

    Science.gov (United States)

    Rao, Dantu Durga; Satyanarayana, N V; Malleswara Reddy, A; Sait, Shakil S; Chakole, Dinesh; Mukkanti, K

    2010-02-05

    A novel stability-indicating gradient reverse phase ultra-performance liquid chromatographic (RP-UPLC) method was developed for the determination of purity of desloratadine in presence of its impurities and forced degradation products. The method was developed using Waters Aquity BEH C18 column with mobile phase containing a gradient mixture of solvents A and B. The eluted compounds were monitored at 280nm. The run time was 8min within which desloratadine and its five impurities were well separated. Desloratadine was subjected to the stress conditions of oxidative, acid, base, hydrolytic, thermal and photolytic degradation. Desloratadine was found to degrade significantly in oxidative and thermal stress conditions and stable in acid, base, hydrolytic and photolytic degradation conditions. The degradation products were well resolved from main peak and its impurities, thus proved the stability-indicating power of the method. The developed method was validated as per ICH guidelines with respect to specificity, linearity, limit of detection, limit of quantification, accuracy, precision and robustness. This method was also suitable for the assay determination of desloratadine in pharmaceutical dosage forms.

  7. Validation of nitrogen-nitrate analysis by the chromotropic acid method

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Ana Claudia O.; Matoso, Erika, E-mail: anaclaudia.oliveira@marinha.mil.br [Centro Tecnológico da Marinha em São Paulo (CTMSP/CEA), Iperó, SP (Brazil). Centro Experimental ARAMAR

    2017-07-01

    The problems caused by contamination of water bodies demand strict control of disposal in rivers, seas and oceans. Nitrate ion is present in agricultural inputs, which are applied to the soil to boost plant growth. However, excess or indiscriminate use of these products contaminates water bodies, triggering eutrophication of the aquatic ecosystems. Furthermore, due to diseases that can be caused by the ingestion of high levels of nitrate, such as methaemoglobinaemia, nitrate levels should be controlled in drinking waters and effluents. There are several methods for the determination of nitrate, being the chromotropic acid method a simple and low-cost solution. This method consists of acid addition into the sample in the presence of H{sub 2}SO{sub 4}. The absorbance related to the produced yellow color can be measured by a UV-Vis spectrophotometer at 410 nm. In a modified form, this method can be applied to different aqueous matrices by use of other reagents that eliminate interferences. The aim of this study was to validate the nitrate determination method in waters using chromotropic acid. This method is used in Laboratório Radioecológico (LARE) to analyze effluent to comply with Wastewater Controlling Program of Centro Tecnológico da Marinha em São Paulo – Centro Experimental ARAMAR (CTMSP-CEA). The correlation coefficient for the linearity test was 0.9997. The evaluated detection limit was relatively high (LD = 0.045 mgN/L), if compared to ion chromatography, for example, but enough to determine the presence of this ion, considering the maximum limit proposed by the current legislation. The chromotropic acid method showed to be a robust, accurate and precise method, according the parameters used in this work. (author)

  8. Validation of nitrogen-nitrate analysis by the chromotropic acid method

    International Nuclear Information System (INIS)

    Santos, Ana Claudia O.; Matoso, Erika

    2017-01-01

    The problems caused by contamination of water bodies demand strict control of disposal in rivers, seas and oceans. Nitrate ion is present in agricultural inputs, which are applied to the soil to boost plant growth. However, excess or indiscriminate use of these products contaminates water bodies, triggering eutrophication of the aquatic ecosystems. Furthermore, due to diseases that can be caused by the ingestion of high levels of nitrate, such as methaemoglobinaemia, nitrate levels should be controlled in drinking waters and effluents. There are several methods for the determination of nitrate, being the chromotropic acid method a simple and low-cost solution. This method consists of acid addition into the sample in the presence of H 2 SO 4 . The absorbance related to the produced yellow color can be measured by a UV-Vis spectrophotometer at 410 nm. In a modified form, this method can be applied to different aqueous matrices by use of other reagents that eliminate interferences. The aim of this study was to validate the nitrate determination method in waters using chromotropic acid. This method is used in Laboratório Radioecológico (LARE) to analyze effluent to comply with Wastewater Controlling Program of Centro Tecnológico da Marinha em São Paulo – Centro Experimental ARAMAR (CTMSP-CEA). The correlation coefficient for the linearity test was 0.9997. The evaluated detection limit was relatively high (LD = 0.045 mgN/L), if compared to ion chromatography, for example, but enough to determine the presence of this ion, considering the maximum limit proposed by the current legislation. The chromotropic acid method showed to be a robust, accurate and precise method, according the parameters used in this work. (author)

  9. Validation of statistical assessment method for the optimization of the inspection need for nuclear steam generators

    International Nuclear Information System (INIS)

    Wallin, K.; Voskamp, R.; Schmibauer, J.; Ostermeyer, H.; Nagel, G.

    2011-01-01

    The cost of steam generator inspections in nuclear power plants is high. A new quantitative assessment methodology for the accumulation of flaws due to stochastic causes like fretting has been developed for cases where limited inspection data is available. Additionally, a new quantitative assessment methodology for the accumulation of environment related flaws, caused e.g. by corrosion in steam generator tubes, has been developed. The method that combines deterministic information regarding flaw initiation and growth with stochastic elements connected to environmental aspects requires only knowledge of the experimental flaw accumulation history. The method, combining both types of flaw types, provides a complete description of the flaw accumulation and there are several possible uses of the method. The method can be used to evaluate the total life expectancy of the steam generator and simple statistically defined plugging criteria can be established based on flaw behaviour. This way the inspection interval and inspection coverage can be optimized with respect to allowable flaws and the method can recognize flaw type subsets requiring more frequent inspection intervals. The method can also be used to develop statistically realistic safety factors accounting for uncertainties in inspection flaw sizing and detection. The statistical assessment method has been showed to be robust and insensitive to different assessments of plugged tubes. Because the procedure is re-calibrated after each inspection, it reacts effectively to possible changes in the steam generator environment. Validation of the assessment method is provided for real steam generators, both in the case of stochastic damage as well as environment related flaws. (authors)

  10. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  11. Validation of a pretreatment delivery quality assurance method for the CyberKnife Synchrony system

    Energy Technology Data Exchange (ETDEWEB)

    Mastella, E., E-mail: edoardo.mastella@cnao.it [Medical Physics Unit, CNAO Foundation—National Centre for Oncological Hadron Therapy, Pavia I-27100, Italy and Medical Physics Unit, IEO—European Institute of Oncology, Milan I-20141 (Italy); Vigorito, S.; Rondi, E.; Cattani, F. [Medical Physics Unit, IEO—European Institute of Oncology, Milan I-20141 (Italy); Piperno, G.; Ferrari, A.; Strata, E.; Rozza, D. [Department of Radiation Oncology, IEO—European Institute of Oncology, Milan I-20141 (Italy); Jereczek-Fossa, B. A. [Department of Radiation Oncology, IEO—European Institute of Oncology, Milan I-20141, Italy and Department of Oncology and Hematology Oncology, University of Milan, Milan I-20122 (Italy)

    2016-08-15

    Purpose: To evaluate the geometric and dosimetric accuracies of the CyberKnife Synchrony respiratory tracking system (RTS) and to validate a method for pretreatment patient-specific delivery quality assurance (DQA). Methods: An EasyCube phantom was mounted on the ExacTrac gating phantom, which can move along the superior–inferior (SI) axis of a patient to simulate a moving target. The authors compared dynamic and static measurements. For each case, a Gafchromic EBT3 film was positioned between two slabs of the EasyCube, while a PinPoint ionization chamber was placed in the appropriate space. There were three steps to their evaluation: (1) the field size, the penumbra, and the symmetry of six secondary collimators were measured along the two main orthogonal axes. Dynamic measurements with deliberately simulated errors were also taken. (2) The delivered dose distributions (from step 1) were compared with the planned ones, using the gamma analysis method. The local gamma passing rates were evaluated using three acceptance criteria: 3% local dose difference (LDD)/3 mm, 2%LDD/2 mm, and 3%LDD/1 mm. (3) The DQA plans for six clinical patients were irradiated in different dynamic conditions, to give a total of 19 cases. The measured and planned dose distributions were evaluated with the same gamma-index criteria used in step 2 and the measured chamber doses were compared with the planned mean doses in the sensitive volume of the chamber. Results: (1) A very slight enlargement of the field size and of the penumbra was observed in the SI direction (on average <1 mm), in line with the overall average CyberKnife system error for tracking treatments. (2) Comparison between the planned and the correctly delivered dose distributions confirmed the dosimetric accuracy of the RTS for simple plans. The multicriteria gamma analysis was able to detect the simulated errors, proving the robustness of their method of analysis. (3) All of the DQA clinical plans passed the tests, both in

  12. Determination of bacterial endotoxin (pyrogen) in radiopharmaceuticals by the gel clot method. Validation

    International Nuclear Information System (INIS)

    Fukumori, Neuza Taeko Okasaki

    2008-01-01

    Before the Limulus amebocyte lysate (LAL) test, the only available means of pirogenicity testing for parenteral drugs and medical devices was the United States Pharmacopoeia (USP) rabbit pyrogen test. Especially for radiopharmaceuticals, the LAL assay is the elective way to determine bacterial endotoxin. The aim of this work was to validate the gel clot method for some radiopharmaceuticals without measurable interference. The FDA's LALTest guideline defines interference as a condition that causes a significant difference between the endpoints of a positive water control and positive product control series using a standard endotoxin. Experiments were performed in accordance to the USP bacterial endotoxins test in the 131 I- m-iodobenzylguanidine; the radioisotopes Gallium-67 and Thallium-201; the lyophilized reagents DTPA, Phytate, GHA, HSA and Colloidal Tin. The Maximum Valid Dilution (MVD) was calculated for each product based upon the clinical dose of the material and a twofold serial dilution below the MVD was performed in duplicate to detect interferences. The labeled sensitivity of the used LAL reagent was 0.125 EU mL -1 (Endotoxin Units per milliliter). For validation, a dilution series was performed, a twofold dilution of control standard endotoxin (CSE) from 0.5 to 0.03 EU mL -1 , to confirm the labeled sensitivity of the LAL reagent being tested in sterile and non pyrogenic water, in quadruplicate. The same dilution series was performed with the CSE and the product in the 1:100 dilution factor, in three consecutive batches of each radiopharmaceutical. The products 131 I-m-iodobenzylguanidine, Gallium-67, Thallium-201, DTPA, HSA and Colloidal Tin were found compatible with the LAL test at a 1:100 dilution factor. Phytate and GHA showed some interference in the gel clot test. Other techniques to determine endotoxins as the chromogenic (color development) and the turbidimetric test (turbidity development), were also assessed to get valuable quantitative and

  13. Reliability and Validity of 3 Methods of Assessing Orthopedic Resident Skill in Shoulder Surgery.

    Science.gov (United States)

    Bernard, Johnathan A; Dattilo, Jonathan R; Srikumaran, Uma; Zikria, Bashir A; Jain, Amit; LaPorte, Dawn M

    Traditional measures for evaluating resident surgical technical skills (e.g., case logs) assess operative volume but not level of surgical proficiency. Our goal was to compare the reliability and validity of 3 tools for measuring surgical skill among orthopedic residents when performing 3 open surgical approaches to the shoulder. A total of 23 residents at different stages of their surgical training were tested for technical skill pertaining to 3 shoulder surgical approaches using the following measures: Objective Structured Assessment of Technical Skills (OSATS) checklists, the Global Rating Scale (GRS), and a final pass/fail assessment determined by 3 upper extremity surgeons. Adverse events were recorded. The Cronbach α coefficient was used to assess reliability of the OSATS checklists and GRS scores. Interrater reliability was calculated with intraclass correlation coefficients. Correlations among OSATS checklist scores, GRS scores, and pass/fail assessment were calculated with Spearman ρ. Validity of OSATS checklists was determined using analysis of variance with postgraduate year (PGY) as a between-subjects factor. Significance was set at p shoulder approaches. Checklist scores showed superior interrater reliability compared with GRS and subjective pass/fail measurements. GRS scores were positively correlated across training years. The incidence of adverse events was significantly higher among PGY-1 and PGY-2 residents compared with more experienced residents. OSATS checklists are a valid and reliable assessment of technical skills across 3 surgical shoulder approaches. However, checklist scores do not measure quality of technique. Documenting adverse events is necessary to assess quality of technique and ultimate pass/fail status. Multiple methods of assessing surgical skill should be considered when evaluating orthopedic resident surgical performance. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights

  14. Validation of a photography-based goniometry method for measuring joint range of motion.

    Science.gov (United States)

    Blonna, Davide; Zarkadas, Peter C; Fitzsimmons, James S; O'Driscoll, Shawn W

    2012-01-01

    A critical component of evaluating the outcomes after surgery to restore lost elbow motion is the range of motion (ROM) of the elbow. This study examined if digital photography-based goniometry is as accurate and reliable as clinical goniometry for measuring elbow ROM. Instrument validity and reliability for photography-based goniometry were evaluated for a consecutive series of 50 elbow contractures by 4 observers with different levels of elbow experience. Goniometric ROM measurements were taken with the elbows in full extension and full flexion directly in the clinic (once) and from digital photographs (twice in a blinded random manner). Instrument validity for photography-based goniometry was extremely high (intraclass correlation coefficient: extension = 0.98, flexion = 0.96). For extension and flexion measurements by the expert surgeon, systematic error was negligible (0° and 1°, respectively). Limits of agreement were 7° (95% confidence interval [CI], 5° to 9°) and -7° (95% CI, -5° to -9°) for extension and 8° (95% CI, 6° to 10°) and -7° (95% CI, -5° to -9°) for flexion. Interobserver reliability for photography-based goniometry was better than that for clinical goniometry. The least experienced observer's photographic goniometry measurements were closer to the reference measurements than the clinical goniometry measurements. Photography-based goniometry is accurate and reliable for measuring elbow ROM. The photography-based method relied less on observer expertise than clinical goniometry. This validates an objective measure of patient outcome without requiring doctor-patient contact at a tertiary care center, where most contracture surgeries are done. Copyright © 2012 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Mosby, Inc. All rights reserved.

  15. Validated HPTLC methods for determination of some selected antihypertensive mixtures in their combined dosage forms

    Directory of Open Access Journals (Sweden)

    Rasha A. Shaalan

    2014-12-01

    Full Text Available Simple and selective HPTLC methods were developed for the simultaneous determination of the antihypertensive drugs; carvedilol and hydrochlorothiazide in their binary mixture (Mixture I and amlodipine besylate, valsartan, and hydrochlorothiazide in their combined ternary formulation (Mixture II. Effective chromatographic separation was achieved on Fluka TLC plates 20 × 20 cm aluminum cards, 0.2 mm thickness through linear ascending development. For Mixture I, the mobile phase composed of chloroform–methanol in the ratio 8:2 v/v. Detection was performed at 254 nm for both carvedilol and hydrochlorothiazide. For Mixture II, the mobile phase was chloroform–methanol–ammonia in the volume ratio 8:2:0.1. Detection was performed at 254 nm for valsartan and hydrochlorothiazide, and at 365 nm for amlodipine. Quantification was based on spectrodensitometric analysis. Analytical performance of the proposed HPTLC procedures was statistically validated with respect to linearity, ranges, precision, accuracy, specificity, robustness, detection and quantification limits. The linearity ranges were 0.05–1.0 and 0.1–2.0 μg/spot for carvedilol and hydrochlorothiazide, respectively in Mixture I, 0.1–2.0, 0.1–2.0 and 0.2–4.0 μg/spot for amlodipine, hydrochlorothiazide and valsartan, respectively in Mixture II, with correlation coefficients >0.9992. The validated HPTLC methods were applied to the analysis of the cited antihypertensive drugs in their combined pharmaceutical tablets. The proposed methods confirmed peak identity and purity.

  16. Evaluation of bone formation in calcium phosphate scaffolds with μCT-method validation using SEM.

    Science.gov (United States)

    Lewin, S; Barba, A; Persson, C; Franch, J; Ginebra, M-P; Öhman-Mägi, C

    2017-10-05

    There is a plethora of calcium phosphate (CaP) scaffolds used as synthetic substitutes to bone grafts. The scaffold performance is often evaluated from the quantity of bone formed within or in direct contact with the scaffold. Micro-computed tomography (μCT) allows three-dimensional evaluation of bone formation inside scaffolds. However, the almost identical x-ray attenuation of CaP and bone obtrude the separation of these phases in μCT images. Commonly, segmentation of bone in μCT images is based on gray scale intensity, with manually determined global thresholds. However, image analysis methods, and methods for manual thresholding in particular, lack standardization and may consequently suffer from subjectivity. The aim of the present study was to provide a methodological framework for addressing these issues. Bone formation in two types of CaP scaffold architectures (foamed and robocast), obtained from a larger animal study (a 12 week canine animal model) was evaluated by μCT. In addition, cross-sectional scanning electron microscopy (SEM) images were acquired as references to determine thresholds and to validate the result. μCT datasets were registered to the corresponding SEM reference. Global thresholds were then determined by quantitatively correlating the different area fractions in the μCT image, towards the area fractions in the corresponding SEM image. For comparison, area fractions were also quantified using global thresholds determined manually by two different approaches. In the validation the manually determined thresholds resulted in large average errors in area fraction (up to 17%), whereas for the evaluation using SEM references, the errors were estimated to be less than 3%. Furthermore, it was found that basing the thresholds on one single SEM reference gave lower errors than determining them manually. This study provides an objective, robust and less error prone method to determine global thresholds for the evaluation of bone formation in

  17. Validation of an HPLC method for the simultaneous determination of eletriptan and UK 120.413

    Directory of Open Access Journals (Sweden)

    LJILJANA ZIVANOVIC

    2006-11-01

    Full Text Available Arapid and sensitive RPHPLCmethod was developed for the routine control analysis of eletriptan hydrobromide and its organic impurity UK 120.413 in Relpax® tablets. The chromatography was performed at 20 °Cusing a C18 XTerraTM (5 m, 150 × 4,6 mm column at a flow rate 1.0 ml/min. The drug and its impurity were detected at 225 nm. The mobile phase consisted of TEA (1 % – methanol (67.2:32.8 v/v, the pH of which was adjusted to 6.8 with 85 % orthophosphoric acid. Quantification was accomplished by the internal standard method. The developed RP HPLC method was validated by testing: accuracy, precision, repeatibility, specificity, detection limit, quantification limit, linearity, robustness and sensitivity. High linearity of the analytical procedure was confirmed over the concentration range of 0.05 – 1.00 mg/ml for eletriptan hydrobromide and from 0.10 – 1.50 µg/ml for UK 120.413, with correlation coefficients greater than r = 0.995. The low value of the RSD expressed the good repeatability and precision of the method. Experimental design and a response surface method were used to test robustness of the analytical procedure and to evaluate the effect of variation of the method parameters, namely the mobile phase composition, pH and temperature. They showed small deviations from the method setting. The good recovery and low RSD confirm the suitability of the proposed RP HPLC method for the routine determination of eletriptan hydrobromide and its impurity UK 120.413 in Relpax® tables.

  18. Validation of a cartridge method for the quality control determination of 99Tcm-HMPAO

    International Nuclear Information System (INIS)

    Pandos, G.; Penglis, S.; Tsopelas, C.; Royal Adelaide Hospital, Adelaide, SA

    1999-01-01

    Full text: The manufacturer's method for assessing the radiochemical purity (RCP) of ±-HMPAO requires the use of three solvent types on two different stationary phases, and is time-consuming (∼ 15 min) in consideration of the short shelf-life (30 min). An impetus to develop a rapid quality control procedure for this product has led to the use of a single strip Whatman 17 chromatography system using ethyl acetate as the developing solvent. This popular Whatman paper system was previously validates against the manufacturer's method. We have developed a new method to successfully determine the % RCP of ±-HMPAO, which employs a disposable, inexpensive and reusable Amprep C-18 cartridge with normal saline as a non-organic mobile phase. The Whatman paper system separates the primary lipophilic 99 Tc m -HMPAO complex from 99 Tc m O-2, 99 Tc m O 4 and secondary 99 Tc m -HMPAO complex at the origin. By comparison, the lipophilic portion was retained on the cartridge and the hydrophilic impurities were found in saline eluent with the cartridge method. Whatman 17 paper system results showed 95.1 ± I.7% 99 Tc m -HMPAO after 5 min and the cartridge method gave 95.5 ± 1.5% 99 Tc m -HMPAO (n = 8) after 3 min. The % 99 Tc m O 2 levels in 99 Tc m -HMPAO were insignificant. When a failed kit was assessed for RCP at 2.5 h post-reconstitution, the Whatman paper system and the cartridge method correlated well, resulting in 63.1 ± 2.7% and 62.9±2.1% 99 Tc m -HMPAO (n=3) respectively. Although the cartridge method may slightly overestimate the % RCP of 99 Tc m -HMPAO, it was found to be simple, rapid and reliable for the quality control analysis of routine 99 Tc m -HMPAO preparations

  19. Validity of the Remote Food Photography Method Against Doubly Labeled Water Among Minority Preschoolers.

    Science.gov (United States)

    Nicklas, Theresa; Saab, Rabab; Islam, Noemi G; Wong, William; Butte, Nancy; Schulin, Rebecca; Liu, Yan; Apolzan, John W; Myers, Candice A; Martin, Corby K

    2017-09-01

    The aim of this study was to determine the validity of energy intake (EI) estimations made using the remote food photography method (RFPM) compared to the doubly labeled water (DLW) method in minority preschool children in a free-living environment. Seven days of food intake and spot urine samples excluding first void collections for DLW analysis were obtained on thirty-nine 3- to 5-year-old Hispanic and African American children. Using an iPhone, caregivers captured before and after pictures of each child's intake, pictures were wirelessly transmitted to trained raters who estimated portion size using existing visual estimation procedures, and energy and macronutrients were calculated. Paired t tests, mean differences, and Bland-Altman limits of agreement were performed. The mean EI was 1,191 ± 256 kcal/d using the RFPM and 1,412 ± 220 kcal/d using the DLW method, resulting in a mean underestimate of 222 kcal/d (-15.6%; P < 0.0001) that was consistent regardless of intake. The RFPM underestimated EI by -28.5% in 34 children and overestimated EI by 15.6% in 5 children. The RFPM underestimated total EI when compared to the DLW method among preschoolers. Further refinement of the RFPM is needed for assessing the EI of young children. © 2017 The Obesity Society.

  20. Diagnostic Methods of Helicobacter pylori Infection for Epidemiological Studies: Critical Importance of Indirect Test Validation.

    Science.gov (United States)

    Miftahussurur, Muhammad; Yamaoka, Yoshio

    2016-01-01

    Among the methods developed to detect H. pylori infection, determining the gold standard remains debatable, especially for epidemiological studies. Due to the decreasing sensitivity of direct diagnostic tests (histopathology and/or immunohistochemistry [IHC], rapid urease test [RUT], and culture), several indirect tests, including antibody-based tests (serology and urine test), urea breath test (UBT), and stool antigen test (SAT) have been developed to diagnose H. pylori infection. Among the indirect tests, UBT and SAT became the best methods to determine active infection. While antibody-based tests, especially serology, are widely available and relatively sensitive, their specificity is low. Guidelines indicated that no single test can be considered as the gold standard for the diagnosis of H. pylori infection and that one should consider the method's advantages and disadvantages. Based on four epidemiological studies, culture and RUT present a sensitivity of 74.2-90.8% and 83.3-86.9% and a specificity of 97.7-98.8% and 95.1-97.2%, respectively, when using IHC as a gold standard. The sensitivity of serology is quite high, but that of the urine test was lower compared with that of the other methods. Thus, indirect test validation is important although some commercial kits propose universal cut-off values.