WorldWideScience

Sample records for method validation parameters

  1. Validation parameters of instrumental method for determination of total bacterial count in milk

    Directory of Open Access Journals (Sweden)

    Nataša Mikulec

    2004-10-01

    Full Text Available The method of flow citometry as rapid, instrumental and routine microbiological method is used for determination of total bacterial count in milk. The results of flow citometry are expressed as individual bacterial cells count. Problems regarding the interpretation of the results of total bacterial count can be avoided by transformation of the results of flow citometry method onto the scale of reference method (HRN ISO 6610:2001.. The method of flow citometry, like any analitycal method, according to the HRN EN ISO/IEC 17025:2000 standard, requires validation and verification. This paper describes parameters of validation: accuracy, precision, specificity, range, robustness and measuring uncertainty for the method of flow citometry.

  2. Development and validity of methods for the estimation of temporal gait parameters from heel-attached inertial sensors in younger and older adults.

    Science.gov (United States)

    Misu, Shogo; Asai, Tsuyoshi; Ono, Rei; Sawa, Ryuichi; Tsutsumimoto, Kota; Ando, Hiroshi; Doi, Takehiko

    2017-09-01

    The heel is likely a suitable location to which inertial sensors are attached for the detection of gait events. However, there are few studies to detect gait events and determine temporal gait parameters using sensors attached to the heels. We developed two methods to determine temporal gait parameters: detecting heel-contact using acceleration and detecting toe-off using angular velocity data (acceleration-angular velocity method; A-V method), and detecting both heel-contact and toe-off using angular velocity data (angular velocity-angular velocity method; V-V method). The aim of this study was to examine the concurrent validity of the A-V and V-V methods against the standard method, and to compare their accuracy. Temporal gait parameters were measured in 10 younger and 10 older adults. The intra-class correlation coefficients were excellent in both methods compared with the standard method (0.80 to 1.00). The root mean square errors of stance and swing time in the A-V method were smaller than the V-V method in older adults, although there were no significant discrepancies in the other comparisons. Our study suggests that inertial sensors attached to the heels, using the A-V method in particular, provide a valid measurement of temporal gait parameters. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    Science.gov (United States)

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  4. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  5. Pyrrolizidine alkaloids in the food chain: development, validation, and application of a new HPLC-ESI-MS/MS sum parameter method.

    Science.gov (United States)

    Cramer, Luise; Schiebel, Hans-Martin; Ernst, Ludger; Beuerle, Till

    2013-11-27

    Contamination of food and feed with pyrrolizidine alkaloids is currently discussed as a potential health risk. Here, we report the development of a new HPLC-ESI-MS/MS sum parameter method to quantitate the pyrrolizidine alkaloid content in complex food matrices. The procedure was validated for honey and culinary herbs. Isotopically labeled 7-O-9-O-dibutyroyl-[9,9-(2)H2]-retronecine was synthesized and utilized as an internal standard for validation and quantitation. The total pyrrolizidine alkaloid content of a sample is expressed as a single sum parameter: retronecine equivalents (RE). Ld/Lq for honey was 0.1 μg RE/kg/0.3 μg RE/kg. For culinary herbs, 1.0 μg RE/kg/3.0 μg RE/kg (dry weight, dw) and 0.1 μg RE/kg/0.3 μg RE/kg (fresh weight, fw) were determined, respectively. The new method was applied to analyze 21 herbal convenience products. Fifteen products (71%) were pyrrolizidine alkaloid positive showing pyrrolizidine alkaloid concentrations ranging from 0.9 to 74 μg RE/kg fw.

  6. Automated extraction and validation of children's gait parameters with the Kinect.

    Science.gov (United States)

    Motiian, Saeid; Pergami, Paola; Guffey, Keegan; Mancinelli, Corrie A; Doretto, Gianfranco

    2015-12-02

    Gait analysis for therapy regimen prescription and monitoring requires patients to physically access clinics with specialized equipment. The timely availability of such infrastructure at the right frequency is especially important for small children. Besides being very costly, this is a challenge for many children living in rural areas. This is why this work develops a low-cost, portable, and automated approach for in-home gait analysis, based on the Microsoft Kinect. A robust and efficient method for extracting gait parameters is introduced, which copes with the high variability of noisy Kinect skeleton tracking data experienced across the population of young children. This is achieved by temporally segmenting the data with an approach based on coupling a probabilistic matching of stride template models, learned offline, with the estimation of their global and local temporal scaling. A preliminary study conducted on healthy children between 2 and 4 years of age is performed to analyze the accuracy, precision, repeatability, and concurrent validity of the proposed method against the GAITRite when measuring several spatial and temporal children's gait parameters. The method has excellent accuracy and good precision, with segmenting temporal sequences of body joint locations into stride and step cycles. Also, the spatial and temporal gait parameters, estimated automatically, exhibit good concurrent validity with those provided by the GAITRite, as well as very good repeatability. In particular, on a range of nine gait parameters, the relative and absolute agreements were found to be good and excellent, and the overall agreements were found to be good and moderate. This work enables and validates the automated use of the Kinect for children's gait analysis in healthy subjects. In particular, the approach makes a step forward towards developing a low-cost, portable, parent-operated in-home tool for clinicians assisting young children.

  7. Computer code ENDSAM for random sampling and validation of the resonance parameters covariance matrices of some major nuclear data libraries

    International Nuclear Information System (INIS)

    Plevnik, Lucijan; Žerovnik, Gašper

    2016-01-01

    Highlights: • Methods for random sampling of correlated parameters. • Link to open-source code for sampling of resonance parameters in ENDF-6 format. • Validation of the code on realistic and artificial data. • Validation of covariances in three major contemporary nuclear data libraries. - Abstract: Methods for random sampling of correlated parameters are presented. The methods are implemented for sampling of resonance parameters in ENDF-6 format and a link to the open-source code ENDSAM is given. The code has been validated on realistic data. Additionally, consistency of covariances of resonance parameters of three major contemporary nuclear data libraries (JEFF-3.2, ENDF/B-VII.1 and JENDL-4.0u2) has been checked.

  8. Update of Standard Practices for New Method Validation in Forensic Toxicology.

    Science.gov (United States)

    Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T

    2017-01-01

    International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  9. Method validation for chemical composition determination by electron microprobe with wavelength dispersive spectrometer

    Science.gov (United States)

    Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.

    2016-07-01

    The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.

  10. Validation of Simulation Models without Knowledge of Parameters Using Differential Algebra

    Directory of Open Access Journals (Sweden)

    Björn Haffke

    2015-01-01

    Full Text Available This study deals with the external validation of simulation models using methods from differential algebra. Without any system identification or iterative numerical methods, this approach provides evidence that the equations of a model can represent measured and simulated sets of data. This is very useful to check if a model is, in general, suitable. In addition, the application of this approach to verification of the similarity between the identifiable parameters of two models with different sets of input and output measurements is demonstrated. We present a discussion on how the method can be used to find parameter deviations between any two models. The advantage of this method is its applicability to nonlinear systems as well as its algorithmic nature, which makes it easy to automate.

  11. Validation of the method for investigation of radiopharmaceuticals for in vitro use

    International Nuclear Information System (INIS)

    Vranjes, S; Jovanovic, M.; Orlic, M.; Lazic, E. . E-mail address of corresponding author: sanjav@vin.bg.ac.yu

    2005-01-01

    The aim of this study was to validate analytical method for determination of total radioactivity and radioactive concentration of 125 I-triiodotironin, radiopharmaceutical for in vitro use. Analytical parameters: selectivity, accuracy, linearity and range of this method were determined. Values obtained for all parameters are reasonable for analytical methods, therefore this method could be used for farther investigation. (author)

  12. The Validation of NAA Method Used as Test Method in Serpong NAA Laboratory

    International Nuclear Information System (INIS)

    Rina-Mulyaningsih, Th.

    2004-01-01

    The Validation Of NAA Method Used As Test Method In Serpong NAA Laboratory. NAA Method is a non standard testing method. The testing laboratory shall validate its using method to ensure and confirm that it is suitable with application. The validation of NAA methods have been done with the parameters of accuracy, precision, repeatability and selectivity. The NIST 1573a Tomato Leaves, NIES 10C Rice flour unpolished and standard elements were used in this testing program. The result of testing with NIST 1573a showed that the elements of Na, Zn, Al and Mn are met from acceptance criteria of accuracy and precision, whereas Co is rejected. The result of testing with NIES 10C showed that Na and Zn elements are met from acceptance criteria of accuracy and precision, but Mn element is rejected. The result of selectivity test showed that the value of quantity is between 0.1-2.5 μg, depend on the elements. (author)

  13. Validity of gait parameters for hip flexor contracture in patients with cerebral palsy

    Directory of Open Access Journals (Sweden)

    Lee Sang Hyeong

    2011-01-01

    Full Text Available Abstract Background Psoas contracture is known to cause abnormal hip motion in patients with cerebral palsy. The authors investigated the clinical relevance of hip kinematic and kinetic parameters, and 3D modeled psoas length in terms of discriminant validty, convergent validity, and responsiveness. Methods Twenty-four patients with cerebral palsy (mean age 6.9 years and 28 normal children (mean age 7.6 years were included. Kinematic and kinetic data were obtained by three dimensional gait analysis, and psoas lengths were determined using a musculoskeletal modeling technique. Validity of the hip parameters were evaluated. Results In discriminant validity, maximum psoas length (effect size r = 0.740, maximum pelvic tilt (0.710, maximum hip flexion in late swing (0.728, maximum hip extension in stance (0.743, and hip flexor index (0.792 showed favorable discriminant ability between the normal controls and the patients. In convergent validity, maximum psoas length was not significantly correlated with maximum hip extension in stance in control group whereas it was correlated with maximum hip extension in stance (r = -0.933, p Conclusions Maximum pelvic tilt, maximum psoas length, hip flexor index, and maximum hip extension in stance were found to be clinically relevant parameters in evaluating hip flexor contracture.

  14. Validation of NAA Method for Urban Particulate Matter

    International Nuclear Information System (INIS)

    Woro Yatu Niken Syahfitri; Muhayatun; Diah Dwiana Lestiani; Natalia Adventini

    2009-01-01

    Nuclear analytical techniques have been applied in many countries for determination of environmental pollutant. Method of NAA (neutron activation analysis) representing one of nuclear analytical technique of that has low detection limits, high specificity, high precision, and accuracy for large majority of naturally occurring elements, and ability of non-destructive and simultaneous determination of multi-elemental, and can handle small sample size (< 1 mg). To ensure quality and reliability of the method, validation are needed to be done. A standard reference material, SRM NIST 1648 Urban Particulate Matter, has been used to validate NAA method. Accuracy and precision test were used as validation parameters. Particulate matter were validated for 18 elements: Ti, I, V, Br, Mn, Na, K, Cl, Cu, Al, As, Fe, Co, Zn, Ag, La, Cr, and Sm,. The result showed that the percent relative standard deviation of the measured elemental concentrations are found to be within ranged from 2 to 14,8% for most of the elements analyzed whereas Hor rat value in range 0,3-1,3. Accuracy test results showed that relative bias ranged from -11,1 to 3,6%. Based on validation results, it can be stated that NAA method is reliable for characterization particulate matter and other similar matrix samples to support air quality monitoring. (author)

  15. A New Method for Optimal Regularization Parameter Determination in the Inverse Problem of Load Identification

    Directory of Open Access Journals (Sweden)

    Wei Gao

    2016-01-01

    Full Text Available According to the regularization method in the inverse problem of load identification, a new method for determining the optimal regularization parameter is proposed. Firstly, quotient function (QF is defined by utilizing the regularization parameter as a variable based on the least squares solution of the minimization problem. Secondly, the quotient function method (QFM is proposed to select the optimal regularization parameter based on the quadratic programming theory. For employing the QFM, the characteristics of the values of QF with respect to the different regularization parameters are taken into consideration. Finally, numerical and experimental examples are utilized to validate the performance of the QFM. Furthermore, the Generalized Cross-Validation (GCV method and the L-curve method are taken as the comparison methods. The results indicate that the proposed QFM is adaptive to different measuring points, noise levels, and types of dynamic load.

  16. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  17. A practical iterative PID tuning method for mechanical systems using parameter chart

    Science.gov (United States)

    Kang, M.; Cheong, J.; Do, H. M.; Son, Y.; Niculescu, S.-I.

    2017-10-01

    In this paper, we propose a method of iterative proportional-integral-derivative parameter tuning for mechanical systems that possibly possess hidden mechanical resonances, using a parameter chart which visualises the closed-loop characteristics in a 2D parameter space. We employ a hypothetical assumption that the considered mechanical systems have their upper limit of the derivative feedback gain, from which the feasible region in the parameter chart becomes fairly reduced and thus the gain selection can be extremely simplified. Then, a two-directional parameter search is carried out within the feasible region in order to find the best set of parameters. Experimental results show the validity of the assumption used and the proposed parameter tuning method.

  18. Generator Dynamic Model Validation and Parameter Calibration Using Phasor Measurements at the Point of Connection

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Zhenyu; Du, Pengwei; Kosterev, Dmitry; Yang, Steve

    2013-05-01

    Disturbance data recorded by phasor measurement units (PMU) offers opportunities to improve the integrity of dynamic models. However, manually tuning parameters through play-back events demands significant efforts and engineering experiences. In this paper, a calibration method using the extended Kalman filter (EKF) technique is proposed. The formulation of EKF with parameter calibration is discussed. Case studies are presented to demonstrate its validity. The proposed calibration method is cost-effective, complementary to traditional equipment testing for improving dynamic model quality.

  19. Cleaning verification: A five parameter study of a Total Organic Carbon method development and validation for the cleaning assessment of residual detergents in manufacturing equipment.

    Science.gov (United States)

    Li, Xue; Ahmad, Imad A Haidar; Tam, James; Wang, Yan; Dao, Gina; Blasko, Andrei

    2018-02-05

    A Total Organic Carbon (TOC) based analytical method to quantitate trace residues of clean-in-place (CIP) detergents CIP100 ® and CIP200 ® on the surfaces of pharmaceutical manufacturing equipment was developed and validated. Five factors affecting the development and validation of the method were identified: diluent composition, diluent volume, extraction method, location for TOC sample preparation, and oxidant flow rate. Key experimental parameters were optimized to minimize contamination and to improve the sensitivity, recovery, and reliability of the method. The optimized concentration of the phosphoric acid in the swabbing solution was 0.05M, and the optimal volume of the sample solution was 30mL. The swab extraction method was 1min sonication. The use of a clean room, as compared to an isolated lab environment, was not required for method validation. The method was demonstrated to be linear with a correlation coefficient (R) of 0.9999. The average recoveries from stainless steel surfaces at multiple spike levels were >90%. The repeatability and intermediate precision results were ≤5% across the 2.2-6.6ppm range (50-150% of the target maximum carry over, MACO, limit). The method was also shown to be sensitive with a detection limit (DL) of 38ppb and a quantitation limit (QL) of 114ppb. The method validation demonstrated that the developed method is suitable for its intended use. The methodology developed in this study is generally applicable to the cleaning verification of any organic detergents used for the cleaning of pharmaceutical manufacturing equipment made of electropolished stainless steel material. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. A chromatographic method validation to quantify tablets Mephenesine of national production

    International Nuclear Information System (INIS)

    Suarez Perez, Yania; Izquierdo Castro, Adalberto; Milian Sanchez, Jana Daria

    2009-01-01

    Authors made validation of an analytical method by high performance liquid chromatography (HPLC) for quantification of Mephenesine in recently reformulated 500 mg tablets. With regard to its application to quality control, validation included the following parameters: linearity, accuracy, precision, and selectivity. Results were satisfactory within 50-150 % rank. In the case of its use in subsequent studies of chemical stability, the selectivity for stability and sensitivity was assessed. Estimated detection and quantification limits were appropriate, and the method was selective versus the possible degradation products. (Author)

  1. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.

    2011-09-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which experimental data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport universe, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new experiments within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies. © 2011 Elsevier Ltd. All rights reserved.

  2. Robustness study in SSNTD method validation: indoor radon quality

    Energy Technology Data Exchange (ETDEWEB)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L., E-mail: danilacdias@gmail.com [Comissao Nacional de Energia Nuclear (LAPOC/CNEN), Pocos de Caldas, MG (Brazil). Laboratorio de Pocos de Caldas

    2017-07-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  3. Robustness study in SSNTD method validation: indoor radon quality

    International Nuclear Information System (INIS)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L.

    2017-01-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  4. GC Method Validation for the Analysis of Menthol in Suppository Pharmaceutical Dosage Form

    Directory of Open Access Journals (Sweden)

    Murad N. Abualhasan

    2017-01-01

    Full Text Available Menthol is widely used as a fragrance and flavor in the food and cosmetic industries. It is also used in the medical and pharmaceutical fields for its various biological effects. Gas chromatography (GC is considered to be a sensitive method for the analysis of menthol. GC chromatographic separation was developed using capillary column (VF-624 and a flame ionization detector (FID. The method was validated as per ICH guidelines for various parameters such as precision, linearity, accuracy, solution stability, robustness, limit of detection, and quantification. The tested validation parameters were found to be within acceptable limits. The method was successfully applied for the quantification of menthol in suppositories formulations. Quality control departments and official pharmacopeias can use our developed method in the analysis of menthol in pharmaceutical dosage formulation and raw material.

  5. Software for validating parameters retrieved from satellite

    Digital Repository Service at National Institute of Oceanography (India)

    Muraleedharan, P.M.; Sathe, P.V.; Pankajakshan, T.

    -channel Scanning Microwave Radiometer (MSMR) onboard the Indian satellites Occansat-1 during 1999-2001 were validated using this software as a case study. The program has several added advantages over the conventional method of validation that involves strenuous...

  6. Development and validation of a dissolution method using HPLC for diclofenac potassium in oral suspension

    Directory of Open Access Journals (Sweden)

    Alexandre Machado Rubim

    2014-04-01

    Full Text Available The present study describes the development and validation of an in vitro dissolution method for evaluation to release diclofenac potassium in oral suspension. The dissolution test was developed and validated according to international guidelines. Parameters like linearity, specificity, precision and accuracy were evaluated, as well as the influence of rotation speed and surfactant concentration on the medium. After selecting the best conditions, the method was validated using apparatus 2 (paddle, 50-rpm rotation speed, 900 mL of water with 0.3% sodium lauryl sulfate (SLS as dissolution medium at 37.0 ± 0.5°C. Samples were analyzed using the HPLC-UV (PDA method. The results obtained were satisfactory for the parameters evaluated. The method developed may be useful in routine quality control for pharmaceutical industries that produce oral suspensions containing diclofenac potassium.

  7. 40 CFR 60.4410 - How do I establish a valid parameter range if I have chosen to continuously monitor parameters?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false How do I establish a valid parameter... § 60.4410 How do I establish a valid parameter range if I have chosen to continuously monitor... continuously monitored and recorded during each run of the initial performance test, to establish acceptable...

  8. Improved Battery Parameter Estimation Method Considering Operating Scenarios for HEV/EV Applications

    Directory of Open Access Journals (Sweden)

    Jufeng Yang

    2016-12-01

    Full Text Available This paper presents an improved battery parameter estimation method based on typical operating scenarios in hybrid electric vehicles and pure electric vehicles. Compared with the conventional estimation methods, the proposed method takes both the constant-current charging and the dynamic driving scenarios into account, and two separate sets of model parameters are estimated through different parts of the pulse-rest test. The model parameters for the constant-charging scenario are estimated from the data in the pulse-charging periods, while the model parameters for the dynamic driving scenario are estimated from the data in the rest periods, and the length of the fitted dataset is determined by the spectrum analysis of the load current. In addition, the unsaturated phenomenon caused by the long-term resistor-capacitor (RC network is analyzed, and the initial voltage expressions of the RC networks in the fitting functions are improved to ensure a higher model fidelity. Simulation and experiment results validated the feasibility of the developed estimation method.

  9. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  10. Empirical Performance of Cross-Validation With Oracle Methods in a Genomics Context.

    Science.gov (United States)

    Martinez, Josue G; Carroll, Raymond J; Müller, Samuel; Sampson, Joshua N; Chatterjee, Nilanjan

    2011-11-01

    When employing model selection methods with oracle properties such as the smoothly clipped absolute deviation (SCAD) and the Adaptive Lasso, it is typical to estimate the smoothing parameter by m-fold cross-validation, for example, m = 10. In problems where the true regression function is sparse and the signals large, such cross-validation typically works well. However, in regression modeling of genomic studies involving Single Nucleotide Polymorphisms (SNP), the true regression functions, while thought to be sparse, do not have large signals. We demonstrate empirically that in such problems, the number of selected variables using SCAD and the Adaptive Lasso, with 10-fold cross-validation, is a random variable that has considerable and surprising variation. Similar remarks apply to non-oracle methods such as the Lasso. Our study strongly questions the suitability of performing only a single run of m-fold cross-validation with any oracle method, and not just the SCAD and Adaptive Lasso.

  11. Identification of strategy parameters for particle swarm optimizer through Taguchi method

    Institute of Scientific and Technical Information of China (English)

    KHOSLA Arun; KUMAR Shakti; AGGARWAL K.K.

    2006-01-01

    Particle swarm optimization (PSO), like other evolutionary algorithms is a population-based stochastic algorithm inspired from the metaphor of social interaction in birds, insects, wasps, etc. It has been used for finding promising solutions in complex search space through the interaction of particles in a swarm. It is a well recognized fact that the performance of evolutionary algorithms to a great extent depends on the choice of appropriate strategy/operating parameters like population size,crossover rate, mutation rate, crossover operator, etc. Generally, these parameters are selected through hit and trial process, which is very unsystematic and requires rigorous experimentation. This paper proposes a systematic based on Taguchi method reasoning scheme for rapidly identifying the strategy parameters for the PSO algorithm. The Taguchi method is a robust design approach using fractional factorial design to study a large number of parameters with small number of experiments. Computer simulations have been performed on two benchmark functions-Rosenbrock function and Griewank function-to validate the approach.

  12. Microscale validation of 4-aminoantipyrine test method for quantifying phenolic compounds in microbial culture

    International Nuclear Information System (INIS)

    Justiz Mendoza, Ibrahin; Aguilera Rodriguez, Isabel; Perez Portuondo, Irasema

    2014-01-01

    Validation of test methods microscale is currently of great importance due to the economic and environmental advantages possessed, which constitutes a prerequisite for the performance of services and quality assurance of the results to provide customer. This paper addresses the microscale validation of 4-aminoantipyrine spectrophotometric method for the quantification of phenolic compounds in culture medium. Parameters linearity, precision, regression, accuracy, detection limits, quantification limits and robustness were evaluated, addition to the comparison test with no standardized method for determining polyphenols (Folin Ciocalteu). The results showed that both methods are feasible for determining phenols

  13. Sensitivity and uncertainty analyses applied to criticality safety validation, methods development. Volume 1

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Hopper, C.M.; Childs, R.L.; Parks, C.V.

    1999-01-01

    This report presents the application of sensitivity and uncertainty (S/U) analysis methodologies to the code/data validation tasks of a criticality safety computational study. Sensitivity and uncertainty analysis methods were first developed for application to fast reactor studies in the 1970s. This work has revitalized and updated the available S/U computational capabilities such that they can be used as prototypic modules of the SCALE code system, which contains criticality analysis tools currently used by criticality safety practitioners. After complete development, simplified tools are expected to be released for general use. The S/U methods that are presented in this volume are designed to provide a formal means of establishing the range (or area) of applicability for criticality safety data validation studies. The development of parameters that are analogous to the standard trending parameters forms the key to the technique. These parameters are the D parameters, which represent the differences by group of sensitivity profiles, and the ck parameters, which are the correlation coefficients for the calculational uncertainties between systems; each set of parameters gives information relative to the similarity between pairs of selected systems, e.g., a critical experiment and a specific real-world system (the application)

  14. Accurate Lithium-ion battery parameter estimation with continuous-time system identification methods

    International Nuclear Information System (INIS)

    Xia, Bing; Zhao, Xin; Callafon, Raymond de; Garnier, Hugues; Nguyen, Truong; Mi, Chris

    2016-01-01

    Highlights: • Continuous-time system identification is applied in Lithium-ion battery modeling. • Continuous-time and discrete-time identification methods are compared in detail. • The instrumental variable method is employed to further improve the estimation. • Simulations and experiments validate the advantages of continuous-time methods. - Abstract: The modeling of Lithium-ion batteries usually utilizes discrete-time system identification methods to estimate parameters of discrete models. However, in real applications, there is a fundamental limitation of the discrete-time methods in dealing with sensitivity when the system is stiff and the storage resolutions are limited. To overcome this problem, this paper adopts direct continuous-time system identification methods to estimate the parameters of equivalent circuit models for Lithium-ion batteries. Compared with discrete-time system identification methods, the continuous-time system identification methods provide more accurate estimates to both fast and slow dynamics in battery systems and are less sensitive to disturbances. A case of a 2"n"d-order equivalent circuit model is studied which shows that the continuous-time estimates are more robust to high sampling rates, measurement noises and rounding errors. In addition, the estimation by the conventional continuous-time least squares method is further improved in the case of noisy output measurement by introducing the instrumental variable method. Simulation and experiment results validate the analysis and demonstrate the advantages of the continuous-time system identification methods in battery applications.

  15. Correlation and agreement of a digital and conventional method to measure arch parameters.

    Science.gov (United States)

    Nawi, Nes; Mohamed, Alizae Marny; Marizan Nor, Murshida; Ashar, Nor Atika

    2018-01-01

    The aim of the present study was to determine the overall reliability and validity of arch parameters measured digitally compared to conventional measurement. A sample of 111 plaster study models of Down syndrome (DS) patients were digitized using a blue light three-dimensional (3D) scanner. Digital and manual measurements of defined parameters were performed using Geomagic analysis software (Geomagic Studio 2014 software, 3D Systems, Rock Hill, SC, USA) on digital models and with a digital calliper (Tuten, Germany) on plaster study models. Both measurements were repeated twice to validate the intraexaminer reliability based on intraclass correlation coefficients (ICCs) using the independent t test and Pearson's correlation, respectively. The Bland-Altman method of analysis was used to evaluate the agreement of the measurement between the digital and plaster models. No statistically significant differences (p > 0.05) were found between the manual and digital methods when measuring the arch width, arch length, and space analysis. In addition, all parameters showed a significant correlation coefficient (r ≥ 0.972; p digital and manual measurements. Furthermore, a positive agreement between digital and manual measurements of the arch width (90-96%), arch length and space analysis (95-99%) were also distinguished using the Bland-Altman method. These results demonstrate that 3D blue light scanning and measurement software are able to precisely produce 3D digital model and measure arch width, arch length, and space analysis. The 3D digital model is valid to be used in various clinical applications.

  16. Acceptability criteria for linear dependence in validating UV-spectrophotometric methods of quantitative determination in forensic and toxicological analysis

    Directory of Open Access Journals (Sweden)

    L. Yu. Klimenko

    2014-08-01

    Full Text Available Introduction. This article is the result of authors’ research in the field of development of the approaches to validation of quantitative determination methods for purposes of forensic and toxicological analysis and devoted to the problem of acceptability criteria formation for validation parameter «linearity/calibration model». The aim of research. The purpose of this paper is to analyse the present approaches to acceptability estimation of the calibration model chosen for method description according to the requirements of the international guidances, to form the own approaches to acceptability estimation of the linear dependence when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. Materials and methods. UV-spectrophotometric method of doxylamine quantitative determination in blood. Results. The approaches to acceptability estimation of calibration models when carrying out the validation of bioanalytical methods is stated in international papers, namely «Guidance for Industry: Bioanalytical method validation» (U.S. FDA, 2001, «Standard Practices for Method Validation in Forensic Toxicology» (SWGTOX, 2012, «Guidance for the Validation of Analytical Methodology and Calibration of Equipment used for Testing of Illicit Drugs in Seized Materials and Biological Specimens» (UNODC, 2009 and «Guideline on validation of bioanalytical methods» (ЕМА, 2011 have been analysed. It has been suggested to be guided by domestic developments in the field of validation of analysis methods for medicines and, particularly, by the approaches to validation methods in the variant of the calibration curve method for forming the acceptability criteria of the obtained linear dependences when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. The choice of the method of calibration curve is

  17. Data Based Parameter Estimation Method for Circular-scanning SAR Imaging

    Directory of Open Access Journals (Sweden)

    Chen Gong-bo

    2013-06-01

    Full Text Available The circular-scanning Synthetic Aperture Radar (SAR is a novel working mode and its image quality is closely related to the accuracy of the imaging parameters, especially considering the inaccuracy of the real speed of the motion. According to the characteristics of the circular-scanning mode, a new data based method for estimating the velocities of the radar platform and the scanning-angle of the radar antenna is proposed in this paper. By referring to the basic conception of the Doppler navigation technique, the mathematic model and formulations for the parameter estimation are firstly improved. The optimal parameter approximation based on the least square criterion is then realized in solving those equations derived from the data processing. The simulation results verified the validity of the proposed scheme.

  18. Determination of methylmercury in marine biota samples with advanced mercury analyzer: method validation.

    Science.gov (United States)

    Azemard, Sabine; Vassileva, Emilia

    2015-06-01

    In this paper, we present a simple, fast and cost-effective method for determination of methyl mercury (MeHg) in marine samples. All important parameters influencing the sample preparation process were investigated and optimized. Full validation of the method was performed in accordance to the ISO-17025 (ISO/IEC, 2005) and Eurachem guidelines. Blanks, selectivity, working range (0.09-3.0ng), recovery (92-108%), intermediate precision (1.7-4.5%), traceability, limit of detection (0.009ng), limit of quantification (0.045ng) and expanded uncertainty (15%, k=2) were assessed. Estimation of the uncertainty contribution of each parameter and the demonstration of traceability of measurement results was provided as well. Furthermore, the selectivity of the method was studied by analyzing the same sample extracts by advanced mercury analyzer (AMA) and gas chromatography-atomic fluorescence spectrometry (GC-AFS). Additional validation of the proposed procedure was effectuated by participation in the IAEA-461 worldwide inter-laboratory comparison exercises. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Determination of vitamin C in foods: current state of method validation.

    Science.gov (United States)

    Spínola, Vítor; Llorent-Martínez, Eulogio J; Castilho, Paula C

    2014-11-21

    Vitamin C is one of the most important vitamins, so reliable information about its content in foodstuffs is a concern to both consumers and quality control agencies. However, the heterogeneity of food matrixes and the potential degradation of this vitamin during its analysis create enormous challenges. This review addresses the development and validation of high-performance liquid chromatography methods for vitamin C analysis in food commodities, during the period 2000-2014. The main characteristics of vitamin C are mentioned, along with the strategies adopted by most authors during sample preparation (freezing and acidification) to avoid vitamin oxidation. After that, the advantages and handicaps of different analytical methods are discussed. Finally, the main aspects concerning method validation for vitamin C analysis are critically discussed. Parameters such as selectivity, linearity, limit of quantification, and accuracy were studied by most authors. Recovery experiments during accuracy evaluation were in general satisfactory, with usual values between 81 and 109%. However, few methods considered vitamin C stability during the analytical process, and the study of the precision was not always clear or complete. Potential future improvements regarding proper method validation are indicated to conclude this review. Copyright © 2014. Published by Elsevier B.V.

  20. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    Science.gov (United States)

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  1. Validation of Essential Acoustic Parameters for Highly Urgent In-Vehicle Collision Warnings.

    Science.gov (United States)

    Lewis, Bridget A; Eisert, Jesse L; Baldwin, Carryl L

    2018-03-01

    Objective The aim of this study was to validate the importance of key acoustic criteria for use as in-vehicle forward collision warning (FCW) systems. Background Despite recent advances in vehicle safety, automobile crashes remain one of the leading causes of death. As automation allows for more control of noncritical functions by the vehicle, the potential for disengagement and distraction from the driving task also increases. It is, therefore, as important as ever that in-vehicle safety-critical interfaces are intuitive and unambiguous, promoting effective collision avoidance responses upon first exposure even under divided-attention conditions. Method The current study used a driving simulator to assess the effectiveness of two warnings, one that met all essential acoustic parameters, one that met only some essential parameters, and a no-warning control in the context of a lead vehicle-following task in conjunction with a cognitive distractor task and collision event. Results Participants receiving an FCW comprising five essential acoustic components had improved collision avoidance responses relative to a no-warning condition and an FCW missing essential elements on their first exposure. Responses to a consistently good warning (GMU Prime) improved with subsequent exposures, whereas continued exposure to the less optimal FCW (GMU Sub-Prime) resulted in poorer performance even relative to receiving no warning at all. Conclusions This study provides support for previous warning design studies and for the validity of five key acoustic parameters essential for the design of effective in-vehicle FCWs. Application Results from this study have implications for the design of auditory FCWs and in-vehicle display design.

  2. A New Method for Determining Optimal Regularization Parameter in Near-Field Acoustic Holography

    Directory of Open Access Journals (Sweden)

    Yue Xiao

    2018-01-01

    Full Text Available Tikhonov regularization method is effective in stabilizing reconstruction process of the near-field acoustic holography (NAH based on the equivalent source method (ESM, and the selection of the optimal regularization parameter is a key problem that determines the regularization effect. In this work, a new method for determining the optimal regularization parameter is proposed. The transfer matrix relating the source strengths of the equivalent sources to the measured pressures on the hologram surface is augmented by adding a fictitious point source with zero strength. The minimization of the norm of this fictitious point source strength is as the criterion for choosing the optimal regularization parameter since the reconstructed value should tend to zero. The original inverse problem in calculating the source strengths is converted into a univariate optimization problem which is solved by a one-dimensional search technique. Two numerical simulations with a point driven simply supported plate and a pulsating sphere are investigated to validate the performance of the proposed method by comparison with the L-curve method. The results demonstrate that the proposed method can determine the regularization parameter correctly and effectively for the reconstruction in NAH.

  3. State of charge estimation of lithium-ion batteries based on an improved parameter identification method

    International Nuclear Information System (INIS)

    Xia, Bizhong; Chen, Chaoren; Tian, Yong; Wang, Mingwang; Sun, Wei; Xu, Zhihui

    2015-01-01

    The SOC (state of charge) is the most important index of the battery management systems. However, it cannot be measured directly with sensors and must be estimated with mathematical techniques. An accurate battery model is crucial to exactly estimate the SOC. In order to improve the model accuracy, this paper presents an improved parameter identification method. Firstly, the concept of polarization depth is proposed based on the analysis of polarization characteristics of the lithium-ion batteries. Then, the nonlinear least square technique is applied to determine the model parameters according to data collected from pulsed discharge experiments. The results show that the proposed method can reduce the model error as compared with the conventional approach. Furthermore, a nonlinear observer presented in the previous work is utilized to verify the validity of the proposed parameter identification method in SOC estimation. Finally, experiments with different levels of discharge current are carried out to investigate the influence of polarization depth on SOC estimation. Experimental results show that the proposed method can improve the SOC estimation accuracy as compared with the conventional approach, especially under the conditions of large discharge current. - Highlights: • The polarization characteristics of lithium-ion batteries are analyzed. • The concept of polarization depth is proposed to improve model accuracy. • A nonlinear least square technique is applied to determine the model parameters. • A nonlinear observer is used as the SOC estimation algorithm. • The validity of the proposed method is verified by experimental results.

  4. Support Vector Data Description Model to Map Specific Land Cover with Optimal Parameters Determined from a Window-Based Validation Set

    Directory of Open Access Journals (Sweden)

    Jinshui Zhang

    2017-04-01

    Full Text Available This paper developed an approach, the window-based validation set for support vector data description (WVS-SVDD, to determine optimal parameters for support vector data description (SVDD model to map specific land cover by integrating training and window-based validation sets. Compared to the conventional approach where the validation set included target and outlier pixels selected visually and randomly, the validation set derived from WVS-SVDD constructed a tightened hypersphere because of the compact constraint by the outlier pixels which were located neighboring to the target class in the spectral feature space. The overall accuracies for wheat and bare land achieved were as high as 89.25% and 83.65%, respectively. However, target class was underestimated because the validation set covers only a small fraction of the heterogeneous spectra of the target class. The different window sizes were then tested to acquire more wheat pixels for validation set. The results showed that classification accuracy increased with the increasing window size and the overall accuracies were higher than 88% at all window size scales. Moreover, WVS-SVDD showed much less sensitivity to the untrained classes than the multi-class support vector machine (SVM method. Therefore, the developed method showed its merits using the optimal parameters, tradeoff coefficient (C and kernel width (s, in mapping homogeneous specific land cover.

  5. Stepwise Procedure for Development and Validation of a Multipesticide Method

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    The stepwise procedure for development and the validation of so called multi-pesticide methods are described. Principles, preliminary actions, criteria for the selection of chromatographic separation, detection and performance verification of multi-pesticide methods are outlined. Also the long term repeatability and reproducibility, as well as the necessity for the documentation of laboratory work are highlighted. Appendix I hereof describes in detail the calculation of calibration parameters, whereas Appendix II focuses on the calculation of the significance of differences of concentrations obtained on two different separation columns. (author)

  6. Session-RPE Method for Training Load Monitoring: Validity, Ecological Usefulness, and Influencing Factors

    Directory of Open Access Journals (Sweden)

    Monoem Haddad

    2017-11-01

    Full Text Available Purpose: The aim of this review is to (1 retrieve all data validating the Session-rating of perceived exertion (RPE-method using various criteria, (2 highlight the rationale of this method and its ecological usefulness, and (3 describe factors that can alter RPE and users of this method should take into consideration.Method: Search engines such as SPORTDiscus, PubMed, and Google Scholar databases in the English language between 2001 and 2016 were consulted for the validity and usefulness of the session-RPE method. Studies were considered for further analysis when they used the session-RPE method proposed by Foster et al. in 2001. Participants were athletes of any gender, age, or level of competition. Studies using languages other than English were excluded in the analysis of the validity and reliability of the session-RPE method. Other studies were examined to explain the rationale of the session-RPE method and the origin of RPE.Results: A total of 950 studies cited the Foster et al. study that proposed the session RPE-method. 36 studies have examined the validity and reliability of this proposed method using the modified CR-10.Conclusion: These studies confirmed the validity and good reliability and internal consistency of session-RPE method in several sports and physical activities with men and women of different age categories (children, adolescents, and adults among various expertise levels. This method could be used as “standing alone” method for training load (TL monitoring purposes though some recommend to combine it with other physiological parameters as heart rate.

  7. Oxcarbazepine: validation and application of an analytical method

    Directory of Open Access Journals (Sweden)

    Paula Cristina Rezende Enéas

    2010-06-01

    Full Text Available Oxcarbazepine (OXC is an important anticonvulsant and mood stabilizing drug. A pharmacopoeial monograph for OXC is not yet available and therefore the development and validation of a new analytical method for quantification of this drug is essential. In the present study, a UV spectrophotometric method for the determination of OXC was developed. The various parameters, such as linearity, precision, accuracy and specificity, were studied according to International Conference on Harmonization Guidelines. Batches of 150 mg OXC capsules were prepared and analyzed using the validated UV method. The formulations were also evaluated for parameters including drug-excipient compatibility, flowability, uniformity of weight, disintegration time, assay, uniformity of content and the amount of drug dissolved during the first hour.Oxcarbazepina (OXC é um fármaco anticonvulsivante e estabilizante do humor. O desenvolvimento e validação de método analítico para quantificação da OXC são de fundamental importância devido à ausência de monografias farmacopéicas oficiais para esse fármaco. Nesse trabalho, um método espectrofotométrico UV para determinação da OXC foi desenvolvido. O método proposto foi validado seguindo os parâmetros de linearidade, precisão, exatidão e especificidade de acordo com as normas da Conferência Internacional de Harmonização. Cápsulas de OXC 150 mg foram preparadas e analisadas utilizando-se o método analítico validado. As formulações foram avaliadas com relação à compatibilidade fármaco-excipientes, fluidez, determinação de peso, tempo de desintegração, doseamento, uniformidade de conteúdo e quantidade do fármaco dissolvido após 60 minutos.

  8. Slope Stability Assessment Using Trigger Parameters and SINMAP Methods on Tamblingan-Buyan Ancient Mountain Area in Buleleng Regency, Bali

    Directory of Open Access Journals (Sweden)

    I Nengah Sinarta

    2017-10-01

    Full Text Available The mapping of soil movement was examined by comparing an extension of the deterministic Soil Stability Index Mapping (SINMAP method, and an overlay method with trigger parameters of soil movement. The SINMAP model used soil parameters in the form of the cohesion value (c, internal friction angle (φ, and hydraulic conductivity (ks for the prediction of soil movement based on the factor of safety (FS, while the indirect method used a literature review and field observations. The weightings of soil movement trigger parameters in assessments were based on natural physical aspects: (1 slope inclination = 30%; (2 rock weathering = 15%; (3 geological structure = 20%; (4 rainfall = 15%; (5 groundwater potential = 7%; (6 seismicity = 3%; and (7 vegetation = 10%. The research area was located in the Buleleng district, in particular in the ancient mountain area of Buyan-Tamblingan, in the Sukasada sub-district. The hazard mapping gave a high and very high hazard scale. The SINMAP model gave a validation accuracy of 14.29%, while the overlay method with seven trigger parameters produced an accuracy of 71.43%. Based on the analysis of the very high and high hazard class and the validation of the landslide occurrence points, the deterministic method using soil parameters and water absorption gave a much lower accuracy than the overlay method with a study of soil motion trigger parameters.

  9. Dynamic Friction Parameter Identification Method with LuGre Model for Direct-Drive Rotary Torque Motor

    Directory of Open Access Journals (Sweden)

    Xingjian Wang

    2016-01-01

    Full Text Available Attainment of high-performance motion/velocity control objectives for the Direct-Drive Rotary (DDR torque motor should fully consider practical nonlinearities in controller design, such as dynamic friction. The LuGre model has been widely utilized to describe nonlinear friction behavior; however, parameter identification for the LuGre model remains a challenge. A new dynamic friction parameter identification method for LuGre model is proposed in this study. Static parameters are identified through a series of constant velocity experiments, while dynamic parameters are obtained through a presliding process. Novel evolutionary algorithm (NEA is utilized to increase identification accuracy. Experimental results gathered from the identification experiments conducted in the study for a practical DDR torque motor control system validate the effectiveness of the proposed method.

  10. Voltage stability, bifurcation parameters and continuation methods

    Energy Technology Data Exchange (ETDEWEB)

    Alvarado, F L [Wisconsin Univ., Madison, WI (United States)

    1994-12-31

    This paper considers the importance of the choice of bifurcation parameter in the determination of the voltage stability limit and the maximum power load ability of a system. When the bifurcation parameter is power demand, the two limits are equivalent. However, when other types of load models and bifurcation parameters are considered, the two concepts differ. The continuation method is considered as a method for determination of voltage stability margins. Three variants of the continuation method are described: the continuation parameter is the bifurcation parameter the continuation parameter is initially the bifurcation parameter, but is free to change, and the continuation parameter is a new `arc length` parameter. Implementations of voltage stability software using continuation methods are described. (author) 23 refs., 9 figs.

  11. On the validity of evolutionary models with site-specific parameters.

    Directory of Open Access Journals (Sweden)

    Konrad Scheffler

    Full Text Available Evolutionary models that make use of site-specific parameters have recently been criticized on the grounds that parameter estimates obtained under such models can be unreliable and lack theoretical guarantees of convergence. We present a simulation study providing empirical evidence that a simple version of the models in question does exhibit sensible convergence behavior and that additional taxa, despite not being independent of each other, lead to improved parameter estimates. Although it would be desirable to have theoretical guarantees of this, we argue that such guarantees would not be sufficient to justify the use of these models in practice. Instead, we emphasize the importance of taking the variance of parameter estimates into account rather than blindly trusting point estimates - this is standardly done by using the models to construct statistical hypothesis tests, which are then validated empirically via simulation studies.

  12. A Model Parameter Extraction Method for Dielectric Barrier Discharge Ozone Chamber using Differential Evolution

    Science.gov (United States)

    Amjad, M.; Salam, Z.; Ishaque, K.

    2014-04-01

    In order to design an efficient resonant power supply for ozone gas generator, it is necessary to accurately determine the parameters of the ozone chamber. In the conventional method, the information from Lissajous plot is used to estimate the values of these parameters. However, the experimental setup for this purpose can only predict the parameters at one operating frequency and there is no guarantee that it results in the highest ozone gas yield. This paper proposes a new approach to determine the parameters using a search and optimization technique known as Differential Evolution (DE). The desired objective function of DE is set at the resonance condition and the chamber parameter values can be searched regardless of experimental constraints. The chamber parameters obtained from the DE technique are validated by experiment.

  13. Methods for measurement of durability parameters

    DEFF Research Database (Denmark)

    Hansen, Ernst Jan De Place

    1996-01-01

    Present selected methods for measurement of durabilty parameters relating to chlorides, corrosion, moisture and freeze-thaw, primarly on concrete. Advantages and drawbacks of the different methods are included.......Present selected methods for measurement of durabilty parameters relating to chlorides, corrosion, moisture and freeze-thaw, primarly on concrete. Advantages and drawbacks of the different methods are included....

  14. A New Uncertain Analysis Method for the Prediction of Acoustic Field with Random and Interval Parameters

    Directory of Open Access Journals (Sweden)

    Mingjie Wang

    2016-01-01

    Full Text Available For the frequency response analysis of acoustic field with random and interval parameters, a nonintrusive uncertain analysis method named Polynomial Chaos Response Surface (PCRS method is proposed. In the proposed method, the polynomial chaos expansion method is employed to deal with the random parameters, and the response surface method is used to handle the interval parameters. The PCRS method does not require efforts to modify model equations due to its nonintrusive characteristic. By means of the PCRS combined with the existing interval analysis method, the lower and upper bounds of expectation, variance, and probability density function of the frequency response can be efficiently evaluated. Two numerical examples are conducted to validate the accuracy and efficiency of the approach. The results show that the PCRS method is more efficient compared to the direct Monte Carlo simulation (MCS method based on the original numerical model without causing significant loss of accuracy.

  15. On-line validation of safety parameters and fault identification

    International Nuclear Information System (INIS)

    Tzanos, C.P.

    1985-01-01

    In many safety-significant off-normal events, the reliability of failure identification and corrective operator actions is limited greatly by the large amount of data that has to be processed and analyzed mentally in a very short time and in a high-stress environment. A data-validation and fault-identification system, that uses computers for continuous plant-information processing and analysis, can enhance plant safety and also improve plant availability. A methodology has been developed that provides validation of safety-significant plant parameter measurements, plant state verification, and fault identification in the presence of many instrumentation failures (including multiple common-cause failures). This paper presents this methodology and some results of its application to a reference LMFBR plant. The basic features of this methodology and the results of its application are summarized

  16. Validation of DRAGON side-step method for Bruce-A restart Phase-B physics tests

    International Nuclear Information System (INIS)

    Shen, W.; Ngo-Trong, C.; Davis, R.S.

    2004-01-01

    The DRAGON side-step method, developed at AECL, has a number of advantages over the all-DRAGON method that was used before. It is now the qualified method for reactivity-device calculations. Although the side-step-method-generated incremental cross sections have been validated against those previously calculated with the all-DRAGON method, it is highly desirable to validate the side-step method against device-worth measurements in power reactors directly. In this paper, the DRAGON side-step method was validated by comparison with the device-calibration measurements made in Bruce-A NGS Unit 4 restart Phase-B commissioning in 2003. The validation exercise showed excellent results, with the DRAGON code overestimating the measured ZCR worth by ∼5%. A sensitivity study was also performed in this paper to assess the effect of various DRAGON modelling techniques on the incremental cross sections. The assessment shows that the refinement of meshes in 3-D and the use of the side-step method are two major reasons contributing to the improved agreement between the calculated ZCR worths and the measurements. Use of different DRAGON versions, DRAGON libraries, local-parameter core conditions, and weighting techniques for the homogenization of tube clusters inside the ZCR have a very small effect on the ZCR incremental thermal absorption cross section and ZCR reactivity worth. (author)

  17. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2015-05-01

    Several sophisticated methods of footprint analysis currently exist. However, it is sometimes useful to apply standard measurement methods of recognized evidence with an easy and quick application. We sought to assess the reliability and validity of a new method of footprint assessment in a healthy population using Photoshop CS5 software (Adobe Systems Inc, San Jose, California). Forty-two footprints, corresponding to 21 healthy individuals (11 men with a mean ± SD age of 20.45 ± 2.16 years and 10 women with a mean ± SD age of 20.00 ± 1.70 years) were analyzed. Footprints were recorded in static bipedal standing position using optical podography and digital photography. Three trials for each participant were performed. The Hernández-Corvo, Chippaux-Smirak, and Staheli indices and the Clarke angle were calculated by manual method and by computerized method using Photoshop CS5 software. Test-retest was used to determine reliability. Validity was obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed high values (ICC, 0.98-0.99). Moreover, the validity test clearly showed no difference between techniques (ICC, 0.99-1). The reliability and validity of a method to measure, assess, and record the podometric indices using Photoshop CS5 software has been demonstrated. This provides a quick and accurate tool useful for the digital recording of morphostatic foot study parameters and their control.

  18. Validity and repeatability of inertial measurement units for measuring gait parameters.

    Science.gov (United States)

    Washabaugh, Edward P; Kalyanaraman, Tarun; Adamczyk, Peter G; Claflin, Edward S; Krishnan, Chandramouli

    2017-06-01

    Inertial measurement units (IMUs) are small wearable sensors that have tremendous potential to be applied to clinical gait analysis. They allow objective evaluation of gait and movement disorders outside the clinic and research laboratory, and permit evaluation on large numbers of steps. However, repeatability and validity data of these systems are sparse for gait metrics. The purpose of this study was to determine the validity and between-day repeatability of spatiotemporal metrics (gait speed, stance percent, swing percent, gait cycle time, stride length, cadence, and step duration) as measured with the APDM Opal IMUs and Mobility Lab system. We collected data on 39 healthy subjects. Subjects were tested over two days while walking on a standard treadmill, split-belt treadmill, or overground, with IMUs placed in two locations: both feet and both ankles. The spatiotemporal measurements taken with the IMU system were validated against data from an instrumented treadmill, or using standard clinical procedures. Repeatability and minimally detectable change (MDC) of the system was calculated between days. IMUs displayed high to moderate validity when measuring most of the gait metrics tested. Additionally, these measurements appear to be repeatable when used on the treadmill and overground. The foot configuration of the IMUs appeared to better measure gait parameters; however, both the foot and ankle configurations demonstrated good repeatability. In conclusion, the IMU system in this study appears to be both accurate and repeatable for measuring spatiotemporal gait parameters in healthy young adults. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Validation of ascorbic acid tablets of national production by igh-performance liquid chromatography method

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Izquierdo Castro, Idalberto

    2009-01-01

    We validate an analytical method by high-performance liquid chromatography to determine ascorbic acid proportion in vitamin C tablets, which was designed as an alternative method to quality control and to follow-up of active principle chemical stability, since official techniques to quality control of ascorbic acid in tablets are not selective with degradation products. Method was modified according to that reported in USP 28, 2005 for analysis of injectable product. We used a RP-18 column of 250 x 4.6 mm 5 μm with a UV detector to 245 nm. Its validation was necessary for both objectives, considering parameters required for methods of I and II categories. This method was enough linear, exact, and precise in the rank of 100-300 μg/mL. Also, it was selective with remaining components of matrix and with the possible degradation products achieved in stressing conditions. Detection and quantification limits were estimated. When method was validated it was applied to ascorbic acid quantification in two batches of expired tablets and we detected a marked influence of container in active degradation principle after 12 months at room temperature. (Author)

  20. Validation study of core analysis methods for full MOX BWR

    International Nuclear Information System (INIS)

    2013-01-01

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO 2 and MOX fuel rods, (3) analysis of isotopic composition data for UO 2 and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  1. Validation study of core analysis methods for full MOX BWR

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO{sub 2} and MOX fuel rods, (3) analysis of isotopic composition data for UO{sub 2} and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  2. Validating carbonation parameters of alkaline solid wastes via integrated thermal analyses: Principles and applications

    International Nuclear Information System (INIS)

    Pan, Shu-Yuan; Chang, E.-E.; Kim, Hyunook; Chen, Yi-Hung; Chiang, Pen-Chi

    2016-01-01

    Highlights: • Key carbonation parameters of wastes are determined by integrated thermal analyses. • A modified TG-DTG interpretation is proposed, and validated by the DSC technique. • The modified TG-DTG interpretation is further verified by DTA, TG-MS and TG-FTIR. • Kinetics and thermodynamics of CaCO 3 decomposition in solid wastes are determined. • Implication to maximum carbonation conversion of various solid wastes is described. - Abstract: Accelerated carbonation of alkaline solid wastes is an attractive method for CO 2 capture and utilization. However, the evaluation criteria of CaCO 3 content in solid wastes and the way to interpret thermal analysis profiles were found to be quite different among the literature. In this investigation, an integrated thermal analyses for determining carbonation parameters in basic oxygen furnace slag (BOFS) were proposed based on thermogravimetric (TG), derivative thermogravimetric (DTG), and differential scanning calorimetry (DSC) analyses. A modified method of TG-DTG interpretation was proposed by considering the consecutive weight loss of sample with 200–900 °C because the decomposition of various hydrated compounds caused variances in estimates by using conventional methods of TG interpretation. Different quantities of reference CaCO 3 standards, carbonated BOFS samples and synthetic CaCO 3 /BOFS mixtures were prepared for evaluating the data quality of the modified TG-DTG interpretation, in terms of precision and accuracy. The quantitative results of the modified TG-DTG method were also validated by DSC analysis. In addition, to confirm the TG-DTG results, the evolved gas analysis was performed by mass spectrometer and Fourier transform infrared spectroscopy for detection of the gaseous compounds released during heating. Furthermore, the decomposition kinetics and thermodynamics of CaCO 3 in BOFS was evaluated using Arrhenius equation and Kissinger equation. The proposed integrated thermal analyses for

  3. Validating carbonation parameters of alkaline solid wastes via integrated thermal analyses: Principles and applications

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Shu-Yuan [Graduate Institute of Environmental Engineering, National Taiwan University, Taipei 10673, Taiwan (China); Chang, E.-E. [Department of Biochemistry, Taipei Medical University, Taipei 110, Taiwan (China); Kim, Hyunook [Department of Environmental Engineering, University of Seoul, Seoul 130-743 (Korea, Republic of); Chen, Yi-Hung [Department of Chemical Engineering and Biotechnology, National Taipei University of Technology, Taipei 10608, Taiwan (China); Chiang, Pen-Chi, E-mail: pcchiang@ntu.edu.tw [Graduate Institute of Environmental Engineering, National Taiwan University, Taipei 10673, Taiwan (China)

    2016-04-15

    Highlights: • Key carbonation parameters of wastes are determined by integrated thermal analyses. • A modified TG-DTG interpretation is proposed, and validated by the DSC technique. • The modified TG-DTG interpretation is further verified by DTA, TG-MS and TG-FTIR. • Kinetics and thermodynamics of CaCO{sub 3} decomposition in solid wastes are determined. • Implication to maximum carbonation conversion of various solid wastes is described. - Abstract: Accelerated carbonation of alkaline solid wastes is an attractive method for CO{sub 2} capture and utilization. However, the evaluation criteria of CaCO{sub 3} content in solid wastes and the way to interpret thermal analysis profiles were found to be quite different among the literature. In this investigation, an integrated thermal analyses for determining carbonation parameters in basic oxygen furnace slag (BOFS) were proposed based on thermogravimetric (TG), derivative thermogravimetric (DTG), and differential scanning calorimetry (DSC) analyses. A modified method of TG-DTG interpretation was proposed by considering the consecutive weight loss of sample with 200–900 °C because the decomposition of various hydrated compounds caused variances in estimates by using conventional methods of TG interpretation. Different quantities of reference CaCO{sub 3} standards, carbonated BOFS samples and synthetic CaCO{sub 3}/BOFS mixtures were prepared for evaluating the data quality of the modified TG-DTG interpretation, in terms of precision and accuracy. The quantitative results of the modified TG-DTG method were also validated by DSC analysis. In addition, to confirm the TG-DTG results, the evolved gas analysis was performed by mass spectrometer and Fourier transform infrared spectroscopy for detection of the gaseous compounds released during heating. Furthermore, the decomposition kinetics and thermodynamics of CaCO{sub 3} in BOFS was evaluated using Arrhenius equation and Kissinger equation. The proposed

  4. A validation methodology for fault-tolerant clock synchronization

    Science.gov (United States)

    Johnson, S. C.; Butler, R. W.

    1984-01-01

    A validation method for the synchronization subsystem of a fault-tolerant computer system is presented. The high reliability requirement of flight crucial systems precludes the use of most traditional validation methods. The method presented utilizes formal design proof to uncover design and coding errors and experimentation to validate the assumptions of the design proof. The experimental method is described and illustrated by validating an experimental implementation of the Software Implemented Fault Tolerance (SIFT) clock synchronization algorithm. The design proof of the algorithm defines the maximum skew between any two nonfaulty clocks in the system in terms of theoretical upper bounds on certain system parameters. The quantile to which each parameter must be estimated is determined by a combinatorial analysis of the system reliability. The parameters are measured by direct and indirect means, and upper bounds are estimated. A nonparametric method based on an asymptotic property of the tail of a distribution is used to estimate the upper bound of a critical system parameter. Although the proof process is very costly, it is extremely valuable when validating the crucial synchronization subsystem.

  5. Validation of analytical methods for the stability studies of naproxen suppositories for infant and adult use

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar

    2011-01-01

    Analytical and validating studies were performed in this paper, with a view to using them in the stability studies of the future formulations of naproxen suppositories for children and adults. The most influential factors in the naproxen stability were determined, that is, the major degradation occurred in acid medium, oxidative medium and by light action. One high-performance liquid chromatography-based method was evaluated, which proved to be adequate to quantify naproxen in suppositories and was selective against degradation products. The quantification limit was 3,480 μg, so it was valid for these studies. Additionally, the parameters specificity for stability, detection and quantification limits were evaluated for the direct semi-aqueous acid-base method, which was formerly validated for the quality control and showed satisfactory results. Nevertheless, the volumetric methods were not regarded as stability indicators; therefore, this method will be used along with the chromatographic methods of choice, that is, thin-layer chromatography and highperformance liquid chromatography, to determine the degradation products

  6. A validated HPTLC method for estimation of moxifloxacin hydrochloride in tablets.

    Science.gov (United States)

    Dhillon, Vandana; Chaudhary, Alok Kumar

    2010-10-01

    A simple HPTLC method having high accuracy, precision and reproducibility was developed for the routine estimation of moxifloxacin hydrochloride in the tablets available in market and was validated for various parameters according to ICH guidelines. moxifloxacin hydrochloride was estimated at 292 nm by densitometry using Silica gel 60 F254 as stationary phase and a premix of methylene chloride: methanol: strong ammonia solution and acetonitrile (10:10:5:10) as mobile phase. Method was found linear in a range of 9-54 nanograms with a correlation coefficient >0.99. The regression equation was: AUC = 65.57 × (Amount in nanograms) + 163 (r(2) = 0.9908).

  7. Validation for chromatographic and electrophoretic methods

    OpenAIRE

    Ribani, Marcelo; Bottoli, Carla Beatriz Grespan; Collins, Carol H.; Jardim, Isabel Cristina Sales Fontes; Melo, Lúcio Flávio Costa

    2004-01-01

    The validation of an analytical method is fundamental to implementing a quality control system in any analytical laboratory. As the separation techniques, GC, HPLC and CE, are often the principal tools used in such determinations, procedure validation is a necessity. The objective of this review is to describe the main aspects of validation in chromatographic and electrophoretic analysis, showing, in a general way, the similarities and differences between the guidelines established by the dif...

  8. Detecting generalized synchronization of chaotic dynamical systems. A kernel-based method and choice of its parameter

    International Nuclear Information System (INIS)

    Suetani, Hiromichi; Iba, Yukito; Aihara, Kazuyuki

    2006-01-01

    An approach based on the kernel methods for capturing the nonlinear interdependence between two signals is introduced. It is demonstrated that the proposed approach is useful for characterizing generalized synchronization with a successful simple example. An attempt to choose an optimal kernel parameter based on cross validation is also discussed. (author)

  9. Method for Determining the Time Parameter

    Directory of Open Access Journals (Sweden)

    K. P. Baslyk

    2014-01-01

    Full Text Available This article proposes a method for calculating one of the characteristics that represents the flight program of the first stage of ballistic rocket i.e. time parameter of the program of attack angle.In simulation of placing the payload for the first stage, a program of flight is used which consists of three segments, namely a vertical climb of the rocket, a segment of programmed reversal by attack angle, and a segment of gravitational reversal with zero angle of attack.The programed reversal by attack angle is simulated as a rapidly decreasing and increasing function. This function depends on the attack angle amplitude, time and time parameter.If the projected and ballistic parameters and the amplitude of attack angle were determined this coefficient is calculated based the constraint that the rocket velocity is equal to 0.8 from the sound velocity (0,264 km/sec when the angle of attack becomes equal to zero. Such constraint is transformed to the nonlinear equation, which can be solved using a Newton method.The attack angle amplitude value is unknown for the design analysis. Exceeding some maximum admissible value for this parameter may lead to excessive trajectory collapsing (foreshortening, which can be identified as an arising negative trajectory angle.Consequently, therefore it is necessary to compute the maximum value of the attack angle amplitude with the following constraints: a trajectory angle is positive during the entire first stage flight and the rocket velocity is equal to 0,264 km/sec by the end of program of angle attack. The problem can be formulated as a task of the nonlinear programming, minimization of the modified Lagrange function, which is solved using the multipliers method.If multipliers and penalty parameter are constant the optimization problem without constraints takes place. Using the determined coordinate descent method allows solving the problem of modified Lagrange function of unconstrained minimization with fixed

  10. GPS User Devices Parameter Control Methods

    OpenAIRE

    Klūga, A; Kuļikovs, M; Beļinska, V; Zeļenkovs, A

    2007-01-01

    In our day’s wide assortment of GPS user devices is manufacture. How to verify that parameters of the real device corresponds to parameters that manufacture shows. How to verify that parameters have not been changed during the operation time. The last one is very important for aviation GPS systems, which must be verified before the flight, but the values of parameter in time of repair works. This work analyses GPS user devices parameters control methods.

  11. Comprehensive validation scheme for in situ fiber optics dissolution method for pharmaceutical drug product testing.

    Science.gov (United States)

    Mirza, Tahseen; Liu, Qian Julie; Vivilecchia, Richard; Joshi, Yatindra

    2009-03-01

    There has been a growing interest during the past decade in the use of fiber optics dissolution testing. Use of this novel technology is mainly confined to research and development laboratories. It has not yet emerged as a tool for end product release testing despite its ability to generate in situ results and efficiency improvement. One potential reason may be the lack of clear validation guidelines that can be applied for the assessment of suitability of fiber optics. This article describes a comprehensive validation scheme and development of a reliable, robust, reproducible and cost-effective dissolution test using fiber optics technology. The test was successfully applied for characterizing the dissolution behavior of a 40-mg immediate-release tablet dosage form that is under development at Novartis Pharmaceuticals, East Hanover, New Jersey. The method was validated for the following parameters: linearity, precision, accuracy, specificity, and robustness. In particular, robustness was evaluated in terms of probe sampling depth and probe orientation. The in situ fiber optic method was found to be comparable to the existing manual sampling dissolution method. Finally, the fiber optic dissolution test was successfully performed by different operators on different days, to further enhance the validity of the method. The results demonstrate that the fiber optics technology can be successfully validated for end product dissolution/release testing. (c) 2008 Wiley-Liss, Inc. and the American Pharmacists Association

  12. Validation of high-performance liquid chromatography (HPLC method for quantitative analysis of histamine in fish and fishery products

    Directory of Open Access Journals (Sweden)

    B.K.K.K. Jinadasa

    2016-12-01

    Full Text Available A high-performance liquid chromatography method is described for quantitative determination and validation of histamine in fish and fishery product samples. Histamine is extracted from fish/fishery products by homogenizing with tri-chloro acetic acid, separated with Amberlite CG-50 resin and C18-ODS Hypersil reversed phase column at ambient temperature (25°C. Linear standard curves with high correlation coefficients were obtained. An isocratic elution program was used; the total elution time was 10 min. The method was validated by assessing the following aspects; specificity, repeatability, reproducibility, linearity, recovery, limits of detection, limit of quantification and uncertainty. The validated parameters are in good agreement with method and it is a useful tool for determining histamine in fish and fishery products.

  13. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    Science.gov (United States)

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Probing the parameter space of HD 49933: A comparison between global and local methods

    Energy Technology Data Exchange (ETDEWEB)

    Creevey, O L [Instituto de Astrofisica de Canarias (IAC), E-38200 La Laguna, Tenerife (Spain); Bazot, M, E-mail: orlagh@iac.es, E-mail: bazot@astro.up.pt [Centro de Astrofisica da Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal)

    2011-01-01

    We present two independent methods for studying the global stellar parameter space (mass M, age, chemical composition X{sub 0}, Z{sub 0}) of HD 49933 with seismic data. Using a local minimization and an MCMC algorithm, we obtain consistent results for the determination of the stellar properties: M 1.1-1.2 M{sub sun} Age {approx} 3.0 Gyr, Z{sub 0} {approx} 0.008. A description of the error ellipses can be defined using Singular Value Decomposition techniques, and this is validated by comparing the errors with those from the MCMC method.

  15. Measurement methods and accuracy analysis of Chang'E-5 Panoramic Camera installation parameters

    Science.gov (United States)

    Yan, Wei; Ren, Xin; Liu, Jianjun; Tan, Xu; Wang, Wenrui; Chen, Wangli; Zhang, Xiaoxia; Li, Chunlai

    2016-04-01

    Chang'E-5 (CE-5) is a lunar probe for the third phase of China Lunar Exploration Project (CLEP), whose main scientific objectives are to implement lunar surface sampling and to return the samples back to the Earth. To achieve these goals, investigation of lunar surface topography and geological structure within sampling area seems to be extremely important. The Panoramic Camera (PCAM) is one of the payloads mounted on CE-5 lander. It consists of two optical systems which installed on a camera rotating platform. Optical images of sampling area can be obtained by PCAM in the form of a two-dimensional image and a stereo images pair can be formed by left and right PCAM images. Then lunar terrain can be reconstructed based on photogrammetry. Installation parameters of PCAM with respect to CE-5 lander are critical for the calculation of exterior orientation elements (EO) of PCAM images, which is used for lunar terrain reconstruction. In this paper, types of PCAM installation parameters and coordinate systems involved are defined. Measurement methods combining camera images and optical coordinate observations are studied for this work. Then research contents such as observation program and specific solution methods of installation parameters are introduced. Parametric solution accuracy is analyzed according to observations obtained by PCAM scientifically validated experiment, which is used to test the authenticity of PCAM detection process, ground data processing methods, product quality and so on. Analysis results show that the accuracy of the installation parameters affects the positional accuracy of corresponding image points of PCAM stereo images within 1 pixel. So the measurement methods and parameter accuracy studied in this paper meet the needs of engineering and scientific applications. Keywords: Chang'E-5 Mission; Panoramic Camera; Installation Parameters; Total Station; Coordinate Conversion

  16. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  17. Model Optimization Identification Method Based on Closed-loop Operation Data and Process Characteristics Parameters

    Directory of Open Access Journals (Sweden)

    Zhiqiang GENG

    2014-01-01

    Full Text Available Output noise is strongly related to input in closed-loop control system, which makes model identification of closed-loop difficult, even unidentified in practice. The forward channel model is chosen to isolate disturbance from the output noise to input, and identified by optimization the dynamic characteristics of the process based on closed-loop operation data. The characteristics parameters of the process, such as dead time and time constant, are calculated and estimated based on the PI/PID controller parameters and closed-loop process input/output data. And those characteristics parameters are adopted to define the search space of the optimization identification algorithm. PSO-SQP optimization algorithm is applied to integrate the global search ability of PSO with the local search ability of SQP to identify the model parameters of forward channel. The validity of proposed method has been verified by the simulation. The practicability is checked with the PI/PID controller parameter turning based on identified forward channel model.

  18. The dispersal of contaminants in heterogeneous aquifers: a review of methods of estimating scale dependent parameters

    International Nuclear Information System (INIS)

    Farmer, C.L.

    1986-02-01

    The design and assessment of underground waste disposal options requires modelling the dispersal of contaminants within aquifers. The logical structure of the development and application of disposal models is discussed. In particular we examine the validity and interpretation of the gradient diffusion model. The effective dispersion parameters in such a model seem to depend upon the scale on which they are measured. This phenomenon is analysed and methods for modelling scale dependent parameters are reviewed. Specific recommendations regarding the modelling of contaminant dispersal are provided. (author)

  19. Validation method for determination of cholesterol in human urine with electrochemical sensors using gold electrodes

    Science.gov (United States)

    Riyanto, Laksono, Tomy Agung

    2017-12-01

    Electrochemical sensors for the determination of cholesterol with Au as a working electrode (Au) and its application to the analysis of urine have been done. The gold electrode was prepared using gold pure (99.99%), with size 1.0 mm by length and wide respectively, connected with silver wire using silver conductive paint. Validation methods have been investigated in the analysis of cholesterol in human urine using electrochemical sensors or cyclic voltammetry (CV) method. The effect of electrolyte and uric acid concentration has been determined to produce the optimum method. Validation method parameters for cholesterol analysis in human urine using CV are precision, recovery, linearity, limit of detection (LOD) and limit of quantification (LOQ). The result showed the correlation of concentration of cholesterol to anodic peak current is the coefficient determination of R2 = 0.916. The results of the validation method showed the precision, recovery, linearity, LOD, and LOQ are 1.2539%, 144.33%, 0.916, 1.49 × 10-1 mM and 4.96 × 10-1 mM, respectively. As a conclusion is Au electrode is a good electrode for electrochemical sensors to determination of cholesterol in human urine.

  20. Analytical method validation of GC-FID for the simultaneous measurement of hydrocarbons (C2-C4) in their gas mixture

    OpenAIRE

    Oman Zuas; Harry budiman; Muhammad Rizky Mulyana

    2016-01-01

    An accurate gas chromatography coupled to a flame ionization detector (GC-FID) method was validated for the simultaneous analysis of light hydrocarbons (C2-C4) in their gas mixture. The validation parameters were evaluated based on the ISO/IEC 17025 definition including method selectivity, repeatability, accuracy, linearity, limit of detection (LOD), limit of quantitation (LOQ), and ruggedness. Under the optimum analytical conditions, the analysis of gas mixture revealed that each target comp...

  1. Evaluation of the Reference Numerical Parameters of the Monthly Method in ISO 13790 Considering S/V Ratio

    Directory of Open Access Journals (Sweden)

    Hee-Jeong Kwak

    2015-01-01

    Full Text Available Many studies have investigated the accuracy of the numerical parameters in the application of the quasi steady-state calculation method. The aim of this study is to derive the reference numerical parameters of the ISO 13790 monthly method by reflecting the surface-to-volume (S/V ratio and the characteristics of the structures. The calculation process was established, and the parameters necessary to derive the reference numerical parameters were calculated based on the input data prepared for the established calculation processes. The reference numerical parameters were then derived through regression analyses of the calculated parameters and the time constant. The parameters obtained from an apartment building and the parameters of the international standard were both applied to the Passive House Planning Package (PHPP and EnergyPlus programs, and the results were analyzed in order to evaluate the validity of the results. The analysis revealed that the calculation results based on the parameters derived from this study yielded lower error rates than those based on the default parameters in ISO 13790. However, the differences were shown to be negligible in the case of high heat capacity.

  2. Identification of Water Quality Significant Parameter with Two Transformation/Standardization Methods on Principal Component Analysis and Scilab Software

    Directory of Open Access Journals (Sweden)

    Jovan Putranda

    2016-09-01

    Full Text Available Water quality monitoring is prone to encounter error on its recording or measuring process. The monitoring on river water quality not only aims to recognize the water quality dynamic, but also to evaluate the data to create river management policy and water pollution in order to maintain the continuity of human health or sanitation requirement, and biodiversity preservation. Evaluation on water quality monitoring needs to be started by identifying the important water quality parameter. This research objected to identify the significant parameters by using two transformation or standardization methods on water quality data, which are the river Water Quality Index, WQI (Indeks Kualitas Air, Sungai, IKAs transformation or standardization method and transformation or standardization method with mean 0 and variance 1; so that the variability of water quality parameters could be aggregated with one another. Both of the methods were applied on the water quality monitoring data which its validity and reliability have been tested. The PCA, Principal Component Analysis (Analisa Komponen Utama, AKU, with the help of Scilab software, has been used to process the secondary data on water quality parameters of Gadjah Wong river in 2004-2013, with its validity and reliability has been tested. The Scilab result was cross examined with the result from the Excel-based Biplot Add In software. The research result showed that only 18 from total 35 water quality parameters that have passable data quality. The two transformation or standardization data methods gave different significant parameter type and amount result. On the transformation or standardization mean 0 variances 1, there were water quality significant parameter dynamic to mean concentration of each water quality parameters, which are TDS, SO4, EC, TSS, NO3N, COD, BOD5, Grease Oil and NH3N. On the river WQI transformation or standardization, the water quality significant parameter showed the level of

  3. Verification and validation of the safety parameter display system for nuclear power plant

    International Nuclear Information System (INIS)

    Zhang Yuanfang

    1993-05-01

    During the design and development phase of the safety parameter display system for nuclear power plant, a verification and validation (V and V) plan has been implemented to improve the quality of system design. The V and V activities are briefly introduced, which were executed in four stages of feasibility research, system design, code development and system integration and regulation. The evaluation plan and the process of implementation as well as the evaluation conclusion of the final technical validation for this system are also presented in detail

  4. Fault-tolerant clock synchronization validation methodology. [in computer systems

    Science.gov (United States)

    Butler, Ricky W.; Palumbo, Daniel L.; Johnson, Sally C.

    1987-01-01

    A validation method for the synchronization subsystem of a fault-tolerant computer system is presented. The high reliability requirement of flight-crucial systems precludes the use of most traditional validation methods. The method presented utilizes formal design proof to uncover design and coding errors and experimentation to validate the assumptions of the design proof. The experimental method is described and illustrated by validating the clock synchronization system of the Software Implemented Fault Tolerance computer. The design proof of the algorithm includes a theorem that defines the maximum skew between any two nonfaulty clocks in the system in terms of specific system parameters. Most of these parameters are deterministic. One crucial parameter is the upper bound on the clock read error, which is stochastic. The probability that this upper bound is exceeded is calculated from data obtained by the measurement of system parameters. This probability is then included in a detailed reliability analysis of the system.

  5. Development and validation of stability indicating UPLC assay method for ziprasidone active pharma ingredient

    Directory of Open Access Journals (Sweden)

    Sonam Mittal

    2012-01-01

    Full Text Available Background: Ziprasidone, a novel antipsychotic, exhibits a potent highly selective antagonistic activity on D2 and 5HT2A receptors. Literature survey for ziprasidone revealed several analytical methods based on different techniques but no UPLC method has been reported so far. Aim: Aim of this research paper is to present a simple and rapid stability indicating isocratic, ultra performance liquid chromatographic (UPLC method which was developed and validated for the determination of ziprasidone active pharmaceutical ingredient. Forced degradation studies of ziprasidone were studied under acid, base, oxidative hydrolysis, thermal stress and photo stress conditions. Materials and Methods: The quantitative determination of ziprasidone drug was performed on a Supelco analytical column (100×2.1 mm i.d., 2.7 ΅m with 10 mM ammonium acetate buffer (pH: 6.7 and acetonitrile (ACN as mobile phase with the ratio (55:45-Buffer:ACN at a flow rate of 0.35 ml/ min. For UPLC method, UV detection was made at 318 nm and the run time was 3 min. Developed UPLC method was validated as per ICH guidelines. Results and Conclusion: Mild degradation of the drug substance was observed during oxidative hydrolysis and considerable degradation observed during basic hydrolysis. During method validation, parameters such as precision, linearity, ruggedness, stability, robustness, and specificity were evaluated, which remained within acceptable limits. Developed UPLC method was successfully applied for evaluating assay of Ziprasidone active Pharma ingredient.

  6. Method validation for simultaneous counting of Total α , β in Drinking Water using Liquid Scintillation Counter

    International Nuclear Information System (INIS)

    Al-Masri, M. S.; Nashawati, A.

    2014-05-01

    In this work, Method Validation Methods and Pulse Shape Analysis were validated to determine gross Alpha and Beta Emitters in Drinking Water using Liquid Scintillation Counter Win spectral 1414. Validation parameters include Method Detection Limit, Method Quantitation Limit, Repeatability Limit, Intermediate Precision, Trueness) Bias), Recovery Coefficient, Linearity and Uncertainty Budget in analysis. The results show that the Method Detection Limit and Method Quantitation Limit were 0.07, 0.24 Bq/l for Alpha emitters respectively, and 0.42, 1.4 Bq/l for Beta emitters, respectively. The relative standard deviation of Repeatability Limit reached 2.81% for Alpha emitters and 3.96% for Beta emitters. In addition to, the relative standard deviation of Intermediate Precisionis was 0.54% for Alpha emitters and 1.17% for Beta emitters. Moreover, the trueness was - 7.7% for Alpha emitters and - 4.5% for Beta emitters. Recovery Coefficient ranged between 87 - 96% and 88-101 for Alpha and Beta emitters, respectively. Linearity reached 1 for both Alpha and Beta emitters. on the other hand, Uncertainty Budget for all continents was 96.65% ,83.14% for Alpha and Beta emitters, respectively (author).

  7. Calibration and Validation Parameter of Hydrologic Model HEC-HMS using Particle Swarm Optimization Algorithms – Single Objective

    Directory of Open Access Journals (Sweden)

    R. Garmeh

    2016-02-01

    results show that the performance model is not desirable. The results emphasized the impossibility of obtaining unique parameters for a basin. This method of solution, because of non-single solutions of calibration, could be helpful as an inverse problem that could limit the number of candidates. The above analysis revealed the existence of differentparameter sets that can altogether simulate verificationevents quite well, which shows the non-uniqueness featureof the calibration problem under study. However, the methodologyhas benefited from that feature by finding newparameter intervals that should be fine-tuned further inorder to decrease input and model prediction uncertainties.The proposed methodology performed well in the automatedcalibration of an event-based hydrologic model;however, the authors are aware of a drawback of the presentedanalysis – this undertakingwas not a completely fair validationprocedure. It is because validation events represent possiblefuture scenarios and thus are not available at the time ofmodel calibration. Hence, an event being selected as a validationevent should not be used to receive any morefeedback for adjusting parameter values and ranges.However,this remark was not fully taken into consideration, mostlybecause of being seriously short of enough observed eventsin this calibration study. Therefore, the proposed methodology,although sound and useful, should be validated inother case studies with more observed flood events.

  8. Development and Validation of a Dissolution Test Method for ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a dissolution test method for dissolution release of artemether and lumefantrine from tablets. Methods: A single dissolution method for evaluating the in vitro release of artemether and lumefantrine from tablets was developed and validated. The method comprised of a dissolution medium of ...

  9. A New Statistical Method to Determine the Degree of Validity of Health Economic Model Outcomes against Empirical Data.

    Science.gov (United States)

    Corro Ramos, Isaac; van Voorn, George A K; Vemer, Pepijn; Feenstra, Talitha L; Al, Maiwenn J

    2017-09-01

    The validation of health economic (HE) model outcomes against empirical data is of key importance. Although statistical testing seems applicable, guidelines for the validation of HE models lack guidance on statistical validation, and actual validation efforts often present subjective judgment of graphs and point estimates. To discuss the applicability of existing validation techniques and to present a new method for quantifying the degrees of validity statistically, which is useful for decision makers. A new Bayesian method is proposed to determine how well HE model outcomes compare with empirical data. Validity is based on a pre-established accuracy interval in which the model outcomes should fall. The method uses the outcomes of a probabilistic sensitivity analysis and results in a posterior distribution around the probability that HE model outcomes can be regarded as valid. We use a published diabetes model (Modelling Integrated Care for Diabetes based on Observational data) to validate the outcome "number of patients who are on dialysis or with end-stage renal disease." Results indicate that a high probability of a valid outcome is associated with relatively wide accuracy intervals. In particular, 25% deviation from the observed outcome implied approximately 60% expected validity. Current practice in HE model validation can be improved by using an alternative method based on assessing whether the model outcomes fit to empirical data at a predefined level of accuracy. This method has the advantage of assessing both model bias and parameter uncertainty and resulting in a quantitative measure of the degree of validity that penalizes models predicting the mean of an outcome correctly but with overly wide credible intervals. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  10. Practical procedure for method validation in INAA- A tutorial

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2015-01-01

    This paper describes the procedure employed by the Neutron Activation Laboratory at the Nuclear and Energy Research Institute (LAN, IPEN - CNEN/SP) for validation of Instrumental Neutron Activation Analysis (INAA) methods. According to recommendations of ISO/IEC 17025 the method performance characteristics (limit of detection, limit of quantification, trueness, repeatability, intermediate precision, reproducibility, selectivity, linearity and uncertainties budget) were outline in an easy, fast and convenient way. The paper presents step by step how to calculate the required method performance characteristics in a process of method validation, what are the procedures, adopted strategies and acceptance criteria for the results, that is, how to make a method validation in INAA. In order to exemplify the methodology applied, obtained results for the method validation of mass fraction determination of Co, Cr, Fe, Rb, Se and Zn in biological matrix samples, using an internal reference material of mussel tissue were presented. It was concluded that the methodology applied for validation of INAA methods is suitable, meeting all the requirements of ISO/IEC 17025, and thereby, generating satisfactory results for the studies carried at LAN, IPEN - CNEN/SP. (author)

  11. Practical procedure for method validation in INAA- A tutorial

    Energy Technology Data Exchange (ETDEWEB)

    Petroni, Robson; Moreira, Edson G., E-mail: robsonpetroni@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    This paper describes the procedure employed by the Neutron Activation Laboratory at the Nuclear and Energy Research Institute (LAN, IPEN - CNEN/SP) for validation of Instrumental Neutron Activation Analysis (INAA) methods. According to recommendations of ISO/IEC 17025 the method performance characteristics (limit of detection, limit of quantification, trueness, repeatability, intermediate precision, reproducibility, selectivity, linearity and uncertainties budget) were outline in an easy, fast and convenient way. The paper presents step by step how to calculate the required method performance characteristics in a process of method validation, what are the procedures, adopted strategies and acceptance criteria for the results, that is, how to make a method validation in INAA. In order to exemplify the methodology applied, obtained results for the method validation of mass fraction determination of Co, Cr, Fe, Rb, Se and Zn in biological matrix samples, using an internal reference material of mussel tissue were presented. It was concluded that the methodology applied for validation of INAA methods is suitable, meeting all the requirements of ISO/IEC 17025, and thereby, generating satisfactory results for the studies carried at LAN, IPEN - CNEN/SP. (author)

  12. Validation of the Rotation Ratios Method

    International Nuclear Information System (INIS)

    Foss, O.A.; Klaksvik, J.; Benum, P.; Anda, S.

    2007-01-01

    Background: The rotation ratios method describes rotations between pairs of sequential pelvic radiographs. The method seems promising but has not been validated. Purpose: To validate the accuracy of the rotation ratios method. Material and Methods: Known pelvic rotations between 165 radiographs obtained from five skeletal pelvises in an experimental material were compared with the corresponding calculated rotations to describe the accuracy of the method. The results from a clinical material of 262 pelvic radiographs from 46 patients defined the ranges of rotational differences compared. Repeated analyses, both on the experimental and the clinical material, were performed using the selected reference points to describe the robustness and the repeatability of the method. Results: The reference points were easy to identify and barely influenced by pelvic rotations. The mean differences between calculated and real pelvic rotations were 0.0 deg (SD 0.6) for vertical rotations and 0.1 deg (SD 0.7) for transversal rotations in the experimental material. The intra- and interobserver repeatability of the method was good. Conclusion: The accuracy of the method was reasonably high, and the method may prove to be clinically useful

  13. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.; Adams, M.L.; McClarren, R.G.; Mallick, B.K.

    2011-01-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework

  14. Validation of a method for radionuclide activity optimize in SPECT

    International Nuclear Information System (INIS)

    Perez Diaz, M.; Diaz Rizo, O.; Lopez Diaz, A.; Estevez Aparicio, E.; Roque Diaz, R.

    2007-01-01

    A discriminant method for optimizing the activity administered in NM studies is validated by comparison with ROC curves. the method is tested in 21 SPECT, performed with a Cardiac phantom. Three different cold lesions (L1, L2 and L3) were placed in the myocardium-wall for each SPECT. Three activities (84 MBq, 37 MBq or 18.5 MBq) of Tc-99m diluted in water were used as background. The linear discriminant analysis was used to select the parameters that characterize image quality (Background-to-Lesion (B/L) and Signal-to-Noise (S/N) ratios). Two clusters with different image quality (p=0.021) were obtained following the selected variables. the first one involved the studies performed with 37 MBq and 84 MBq, and the second one included the studies with 18.5 MBq. the ratios B/L, B/L2 and B/L3 are the parameters capable to construct the function, with 100% of cases correctly classified into the clusters. The value of 37 MBq is the lowest tested activity for which good results for the B/Li variables were obtained,without significant differences from the results with 84 MBq (p>0.05). The result is coincident with the applied ROC-analysis. A correlation between both method of r=890 was obtained. (Author) 26 refs

  15. Validation of an image quality index: its correlation with quality control parameters

    International Nuclear Information System (INIS)

    Cabrejas, M.L.C.; Giannone, C.A.; Arashiro, J.G.; Cabrejas, R.C.

    2002-01-01

    Objective and Rationale: To validate a new image quality index (the Performance Index: PI) that assesses detectability of simulated lesions with a phantom. This index, presumably must depend markedly on quality control (QC) parameters as tomographic uniformity (Unif), Centre of Rotation (COR) and Spatial resolution (FWHM). The simultaneous effects of the QC parameters may explain much of the variation in the PIs; i.e. they may be predictors of the PI values. Methods: An overall performance phantom containing 3 sections was used. The first uniform section was used to determine tomographic uniformity. From the analysis of the slices corresponding to the second section containing 8 cold cylindrical simulated lesions of different diameters (range 7 mm - 17 mm), the number of true and false positives are determined and from these a new Performance Index (PI) is defined as the ratio between the positive predictive value and the sensitivity (expressed as its complement adding a constant to avoid a singularity). A point source located on the top of the phantom was used to determine the Centre of Rotation and the Spatial Resolution expressed by the FWHM in mm. 40 nuclear medicine labs participate at the survey. Standard multiple regression analysis between the Performance Index, as dependent variable, and FWHM, COR and Unif as independent variables was performed to evaluate the influence of the QC parameters on the PI values. Results: It is shown that resolution and COR are both predictors of the PIs, with statistical significance for the multiple correlation co-efficient R. However the addition of the variable tomographic uniformity to the model, does not improve the prediction of PIs. Moreover, the regression model lacks overall statistical significance. Regression summary for dependent variable Performance Index is presented. Conclusions: We confirm that the new Performance Index (PI), depends on QC parameters as COR and Spatial resolution. Those labs whose PIs are out

  16. Method validation in pharmaceutical analysis: from theory to practical optimization

    Directory of Open Access Journals (Sweden)

    Jaqueline Kaleian Eserian

    2015-01-01

    Full Text Available The validation of analytical methods is required to obtain high-quality data. For the pharmaceutical industry, method validation is crucial to ensure the product quality as regards both therapeutic efficacy and patient safety. The most critical step in validating a method is to establish a protocol containing well-defined procedures and criteria. A well planned and organized protocol, such as the one proposed in this paper, results in a rapid and concise method validation procedure for quantitative high performance liquid chromatography (HPLC analysis.   Type: Commentary

  17. A Comparative Study of Distribution System Parameter Estimation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Yannan; Williams, Tess L.; Gourisetti, Sri Nikhil Gup

    2016-07-17

    In this paper, we compare two parameter estimation methods for distribution systems: residual sensitivity analysis and state-vector augmentation with a Kalman filter. These two methods were originally proposed for transmission systems, and are still the most commonly used methods for parameter estimation. Distribution systems have much lower measurement redundancy than transmission systems. Therefore, estimating parameters is much more difficult. To increase the robustness of parameter estimation, the two methods are applied with combined measurement snapshots (measurement sets taken at different points in time), so that the redundancy for computing the parameter values is increased. The advantages and disadvantages of both methods are discussed. The results of this paper show that state-vector augmentation is a better approach for parameter estimation in distribution systems. Simulation studies are done on a modified version of IEEE 13-Node Test Feeder with varying levels of measurement noise and non-zero error in the other system model parameters.

  18. Validity and reliability of the session-RPE method for quantifying training load in karate athletes.

    Science.gov (United States)

    Tabben, M; Tourny, C; Haddad, M; Chaabane, H; Chamari, K; Coquart, J B

    2015-04-24

    To test the construct validity and reliability of the session rating of perceived exertion (sRPE) method by examining the relationship between RPE and physiological parameters (heart rate: HR and blood lactate concentration: [La --] ) and the correlations between sRPE and two HR--based methods for quantifying internal training load (Banister's method and Edwards's method) during karate training camp. Eighteen elite karate athletes: ten men (age: 24.2 ± 2.3 y, body mass: 71.2 ± 9.0 kg, body fat: 8.2 ± 1.3% and height: 178 ± 7 cm) and eight women (age: 22.6 ± 1.2 y, body mass: 59.8 ± 8.4 kg, body fat: 20.2 ± 4.4%, height: 169 ± 4 cm) were included in the study. During training camp, subjects participated in eight karate--training sessions including three training modes (4 tactical--technical, 2 technical--development, and 2 randori training), during which RPE, HR, and [La -- ] were recorded. Significant correlations were found between RPE and physiological parameters (percentage of maximal HR: r = 0.75, 95% CI = 0.64--0.86; [La --] : r = 0.62, 95% CI = 0.49--0.75; P training load ( r = 0.65--0.95; P reliability of the same intensity across training sessions (Cronbach's α = 0.81, 95% CI = 0.61--0.92). This study demonstrates that the sRPE method is valid for quantifying internal training load and intensity in karate.

  19. Validation of method in instrumental NAA for food products sample

    International Nuclear Information System (INIS)

    Alfian; Siti Suprapti; Setyo Purwanto

    2010-01-01

    NAA is a method of testing that has not been standardized. To affirm and confirm that this method is valid. it must be done validation of the method with various sample standard reference materials. In this work. the validation is carried for food product samples using NIST SRM 1567a (wheat flour) and NIST SRM 1568a (rice flour). The results show that the validation method for testing nine elements (Al, K, Mg, Mn, Na, Ca, Fe, Se and Zn) in SRM 1567a and eight elements (Al, K, Mg, Mn, Na, Ca, Se and Zn ) in SRM 1568a pass the test of accuracy and precision. It can be conclude that this method has power to give valid result in determination element of the food products samples. (author)

  20. Human Factors methods concerning integrated validation of nuclear power plant control rooms; Metodutveckling foer integrerad validering

    Energy Technology Data Exchange (ETDEWEB)

    Oskarsson, Per-Anders; Johansson, Bjoern J.E.; Gonzalez, Natalia (Swedish Defence Research Agency, Information Systems, Linkoeping (Sweden))

    2010-02-15

    The frame of reference for this work was existing recommendations and instructions from the NPP area, experiences from the review of the Turbic Validation and experiences from system validations performed at the Swedish Armed Forces, e.g. concerning military control rooms and fighter pilots. These enterprises are characterized by complex systems in extreme environments, often with high risks, where human error can lead to serious consequences. A focus group has been performed with representatives responsible for Human Factors issues from all Swedish NPP:s. The questions that were discussed were, among other things, for whom an integrated validation (IV) is performed and its purpose, what should be included in an IV, the comparison with baseline measures, the design process, the role of SSM, which methods of measurement should be used, and how the methods are affected of changes in the control room. The report brings different questions to discussion concerning the validation process. Supplementary methods of measurement for integrated validation are discussed, e.g. dynamic, psychophysiological, and qualitative methods for identification of problems. Supplementary methods for statistical analysis are presented. The study points out a number of deficiencies in the validation process, e.g. the need of common guidelines for validation and design, criteria for different types of measurements, clarification of the role of SSM, and recommendations for the responsibility of external participants in the validation process. The authors propose 12 measures for taking care of the identified problems

  1. Development and Validation Dissolution Analytical Method of Nimesulide beta-Cyclodextrin 400 mg Tablet

    Directory of Open Access Journals (Sweden)

    Carlos Eduardo Carvalho Pereira

    2016-10-01

    Full Text Available The nimesulide (N-(4-nitro-2-phenoxyphenylmethanesulfonamide belongs to the class of non-steroidal anti-inflammatory drugs (NSAIDs and category II of the biopharmaceutical classification, The complexation of nimesulide with b-cyclodextrin is a pharmacological strategy to increase the solubility of the drug The objective of this study was to develop and validate an analytical methodology for dissolving the nimesulide beta-cyclodextrin 400 mg tablet and meets the guidelines of ANVISA for drug registration purposes. Once developed, the dissolution methodology was validated according to the RE of parameters no.  899/2003. In the development of the method it was noted that the duration of the dissolution test was 60 minutes, the volume and the most suitable dissolution medium was 900 mL of aqueous solution of sodium lauryl sulfate 1% (w/ v. It was also noted that rotation of 100 rpm and the paddle apparatus was the most appropriate to evaluate the dissolution of the drug. Spectrophotometric methodology was used to quantify the percentage of dissolved drug. The wavelength was 390 nm using the quantification. The validation of the methodology, system suitability parameters, specificity/selectivity, linearity, precision, accuracy and robustness were satisfactory and proved that the developed dissolution methodology was duly executed. DOI: http://dx.doi.org/10.17807/orbital.v8i5.827

  2. A Multiscale Finite Element Model Validation Method of Composite Cable-Stayed Bridge Based on Structural Health Monitoring System

    Directory of Open Access Journals (Sweden)

    Rumian Zhong

    2015-01-01

    Full Text Available A two-step response surface method for multiscale finite element model (FEM updating and validation is presented with respect to Guanhe Bridge, a composite cable-stayed bridge in the National Highway number G15, in China. Firstly, the state equations of both multiscale and single-scale FEM are established based on the basic equation in structural dynamic mechanics to update the multiscale coupling parameters and structural parameters. Secondly, based on the measured data from the structural health monitoring (SHM system, a Monte Carlo simulation is employed to analyze the uncertainty quantification and transmission, where the uncertainties of the multiscale FEM and measured data were considered. The results indicate that the relative errors between the calculated and measured frequencies are less than 2%, and the overlap ratio indexes of each modal frequency are larger than 80% without the average absolute value of relative errors. These demonstrate that the proposed method can be applied to validate the multiscale FEM, and the validated FEM can reflect the current conditions of the real bridge; thus it can be used as the basis for bridge health monitoring, damage prognosis (DP, and safety prognosis (SP.

  3. Validity of a smartphone protractor to measure sagittal parameters in adult spinal deformity.

    Science.gov (United States)

    Kunkle, William Aaron; Madden, Michael; Potts, Shannon; Fogelson, Jeremy; Hershman, Stuart

    2017-10-01

    Smartphones have become an integral tool in the daily life of health-care professionals (Franko 2011). Their ease of use and wide availability often make smartphones the first tool surgeons use to perform measurements. This technique has been validated for certain orthopedic pathologies (Shaw 2012; Quek 2014; Milanese 2014; Milani 2014), but never to assess sagittal parameters in adult spinal deformity (ASD). This study was designed to assess the validity, reproducibility, precision, and efficiency of using a smartphone protractor application to measure sagittal parameters commonly measured in ASD assessment and surgical planning. This study aimed to (1) determine the validity of smartphone protractor applications, (2) determine the intra- and interobserver reliability of smartphone protractor applications when used to measure sagittal parameters in ASD, (3) determine the efficiency of using a smartphone protractor application to measure sagittal parameters, and (4) elucidate whether a physician's level of experience impacts the reliability or validity of using a smartphone protractor application to measure sagittal parameters in ASD. An experimental validation study was carried out. Thirty standard 36″ standing lateral radiographs were examined. Three separate measurements were performed using a marker and protractor; then at a separate time point, three separate measurements were performed using a smartphone protractor application for all 30 radiographs. The first 10 radiographs were then re-measured two more times, for a total of three measurements from both the smartphone protractor and marker and protractor. The parameters included lumbar lordosis, pelvic incidence, and pelvic tilt. Three raters performed all measurements-a junior level orthopedic resident, a senior level orthopedic resident, and a fellowship-trained spinal deformity surgeon. All data, including the time to perform the measurements, were recorded, and statistical analysis was performed to

  4. Introducing conjoint analysis method into delayed lotteries studies: its validity and time stability are higher than in adjusting.

    Science.gov (United States)

    Białek, Michał; Markiewicz, Łukasz; Sawicki, Przemysław

    2015-01-01

    The delayed lotteries are much more common in everyday life than are pure lotteries. Usually, we need to wait to find out the outcome of the risky decision (e.g., investing in a stock market, engaging in a relationship). However, most research has studied the time discounting and probability discounting in isolation using the methodologies designed specifically to track changes in one parameter. Most commonly used method is adjusting, but its reported validity and time stability in research on discounting are suboptimal. The goal of this study was to introduce the novel method for analyzing delayed lotteries-conjoint analysis-which hypothetically is more suitable for analyzing individual preferences in this area. A set of two studies compared the conjoint analysis with adjusting. The results suggest that individual parameters of discounting strength estimated with conjoint have higher predictive value (Study 1 and 2), and they are more stable over time (Study 2) compared to adjusting. We discuss these findings, despite the exploratory character of reported studies, by suggesting that future research on delayed lotteries should be cross-validated using both methods.

  5. Introducing conjoint analysis method into delayed lotteries studies: Its validity and time stability are higher than in adjusting

    Directory of Open Access Journals (Sweden)

    Michal eBialek

    2015-01-01

    Full Text Available The delayed lotteries are much more common in everyday life than are pure lotteries. Usually, we need to wait to find out the outcome of the risky decision (e.g., investing in a stock market, engaging in a relationship. However, most research has studied the time discounting and probability discounting in isolation using the methodologies designed specifically to track changes in one parameter. Most commonly used method is adjusting, but its reported validity and time stability in research on discounting are suboptimal.The goal of this study was to introduce the novel method for analyzing delayed lotteries - conjoint analysis - which hypothetically is more suitable for analyzing individual preferences in this area. A set of two studies compared the conjoint analysis with adjusting. The results suggest that individual parameters of discounting strength estimated with conjoint have higher predictive value (Study 1 & 2, and they are more stable over time (Study 2 compared to adjusting. We discuss these findings, despite the exploratory character of reported studies, by suggesting that future research on delayed lotteries should be cross-validated using both methods.

  6. Method validation for determination of heavy metals in wine and slightly alcoholic beverages by ICP-MS

    International Nuclear Information System (INIS)

    Voica, Cezara; Dehelean, Adriana; Pamula, A

    2009-01-01

    The Organisation International de la Vigne et du Vin (OIV) fixed an uppermost level for some heavy metals in wine. Consequently, the need to determine very low concentration of elements that may be present in wine in trace and ultra trace levels occurred. Inductively coupled plasma mass spectrometry ICP-MS is considered an excellent tool for detailed characterization of the elementary composition of many samples, including samples of drinks. In this study a method of quantitative analysis for the determination of toxic metals (Cr, As, Cd, Ni, Hg, Pb) in wines and slightly alcoholic beverages by ICP-MS was validated. Several parameters have been taken into account and evaluated for the validation of method, namely: linearity, the minimum detection limit, the limit of quantification, accuracy and uncertainty.

  7. Method validation for determination of heavy metals in wine and slightly alcoholic beverages by ICP-MS

    Energy Technology Data Exchange (ETDEWEB)

    Voica, Cezara; Dehelean, Adriana; Pamula, A, E-mail: cezara.voica@itim-cj.r [National Institute for Research and Development of Isotopic and Molecular Technologies, 65-103 Donath, 400293 Cluj-Napoca (Romania)

    2009-08-01

    The Organisation International de la Vigne et du Vin (OIV) fixed an uppermost level for some heavy metals in wine. Consequently, the need to determine very low concentration of elements that may be present in wine in trace and ultra trace levels occurred. Inductively coupled plasma mass spectrometry ICP-MS is considered an excellent tool for detailed characterization of the elementary composition of many samples, including samples of drinks. In this study a method of quantitative analysis for the determination of toxic metals (Cr, As, Cd, Ni, Hg, Pb) in wines and slightly alcoholic beverages by ICP-MS was validated. Several parameters have been taken into account and evaluated for the validation of method, namely: linearity, the minimum detection limit, the limit of quantification, accuracy and uncertainty.

  8. ASTM Validates Air Pollution Test Methods

    Science.gov (United States)

    Chemical and Engineering News, 1973

    1973-01-01

    The American Society for Testing and Materials (ASTM) has validated six basic methods for measuring pollutants in ambient air as the first part of its Project Threshold. Aim of the project is to establish nationwide consistency in measuring pollutants; determining precision, accuracy and reproducibility of 35 standard measuring methods. (BL)

  9. Parameter-free method for the shape optimization of stiffeners on thin-walled structures to minimize stress concentration

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yang; Shibutan, Yoji [Osaka University, Osaka (Japan); Shimoda, Masatoshi [Toyota Technological Institute, Nagoya (Japan)

    2015-04-15

    This paper presents a parameter-free shape optimization method for the strength design of stiffeners on thin-walled structures. The maximum von Mises stress is minimized and subjected to the volume constraint. The optimum design problem is formulated as a distributed-parameter shape optimization problem under the assumptions that a stiffener is varied in the in-plane direction and that the thickness is constant. The issue of nondifferentiability, which is inherent in this min-max problem, is avoided by transforming the local measure to a smooth differentiable integral functional by using the Kreisselmeier-Steinhauser function. The shape gradient functions are derived by using the material derivative method and adjoint variable method and are applied to the H{sup 1} gradient method for shells to determine the optimal free-boundary shapes. By using this method, the smooth optimal stiffener shape can be obtained without any shape design parameterization while minimizing the maximum stress. The validity of this method is verified through two practical design examples.

  10. Simple method for quick estimation of aquifer hydrogeological parameters

    Science.gov (United States)

    Ma, C.; Li, Y. Y.

    2017-08-01

    Development of simple and accurate methods to determine the aquifer hydrogeological parameters was of importance for groundwater resources assessment and management. Aiming at the present issue of estimating aquifer parameters based on some data of the unsteady pumping test, a fitting function of Theis well function was proposed using fitting optimization method and then a unitary linear regression equation was established. The aquifer parameters could be obtained by solving coefficients of the regression equation. The application of the proposed method was illustrated, using two published data sets. By the error statistics and analysis on the pumping drawdown, it showed that the method proposed in this paper yielded quick and accurate estimates of the aquifer parameters. The proposed method could reliably identify the aquifer parameters from long distance observed drawdowns and early drawdowns. It was hoped that the proposed method in this paper would be helpful for practicing hydrogeologists and hydrologists.

  11. Toward a Unified Validation Framework in Mixed Methods Research

    Science.gov (United States)

    Dellinger, Amy B.; Leech, Nancy L.

    2007-01-01

    The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as…

  12. Validation of qualitative microbiological test methods

    NARCIS (Netherlands)

    IJzerman-Boon, Pieta C.; van den Heuvel, Edwin R.

    2015-01-01

    This paper considers a statistical model for the detection mechanism of qualitative microbiological test methods with a parameter for the detection proportion (the probability to detect a single organism) and a parameter for the false positive rate. It is demonstrated that the detection proportion

  13. Research on filter’s parameter selection based on PROMETHEE method

    Science.gov (United States)

    Zhu, Hui-min; Wang, Hang-yu; Sun, Shi-yan

    2018-03-01

    The selection of filter’s parameters in target recognition was studied in this paper. The PROMETHEE method was applied to the optimization problem of Gabor filter parameters decision, the correspondence model of the elemental relation between two methods was established. The author took the identification of military target as an example, problem about the filter’s parameter decision was simulated and calculated by PROMETHEE. The result showed that using PROMETHEE method for the selection of filter’s parameters was more scientific. The human disturbance caused by the experts method and empirical method could be avoided by this way. The method can provide reference for the parameter configuration scheme decision of the filter.

  14. Method validation using weighted linear regression models for quantification of UV filters in water samples.

    Science.gov (United States)

    da Silva, Claudia Pereira; Emídio, Elissandro Soares; de Marchi, Mary Rosa Rodrigues

    2015-01-01

    This paper describes the validation of a method consisting of solid-phase extraction followed by gas chromatography-tandem mass spectrometry for the analysis of the ultraviolet (UV) filters benzophenone-3, ethylhexyl salicylate, ethylhexyl methoxycinnamate and octocrylene. The method validation criteria included evaluation of selectivity, analytical curve, trueness, precision, limits of detection and limits of quantification. The non-weighted linear regression model has traditionally been used for calibration, but it is not necessarily the optimal model in all cases. Because the assumption of homoscedasticity was not met for the analytical data in this work, a weighted least squares linear regression was used for the calibration method. The evaluated analytical parameters were satisfactory for the analytes and showed recoveries at four fortification levels between 62% and 107%, with relative standard deviations less than 14%. The detection limits ranged from 7.6 to 24.1 ng L(-1). The proposed method was used to determine the amount of UV filters in water samples from water treatment plants in Araraquara and Jau in São Paulo, Brazil. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Resonance Damping and Parameter Design Method for LCL-LC Filter Interfaced Grid-Connected Photovoltaic Inverters

    DEFF Research Database (Denmark)

    Li, Zipeng; Jiang, Aiting; Shen, Pan

    2016-01-01

    , this paper presents a systematic design method for the LCL-LC filtered grid-connected photovoltaic (PV) system. With this method, controller parameters and the active damping feedback coefficient are easily obtained by specifying the system stability and dynamic performance indices, and it is more convenient......-frequency harmonics attenuation ability, but the resonant problem affects the system stability remarkably. In this paper, active damping based on the capacitor voltage feedback is proposed using the concept of the equivalent virtual impedance in parallel with the capacitor. With the consideration of system delay...... to optimize the system performance according to the predefined satisfactory region. Finally, the simulation results are presented to validate the proposed design method and control scheme....

  16. Pose measurement method with six parameters for microassembly based on an optical micrometer

    Science.gov (United States)

    Ye, Xin; Wang, Qiang; Zhang, Zhi-jing; Sun, Yuan; Zhang, Xiao-feng

    2009-07-01

    This paper presents a new pose measurement method of microminiature parts that is capable of transforming one dimension (1D) contour size obtained by optical micrometer to three dimension (3D) data with six parameters for microassembly. Pose measurement is one of the most important processes for microminiature parts' alignment and insertion in microassembly. During the past few years, researchers have developed their microassembly systems focusing on visual identification to obtain two or three dimension data with no more than three parameters. Scanning electronic microscope (SEM), optical microscope, and stereomicroscope are applied in their systems. However, as structures of microminiature parts become increasingly complex, six parameters to represent their position and orientation are specifically needed. Firstly, The pose measurement model is established based on the introduction of measuring objects and measuring principle of optical micrometer. The measuring objects are microminiature parts with complex 3D structure. Two groups of two dimension (2D) data are gathered at two different measurement positions. Then part pose with 6 parameters is calculated, including 3 position parameters of feature point of the part and 3 orientation parameters of the part axis. Secondly, pose measurement process for a small shaft, vertical orientation determination, and position parameters obtaining are presented. 2D data is gathered by scanning the generatrix of the part, and valid data is extracted and saved in arrays. A vertical orientation criterion is proposed to determine whether the part is parallel to the Z-axis of the coordinate. If not, 2D data will be fixed into a linear equation using least square algorithm. Then orientation parameters are calculated. Center of Part End (CPE) is selected as feature point of the part, and its position parameters are extracted form two group of 2D data. Finally, a fast pose measurement device is developed and representative

  17. METAHEURISTIC OPTIMIZATION METHODS FOR PARAMETERS ESTIMATION OF DYNAMIC SYSTEMS

    Directory of Open Access Journals (Sweden)

    V. Panteleev Andrei

    2017-01-01

    Full Text Available The article considers the usage of metaheuristic methods of constrained global optimization: “Big Bang - Big Crunch”, “Fireworks Algorithm”, “Grenade Explosion Method” in parameters of dynamic systems estimation, described with algebraic-differential equations. Parameters estimation is based upon the observation results from mathematical model behavior. Their values are derived after criterion minimization, which describes the total squared error of state vector coordinates from the deduced ones with precise values observation at different periods of time. Paral- lelepiped type restriction is imposed on the parameters values. Used for solving problems, metaheuristic methods of constrained global extremum don’t guarantee the result, but allow to get a solution of a rather good quality in accepta- ble amount of time. The algorithm of using metaheuristic methods is given. Alongside with the obvious methods for solving algebraic-differential equation systems, it is convenient to use implicit methods for solving ordinary differen- tial equation systems. Two ways of solving the problem of parameters evaluation are given, those parameters differ in their mathematical model. In the first example, a linear mathematical model describes the chemical action parameters change, and in the second one, a nonlinear mathematical model describes predator-prey dynamics, which characterize the changes in both kinds’ population. For each of the observed examples there are calculation results from all the three methods of optimization, there are also some recommendations for how to choose methods parameters. The obtained numerical results have demonstrated the efficiency of the proposed approach. The deduced parameters ap- proximate points slightly differ from the best known solutions, which were deduced differently. To refine the results one should apply hybrid schemes that combine classical methods of optimization of zero, first and second orders and

  18. Discrimination of Clover and Citrus Honeys from Egypt According to Floral Type Using Easily Assessable Physicochemical Parameters and Discriminant Analysis: An External Validation of the Chemometric Approach

    Directory of Open Access Journals (Sweden)

    Ioannis K. Karabagias

    2018-05-01

    Full Text Available Twenty-two honey samples, namely clover and citrus honeys, were collected from the greater Cairo area during the harvesting year 2014–2015. The main purpose of the present study was to characterize the aforementioned honey types and to investigate whether the use of easily assessable physicochemical parameters, including color attributes in combination with chemometrics, could differentiate honey floral origin. Parameters taken into account were: pH, electrical conductivity, ash, free acidity, lactonic acidity, total acidity, moisture content, total sugars (degrees Brix-°Bx, total dissolved solids and their ratio to total acidity, salinity, CIELAB color parameters, along with browning index values. Results showed that all honey samples analyzed met the European quality standards set for honey and had variations in the aforementioned physicochemical parameters depending on floral origin. Application of linear discriminant analysis showed that eight physicochemical parameters, including color, could classify Egyptian honeys according to floral origin (p < 0.05. Correct classification rate was 95.5% using the original method and 90.9% using the cross validation method. The discriminatory ability of the developed model was further validated using unknown honey samples. The overall correct classification rate was not affected. Specific physicochemical parameter analysis in combination with chemometrics has the potential to enhance the differences in floral honeys produced in a given geographical zone.

  19. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  20. Application of validation data for assessing spatial interpolation methods for 8-h ozone or other sparsely monitored constituents

    International Nuclear Information System (INIS)

    Joseph, John; Sharif, Hatim O.; Sunil, Thankam; Alamgir, Hasanat

    2013-01-01

    The adverse health effects of high concentrations of ground-level ozone are well-known, but estimating exposure is difficult due to the sparseness of urban monitoring networks. This sparseness discourages the reservation of a portion of the monitoring stations for validation of interpolation techniques precisely when the risk of overfitting is greatest. In this study, we test a variety of simple spatial interpolation techniques for 8-h ozone with thousands of randomly selected subsets of data from two urban areas with monitoring stations sufficiently numerous to allow for true validation. Results indicate that ordinary kriging with only the range parameter calibrated in an exponential variogram is the generally superior method, and yields reliable confidence intervals. Sparse data sets may contain sufficient information for calibration of the range parameter even if the Moran I p-value is close to unity. R script is made available to apply the methodology to other sparsely monitored constituents. -- Highlights: •Spatial interpolation methods were tested for thousands of sparse ozone data sets. •A particular single-parameter ordinary kriging was found to be generally superior. •A Moran I p-value in the training set is not helpful in selecting the method. •The sum of the squares of the residuals is helpful in selecting the method. •R script is available for application to other sites and constituents. -- Spatial interpolation methods were compared for thousands of subsets of data for 8-h ozone using R script applicable to other constituents as well, and available from the authors

  1. A proposal of parameter determination method in the residual strength degradation model for the prediction of fatigue life (I)

    International Nuclear Information System (INIS)

    Kim, Sang Tae; Jang, Seong Soo

    2001-01-01

    The static and fatigue tests have been carried out to verify the validity of a generalized residual strength degradation model. And a new method of parameter determination in the model is verified experimentally to account for the effect of tension-compression fatigue loading of spheroidal graphite cast iron. It is shown that the correlation between the experimental results and the theoretical prediction on the statistical distribution of fatigue life by using the proposed method is very reasonable. Furthermore, it is found that the correlation between the theoretical prediction and the experimental results of fatigue life in case of tension-tension fatigue data in composite material appears to be reasonable. Therefore, the proposed method is more adjustable in the determination of the parameter than maximum likelihood method and minimization technique

  2. Validated modified Lycopodium spore method development for ...

    African Journals Online (AJOL)

    Validated modified lycopodium spore method has been developed for simple and rapid quantification of herbal powdered drugs. Lycopodium spore method was performed on ingredients of Shatavaryadi churna, an ayurvedic formulation used as immunomodulator, galactagogue, aphrodisiac and rejuvenator. Estimation of ...

  3. Statistics of Parameter Estimates: A Concrete Example

    KAUST Repository

    Aguilar, Oscar

    2015-01-01

    © 2015 Society for Industrial and Applied Mathematics. Most mathematical models include parameters that need to be determined from measurements. The estimated values of these parameters and their uncertainties depend on assumptions made about noise levels, models, or prior knowledge. But what can we say about the validity of such estimates, and the influence of these assumptions? This paper is concerned with methods to address these questions, and for didactic purposes it is written in the context of a concrete nonlinear parameter estimation problem. We will use the results of a physical experiment conducted by Allmaras et al. at Texas A&M University [M. Allmaras et al., SIAM Rev., 55 (2013), pp. 149-167] to illustrate the importance of validation procedures for statistical parameter estimation. We describe statistical methods and data analysis tools to check the choices of likelihood and prior distributions, and provide examples of how to compare Bayesian results with those obtained by non-Bayesian methods based on different types of assumptions. We explain how different statistical methods can be used in complementary ways to improve the understanding of parameter estimates and their uncertainties.

  4. DEM modeling of ball mills with experimental validation: influence of contact parameters on charge motion and power draw

    Science.gov (United States)

    Boemer, Dominik; Ponthot, Jean-Philippe

    2017-01-01

    Discrete element method simulations of a 1:5-scale laboratory ball mill are presented in this paper to study the influence of the contact parameters on the charge motion and the power draw. The position density limit is introduced as an efficient mathematical tool to describe and to compare the macroscopic charge motion in different scenarios, i.a. with different values of the contact parameters. While the charge motion and the power draw are relatively insensitive to the stiffness and the damping coefficient of the linear spring-slider-damper contact law, the coefficient of friction has a strong influence since it controls the sliding propensity of the charge. Based on the experimental calibration and validation by charge motion photographs and power draw measurements, the descriptive and predictive capabilities of the position density limit and the discrete element method are demonstrated, i.e. the real position of the charge is precisely delimited by the respective position density limit and the power draw can be predicted with an accuracy of about 5 %.

  5. Development and validation of a spectrophotometry method for the determination of histamine in fresh tuna (Thunnus tunna)

    International Nuclear Information System (INIS)

    Chacon-Silva, Fainier; Barquero-Quiros, Miriam

    2002-01-01

    Histamine in foods can promote allergic reactions in sensitive persons. A colorimetric microscale method for histamine determination was developed and validated. Cu 2+ histamine chelation occurs at 9,5 ph. Dichloromethane extraction of the complex as the salt with tetrabromo phenolphthalein ethyl ester, allows photometric quantitation at 515 nm. The validation of micro method was accomplished trough its performance parameters, detection limit, quantitation limit, sensitivity, linearity, precision, recuperation. This methodology was applied to twenty raw tuna samples, collected in San Jose metropolitan area. It was found that 45% of analyzed samples had a histamine content in the range between 100-200 mg/kg. This finding indicates bacterial contamination, 15% of samples analyzed were over 500 mg/kg FDA level of sanitary risk. (Author) [es

  6. Parameter Estimation in Continuous Time Domain

    Directory of Open Access Journals (Sweden)

    Gabriela M. ATANASIU

    2016-12-01

    Full Text Available This paper will aim to presents the applications of a continuous-time parameter estimation method for estimating structural parameters of a real bridge structure. For the purpose of illustrating this method two case studies of a bridge pile located in a highly seismic risk area are considered, for which the structural parameters for the mass, damping and stiffness are estimated. The estimation process is followed by the validation of the analytical results and comparison with them to the measurement data. Further benefits and applications for the continuous-time parameter estimation method in civil engineering are presented in the final part of this paper.

  7. Development and validation of an ICP-MS method applied to the determination of uranium in urine

    International Nuclear Information System (INIS)

    Ferreira, Isabela M. de S.; Castro, Marcelo X. de; Fidelis, Valdir dos S.; Santos, Osvaldir P.

    2013-01-01

    The objective of this study is to propose a new methodology for uranium analyses in urine using inductively coupled plasma - mass spectrometry (ICP-MS). The urine samples were provided by nuclear facility workers, specifically, employees of the Industrias Nucleares do Brasil - INB, the company responsible for the production of nuclear fuel in Brazil. At nuclear fuel facility sites, ensure the safety of employees is essential due to the risks involved in such activity, mainly exposure to radiation. On account of that, there are federal laws that limit the ionizing radiation exposure among workers. The verification of compliance with the requirements in technical regulation is performed through monitoring programs. In vitro bioassay is a method to monitor the individual worker by the analysis of radionuclides in excreta samples (urine or feces). The determination of uranium in urine was performed without any sample pretreatment and the method was validated by observing a series of operational parameters set by the INMETRO DOQ-CGCRE-008 guide - Guideline on Validation of Analytical Methods. A limit of detection of 0.9 ng L -1 was achieved. For the evaluation of the trueness/recovery of the method, a NIST Standard Reference Material - SRM 2670a - Toxic Elements in Urine (Freeze-Dried) was used. The calculation of relative error for the Certified Reference Material was 1.96%. The reproducibility of the developed method was confirmed through intercomparison analyses. Through the evaluation of performance parameters, the ability of the developed method of determining uranium with high sensitivity and reproducibility was verified, enabling the use in either environmental or occupational exposures. (author)

  8. Parameter estimation in X-ray astronomy

    International Nuclear Information System (INIS)

    Lampton, M.; Margon, B.; Bowyer, S.

    1976-01-01

    The problems of model classification and parameter estimation are examined, with the objective of establishing the statistical reliability of inferences drawn from X-ray observations. For testing the validities of classes of models, the procedure based on minimizing the chi 2 statistic is recommended; it provides a rejection criterion at any desired significance level. Once a class of models has been accepted, a related procedure based on the increase of chi 2 gives a confidence region for the values of the model's adjustable parameters. The procedure allows the confidence level to be chosen exactly, even for highly nonlinear models. Numerical experiments confirm the validity of the prescribed technique.The chi 2 /sub min/+1 error estimation method is evaluated and found unsuitable when several parameter ranges are to be derived, because it substantially underestimates their joint errors. The ratio of variances method, while formally correct, gives parameter confidence regions which are more variable than necessary

  9. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.

    Science.gov (United States)

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil

    2014-08-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.

  10. Dual ant colony operational modal analysis parameter estimation method

    Science.gov (United States)

    Sitarz, Piotr; Powałka, Bartosz

    2018-01-01

    Operational Modal Analysis (OMA) is a common technique used to examine the dynamic properties of a system. Contrary to experimental modal analysis, the input signal is generated in object ambient environment. Operational modal analysis mainly aims at determining the number of pole pairs and at estimating modal parameters. Many methods are used for parameter identification. Some methods operate in time while others in frequency domain. The former use correlation functions, the latter - spectral density functions. However, while some methods require the user to select poles from a stabilisation diagram, others try to automate the selection process. Dual ant colony operational modal analysis parameter estimation method (DAC-OMA) presents a new approach to the problem, avoiding issues involved in the stabilisation diagram. The presented algorithm is fully automated. It uses deterministic methods to define the interval of estimated parameters, thus reducing the problem to optimisation task which is conducted with dedicated software based on ant colony optimisation algorithm. The combination of deterministic methods restricting parameter intervals and artificial intelligence yields very good results, also for closely spaced modes and significantly varied mode shapes within one measurement point.

  11. Method Validation Procedure in Gamma Spectroscopy Laboratory

    International Nuclear Information System (INIS)

    El Samad, O.; Baydoun, R.

    2008-01-01

    The present work describes the methodology followed for the application of ISO 17025 standards in gamma spectroscopy laboratory at the Lebanese Atomic Energy Commission including the management and technical requirements. A set of documents, written procedures and records were prepared to achieve the management part. The technical requirements, internal method validation was applied through the estimation of trueness, repeatability , minimum detectable activity and combined uncertainty, participation in IAEA proficiency tests assure the external method validation, specially that the gamma spectroscopy lab is a member of ALMERA network (Analytical Laboratories for the Measurements of Environmental Radioactivity). Some of these results are presented in this paper. (author)

  12. Method validation for strobilurin fungicides in cereals and fruit

    DEFF Research Database (Denmark)

    Christensen, Hanne Bjerre; Granby, Kit

    2001-01-01

    Strobilurins are a new class of fungicides that are active against a broad spectrum of fungi. In the present work a GC method for analysis of strobilurin fungicides was validated. The method was based on extraction with ethyl acetate/cyclohexane, clean-up by gel permeation chromatography (GPC......) and determination of the content by gas chromatography (GC) with electron capture (EC-), nitrogen/phosphorous (NP-), and mass spectrometric (MS-) detection. Three strobilurins, azoxystrobin, kresoxim-methyl and trifloxystrobin were validated on three matrices, wheat, apple and grapes. The validation was based...

  13. A novel method for the extraction of local gravity wave parameters from gridded three-dimensional data: description, validation, and application

    Directory of Open Access Journals (Sweden)

    L. Schoon

    2018-05-01

    Full Text Available For the local diagnosis of wave properties, we develop, validate, and apply a novel method which is based on the Hilbert transform. It is called Unified Wave Diagnostics (UWaDi. It provides the wave amplitude and three-dimensional wave number at any grid point for gridded three-dimensional data. UWaDi is validated for a synthetic test case comprising two different wave packets. In comparison with other methods, the performance of UWaDi is very good with respect to wave properties and their location. For a first practical application of UWaDi, a minor sudden stratospheric warming on 30 January 2016 is chosen. Specifying the diagnostics for hydrostatic inertia–gravity waves in analyses from the European Centre for Medium-Range Weather Forecasts, we detect the local occurrence of gravity waves throughout the middle atmosphere. The local wave characteristics are discussed in terms of vertical propagation using the diagnosed local amplitudes and wave numbers. We also note some hints on local inertia–gravity wave generation by the stratospheric jet from the detection of shallow slow waves in the vicinity of its exit region.

  14. DBCG hypo trial validation of radiotherapy parameters from a national data bank versus manual reporting.

    Science.gov (United States)

    Brink, Carsten; Lorenzen, Ebbe L; Krogh, Simon Long; Westberg, Jonas; Berg, Martin; Jensen, Ingelise; Thomsen, Mette Skovhus; Yates, Esben Svitzer; Offersen, Birgitte Vrou

    2018-01-01

    The current study evaluates the data quality achievable using a national data bank for reporting radiotherapy parameters relative to the classical manual reporting method of selected parameters. The data comparison is based on 1522 Danish patients of the DBCG hypo trial with data stored in the Danish national radiotherapy data bank. In line with standard DBCG trial practice selected parameters were also reported manually to the DBCG database. Categorical variables are compared using contingency tables, and comparison of continuous parameters is presented in scatter plots. For categorical variables 25 differences between the data bank and manual values were located. Of these 23 were related to mistakes in the manual reported value whilst the remaining two were a wrong classification in the data bank. The wrong classification in the data bank was related to lack of dose information, since the two patients had been treated with an electron boost based on a manual calculation, thus data was not exported to the data bank, and this was not detected prior to comparison with the manual data. For a few database fields in the manual data an ambiguity of the parameter definition of the specific field is seen in the data. This was not the case for the data bank, which extract all data consistently. In terms of data quality the data bank is superior to manually reported values. However, there is a need to allocate resources for checking the validity of the available data as well as ensuring that all relevant data is present. The data bank contains more detailed information, and thus facilitates research related to the actual dose distribution in the patients.

  15. Validation of an analytical method for determining halothane in urine as an instrument for evaluating occupational exposure

    International Nuclear Information System (INIS)

    Gonzalez Chamorro, Rita Maria; Jaime Novas, Arelis; Diaz Padron, Heliodora

    2010-01-01

    The occupational exposure to harmful substances may impose the apparition of determined significative changes in the normal physiology of the organism when the adequate security measures are not taken in time in a working place where the risk may be present. Among the chemical risks that may affect the workers' health are the inhalable anesthetic agents. With the objective to take the first steps for the introduction of an epidemiological surveillance system to this personnel, an analytical method for determining this anesthetic in urine was validated with the instrumental conditions created in our laboratory. To carry out this validation the following parameters were taken into account: specificity, lineament, precision, accuracy, detection limit and quantification limit, and the uncertainty of the method was calculated. In the validation procedure it was found that the technique is specific and precise, the detection limit was of 0,118 μg/L, and of the quantification limit of 0,354 μg/L. The global uncertainty was of 0,243, and the expanded of 0,486. The validated method, together with the posterior introduction of the biological exposure limits, will serve as an auxiliary means of diagnosis which will allow us a periodical control of the personnel exposure

  16. DEVELOPMENT AND VALIDATION OF NUMERICAL METHOD FOR STRENGTH ANALYSIS OF LATTICE COMPOSITE FUSELAGE STRUCTURES

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available Lattice composite fuselage structures are developed as an alternative to conventional composite structures based on laminated skin and stiffeners. Structure layout of lattice structures allows to realize advantages of current composite materials to a maximal extent, at the same time minimizing their main shortcomings, that allows to provide higher weight efficiency for these structures in comparison with conventional analogues.Development and creation of lattice composite structures requires development of novel methods of strength anal- ysis, as conventional methods, as a rule, are aiming to strength analysis of thin-walled elements and do not allow to get confident estimation of local strength of high-loaded unidirectional composite ribs.In the present work the method of operative strength analysis of lattice composite structure is presented, based onspecialized FE-models of unidirectional composite ribs and their intersections. In the frames of the method, every rib is modeled by a caisson structure, consisting of arbitrary number of flanges and webs, modeled by membrane finite elements. Parameters of flanges and webs are calculated automatically from the condition of stiffness characteristics equality of real rib and the model. This method allows to perform local strength analysis of high-loaded ribs of lattice structure without use of here-dimensional finite elements, that allows to shorten time of calculations and sufficiently simplify the procedure of analysis of results of calculations.For validation of the suggested method, the results of experimental investigations of full-scale prototype of shell of lattice composite fuselage section have been used. The prototype of the lattice section was manufactured in CRISM and tested in TsAGI within the frames of a number of Russian and International scientific projects. The results of validation have shown that the suggested method allows to provide high operability of strength analysis, keeping

  17. 76 FR 28664 - Method 301-Field Validation of Pollutant Measurement Methods From Various Waste Media

    Science.gov (United States)

    2011-05-18

    ... . d m = The mean of the paired sample differences. n = Total number of paired samples. 7.4.2 t Test... being compared to a validated test method as part of the Method 301 validation and an audit sample for... tighten the acceptance criteria for the precision of candidate alternative test methods. One commenter...

  18. A Rapid, Simple, and Validated RP-HPLC Method for Quantitative Analysis of Levofloxacin in Human Plasma

    Directory of Open Access Journals (Sweden)

    Dion Notario

    2017-04-01

    Full Text Available To conduct a bioequivalence study for a copy product of levofloxacin (LEV, a simple and validated analytical method was needed, but the previous developed methods were still too complicated. For this reason, a simple and rapid high performance liquid chromatography method was developed and validated for LEV quantification in human plasma. Chromatographic separation was performed under isocratic elution on a Luna Phenomenex® C18 (150 × 4.6 mm, 5 µm column. The mobile phase was comprised of acetonitrile, methanol, and phosphate buffer 25 mM that adjusted at pH 3.0 (13:7:80 v/v/v and pumped at a flow rate of 1.5 mL/min. Detection was performed under UV detector at wavelength of 280 nm. Samples were prepared by adding acetonitrile and followed by centrifugation to precipitate plasma protein. Then followed successively by evaporation and reconstitution step. The optimized method meets the requirements of validation parameters which included linearity (r = 0.995, sensitivity (LLOQ and LOD was 1.77 and 0.57 µg/mL respectively, accuracy (%error above LLOQ ≤ 12% and LLOQ ≤ 20%, precision (RSD ≤ 9%, and robustness in the ranges of 1.77-28.83 µg/mL. Therefore, the method can be used as a routine analysis of LEV in human plasma as well as in bioequivalence study of LEV.

  19. A Voxel-Based Method for Automated Identification and Morphological Parameters Estimation of Individual Street Trees from Mobile Laser Scanning Data

    Directory of Open Access Journals (Sweden)

    Hongxing Liu

    2013-01-01

    Full Text Available As an important component of urban vegetation, street trees play an important role in maintenance of environmental quality, aesthetic beauty of urban landscape, and social service for inhabitants. Acquiring accurate and up-to-date inventory information for street trees is required for urban horticultural planning, and municipal urban forest management. This paper presents a new Voxel-based Marked Neighborhood Searching (VMNS method for efficiently identifying street trees and deriving their morphological parameters from Mobile Laser Scanning (MLS point cloud data. The VMNS method consists of six technical components: voxelization, calculating values of voxels, searching and marking neighborhoods, extracting potential trees, deriving morphological parameters, and eliminating pole-like objects other than trees. The method is validated and evaluated through two case studies. The evaluation results show that the completeness and correctness of our method for street tree detection are over 98%. The derived morphological parameters, including tree height, crown diameter, diameter at breast height (DBH, and crown base height (CBH, are in a good agreement with the field measurements. Our method provides an effective tool for extracting various morphological parameters for individual street trees from MLS point cloud data.

  20. The Value of Qualitative Methods in Social Validity Research

    Science.gov (United States)

    Leko, Melinda M.

    2014-01-01

    One quality indicator of intervention research is the extent to which the intervention has a high degree of social validity, or practicality. In this study, I drew on Wolf's framework for social validity and used qualitative methods to ascertain five middle schoolteachers' perceptions of the social validity of System 44®--a phonics-based reading…

  1. Calculation of Optical Parameters of Liquid Crystals

    Science.gov (United States)

    Kumar, A.

    2007-12-01

    Validation of a modified four-parameter model describing temperature effect on liquid crystal refractive indices is being reported in the present article. This model is based upon the Vuks equation. Experimental data of ordinary and extraordinary refractive indices for two liquid crystal samples MLC-9200-000 and MLC-6608 are used to validate the above-mentioned theoretical model. Using these experimental data, birefringence, order parameter, normalized polarizabilities, and the temperature gradient of refractive indices are determined. Two methods: directly using birefringence measurements and using Haller's extrapolation procedure are adopted for the determination of order parameter. Both approches of order parameter calculation are compared. The temperature dependences of all these parameters are discussed. A close agreement between theory and experiment is obtained.

  2. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  3. Development and validation of Ketorolac Tromethamine in eye drop formulation by RP-HPLC method

    Directory of Open Access Journals (Sweden)

    G. Sunil

    2017-02-01

    Full Text Available A simple, precise and accurate method was developed and validated for analysis of Ketorolac Tromethamine in eye drop formulation. An isocratic HPLC analysis was performed on Kromosil C18 column (150 cm × 4.6 mm × 5 μm. The compound was separated with the mixture of methanol and ammonium dihydrogen phosphate buffer in the ratio of 55:45 V/V, pH 3.0 was adjusted with O-phosphoric acid as the mobile phase at flow of 1.5 mL min−1. UV detection was performed at 314 nm using photo diode array detection. The retention time was found to be 6.01 min. The system suitability parameters such as theoretical plate count, tailing and percentage RSD between six standard injections were within the limit. The method was validated according to ICH guidelines. Calibrations were linear over the concentration range of 50–150 μg mL−1 as indicated by correlation coefficient (r of 0.999. The robustness of the method was evaluated by deliberately altering the chromatographic conditions. The developed method can be applicable for routine quantitative analysis.

  4. Validation of a multi-residue method for the determination of several antibiotic groups in honey by LC-MS/MS.

    Science.gov (United States)

    Bohm, Detlef A; Stachel, Carolin S; Gowik, Petra

    2012-07-01

    The presented multi-method was developed for the confirmation of 37 antibiotic substances from the six antibiotic groups: macrolides, lincosamides, quinolones, tetracyclines, pleuromutilines and diamino-pyrimidine derivatives. All substances were analysed simultaneously in a single analytical run with the same procedure, including an extraction with buffer, a clean-up by solid-phase extraction, and the measurement by liquid chromatography tandem mass spectrometry in ESI+ mode. The method was validated on the basis of an in-house validation concept with factorial design by combination of seven factors to check the robustness in a concentration range of 5-50 μg kg(-1). The honeys used were of different types with regard to colour and origin. The values calculated for the validation parameters-decision limit CCα (range, 7.5-12.9 μg kg(-1)), detection capability CCβ (range, 9.4-19.9 μg kg(-1)), within-laboratory reproducibility RSD(wR) (tylvalosin with 21.4 %), repeatability RSD(r) (tylvalosin with 21.1%), and recovery (range, 92-106%)-were acceptable and in agreement with the criteria of Commission Decision 2002/657/EC. The validation results showed that the method was applicable for the residue analysis of antibiotics in honey to substances with and without recommended concentrations, although some changes had been tested during validation to determine the robustness of the method.

  5. Validation of quantitative 1H NMR method for the analysis of pharmaceutical formulations

    International Nuclear Information System (INIS)

    Santos, Maiara da S.

    2013-01-01

    The need for effective and reliable quality control in products from pharmaceutical industries renders the analyses of their active ingredients and constituents of great importance. This study presents the theoretical basis of ¹H NMR for quantitative analyses and an example of the method validation according to Resolution RE N. 899 by the Brazilian National Health Surveillance Agency (ANVISA), in which the compound paracetamol was the active ingredient. All evaluated parameters (selectivity, linearity, accuracy, repeatability and robustness) showed satisfactory results. It was concluded that a single NMR measurement provides structural and quantitative information of active components and excipients in the sample. (author)

  6. Validation of an activity optimization method for nuclear medicine in planar studies

    Energy Technology Data Exchange (ETDEWEB)

    Perez D, M. [Central University of Las Villas, CEETI, Camajuani Road Km 5.5, Santa Clara 54830 Villa Clara (Cuba); Diaz R, O. [Institute for Sciences and Advanced Technologies (Cuba); Farias L, F. [Federal University of Pernambuco (Brazil)]. e-mail: mperez@uclv.edu.cu

    2006-07-01

    A method for optimizing the administered activity in Static Nuclear Medicine Studies is validated by comparison with ROC curve. Linear Discriminant analysis of image quality in gamma cameras was the applied statistical technique. The constructed linear discriminant function owns as dependent parameters, the differentiated levels of image quality obtained by observer's criterion. The independent parameters in the function were physical variables, as Signal-to Background ratios and Signal-to-Noise ratios. They were obtained from the selection of Regions of Interest in images obtained from a Jaszczak phantom, corresponding to lesion and background sites. The percentage of cases correctly classified by discriminant analysis was analyzed to grade the proposed discriminant method. The minimum value of the administered activity, which permits good image quality, (it means good results for the parameters selected by the discriminant function), can be proposed as an optimized value of activity for planar studies of Nuclear Medicine. The method was tested using images from a Jaszczak phantom, acquired under four activities (1088 MBq, 962 MBq, 740 MBq and 562 MBq) with a gamma camera equipped with a high resolution - low energy- parallel-hole collimator. The gamma camera was tested by a NEMA protocol. Image quality was graded by three expert observers who also developed a rated procedure which consist in analyzing the images for ROC analysis. Two of the six measured Background-to-Signal ratios were the parameters able to construct the linear discriminant function with high correlation respect to the observer criterion, from all the measured physical variables. The value of 740 MBq was the optimum after discriminant method application in this particular experiment. The results were coincident with the application of ROC-analysis. The optimal activity value obtained with the proposed discriminant procedure coincided with the activity value for which the area under the ROC

  7. A simple HPLC method for the determination of halcinonide in lipid nanoparticles: development, validation, encapsulation efficiency, and in vitro drug permeation

    Directory of Open Access Journals (Sweden)

    Clarissa Elize Lopes

    2017-06-01

    Full Text Available ABSTRACT Halcinonide is a high-potency topical glucocorticoid used for skin inflammation treatments that presents toxic systemic effects. A simple and quick analytical method to quantify the amount of halcinonide encapsulated into lipid nanoparticles, such as polymeric lipid-core nanoparticles and solid lipid nanoparticles, was developed and validated regarding the drug's encapsulation efficiency and in vitro permeation. The development and validation of the analytical method were carried out using the high performance liquid chromatography with the UV detection at 239 nm. The validation parameters were specificity, linearity, precision and accuracy, limits of detection and quantitation, and robustness. The method presented an isocratic flow rate of 1.0 mL.min-1, a mobile phase methanol:water (85:15 v/v, and a retention time of 4.21 min. The method was validated according to international and national regulations. The halcinonide encapsulation efficiency in nanoparticles was greater than 99% and the in vitro drug permeation study showed that less than 9% of the drug permeated through the membrane, indicating a nanoparticle reservoir effect, which can reduce the halcinonide's toxic systemic effects. These studies demonstrated the applicability of the developed and validated analytical method to quantify halcinonide in lipid nanoparticles.

  8. Development and Validation of Liquid Chromatographic Method for Estimation of Naringin in Nanoformulation

    Directory of Open Access Journals (Sweden)

    Kranti P. Musmade

    2014-01-01

    Full Text Available A simple, precise, accurate, rapid, and sensitive reverse phase high performance liquid chromatography (RP-HPLC method with UV detection has been developed and validated for quantification of naringin (NAR in novel pharmaceutical formulation. NAR is a polyphenolic flavonoid present in most of the citrus plants having variety of pharmacological activities. Method optimization was carried out by considering the various parameters such as effect of pH and column. The analyte was separated by employing a C18 (250.0 × 4.6 mm, 5 μm column at ambient temperature in isocratic conditions using phosphate buffer pH 3.5: acetonitrile (75 : 25% v/v as mobile phase pumped at a flow rate of 1.0 mL/min. UV detection was carried out at 282 nm. The developed method was validated according to ICH guidelines Q2(R1. The method was found to be precise and accurate on statistical evaluation with a linearity range of 0.1 to 20.0 μg/mL for NAR. The intra- and interday precision studies showed good reproducibility with coefficients of variation (CV less than 1.0%. The mean recovery of NAR was found to be 99.33 ± 0.16%. The proposed method was found to be highly accurate, sensitive, and robust. The proposed liquid chromatographic method was successfully employed for the routine analysis of said compound in developed novel nanopharmaceuticals. The presence of excipients did not show any interference on the determination of NAR, indicating method specificity.

  9. Using constitutive equation gap method for identification of elastic material parameters: Technical insights and illustrations

    KAUST Repository

    Florentin, Éric

    2011-08-09

    The constitutive equation gap method (CEGM) is a well-known concept which, until now, has been used mainly for the verification of finite element simulations. Recently, CEGM-based functional has been proposed to identify local elastic parameters based on experimental full-field measurement. From a technical point of view, this approach requires to quickly describe a space of statically admissible stress fields. We present here the technical insights, inspired from previous works in verification, that leads to the construction of such a space. Then, the identification strategy is implemented and the obtained results are compared with the actual material parameters for numerically generated benchmarks. The quality of the identification technique is demonstrated that makes it a valuable tool for interactive design as a way to validate local material properties. © 2011 Springer-Verlag.

  10. Validation of QuEChERS method for the determination of some pesticide residues in two apple varieties.

    Science.gov (United States)

    Tiryaki, Osman

    2016-10-02

    This study was undertaken to validate the "quick, easy, cheap, effective, rugged and safe" (QuEChERS) method using Golden Delicious and Starking Delicious apple matrices spiked at 0.1 maximum residue limit (MRL), 1.0 MRL and 10 MRL levels of the four pesticides (chlorpyrifos, dimethoate, indoxacarb and imidacloprid). For the extraction and cleanup, original QuEChERS method was followed, then the samples were subjected to liquid chromatography-triple quadrupole mass spectrometry (LC-MS/MS) for chromatographic analyses. According to t test, matrix effect was not significant for chlorpyrifos in both sample matrices, but it was significant for dimethoate, indoxacarb and imidacloprid in both sample matrices. Thus, matrix-matched calibration (MC) was used to compensate matrix effect and quantifications were carried out by using MC. The overall recovery of the method was 90.15% with a relative standard deviation of 13.27% (n = 330). Estimated method detection limit of analytes blew the MRLs. Some other parameters of the method validation, such as recovery, precision, accuracy and linearity were found to be within the required ranges.

  11. Model parameter updating using Bayesian networks

    International Nuclear Information System (INIS)

    Treml, C.A.; Ross, Timothy J.

    2004-01-01

    This paper outlines a model parameter updating technique for a new method of model validation using a modified model reference adaptive control (MRAC) framework with Bayesian Networks (BNs). The model parameter updating within this method is generic in the sense that the model/simulation to be validated is treated as a black box. It must have updateable parameters to which its outputs are sensitive, and those outputs must have metrics that can be compared to that of the model reference, i.e., experimental data. Furthermore, no assumptions are made about the statistics of the model parameter uncertainty, only upper and lower bounds need to be specified. This method is designed for situations where a model is not intended to predict a complete point-by-point time domain description of the item/system behavior; rather, there are specific points, features, or events of interest that need to be predicted. These specific points are compared to the model reference derived from actual experimental data. The logic for updating the model parameters to match the model reference is formed via a BN. The nodes of this BN consist of updateable model input parameters and the specific output values or features of interest. Each time the model is executed, the input/output pairs are used to adapt the conditional probabilities of the BN. Each iteration further refines the inferred model parameters to produce the desired model output. After parameter updating is complete and model inputs are inferred, reliabilities for the model output are supplied. Finally, this method is applied to a simulation of a resonance control cooling system for a prototype coupled cavity linac. The results are compared to experimental data.

  12. Validated spectophotometric methods for the assay of cinitapride hydrogen tartrate in pharmaceuticals

    Directory of Open Access Journals (Sweden)

    Satyanarayana K.V.V.

    2013-01-01

    Full Text Available Three simple, selective and rapid spectrophotometric methods have been established for the determination of cinitapride hydrogen tartrate (CHT in pharmaceutical tablets. The proposed methods are based on the diazotization of CHT with sodium nitrite and hydrochloric acid, followed by coupling with resorcinol, 1-benzoylacetone and 8-hydroxyquinoline in alkaline medium for methods A, B and C respectively. The formed azo dyes are measured at 442, 465 and 552 nm for methods A, B and C respectively. The parameters that affect the reaction were carefully optimized. Under optimum conditions, Beer’s law is obeyed over the ranges 2.0-32.0, 1.0-24.0 and 1.0-20.0 μg. mL-1 for methods A, B, and C, respectively. The calculated molar absorptivity values are 1.2853 x104, 1.9624 x104 and 3.92 x104 L.mol-1.cm-1 for methods A, B and C, respectively. The results of the proposed procedures were validated statistically according to ICH guidelines. The proposed methods were successfully applied to the determination of CHT in Cintapro tablets without interference from common excipients encountered.

  13. Application of verification and validation on safety parameter display systems

    International Nuclear Information System (INIS)

    Thomas, N.C.

    1983-01-01

    Offers some explanation of how verification and validation (VandV) can support development and licensing of the Safety Parameter Display Systems (SPDS). Advocates that VandV can be more readily accepted within the nuclear industry if a better understanding exists of what the objectives of VandV are and should be. Includes a discussion regarding a reasonable balance of costs and benefits of VandV as applied to the SPDS and to other digital systems. Represents the author's perception of the regulator's perspective based on background information and experience, and discussions with regulators about their current concerns and objectives. Suggests that the introduction of the SPDS into the Control Room is a first step towards growing dependency on use of computers

  14. International Harmonization and Cooperation in the Validation of Alternative Methods.

    Science.gov (United States)

    Barroso, João; Ahn, Il Young; Caldeira, Cristiane; Carmichael, Paul L; Casey, Warren; Coecke, Sandra; Curren, Rodger; Desprez, Bertrand; Eskes, Chantra; Griesinger, Claudius; Guo, Jiabin; Hill, Erin; Roi, Annett Janusch; Kojima, Hajime; Li, Jin; Lim, Chae Hyung; Moura, Wlamir; Nishikawa, Akiyoshi; Park, HyeKyung; Peng, Shuangqing; Presgrave, Octavio; Singer, Tim; Sohn, Soo Jung; Westmoreland, Carl; Whelan, Maurice; Yang, Xingfen; Yang, Ying; Zuang, Valérie

    The development and validation of scientific alternatives to animal testing is important not only from an ethical perspective (implementation of 3Rs), but also to improve safety assessment decision making with the use of mechanistic information of higher relevance to humans. To be effective in these efforts, it is however imperative that validation centres, industry, regulatory bodies, academia and other interested parties ensure a strong international cooperation, cross-sector collaboration and intense communication in the design, execution, and peer review of validation studies. Such an approach is critical to achieve harmonized and more transparent approaches to method validation, peer-review and recommendation, which will ultimately expedite the international acceptance of valid alternative methods or strategies by regulatory authorities and their implementation and use by stakeholders. It also allows achieving greater efficiency and effectiveness by avoiding duplication of effort and leveraging limited resources. In view of achieving these goals, the International Cooperation on Alternative Test Methods (ICATM) was established in 2009 by validation centres from Europe, USA, Canada and Japan. ICATM was later joined by Korea in 2011 and currently also counts with Brazil and China as observers. This chapter describes the existing differences across world regions and major efforts carried out for achieving consistent international cooperation and harmonization in the validation and adoption of alternative approaches to animal testing.

  15. A novel validation algorithm allows for automated cell tracking and the extraction of biologically meaningful parameters.

    Directory of Open Access Journals (Sweden)

    Daniel H Rapoport

    Full Text Available Automated microscopy is currently the only method to non-invasively and label-free observe complex multi-cellular processes, such as cell migration, cell cycle, and cell differentiation. Extracting biological information from a time-series of micrographs requires each cell to be recognized and followed through sequential microscopic snapshots. Although recent attempts to automatize this process resulted in ever improving cell detection rates, manual identification of identical cells is still the most reliable technique. However, its tedious and subjective nature prevented tracking from becoming a standardized tool for the investigation of cell cultures. Here, we present a novel method to accomplish automated cell tracking with a reliability comparable to manual tracking. Previously, automated cell tracking could not rival the reliability of manual tracking because, in contrast to the human way of solving this task, none of the algorithms had an independent quality control mechanism; they missed validation. Thus, instead of trying to improve the cell detection or tracking rates, we proceeded from the idea to automatically inspect the tracking results and accept only those of high trustworthiness, while rejecting all other results. This validation algorithm works independently of the quality of cell detection and tracking through a systematic search for tracking errors. It is based only on very general assumptions about the spatiotemporal contiguity of cell paths. While traditional tracking often aims to yield genealogic information about single cells, the natural outcome of a validated cell tracking algorithm turns out to be a set of complete, but often unconnected cell paths, i.e. records of cells from mitosis to mitosis. This is a consequence of the fact that the validation algorithm takes complete paths as the unit of rejection/acceptance. The resulting set of complete paths can be used to automatically extract important biological parameters

  16. Nowcasting Surface Meteorological Parameters Using Successive Correction Method

    National Research Council Canada - National Science Library

    Henmi, Teizi

    2002-01-01

    The successive correction method was examined and evaluated statistically as a nowcasting method for surface meteorological parameters including temperature, dew point temperature, and horizontal wind vector components...

  17. Analytical method validation of GC-FID for the simultaneous measurement of hydrocarbons (C2-C4 in their gas mixture

    Directory of Open Access Journals (Sweden)

    Oman Zuas

    2016-09-01

    Full Text Available An accurate gas chromatography coupled to a flame ionization detector (GC-FID method was validated for the simultaneous analysis of light hydrocarbons (C2-C4 in their gas mixture. The validation parameters were evaluated based on the ISO/IEC 17025 definition including method selectivity, repeatability, accuracy, linearity, limit of detection (LOD, limit of quantitation (LOQ, and ruggedness. Under the optimum analytical conditions, the analysis of gas mixture revealed that each target component was well-separated with high selectivity property. The method was also found to be precise and accurate. The method linearity was found to be high with good correlation coefficient values (R2 ≥ 0.999 for all target components. It can be concluded that the GC-FID developed method is reliable and suitable for determination of light C2-C4 hydrocarbons (including ethylene, propane, propylene, isobutane, and n-butane in their gas mixture. The validated method has successfully been applied to the estimation of hydrocarbons light C2-C4 hydrocarbons in natural gas samples, showing high performance repeatability with relative standard deviation (RSD less than 1.0% and good selectivity with no interference from other possible components could be observed.

  18. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    Development and Validation of Analytical Method for Losartan-Copper Complex Using UV-Vis Spectrophotometry. ... Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with ...

  19. OWL-based reasoning methods for validating archetypes.

    Science.gov (United States)

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Development and validation of a multiresidue method for pesticide analysis in honey by UFLC-MS

    Directory of Open Access Journals (Sweden)

    Adriana M. Zamudio S.

    2017-05-01

    Full Text Available A method for the determination of pesticide residues in honey by ultra fast liquid chromatography coupled with mass spectrometry was developed. For this purpose, different variations of the QuECHERS method were performed: (i amount of sample, (ii type of salt to control pH, (iii buffer pH, and (iv different mixtures for cleaning-up. In addition, to demonstrate that the method is reliable, different validation parameters were studied: accuracy, limits of detection and quantification, linearity and selectivity. The results showed that by means of the changes introduced it was possible to get a more selective method that improves the accuracy of about 19 pesticides selected from the original method. It was found that the method is suitable for the analysis of 50 pesticides, out of 56. Furthermore, with the developed method recoveries between 70 and 120% and relative standard deviation below 15% were found.

  1. Development and Validation of Improved Method for Fingerprint ...

    African Journals Online (AJOL)

    Purpose: To develop and validate an improved method by capillary zone electrophoresis with photodiode array detection for the fingerprint analysis of Ligusticum chuanxiong Hort. (Rhizoma Chuanxiong). Methods: The optimum high performance capillary electrophoresis (HPCE) conditions were 30 mM borax containing 5 ...

  2. Catalytic hydrolysis of ammonia borane: Intrinsic parameter estimation and validation

    Energy Technology Data Exchange (ETDEWEB)

    Basu, S.; Gore, J.P. [School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907-2088 (United States); School of Chemical Engineering, Purdue University, West Lafayette, IN 47907-2100 (United States); Energy Center in Discovery Park, Purdue University, West Lafayette, IN 47907-2022 (United States); Zheng, Y. [School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907-2088 (United States); Energy Center in Discovery Park, Purdue University, West Lafayette, IN 47907-2022 (United States); Varma, A.; Delgass, W.N. [School of Chemical Engineering, Purdue University, West Lafayette, IN 47907-2100 (United States); Energy Center in Discovery Park, Purdue University, West Lafayette, IN 47907-2022 (United States)

    2010-04-02

    Ammonia borane (AB) hydrolysis is a potential process for on-board hydrogen generation. This paper presents isothermal hydrogen release rate measurements of dilute AB (1 wt%) hydrolysis in the presence of carbon supported ruthenium catalyst (Ru/C). The ranges of investigated catalyst particle sizes and temperature were 20-181 {mu}m and 26-56 C, respectively. The obtained rate data included both kinetic and diffusion-controlled regimes, where the latter was evaluated using the catalyst effectiveness approach. A Langmuir-Hinshelwood kinetic model was adopted to interpret the data, with intrinsic kinetic and diffusion parameters determined by a nonlinear fitting algorithm. The AB hydrolysis was found to have an activation energy 60.4 kJ mol{sup -1}, pre-exponential factor 1.36 x 10{sup 10} mol (kg-cat){sup -1} s{sup -1}, adsorption energy -32.5 kJ mol{sup -1}, and effective mass diffusion coefficient 2 x 10{sup -10} m{sup 2} s{sup -1}. These parameters, obtained under dilute AB conditions, were validated by comparing measurements with simulations of AB consumption rates during the hydrolysis of concentrated AB solutions (5-20 wt%), and also with the axial temperature distribution in a 0.5 kW continuous-flow packed-bed reactor. (author)

  3. Validation of methods for measurement of insulin secretion in humans in vivo

    DEFF Research Database (Denmark)

    Kjems, L L; Christiansen, E; Vølund, A

    2000-01-01

    To detect and understand the changes in beta-cell function in the pathogenesis of type 2 diabetes, an accurate and precise estimation of prehepatic insulin secretion rate (ISR) is essential. There are two common methods to assess ISR, the deconvolution method (by Eaton and Polonsky)-considered th......To detect and understand the changes in beta-cell function in the pathogenesis of type 2 diabetes, an accurate and precise estimation of prehepatic insulin secretion rate (ISR) is essential. There are two common methods to assess ISR, the deconvolution method (by Eaton and Polonsky...... of these mathematical techniques for quantification of insulin secretion have been tested in dogs, but not in humans. In the present studies, we examined the validity of both methods to recover the known infusion rates of insulin and C-peptide mimicking ISR during an oral glucose tolerance test. ISR from both......, and a close agreement was found for the results of an oral glucose tolerance test. We also studied whether C-peptide kinetics are influenced by somatostatin infusion. The decay curves after bolus injection of exogenous biosynthetic human C-peptide, the kinetic parameters, and the metabolic clearance rate were...

  4. Development and validation of a spectroscopic method for the ...

    African Journals Online (AJOL)

    Development and validation of a spectroscopic method for the simultaneous analysis of ... advanced analytical methods such as high pressure liquid ..... equipment. DECLARATIONS ... high-performance liquid chromatography. J Chromatogr.

  5. Application of validation data for assessing spatial interpolation methods for 8-h ozone or other sparsely monitored constituents.

    Science.gov (United States)

    Joseph, John; Sharif, Hatim O; Sunil, Thankam; Alamgir, Hasanat

    2013-07-01

    The adverse health effects of high concentrations of ground-level ozone are well-known, but estimating exposure is difficult due to the sparseness of urban monitoring networks. This sparseness discourages the reservation of a portion of the monitoring stations for validation of interpolation techniques precisely when the risk of overfitting is greatest. In this study, we test a variety of simple spatial interpolation techniques for 8-h ozone with thousands of randomly selected subsets of data from two urban areas with monitoring stations sufficiently numerous to allow for true validation. Results indicate that ordinary kriging with only the range parameter calibrated in an exponential variogram is the generally superior method, and yields reliable confidence intervals. Sparse data sets may contain sufficient information for calibration of the range parameter even if the Moran I p-value is close to unity. R script is made available to apply the methodology to other sparsely monitored constituents. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Parameters estimation of the single and double diode photovoltaic models using a Gauss–Seidel algorithm and analytical method: A comparative study

    International Nuclear Information System (INIS)

    Et-torabi, K.; Nassar-eddine, I.; Obbadi, A.; Errami, Y.; Rmaily, R.; Sahnoun, S.; El fajri, A.; Agunaou, M.

    2017-01-01

    Highlights: • Comparative study of two methods: a Gauss Seidel method and an analytical method. • Five models are implemented to estimate the five parameters for single diode. • Two models are used to estimate the seven parameters for double diode. • The parameters are estimated under changing environmental conditions. • To choose method/model combination more adequate for each PV module technology. - Abstract: In the photovoltaic (PV) panels modeling field, this paper presents a comparative study of two parameter estimation methods: the iterative method called Gauss Seidel, applied on the single diode model, and the analytical method used on the double diode model. These parameter estimation methods are based on the manufacturer's datasheets. They are also tested on three PV modules of different technologies: multicrystalline (kyocera KC200GT), monocrystalline (Shell SQ80), and thin film (Shell ST40). For the iterative method, five existing mathematical models classified from 1 to 5 are used to estimate the parameters of these PV modules under varying environmental conditions. Only two models of them are used for the analytical method. Each model is based on the combination of the photocurrent and the reverse saturation current’s expressions in terms of temperature and irradiance. In addition, the results of the models’ simulation are compared with the experimental data obtained from the PV modules’ datasheets, in order to evaluate the accuracy of the models. The simulation shows that the I-V characteristics obtained are matching to the experimental data. In order to validate the reliability of the two methods, both the Absolute Error (AE) and the Root Mean Square Error (RMSE) were calculated. The results suggest that the analytical method can be very useful for monocrystalline and multicrystalline modules, but for the thin film module, the iterative method is the most suitable.

  7. Validation of the Mass-Extraction-Window for Quantitative Methods Using Liquid Chromatography High Resolution Mass Spectrometry.

    Science.gov (United States)

    Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand

    2016-03-15

    A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.

  8. Bayesian risk-based decision method for model validation under uncertainty

    International Nuclear Information System (INIS)

    Jiang Xiaomo; Mahadevan, Sankaran

    2007-01-01

    This paper develops a decision-making methodology for computational model validation, considering the risk of using the current model, data support for the current model, and cost of acquiring new information to improve the model. A Bayesian decision theory-based method is developed for this purpose, using a likelihood ratio as the validation metric for model assessment. An expected risk or cost function is defined as a function of the decision costs, and the likelihood and prior of each hypothesis. The risk is minimized through correctly assigning experimental data to two decision regions based on the comparison of the likelihood ratio with a decision threshold. A Bayesian validation metric is derived based on the risk minimization criterion. Two types of validation tests are considered: pass/fail tests and system response value measurement tests. The methodology is illustrated for the validation of reliability prediction models in a tension bar and an engine blade subjected to high cycle fatigue. The proposed method can effectively integrate optimal experimental design into model validation to simultaneously reduce the cost and improve the accuracy of reliability model assessment

  9. A MACHINE-LEARNING METHOD TO INFER FUNDAMENTAL STELLAR PARAMETERS FROM PHOTOMETRIC LIGHT CURVES

    International Nuclear Information System (INIS)

    Miller, A. A.; Bloom, J. S.; Richards, J. W.; Starr, D. L.; Lee, Y. S.; Butler, N. R.; Tokarz, S.; Smith, N.; Eisner, J. A.

    2015-01-01

    A fundamental challenge for wide-field imaging surveys is obtaining follow-up spectroscopic observations: there are >10 9 photometrically cataloged sources, yet modern spectroscopic surveys are limited to ∼few× 10 6 targets. As we approach the Large Synoptic Survey Telescope era, new algorithmic solutions are required to cope with the data deluge. Here we report the development of a machine-learning framework capable of inferring fundamental stellar parameters (T eff , log g, and [Fe/H]) using photometric-brightness variations and color alone. A training set is constructed from a systematic spectroscopic survey of variables with Hectospec/Multi-Mirror Telescope. In sum, the training set includes ∼9000 spectra, for which stellar parameters are measured using the SEGUE Stellar Parameters Pipeline (SSPP). We employed the random forest algorithm to perform a non-parametric regression that predicts T eff , log g, and [Fe/H] from photometric time-domain observations. Our final optimized model produces a cross-validated rms error (RMSE) of 165 K, 0.39 dex, and 0.33 dex for T eff , log g, and [Fe/H], respectively. Examining the subset of sources for which the SSPP measurements are most reliable, the RMSE reduces to 125 K, 0.37 dex, and 0.27 dex, respectively, comparable to what is achievable via low-resolution spectroscopy. For variable stars this represents a ≈12%-20% improvement in RMSE relative to models trained with single-epoch photometric colors. As an application of our method, we estimate stellar parameters for ∼54,000 known variables. We argue that this method may convert photometric time-domain surveys into pseudo-spectrographic engines, enabling the construction of extremely detailed maps of the Milky Way, its structure, and history

  10. Spacecraft early design validation using formal methods

    International Nuclear Information System (INIS)

    Bozzano, Marco; Cimatti, Alessandro; Katoen, Joost-Pieter; Katsaros, Panagiotis; Mokos, Konstantinos; Nguyen, Viet Yen; Noll, Thomas; Postma, Bart; Roveri, Marco

    2014-01-01

    The size and complexity of software in spacecraft is increasing exponentially, and this trend complicates its validation within the context of the overall spacecraft system. Current validation methods are labor-intensive as they rely on manual analysis, review and inspection. For future space missions, we developed – with challenging requirements from the European space industry – a novel modeling language and toolset for a (semi-)automated validation approach. Our modeling language is a dialect of AADL and enables engineers to express the system, the software, and their reliability aspects. The COMPASS toolset utilizes state-of-the-art model checking techniques, both qualitative and probabilistic, for the analysis of requirements related to functional correctness, safety, dependability and performance. Several pilot projects have been performed by industry, with two of them having focused on the system-level of a satellite platform in development. Our efforts resulted in a significant advancement of validating spacecraft designs from several perspectives, using a single integrated system model. The associated technology readiness level increased from level 1 (basic concepts and ideas) to early level 4 (laboratory-tested)

  11. Performance analysis of pin fins with temperature dependent thermal parameters using the variation of parameters method

    Directory of Open Access Journals (Sweden)

    Cihat Arslantürk

    2016-08-01

    Full Text Available The performance of pin fins transferring heat by convection and radiation and having variable thermal conductivity, variable emissivity and variable heat transfer coefficient was investigated in the present paper. Nondimensionalizing the fin equation, the problem parameters which affect the fin performance were obtained. Dimensionless nonlinear fin equation was solved with the variation of parameters method, which is quite new in the solution of nonlinear heat transfer problems. The solution of variation of parameters method was compared with known analytical solutions and some numerical solution. The comparisons showed that the solutions are seen to be perfectly compatible. The effects of problem parameters were investigated on the heat transfer rate and fin efficiency and results were presented graphically.

  12. Method validation for uranium content analysis using a potentiometer T-90

    International Nuclear Information System (INIS)

    Torowati; Ngatijo; Rahmiati

    2016-01-01

    An experimental method validation has been conducted for uranium content analysis using Potentiometer T-90. The method validation experiment was performed in the quality control laboratory of Experiment Fuel Element Installation, PTBBN - BATAN. The objective is to determine the level of precision and accuracy of analytical results for uranium analysis referring to the latest American Standard Test Method (ASTM) of ASTM C1267-11, which is a modified reference method by reducing of reagent consumption by 10% of the amount used by the original method. The ASTM C 1267-11 reference is a new ASTM as a substitute for the older ASTM namely ASTM C799, Vol.12.01, 2003. It is, therefore, necessary to validate the renewed method. The tool used for the analysis of uranium was potentiometer T-90 and the material used was standard uranium oxide powder CRM (Certificate Reference Material). Validation of the method was done by analyzing standard uranium powder by 7 times weighing and 7 times analysis. Analysis results were used to determine the level of accuracy, precision, Relative Standard Deviation (RSD) and Horwitz coefficient Variation and limit detection and quantitation. The average uranium obtained for this method validation is 84.36% with Standard Deviation (SD) of 0.12%, Relative Standard Deviation (RSD) 0.14% and 2/3 Horwitz coefficient Variation (CV Horwitz) 2.05%. The results show that RSD value is smaller than the value of (2/3) CV Horwitz, which means that this method has a high precision. The accuracy value obtained is 0.48%, and since the acceptance limit of high level of accuracy is when the accuracy value is <2.00%, this method is regarded as having a high degree of accuracy [1]. The limit of detection (LOD) and and the limit of quantitation (LOQ) are 0.0145 g/L and 0.0446 g/L respectively. It is concluded that the ASTM C 1267-11 reference method is valid for use. (author)

  13. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    Directory of Open Access Journals (Sweden)

    Daniel Ramos

    2017-02-01

    Full Text Available Data to which the authors refer to throughout this article are likelihood ratios (LR computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR method in forensic evidence evaluation. These data present a necessary asset for conducting validation experiments when validating LR methods used in forensic evidence evaluation and set up validation reports. These data can be also used as a baseline for comparing the fingermark evidence in the same minutiae configuration as presented in (D. Meuwly, D. Ramos, R. Haraksim, [1], although the reader should keep in mind that different feature extraction algorithms and different AFIS systems used may produce different LRs values. Moreover, these data may serve as a reproducibility exercise, in order to train the generation of validation reports of forensic methods, according to [1]. Alongside the data, a justification and motivation for the use of methods is given. These methods calculate LRs from the fingerprint/mark data and are subject to a validation procedure. The choice of using real forensic fingerprint in the validation and simulated data in the development is described and justified. Validation criteria are set for the purpose of validation of the LR methods, which are used to calculate the LR values from the data and the validation report. For privacy and data protection reasons, the original fingerprint/mark images cannot be shared. But these images do not constitute the core data for the validation, contrarily to the LRs that are shared.

  14. Metamodel-based inverse method for parameter identification: elastic-plastic damage model

    Science.gov (United States)

    Huang, Changwu; El Hami, Abdelkhalak; Radi, Bouchaïb

    2017-04-01

    This article proposed a metamodel-based inverse method for material parameter identification and applies it to elastic-plastic damage model parameter identification. An elastic-plastic damage model is presented and implemented in numerical simulation. The metamodel-based inverse method is proposed in order to overcome the disadvantage in computational cost of the inverse method. In the metamodel-based inverse method, a Kriging metamodel is constructed based on the experimental design in order to model the relationship between material parameters and the objective function values in the inverse problem, and then the optimization procedure is executed by the use of a metamodel. The applications of the presented material model and proposed parameter identification method in the standard A 2017-T4 tensile test prove that the presented elastic-plastic damage model is adequate to describe the material's mechanical behaviour and that the proposed metamodel-based inverse method not only enhances the efficiency of parameter identification but also gives reliable results.

  15. Testing and Validation of Computational Methods for Mass Spectrometry.

    Science.gov (United States)

    Gatto, Laurent; Hansen, Kasper D; Hoopmann, Michael R; Hermjakob, Henning; Kohlbacher, Oliver; Beyer, Andreas

    2016-03-04

    High-throughput methods based on mass spectrometry (proteomics, metabolomics, lipidomics, etc.) produce a wealth of data that cannot be analyzed without computational methods. The impact of the choice of method on the overall result of a biological study is often underappreciated, but different methods can result in very different biological findings. It is thus essential to evaluate and compare the correctness and relative performance of computational methods. The volume of the data as well as the complexity of the algorithms render unbiased comparisons challenging. This paper discusses some problems and challenges in testing and validation of computational methods. We discuss the different types of data (simulated and experimental validation data) as well as different metrics to compare methods. We also introduce a new public repository for mass spectrometric reference data sets ( http://compms.org/RefData ) that contains a collection of publicly available data sets for performance evaluation for a wide range of different methods.

  16. Method validation and uncertainty evaluation of organically bound tritium analysis in environmental sample.

    Science.gov (United States)

    Huang, Yan-Jun; Zeng, Fan; Zhang, Bing; Chen, Chao-Feng; Qin, Hong-Juan; Wu, Lian-Sheng; Guo, Gui-Yin; Yang, Li-Tao; Shang-Guan, Zhi-Hong

    2014-08-01

    The analytical method for organically bound tritium (OBT) was developed in our laboratory. The optimized operating conditions and parameters were established for sample drying, special combustion, distillation, and measurement on a liquid scintillation spectrometer (LSC). Selected types of OBT samples such as rice, corn, rapeseed, fresh lettuce and pork were analyzed for method validation of recovery rate reproducibility, the minimum detection concentration, and the uncertainty for typical low level environmental sample was evaluated. The combustion water recovery rate of different dried environmental sample was kept at about 80%, the minimum detection concentration of OBT ranged from 0.61 to 0.89 Bq/kg (dry weight), depending on the hydrogen content. It showed that this method is suitable for OBT analysis of environmental sample with stable recovery rate, and the combustion water yield of a sample with weight about 40 g would provide sufficient quantity for measurement on LSC. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Cross-validation and Peeling Strategies for Survival Bump Hunting using Recursive Peeling Methods

    Science.gov (United States)

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    We introduce a framework to build a survival/risk bump hunting model with a censored time-to-event response. Our Survival Bump Hunting (SBH) method is based on a recursive peeling procedure that uses a specific survival peeling criterion derived from non/semi-parametric statistics such as the hazards-ratio, the log-rank test or the Nelson--Aalen estimator. To optimize the tuning parameter of the model and validate it, we introduce an objective function based on survival or prediction-error statistics, such as the log-rank test and the concordance error rate. We also describe two alternative cross-validation techniques adapted to the joint task of decision-rule making by recursive peeling and survival estimation. Numerical analyses show the importance of replicated cross-validation and the differences between criteria and techniques in both low and high-dimensional settings. Although several non-parametric survival models exist, none addresses the problem of directly identifying local extrema. We show how SBH efficiently estimates extreme survival/risk subgroups unlike other models. This provides an insight into the behavior of commonly used models and suggests alternatives to be adopted in practice. Finally, our SBH framework was applied to a clinical dataset. In it, we identified subsets of patients characterized by clinical and demographic covariates with a distinct extreme survival outcome, for which tailored medical interventions could be made. An R package PRIMsrc (Patient Rule Induction Method in Survival, Regression and Classification settings) is available on CRAN (Comprehensive R Archive Network) and GitHub. PMID:27034730

  18. Human Factors methods concerning integrated validation of nuclear power plant control rooms

    International Nuclear Information System (INIS)

    Oskarsson, Per-Anders; Johansson, Bjoern J.E.; Gonzalez, Natalia

    2010-02-01

    The frame of reference for this work was existing recommendations and instructions from the NPP area, experiences from the review of the Turbic Validation and experiences from system validations performed at the Swedish Armed Forces, e.g. concerning military control rooms and fighter pilots. These enterprises are characterized by complex systems in extreme environments, often with high risks, where human error can lead to serious consequences. A focus group has been performed with representatives responsible for Human Factors issues from all Swedish NPP:s. The questions that were discussed were, among other things, for whom an integrated validation (IV) is performed and its purpose, what should be included in an IV, the comparison with baseline measures, the design process, the role of SSM, which methods of measurement should be used, and how the methods are affected of changes in the control room. The report brings different questions to discussion concerning the validation process. Supplementary methods of measurement for integrated validation are discussed, e.g. dynamic, psychophysiological, and qualitative methods for identification of problems. Supplementary methods for statistical analysis are presented. The study points out a number of deficiencies in the validation process, e.g. the need of common guidelines for validation and design, criteria for different types of measurements, clarification of the role of SSM, and recommendations for the responsibility of external participants in the validation process. The authors propose 12 measures for taking care of the identified problems

  19. Improved Cole parameter extraction based on the least absolute deviation method

    International Nuclear Information System (INIS)

    Yang, Yuxiang; Ni, Wenwen; Sun, Qiang; Wen, He; Teng, Zhaosheng

    2013-01-01

    The Cole function is widely used in bioimpedance spectroscopy (BIS) applications. Fitting the measured BIS data onto the model and then extracting the Cole parameters (R 0 , R ∞ , α and τ) is a common practice. Accurate extraction of the Cole parameters from the measured BIS data has great significance for evaluating the physiological or pathological status of biological tissue. The traditional least-squares (LS)-based curve fitting method for Cole parameter extraction is often sensitive to noise or outliers and becomes non-robust. This paper proposes an improved Cole parameter extraction based on the least absolute deviation (LAD) method. Comprehensive simulation experiments are carried out and the performances of the LAD method are compared with those of the LS method under the conditions of outliers, random noises and both disturbances. The proposed LAD method exhibits much better robustness under all circumstances, which demonstrates that the LAD method is deserving as an improved alternative to the LS method for Cole parameter extraction for its robustness to outliers and noises. (paper)

  20. Determination of the kinetic parameters of BeO using isothermal decay method

    International Nuclear Information System (INIS)

    Nieto, Juan Azorin; Vega, Claudia Azorin; Montalvo, Teodoro Rivera; Cabrera, Eugenio Torijano

    2016-01-01

    Most of the existing methods for obtaining the frequency factors make use of the trap depth (activation energy) making some assumptions about the order of the kinetics. This causes inconsistencies in the reported values of trapping parameters due that the values of the activation energy obtained by different methods differ appreciably among them. Then, it is necessary to use a method independent of the trap depth making use of the isothermal luminescence decay (ILD) method. The trapping parameters associated with the prominent glow peak of BeO (280 °C) are reported using ILD method. As a check, the trap parameters are also calculated by glow curve shape (Chen's) method after isolating the prominent glow peak by thermal cleaning technique. Our results show a very good agreement between the trapping parameters calculated by the two methods. ILD method was used for determining the trapping parameters of BeO. Results obtained applying this method are in good agreement with those obtained using other methods, except in the value of the frequency factor. - Highlights: • Kinetic parameters of BeO were determined. • Isothermal decay method was used. • Frecuency factor not agree with those obtained by other methods.

  1. Quality data validation: Comprehensive approach to environmental data validation

    International Nuclear Information System (INIS)

    Matejka, L.A. Jr.

    1993-01-01

    Environmental data validation consists of an assessment of three major areas: analytical method validation; field procedures and documentation review; evaluation of the level of achievement of data quality objectives based in part on PARCC parameters analysis and expected applications of data. A program utilizing matrix association of required levels of validation effort and analytical levels versus applications of this environmental data was developed in conjunction with DOE-ID guidance documents to implement actions under the Federal Facilities Agreement and Consent Order in effect at the Idaho National Engineering Laboratory. This was an effort to bring consistent quality to the INEL-wide Environmental Restoration Program and database in an efficient and cost-effective manner. This program, documenting all phases of the review process, is described here

  2. Validation of calculational methods for nuclear criticality safety - approved 1975

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    The American National Standard for Nuclear Criticality Safety in Operations with Fissionable Materials Outside Reactors, N16.1-1975, states in 4.2.5: In the absence of directly applicable experimental measurements, the limits may be derived from calculations made by a method shown to be valid by comparison with experimental data, provided sufficient allowances are made for uncertainties in the data and in the calculations. There are many methods of calculation which vary widely in basis and form. Each has its place in the broad spectrum of problems encountered in the nuclear criticality safety field; however, the general procedure to be followed in establishing validity is common to all. The standard states the requirements for establishing the validity and area(s) of applicability of any calculational method used in assessing nuclear criticality safety

  3. Intelligent methods for the process parameter determination of plastic injection molding

    Science.gov (United States)

    Gao, Huang; Zhang, Yun; Zhou, Xundao; Li, Dequn

    2018-03-01

    Injection molding is one of the most widely used material processing methods in producing plastic products with complex geometries and high precision. The determination of process parameters is important in obtaining qualified products and maintaining product quality. This article reviews the recent studies and developments of the intelligent methods applied in the process parameter determination of injection molding. These intelligent methods are classified into three categories: Case-based reasoning methods, expert system- based methods, and data fitting and optimization methods. A framework of process parameter determination is proposed after comprehensive discussions. Finally, the conclusions and future research topics are discussed.

  4. High Frequency Asymptotic Methods for Traveltimes and Anisotropy Parameter Estimation in Azimuthally Varying Media

    KAUST Repository

    Masmoudi, Nabil

    2014-05-01

    Traveltimes are conventionally evaluated by solving the zero-order approximation of the Wentzel, Kramers and Brillouin (WKB) expansion of the wave equation. This high frequency approximation is good enough for most imaging applications and provides us with a traveltime equation called the eikonal equation. The eikonal equation is a non-linear partial differential equation which can be solved by any of the familiar numerical methods. Among the most popular of these methods is the method of characteristics which yields the ray tracing equations and the finite difference approaches. In the first part of the Master Thesis, we use the ray tracing method to solve the eikonal equation to get P-waves traveltimes for orthorhombic models with arbitrary orientation of symmetry planes. We start with a ray tracing procedure specified in curvilinear coordinate system valid for anisotropy of arbitrary symmetry. The coordinate system is constructed so that the coordinate lines are perpendicular to the symmetry planes of an orthorohombic medium. Advantages of this approach are the conservation of orthorhombic symmetry throughout the model and reduction of the number of parameters specifying the model. We combine this procedure with first-order ray tracing and dynamic ray tracing equations for P waves propagating in smooth, inhomogeneous, weakly anisotropic media. The first-order ray tracing and dynamic ray tracing equations are derived from the exact ones by replacing the exact P-wave eigenvalue of the Christoffel matrix by its first-order approximation. In the second part of the Master Thesis, we compute traveltimes using the fast marching method and we develop an approach to estimate the anisotropy parameters. The idea is to relate them analytically to traveltimes which is challenging in inhomogeneous media. Using perturbation theory, we develop traveltime approximations for transversely isotropic media with horizontal symmetry axis (HTI) as explicit functions of the

  5. Is the smile line a valid parameter for esthetic evaluation? A systematic literature review.

    Science.gov (United States)

    Passia, Nicole; Blatz, Markus; Strub, Jörg Rudolf

    2011-01-01

    The "smile line" is commonly used as a parameter to evaluate and categorize a person's smile. This systematic literature review assessed the existing evidence on the validity and universal applicability of this parameter. The latter was evaluated based on studies on smile perception by orthodontists, general clinicians, and laypeople. A review of the literature published between October 1973 and January 2010 was conducted with the electronic database Pubmed and the search terms "smile," "smile line," "smile arc," and "smile design." The search yielded 309 articles, of which nine studies were included based on the selection criteria. The selected studies typically correlate the smile line with the position of the upper lip during a smile while, on average, 75 to 100% of the maxillary anterior teeth are exposed. A virtual line that connects the incisal edges of the maxillary anterior teeth commonly follows the upper border of the lower lip. Average and parallel smile lines are most common, influenced by the age and gender of a person. Orthodontists, general clinicians, and laypeople have similar preferences and rate average smile lines as most attractive. The smile line is a valid tool to assess the esthetic appearance of a smile. It can be applied universally as clinicians and laypersons perceive and judge it similarly.

  6. Method for Determining Volumetric Efficiency and Its Experimental Validation

    Directory of Open Access Journals (Sweden)

    Ambrozik Andrzej

    2017-12-01

    Full Text Available Modern means of transport are basically powered by piston internal combustion engines. Increasingly rigorous demands are placed on IC engines in order to minimise the detrimental impact they have on the natural environment. That stimulates the development of research on piston internal combustion engines. The research involves experimental and theoretical investigations carried out using computer technologies. While being filled, the cylinder is considered to be an open thermodynamic system, in which non-stationary processes occur. To make calculations of thermodynamic parameters of the engine operating cycle, based on the comparison of cycles, it is necessary to know the mean constant value of cylinder pressure throughout this process. Because of the character of in-cylinder pressure pattern and difficulties in pressure experimental determination, in the present paper, a novel method for the determination of this quantity was presented. In the new approach, the iteration method was used. In the method developed for determining the volumetric efficiency, the following equations were employed: the law of conservation of the amount of substance, the first law of thermodynamics for open system, dependences for changes in the cylinder volume vs. the crankshaft rotation angle, and the state equation. The results of calculations performed with this method were validated by means of experimental investigations carried out for a selected engine at the engine test bench. A satisfactory congruence of computational and experimental results as regards determining the volumetric efficiency was obtained. The method for determining the volumetric efficiency presented in the paper can be used to investigate the processes taking place in the cylinder of an IC engine.

  7. Optimization the machining parameters by using VIKOR and Entropy Weight method during EDM process of Al–18% SiCp Metal matrix composit

    Directory of Open Access Journals (Sweden)

    Rajesh Kumar Bhuyan

    2016-06-01

    Full Text Available The objective of this paper is to optimize the process parameters by combined approach of VIKOR and Entropy weight measurement method during Electrical discharge machining (EDM process of Al-18wt.%SiCp metal matrix composite (MMC. The central composite design (CCD method is considered to evaluate the effect of three process parameters; namely pulse on time (Ton, peak current (Ip and flushing pressure (Fp on the responses like material removal rate (MRR, tool wear rate (TWR, Radial over cut (ROC and surface roughness (Ra. The Entropy weight measurement method evaluates the individual weights of each response and, using VIKOR method, the multi-objective responses are optimized to get a single numerical index known as VIKOR Index. Then the Analysis of Variance (ANOVA technique is used to determine the significance of the process parameters on the VIKOR Index. Finally, the result of the VIKOR Indexed is validated by conformation test using the liner mathematical model equation develop by responses surface methodology to identify the effectiveness of the proposed method.

  8. Validation of the method for determination of the thermal resistance of fouling in shell and tube heat exchangers

    International Nuclear Information System (INIS)

    Markowski, Mariusz; Trafczynski, Marian; Urbaniec, Krzysztof

    2013-01-01

    Highlights: • Heat recovery in a heat exchanger network (HEN). • A novel method for on-line determination of the thermal resistance of fouling is presented. • Details are developed for shell and tube heat exchangers. • The method was validated and sensibility analysis was carried out. • Developed approach allows long-term monitoring of changes in the HEN efficiency. - Abstract: A novel method for on-line determination of the thermal resistance of fouling in shell and tube heat exchangers is presented. It can be applied under the condition that the data on pressure, temperature, mass flowrate and thermophysical properties of both heat-exchanging media are continuously available. The calculation algorithm for use in the novel method is robust and ensures reliable determination of the thermal resistance of fouling even if the operating parameters fluctuate. The method was validated using measurement data retrieved from the operation records of a heat exchanger network connected with a crude distillation unit rated 800 t/h. Sensibility analysis of the method was carried out and the calculated values of the thermal resistance of fouling were critically reviewed considering the results of qualitative evaluation of fouling layers in the exchangers inspected during plant overhaul

  9. Validating carbonation parameters of alkaline solid wastes via integrated thermal analyses: Principles and applications.

    Science.gov (United States)

    Pan, Shu-Yuan; Chang, E-E; Kim, Hyunook; Chen, Yi-Hung; Chiang, Pen-Chi

    2016-04-15

    Accelerated carbonation of alkaline solid wastes is an attractive method for CO2 capture and utilization. However, the evaluation criteria of CaCO3 content in solid wastes and the way to interpret thermal analysis profiles were found to be quite different among the literature. In this investigation, an integrated thermal analyses for determining carbonation parameters in basic oxygen furnace slag (BOFS) were proposed based on thermogravimetric (TG), derivative thermogravimetric (DTG), and differential scanning calorimetry (DSC) analyses. A modified method of TG-DTG interpretation was proposed by considering the consecutive weight loss of sample with 200-900°C because the decomposition of various hydrated compounds caused variances in estimates by using conventional methods of TG interpretation. Different quantities of reference CaCO3 standards, carbonated BOFS samples and synthetic CaCO3/BOFS mixtures were prepared for evaluating the data quality of the modified TG-DTG interpretation, in terms of precision and accuracy. The quantitative results of the modified TG-DTG method were also validated by DSC analysis. In addition, to confirm the TG-DTG results, the evolved gas analysis was performed by mass spectrometer and Fourier transform infrared spectroscopy for detection of the gaseous compounds released during heating. Furthermore, the decomposition kinetics and thermodynamics of CaCO3 in BOFS was evaluated using Arrhenius equation and Kissinger equation. The proposed integrated thermal analyses for determining CaCO3 content in alkaline wastes was precise and accurate, thereby enabling to effectively assess the CO2 capture capacity of alkaline wastes for mineral carbonation. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Identification of metabolic system parameters using global optimization methods

    Directory of Open Access Journals (Sweden)

    Gatzke Edward P

    2006-01-01

    Full Text Available Abstract Background The problem of estimating the parameters of dynamic models of complex biological systems from time series data is becoming increasingly important. Methods and results Particular consideration is given to metabolic systems that are formulated as Generalized Mass Action (GMA models. The estimation problem is posed as a global optimization task, for which novel techniques can be applied to determine the best set of parameter values given the measured responses of the biological system. The challenge is that this task is nonconvex. Nonetheless, deterministic optimization techniques can be used to find a global solution that best reconciles the model parameters and measurements. Specifically, the paper employs branch-and-bound principles to identify the best set of model parameters from observed time course data and illustrates this method with an existing model of the fermentation pathway in Saccharomyces cerevisiae. This is a relatively simple yet representative system with five dependent states and a total of 19 unknown parameters of which the values are to be determined. Conclusion The efficacy of the branch-and-reduce algorithm is illustrated by the S. cerevisiae example. The method described in this paper is likely to be widely applicable in the dynamic modeling of metabolic networks.

  11. Moving beyond Traditional Methods of Survey Validation

    Science.gov (United States)

    Maul, Andrew

    2017-01-01

    In his focus article, "Rethinking Traditional Methods of Survey Validation," published in this issue of "Measurement: Interdisciplinary Research and Perspectives," Andrew Maul wrote that it is commonly believed that self-report, survey-based instruments can be used to measure a wide range of psychological attributes, such as…

  12. Development and Validation of a Precise Method for Determination of Benzalkonium Chloride (BKC Preservative, in Pharmaceutical Formulation of Latanoprost Eye Drops

    Directory of Open Access Journals (Sweden)

    J. Mehta

    2010-01-01

    Full Text Available A simple and precise reversed phase high performance liquid chromatographic method has been developed and validated for the quantification of benzalkonium chloride (BKC preservative in pharmaceutical formulation of latanoprost eye drops. The analyte was chromatographed on a Waters Spherisorb CN, (4.6×250 mm column packed with particles of 5 μm. The mobile phase, optimized through an experimental design, was a 40:60 (v/v mixture of potassium dihydrogen orthophosphate buffer (pH 5.5 and acetonitrile, pumped at a flow rate of 1.0 mL/min at maintaining column temperature at 30 °C. Maximum UV detection was achieved at 210 nm. The method was validated in terms of linearity, repeatability, intermediate precision and method accuracy. The method was shown to be robust, resisting to small deliberate changes in pH, flow rate and composition (organic ratio of the mobile phase. The method was successfully applied for the determination of BKC in a pharmaceutical formulation of latanoprost ophthalmic solution without any interference from common excipients and drug substance. All the validation parameters were within the acceptance range, concordant to ICH guidelines.

  13. An Improved Interferometric Calibration Method Based on Independent Parameter Decomposition

    Science.gov (United States)

    Fan, J.; Zuo, X.; Li, T.; Chen, Q.; Geng, X.

    2018-04-01

    Interferometric SAR is sensitive to earth surface undulation. The accuracy of interferometric parameters plays a significant role in precise digital elevation model (DEM). The interferometric calibration is to obtain high-precision global DEM by calculating the interferometric parameters using ground control points (GCPs). However, interferometric parameters are always calculated jointly, making them difficult to decompose precisely. In this paper, we propose an interferometric calibration method based on independent parameter decomposition (IPD). Firstly, the parameters related to the interferometric SAR measurement are determined based on the three-dimensional reconstruction model. Secondly, the sensitivity of interferometric parameters is quantitatively analyzed after the geometric parameters are completely decomposed. Finally, each interferometric parameter is calculated based on IPD and interferometric calibration model is established. We take Weinan of Shanxi province as an example and choose 4 TerraDEM-X image pairs to carry out interferometric calibration experiment. The results show that the elevation accuracy of all SAR images is better than 2.54 m after interferometric calibration. Furthermore, the proposed method can obtain the accuracy of DEM products better than 2.43 m in the flat area and 6.97 m in the mountainous area, which can prove the correctness and effectiveness of the proposed IPD based interferometric calibration method. The results provide a technical basis for topographic mapping of 1 : 50000 and even larger scale in the flat area and mountainous area.

  14. Validation Process Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); English, Christine M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gesick, Joshua C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukkamala, Saikrishna [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-01-04

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  15. Validation of a UV Spectrometric Method for the Assay of Tolfenamic Acid in Organic Solvents

    Directory of Open Access Journals (Sweden)

    Sofia Ahmed

    2015-01-01

    Full Text Available The present study has been carried out to validate a UV spectrometric method for the assay of tolfenamic acid (TA in organic solvents. TA is insoluble in water; therefore, a total of thirteen commonly used organic solvents have been selected in which the drug is soluble. Fresh stock solutions of TA in each solvent in a concentration of 1 × 10−4 M (2.62 mg% were prepared for the assay. The method has been validated according to the guideline of International Conference on Harmonization and parameters like linearity, range, accuracy, precision, sensitivity, and robustness have been studied. Although the method was found to be efficient for the determination of TA in all solvents on the basis of statistical data 1-octanol, followed by ethanol and methanol, was found to be comparatively better than the other studied solvents. No change in the stock solution stability of TA has been observed in each solvent for 24 hours stored either at room (25±1°C or at refrigerated temperature (2–8°C. A shift in the absorption maxima has been observed for TA in various solvents indicating drug-solvent interactions. The studied method is simple, rapid, economical, accurate, and precise for the assay of TA in different organic solvents.

  16. Development and validation of an ionic chromatography method for the determination of nitrate, nitrite and chloride in meat.

    Science.gov (United States)

    Lopez-Moreno, Cristina; Perez, Isabel Viera; Urbano, Ana M

    2016-03-01

    The purpose of this study is to develop the validation of a method for the analysis of certain preservatives in meat and to obtain a suitable Certified Reference Material (CRM) to achieve this task. The preservatives studied were NO3(-), NO2(-) and Cl(-) as they serve as important antimicrobial agents in meat to inhibit the growth of bacteria spoilage. The meat samples were prepared using a treatment that allowed the production of a known CRM concentration that is highly homogeneous and stable in time. The matrix effects were also studied to evaluate the influence on the analytical signal for the ions of interest, showing that the matrix influence does not affect the final result. An assessment of the signal variation in time was carried out for the ions. In this regard, although the chloride and nitrate signal remained stable for the duration of the study, the nitrite signal decreased appreciably with time. A mathematical treatment of the data gave a stable nitrite signal, obtaining a method suitable for the validation of these anions in meat. A statistical study was needed for the validation of the method, where the precision, accuracy, uncertainty and other mathematical parameters were evaluated obtaining satisfactory results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. A validated RP-HPLC method for the determination of Irinotecan hydrochloride residues for cleaning validation in production area

    Directory of Open Access Journals (Sweden)

    Sunil Reddy

    2013-03-01

    Full Text Available Introduction: cleaning validation is an integral part of current good manufacturing practices in pharmaceutical industry. The main purpose of cleaning validation is to prove the effectiveness and consistency of cleaning in a given pharmaceutical production equipment to prevent cross contamination and adulteration of drug product with other active ingredient. Objective: a rapid, sensitive and specific reverse phase HPLC method was developed and validated for the quantitative determination of irinotecan hydrochloride in cleaning validation swab samples. Method: the method was validated using waters symmetry shield RP-18 (250mm x 4.6mm 5 µm column with isocratic mobile phase containing a mixture of 0.02 M potassium di-hydrogen ortho-phosphate, pH adjusted to 3.5 with ortho-phosphoric acid, methanol and acetonitrile (60:20:20 v/v/v. The flow rate of mobile phase was 1.0 mL/min with column temperature of 25°C and detection wavelength at 220nm. The sample injection volume was 100 µl. Results: the calibration curve was linear over a concentration range from 0.024 to 0.143 µg/mL with a correlation coefficient of 0.997. The intra-day and inter-day precision expressed as relative standard deviation were below 3.2%. The recoveries obtained from stainless steel, PCGI, epoxy, glass and decron cloth surfaces were more than 85% and there was no interference from the cotton swab. The detection limit (DL and quantitation limit (QL were 0.008 and 0.023 µg ml-1, respectively. Conclusion: the developed method was validated with respect to specificity, linearity, limit of detection and quantification, accuracy, precision and solution stability. The overall procedure can be used as part of a cleaning validation program in pharmaceutical manufacture of irinotecan hydrochloride.

  18. Development and validation of analytical methods for dietary supplements

    International Nuclear Information System (INIS)

    Sullivan, Darryl; Crowley, Richard

    2006-01-01

    The expanding use of innovative botanical ingredients in dietary supplements and foods has resulted in a flurry of research aimed at the development and validation of analytical methods for accurate measurement of active ingredients. The pressing need for these methods is being met through an expansive collaborative initiative involving industry, government, and analytical organizations. This effort has resulted in the validation of several important assays as well as important advances in the method engineering procedures which have improved the efficiency of the process. The initiative has also allowed researchers to hurdle many of the barricades that have hindered accurate analysis such as the lack of reference standards and comparative data. As the availability for nutraceutical products continues to increase these methods will provide consumers and regulators with the scientific information needed to assure safety and dependable labeling

  19. Determination of the kinetic parameters of Be O using isothermal decay method

    Energy Technology Data Exchange (ETDEWEB)

    Azorin N, J.; Torijano C, E. [Universidad Autonoma Metropolitana, Unidad Iztapalapa, Av. San Rafael Atlixco 186, Col. Vicentina, 09340 Mexico D. F. (Mexico); Azorin V, C.; Rivera M, T., E-mail: azorin@xanum.uam.mx [IPN, Centro de Investigacion en Ciencia Aplicada y Tecnologia Avanzada, Av. Legaria 694, Col. Irrigacion, 11500 Mexico D. F. (Mexico)

    2015-10-15

    Full text: Most of the existing methods for obtaining the frequency factors make use of the trap depth (activation energy) making some assumptions about the order of the kinetics. This causes inconsistencies in the reported values of trapping parameters due that the values of the activation energy obtained by different methods differ appreciably among them. Then, it is necessary to use a method independent of the trap depth making use of the isothermal luminescence decay method. The trapping parameters associated with the prominent glow peak of Be O (280 degrees C) are reported using isothermal luminescence decay method. As a check, the trap parameters are also calculated by glow curve shape (Chen s) method after isolating the prominent glow peak by thermal cleaning technique. Our results show a very good agreement between the trapping parameters calculated by the two methods. Isothermal luminescence decay method was used for determining the trapping parameters of Be O. Results obtained applying this method are in good agreement with those obtained using other methods, except in the value of the frequency factor. (Author)

  20. A method for generating subgroup parameters from resonance tables

    International Nuclear Information System (INIS)

    Devan, K.; Mohanakrishnan, P.

    1993-01-01

    A method for generating subgroup or band parameters from resonance tables is described. A computer code SPART was written using this method. This code generates the subgroup parameters for any number of bands within the specified broad groups at different temperatures by reading the required input data from the binary cross section library in the Cadarache format. The results obtained with SPART code for two bands were compared with that obtained from GROUPIE code and a good agreement was obtained. Results of the generation of subgroup parameters in four bands for sample case of 239 Pu from resonance tables of Cadarache Ver.2 library is also presented. (author). 8 refs., 2 tabs

  1. Validating the JobFit system functional assessment method

    Energy Technology Data Exchange (ETDEWEB)

    Jenny Legge; Robin Burgess-Limerick

    2007-05-15

    Workplace injuries are costing the Australian coal mining industry and its communities $410 Million a year. This ACARP study aims to meet those demands by developing a safe, reliable and valid pre-employment functional assessment tool. All JobFit System Pre-Employment Functional Assessments (PEFAs) consist of a musculoskeletal screen, balance test, aerobic fitness test and job-specific postural tolerances and material handling tasks. The results of each component are compared to the applicant's job demands and an overall PEFA score between 1 and 4 is given with 1 being the better score. The reliability study and validity study were conducted concurrently. The reliability study examined test-retest, intra-tester and inter-tester reliability of the JobFit System Functional Assessment Method. Overall, good to excellent reliability was found, which was sufficient to be used for comparison with injury data for determining the validity of the assessment. The overall assessment score and material handling tasks had the greatest reliability. The validity study compared the assessment results of 336 records from a Queensland underground and open cut coal mine with their injury records. A predictive relationship was found between PEFA score and the risk of a back/trunk/shoulder injury from manual handling. An association was also found between PEFA score of 1 and increased length of employment. Lower aerobic fitness test results had an inverse relationship with injury rates. The study found that underground workers, regardless of PEFA score, were more likely to have an injury when compared to other departments. No relationship was found between age and risk of injury. These results confirm the validity of the JobFit System Functional Assessment method.

  2. A semi-automated volumetric software for segmentation and perfusion parameter quantification of brain tumors using 320-row multidetector computed tomography: a validation study

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Soo Young; Suh, Sangil; Ryoo, Inseon; Park, Arim; Seol, Hae Young [Korea University Guro Hospital, Department of Radiology, Seoul (Korea, Republic of); Noh, Kyoung Jin [Soonchunhyang University, Department of Electronic Engineering, Asan (Korea, Republic of); Shim, Hackjoon [Toshiba Medical Systems Korea Co., Seoul (Korea, Republic of)

    2017-05-15

    We developed a semi-automated volumetric software, NPerfusion, to segment brain tumors and quantify perfusion parameters on whole-brain CT perfusion (WBCTP) images. The purpose of this study was to assess the feasibility of the software and to validate its performance compared with manual segmentation. Twenty-nine patients with pathologically proven brain tumors who underwent preoperative WBCTP between August 2012 and February 2015 were included. Three perfusion parameters, arterial flow (AF), equivalent blood volume (EBV), and Patlak flow (PF, which is a measure of permeability of capillaries), of brain tumors were generated by a commercial software and then quantified volumetrically by NPerfusion, which also semi-automatically segmented tumor boundaries. The quantification was validated by comparison with that of manual segmentation in terms of the concordance correlation coefficient and Bland-Altman analysis. With NPerfusion, we successfully performed segmentation and quantified whole volumetric perfusion parameters of all 29 brain tumors that showed consistent perfusion trends with previous studies. The validation of the perfusion parameter quantification exhibited almost perfect agreement with manual segmentation, with Lin concordance correlation coefficients (ρ {sub c}) for AF, EBV, and PF of 0.9988, 0.9994, and 0.9976, respectively. On Bland-Altman analysis, most differences between this software and manual segmentation on the commercial software were within the limit of agreement. NPerfusion successfully performs segmentation of brain tumors and calculates perfusion parameters of brain tumors. We validated this semi-automated segmentation software by comparing it with manual segmentation. NPerfusion can be used to calculate volumetric perfusion parameters of brain tumors from WBCTP. (orig.)

  3. A semi-automated volumetric software for segmentation and perfusion parameter quantification of brain tumors using 320-row multidetector computed tomography: a validation study.

    Science.gov (United States)

    Chae, Soo Young; Suh, Sangil; Ryoo, Inseon; Park, Arim; Noh, Kyoung Jin; Shim, Hackjoon; Seol, Hae Young

    2017-05-01

    We developed a semi-automated volumetric software, NPerfusion, to segment brain tumors and quantify perfusion parameters on whole-brain CT perfusion (WBCTP) images. The purpose of this study was to assess the feasibility of the software and to validate its performance compared with manual segmentation. Twenty-nine patients with pathologically proven brain tumors who underwent preoperative WBCTP between August 2012 and February 2015 were included. Three perfusion parameters, arterial flow (AF), equivalent blood volume (EBV), and Patlak flow (PF, which is a measure of permeability of capillaries), of brain tumors were generated by a commercial software and then quantified volumetrically by NPerfusion, which also semi-automatically segmented tumor boundaries. The quantification was validated by comparison with that of manual segmentation in terms of the concordance correlation coefficient and Bland-Altman analysis. With NPerfusion, we successfully performed segmentation and quantified whole volumetric perfusion parameters of all 29 brain tumors that showed consistent perfusion trends with previous studies. The validation of the perfusion parameter quantification exhibited almost perfect agreement with manual segmentation, with Lin concordance correlation coefficients (ρ c ) for AF, EBV, and PF of 0.9988, 0.9994, and 0.9976, respectively. On Bland-Altman analysis, most differences between this software and manual segmentation on the commercial software were within the limit of agreement. NPerfusion successfully performs segmentation of brain tumors and calculates perfusion parameters of brain tumors. We validated this semi-automated segmentation software by comparing it with manual segmentation. NPerfusion can be used to calculate volumetric perfusion parameters of brain tumors from WBCTP.

  4. Content Validity of National Post Marriage Educational Program Using Mixed Methods

    Science.gov (United States)

    MOHAJER RAHBARI, Masoumeh; SHARIATI, Mohammad; KERAMAT, Afsaneh; YUNESIAN, Masoud; ESLAMI, Mohammad; MOUSAVI, Seyed Abbas; MONTAZERI, Ali

    2015-01-01

    Background: Although the validity of content of program is mostly conducted with qualitative methods, this study used both qualitative and quantitative methods for the validation of content of post marriage training program provided for newly married couples. Content validity is a preliminary step of obtaining authorization required to install the program in country's health care system. Methods: This mixed methodological content validation study carried out in four steps with forming three expert panels. Altogether 24 expert panelists were involved in 3 qualitative and quantitative panels; 6 in the first item development one; 12 in the reduction kind, 4 of them were common with the first panel, and 10 executive experts in the last one organized to evaluate psychometric properties of CVR and CVI and Face validity of 57 educational objectives. Results: The raw data of post marriage program had been written by professional experts of Ministry of Health, using qualitative expert panel, the content was more developed by generating 3 topics and refining one topic and its respective content. In the second panel, totally six other objectives were deleted, three for being out of agreement cut of point and three on experts' consensus. The validity of all items was above 0.8 and their content validity indices (0.8–1) were completely appropriate in quantitative assessment. Conclusion: This study provided a good evidence for validation and accreditation of national post marriage program planned for newly married couples in health centers of the country in the near future. PMID:26056672

  5. AN IMPROVED INTERFEROMETRIC CALIBRATION METHOD BASED ON INDEPENDENT PARAMETER DECOMPOSITION

    Directory of Open Access Journals (Sweden)

    J. Fan

    2018-04-01

    Full Text Available Interferometric SAR is sensitive to earth surface undulation. The accuracy of interferometric parameters plays a significant role in precise digital elevation model (DEM. The interferometric calibration is to obtain high-precision global DEM by calculating the interferometric parameters using ground control points (GCPs. However, interferometric parameters are always calculated jointly, making them difficult to decompose precisely. In this paper, we propose an interferometric calibration method based on independent parameter decomposition (IPD. Firstly, the parameters related to the interferometric SAR measurement are determined based on the three-dimensional reconstruction model. Secondly, the sensitivity of interferometric parameters is quantitatively analyzed after the geometric parameters are completely decomposed. Finally, each interferometric parameter is calculated based on IPD and interferometric calibration model is established. We take Weinan of Shanxi province as an example and choose 4 TerraDEM-X image pairs to carry out interferometric calibration experiment. The results show that the elevation accuracy of all SAR images is better than 2.54 m after interferometric calibration. Furthermore, the proposed method can obtain the accuracy of DEM products better than 2.43 m in the flat area and 6.97 m in the mountainous area, which can prove the correctness and effectiveness of the proposed IPD based interferometric calibration method. The results provide a technical basis for topographic mapping of 1 : 50000 and even larger scale in the flat area and mountainous area.

  6. Mathematical correlation of modal-parameter-identification methods via system-realization theory

    Science.gov (United States)

    Juang, Jer-Nan

    1987-01-01

    A unified approach is introduced using system-realization theory to derive and correlate modal-parameter-identification methods for flexible structures. Several different time-domain methods are analyzed and treated. A basic mathematical foundation is presented which provides insight into the field of modal-parameter identification for comparison and evaluation. The relation among various existing methods is established and discussed. This report serves as a starting point to stimulate additional research toward the unification of the many possible approaches for modal-parameter identification.

  7. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    Science.gov (United States)

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the

  8. A fractional factorial probabilistic collocation method for uncertainty propagation of hydrologic model parameters in a reduced dimensional space

    Science.gov (United States)

    Wang, S.; Huang, G. H.; Huang, W.; Fan, Y. R.; Li, Z.

    2015-10-01

    In this study, a fractional factorial probabilistic collocation method is proposed to reveal statistical significance of hydrologic model parameters and their multi-level interactions affecting model outputs, facilitating uncertainty propagation in a reduced dimensional space. The proposed methodology is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability, as well as its capability of revealing complex and dynamic parameter interactions. A set of reduced polynomial chaos expansions (PCEs) only with statistically significant terms can be obtained based on the results of factorial analysis of variance (ANOVA), achieving a reduction of uncertainty in hydrologic predictions. The predictive performance of reduced PCEs is verified by comparing against standard PCEs and the Monte Carlo with Latin hypercube sampling (MC-LHS) method in terms of reliability, sharpness, and Nash-Sutcliffe efficiency (NSE). Results reveal that the reduced PCEs are able to capture hydrologic behaviors of the Xiangxi River watershed, and they are efficient functional representations for propagating uncertainties in hydrologic predictions.

  9. Statistical methods of parameter estimation for deterministically chaotic time series

    Science.gov (United States)

    Pisarenko, V. F.; Sornette, D.

    2004-03-01

    We discuss the possibility of applying some standard statistical methods (the least-square method, the maximum likelihood method, and the method of statistical moments for estimation of parameters) to deterministically chaotic low-dimensional dynamic system (the logistic map) containing an observational noise. A “segmentation fitting” maximum likelihood (ML) method is suggested to estimate the structural parameter of the logistic map along with the initial value x1 considered as an additional unknown parameter. The segmentation fitting method, called “piece-wise” ML, is similar in spirit but simpler and has smaller bias than the “multiple shooting” previously proposed. Comparisons with different previously proposed techniques on simulated numerical examples give favorable results (at least, for the investigated combinations of sample size N and noise level). Besides, unlike some suggested techniques, our method does not require the a priori knowledge of the noise variance. We also clarify the nature of the inherent difficulties in the statistical analysis of deterministically chaotic time series and the status of previously proposed Bayesian approaches. We note the trade off between the need of using a large number of data points in the ML analysis to decrease the bias (to guarantee consistency of the estimation) and the unstable nature of dynamical trajectories with exponentially fast loss of memory of the initial condition. The method of statistical moments for the estimation of the parameter of the logistic map is discussed. This method seems to be the unique method whose consistency for deterministically chaotic time series is proved so far theoretically (not only numerically).

  10. Homogenization of metamaterials: Parameters retrieval methods and intrinsic problems

    DEFF Research Database (Denmark)

    Andryieuski, Andrei; Malureanu, Radu; Lavrinenko, Andrei

    2010-01-01

    Metamaterials (MTMs) claim a lot of attention worldwide. Description of the MTMs in terms of effective parameters is a simple and useful tool for characterisation of their electromagnetic properties. So a reliable effective parameters restoration method is on demand. In this paper we report about...

  11. Determination of Modafinil in Tablet Formulation Using Three New Validated Spectrophotometric Methods

    International Nuclear Information System (INIS)

    Basniwal, P.K.; Jain, D.; Basniwal, P.K.

    2014-01-01

    In this study, three new UV spectrophotometric methods viz. linear regression equation (LRE), standard absorptivity (SA) and first order derivative (FOD) method were developed and validated for determination of modafinil in tablet form. The Beer-Lamberts law was obeyed as linear in the range of 10-50 μg/ mL and all the methods were validated for linearity, accuracy, precision and robustness. These methods were successfully applied for assay of modafinil drug content in tablets in the range of 100.20 - 100.42 %, 100.11 - 100.58 % and 100.25 - 100.34 %, respectively with acceptable standard deviation (less than two) for all the methods. The validated spectrophotometric methods may be successfully applied for assay, dissolution studies, bio-equivalence studies as well as routine analysis in pharmaceutical industries. (author)

  12. An Efficient Data Partitioning to Improve Classification Performance While Keeping Parameters Interpretable.

    Directory of Open Access Journals (Sweden)

    Kristjan Korjus

    Full Text Available Supervised machine learning methods typically require splitting data into multiple chunks for training, validating, and finally testing classifiers. For finding the best parameters of a classifier, training and validation are usually carried out with cross-validation. This is followed by application of the classifier with optimized parameters to a separate test set for estimating the classifier's generalization performance. With limited data, this separation of test data creates a difficult trade-off between having more statistical power in estimating generalization performance versus choosing better parameters and fitting a better model. We propose a novel approach that we term "Cross-validation and cross-testing" improving this trade-off by re-using test data without biasing classifier performance. The novel approach is validated using simulated data and electrophysiological recordings in humans and rodents. The results demonstrate that the approach has a higher probability of discovering significant results than the standard approach of cross-validation and testing, while maintaining the nominal alpha level. In contrast to nested cross-validation, which is maximally efficient in re-using data, the proposed approach additionally maintains the interpretability of individual parameters. Taken together, we suggest an addition to currently used machine learning approaches which may be particularly useful in cases where model weights do not require interpretation, but parameters do.

  13. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    Science.gov (United States)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  14. Development of a validated HPLC method for the determination of sennoside A and B, two major constituents of Cassia obovata Coll.

    Directory of Open Access Journals (Sweden)

    Ghassemi-Dehkordi Nasrollah

    2014-04-01

    Full Text Available Introduction: Cassia obovata Coll is the only Senna species which grows wild in Iran. In the present study, an optimised reverse High Performance Liquid Chromatography (HPLC validated method was established for quantification of sennosides A and B, the major constituents of C. obovata with a simple and accurate method. Methods: HPLC analysis was done using Waters 515 pump on a Nova-Pak C18 (3.9 × 150 mm. Millennium software was used for the determination of the sennoside A and B in Cassia species and processing the information. The method was validated according to USP 32 requirements. Results: The solvent impact on the selectivity factor and partition coefficient parameters evaluated. Using a conventional RP-18 L1 column, 3.9 × 150 mm, the mobile phase was selected after several trials with different mixtures of water and acetonitrile. Sennosides A and B were determined using the external standard calibration method. Using USP 35-NF 30, the LOD and LOQ were calculated. The reliability of the HPLC-method for analysis of sennoside A + B was validated through its linearity, reproducibility, repeatability, and recovery. Fina1ly ethanol:water (1:1 extracts of Cassia obovata and Cassia angustifolia were standardized by assay of sennoside A and B through above HPLC validated method. Conclusion: Through the above method, determination of sennosides in Cassia species are completely possible. Moreover, through comparing the results, even though sennosides are rich in Cassia angustifolia but, the results shows that C. obovata could be considered as an alternative source for sennosides A and B.

  15. Mathematical correlation of modal parameter identification methods via system realization theory

    Science.gov (United States)

    Juang, J. N.

    1986-01-01

    A unified approach is introduced using system realization theory to derive and correlate modal parameter identification methods for flexible structures. Several different time-domain and frequency-domain methods are analyzed and treated. A basic mathematical foundation is presented which provides insight into the field of modal parameter identification for comparison and evaluation. The relation among various existing methods is established and discussed. This report serves as a starting point to stimulate additional research towards the unification of the many possible approaches for modal parameter identification.

  16. Temporal parameter change of human postural control ability during upright swing using recursive least square method

    Science.gov (United States)

    Goto, Akifumi; Ishida, Mizuri; Sagawa, Koichi

    2010-01-01

    The purpose of this study is to derive quantitative assessment indicators of the human postural control ability. An inverted pendulum is applied to standing human body and is controlled by ankle joint torque according to PD control method in sagittal plane. Torque control parameters (KP: proportional gain, KD: derivative gain) and pole placements of postural control system are estimated with time from inclination angle variation using fixed trace method as recursive least square method. Eight young healthy volunteers are participated in the experiment, in which volunteers are asked to incline forward as far as and as fast as possible 10 times over 10 [s] stationary intervals with their neck joint, hip joint and knee joint fixed, and then return to initial upright posture. The inclination angle is measured by an optical motion capture system. Three conditions are introduced to simulate unstable standing posture; 1) eyes-opened posture for healthy condition, 2) eyes-closed posture for visual impaired and 3) one-legged posture for lower-extremity muscle weakness. The estimated parameters Kp, KD and pole placements are applied to multiple comparison test among all stability conditions. The test results indicate that Kp, KD and real pole reflect effect of lower-extremity muscle weakness and KD also represents effect of visual impairment. It is suggested that the proposed method is valid for quantitative assessment of standing postural control ability.

  17. Validation of nitrogen-nitrate analysis by the chromotropic acid method

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Ana Claudia O.; Matoso, Erika, E-mail: anaclaudia.oliveira@marinha.mil.br [Centro Tecnológico da Marinha em São Paulo (CTMSP/CEA), Iperó, SP (Brazil). Centro Experimental ARAMAR

    2017-07-01

    The problems caused by contamination of water bodies demand strict control of disposal in rivers, seas and oceans. Nitrate ion is present in agricultural inputs, which are applied to the soil to boost plant growth. However, excess or indiscriminate use of these products contaminates water bodies, triggering eutrophication of the aquatic ecosystems. Furthermore, due to diseases that can be caused by the ingestion of high levels of nitrate, such as methaemoglobinaemia, nitrate levels should be controlled in drinking waters and effluents. There are several methods for the determination of nitrate, being the chromotropic acid method a simple and low-cost solution. This method consists of acid addition into the sample in the presence of H{sub 2}SO{sub 4}. The absorbance related to the produced yellow color can be measured by a UV-Vis spectrophotometer at 410 nm. In a modified form, this method can be applied to different aqueous matrices by use of other reagents that eliminate interferences. The aim of this study was to validate the nitrate determination method in waters using chromotropic acid. This method is used in Laboratório Radioecológico (LARE) to analyze effluent to comply with Wastewater Controlling Program of Centro Tecnológico da Marinha em São Paulo – Centro Experimental ARAMAR (CTMSP-CEA). The correlation coefficient for the linearity test was 0.9997. The evaluated detection limit was relatively high (LD = 0.045 mgN/L), if compared to ion chromatography, for example, but enough to determine the presence of this ion, considering the maximum limit proposed by the current legislation. The chromotropic acid method showed to be a robust, accurate and precise method, according the parameters used in this work. (author)

  18. Validation of nitrogen-nitrate analysis by the chromotropic acid method

    International Nuclear Information System (INIS)

    Santos, Ana Claudia O.; Matoso, Erika

    2017-01-01

    The problems caused by contamination of water bodies demand strict control of disposal in rivers, seas and oceans. Nitrate ion is present in agricultural inputs, which are applied to the soil to boost plant growth. However, excess or indiscriminate use of these products contaminates water bodies, triggering eutrophication of the aquatic ecosystems. Furthermore, due to diseases that can be caused by the ingestion of high levels of nitrate, such as methaemoglobinaemia, nitrate levels should be controlled in drinking waters and effluents. There are several methods for the determination of nitrate, being the chromotropic acid method a simple and low-cost solution. This method consists of acid addition into the sample in the presence of H 2 SO 4 . The absorbance related to the produced yellow color can be measured by a UV-Vis spectrophotometer at 410 nm. In a modified form, this method can be applied to different aqueous matrices by use of other reagents that eliminate interferences. The aim of this study was to validate the nitrate determination method in waters using chromotropic acid. This method is used in Laboratório Radioecológico (LARE) to analyze effluent to comply with Wastewater Controlling Program of Centro Tecnológico da Marinha em São Paulo – Centro Experimental ARAMAR (CTMSP-CEA). The correlation coefficient for the linearity test was 0.9997. The evaluated detection limit was relatively high (LD = 0.045 mgN/L), if compared to ion chromatography, for example, but enough to determine the presence of this ion, considering the maximum limit proposed by the current legislation. The chromotropic acid method showed to be a robust, accurate and precise method, according the parameters used in this work. (author)

  19. Assessment of the impact of a parameter estimation method for the Nash Model on selected parameters of a catchment discharge hydrograph

    Directory of Open Access Journals (Sweden)

    Kołodziejczyk Katarzyna

    2017-01-01

    Full Text Available An analysis of the usefulness of two parameter calculation methods (N and k parameters for the Nash Model was performed to transform effective rainfall into discharge based on two rainfall episodes gauged at the Kostrze gauging station as well as urban development data for the city of Cracow for 2014 and data obtained from a soil and agriculture map. The methods were the Rao et al. method and the Bajkiewicz-Grabowska method for regression relationships between instantaneous unit hydrograph model parameters and the physiographic parameters of a catchment. Effective rainfall was calculated for each rainfall episode using the SCS-CN method. A direct discharge hydrograph was calculated based on an effective rainfall hyetograph and using the Nash Model. Research has found that both studied methods yield comparable results, which indicates that both methods of effective rainfall transformation into discharge are useful. In addition, it has been shown that the impact of the Nash Model parameter estimation method on discharge hydrographs is minimal.

  20. Optimization and validation of a method using UHPLC-fluorescence for the analysis of polycyclic aromatic hydrocarbons in cold-pressed vegetable oils.

    Science.gov (United States)

    Silva, Simone Alves da; Sampaio, Geni Rodrigues; Torres, Elizabeth Aparecida Ferraz da Silva

    2017-04-15

    Among the different food categories, the oils and fats are important sources of exposure to polycyclic aromatic hydrocarbons (PAHs), a group of organic chemical contaminants. The use of a validated method is essential to obtain reliable analytical results since the legislation establishes maximum limits in different foods. The objective of this study was to optimize and validate a method for the quantification of four PAHs [benzo(a)anthracene, chrysene, benzo(b)fluoranthene, benzo(a)pyrene] in vegetable oils. The samples were submitted to liquid-liquid extraction, followed by solid-phase extraction, and analyzed by ultra-high performance liquid chromatography. Under the optimized conditions, the validation parameters were evaluated according to the INMETRO Guidelines: linearity (r2 >0.99), selectivity (no matrix interference), limits of detection (0.08-0.30μgkg -1 ) and quantification (0.25-1.00μgkg -1 ), recovery (80.13-100.04%), repeatability and intermediate precision (analysis of PAHs in the vegetable oils evaluated. Copyright © 2016. Published by Elsevier Ltd.

  1. Parameter identification and model validation for the piezoelectric actuator in an inertia motor

    International Nuclear Information System (INIS)

    Hunstig, Matthias; Hemsel, Tobias

    2010-01-01

    Piezoelectric inertia motors make use of the inertia of a slider to drive the slider by friction contact in a series of small steps which are generally composed of a stick phase and a slip phase. If the best electrical drive signal for the piezoelectric actuator in an inertia motor is to be determined, its dynamical behaviour must be known. A classic dynamic lumped parameter model for piezoelectric actuators is valid only in resonance and, therefore, is not suitable for modelling the actuator in an inertia motor. A reduced dynamic model is used instead. Its parameters are identified using a step response measurement. This model is used to predict the movement of the actuator in response to a velocity-optimized signal introduced in a separate contribution. Results show that the model cannot represent the dynamical characteristics of the actuator completely. For determining voltage signals that let piezoelectric actuators follow a calculated movement pattern exactly, the model can, therefore, only be used with limitations.

  2. Penalty parameter of the penalty function method

    DEFF Research Database (Denmark)

    Si, Cheng Yong; Lan, Tian; Hu, Junjie

    2014-01-01

    The penalty parameter of penalty function method is systematically analyzed and discussed. For the problem that Deb's feasibility-based rule doesnot give the detailed instruction as how to rank two solutions when they have the same constraint violation, an improved Deb's feasibility-based rule is...

  3. Development and Validation of a Stability-Indicating LC-UV Method ...

    African Journals Online (AJOL)

    Keywords: Ketotifen, Cetirizine, Stability indicating method, Stressed conditions, Validation. Tropical ... in biological fluids [13] are also reported. Stability indicating HPLC method is reported for ketotifen where drug is ..... paracetamol, cetirizine.

  4. Mycotoxin and fungicide residues in wheat grains from fungicide-treated plants measured by a validated LC-MS method.

    Science.gov (United States)

    da Luz, Suzane Rickes; Pazdiora, Paulo Cesar; Dallagnol, Leandro José; Dors, Giniani Carla; Chaves, Fábio Clasen

    2017-04-01

    Wheat (Triticum aestivum) is an annual crop, cultivated in the winter and spring and susceptible to several pathogens, especially fungi, which are managed with fungicides. It is also one of the most consumed cereals, and can be contaminated by mycotoxins and fungicides. The objective of this study was to validate an analytical method by LC-MS for simultaneous determination of mycotoxins and fungicide residues in wheat grains susceptible to fusarium head blight treated with fungicides, and to evaluate the relationship between fungicide application and mycotoxin production. All parameters of the validated analytical method were within AOAC and ANVISA limits. Deoxynivalenol was the prevalent mycotoxin in wheat grain and epoxiconazole was the fungicide residue found in the highest concentration. All fungicidal treatments induced an increase in AFB2 production when compared to the control (without application). AFB1 and deoxynivalenol, on the contrary, were reduced in all fungicide treatments compared to the control. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Fuzzy-logic based strategy for validation of multiplex methods: example with qualitative GMO assays.

    Science.gov (United States)

    Bellocchi, Gianni; Bertholet, Vincent; Hamels, Sandrine; Moens, W; Remacle, José; Van den Eede, Guy

    2010-02-01

    This paper illustrates the advantages that a fuzzy-based aggregation method could bring into the validation of a multiplex method for GMO detection (DualChip GMO kit, Eppendorf). Guidelines for validation of chemical, bio-chemical, pharmaceutical and genetic methods have been developed and ad hoc validation statistics are available and routinely used, for in-house and inter-laboratory testing, and decision-making. Fuzzy logic allows summarising the information obtained by independent validation statistics into one synthetic indicator of overall method performance. The microarray technology, introduced for simultaneous identification of multiple GMOs, poses specific validation issues (patterns of performance for a variety of GMOs at different concentrations). A fuzzy-based indicator for overall evaluation is illustrated in this paper, and applied to validation data for different genetically modified elements. Remarks were drawn on the analytical results. The fuzzy-logic based rules were shown to be applicable to improve interpretation of results and facilitate overall evaluation of the multiplex method.

  6. Method development and validation of potent pyrimidine derivative by UV-VIS spectrophotometer.

    Science.gov (United States)

    Chaudhary, Anshu; Singh, Anoop; Verma, Prabhakar Kumar

    2014-12-01

    A rapid and sensitive ultraviolet-visible (UV-VIS) spectroscopic method was developed for the estimation of pyrimidine derivative 6-Bromo-3-(6-(2,6-dichlorophenyl)-2-(morpolinomethylamino) pyrimidine4-yl) -2H-chromen-2-one (BT10M) in bulk form. Pyrimidine derivative was monitored at 275 nm with UV detection, and there is no interference of diluents at 275 nm. The method was found to be linear in the range of 50 to 150 μg/ml. The accuracy and precision were determined and validated statistically. The method was validated as a guideline. The results showed that the proposed method is suitable for the accurate, precise, and rapid determination of pyrimidine derivative. Graphical Abstract Method development and validation of potent pyrimidine derivative by UV spectroscopy.

  7. A New Filled Function Method with One Parameter for Global Optimization

    Directory of Open Access Journals (Sweden)

    Fei Wei

    2013-01-01

    Full Text Available The filled function method is an effective approach to find the global minimizer of multidimensional multimodal functions. The conventional filled functions are numerically unstable due to exponential or logarithmic term and sensitive to parameters. In this paper, a new filled function with only one parameter is proposed, which is continuously differentiable and proved to satisfy all conditions of the filled function definition. Moreover, this filled function is not sensitive to parameter, and the overflow can not happen for this function. Based on these, a new filled function method is proposed, and it is numerically stable to the initial point and the parameter variable. The computer simulations indicate that the proposed filled function method is efficient and effective.

  8. A long-term validation of the modernised DC-ARC-OES solid-sample method.

    Science.gov (United States)

    Flórián, K; Hassler, J; Förster, O

    2001-12-01

    The validation procedure based on ISO 17025 standard has been used to study and illustrate both the longterm stability of the calibration process of the DC-ARC solid sample spectrometric method and the main validation criteria of the method. In the calculation of the validation characteristics depending on the linearity(calibration), also the fulfilment of predetermining criteria such as normality and homoscedasticity was checked. In order to decide whether there are any trends in the time-variation of the analytical signal or not, also the Neumann test of trend was applied and evaluated. Finally, a comparison with similar validation data of the ETV-ICP-OES method was carried out.

  9. Comparing the Validity of Non-Invasive Methods in Measuring Thoracic Kyphosis and Lumbar Lordosis

    Directory of Open Access Journals (Sweden)

    Mohammad Yousefi

    2012-04-01

    Full Text Available Background: the purpose of this article is to study the validity of each of the non-invasive methods (flexible ruler, spinal mouse, and processing the image versus the one through-Ray radiation (the basic method and comparing them with each other.Materials and Methods: for evaluating the validity of each of these non-invasive methods, the thoracic Kyphosis and lumber Lordosis angle of 20 students of Birjand University (age mean and standard deviation: 26±2, weight: 72±2.5 kg, height: 169±5.5 cm through fours methods of flexible ruler, spinal mouse, and image processing and X-ray.Results: the results indicated that the validity of the methods including flexible ruler, spinal mouse, and image processing in measuring the thoracic Kyphosis and lumber Lordosis angle respectively have an adherence of 0.81, 0.87, 0.73, 0.76, 0.83, 0.89 (p>0.05. As a result, regarding the gained validity against the golden method of X-ray, it could be stated that the three mentioned non-invasive methods have adequate validity. In addition, the one-way analysis of variance test indicated that there existed a meaningful relationship between the three methods of measuring the thoracic Kyphosis and lumber Lordosis, and with respect to the Tukey’s test result, the image processing method is the most precise one.Conclusion as a result, this method could be used along with other non-invasive methods as a valid measuring method.

  10. Validation of choice and determination of geotechnology parameters with regard to stress–strain state of rocks

    Science.gov (United States)

    Freidin, AM; Neverov, SA; Neverov, AA; Konurin, AI

    2018-03-01

    The paper illustrates efficiency and reliability of types of rock mass stress state conditioned by geological and structural features of rocks in design, selection and validation of geotechnology parameters. The authors of the paper present calculation of stresses in rock mass under sublevel stoping depending on the type of the geosphere and on the depth of the ore body occurrence.

  11. Program for searching for semiempirical parameters by the MNDO method

    International Nuclear Information System (INIS)

    Bliznyuk, A.A.; Voityuk, A.A.

    1987-01-01

    The authors describe an program for optimizing atomic models constructed using the MNDO method which varies not only the parameters but also the scope for simple changes in the calculation scheme. The target function determines properties such as formation enthalpies, dipole moments, ionization potentials, and geometrical parameters. Software used to minimize the target function is based on the simplex method on the Nelder-Mead algorithm and on the Fletcher variable-metric method. The program is written in FORTRAN IV and implemented on the ES computer

  12. Development and validation of a RP–HPLC method for the quantization studies of metronidazole in tablets and powders dosage forms

    Directory of Open Access Journals (Sweden)

    Elena Gabriela Oltean,

    2011-12-01

    Full Text Available An isocratic high-performance liquid chromatography (HPLC procedure was developed for the quantitative determination of metronidazole in tablets and powders. HPLC separation was carried out by reversed phasechromatography on Kromasil C18 (250 mm x 4.6 mm i.e.; 5 ìm particle size, held in thermostat at 25°C. The mobile phase consisted of methanol/ 0.1% phosphoric acid aq. (20/80v/v, with a flow rate of 1 ml/min and with UV detection at 317 nm. In order to validate the method, the following parameters have been investigated: linearity (r2=0.9999, range, precision, accuracy, specificity, limit of detection and limit of quantification. The described method can be successfully applied for the analysis of the active pharmaceuticalcompound in tablets and powders. This paper aimed to develop and validate an HPLC sensitive applicable method to determine the quantity of metronidazole in tablets and powders, contributing to the quality and safety control of these types of pharmaceutical preparations.

  13. Reliability and validity of the AutoCAD software method in lumbar lordosis measurement.

    Science.gov (United States)

    Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe

    2011-12-01

    The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis.

  14. Increased efficacy for in-house validation of real-time PCR GMO detection methods.

    Science.gov (United States)

    Scholtens, I M J; Kok, E J; Hougs, L; Molenaar, B; Thissen, J T N M; van der Voet, H

    2010-03-01

    To improve the efficacy of the in-house validation of GMO detection methods (DNA isolation and real-time PCR, polymerase chain reaction), a study was performed to gain insight in the contribution of the different steps of the GMO detection method to the repeatability and in-house reproducibility. In the present study, 19 methods for (GM) soy, maize canola and potato were validated in-house of which 14 on the basis of an 8-day validation scheme using eight different samples and five on the basis of a more concise validation protocol. In this way, data was obtained with respect to the detection limit, accuracy and precision. Also, decision limits were calculated for declaring non-conformance (>0.9%) with 95% reliability. In order to estimate the contribution of the different steps in the GMO analysis to the total variation variance components were estimated using REML (residual maximum likelihood method). From these components, relative standard deviations for repeatability and reproducibility (RSD(r) and RSD(R)) were calculated. The results showed that not only the PCR reaction but also the factors 'DNA isolation' and 'PCR day' are important factors for the total variance and should therefore be included in the in-house validation. It is proposed to use a statistical model to estimate these factors from a large dataset of initial validations so that for similar GMO methods in the future, only the PCR step needs to be validated. The resulting data are discussed in the light of agreed European criteria for qualified GMO detection methods.

  15. Multi-Objective Parameter Selection for Classifers

    Directory of Open Access Journals (Sweden)

    Christoph Mussel

    2012-01-01

    Full Text Available Setting the free parameters of classifiers to different values can have a profound impact on their performance. For some methods, specialized tuning algorithms have been developed. These approaches mostly tune parameters according to a single criterion, such as the cross-validation error. However, it is sometimes desirable to obtain parameter values that optimize several concurrent - often conflicting - criteria. The TunePareto package provides a general and highly customizable framework to select optimal parameters for classifiers according to multiple objectives. Several strategies for sampling andoptimizing parameters are supplied. The algorithm determines a set of Pareto-optimal parameter configuration and leaves the ultimate decision on the weighting of objectives to the researcher. Decision support is provided by novel visualization techniques.

  16. Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory

    Science.gov (United States)

    Long, Haiying

    2017-01-01

    Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…

  17. Validation of Cloud Parameters Derived from Geostationary Satellites, AVHRR, MODIS, and VIIRS Using SatCORPS Algorithms

    Science.gov (United States)

    Minnis, P.; Sun-Mack, S.; Bedka, K. M.; Yost, C. R.; Trepte, Q. Z.; Smith, W. L., Jr.; Painemal, D.; Chen, Y.; Palikonda, R.; Dong, X.; hide

    2016-01-01

    Validation is a key component of remote sensing that can take many different forms. The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) is applied to many different imager datasets including those from the geostationary satellites, Meteosat, Himiwari-8, INSAT-3D, GOES, and MTSAT, as well as from the low-Earth orbiting satellite imagers, MODIS, AVHRR, and VIIRS. While each of these imagers have similar sets of channels with wavelengths near 0.65, 3.7, 11, and 12 micrometers, many differences among them can lead to discrepancies in the retrievals. These differences include spatial resolution, spectral response functions, viewing conditions, and calibrations, among others. Even when analyzed with nearly identical algorithms, it is necessary, because of those discrepancies, to validate the results from each imager separately in order to assess the uncertainties in the individual parameters. This paper presents comparisons of various SatCORPS-retrieved cloud parameters with independent measurements and retrievals from a variety of instruments. These include surface and space-based lidar and radar data from CALIPSO and CloudSat, respectively, to assess the cloud fraction, height, base, optical depth, and ice water path; satellite and surface microwave radiometers to evaluate cloud liquid water path; surface-based radiometers to evaluate optical depth and effective particle size; and airborne in-situ data to evaluate ice water content, effective particle size, and other parameters. The results of comparisons are compared and contrasted and the factors influencing the differences are discussed.

  18. Ensemble Kalman filter regularization using leave-one-out data cross-validation

    KAUST Repository

    Rayo Schiappacasse, Lautaro Jerónimo

    2012-09-19

    In this work, the classical leave-one-out cross-validation method for selecting a regularization parameter for the Tikhonov problem is implemented within the EnKF framework. Following the original concept, the regularization parameter is selected such that it minimizes the predictive error. Some ideas about the implementation, suitability and conceptual interest of the method are discussed. Finally, what will be called the data cross-validation regularized EnKF (dCVr-EnKF) is implemented in a 2D 2-phase synthetic oil reservoir experiment and the results analyzed.

  19. Validation of the method for determination of plutonium isotopes in urine samples and its application in a nuclear facility at Otwock

    Directory of Open Access Journals (Sweden)

    Rzemek Katarzyna

    2015-03-01

    Full Text Available The studies aimed at determining low activities of alpha radioactive elements are widely recognized as essential for the human health, because of their high radiotoxicity in case of internal contamination. Some groups of workers of nuclear facility at Otwock are potentially exposed to contamination with plutonium isotopes. For this reason, the method for determination of plutonium isotopes has been introduced and validated in Radiation Protection Measurements Laboratory (LPD of the National Centre for Nuclear Research (NCBJ. In this method the plutonium is isolated from a sample by coprecipitation with phosphates and separated on a AG 1-X2 Resin. After electrodeposition, the sample is measured by alpha spectrometry. Validation was performed in order to assess parameters such as: selectivity, accuracy (trueness and precision and linearity of the method. The results of plutonium determination in urine samples of persons potentially exposed to internal contamination are presented in this work.

  20. A method for model identification and parameter estimation

    International Nuclear Information System (INIS)

    Bambach, M; Heinkenschloss, M; Herty, M

    2013-01-01

    We propose and analyze a new method for the identification of a parameter-dependent model that best describes a given system. This problem arises, for example, in the mathematical modeling of material behavior where several competing constitutive equations are available to describe a given material. In this case, the models are differential equations that arise from the different constitutive equations, and the unknown parameters are coefficients in the constitutive equations. One has to determine the best-suited constitutive equations for a given material and application from experiments. We assume that the true model is one of the N possible parameter-dependent models. To identify the correct model and the corresponding parameters, we can perform experiments, where for each experiment we prescribe an input to the system and observe a part of the system state. Our approach consists of two stages. In the first stage, for each pair of models we determine the experiment, i.e. system input and observation, that best differentiates between the two models, and measure the distance between the two models. Then we conduct N(N − 1) or, depending on the approach taken, N(N − 1)/2 experiments and use the result of the experiments as well as the previously computed model distances to determine the true model. We provide sufficient conditions on the model distances and measurement errors which guarantee that our approach identifies the correct model. Given the model, we identify the corresponding model parameters in the second stage. The problem in the second stage is a standard parameter estimation problem and we use a method suitable for the given application. We illustrate our approach on three examples, including one where the models are elliptic partial differential equations with different parameterized right-hand sides and an example where we identify the constitutive equation in a problem from computational viscoplasticity. (paper)

  1. Development of computer code for determining prediction parameters of radionuclide migration in soil layer

    International Nuclear Information System (INIS)

    Ogawa, Hiromichi; Ohnuki, Toshihiko

    1986-07-01

    A computer code (MIGSTEM-FIT) has been developed to determine the prediction parameters, retardation factor, water flow velocity, dispersion coefficient, etc., of radionuclide migration in soil layer from the concentration distribution of radionuclide in soil layer or in effluent. In this code, the solution of the predicting equation for radionuclide migration is compared with the concentration distribution measured, and the most adequate values of parameter can be determined by the flexible tolerance method. The validity of finite differential method, which was one of the method to solve the predicting equation, was confirmed by comparison with the analytical solution, and also the validity of fitting method was confirmed by the fitting of the concentration distribution calculated from known parameters. From the examination about the error, it was found that the error of the parameter obtained by using this code was smaller than that of the concentration distribution measured. (author)

  2. Development and validation of a sensitive HPLC method for the quantification of HI-6 in guinea pig plasma and evaluated in domestic swine.

    Science.gov (United States)

    Bohnert, Sara; Vair, Cory; Mikler, John

    2010-05-15

    A rapid and small volume assay to quantify HI-6 in plasma was developed to further the development and licensing of an intravenous formulation of HI-6. The objective of this method was to develop a sensitive and rapid assay that clearly resolved HI-6 and an internal standard in saline and plasma matrices. A fully validated method using ion-pair HPLC and 2-PAM as the internal standard fulfilled these requirements. Small plasma samples of 35 microL were extracted using acidification, filtration and neutralization. Linearity was shown for over 4 microg/mL to 1mg/mL with accuracy and precision within 6% relative error at the lower limit of detection. This method was utilized in the pharmacokinetic analysis HI-6 dichloride (2Cl) and HI-6 dimethane sulfonate (DMS) in anaesthetized guinea pigs and domestic swine following an intravenous bolus administration. From the resultant pharmacokinetic parameters a target plasma concentration of 100 microM was established and maintained in guinea pigs receiving an intravenous infusion. This validated method allows for the analysis of low volume samples, increased sample numbers and is applicable to the determination of pharmacokinetic profiles and parameters. Copyright (c) 2010. Published by Elsevier B.V.

  3. Laboratory diagnostic methods, system of quality and validation

    Directory of Open Access Journals (Sweden)

    Ašanin Ružica

    2005-01-01

    Full Text Available It is known that laboratory investigations secure safe and reliable results that provide a final confirmation of the quality of work. Ideas, planning, knowledge, skills, experience, and environment, along with good laboratory practice, quality control and reliability of quality, make the area of biological investigations very complex. In recent years, quality control, including the control of work in the laboratory, is based on international standards and is used at that level. The implementation of widely recognized international standards, such as the International Standard ISO/IEC 17025 (1 and the implementing of the quality system series ISO/IEC 9000 (2 have become the imperative on the grounds of which laboratories have a formal, visible and corresponding system of quality. The diagnostic methods that are used must constantly yield results which identify the animal as positive or negative, and the precise status of the animal is determined with a predefined degree of statistical significance. Methods applied on a selected population reduce the risk of obtaining falsely positive or falsely negative results. A condition for this are well conceived and documented methods, with the application of the corresponding reagents, and work with professional and skilled staff. This process requires also a consistent implementation of the most rigorous experimental plans, epidemiological and statistical data and estimations, with constant monitoring of the validity of the applied methods. Such an approach is necessary in order to cut down the number of misconceptions and accidental mistakes, for a referent population of animals on which the validity of a method is tested. Once a valid method is included in daily routine investigations, it is necessary to apply constant monitoring for the purpose of internal quality control, in order adequately to evaluate its reproducibility and reliability. Consequently, it is necessary at least twice yearly to conduct

  4. Development and Validation of RP-HPLC Method for the Determination of Adefovir Dipivoxil in Bulk and in Pharmaceutical Formulation

    Directory of Open Access Journals (Sweden)

    Zaheer Ahmed

    2009-01-01

    Full Text Available A rapid and sensitive RP-HPLC method with UV detection (262 nm for routine analysis of adefovir dipivoxil in bulk and in pharmaceutical formulation was developed. Chromatography was performed with mobile phase containing a mixture of acetonitrile and phosphate buffer (50:50, v/v with flow rate 1.0 mL min-l. In the range of 5.0-100 µg/mL, the linearity of adefovir dipivoxil shows a correlation co-efficient of 0.9999. The proposed method was validated by determining sensitivity accuracy, precision, robustness stability, specificity, selectivity and system suitability parameters.

  5. Validation of an HPLC method for determination of chemical purity of [18F]fluoromisonidazole ([18F]FMISO)

    International Nuclear Information System (INIS)

    Nascimento, Natalia C.E.S.; Oliveira, Mércia L.; Lima, Fernando R.A.; Silveira, Marina B.; Ferreira, Soraya Z.; Silva, Juliana B.

    2017-01-01

    [ 18 F]Fluoromisonidazole ([ 18 F]FMISO) is a nitroimidazole derivative labelled with fluorine-18 that selectively binds to hypoxic cells. It has been shown to be a suitable PET tracer for imaging hypoxia in tumors as well as in noncancerous tissues. [ 18 F]FMISO was prepared using a TRACERlabMX FDG ® module (GE) with cassettes, software sequence and reagents kits from ABX. In this work, we aimed to develop and to validate a new high performance liquid chromatography (HPLC) method for determination of chemical purity of [ 18 F]FMISO. Analyses were performed with an Agilent chromatograph equipped with radioactivity and UV detectors. [ 18 F]FMISO and impurities were separated on a C18 column by gradient elution with water and acetonitrile. Selectivity, linearity, detection limit (DL), quantification limit (LQ), precision, accuracy and robustness were assessed to demonstrate that the HPLC method is adequate for its intended purpose. The HPLC method showed a good precision, as all RSD values were lower than 5%. Robustness was evaluated considering a variation on parameters such mobile phase gradient and flow rate. Results evidenced that the HPLC method is validated and is suitable for radiochemical purity evaluation of [ 18 F]FMISO, considering operational conditions of our laboratory. As an extension of this work, other analytical methods used for [ 18 F]FMISO quality control should be evaluated, in compliance with good manufacture practice. (author)

  6. Validation of photosynthetic-fluorescence parameters as biomarkers for isoproturon toxic effect on alga Scenedesmus obliquus

    International Nuclear Information System (INIS)

    Dewez, David; Didur, Olivier; Vincent-Heroux, Jonathan; Popovic, Radovan

    2008-01-01

    Photosynthetic-fluorescence parameters were investigated to be used as valid biomarkers of toxicity when alga Scenedesmus obliquus was exposed to isoproturon [3-(4-isopropylphenyl)-1,1-dimethylurea] effect. Chlorophyll fluorescence induction of algal cells treated with isoproturon showed inactivation of photosystem II (PSII) reaction centers and strong inhibition of PSII electron transport. A linear correlation was found (R 2 ≥ 0.861) between the change of cells density affected by isoproturon and the change of effective PSII quantum yield (Φ M' ), photochemical quenching (q P ) and relative photochemical quenching (q P(rel) ) values. The cells density was also linearly dependent (R 2 = 0.838) on the relative unquenched fluorescence parameter (UQF (rel) ). Non-linear correlation was found (R 2 = 0.937) only between cells density and the energy transfer efficiency from absorbed light to PSII reaction center (ABS/RC). The order of sensitivity determined by the EC-50% was: UQF (rel) > Φ M' > q P > q P(rel) > ABS/RC. Correlations between cells density and those photosynthetic-fluorescence parameters provide supporting evidence to use them as biomarkers of toxicity for environmental pollutants. - Photosynthetic-fluorescence parameters are reliable biomarkers of isoproturon toxicity

  7. Validated RP-HPLC Method for Quantification of Phenolic ...

    African Journals Online (AJOL)

    Purpose: To evaluate the total phenolic content and antioxidant potential of the methanol extracts of aerial parts and roots of Thymus sipyleus Boiss and also to determine some phenolic compounds using a newly developed and validated reversed phase high performance liquid chromatography (RP-HPLC) method.

  8. Development and Validation of a Bioanalytical Method for Direct ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a user-friendly spiked plasma method for the extraction of diclofenac potassium that reduces the number of treatments with plasma sample, in order to minimize human error. Method: Instead of solvent evaporation technique, the spiked plasma sample was modified with H2SO4 and NaCl, ...

  9. OPTIMASI PARAMETER MESIN LASER CUTTING TERHADAP KEKASARAN DAN LAJU PEMOTONGAN PADA SUS 316L MENGGUNAKAN TAGUCHI GREY RELATIONAL ANALYSIS METHOD

    Directory of Open Access Journals (Sweden)

    Rakasita R

    2016-06-01

    Full Text Available Optimasi parameter adalah teknik yang digunakan pada proses manufaktur untuk menghasilkan produk terbaik. Penelitian ini bertujuan untuk mengoptimasi parameter CNC laser cutting, yaitu titik fokus sinar laser, tekanan gas cutting dan cutting speed untuk mengurangi variasi terhadap respon kekasaran dan laju pemotongan pada material SUS 316L. Masing-masing parameter memiliki 3 level dan pada penelitian ini menggunakan matriks orthogonal L9 (34. Metode ANOVA dan Taguchi digunakan untuk menganalisis data hasil percobaan. Optimasi kekasaran minimum permukaan dan laju pemotongan maksimum pada proses laser cutting dilakukan dengan menggunakan Grey relational analysis. Eksperimen konfirmasi digunakan untuk membuktikan hasil optimal yang telah didapatkan dari metode Taguchi Grey relational analysis. Hasil eksperimen menunjukkan bahwa Taguchi Grey relational analysis efektif digunakan untuk mengoptimasi parameter pemesinan pada laser cutting dengan multi respon.   Abstract Parameter optimization is used in manufacturing as an indicator to produce the best manufacturing product. This paper studies an optimization parameters of CNC laser cutting such as focus of laser beam, pressure cutting gases and cutting speed for reducing variation of surface roughness and cutting rate on material SUS 316L. Based on L9(34 orthogonal array parameters, it is analized using ANOVA based on Taguchi method. In order to optimaze the minimum surface roughness and maximum cutting rate in laser cutting process, it is used Grey relational analysis. The confirmation experiments used to validate the optimal results that has done by Taguchi method. The results show that the Taguchi Grey relational analysis is being effective to optimize the machining parameters for laser cutting process with two responses.

  10. Rationale and methods of the European Food Consumption Validation (EFCOVAL) Project

    NARCIS (Netherlands)

    Boer, de E.J.; Slimani, N.; Boeing, H.; Feinberg, M.; Leclerq, C.; Trolle, E.; Amiano, P.; Andersen, L.F.; Freisling, H.; Geelen, A.; Harttig, U.; Huybrechts, I.; Kaic-Rak, A.; Lafay, L.; Lillegaard, I.T.L.; Ruprich, J.; Vries, de J.H.M.; Ocke, M.C.

    2011-01-01

    Background/Objectives: The overall objective of the European Food Consumption Validation (EFCOVAL) Project was to further develop and validate a trans-European food consumption method to be used for the evaluation of the intake of foods, nutrients and potentially hazardous chemicals within the

  11. Generalized dislocated lag function projective synchronization of fractional order chaotic systems with fully uncertain parameters

    International Nuclear Information System (INIS)

    Wang, Cong; Zhang, Hong-li; Fan, Wen-hui

    2017-01-01

    In this paper, we propose a new method to improve the safety of secure communication. This method uses the generalized dislocated lag projective synchronization and function projective synchronization to form a new generalized dislocated lag function projective synchronization. Moreover, this paper takes the examples of fractional order Chen system and Lü system with uncertain parameters as illustration. As the parameters of the two systems are uncertain, the nonlinear controller and parameter update algorithms are designed based on the fractional stability theory and adaptive control method. Moreover, this synchronization form and method of control are applied to secure communication via chaotic masking modulation. Many information signals can be recovered and validated. Finally, simulations are used to show the validity and feasibility of the proposed scheme.

  12. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    Science.gov (United States)

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  13. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation.

    Science.gov (United States)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-07-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes' inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of the evidence for a trace specimen, e.g. a fingermark, and a reference specimen, e.g. a fingerprint, to originate from common or different sources. Some theoretical aspects of probabilities necessary for this Guideline were discussed prior to its elaboration, which started after a workshop of forensic researchers and practitioners involved in this topic. In the workshop, the following questions were addressed: "which aspects of a forensic evaluation scenario need to be validated?", "what is the role of the LR as part of a decision process?" and "how to deal with uncertainty in the LR calculation?". The questions: "what to validate?" focuses on the validation methods and criteria and "how to validate?" deals with the implementation of the validation protocol. Answers to these questions were deemed necessary with several objectives. First, concepts typical for validation standards [1], such as performance characteristics, performance metrics and validation criteria, will be adapted or applied by analogy to the LR framework. Second, a validation strategy will be defined. Third, validation methods will be described. Finally, a validation protocol and an example of validation report will be proposed, which can be applied to the forensic fields developing and validating LR methods for the evaluation of the strength of evidence at source level under the following propositions. Copyright © 2016. Published by Elsevier B.V.

  14. Tuning Parameters in Heuristics by Using Design of Experiments Methods

    Science.gov (United States)

    Arin, Arif; Rabadi, Ghaith; Unal, Resit

    2010-01-01

    With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.

  15. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  16. Content validity of methods to assess malnutrition in cancer patients: a systematic review

    NARCIS (Netherlands)

    Sealy, Martine; Nijholt, Willemke; Stuiver, M.M.; van der Berg, M.M.; Ottery, Faith D.; van der Schans, Cees; Roodenburg, Jan L N; Jager-Wittenaar, Harriët

    Content validity of methods to assess malnutrition in cancer patients: A systematic review Rationale: Inadequate operationalisation of the multidimensial concept of malnutrition may result in inadequate evaluation of nutritional status. In this review we aimed to assess content validity of methods

  17. Development and Validation of a Liquid Chromatographic Method ...

    African Journals Online (AJOL)

    A liquid chromatographic method for the simultaneous determination of six human immunodeficiency virus (HIV) protease inhibitors, indinavir, saquinavir, ritonavir, amprenavir, nelfinavir and lopinavir, was developed and validated. Optimal separation was achieved on a PLRP-S 100 Å, 250 x 4.6 mm I.D. column maintained ...

  18. Deterministic flows of order-parameters in stochastic processes of quantum Monte Carlo method

    International Nuclear Information System (INIS)

    Inoue, Jun-ichi

    2010-01-01

    In terms of the stochastic process of quantum-mechanical version of Markov chain Monte Carlo method (the MCMC), we analytically derive macroscopically deterministic flow equations of order parameters such as spontaneous magnetization in infinite-range (d(= ∞)-dimensional) quantum spin systems. By means of the Trotter decomposition, we consider the transition probability of Glauber-type dynamics of microscopic states for the corresponding (d + 1)-dimensional classical system. Under the static approximation, differential equations with respect to macroscopic order parameters are explicitly obtained from the master equation that describes the microscopic-law. In the steady state, we show that the equations are identical to the saddle point equations for the equilibrium state of the same system. The equation for the dynamical Ising model is recovered in the classical limit. We also check the validity of the static approximation by making use of computer simulations for finite size systems and discuss several possible extensions of our approach to disordered spin systems for statistical-mechanical informatics. Especially, we shall use our procedure to evaluate the decoding process of Bayesian image restoration. With the assistance of the concept of dynamical replica theory (the DRT), we derive the zero-temperature flow equation of image restoration measure showing some 'non-monotonic' behaviour in its time evolution.

  19. Combustion Model and Control Parameter Optimization Methods for Single Cylinder Diesel Engine

    Directory of Open Access Journals (Sweden)

    Bambang Wahono

    2014-01-01

    Full Text Available This research presents a method to construct a combustion model and a method to optimize some control parameters of diesel engine in order to develop a model-based control system. The construction purpose of the model is to appropriately manage some control parameters to obtain the values of fuel consumption and emission as the engine output objectives. Stepwise method considering multicollinearity was applied to construct combustion model with the polynomial model. Using the experimental data of a single cylinder diesel engine, the model of power, BSFC, NOx, and soot on multiple injection diesel engines was built. The proposed method succesfully developed the model that describes control parameters in relation to the engine outputs. Although many control devices can be mounted to diesel engine, optimization technique is required to utilize this method in finding optimal engine operating conditions efficiently beside the existing development of individual emission control methods. Particle swarm optimization (PSO was used to calculate control parameters to optimize fuel consumption and emission based on the model. The proposed method is able to calculate control parameters efficiently to optimize evaluation item based on the model. Finally, the model which added PSO then was compiled in a microcontroller.

  20. Validation of the MCNP-DSP Monte Carlo code for calculating source-driven noise parameters of subcritical systems

    International Nuclear Information System (INIS)

    Valentine, T.E.; Mihalczo, J.T.

    1995-01-01

    This paper describes calculations performed to validate the modified version of the MCNP code, the MCNP-DSP, used for: the neutron and photon spectra of the spontaneous fission of californium 252; the representation of the detection processes for scattering detectors; the timing of the detection process; and the calculation of the frequency analysis parameters for the MCNP-DSP code

  1. An improved method to estimate reflectance parameters for high dynamic range imaging

    Science.gov (United States)

    Li, Shiying; Deguchi, Koichiro; Li, Renfa; Manabe, Yoshitsugu; Chihara, Kunihiro

    2008-01-01

    Two methods are described to accurately estimate diffuse and specular reflectance parameters for colors, gloss intensity and surface roughness, over the dynamic range of the camera used to capture input images. Neither method needs to segment color areas on an image, or to reconstruct a high dynamic range (HDR) image. The second method improves on the first, bypassing the requirement for specific separation of diffuse and specular reflection components. For the latter method, diffuse and specular reflectance parameters are estimated separately, using the least squares method. Reflection values are initially assumed to be diffuse-only reflection components, and are subjected to the least squares method to estimate diffuse reflectance parameters. Specular reflection components, obtained by subtracting the computed diffuse reflection components from reflection values, are then subjected to a logarithmically transformed equation of the Torrance-Sparrow reflection model, and specular reflectance parameters for gloss intensity and surface roughness are finally estimated using the least squares method. Experiments were carried out using both methods, with simulation data at different saturation levels, generated according to the Lambert and Torrance-Sparrow reflection models, and the second method, with spectral images captured by an imaging spectrograph and a moving light source. Our results show that the second method can estimate the diffuse and specular reflectance parameters for colors, gloss intensity and surface roughness more accurately and faster than the first one, so that colors and gloss can be reproduced more efficiently for HDR imaging.

  2. Ensemble Kalman filter regularization using leave-one-out data cross-validation

    KAUST Repository

    Rayo Schiappacasse, Lautaro Jeró nimo; Hoteit, Ibrahim

    2012-01-01

    In this work, the classical leave-one-out cross-validation method for selecting a regularization parameter for the Tikhonov problem is implemented within the EnKF framework. Following the original concept, the regularization parameter is selected

  3. Estimation of Critical Parameters in Concrete Production Using Multispectral Vision Technology

    DEFF Research Database (Denmark)

    Hansen, Michael Edberg; Ersbøll, Bjarne Kjær; Carstensen, Jens Michael

    2005-01-01

    We analyze multispectral reflectance images of concrete aggregate material, and design computational measures of the important and critical parameters used in concrete production. The features extracted from the images are exploited as explanatory variables in regression models and used to predict...... aggregate type, water content, and size distribution. We analyze and validate the methods on five representative aggregate types, commonly used in concrete production. Using cross validation, the generated models proves to have a high performance in predicting all of the critical parameters....

  4. Validity criteria for the diagnosis of fatty liver by M probe-based controlled attenuation parameter.

    Science.gov (United States)

    Wong, Vincent Wai-Sun; Petta, Salvatore; Hiriart, Jean-Baptiste; Cammà, Calogero; Wong, Grace Lai-Hung; Marra, Fabio; Vergniol, Julien; Chan, Anthony Wing-Hung; Tuttolomondo, Antonino; Merrouche, Wassil; Chan, Henry Lik-Yuen; Le Bail, Brigitte; Arena, Umberto; Craxì, Antonio; de Lédinghen, Victor

    2017-09-01

    Controlled attenuation parameter (CAP) can be performed together with liver stiffness measurement (LSM) by transient elastography (TE) and is often used to diagnose fatty liver. We aimed to define the validity criteria of CAP. CAP was measured by the M probe prior to liver biopsy in 754 consecutive patients with different liver diseases at three centers in Europe and Hong Kong (derivation cohort, n=340; validation cohort, n=414; 101 chronic hepatitis B, 154 chronic hepatitis C, 349 non-alcoholic fatty liver disease, 37 autoimmune hepatitis, 49 cholestatic liver disease, 64 others; 277 F3-4; age 52±14; body mass index 27.2±5.3kg/m 2 ). The primary outcome was the diagnosis of fatty liver, defined as steatosis involving ≥5% of hepatocytes. The area under the receiver-operating characteristics curve (AUROC) for CAP diagnosis of fatty liver was 0.85 (95% CI 0.82-0.88). The interquartile range (IQR) of CAP had a negative correlation with CAP (r=-0.32, psteatosis was lower among patients with body mass index ≥30kg/m 2 and F3-4 fibrosis. The validity of CAP for the diagnosis of fatty liver is lower if the IQR of CAP is ≥40dB/m. Lay summary: Controlled attenuation parameter (CAP) is measured by transient elastography (TE) for the detection of fatty liver. In this large study, using liver biopsy as a reference, we show that the variability of CAP measurements based on its interquartile range can reflect the accuracy of fatty liver diagnosis. In contrast, other clinical factors such as adiposity and liver enzyme levels do not affect the performance of CAP. Copyright © 2017 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.

  5. New clinical validation method for automated sphygmomanometer: a proposal by Japan ISO-WG for sphygmomanometer standard.

    Science.gov (United States)

    Shirasaki, Osamu; Asou, Yosuke; Takahashi, Yukio

    2007-12-01

    Owing to fast or stepwise cuff deflation, or measuring at places other than the upper arm, the clinical accuracy of most recent automated sphygmomanometers (auto-BPMs) cannot be validated by one-arm simultaneous comparison, which would be the only accurate validation method based on auscultation. Two main alternative methods are provided by current standards, that is, two-arm simultaneous comparison (method 1) and one-arm sequential comparison (method 2); however, the accuracy of these validation methods might not be sufficient to compensate for the suspicious accuracy in lateral blood pressure (BP) differences (LD) and/or BP variations (BPV) between the device and reference readings. Thus, the Japan ISO-WG for sphygmomanometer standards has been studying a new method that might improve validation accuracy (method 3). The purpose of this study is to determine the appropriateness of method 3 by comparing immunity to LD and BPV with those of the current validation methods (methods 1 and 2). The validation accuracy of the above three methods was assessed in human participants [N=120, 45+/-15.3 years (mean+/-SD)]. An oscillometric automated monitor, Omron HEM-762, was used as the tested device. When compared with the others, methods 1 and 3 showed a smaller intra-individual standard deviation of device error (SD1), suggesting their higher reproducibility of validation. The SD1 by method 2 (P=0.004) significantly correlated with the participant's BP, supporting our hypothesis that the increased SD of device error by method 2 is at least partially caused by essential BPV. Method 3 showed a significantly (P=0.0044) smaller interparticipant SD of device error (SD2), suggesting its higher interparticipant consistency of validation. Among the methods of validation of the clinical accuracy of auto-BPMs, method 3, which showed the highest reproducibility and highest interparticipant consistency, can be proposed as being the most appropriate.

  6. Planet Candidate Validation in K2 Crowded Fields

    Science.gov (United States)

    Rampalli, Rayna; Vanderburg, Andrew; Latham, David; Quinn, Samuel

    2018-01-01

    In just three years, the K2 mission has yielded some remarkable outcomes with the discovery of over 100 confirmed planets and 500 reported planet candidates to be validated. One challenge with this mission is the search for planets located in star-crowded regions. Campaign 13 is one such example, located towards the galactic plane in the constellation of Taurus. We subject the potential planetary candidates to a validation process involving spectroscopy to derive certain stellar parameters. Seeing-limited on/off imaging follow-up is also utilized in order to rule out false positives due to nearby eclipsing binaries. Using Markov chain Monte Carlo analysis, the best-fit parameters for each candidate are generated. These will be suitable for finding a candidate’s false positive probability through methods including feeding such parameters into the Validation of Exoplanet Signals using a Probabilistic Algorithm (VESPA). These techniques and results serve as important tools for conducting candidate validation and follow-up observations for space-based missions such as the upcoming TESS mission since TESS’s large camera pixels resemble K2’s star-crowded fields.

  7. Validation of an HPLC-UV method for the identification and quantification of bioactive amines in chicken meat

    Directory of Open Access Journals (Sweden)

    D.C.S. Assis

    2016-06-01

    Full Text Available ABSTRACT A high-performance liquid chromatography with ultraviolet detection (HPLC-UV method was validated for the study of bioactive amines in chicken meat. A gradient elution system with an ultraviolet detector was used after extraction with trichloroacetic acid and pre-column derivatization with dansyl chloride. Putrescine, cadaverine, histamine, tyramine, spermidine, and spermine standards were used for the evaluation of the following performance parameters: selectivity, linearity, precision, recovery, limits of detection, limits of quantification and ruggedness. The results indicated excellent selectivity, separation of all amines, a coefficient of determination greater than 0.99 and recovery from 92.25 to 102.25% at the concentration of 47.2mg.kg-1, with a limit of detection at 0.3mg.kg-1 and a limit of quantification at 0.9mg.kg-1 for all amines, with the exception of histamine, which exhibited the limit of quantification, of 1mg.kg-1. In conclusion, the performance parameters demonstrated adequacy of the method for the detection and quantification of bioactive amines in chicken meat.

  8. Validation of methods for determination of free water content in poultry meat

    Directory of Open Access Journals (Sweden)

    Jarmila Žítková

    2007-01-01

    Full Text Available Methods for determination of free water content in poultry meat are described in Commission Regulation EEC No 1538/91 as amended and in ČSN 57 3100. Two of them (method A and D have been validated in conditions of a Czech poultry processing plant. The capacity of slaughtering was 6000 pieces per hour and carcasses were chilled by air with spraying. All determinations were carried out in the plant’s lab and in the lab of the Institute of Food Technology. Method A was used to detect the amount of water lost from frozen chicken during thawing in controlled conditions. Twenty carcasses from six weight groups (900 g–1400 g were tested. The average values of thaw loss water contents ranged between 0.46% and 1.71%, the average value of total 120 samples was 1.16%. The results were compared with the required maximum limit value of 3.3%. The water loss content was in negative correlation with the weight of chicken (r = –0.56. Method D (chemical test has been applied to determine the total water content of certain poultry cuts. It involved the determination of water and protein contents of 62 representative samples in total. The average values of ratio of water weight to proteins weight WA/RPA were in breast fillets 3.29, in legs with a portion of the back 4.06, legs 4.00, thighs 3.85 and drumsticks 4.10. The results corresponded to the required limit values for breast fillets 3.40 and for leg cuts 4.15. The ratio of water weight to proteins weight WA/RPA was correlated with the weight of chicken for breast fillets negatively (r = –0.61 and for leg cuts positively (r = 0.70. Different correlations can be explained by the distribution of water, protein and fat in carcasses. The evaluation of methods in the parameter of percentage ratio of the average value to the limit showed that method D (results were at the level of 97% of the limit was more exact than method A (results were at the level 32% of the limit but it is more expensive. Both methods

  9. Validation of Bayesian analysis of compartmental kinetic models in medical imaging.

    Science.gov (United States)

    Sitek, Arkadiusz; Li, Quanzheng; El Fakhri, Georges; Alpert, Nathaniel M

    2016-10-01

    Kinetic compartmental analysis is frequently used to compute physiologically relevant quantitative values from time series of images. In this paper, a new approach based on Bayesian analysis to obtain information about these parameters is presented and validated. The closed-form of the posterior distribution of kinetic parameters is derived with a hierarchical prior to model the standard deviation of normally distributed noise. Markov chain Monte Carlo methods are used for numerical estimation of the posterior distribution. Computer simulations of the kinetics of F18-fluorodeoxyglucose (FDG) are used to demonstrate drawing statistical inferences about kinetic parameters and to validate the theory and implementation. Additionally, point estimates of kinetic parameters and covariance of those estimates are determined using the classical non-linear least squares approach. Posteriors obtained using methods proposed in this work are accurate as no significant deviation from the expected shape of the posterior was found (one-sided P>0.08). It is demonstrated that the results obtained by the standard non-linear least-square methods fail to provide accurate estimation of uncertainty for the same data set (P<0.0001). The results of this work validate new methods for a computer simulations of FDG kinetics. Results show that in situations where the classical approach fails in accurate estimation of uncertainty, Bayesian estimation provides an accurate information about the uncertainties in the parameters. Although a particular example of FDG kinetics was used in the paper, the methods can be extended for different pharmaceuticals and imaging modalities. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  10. [Data validation methods and discussion on Chinese materia medica resource survey].

    Science.gov (United States)

    Zhang, Yue; Ma, Wei-Feng; Zhang, Xiao-Bo; Zhu, Shou-Dong; Guo, Lan-Ping; Wang, Xing-Xing

    2013-07-01

    From the beginning of the fourth national survey of the Chinese materia medica resources, there were 22 provinces have conducted pilots. The survey teams have reported immense data, it put forward the very high request to the database system construction. In order to ensure the quality, it is necessary to check and validate the data in database system. Data validation is important methods to ensure the validity, integrity and accuracy of census data. This paper comprehensively introduce the data validation system of the fourth national survey of the Chinese materia medica resources database system, and further improve the design idea and programs of data validation. The purpose of this study is to promote the survey work smoothly.

  11. VALUE - Validating and Integrating Downscaling Methods for Climate Change Research

    Science.gov (United States)

    Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose

    2013-04-01

    Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of

  12. Pre-validation methods for developing a patient reported outcome instrument

    Directory of Open Access Journals (Sweden)

    Castillo Mayret M

    2011-08-01

    Full Text Available Abstract Background Measures that reflect patients' assessment of their health are of increasing importance as outcome measures in randomised controlled trials. The methodological approach used in the pre-validation development of new instruments (item generation, item reduction and question formatting should be robust and transparent. The totality of the content of existing PRO instruments for a specific condition provides a valuable resource (pool of items that can be utilised to develop new instruments. Such 'top down' approaches are common, but the explicit pre-validation methods are often poorly reported. This paper presents a systematic and generalisable 5-step pre-validation PRO instrument methodology. Methods The method is illustrated using the example of the Aberdeen Glaucoma Questionnaire (AGQ. The five steps are: 1 Generation of a pool of items; 2 Item de-duplication (three phases; 3 Item reduction (two phases; 4 Assessment of the remaining items' content coverage against a pre-existing theoretical framework appropriate to the objectives of the instrument and the target population (e.g. ICF; and 5 qualitative exploration of the target populations' views of the new instrument and the items it contains. Results The AGQ 'item pool' contained 725 items. Three de-duplication phases resulted in reduction of 91, 225 and 48 items respectively. The item reduction phases discarded 70 items and 208 items respectively. The draft AGQ contained 83 items with good content coverage. The qualitative exploration ('think aloud' study resulted in removal of a further 15 items and refinement to the wording of others. The resultant draft AGQ contained 68 items. Conclusions This study presents a novel methodology for developing a PRO instrument, based on three sources: literature reporting what is important to patient; theoretically coherent framework; and patients' experience of completing the instrument. By systematically accounting for all items dropped

  13. Validation of photosynthetic-fluorescence parameters as biomarkers for isoproturon toxic effect on alga Scenedesmus obliquus.

    Science.gov (United States)

    Dewez, David; Didur, Olivier; Vincent-Héroux, Jonathan; Popovic, Radovan

    2008-01-01

    Photosynthetic-fluorescence parameters were investigated to be used as valid biomarkers of toxicity when alga Scenedesmus obliquus was exposed to isoproturon [3-(4-isopropylphenyl)-1,1-dimethylurea] effect. Chlorophyll fluorescence induction of algal cells treated with isoproturon showed inactivation of photosystem II (PSII) reaction centers and strong inhibition of PSII electron transport. A linear correlation was found (R2>or=0.861) between the change of cells density affected by isoproturon and the change of effective PSII quantum yield (PhiM'), photochemical quenching (qP) and relative photochemical quenching (qP(rel)) values. The cells density was also linearly dependent (R2=0.838) on the relative unquenched fluorescence parameter (UQF(rel)). Non-linear correlation was found (R2=0.937) only between cells density and the energy transfer efficiency from absorbed light to PSII reaction center (ABS/RC). The order of sensitivity determined by the EC-50% was: UQF(rel)>PhiM'>qP>qP(rel)>ABS/RC. Correlations between cells density and those photosynthetic-fluorescence parameters provide supporting evidence to use them as biomarkers of toxicity for environmental pollutants.

  14. A simple method for determining the lattice parameter and chemical composition in ternary bcc-Fe rich nanocrystals

    International Nuclear Information System (INIS)

    Moya, Javier A.; Gamarra Caramella, Soledad; Marta, Leonardo J.; Berejnoi, Carlos

    2015-01-01

    Highlights: • A method for determining composition in ternary nanocrystals is presented. • X-ray diffraction and Mössbauer spectroscopy data were employed. • We perform theoretical charts for lattice parameter of Fe-rich ternary alloys. • A linear relationship in lattice parameter for binary alloys is evaluated. • A parabolic relationship is proposed for the Fe–Co–Si alloy. - Abstract: Charts containing lattice parameters of Fe 1−x (M,N) x ternary systems with M and N = Si, Al, Ge or Co, and 0 ⩽ x ⩽ ∼0.3, were developed by implementing a linear relationship between the respective binary alloys with the same solute content of the ternary one. Charts were validated with experimental data obtained from literature. For the Fe–Co–Si system, the linear relationship does not fit the experimental data. For the other systems (except the Fe–Co–Ge one where no experimental data was found), the lineal relationship constitute a very good approximation. Using these charts and the lattice parameter data obtained from X-ray diffraction technique combining with the solute content data obtained from Mössbauer spectroscopy technique it is possible to determine the chemical composition of nanograins in soft magnetic nanocomposite materials and some examples are provided

  15. A simple method for determining the lattice parameter and chemical composition in ternary bcc-Fe rich nanocrystals

    Energy Technology Data Exchange (ETDEWEB)

    Moya, Javier A., E-mail: jmoya.fi.uba@gmail.com [Grupo Interdisciplinario en Materiales-IESIING, Universidad Católica de Salta, INTECIN UBA-CONICET, Salta (Argentina); Gamarra Caramella, Soledad; Marta, Leonardo J. [Grupo Interdisciplinario en Materiales-IESIING, Universidad Católica de Salta, INTECIN UBA-CONICET, Salta (Argentina); Berejnoi, Carlos [Universidad Nacional de Salta, Facultad de Ingeniería, Salta (Argentina)

    2015-05-15

    Highlights: • A method for determining composition in ternary nanocrystals is presented. • X-ray diffraction and Mössbauer spectroscopy data were employed. • We perform theoretical charts for lattice parameter of Fe-rich ternary alloys. • A linear relationship in lattice parameter for binary alloys is evaluated. • A parabolic relationship is proposed for the Fe–Co–Si alloy. - Abstract: Charts containing lattice parameters of Fe{sub 1−x}(M,N){sub x} ternary systems with M and N = Si, Al, Ge or Co, and 0 ⩽ x ⩽ ∼0.3, were developed by implementing a linear relationship between the respective binary alloys with the same solute content of the ternary one. Charts were validated with experimental data obtained from literature. For the Fe–Co–Si system, the linear relationship does not fit the experimental data. For the other systems (except the Fe–Co–Ge one where no experimental data was found), the lineal relationship constitute a very good approximation. Using these charts and the lattice parameter data obtained from X-ray diffraction technique combining with the solute content data obtained from Mössbauer spectroscopy technique it is possible to determine the chemical composition of nanograins in soft magnetic nanocomposite materials and some examples are provided.

  16. SCoPE: an efficient method of Cosmological Parameter Estimation

    International Nuclear Information System (INIS)

    Das, Santanu; Souradeep, Tarun

    2014-01-01

    Markov Chain Monte Carlo (MCMC) sampler is widely used for cosmological parameter estimation from CMB and other data. However, due to the intrinsic serial nature of the MCMC sampler, convergence is often very slow. Here we present a fast and independently written Monte Carlo method for cosmological parameter estimation named as Slick Cosmological Parameter Estimator (SCoPE), that employs delayed rejection to increase the acceptance rate of a chain, and pre-fetching that helps an individual chain to run on parallel CPUs. An inter-chain covariance update is also incorporated to prevent clustering of the chains allowing faster and better mixing of the chains. We use an adaptive method for covariance calculation to calculate and update the covariance automatically as the chains progress. Our analysis shows that the acceptance probability of each step in SCoPE is more than 95% and the convergence of the chains are faster. Using SCoPE, we carry out some cosmological parameter estimations with different cosmological models using WMAP-9 and Planck results. One of the current research interests in cosmology is quantifying the nature of dark energy. We analyze the cosmological parameters from two illustrative commonly used parameterisations of dark energy models. We also asses primordial helium fraction in the universe can be constrained by the present CMB data from WMAP-9 and Planck. The results from our MCMC analysis on the one hand helps us to understand the workability of the SCoPE better, on the other hand it provides a completely independent estimation of cosmological parameters from WMAP-9 and Planck data

  17. Optimization and validation of high-performance liquid chromatography method for analyzing 25-desacetyl rifampicin in human urine

    Science.gov (United States)

    Lily; Laila, L.; Prasetyo, B. E.

    2018-03-01

    A selective, reproducibility, effective, sensitive, simple and fast High-Performance Liquid Chromatography (HPLC) was developed, optimized and validated to analyze 25-Desacetyl Rifampicin (25-DR) in human urine which is from tuberculosis patient. The separation was performed by HPLC Agilent Technologies with column Agilent Eclipse XDB- Ci8 and amobile phase of 65:35 v/v methanol: 0.01 M sodium phosphate buffer pH 5.2, at 254 nm and flow rate of 0.8ml/min. The mean retention time was 3.016minutes. The method was linear from 2–10μg/ml 25-DR with a correlation coefficient of 0.9978. Standard deviation, relative standard deviation and coefficient variation of 2, 6, 10μg/ml 25-DR were 0-0.0829, 03.1752, 0-0.0317%, respectively. The recovery of 5, 7, 9μg/ml25-DR was 80.8661, 91.3480 and 111.1457%, respectively. Limits of detection (LoD) and quantification (LoQ) were 0.51 and 1.7μg/ml, respectively. The method has fulfilled the validity guidelines of the International Conference on Harmonization (ICH) bioanalytical method which includes parameters of specificity, linearity, precision, accuracy, LoD, and LoQ. The developed method is suitable for pharmacokinetic analysis of various concentrations of 25-DR in human urine.

  18. Development and validation of reversed-phase HPLC gradient method for the estimation of efavirenz in plasma.

    Directory of Open Access Journals (Sweden)

    Shweta Gupta

    Full Text Available Efavirenz is an anti-viral agent of non-nucleoside reverse transcriptase inhibitor category used as a part of highly active retroviral therapy for the treatment of infections of human immune deficiency virus type-1. A simple, sensitive and rapid reversed-phase high performance liquid chromatographic gradient method was developed and validated for the determination of efavirenz in plasma. The method was developed with high performance liquid chromatography using Waters X-Terra Shield, RP18 50 x 4.6 mm, 3.5 μm column and a mobile phase consisting of phosphate buffer pH 3.5 and Acetonitrile. The elute was monitored with the UV-Visible detector at 260 nm with a flow rate of 1.5 mL/min. Tenofovir disoproxil fumarate was used as internal standard. The method was validated for linearity, precision, accuracy, specificity, robustness and data obtained were statistically analyzed. Calibration curve was found to be linear over the concentration range of 1-300 μg/mL. The retention times of efavirenz and tenofovir disoproxil fumarate (internal standard were 5.941 min and 4.356 min respectively. The regression coefficient value was found to be 0.999. The limit of detection and the limit of quantification obtained were 0.03 and 0.1 μg/mL respectively. The developed HPLC method can be useful for quantitative pharmacokinetic parameters determination of efavirenz in plasma.

  19. An Automatic Parameter Identification Method for a PMSM Drive with LC-Filter

    DEFF Research Database (Denmark)

    Bech, Michael Møller; Christensen, Jeppe Haals; Weber, Magnus L.

    2016-01-01

    of the PMSM fed through an LC-filter. Based on the measured current response, model parameters for both the filter (L, R, C) and the PMSM (L and R) are estimated: First, the frequency response of the system is estimated using Welch Modified Periodogram method and then an optimization algorithm is used to find...... the parameters in an analytical reference model that minimize the model error. To demonstrate the practical feasibility of the method, a fully functional drive including an embedded real-time controller has been built. In addition to modulation, data acquisition and control the whole parameter identification...... method is also implemented on the real-time controller. Based on laboratory experiments on a 22 kW drive, it is concluded that the embedded identification method can estimate the five parameters in less than ten seconds....

  20. Iterative method of the parameter variation for solution of nonlinear functional equations

    International Nuclear Information System (INIS)

    Davidenko, D.F.

    1975-01-01

    The iteration method of parameter variation is used for solving nonlinear functional equations in Banach spaces. The authors consider some methods for numerical integration of ordinary first-order differential equations and construct the relevant iteration methods of parameter variation, both one- and multifactor. They also discuss problems of mathematical substantiation of the method, study the conditions and rate of convergence, estimate the error. The paper considers the application of the method to specific functional equations

  1. Development, validation and evaluation of an analytical method for the determination of monomeric and oligomeric procyanidins in apple extracts.

    Science.gov (United States)

    Hollands, Wendy J; Voorspoels, Stefan; Jacobs, Griet; Aaby, Kjersti; Meisland, Ane; Garcia-Villalba, Rocio; Tomas-Barberan, Francisco; Piskula, Mariusz K; Mawson, Deborah; Vovk, Irena; Needs, Paul W; Kroon, Paul A

    2017-04-28

    There is a lack of data for individual oligomeric procyanidins in apples and apple extracts. Our aim was to develop, validate and evaluate an analytical method for the separation, identification and quantification of monomeric and oligomeric flavanols in apple extracts. To achieve this, we prepared two types of flavanol extracts from freeze-dried apples; one was an epicatechin-rich extract containing ∼30% (w/w) monomeric (-)-epicatechin which also contained oligomeric procyanidins (Extract A), the second was an oligomeric procyanidin-rich extract depleted of epicatechin (Extract B). The parameters considered for method optimisation were HPLC columns and conditions, sample heating, mass of extract and dilution volumes. The performance characteristics considered for method validation included standard linearity, method sensitivity, precision and trueness. Eight laboratories participated in the method evaluation. Chromatographic separation of the analytes was best achieved utilizing a Hilic column with a binary mobile phase consisting of acidic acetonitrile and acidic aqueous methanol. The final method showed linearity for epicatechin in the range 5-100μg/mL with a correlation co-efficient >0.999. Intra-day and inter-day precision of the analytes ranged from 2 to 6% and 2 to 13% respectively. Up to dp3, trueness of the method was >95% but decreased with increasing dp. Within laboratory precision showed RSD values <5 and 10% for monomers and oligomers, respectively. Between laboratory precision was 4 and 15% (Extract A) and 7 and 30% (Extract B) for monomers and oligomers, respectively. An analytical method for the separation, identification and quantification of procyanidins in an apple extract was developed, validated and assessed. The results of the inter-laboratory evaluation indicate that the method is reliable and reproducible. Copyright © 2017. Published by Elsevier B.V.

  2. Impurities in biogas - validation of analytical methods for siloxanes; Foeroreningar i biogas - validering av analysmetodik foer siloxaner

    Energy Technology Data Exchange (ETDEWEB)

    Arrhenius, Karine; Magnusson, Bertil; Sahlin, Eskil [SP Technical Research Institute of Sweden, Boraas (Sweden)

    2011-11-15

    Biogas produced from digester or landfill contains impurities which can be harmful for component that will be in contact with the biogas during its utilization. Among these, the siloxanes are often mentioned. During combustion, siloxanes are converted to silicon dioxide which accumulates on the heated surfaces in combustion equipment. Silicon dioxide is a solid compound and will remain in the engine and cause damages. Consequently, it is necessary to develop methods for the accurate determination of these compounds in biogases. In the first part of this report, a method for analysis of siloxanes in biogases was validated. The sampling was performed directly at the plant by drawing a small volume of biogas onto an adsorbent tube under a short period of time. These tubes were subsequently sent to the laboratory for analysis. The purpose of method validation is to demonstrate that the established method is fit for the purpose. This means that the method, as used by the laboratory generating the data, will provide data that meets a set of criteria concerning precision and accuracy. At the end, the uncertainty of the method was calculated. In the second part of this report, the validated method was applied to real samples collected in waste water treatment plants, co-digestion plants and plants digesting other wastes (agriculture waste). Results are presented at the end of this report. As expected, the biogases from waste water treatment plants contained largely higher concentrations of siloxanes than biogases from co-digestion plants and plants digesting agriculture wastes. The concentration of siloxanes in upgraded biogas regardless of which feedstock was digested and which upgrading technique was used was low.

  3. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  4. Validation of photosynthetic-fluorescence parameters as biomarkers for isoproturon toxic effect on alga Scenedesmus obliquus

    Energy Technology Data Exchange (ETDEWEB)

    Dewez, David; Didur, Olivier; Vincent-Heroux, Jonathan [University of Quebec in Montreal, Department of Chemistry, Environmental Toxicology Research Center - TOXEN, 2101, Jeanne-Mance, Montreal, Quebec H2X 2J6 (Canada); Popovic, Radovan [University of Quebec in Montreal, Department of Chemistry, Environmental Toxicology Research Center - TOXEN, 2101, Jeanne-Mance, Montreal, Quebec H2X 2J6 (Canada)], E-mail: popovic.radovan@uqam.ca

    2008-01-15

    Photosynthetic-fluorescence parameters were investigated to be used as valid biomarkers of toxicity when alga Scenedesmus obliquus was exposed to isoproturon [3-(4-isopropylphenyl)-1,1-dimethylurea] effect. Chlorophyll fluorescence induction of algal cells treated with isoproturon showed inactivation of photosystem II (PSII) reaction centers and strong inhibition of PSII electron transport. A linear correlation was found (R{sup 2} {>=} 0.861) between the change of cells density affected by isoproturon and the change of effective PSII quantum yield ({phi}{sub M'}), photochemical quenching (q{sub P}) and relative photochemical quenching (q{sub P(rel)}) values. The cells density was also linearly dependent (R{sup 2} = 0.838) on the relative unquenched fluorescence parameter (UQF{sub (rel)}). Non-linear correlation was found (R{sup 2} = 0.937) only between cells density and the energy transfer efficiency from absorbed light to PSII reaction center (ABS/RC). The order of sensitivity determined by the EC-50% was: UQF{sub (rel)} > {phi}{sub M'} > q{sub P} > q{sub P(rel)} > ABS/RC. Correlations between cells density and those photosynthetic-fluorescence parameters provide supporting evidence to use them as biomarkers of toxicity for environmental pollutants. - Photosynthetic-fluorescence parameters are reliable biomarkers of isoproturon toxicity.

  5. A Novel Nonlinear Parameter Estimation Method of Soft Tissues

    Directory of Open Access Journals (Sweden)

    Qianqian Tong

    2017-12-01

    Full Text Available The elastic parameters of soft tissues are important for medical diagnosis and virtual surgery simulation. In this study, we propose a novel nonlinear parameter estimation method for soft tissues. Firstly, an in-house data acquisition platform was used to obtain external forces and their corresponding deformation values. To provide highly precise data for estimating nonlinear parameters, the measured forces were corrected using the constructed weighted combination forecasting model based on a support vector machine (WCFM_SVM. Secondly, a tetrahedral finite element parameter estimation model was established to describe the physical characteristics of soft tissues, using the substitution parameters of Young’s modulus and Poisson’s ratio to avoid solving complicated nonlinear problems. To improve the robustness of our model and avoid poor local minima, the initial parameters solved by a linear finite element model were introduced into the parameter estimation model. Finally, a self-adapting Levenberg–Marquardt (LM algorithm was presented, which is capable of adaptively adjusting iterative parameters to solve the established parameter estimation model. The maximum absolute error of our WCFM_SVM model was less than 0.03 Newton, resulting in more accurate forces in comparison with other correction models tested. The maximum absolute error between the calculated and measured nodal displacements was less than 1.5 mm, demonstrating that our nonlinear parameters are precise.

  6. Reliability and validity of non-radiographic methods of thoracic kyphosis measurement: a systematic review.

    Science.gov (United States)

    Barrett, Eva; McCreesh, Karen; Lewis, Jeremy

    2014-02-01

    A wide array of instruments are available for non-invasive thoracic kyphosis measurement. Guidelines for selecting outcome measures for use in clinical and research practice recommend that properties such as validity and reliability are considered. This systematic review reports on the reliability and validity of non-invasive methods for measuring thoracic kyphosis. A systematic search of 11 electronic databases located studies assessing reliability and/or validity of non-invasive thoracic kyphosis measurement techniques. Two independent reviewers used a critical appraisal tool to assess the quality of retrieved studies. Data was extracted by the primary reviewer. The results were synthesized qualitatively using a level of evidence approach. 27 studies satisfied the eligibility criteria and were included in the review. The reliability, validity and both reliability and validity were investigated by sixteen, two and nine studies respectively. 17/27 studies were deemed to be of high quality. In total, 15 methods of thoracic kyphosis were evaluated in retrieved studies. All investigated methods showed high (ICC ≥ .7) to very high (ICC ≥ .9) levels of reliability. The validity of the methods ranged from low to very high. The strongest levels of evidence for reliability exists in support of the Debrunner kyphometer, Spinal Mouse and Flexicurve index, and for validity supports the arcometer and Flexicurve index. Further reliability and validity studies are required to strengthen the level of evidence for the remaining methods of measurement. This should be addressed by future research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Development and Validation of a Precise and Stability Indicating LC Method for the Determination of Benzalkonium Chloride in Pharmaceutical Formulation Using an Experimental Design

    Directory of Open Access Journals (Sweden)

    Harshal K. Trivedi

    2010-01-01

    Full Text Available A simple, precise, shorter runtime and stability indicating reverse-phase high performance liquid chromatographic method has been developed and validated for the quantification of benzalkonium chloride (BKC preservative in pharmaceutical formulation of sparfloxacin eye drop. The method was successfully applied for determination of benzalkonium chloride in various ophthalmic formulations like latanoprost, timolol, dexametasone, gatifloxacin, norfloxacin, combination of moxifloxacin and dexamethasone, combination of nepthazoline HCl, zinc sulphate and chlorpheniramine maleate, combination of tobaramycin and dexamethasone, combination of phenylephrine HCl, naphazoline HCl, menthol and camphor. The RP-LC separation was achieved on an Purospher Star RP-18e 75 mm × 4.0 mm, 3.0 μ in the isocratic mode using buffer: acetonitrile (35: 65, v/v, as the mobile phase at a flow rate of 1.8 mL/min. The methods were performed at 215 nm; in LC method, quantification was achieved with PDA detection over the concentration range of 50 to 150 μg/mL. The method is effective to separate four homologs with good resolution in presence of excipients, sparfloxacin and degradable compound due to sparfloxacin and BKC within five minutes. The method was validated and the results were compared statistically. They were found to be simple, accurate, precise and specific. The proposed method was validated in terms of specificity, precision, recovery, solution stability, linearity and range. All the validation parameters were within the acceptance range and concordant to ICH guidelines.

  8. Determination of fuel irradiation parameters. Required accuracies and available methods

    International Nuclear Information System (INIS)

    Mas, P.

    1977-01-01

    This paper reports on the present point of some main methods to determine the nuclear parameters of fuel irradiation in testing reactors (nuclear power, burn up, ...) The different methods (theoretical or experimental) are reviewed: neutron measurements and calculations, gamma scanning, heat balance, ... . The required accuracies are reviewed: they are of 3-5 % on flux, fluences, nuclear power, burn-up, conversion factor. These required accuracies are compared with the real accuracies available which are the present time of order of 5-20 % on these parameters

  9. Determination of material irradiation parameters. Required accuracies and available methods

    International Nuclear Information System (INIS)

    Cerles, J.M.; Mas, P.

    1978-01-01

    In this paper, the author reports some main methods to determine the nuclear parameters of material irradiation in testing reactor (nuclear power, burn-up, fluxes, fluences, ...). The different methods (theoretical or experimental) are reviewed: neutronics measurements and calculations, gamma scanning, thermal balance, ... The required accuracies are reviewed: they are of 3-5% on flux, fluences, nuclear power, burn-up, conversion factor, ... These required accuracies are compared with the real accuracies available which are at the present time of order of 5-20% on these parameters

  10. Optimizing Methods of Obtaining Stellar Parameters for the H3 Survey

    Science.gov (United States)

    Ivory, KeShawn; Conroy, Charlie; Cargile, Phillip

    2018-01-01

    The Stellar Halo at High Resolution with Hectochelle Survey (H3) is in the process of observing and collecting stellar parameters for stars in the Milky Way's halo. With a goal of measuring radial velocities for fainter stars, it is crucial that we have optimal methods of obtaining this and other parameters from the data from these stars.The method currently developed is The Payne, named after Cecilia Payne-Gaposchkin, a code that uses neural networks and Markov Chain Monte Carlo methods to utilize both spectra and photometry to obtain values for stellar parameters. This project was to investigate the benefit of fitting both spectra and spectral energy distributions (SED). Mock spectra using the parameters of the Sun were created and noise was inserted at various signal to noise values. The Payne then fit each mock spectrum with and without a mock SED also generated from solar parameters. The result was that at high signal to noise, the spectrum dominated and the effect of fitting the SED was minimal. But at low signal to noise, the addition of the SED greatly decreased the standard deviation of the data and resulted in more accurate values for temperature and metallicity.

  11. Optimizing parameters of a technical system using quality function deployment method

    Science.gov (United States)

    Baczkowicz, M.; Gwiazda, A.

    2015-11-01

    The article shows the practical use of Quality Function Deployment (QFD) on the example of a mechanized mining support. Firstly it gives a short description of this method and shows how the designing process, from the constructor point of view, looks like. The proposed method allows optimizing construction parameters and comparing them as well as adapting to customer requirements. QFD helps to determine the full set of crucial construction parameters and then their importance and difficulty of their execution. Secondly it shows chosen technical system and presents its construction with figures of the existing and future optimized model. The construction parameters were selected from the designer point of view. The method helps to specify a complete set of construction parameters, from the point of view, of the designed technical system and customer requirements. The QFD matrix can be adjusted depending on designing needs and not every part of it has to be considered. Designers can choose which parts are the most important. Due to this QFD can be a very flexible tool. The most important is to define relationships occurring between parameters and that part cannot be eliminated from the analysis.

  12. Specification and Preliminary Validation of IAT (Integrated Analysis Techniques) Methods: Executive Summary.

    Science.gov (United States)

    1985-03-01

    conceptual framwork , and preliminary validation of IAT concepts. Planned work for FY85, including more extensive validation, is also described. 20...Developments: Required Capabilities .... ......... 10 2-1 IAT Conceptual Framework - FY85 (FEO) ..... ........... 11 2-2 Recursive Nature of Decomposition...approach: 1) Identify needs & requirements for IAT. 2) Develop IAT conceptual framework. 3) Validate IAT methods. 4) Develop applications materials. To

  13. Validated TLC-densitometric analysis for determination of carotenoids in fancy carp (Cyprinus carpio serum and the application for pharmacokinetic parameter assessment

    Directory of Open Access Journals (Sweden)

    Bundit Yuangsoi

    2008-09-01

    Full Text Available A densitometric Thin-layer Chromatographic (TLC method of carotenoids such as astaxanthin, lutein, and B-carotene have been established and validated for quantitative determination of carotenoids in fancy carp serum. This study can be used in the evaluation of pharmacokinetic parameters of carotenoids in fancy carp serum. Analyses of carotenoids were performed on TLC glass plates pre-coated with silica gel 60 as the stationary phase. Linear ascending development was carried out in a twin trough glass chamber saturated with mobile phase consisting of petroleum ether-diethyl ether-acetone(75:15:10, v/v/v at a temperature of 25±2oC. TLC scanner was used for spectrodensitometric scanning and analysis inabsorbance mode at 450 nm. The system was found to give compact spots for astaxanthin, lutein, and b-carotene (Rf values of 0.21, 0.17 and 0.97, respectively. The method was validated for linearity, precision, accuracy, LOD, LOQ and HORRAT value. The linear regression analysis data of astaxanthin, lutein, and b-carotene for the calibration plots showed a good linear relationship with r2 = 0.999, 0.998 and 0.998, respectively, in a concentration range of 0.01-6.50 ug/spot with respect to the peak area. Precision (% RSDr of astaxanthin, lutein, and b-carotene was 2.93, 3.34, and 2.61, respectively. The limit of detection (LOD was 0.011, 0.023 and 0.026 μg/spot, respectively. The additionally limit of quantization (LOQ was 0.036, 0.075 and 0.085 μg/spot, respectively. The percent recoveries of astaxanthin, lutein, and b-carotene spiked to sample blank showed an average of percent recoveries for astaxanthin (0.3-2.0 mg/ml of 91.70%, for lutein(0.2-3.0 mg/ul of 90.47%, and for b-carotene (0.1-1.0 mg/ul of 102.25%. In all carotenoids, the HORRAT values were below the critical value. Therefore, this method enables simple, rapid, economical and precise quantitative determination of carotenoids in fancy carp serum for evaluated pharmacokinetic parameters

  14. Statistical Validation of Engineering and Scientific Models: Background

    International Nuclear Information System (INIS)

    Hills, Richard G.; Trucano, Timothy G.

    1999-01-01

    A tutorial is presented discussing the basic issues associated with propagation of uncertainty analysis and statistical validation of engineering and scientific models. The propagation of uncertainty tutorial illustrates the use of the sensitivity method and the Monte Carlo method to evaluate the uncertainty in predictions for linear and nonlinear models. Four example applications are presented; a linear model, a model for the behavior of a damped spring-mass system, a transient thermal conduction model, and a nonlinear transient convective-diffusive model based on Burger's equation. Correlated and uncorrelated model input parameters are considered. The model validation tutorial builds on the material presented in the propagation of uncertainty tutoriaI and uses the damp spring-mass system as the example application. The validation tutorial illustrates several concepts associated with the application of statistical inference to test model predictions against experimental observations. Several validation methods are presented including error band based, multivariate, sum of squares of residuals, and optimization methods. After completion of the tutorial, a survey of statistical model validation literature is presented and recommendations for future work are made

  15. Identification of hydrological model parameters for flood forecasting using data depth measures

    Science.gov (United States)

    Krauße, T.; Cullmann, J.

    2011-03-01

    The development of methods for estimating the parameters of hydrological models considering uncertainties has been of high interest in hydrological research over the last years. Besides the very popular Markov Chain Monte Carlo (MCMC) methods which estimate the uncertainty of model parameters in the settings of a Bayesian framework, the development of depth based sampling methods, also entitled robust parameter estimation (ROPE), have attracted an increasing research interest. These methods understand the estimation of model parameters as a geometric search of a set of robust performing parameter vectors by application of the concept of data depth. Recent studies showed that the parameter vectors estimated by depth based sampling perform more robust in validation. One major advantage of this kind of approach over the MCMC methods is that the formulation of a likelihood function within a Bayesian uncertainty framework gets obsolete and arbitrary purpose-oriented performance criteria defined by the user can be integrated without any further complications. In this paper we present an advanced ROPE method entitled the Advanced Robust Parameter Estimation by Monte Carlo algorithm (AROPEMC). The AROPEMC algorithm is a modified version of the original robust parameter estimation algorithm ROPEMC developed by Bárdossy and Singh (2008). AROPEMC performs by merging iterative Monte Carlo simulations, identifying well performing parameter vectors, the sampling of robust parameter vectors according to the principle of data depth and the application of a well-founded stopping criterion applied in supervised machine learning. The principals of the algorithm are illustrated by means of the Rosenbrock's and Rastrigin's function, two well known performance benchmarks for optimisation algorithms. Two case studies demonstrate the advantage of AROPEMC compared to state of the art global optimisation algorithms. A distributed process-oriented hydrological model is calibrated and

  16. Parameter estimation method that directly compares gravitational wave observations to numerical relativity

    Science.gov (United States)

    Lange, J.; O'Shaughnessy, R.; Boyle, M.; Calderón Bustillo, J.; Campanelli, M.; Chu, T.; Clark, J. A.; Demos, N.; Fong, H.; Healy, J.; Hemberger, D. A.; Hinder, I.; Jani, K.; Khamesra, B.; Kidder, L. E.; Kumar, P.; Laguna, P.; Lousto, C. O.; Lovelace, G.; Ossokine, S.; Pfeiffer, H.; Scheel, M. A.; Shoemaker, D. M.; Szilagyi, B.; Teukolsky, S.; Zlochower, Y.

    2017-11-01

    We present and assess a Bayesian method to interpret gravitational wave signals from binary black holes. Our method directly compares gravitational wave data to numerical relativity (NR) simulations. In this study, we present a detailed investigation of the systematic and statistical parameter estimation errors of this method. This procedure bypasses approximations used in semianalytical models for compact binary coalescence. In this work, we use the full posterior parameter distribution for only generic nonprecessing binaries, drawing inferences away from the set of NR simulations used, via interpolation of a single scalar quantity (the marginalized log likelihood, ln L ) evaluated by comparing data to nonprecessing binary black hole simulations. We also compare the data to generic simulations, and discuss the effectiveness of this procedure for generic sources. We specifically assess the impact of higher order modes, repeating our interpretation with both l ≤2 as well as l ≤3 harmonic modes. Using the l ≤3 higher modes, we gain more information from the signal and can better constrain the parameters of the gravitational wave signal. We assess and quantify several sources of systematic error that our procedure could introduce, including simulation resolution and duration; most are negligible. We show through examples that our method can recover the parameters for equal mass, zero spin, GW150914-like, and unequal mass, precessing spin sources. Our study of this new parameter estimation method demonstrates that we can quantify and understand the systematic and statistical error. This method allows us to use higher order modes from numerical relativity simulations to better constrain the black hole binary parameters.

  17. Analytical method (HPLC, validation used for identification and assay of the pharmaceutical active ingredient, Tylosin tartrate for veterinary use and its finite product Tilodem 50, hydrosoluble powder

    Directory of Open Access Journals (Sweden)

    Maria Neagu

    2010-12-01

    Full Text Available In SC DELOS IMPEX ’96 SRL the quality of the active pharmaceutical ingredient (API for the finite product Tilodem 50 - hydrosoluble powder was acomkplished in the respect of last European Pharmacopoeia.The method for analysis used in this purpose was the compendial method „Tylosin tartrate for veterinary use” in EurPh. in vigour edition and represent a variant developed and validation „in house”.The parameters which was included in the methodology validation for chromatographic method are the followings: Selectivity, Linearity, Linearity range, Detection and Quantification limits, Precision, Repeatability (intra day, Inter-Day Reproductibility, Accuracy, Robustness, Solutions’ stability and System suitability. According to the European Pharmacopoeia, the active pharmaceutical ingredient is consistent, in terms of quality, if it contains Tylosin A - minimum 80% and the amount of Tylosin A, B, C, D, at minimum 95%. Identification and determination of each component separately (Tylosin A, B, C, D is possible by chromatographic separation-HPLC. Validation of analytical methods is presented below.

  18. Some properties of 2-D dielectric-based ENG/MNG material parameters extracted using the S-parameter method

    DEFF Research Database (Denmark)

    Wu, Yunqiu; Arslanagic, Samel

    This work presents a systematic investigation of material parameters for two-dimensional epsilon-negative (ENG) and mu-negative (MNG) materials as obtained by the scattering parameter method. The unit cell consists of infinite dielectric cylinders, their sizes and permittivities are chosen...... to enable the ENG and MNG behaviors. For the both configurations, the permittivity and the permeability is reported. Influence of several effects on the extracted material parameters is examined, including the loss inside the cylinders and the size of the unit cells...

  19. Validation of Plutonium Radioisotopes Analysis Using Alpha Spectrometry

    International Nuclear Information System (INIS)

    Noor Fadzilah Yusof; Jalal Sharib; Mohd Tarmizi Ishak; Zulkifli Daud; Abdul Kadir Ishak

    2016-01-01

    This paper presents the validation of an established method used to detect plutonium (Pu) radioisotopes in marine environment samples. The separation method consists of sample digestion, anion exchange, purification, electroplating and counting by an alpha spectrometry. Applying the method on standard reference materials from marine environment, the results are validated using seven parameters, namely specificity, linearity, bias or accuracy, detection limit, precision/ repeatability, reproducibility/ ruggedness and robustness in accordance with International Organization for Standardization (ISO) guidelines. The findings were that the results obtained were in a good agreement and satisfactory compared to the provided readings from certificate of reference materials. (author)

  20. River routing at the continental scale: use of globally-available data and an a priori method of parameter estimation

    Directory of Open Access Journals (Sweden)

    P. Naden

    1999-01-01

    Full Text Available Two applications of a river routing model based on the observed river network and a linearised solution to the convective-diffusion equation are presented. One is an off-line application to part of the Amazon basin (catchment area 2.15 M km2 using river network data from the Digital Chart of the World and GCM-generated runoff at a grid resolution of 2.5 degrees latitude and 3.75 degrees longitude. The other application is to the Arkansas (409,000 km2 and Red River (125,500 km2 basins as an integrated component of a macro-scale hydrological model, driven by observed meteorology and operating on a 17 km grid. This second application makes use of the US EPA reach data to construct the river network. In both cases, a method of computing parameter values a priori has been applied and shows some success, although some interpretation is required to derive `correct' parameter values and further work is needed to develop guidelines for use of the method. The applications, however, do demonstrate the possibilities for applying the routing model at the continental scale, with globally-available data and a priori parameter estimation, and its value for validating GCM output against observed flows.

  1. An impact analysis of forecasting methods and forecasting parameters on bullwhip effect

    Science.gov (United States)

    Silitonga, R. Y. H.; Jelly, N.

    2018-04-01

    Bullwhip effect is an increase of variance of demand fluctuation from downstream to upstream of supply chain. Forecasting methods and forecasting parameters were recognized as some factors that affect bullwhip phenomena. To study these factors, we can develop simulations. There are several ways to simulate bullwhip effect in previous studies, such as mathematical equation modelling, information control modelling, computer program, and many more. In this study a spreadsheet program named Bullwhip Explorer was used to simulate bullwhip effect. Several scenarios were developed to show the change in bullwhip effect ratio because of the difference in forecasting methods and forecasting parameters. Forecasting methods used were mean demand, moving average, exponential smoothing, demand signalling, and minimum expected mean squared error. Forecasting parameters were moving average period, smoothing parameter, signalling factor, and safety stock factor. It showed that decreasing moving average period, increasing smoothing parameter, increasing signalling factor can create bigger bullwhip effect ratio. Meanwhile, safety stock factor had no impact to bullwhip effect.

  2. Validity and reliability of a method for assessment of cervical vertebral maturation.

    Science.gov (United States)

    Zhao, Xiao-Guang; Lin, Jiuxiang; Jiang, Jiu-Hui; Wang, Qingzhu; Ng, Sut Hong

    2012-03-01

    To evaluate the validity and reliability of the cervical vertebral maturation (CVM) method with a longitudinal sample. Eighty-six cephalograms from 18 subjects (5 males and 13 females) were selected from the longitudinal database. Total mandibular length was measured on each film; an increased rate served as the gold standard in examination of the validity of the CVM method. Eleven orthodontists, after receiving intensive training in the CVM method, evaluated all films twice. Kendall's W and the weighted kappa statistic were employed. Kendall's W values were higher than 0.8 at both times, indicating strong interobserver reproducibility, but interobserver agreement was documented twice at less than 50%. A wide range of intraobserver agreement was noted (40.7%-79.1%), and substantial intraobserver reproducibility was proved by kappa values (0.53-0.86). With regard to validity, moderate agreement was reported between the gold standard and observer staging at the initial time (kappa values 0.44-0.61). However, agreement seemed to be unacceptable for clinical use, especially in cervical stage 3 (26.8%). Even though the validity and reliability of the CVM method proved statistically acceptable, we suggest that many other growth indicators should be taken into consideration in evaluating adolescent skeletal maturation.

  3. A novel method for identification of lithium-ion battery equivalent circuit model parameters considering electrochemical properties

    Science.gov (United States)

    Zhang, Xi; Lu, Jinling; Yuan, Shifei; Yang, Jun; Zhou, Xuan

    2017-03-01

    This paper proposes a novel parameter identification method for the lithium-ion (Li-ion) battery equivalent circuit model (ECM) considering the electrochemical properties. An improved pseudo two-dimension (P2D) model is established on basis of partial differential equations (PDEs), since the electrolyte potential is simplified from the nonlinear to linear expression while terminal voltage can be divided into the electrolyte potential, open circuit voltage (OCV), overpotential of electrodes, internal resistance drop, and so on. The model order reduction process is implemented by the simplification of the PDEs using the Laplace transform, inverse Laplace transform, Pade approximation, etc. A unified second order transfer function between cell voltage and current is obtained for the comparability with that of ECM. The final objective is to obtain the relationship between the ECM resistances/capacitances and electrochemical parameters such that in various conditions, ECM precision could be improved regarding integration of battery interior properties for further applications, e.g., SOC estimation. Finally simulation and experimental results prove the correctness and validity of the proposed methodology.

  4. Do subjective assessments of running patterns reflect objective parameters?

    Science.gov (United States)

    Lussiana, Thibault; Gindre, Cyrille; Mourot, Laurent; Hébert-Losier, Kim

    2017-08-01

    Running patterns are often categorized into subgroups according to common features before data analysis and interpretation. The Volodalen ® method is a simple field-based tool used to classify runners into aerial or terrestrial using a 5-item subjective rating scale. We aimed to validate the Volodalen ® method by quantifying the relationship between its subjective scores and 3D biomechanical measures. Fifty-four runners ran 30 s on a treadmill at 10, 12, 14, 16, and 18 km h -1 while their kinematics were assessed subjectively using the Volodalen ® method and objectively using 3D motion capture. For each runner and speed, two researchers scored the five Volodalen ® items on a 1-to-5 scale, which addressed vertical oscillation, upper-body motion, pelvis and foot position at ground contact, and footstrike pattern. Seven 3D biomechanical parameters reflecting the subjective items were also collected and correlated to the subjective scores. Twenty-eight runners were classified as aerial and 26 as terrestrial. Runner classification did not change with speed, but the relative contribution of the biomechanical parameters to the subjective classification was speed dependent. The magnitude of correlations between subjective and objective measures ranged from trivial to very large. Five of the seven objective parameters significantly differed between aerial and terrestrial runners, and these parameters demonstrated the strongest correlations to the subjective scores. Our results support the validity of the Volodalen ® method, whereby the visual appreciation of running gait reflected quantifiable objective parameters. Two minor modifications to the method are proposed to simplify its use and improve agreement between subjective and objective measures.

  5. Comparison of Two Methods Used to Model Shape Parameters of Pareto Distributions

    Science.gov (United States)

    Liu, C.; Charpentier, R.R.; Su, J.

    2011-01-01

    Two methods are compared for estimating the shape parameters of Pareto field-size (or pool-size) distributions for petroleum resource assessment. Both methods assume mature exploration in which most of the larger fields have been discovered. Both methods use the sizes of larger discovered fields to estimate the numbers and sizes of smaller fields: (1) the tail-truncated method uses a plot of field size versus size rank, and (2) the log-geometric method uses data binned in field-size classes and the ratios of adjacent bin counts. Simulation experiments were conducted using discovered oil and gas pool-size distributions from four petroleum systems in Alberta, Canada and using Pareto distributions generated by Monte Carlo simulation. The estimates of the shape parameters of the Pareto distributions, calculated by both the tail-truncated and log-geometric methods, generally stabilize where discovered pool numbers are greater than 100. However, with fewer than 100 discoveries, these estimates can vary greatly with each new discovery. The estimated shape parameters of the tail-truncated method are more stable and larger than those of the log-geometric method where the number of discovered pools is more than 100. Both methods, however, tend to underestimate the shape parameter. Monte Carlo simulation was also used to create sequences of discovered pool sizes by sampling from a Pareto distribution with a discovery process model using a defined exploration efficiency (in order to show how biased the sampling was in favor of larger fields being discovered first). A higher (more biased) exploration efficiency gives better estimates of the Pareto shape parameters. ?? 2011 International Association for Mathematical Geosciences.

  6. Cross-validation pitfalls when selecting and assessing regression and classification models.

    Science.gov (United States)

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  7. Signal-to-noise assessment for diffusion tensor imaging with single data set and validation using a difference image method with data from a multicenter study

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zhiyue J., E-mail: jerry.wang@childrens.com [Department of Radiology, Children' s Medical Center, Dallas, Texas 75235 and Department of Radiology, University of Texas Southwestern Medical Center, Dallas, Texas 75390 (United States); Chia, Jonathan M. [Clinical Science, Philips Healthcare, Cleveland, Ohio 44143 (United States); Ahmed, Shaheen; Rollins, Nancy K. [Department of Radiology, Children' s Medical Center, Dallas, TX 75235 and Department of Radiology, University of Texas Southwestern Medical Center, Dallas, TX 75390 (United States)

    2014-09-15

    Purpose: To describe a quantitative method for determination of SNR that extracts the local noise level using a single diffusion data set. Methods: Brain data sets came from a multicenter study (eight sites; three MR vendors). Data acquisition protocol required b = 0, 700 s/mm{sup 2}, fov = 256 × 256 mm{sup 2}, acquisition matrix size 128 × 128, reconstruction matrix size 256 × 256; 30 gradient encoding directions and voxel size 2 × 2 × 2 mm{sup 3}. Regions-of-interest (ROI) were placed manually on the b = 0 image volume on transverse slices, and signal was recorded as the mean value of the ROI. The noise level from the ROI was evaluated using Fourier Transform based Butterworth high-pass filtering. Patients were divided into two groups, one for filter parameter optimization (N = 17) and one for validation (N = 10). Six white matter areas (the genu and splenium of corpus callosum, right and left centrum semiovale, right and left anterior corona radiata) were analyzed. The Bland–Altman method was used to compare the resulting SNR with that from the difference image method. The filter parameters were optimized for each brain area, and a set of “global” parameters was also obtained, which represent an average of all regions. Results: The Bland–Altman analysis on the validation group using “global” filter parameters revealed that the 95% limits of agreement of percent bias between the SNR obtained with the new and the reference methods were −15.5% (median of the lower limit, range [−24.1%, −8.9%]) and 14.5% (median of the higher limits, range [12.7%, 18.0%]) for the 6 brain areas. Conclusions: An FT-based high-pass filtering method can be used for local area SNR assessment using only one DTI data set. This method could be used to evaluate SNR for patient studies in a multicenter setting.

  8. A proposed method for fast determination of plasma parameters

    International Nuclear Information System (INIS)

    Braams, B.J.; Lackner, K.

    1984-09-01

    The method of function parametrization, developed and applied by H. Wind for fast data evaluation in high energy physics, is presented in the context of controlled fusion research. This method relies on statistical analysis of a data base of simulated experiments in order to obtain a functional representation for the intrinsic physical parameters of a system in terms of the values of the measurements. Some variations on Wind's original procedure are suggested. A specific application for tokamak experiments would be the determination of certain global parameters of the plasma, characterizing the current profile, shape of the cross-section, plasma pressure, and the internal inductance. The relevant measurements for this application include values of the poloidal field and flux external to the plasma, and a diamagnetic measurement. These may be combined with other diagnostics, such as electron-cyclotron emission and laser interferometry, in order to obtain also density and temperature profiles. There appears to be a capability for on-line determination of basic physical parameters, in a millisecond timescale on a minicomputer instead of in seconds on a large mainframe. (orig.)

  9. Triangulation, Respondent Validation, and Democratic Participation in Mixed Methods Research

    Science.gov (United States)

    Torrance, Harry

    2012-01-01

    Over the past 10 years or so the "Field" of "Mixed Methods Research" (MMR) has increasingly been exerting itself as something separate, novel, and significant, with some advocates claiming paradigmatic status. Triangulation is an important component of mixed methods designs. Triangulation has its origins in attempts to validate research findings…

  10. An extended L-curve method for choosing a regularization parameter in electrical resistance tomography

    International Nuclear Information System (INIS)

    Xu, Yanbin; Pei, Yang; Dong, Feng

    2016-01-01

    The L-curve method is a popular regularization parameter choice method for the ill-posed inverse problem of electrical resistance tomography (ERT). However the method cannot always determine a proper parameter for all situations. An investigation into those situations where the L-curve method failed show that a new corner point appears on the L-curve and the parameter corresponding to the new corner point can obtain a satisfactory reconstructed solution. Thus an extended L-curve method, which determines the regularization parameter associated with either global corner or the new corner, is proposed. Furthermore, two strategies are provided to determine the new corner–one is based on the second-order differential of L-curve, and the other is based on the curvature of L-curve. The proposed method is examined by both numerical simulations and experimental tests. And the results indicate that the extended method can handle the parameter choice problem even in the case where the typical L-curve method fails. Finally, in order to reduce the running time of the method, the extended method is combined with a projection method based on the Krylov subspace, which was able to boost the extended L-curve method. The results verify that the speed of the extended L-curve method is distinctly improved. The proposed method extends the application of the L-curve in the field of choosing regularization parameter with an acceptable running time and can also be used in other kinds of tomography. (paper)

  11. Interpolation decoding method with variable parameters for fractal image compression

    International Nuclear Information System (INIS)

    He Chuanjiang; Li Gaoping; Shen Xiaona

    2007-01-01

    The interpolation fractal decoding method, which is introduced by [He C, Yang SX, Huang X. Progressive decoding method for fractal image compression. IEE Proc Vis Image Signal Process 2004;3:207-13], involves generating progressively the decoded image by means of an interpolation iterative procedure with a constant parameter. It is well-known that the majority of image details are added at the first steps of iterations in the conventional fractal decoding; hence the constant parameter for the interpolation decoding method must be set as a smaller value in order to achieve a better progressive decoding. However, it needs to take an extremely large number of iterations to converge. It is thus reasonable for some applications to slow down the iterative process at the first stages of decoding and then to accelerate it afterwards (e.g., at some iteration as we need). To achieve the goal, this paper proposed an interpolation decoding scheme with variable (iteration-dependent) parameters and proved the convergence of the decoding process mathematically. Experimental results demonstrate that the proposed scheme has really achieved the above-mentioned goal

  12. An automatic and effective parameter optimization method for model tuning

    Directory of Open Access Journals (Sweden)

    T. Zhang

    2015-11-01

    simulation results show that the optimum combination of these parameters determined using this method is able to improve the model's overall performance by 9 %. The proposed methodology and software framework can be easily applied to other GCMs to speed up the model development process, especially regarding unavoidable comprehensive parameter tuning during the model development stage.

  13. Validity of a Manual Soft Tissue Profile Prediction Method Following Mandibular Setback Osteotomy

    OpenAIRE

    Kolokitha, Olga-Elpis

    2007-01-01

    Objectives The aim of this study was to determine the validity of a manual cephalometric method used for predicting the post-operative soft tissue profiles of patients who underwent mandibular setback surgery and compare it to a computerized cephalometric prediction method (Dentofacial Planner). Lateral cephalograms of 18 adults with mandibular prognathism taken at the end of pre-surgical orthodontics and approximately one year after surgery were used. Methods To test the validity of the manu...

  14. The Validity of Dimensional Regularization Method on Fractal Spacetime

    Directory of Open Access Journals (Sweden)

    Yong Tao

    2013-01-01

    Full Text Available Svozil developed a regularization method for quantum field theory on fractal spacetime (1987. Such a method can be applied to the low-order perturbative renormalization of quantum electrodynamics but will depend on a conjectural integral formula on non-integer-dimensional topological spaces. The main purpose of this paper is to construct a fractal measure so as to guarantee the validity of the conjectural integral formula.

  15. Seasonal evolution of soil and plant parameters on the agricultural Gebesee test site: a database for the set-up and validation of EO-LDAS and satellite-aided retrieval models

    Science.gov (United States)

    Truckenbrodt, Sina C.; Schmullius, Christiane C.

    2018-03-01

    Ground reference data are a prerequisite for the calibration, update, and validation of retrieval models facilitating the monitoring of land parameters based on Earth Observation data. Here, we describe the acquisition of a comprehensive ground reference database which was created to test and validate the recently developed Earth Observation Land Data Assimilation System (EO-LDAS) and products derived from remote sensing observations in the visible and infrared range. In situ data were collected for seven crop types (winter barley, winter wheat, spring wheat, durum, winter rape, potato, and sugar beet) cultivated on the agricultural Gebesee test site, central Germany, in 2013 and 2014. The database contains information on hyperspectral surface reflectance factors, the evolution of biophysical and biochemical plant parameters, phenology, surface conditions, atmospheric states, and a set of ground control points. Ground reference data were gathered at an approximately weekly resolution and on different spatial scales to investigate variations within and between acreages. In situ data collected less than 1 day apart from satellite acquisitions (RapidEye, SPOT 5, Landsat-7 and -8) with a cloud coverage ≤ 25 % are available for 10 and 15 days in 2013 and 2014, respectively. The measurements show that the investigated growing seasons were characterized by distinct meteorological conditions causing interannual variations in the parameter evolution. Here, the experimental design of the field campaigns, and methods employed in the determination of all parameters, are described in detail. Insights into the database are provided and potential fields of application are discussed. The data will contribute to a further development of crop monitoring methods based on remote sensing techniques. The database is freely available at PANGAEA (https://doi.org/10.1594/PANGAEA.874251).

  16. Development and validation of a RP- HPLC method for the quantitation studies of bromadiolone in Ratitox F

    Directory of Open Access Journals (Sweden)

    Elena Gabriela Oltean

    2011-12-01

    Full Text Available An isocratic high-performance liquid chromatography (HPLC procedure was developed for the quantitative determination of bromadiolone (hydroxycoumarins in Ratitox F product – rodenticide. HPLC separation was carried out by reversed phase chromatography ODS 2 Hypersil C18 (250 mm x 4.6 mm i.e.; 5 ìm particle size, held in thermostat at 25°C. The mobile phase consisted of methanol/0.1% aqueous solution phosphoric acid (90/10v/v, with a flow rate of 1 ml/min and with UV detection at 265 nm. In order to validate the method, the following parameters have been investigated- linearity (r2 = 0.9999, range, precision, accuracy, specificity, limit of detection and limit of quantification. The described method can be successfully applied for the analysis of Ratitox F – rodenticide.

  17. Development and validation of analytical method for the estimation of nateglinide in rabbit plasma

    Directory of Open Access Journals (Sweden)

    Nihar Ranjan Pani

    2012-12-01

    Full Text Available Nateglinide has been widely used in the treatment of type-2 diabetics as an insulin secretogoga. A reliable, rapid, simple and sensitive reversed-phase high performance liquid chromatography (RP-HPLC method was developed and validated for determination of nateglinide in rabbit plasma. The method was developed on Hypersil BDSC-18 column (250 mm×4.6 mm, 5 mm using a mobile phase of 10 mM phosphate buffer (pH 2.5 and acetonitrile (35:65, v/v. The elute was monitored with the UV–vis detector at 210 nm with a flow rate of 1 mL/min. Calibration curve was linear over the concentration range of 25–2000 ng/mL. The retention times of nateglinide and internal standard (gliclazide were 9.608 min and 11.821 min respectively. The developed RP-HPLC method can be successfully applied to the quantitative pharmacokinetic parameters determination of nateglinide in rabbit model. Keywords: HPLC, Nateglinide, Rabbit plasma, Pharmacokinetics

  18. [Validation of an in-house method for the determination of zinc in serum: Meeting the requirements of ISO 17025].

    Science.gov (United States)

    Llorente Ballesteros, M T; Navarro Serrano, I; López Colón, J L

    2015-01-01

    The aim of this report is to propose a scheme for validation of an analytical technique according to ISO 17025. According to ISO 17025, the fundamental parameters tested were: selectivity, calibration model, precision, accuracy, uncertainty of measurement, and analytical interference. A protocol has been developed that has been applied successfully to quantify zinc in serum by atomic absorption spectrometry. It is demonstrated that our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  19. Intelligent tuning method of PID parameters based on iterative learning control for atomic force microscopy.

    Science.gov (United States)

    Liu, Hui; Li, Yingzi; Zhang, Yingxu; Chen, Yifu; Song, Zihang; Wang, Zhenyu; Zhang, Suoxin; Qian, Jianqiang

    2018-01-01

    Proportional-integral-derivative (PID) parameters play a vital role in the imaging process of an atomic force microscope (AFM). Traditional parameter tuning methods require a lot of manpower and it is difficult to set PID parameters in unattended working environments. In this manuscript, an intelligent tuning method of PID parameters based on iterative learning control is proposed to self-adjust PID parameters of the AFM according to the sample topography. This method gets enough information about the output signals of PID controller and tracking error, which will be used to calculate the proper PID parameters, by repeated line scanning until convergence before normal scanning to learn the topography. Subsequently, the appropriate PID parameters are obtained by fitting method and then applied to the normal scanning process. The feasibility of the method is demonstrated by the convergence analysis. Simulations and experimental results indicate that the proposed method can intelligently tune PID parameters of the AFM for imaging different topographies and thus achieve good tracking performance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Method validation for preparing serum and plasma samples from human blood for downstream proteomic, metabolomic, and circulating nucleic acid-based applications.

    Science.gov (United States)

    Ammerlaan, Wim; Trezzi, Jean-Pierre; Lescuyer, Pierre; Mathay, Conny; Hiller, Karsten; Betsou, Fay

    2014-08-01

    Formal method validation for biospecimen processing in the context of accreditation in laboratories and biobanks is lacking. Serum and plasma processing protocols were validated for fitness-for-purpose in terms of key downstream endpoints, and this article demonstrates methodology for biospecimen processing method validation. Serum and plasma preparation from human blood was optimized for centrifugation conditions with respect to microparticle counts. Optimal protocols were validated for methodology and reproducibility in terms of acceptance criteria based on microparticle counts, DNA and hemoglobin concentration, and metabolomic and proteomic profiles. These parameters were also used to evaluate robustness for centrifugation temperature (4°C versus room temperature [RT]), deceleration (low, medium, high) and blood stability (after a 2-hour delay). Optimal protocols were 10-min centrifugation for serum and 20-min for plasma at 2000 g, medium brake, RT. Methodology and reproducibility acceptance criteria were met for both protocols except for reproducibility of plasma metabolomics. Overall, neither protocol was robust for centrifugation at 4°C versus RT. RT gave higher microparticles and free DNA yields in serum, and fewer microparticles with less hemolysis in plasma. Overall, both protocols were robust for fast, medium, and low deceleration, with a medium brake considered optimal. Pre-centrifugation stability after a 2-hour delay was seen at both temperatures for hemoglobin concentration and proteomics, but not for microparticle counts. We validated serum and plasma collection methods suitable for downstream protein, metabolite, or free nucleic acid-based applications. Temperature and pre-centrifugation delay can influence analytic results, and laboratories and biobanks should systematically record these conditions in the scope of accreditation.

  1. Development of a new compound method to extract the five parameters of PV modules

    International Nuclear Information System (INIS)

    Bai, Jianbo; Liu, Sheng; Hao, Yuzhe; Zhang, Zhen; Jiang, Meng; Zhang, Yu

    2014-01-01

    Highlights: • A compound method to extract the five parameters of the five-parameter PV model. • A piecewise curve-fitting method to obtain the differential values at the short and open circuit points. • Simulated and experimental I–V and P–V curves at any operating conditions have excellent agreement. • Prediction of generation output for a PV power station has high accuracy. - Abstract: The five-parameter photovoltaic (PV) mathematical model has been considered a reliable and accurate method for simulating the performance of PV modules. This paper puts forth a new compound method to extract the five parameters of the model with the basic manufacture template data. As the two differential values at the short and open circuit points of the I–V curve at standard testing conditions (STC) are fundamental data to obtain the five parameters and not normally available from the template data, we use a piecewise I–V curve-fitting method combined with the four-parameter PV model to calculate them with which an explicit extraction method is then presented to extract the five parameters at STC conditions by using five individual algebraic equations. Furthermore, the five parameters are revised according to certain operating conditions. In order to evaluate the effectiveness of the proposed method, the simulated I–V characteristic curves for three types of PV modules over a range of operating conditions are compared with the measured data. The experimental results demonstrate that the method has high accuracy. This method is also used to predict the generation power of an actual PV power station; the simulation results show good agreement with the field data. This proposed method is easy to carry out and especially useful for simulating the actual performances of PV modules or arrays at various operating conditions and predicting the output power of real PV power stations

  2. Possible checking of technical parameters in nondestructive materials and products testing

    International Nuclear Information System (INIS)

    Kesl, J.

    1987-01-01

    The requirements are summed up for partial technical parameters of instruments and facilities for nondestructive testing by ultrasound, radiography, by magnetic, capillary and electric induction methods. The requirements and procedures for testing instrument performance are presented for the individual methods as listed in domestic and foreign standards, specifications and promotional literature. The parameters to be tested and the methods of testing, including the testing and calibration instruments are shown in tables. The Czechoslovak standards are listed currently valid for nondestructive materials testing. (M.D.)

  3. A new approach to the extraction of single exponential diode model parameters

    Science.gov (United States)

    Ortiz-Conde, Adelmo; García-Sánchez, Francisco J.

    2018-06-01

    A new integration method is presented for the extraction of the parameters of a single exponential diode model with series resistance from the measured forward I-V characteristics. The extraction is performed using auxiliary functions based on the integration of the data which allow to isolate the effects of each of the model parameters. A differentiation method is also presented for data with low level of experimental noise. Measured and simulated data are used to verify the applicability of both proposed method. Physical insight about the validity of the model is also obtained by using the proposed graphical determinations of the parameters.

  4. Application of Muskingum routing method with variable parameters in ungauged basin

    Directory of Open Access Journals (Sweden)

    Xiao-meng Song

    2011-03-01

    Full Text Available This paper describes a flood routing method applied in an ungauged basin, utilizing the Muskingum model with variable parameters of wave travel time K and weight coefficient of discharge x based on the physical characteristics of the river reach and flood, including the reach slope, length, width, and flood discharge. Three formulas for estimating parameters of wide rectangular, triangular, and parabolic cross sections are proposed. The influence of the flood on channel flow routing parameters is taken into account. The HEC-HMS hydrological model and the geospatial hydrologic analysis module HEC-GeoHMS were used to extract channel or watershed characteristics and to divide sub-basins. In addition, the initial and constant-rate method, user synthetic unit hydrograph method, and exponential recession method were used to estimate runoff volumes, the direct runoff hydrograph, and the baseflow hydrograph, respectively. The Muskingum model with variable parameters was then applied in the Louzigou Basin in Henan Province of China, and of the results, the percentages of flood events with a relative error of peak discharge less than 20% and runoff volume less than 10% are both 100%. They also show that the percentages of flood events with coefficients of determination greater than 0.8 are 83.33%, 91.67%, and 87.5%, respectively, for rectangular, triangular, and parabolic cross sections in 24 flood events. Therefore, this method is applicable to ungauged basins.

  5. Development and Validation of a RP-HPLC Method for the ...

    African Journals Online (AJOL)

    Development and Validation of a RP-HPLC Method for the Simultaneous Determination of Rifampicin and a Flavonoid Glycoside - A Novel ... range, accuracy, precision, limit of detection, limit of quantification, robustness and specificity.

  6. Application of Modal Parameter Estimation Methods for Continuous Wavelet Transform-Based Damage Detection for Beam-Like Structures

    Directory of Open Access Journals (Sweden)

    Zhi Qiu

    2015-02-01

    Full Text Available This paper presents a hybrid damage detection method based on continuous wavelet transform (CWT and modal parameter identification techniques for beam-like structures. First, two kinds of mode shape estimation methods, herein referred to as the quadrature peaks picking (QPP and rational fraction polynomial (RFP methods, are used to identify the first four mode shapes of an intact beam-like structure based on the hammer/accelerometer modal experiment. The results are compared and validated using a numerical simulation with ABAQUS software. In order to determine the damage detection effectiveness between the QPP-based method and the RFP-based method when applying the CWT technique, the first two mode shapes calculated by the QPP and RFP methods are analyzed using CWT. The experiment, performed on different damage scenarios involving beam-like structures, shows that, due to the outstanding advantage of the denoising characteristic of the RFP-based (RFP-CWT technique, the RFP-CWT method gives a clearer indication of the damage location than the conventionally used QPP-based (QPP-CWT method. Finally, an overall evaluation of the damage detection is outlined, as the identification results suggest that the newly proposed RFP-CWT method is accurate and reliable in terms of detection of damage locations on beam-like structures.

  7. Validação de métodos cromatográficos para a determinação de resíduos de medicamentos veterinários em alimentos Validation of chromatographic methods for the determination of residues of veterinary drugs in foods

    OpenAIRE

    Jonas Augusto Rizzato Paschoal; Susanne Rath; Flavia Pereira da Silva Airoldi; Felix G. R. Reyes

    2008-01-01

    Different agencies that supply validation guidelines worldwide establish almost the same parameters to be evaluated in the validation process of bioanalytical methods. However, they recommend different procedures, as well as establish different acceptance criteria. The present review delineates and discusses the stages involved in the validation procedures of bioanalytical methods designed for determining veterinary residues in food, explaining the main differences in the guidelines establish...

  8. Quality of dried cauliflower according to the methods and drying parameters

    Directory of Open Access Journals (Sweden)

    Łapczyńska-Kordon Bogusława

    2018-01-01

    Full Text Available The quality of food products is a complex concept. It can be defined in many ways. The common element of most of these definitions is the condition of meeting the requirements of consumers. Quality determines product compliance with the requirements set by the normalized regulations. The paper attempts to determine the optimal method and parameters of cauliflower drying. In addition, a qualitative assessment of the obtained product was made. The results show that the method and parameters of drying significantly affect the quality of the dried cauliflower. Convection drying guarantees higher drought quality with respect to the color of the sample (higher brightness, taste and odor. Of the drying parameters accepted in the experiment, the most positive effect on the tested parameters was recorded using convection drying at a flow rate of 0.2 ms-1 and the least favorable for microwave drying 170 or 210 W.

  9. Evaluation of Hydraulic Parameters Obtained by Different Measurement Methods for Heterogeneous Gravel Soil

    Directory of Open Access Journals (Sweden)

    Chen Zeng

    2012-01-01

    Full Text Available Knowledge of soil hydraulic parameters for the van Genuchten function is important to characterize soil water movement for watershed management. Accurate and rapid prediction of soil water flow in heterogeneous gravel soil has become a hot topic in recent years. However, it is difficult to precisely estimate hydraulic parameters in a heterogeneous soil with rock fragments. In this study, the HYDRUS-2D numerical model was used to evaluate hydraulic parameters for heterogeneous gravel soil that was irregularly embedded with rock fragments in a grape production base. The centrifugal method (CM, tensiometer method (TM and inverse solution method (ISM were compared for various parameters in the van Genuchten function. The soil core method (SCM, disc infiltration method (DIM and inverse solution method (ISM were also investigated for measuring saturated hydraulic conductivity. Simulation with the DIM approach revealed a problem of overestimating soil water infiltration whereas simulation with the SCM approach revealed a problem of underestimating water movement as compared to actual field observation. The ISM approach produced the best simulation result even though this approach slightly overestimated soil moisture by ignoring the impact of rock fragments. This study provides useful information on the overall evaluation of soil hydraulic parameters attained with different measurement methods for simulating soil water movement and distribution in heterogeneous gravel soil.

  10. Developing and validating a nutrition knowledge questionnaire: key methods and considerations.

    Science.gov (United States)

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-10-01

    To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.

  11. Parameter Selection for Ant Colony Algorithm Based on Bacterial Foraging Algorithm

    Directory of Open Access Journals (Sweden)

    Peng Li

    2016-01-01

    Full Text Available The optimal performance of the ant colony algorithm (ACA mainly depends on suitable parameters; therefore, parameter selection for ACA is important. We propose a parameter selection method for ACA based on the bacterial foraging algorithm (BFA, considering the effects of coupling between different parameters. Firstly, parameters for ACA are mapped into a multidimensional space, using a chemotactic operator to ensure that each parameter group approaches the optimal value, speeding up the convergence for each parameter set. Secondly, the operation speed for optimizing the entire parameter set is accelerated using a reproduction operator. Finally, the elimination-dispersal operator is used to strengthen the global optimization of the parameters, which avoids falling into a local optimal solution. In order to validate the effectiveness of this method, the results were compared with those using a genetic algorithm (GA and a particle swarm optimization (PSO, and simulations were conducted using different grid maps for robot path planning. The results indicated that parameter selection for ACA based on BFA was the superior method, able to determine the best parameter combination rapidly, accurately, and effectively.

  12. Application of decomposition method and inverse prediction of parameters in a moving fin

    International Nuclear Information System (INIS)

    Singla, Rohit K.; Das, Ranjan

    2014-01-01

    Highlights: • Adomian decomposition is used to study a moving fin. • Effects of different parameters on the temperature and efficiency are studied. • Binary-coded GA is used to solve an inverse problem. • Sensitivity analyses of important parameters are carried out. • Measurement error up to 8% is found to be tolerable. - Abstract: The application of the Adomian decomposition method (ADM) is extended to study a conductive–convective and radiating moving fin having variable thermal conductivity. Next, through an inverse approach, ADM in conjunction with a binary-coded genetic algorithm (GA) is also applied for estimation of unknown properties in order to satisfy a given temperature distribution. ADM being one of the widely-used numerical methods for solving non-linear equations, the required temperature field has been obtained using a forward method involving ADM. In the forward problem, the temperature field and efficiency are investigated for various parameters such as convection–conduction parameter, radiation–conduction parameter, Peclet number, convection sink temperature, radiation sink temperature, and dimensionless thermal conductivity. Additionally, in the inverse problem, the effect of random measurement errors, iterative variation of parameters, sensitivity coefficients of unknown parameters are investigated. The performance of GA is compared with few other optimization methods as well as with different temperature measurement points. It is found from the present study that the results obtained from ADM are in good agreement with the results of the differential transformation method available in the literature. It is also observed that for satisfactory reconstruction of the temperature field, the measurement error should be within 8% and the temperature field is strongly dependent on the speed than thermal parameters of the moving fin

  13. Parameters estimation for reactive transport: A way to test the validity of a reactive model

    Science.gov (United States)

    Aggarwal, Mohit; Cheikh Anta Ndiaye, Mame; Carrayrou, Jérôme

    The chemical parameters used in reactive transport models are not known accurately due to the complexity and the heterogeneous conditions of a real domain. We will present an efficient algorithm in order to estimate the chemical parameters using Monte-Carlo method. Monte-Carlo methods are very robust for the optimisation of the highly non-linear mathematical model describing reactive transport. Reactive transport of tributyltin (TBT) through natural quartz sand at seven different pHs is taken as the test case. Our algorithm will be used to estimate the chemical parameters of the sorption of TBT onto the natural quartz sand. By testing and comparing three models of surface complexation, we show that the proposed adsorption model cannot explain the experimental data.

  14. Parameters of explosives detection through tagged neutron method

    Energy Technology Data Exchange (ETDEWEB)

    Bagdasaryan, Kh.E.; Batyaev, V.F.; Belichenko, S.G., E-mail: consul757@mail.ru; Bestaev, R.R.; Gavryuchenkov, A.V.; Karetnikov, M.D.

    2015-06-01

    The potentialities of tagged neutron method (TNM) for explosives detection are examined on the basis of an idealized geometrical model. The model includes ING-27 14 MeV neutron generator with a built-in α-detector, a LYSO γ-detector and samples of material to be identified of approximately 0.3 kg each: explosives imitators (trinitrotoluene - TNT, tetryl, RDX and ammonium nitrate), legal materials (sugar, water, silk and polyethylene). The samples were unshielded or shielded by a paper layer of various thicknesses. The experimental data were interpreted by numerical simulation using a Poisson distribution of signals with the statistical parameters defined experimentally. The detection parameters were obtained by a pattern classification theory and a Bayes classifier.

  15. Reliability and validity of a smartphone-based assessment of gait parameters across walking speed and smartphone locations: Body, bag, belt, hand, and pocket.

    Science.gov (United States)

    Silsupadol, Patima; Teja, Kunlanan; Lugade, Vipul

    2017-10-01

    The assessment of spatiotemporal gait parameters is a useful clinical indicator of health status. Unfortunately, most assessment tools require controlled laboratory environments which can be expensive and time consuming. As smartphones with embedded sensors are becoming ubiquitous, this technology can provide a cost-effective, easily deployable method for assessing gait. Therefore, the purpose of this study was to assess the reliability and validity of a smartphone-based accelerometer in quantifying spatiotemporal gait parameters when attached to the body or in a bag, belt, hand, and pocket. Thirty-four healthy adults were asked to walk at self-selected comfortable, slow, and fast speeds over a 10-m walkway while carrying a smartphone. Step length, step time, gait velocity, and cadence were computed from smartphone-based accelerometers and validated with GAITRite. Across all walking speeds, smartphone data had excellent reliability (ICC 2,1 ≥0.90) for the body and belt locations, with bag, hand, and pocket locations having good to excellent reliability (ICC 2,1 ≥0.69). Correlations between the smartphone-based and GAITRite-based systems were very high for the body (r=0.89, 0.98, 0.96, and 0.87 for step length, step time, gait velocity, and cadence, respectively). Similarly, Bland-Altman analysis demonstrated that the bias approached zero, particularly in the body, bag, and belt conditions under comfortable and fast speeds. Thus, smartphone-based assessments of gait are most valid when placed on the body, in a bag, or on a belt. The use of a smartphone to assess gait can provide relevant data to clinicians without encumbering the user and allow for data collection in the free-living environment. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Validation of a Novel Virtual Reality Simulator for Robotic Surgery

    Directory of Open Access Journals (Sweden)

    Henk W. R. Schreuder

    2014-01-01

    Full Text Available Objective. With the increase in robotic-assisted laparoscopic surgery there is a concomitant rising demand for training methods. The objective was to establish face and construct validity of a novel virtual reality simulator (dV-Trainer, Mimic Technologies, Seattle, WA for the use in training of robot-assisted surgery. Methods. A comparative cohort study was performed. Participants (n=42 were divided into three groups according to their robotic experience. To determine construct validity, participants performed three different exercises twice. Performance parameters were measured. To determine face validity, participants filled in a questionnaire after completion of the exercises. Results. Experts outperformed novices in most of the measured parameters. The most discriminative parameters were “time to complete” and “economy of motion” (P<0.001. The training capacity of the simulator was rated 4.6 ± 0.5 SD on a 5-point Likert scale. The realism of the simulator in general, visual graphics, movements of instruments, interaction with objects, and the depth perception were all rated as being realistic. The simulator is considered to be a very useful training tool for residents and medical specialist starting with robotic surgery. Conclusions. Face and construct validity for the dV-Trainer could be established. The virtual reality simulator is a useful tool for training robotic surgery.

  17. Optimization and Validation of Quantitative Spectrophotometric Methods for the Determination of Alfuzosin in Pharmaceutical Formulations

    Directory of Open Access Journals (Sweden)

    M. Vamsi Krishna

    2007-01-01

    Full Text Available Three accurate, simple and precise spectrophotometric methods for the determination of alfuzosin hydrochloride in bulk drugs and tablets are developed. The first method is based on the reaction of alfuzosin with ninhydrin reagent in N, N'-dimethylformamide medium (DMF producing a colored product which absorbs maximally at 575 nm. Beer’s law is obeyed in the concentration range 12.5-62.5 µg/mL of alfuzosin. The second method is based on the reaction of drug with ascorbic acid in DMF medium resulting in the formation of a colored product, which absorbs maximally at 530 nm. Beer’s law is obeyed in the concentration 10-50 µg/mL of alfuzosin. The third method is based on the reaction of alfuzosin with p-benzoquinone (PBQ to form a colored product with λmax at 400 nm. The products of the reaction were stable for 2 h at room temperature. The optimum experimental parameters for the reactions have been studied. The validity of the described procedures was assessed. Statistical analysis of the results has been carried out revealing high accuracy and good precision. The proposed methods could be used for the determination of alfuzosin in pharmaceutical formulations. The procedures were rapid, simple and suitable for quality control application.

  18. Validation of an analytical method applicable to study of 1 mg/mL oral Risperidone solution stability

    International Nuclear Information System (INIS)

    Abreu Alvarez, Maikel; Garcia Penna, Caridad Margarita; Martinez Miranda, Lissette

    2010-01-01

    A validated analytical method by high-performance liquid chromatography (HPLC) was applicable to study of 1 mg/mL Risperidone oral solution stability. The above method was linear, accurate, specific and exact. A stability study of the 1 mg/mL Risperidone oral solution was developed determining its expiry date. The shelf life study was conducted for 24 months at room temperature; whereas the accelerated stability study was conducted with product under influence of humidity and temperature; analysis was made during 3 months. Formula fulfilled the quality specifications described in Pharmacopeia. The results of stability according to shelf life after 24 months showed that the product maintains the parameters determining its quality during this time and in accelerated studies there was not significant degradation (p> 0.05) in the product. Under mentioned conditions expiry date was of 2 years

  19. An extended validation of the last generation of particle finite element method for free surface flows

    Science.gov (United States)

    Gimenez, Juan M.; González, Leo M.

    2015-03-01

    In this paper, a new generation of the particle method known as Particle Finite Element Method (PFEM), which combines convective particle movement and a fixed mesh resolution, is applied to free surface flows. This interesting variant, previously described in the literature as PFEM-2, is able to use larger time steps when compared to other similar numerical tools which implies shorter computational times while maintaining the accuracy of the computation. PFEM-2 has already been extended to free surface problems, being the main topic of this paper a deep validation of this methodology for a wider range of flows. To accomplish this task, different improved versions of discontinuous and continuous enriched basis functions for the pressure field have been developed to capture the free surface dynamics without artificial diffusion or undesired numerical effects when different density ratios are involved. A collection of problems has been carefully selected such that a wide variety of Froude numbers, density ratios and dominant dissipative cases are reported with the intention of presenting a general methodology, not restricted to a particular range of parameters, and capable of using large time-steps. The results of the different free-surface problems solved, which include: Rayleigh-Taylor instability, sloshing problems, viscous standing waves and the dam break problem, are compared to well validated numerical alternatives or experimental measurements obtaining accurate approximations for such complex flows.

  20. Potential of accuracy profile for method validation in inductively coupled plasma spectrochemistry

    International Nuclear Information System (INIS)

    Mermet, J.M.; Granier, G.

    2012-01-01

    Method validation is usually performed over a range of concentrations for which analytical criteria must be verified. One important criterion in quantitative analysis is accuracy, i.e. the contribution of both trueness and precision. The study of accuracy over this range is called an accuracy profile and provides experimental tolerance intervals. Comparison with acceptability limits fixed by the end user defines a validity domain. This work describes the computation involved in the building of the tolerance intervals, particularly for the intermediate precision with within-laboratory experiments and for the reproducibility with interlaboratory studies. Computation is based on ISO 5725‐4 and on previously published work. Moreover, the bias uncertainty is also computed to verify the bias contribution to accuracy. The various types of accuracy profile behavior are exemplified with results obtained by using ICP-MS and ICP-AES. This procedure allows the analyst to define unambiguously a validity domain for a given accuracy. However, because the experiments are time-consuming, the accuracy profile method is mainly dedicated to method validation. - Highlights: ► An analytical method is defined by its accuracy, i.e. both trueness and precision. ► The accuracy as a function of an analyte concentration is an accuracy profile. ► Profile basic concepts are explained for trueness and intermediate precision. ► Profile-based tolerance intervals have to be compared with acceptability limits. ► Typical accuracy profiles are given for both ICP-AES and ICP-MS techniques.

  1. Online In-Core Thermal Neutron Flux Measurement for the Validation of Computational Methods

    International Nuclear Information System (INIS)

    Mohamad Hairie Rabir; Muhammad Rawi Mohamed Zin; Yahya Ismail

    2016-01-01

    In order to verify and validate the computational methods for neutron flux calculation in RTP calculations, a series of thermal neutron flux measurement has been performed. The Self Powered Neutron Detector (SPND) was used to measure thermal neutron flux to verify the calculated neutron flux distribution in the TRIGA reactor. Measurements results obtained online for different power level of the reactor. The experimental results were compared to the calculations performed with Monte Carlo code MCNP using detailed geometrical model of the reactor. The calculated and measured thermal neutron flux in the core are in very good agreement indicating that the material and geometrical properties of the reactor core are modelled well. In conclusion one can state that our computational model describes very well the neutron flux distribution in the reactor core. Since the computational model properly describes the reactor core it can be used for calculations of reactor core parameters and for optimization of RTP utilization. (author)

  2. General methods for modified projective synchronization of hyperchaotic systems with known or unknown parameters

    Science.gov (United States)

    Tang, Yang; Fang, Jian-an

    2008-03-01

    This work is concerned with the general methods for modified projective synchronization of hyperchaotic systems. A systematic method of active control is developed to synchronize two hyperchaotic systems with known parameters. Moreover, by combining the adaptive control and linear feedback methods, general sufficient conditions for the modified projective synchronization of identical or different chaotic systems with fully unknown or partially unknown parameters are presented. Meanwhile, the speed of parameters identification can be regulated by adjusting adaptive gain matrix. Numerical simulations verify the effectiveness of the proposed methods.

  3. Seasonal evolution of soil and plant parameters on the agricultural Gebesee test site: a database for the set-up and validation of EO-LDAS and satellite-aided retrieval models

    Directory of Open Access Journals (Sweden)

    S. C. Truckenbrodt

    2018-03-01

    Full Text Available Ground reference data are a prerequisite for the calibration, update, and validation of retrieval models facilitating the monitoring of land parameters based on Earth Observation data. Here, we describe the acquisition of a comprehensive ground reference database which was created to test and validate the recently developed Earth Observation Land Data Assimilation System (EO-LDAS and products derived from remote sensing observations in the visible and infrared range. In situ data were collected for seven crop types (winter barley, winter wheat, spring wheat, durum, winter rape, potato, and sugar beet cultivated on the agricultural Gebesee test site, central Germany, in 2013 and 2014. The database contains information on hyperspectral surface reflectance factors, the evolution of biophysical and biochemical plant parameters, phenology, surface conditions, atmospheric states, and a set of ground control points. Ground reference data were gathered at an approximately weekly resolution and on different spatial scales to investigate variations within and between acreages. In situ data collected less than 1 day apart from satellite acquisitions (RapidEye, SPOT 5, Landsat-7 and -8 with a cloud coverage  ≤  25 % are available for 10 and 15 days in 2013 and 2014, respectively. The measurements show that the investigated growing seasons were characterized by distinct meteorological conditions causing interannual variations in the parameter evolution. Here, the experimental design of the field campaigns, and methods employed in the determination of all parameters, are described in detail. Insights into the database are provided and potential fields of application are discussed. The data will contribute to a further development of crop monitoring methods based on remote sensing techniques. The database is freely available at PANGAEA (https://doi.org/10.1594/PANGAEA.874251.

  4. Assessing different parameters estimation methods of Weibull distribution to compute wind power density

    International Nuclear Information System (INIS)

    Mohammadi, Kasra; Alavi, Omid; Mostafaeipour, Ali; Goudarzi, Navid; Jalilvand, Mahdi

    2016-01-01

    Highlights: • Effectiveness of six numerical methods is evaluated to determine wind power density. • More appropriate method for computing the daily wind power density is estimated. • Four windy stations located in the south part of Alberta, Canada namely is investigated. • The more appropriate parameters estimation method was not identical among all examined stations. - Abstract: In this study, the effectiveness of six numerical methods is evaluated to determine the shape (k) and scale (c) parameters of Weibull distribution function for the purpose of calculating the wind power density. The selected methods are graphical method (GP), empirical method of Justus (EMJ), empirical method of Lysen (EML), energy pattern factor method (EPF), maximum likelihood method (ML) and modified maximum likelihood method (MML). The purpose of this study is to identify the more appropriate method for computing the wind power density in four stations distributed in Alberta province of Canada namely Edmonton City Center Awos, Grande Prairie A, Lethbridge A and Waterton Park Gate. To provide a complete analysis, the evaluations are performed on both daily and monthly scales. The results indicate that the precision of computed wind power density values change when different parameters estimation methods are used to determine the k and c parameters. Four methods of EMJ, EML, EPF and ML present very favorable efficiency while the GP method shows weak ability for all stations. However, it is found that the more effective method is not similar among stations owing to the difference in the wind characteristics.

  5. On-line scheme for parameter estimation of nonlinear lithium ion battery equivalent circuit models using the simplified refined instrumental variable method for a modified Wiener continuous-time model

    International Nuclear Information System (INIS)

    Allafi, Walid; Uddin, Kotub; Zhang, Cheng; Mazuir Raja Ahsan Sha, Raja; Marco, James

    2017-01-01

    Highlights: •Off-line estimation approach for continuous-time domain for non-invertible function. •Model reformulated to multi-input-single-output; nonlinearity described by sigmoid. •Method directly estimates parameters of nonlinear ECM from the measured-data. •Iterative on-line technique leads to smoother convergence. •The model is validated off-line and on-line using NCA battery. -- Abstract: The accuracy of identifying the parameters of models describing lithium ion batteries (LIBs) in typical battery management system (BMS) applications is critical to the estimation of key states such as the state of charge (SoC) and state of health (SoH). In applications such as electric vehicles (EVs) where LIBs are subjected to highly demanding cycles of operation and varying environmental conditions leading to non-trivial interactions of ageing stress factors, this identification is more challenging. This paper proposes an algorithm that directly estimates the parameters of a nonlinear battery model from measured input and output data in the continuous time-domain. The simplified refined instrumental variable method is extended to estimate the parameters of a Wiener model where there is no requirement for the nonlinear function to be invertible. To account for nonlinear battery dynamics, in this paper, the typical linear equivalent circuit model (ECM) is enhanced by a block-oriented Wiener configuration where the nonlinear memoryless block following the typical ECM is defined to be a sigmoid static nonlinearity. The nonlinear Weiner model is reformulated in the form of a multi-input, single-output linear model. This linear form allows the parameters of the nonlinear model to be estimated using any linear estimator such as the well-established least squares (LS) algorithm. In this paper, the recursive least square (RLS) method is adopted for online parameter estimation. The approach was validated on experimental data measured from an 18650-type Graphite

  6. Analytical method development and validation for quantification of uranium by Fourier Transform Infrared Spectroscopy (FTIR) for routine quality control analysis

    International Nuclear Information System (INIS)

    Pereira, Elaine; Silva, Ieda de S.; Gomide, Ricardo G.; Pires, Maria Aparecida F.

    2015-01-01

    This work presents a low cost, simple and new methodology for direct determination uranium in different matrices uranium: organic phase (UO 2 (NO 3 ) 2 .2TBP - uranyl nitrate complex) and aqueous phase (UO 2 (NO 3 ) 2 - NTU - uranyl nitrate), based on Fourier Transform Infrared spectroscopy (FTIR) using KBr pellets technique. The analytical validation is essential to define if a developed methodology is completely adjusted to the objectives that it is destined and is considered one of the main instruments of quality control. The parameters used in the validation process were: selectivity, linearity, limits of detection (LD) and quantitation (LQ), precision (repeatability and intermediate precision), accuracy and robustness. The method for uranium in organic phase (UO 2 (NO 3 ) 2 .2TBP in hexane/embedded in KBr) was linear (r=0.9989) over the range of 1.0 g L -1 a 14.3 g L -1 , LD were 92.1 mg L -1 and LQ 113.1 mg L -1 , precision (RSD < 1.6% and p-value < 0.05), accurate (recovery of 100.1% - 102.9%). The method for uranium aqueous phase (UO 2 (NO 3 )2/embedded in KBr) was linear (r=0.9964) over the range of 5.4 g L -1 a 51.2 g L -1 , LD were 835 mg L -1 and LQ 958 mg L -1 , precision (RSD < 1.0% and p-value < 0.05), accurate (recovery of 99.1% - 102.0%). The FTIR method is robust regarding most of the variables analyzed, as the difference between results obtained under nominal and modified conditions were lower than the critical value for all analytical parameters studied. Some process samples were analyzed in FTIR and compared with gravimetric and x ray fluorescence (XRF) analyses showing similar results in all three methods. The statistical tests (Student-t and Fischer) showed that the techniques are equivalent. (author)

  7. Method Development and Validation for the Determination of Caffeine: An Alkaloid from Coffea arabica by High-performance Liquid Chromatography Method.

    Science.gov (United States)

    Naveen, P; Lingaraju, H B; Deepak, M; Medhini, B; Prasad, K Shyam

    2018-01-01

    The present study was investigated to develop and validate a reversed phase high performance liquid chromatography method for the determination of caffeine from bean material of Coffee arabica. The separation was achieved on a reversed-phase C18 column using a mobile phase composed of water: methanol (50:50) at a flow rate of 1.0 mlmin-1. The detection was carried out on a UV detector at 272 nm. The developed method was validated according to the requirements for International Conference on Harmonisation (ICH) guidelines, which includes specificity, linearity, precision, accuracy, limit of detection and limit of quantitation. The developed method validates good linearity with excellent correlation coefficient (R2 > 0.999). In repeatability and intermediate precision, the percentage relative standard deviation (% RSD) of peak area was less than 1% shows high precision of the method. The recovery rate for caffeine was within 98.78% - 101.28% indicates high accuracy of the method. The low limit of detection and limit of quantitation of caffeine enable the detection and quantitation of caffeine from C. arabica at low concentrations. The developed HPLC method is a simple, rapid, precisely, accurately and widely accepted and it is recommended for efficient assays in routine work. A simple, accurate, and sensitive high-performance liquid chromatography (HPLC) method for caffeine from Coffea arabica has been developed and validated. The developed HPLC method was validated for linearity, specificity, precision, recovery, limits of detection, and limits of quantification by the International Conference on Harmonization guidelines. The results revealed that the proposed method is highly reliable. This method could be successfully applied for routine quality work analysis. Abbreviation Used: C. arabica : Coffee arabica, ICH: International Conference on Harmonisation, % RSD: Percentage Relative Standard Deviation, R2: Correlation Coefficient, ppm: Parts per million, LOD: Limits

  8. τ lifetime with the impact parameter difference method

    International Nuclear Information System (INIS)

    Andreazza, A.

    1995-01-01

    The impact parameter difference method for measuring the τ lifetime has been used since 1990 in the ALEPH and DELPHI collaborations at LEP. This paper is mainly devoted to the description of the method. The most recent preliminary results, τ τ =288.1±5.4(stat.)±1.2(syst.)fs obtained by the ALEPH collaboration on 1992 data and τ τ =292.8±5.0(stat.)±3.7(syst.)fs from the combined DELPHI analysis of 1992-93 data, are still statistically limited, therefore a global error on the τ lifetime of less than 1% per experiment should be attainable with this method at the end of LEP-1 running. ((orig.))

  9. Discussion of the experimental methods of the estimation of the reaction impact parameter

    International Nuclear Information System (INIS)

    Muryn, B.; Dziunikowska, K.; Eskreys, A.; Coghen, T.

    1978-01-01

    Two methods of determination of the reaction impact parameter, the one proposed by Webber and other by Henyey and Pumplin, are compared and discussed. It is shown that the lower limits of the impact parameter bsub(L) obtained by means of these methods are comparable and are always very low (approximately < 0.5 fm). On the example of the Henyey - Pumplin method it is argued that the experimentally obtained values bsub(L) may be very unreliable estimates of the reaction impact parameter and that any comparison of different reactions or reactions channels may be meaningless. (author)

  10. A need for determination of arsenic species at low levels in cereal-based food and infant cereals. Validation of a method by IC-ICPMS.

    Science.gov (United States)

    Llorente-Mirandes, Toni; Calderón, Josep; Centrich, Francesc; Rubio, Roser; López-Sánchez, José Fermín

    2014-03-15

    The present study arose from the need to determine inorganic arsenic (iAs) at low levels in cereal-based food. Validated methods with a low limit of detection (LOD) are required to analyse these kinds of food. An analytical method for the determination of iAs, methylarsonic acid (MA) and dimethylarsinic acid (DMA) in cereal-based food and infant cereals is reported. The method was optimised and validated to achieve low LODs. Ion chromatography-inductively coupled plasma mass spectrometry (IC-ICPMS) was used for arsenic speciation. The main quality parameters were established. To expand the applicability of the method, different cereal products were analysed: bread, biscuits, breakfast cereals, wheat flour, corn snacks, pasta and infant cereals. The total and inorganic arsenic content of 29 cereal-based food samples ranged between 3.7-35.6 and 3.1-26.0 μg As kg(-1), respectively. The present method could be considered a valuable tool for assessing inorganic arsenic contents in cereal-based foods. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Radon decay product in-door behaviour - parameter, measurement method, and model review

    International Nuclear Information System (INIS)

    Scofield, P.

    1988-01-01

    This report reviews parameters used to characterize indoor radon daughter behavior and concentrations. Certain parameters that affect indoor radon daughter concentrations are described and the values obtained experimentally or theoretically are summarized. Radon daughter measurement methods are reviewed, such as, PAEC, unattached daughters, particle size distributions, and plateout measurement methods. In addition, certain radon pressure driven/diffusion models and indoor radon daughter models are briefly described. (orig.)

  12. A Parameter Robust Method for Singularly Perturbed Delay Differential Equations

    Directory of Open Access Journals (Sweden)

    Erdogan Fevzi

    2010-01-01

    Full Text Available Uniform finite difference methods are constructed via nonstandard finite difference methods for the numerical solution of singularly perturbed quasilinear initial value problem for delay differential equations. A numerical method is constructed for this problem which involves the appropriate Bakhvalov meshes on each time subinterval. The method is shown to be uniformly convergent with respect to the perturbation parameter. A numerical example is solved using the presented method, and the computed result is compared with exact solution of the problem.

  13. Method of reduction of diagnostic parameters during observation on the example of a combustion engine

    Directory of Open Access Journals (Sweden)

    Orczyk Malgorzata

    2017-01-01

    Full Text Available The article presents a method of selecting diagnostic parameters which map the process of damaging the object. This method consists in calculating, during the observation, the correlation coefficient between the intensity of damage and the individual diagnostic parameters; and discarding of those parameters whose correlation coefficient values are outside of the narrowest confidence interval of the correlation coefficient. The characteristic feature of this method is that the parameters are reduced during the diagnostic experiment. The essence of the proposed method is illustrated by the vibration diagnosis of an internal combustion engine.

  14. Validation of Cardiovascular Parameters during NASA's Functional Task Test

    Science.gov (United States)

    Arzeno, N. M.; Stenger, M. B.; Bloomberg, J. J.; Platts, S. H.

    2009-01-01

    Microgravity exposure causes physiological deconditioning and impairs crewmember task performance. The Functional Task Test (FTT) is designed to correlate these physiological changes to performance in a series of operationally-relevant tasks. One of these, the Recovery from Fall/Stand Test (RFST), tests both the ability to recover from a prone position and cardiovascular responses to orthostasis. PURPOSE: Three minutes were chosen for the duration of this test, yet it is unknown if this is long enough to induce cardiovascular responses similar to the operational 5 min stand test. The purpose of this study was to determine the validity and reliability of heart rate variability (HRV) analysis of a 3 min stand and to examine the effect of spaceflight on these measures. METHODS: To determine the validity of using 3 vs. 5 min of standing to assess HRV, ECG was collected from 7 healthy subjects who participated in a 6 min RFST. Mean R-R interval (RR) and spectral HRV were measured in minutes 0-3 and 0-5 following the heart rate transient due to standing. Significant differences between the segments were determined by a paired t-test. To determine the reliability of the 3-min stand test, 13 healthy subjects completed 3 trials of the FTT on separate days, including the RFST with a 3 min stand. Analysis of variance (ANOVA) was performed on the HRV measures. One crewmember completed the FTT before a 14-day mission, on landing day (R+0) and one (R+1) day after returning to Earth. RESULTS VALIDITY: HRV measures reflecting autonomic activity were not significantly different during the 0-3 and 0-5 min segments. RELIABILITY: The average coefficient of variation for RR, systolic (SBP) and diastolic blood pressures during the RFST were less than 8% for the 3 sessions. ANOVA results yielded a greater inter-subject variability (p0.05) for HRV in the RFST. SPACEFLIGHT: Lower RR and higher SBP were observed on R+0 in rest and stand. On R+1, both RR and SBP trended towards preflight

  15. Optimization and validation of Folin-Ciocalteu method for the determination of total polyphenol content of Pu-erh tea.

    Science.gov (United States)

    Musci, Marilena; Yao, Shicong

    2017-12-01

    Pu-erh tea is a post-fermented tea that has recently gained popularity worldwide, due to potential health benefits related to the antioxidant activity resulting from its high polyphenolic content. The Folin-Ciocalteu method is a simple, rapid, and inexpensive assay widely applied for the determination of total polyphenol content. Over the past years, it has been subjected to many modifications, often without any systematic optimization or validation. In our study, we sought to optimize the Folin-Ciocalteu method, evaluate quality parameters including linearity, precision and stability, and then apply the optimized model to determine the total polyphenol content of 57 Chinese teas, including green tea, aged and ripened Pu-erh tea. Our optimized Folin-Ciocalteu method reduced analysis time, allowed for the analysis of a large number of samples, to discriminate among the different teas, and to assess the effect of the post-fermentation process on polyphenol content.

  16. The dynamical core of the Aeolus 1.0 statistical-dynamical atmosphere model: validation and parameter optimization

    Science.gov (United States)

    Totz, Sonja; Eliseev, Alexey V.; Petri, Stefan; Flechsig, Michael; Caesar, Levke; Petoukhov, Vladimir; Coumou, Dim

    2018-02-01

    We present and validate a set of equations for representing the atmosphere's large-scale general circulation in an Earth system model of intermediate complexity (EMIC). These dynamical equations have been implemented in Aeolus 1.0, which is a statistical-dynamical atmosphere model (SDAM) and includes radiative transfer and cloud modules (Coumou et al., 2011; Eliseev et al., 2013). The statistical dynamical approach is computationally efficient and thus enables us to perform climate simulations at multimillennia timescales, which is a prime aim of our model development. Further, this computational efficiency enables us to scan large and high-dimensional parameter space to tune the model parameters, e.g., for sensitivity studies.Here, we present novel equations for the large-scale zonal-mean wind as well as those for planetary waves. Together with synoptic parameterization (as presented by Coumou et al., 2011), these form the mathematical description of the dynamical core of Aeolus 1.0.We optimize the dynamical core parameter values by tuning all relevant dynamical fields to ERA-Interim reanalysis data (1983-2009) forcing the dynamical core with prescribed surface temperature, surface humidity and cumulus cloud fraction. We test the model's performance in reproducing the seasonal cycle and the influence of the El Niño-Southern Oscillation (ENSO). We use a simulated annealing optimization algorithm, which approximates the global minimum of a high-dimensional function.With non-tuned parameter values, the model performs reasonably in terms of its representation of zonal-mean circulation, planetary waves and storm tracks. The simulated annealing optimization improves in particular the model's representation of the Northern Hemisphere jet stream and storm tracks as well as the Hadley circulation.The regions of high azonal wind velocities (planetary waves) are accurately captured for all validation experiments. The zonal-mean zonal wind and the integrated lower

  17. Validation of ultraviolet method to determine serum phosphorus level

    International Nuclear Information System (INIS)

    Garcia Borges, Lisandra; Perez Prieto, Teresa Maria; Valdes Diez, Lilliam

    2009-01-01

    Validation of a spectrophotometry method applicable in clinic labs was proposed to analytical assessment of serum phosphates using a kit UV-phosphorus of domestic production from 'Carlos J. Finlay' Biologics Production Enterprise (Havana, Cuba). Analysis method was based on phosphorus reaction to ammonium molybdenum to acid pH to creating a measurable complex to 340 nm. Specificity and precision were measured considering the method strength, linearity, accuracy and sensitivity. Analytical method was linear up to 4,8 mmol/L, precise (CV 30 .999) during clinical interest concentration interval where there were not interferences by matrix. Detection limit values were of 0.037 mmol/L and of quantification of 0.13 mmol/L both were satisfactory for product use

  18. An accurate method for the determination of unlike potential parameters from thermal diffusion data

    International Nuclear Information System (INIS)

    El-Geubeily, S.

    1997-01-01

    A new method is introduced by means of which the unlike intermolecular potential parameters can be determined from the experimental measurements of the thermal diffusion factor as a function of temperature. The method proved to be easy, accurate, and applicable two-, three-, and four-parameter potential functions whose collision integrals are available. The potential parameters computed by this method are found to provide a faith full representation of the thermal diffusion data under consideration. 3 figs., 4 tabs

  19. Parameter Estimation of Damped Compound Pendulum Using Bat Algorithm

    Directory of Open Access Journals (Sweden)

    Saad Mohd Sazli

    2016-01-01

    Full Text Available In this study, the parameter identification of the damped compound pendulum system is proposed using one of the most promising nature inspired algorithms which is Bat Algorithm (BA. The procedure used to achieve the parameter identification of the experimental system consists of input-output data collection, ARX model order selection and parameter estimation using bat algorithm (BA method. PRBS signal is used as an input signal to regulate the motor speed. Whereas, the output signal is taken from position sensor. Both, input and output data is used to estimate the parameter of the autoregressive with exogenous input (ARX model. The performance of the model is validated using mean squares error (MSE between the actual and predicted output responses of the models. Finally, comparative study is conducted between BA and the conventional estimation method (i.e. Least Square. Based on the results obtained, MSE produce from Bat Algorithm (BA is outperformed the Least Square (LS method.

  20. Validation of a high-performance size-exclusion chromatography method to determine and characterize β-glucans in beer wort using a triple-detector array.

    Science.gov (United States)

    Tomasi, Ivan; Marconi, Ombretta; Sileoni, Valeria; Perretti, Giuseppe

    2017-01-01

    Beer wort β-glucans are high-molecular-weight non-starch polysaccharides of that are great interest to the brewing industries. Because glucans can increase the viscosity of the solutions and form gels, hazes, and precipitates, they are often related to poor lautering performance and beer filtration problems. In this work, a simple and suitable method was developed to determine and characterize β-glucans in beer wort using size exclusion chromatography coupled with a triple-detector array, which is composed of a light scatterer, a viscometer, and a refractive-index detector. The method performances are comparable to the commercial reference method as result from the statistical validation and enable one to obtain interesting parameters of β-glucan in beer wort, such as the molecular weight averages, fraction description, hydrodynamic radius, intrinsic viscosity, polydispersity and Mark-Houwink parameters. This characterization can be useful in brewing science to understand filtration problems, which are not always explained through conventional analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Edge Modeling by Two Blur Parameters in Varying Contrasts.

    Science.gov (United States)

    Seo, Suyoung

    2018-06-01

    This paper presents a method of modeling edge profiles with two blur parameters, and estimating and predicting those edge parameters with varying brightness combinations and camera-to-object distances (COD). First, the validity of the edge model is proven mathematically. Then, it is proven experimentally with edges from a set of images captured for specifically designed target sheets and with edges from natural images. Estimation of the two blur parameters for each observed edge profile is performed with a brute-force method to find parameters that produce global minimum errors. Then, using the estimated blur parameters, actual blur parameters of edges with arbitrary brightness combinations are predicted using a surface interpolation method (i.e., kriging). The predicted surfaces show that the two blur parameters of the proposed edge model depend on both dark-side edge brightness and light-side edge brightness following a certain global trend. This is similar across varying CODs. The proposed edge model is compared with a one-blur parameter edge model using experiments of the root mean squared error for fitting the edge models to each observed edge profile. The comparison results suggest that the proposed edge model has superiority over the one-blur parameter edge model in most cases where edges have varying brightness combinations.

  2. Calculation of parameter failure probability of thermodynamic system by response surface and importance sampling method

    International Nuclear Information System (INIS)

    Shang Yanlong; Cai Qi; Chen Lisheng; Zhang Yangwei

    2012-01-01

    In this paper, the combined method of response surface and importance sampling was applied for calculation of parameter failure probability of the thermodynamic system. The mathematics model was present for the parameter failure of physics process in the thermodynamic system, by which the combination arithmetic model of response surface and importance sampling was established, then the performance degradation model of the components and the simulation process of parameter failure in the physics process of thermodynamic system were also present. The parameter failure probability of the purification water system in nuclear reactor was obtained by the combination method. The results show that the combination method is an effective method for the calculation of the parameter failure probability of the thermodynamic system with high dimensionality and non-linear characteristics, because of the satisfactory precision with less computing time than the direct sampling method and the drawbacks of response surface method. (authors)

  3. Mechanomyographic Parameter Extraction Methods: An Appraisal for Clinical Applications

    Directory of Open Access Journals (Sweden)

    Morufu Olusola Ibitoye

    2014-12-01

    Full Text Available The research conducted in the last three decades has collectively demonstrated that the skeletal muscle performance can be alternatively assessed by mechanomyographic signal (MMG parameters. Indices of muscle performance, not limited to force, power, work, endurance and the related physiological processes underlying muscle activities during contraction have been evaluated in the light of the signal features. As a non-stationary signal that reflects several distinctive patterns of muscle actions, the illustrations obtained from the literature support the reliability of MMG in the analysis of muscles under voluntary and stimulus evoked contractions. An appraisal of the standard practice including the measurement theories of the methods used to extract parameters of the signal is vital to the application of the signal during experimental and clinical practices, especially in areas where electromyograms are contraindicated or have limited application. As we highlight the underpinning technical guidelines and domains where each method is well-suited, the limitations of the methods are also presented to position the state of the art in MMG parameters extraction, thus providing the theoretical framework for improvement on the current practices to widen the opportunity for new insights and discoveries. Since the signal modality has not been widely deployed due partly to the limited information extractable from the signals when compared with other classical techniques used to assess muscle performance, this survey is particularly relevant to the projected future of MMG applications in the realm of musculoskeletal assessments and in the real time detection of muscle activity.

  4. Novel Method for 5G Systems NLOS Channels Parameter Estimation

    Directory of Open Access Journals (Sweden)

    Vladeta Milenkovic

    2017-01-01

    Full Text Available For the development of new 5G systems to operate in mm bands, there is a need for accurate radio propagation modelling at these bands. In this paper novel approach for NLOS channels parameter estimation will be presented. Estimation will be performed based on LCR performance measure, which will enable us to estimate propagation parameters in real time and to avoid weaknesses of ML and moment method estimation approaches.

  5. A new method to estimate heat source parameters in gas metal arc welding simulation process

    International Nuclear Information System (INIS)

    Jia, Xiaolei; Xu, Jie; Liu, Zhaoheng; Huang, Shaojie; Fan, Yu; Sun, Zhi

    2014-01-01

    Highlights: •A new method for accurate simulation of heat source parameters was presented. •The partial least-squares regression analysis was recommended in the method. •The welding experiment results verified accuracy of the proposed method. -- Abstract: Heat source parameters were usually recommended by experience in welding simulation process, which induced error in simulation results (e.g. temperature distribution and residual stress). In this paper, a new method was developed to accurately estimate heat source parameters in welding simulation. In order to reduce the simulation complexity, a sensitivity analysis of heat source parameters was carried out. The relationships between heat source parameters and welding pool characteristics (fusion width (W), penetration depth (D) and peak temperature (T p )) were obtained with both the multiple regression analysis (MRA) and the partial least-squares regression analysis (PLSRA). Different regression models were employed in each regression method. Comparisons of both methods were performed. A welding experiment was carried out to verify the method. The results showed that both the MRA and the PLSRA were feasible and accurate for prediction of heat source parameters in welding simulation. However, the PLSRA was recommended for its advantages of requiring less simulation data

  6. Development and validation of a method for the determination of regulated fragrance allergens by High-Performance Liquid Chromatography and Parallel Factor Analysis 2.

    Science.gov (United States)

    Pérez-Outeiral, Jessica; Elcoroaristizabal, Saioa; Amigo, Jose Manuel; Vidal, Maider

    2017-12-01

    This work presents the development and validation of a multivariate method for quantitation of 6 potentially allergenic substances (PAS) related to fragrances by ultrasound-assisted emulsification microextraction coupled with HPLC-DAD and PARAFAC2 in the presence of other 18 PAS. The objective is the extension of a previously proposed univariate method to be able to determine the 24 PAS currently considered as allergens. The suitability of the multivariate approach for the qualitative and quantitative analysis of the analytes is discussed through datasets of increasing complexity, comprising the assessment and validation of the method performance. PARAFAC2 showed to adequately model the data facing up different instrumental and chemical issues, such as co-elution profiles, overlapping spectra, unknown interfering compounds, retention time shifts and baseline drifts. Satisfactory quality parameters of the model performance were obtained (R 2 ≥0.94), as well as meaningful chromatographic and spectral profiles (r≥0.97). Moreover, low errors of prediction in external validation standards (below 15% in most cases) as well as acceptable quantification errors in real spiked samples (recoveries from 82 to 119%) confirmed the suitability of PARAFAC2 for resolution and quantification of the PAS. The combination of the previously proposed univariate approach, for the well-resolved peaks, with the developed multivariate method allows the determination of the 24 regulated PAS. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. A Parameter Selection Method for Wind Turbine Health Management through SCADA Data

    Directory of Open Access Journals (Sweden)

    Mian Du

    2017-02-01

    Full Text Available Wind turbine anomaly or failure detection using machine learning techniques through supervisory control and data acquisition (SCADA system is drawing wide attention from academic and industry While parameter selection is important for modelling a wind turbine’s condition, only a few papers have been published focusing on this issue and in those papers interconnections among sub-components in a wind turbine are used to address this problem. However, merely the interconnections for decision making sometimes is too general to provide a parameter list considering the differences of each SCADA dataset. In this paper, a method is proposed to provide more detailed suggestions on parameter selection based on mutual information. First, the copula is proven to be capable of simplifying the estimation of mutual information. Then an empirical copulabased mutual information estimation method (ECMI is introduced for application. After that, a real SCADA dataset is adopted to test the method, and the results show the effectiveness of the ECMI in providing parameter selection suggestions when physical knowledge is not accurate enough.

  8. Parameter and state estimation in nonlinear dynamical systems

    Science.gov (United States)

    Creveling, Daniel R.

    This thesis is concerned with the problem of state and parameter estimation in nonlinear systems. The need to evaluate unknown parameters in models of nonlinear physical, biophysical and engineering systems occurs throughout the development of phenomenological or reduced models of dynamics. When verifying and validating these models, it is important to incorporate information from observations in an efficient manner. Using the idea of synchronization of nonlinear dynamical systems, this thesis develops a framework for presenting data to a candidate model of a physical process in a way that makes efficient use of the measured data while allowing for estimation of the unknown parameters in the model. The approach presented here builds on existing work that uses synchronization as a tool for parameter estimation. Some critical issues of stability in that work are addressed and a practical framework is developed for overcoming these difficulties. The central issue is the choice of coupling strength between the model and data. If the coupling is too strong, the model will reproduce the measured data regardless of the adequacy of the model or correctness of the parameters. If the coupling is too weak, nonlinearities in the dynamics could lead to complex dynamics rendering any cost function comparing the model to the data inadequate for the determination of model parameters. Two methods are introduced which seek to balance the need for coupling with the desire to allow the model to evolve in its natural manner without coupling. One method, 'balanced' synchronization, adds to the synchronization cost function a requirement that the conditional Lyapunov exponents of the model system, conditioned on being driven by the data, remain negative but small in magnitude. Another method allows the coupling between the data and the model to vary in time according to a specific form of differential equation. The coupling dynamics is damped to allow for a tendency toward zero coupling

  9. Working towards accreditation by the International Standards Organization 15189 Standard: how to validate an in-house developed method an example of lead determination in whole blood by electrothermal atomic absorption spectrometry.

    Science.gov (United States)

    Garcia Hejl, Carine; Ramirez, Jose Manuel; Vest, Philippe; Chianea, Denis; Renard, Christophe

    2014-09-01

    Laboratories working towards accreditation by the International Standards Organization (ISO) 15189 standard are required to demonstrate the validity of their analytical methods. The different guidelines set by various accreditation organizations make it difficult to provide objective evidence that an in-house method is fit for the intended purpose. Besides, the required performance characteristics tests and acceptance criteria are not always detailed. The laboratory must choose the most suitable validation protocol and set the acceptance criteria. Therefore, we propose a validation protocol to evaluate the performance of an in-house method. As an example, we validated the process for the detection and quantification of lead in whole blood by electrothermal absorption spectrometry. The fundamental parameters tested were, selectivity, calibration model, precision, accuracy (and uncertainty of measurement), contamination, stability of the sample, reference interval, and analytical interference. We have developed a protocol that has been applied successfully to quantify lead in whole blood by electrothermal atomic absorption spectrometry (ETAAS). In particular, our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics.

  10. Validation of an HPLC method for determination of chemical purity of [{sup 18}F]fluoromisonidazole ([{sup 18}F]FMISO)

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, Natalia C.E.S.; Oliveira, Mércia L.; Lima, Fernando R.A., E-mail: nataliafleming@hotmail.com, E-mail: mercial@cnen.gov.br, E-mail: falima@cnen.gov.br [Centro Regional de Ciências Nucleares do Nordeste (CRCN-NE/CNEN-PE), Recife, PE (Brazil); Silveira, Marina B.; Ferreira, Soraya Z.; Silva, Juliana B., E-mail: mbs@cdtn.br, E-mail: zandims@cdtn.br, E-mail: silvajb@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2017-07-01

    [{sup 18}F]Fluoromisonidazole ([{sup 18}F]FMISO) is a nitroimidazole derivative labelled with fluorine-18 that selectively binds to hypoxic cells. It has been shown to be a suitable PET tracer for imaging hypoxia in tumors as well as in noncancerous tissues. [{sup 18}F]FMISO was prepared using a TRACERlabMX{sub FDG}® module (GE) with cassettes, software sequence and reagents kits from ABX. In this work, we aimed to develop and to validate a new high performance liquid chromatography (HPLC) method for determination of chemical purity of [{sup 18}F]FMISO. Analyses were performed with an Agilent chromatograph equipped with radioactivity and UV detectors. [{sup 18}F]FMISO and impurities were separated on a C18 column by gradient elution with water and acetonitrile. Selectivity, linearity, detection limit (DL), quantification limit (LQ), precision, accuracy and robustness were assessed to demonstrate that the HPLC method is adequate for its intended purpose. The HPLC method showed a good precision, as all RSD values were lower than 5%. Robustness was evaluated considering a variation on parameters such mobile phase gradient and flow rate. Results evidenced that the HPLC method is validated and is suitable for radiochemical purity evaluation of [{sup 18}F]FMISO, considering operational conditions of our laboratory. As an extension of this work, other analytical methods used for [{sup 18}F]FMISO quality control should be evaluated, in compliance with good manufacture practice. (author)

  11. Development and validation of ultra-performance liquid chromatographic method with tandem mass spectrometry for determination of lenalidomide in rabbit and human plasma

    Directory of Open Access Journals (Sweden)

    Iqbal Muzaffar

    2013-01-01

    Full Text Available Abstract Background Lenalidomide (LND is a potent novel thalidomide analog which demonstrated remarkable clinical activity in treatment of multiple myeloma disease via a multiple-pathways mechanism. Validated sensitive method with high throughput is required for the determination of lenalidomide for pharmacokinetics and toxicokinetic studies. Ultra performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS is a preeminent analytical tool for rapid biomedical analysis. Results A simple, highly sensitive UPLC-MS/MS method was developed and validated for the determination of LND in rabbit and human plasma. After a simple protein precipitation using methanol, LND and carbamazepine (IS were separated on Acquity UPLC BEH™ C18 column (50 × 2.1 mm, i.d. 1.7 μm, Waters, USA using a mobile phase consisted of acetonitrile:water:formic acid (65:35:0.1%, v/v/v pumped at a flow rate of 0.2 mL/min. LND and IS were eluted at 0.71 and 1.92 min, respectively. The mass spectrometric determination was carried out using an electrospray interface operated in the positive mode with multiple reaction monitoring (MRM mode. The precursor to product ion transitions of m/z 260.1 > 149.0 and m/z 237.0 > 179.0 were used to quantify LND and IS, respectively. The method was linear in the concentration range of 0.23–1000 ng/mL with a limit of quantitation of 0.23 ng/mL. All the validation parameters were in the ranges acceptable by the guidelines of analytical method validation. Conclusion The proposed UPLC-MS/MS method is simple, rapid and highly sensitive, and hence it could be reliable for pharmacokinetic and toxicokinetic study in both animals and humans.

  12. Prediction of broadband ground-motion time histories: Hybrid low/high-frequency method with correlated random source parameters

    Science.gov (United States)

    Liu, P.; Archuleta, R.J.; Hartzell, S.H.

    2006-01-01

    We present a new method for calculating broadband time histories of ground motion based on a hybrid low-frequency/high-frequency approach with correlated source parameters. Using a finite-difference method we calculate low- frequency synthetics (structure. We also compute broadband synthetics in a 1D velocity model using a frequency-wavenumber method. The low frequencies from the 3D calculation are combined with the high frequencies from the 1D calculation by using matched filtering at a crossover frequency of 1 Hz. The source description, common to both the 1D and 3D synthetics, is based on correlated random distributions for the slip amplitude, rupture velocity, and rise time on the fault. This source description allows for the specification of source parameters independent of any a priori inversion results. In our broadband modeling we include correlation between slip amplitude, rupture velocity, and rise time, as suggested by dynamic fault modeling. The method of using correlated random source parameters is flexible and can be easily modified to adjust to our changing understanding of earthquake ruptures. A realistic attenuation model is common to both the 3D and 1D calculations that form the low- and high-frequency components of the broadband synthetics. The value of Q is a function of the local shear-wave velocity. To produce more accurate high-frequency amplitudes and durations, the 1D synthetics are corrected with a randomized, frequency-dependent radiation pattern. The 1D synthetics are further corrected for local site and nonlinear soil effects by using a 1D nonlinear propagation code and generic velocity structure appropriate for the site’s National Earthquake Hazards Reduction Program (NEHRP) site classification. The entire procedure is validated by comparison with the 1994 Northridge, California, strong ground motion data set. The bias and error found here for response spectral acceleration are similar to the best results that have been published by

  13. Critical Values for Lawshe's Content Validity Ratio: Revisiting the Original Methods of Calculation

    Science.gov (United States)

    Ayre, Colin; Scally, Andrew John

    2014-01-01

    The content validity ratio originally proposed by Lawshe is widely used to quantify content validity and yet methods used to calculate the original critical values were never reported. Methods for original calculation of critical values are suggested along with tables of exact binomial probabilities.

  14. Measurement of drill grinding parameters using laser sensor

    Science.gov (United States)

    Yanping, Peng; Kumehara, Hiroyuki; Wei, Zhang; Nomura, Takashi

    2005-12-01

    To measure the grinding parameters and geometry parameters accurately for a drill point is essential to its design and reconditioning. In recent years, a number of non-contact coordinate measuring apparatuses, using CCD camera or laser sensors, are developed. But, a lot work is to be done for further improvement. This paper reports another kind of laser coordinate meter. As an example of its application, the method for geometry inspection of the drill flank surface is detailed. Measured data from laser scanning on the flank surface around some points with several 2-dimensional curves are analyzed with mathematical procedure. If one of these curves turns to be a straight line, it must be the generatrix of the grinding cone. Thus, the grinding parameters are determined by a set of three generatrices. Then, the measurement method and data processing procedure are proposed. Its validity is assessed by measuring a sample with given parameters. The point geometry measured agrees well with the known values. In comparison with other methods in the published literature, it is simpler in computation and more accurate in results.

  15. Element diameter free stability parameters for stabilized methods applied to fluids

    International Nuclear Information System (INIS)

    Franca, L.P.; Madureira, A.L.

    1992-08-01

    Stability parameters for stabilized methods in fluids are suggested. The computation of the largest eigenvalue of a generalized eigenvalue problem replaces controversial definitions of element diameters and inverse estimate constants, used heretofore to compute these stability parameters. The design is employed in the advective-diffusive model, incompressible Navier-Stokes equations and the Stokes problem. (author)

  16. Preliminary validation of assays to measure parameters of calcium metabolism in captive Asian and African elephants in western Europe.

    Science.gov (United States)

    van Sonsbeek, Gerda R; van der Kolk, Johannes H; van Leeuwen, Johannes P T M; Schaftenaar, Willem

    2011-05-01

    Hypocalcemia is a well known cause of dystocia in animals, including elephants in captivity. In order to study calcium metabolism in elephants, it is of utmost importance to use properly validated assays, as these might be prone to specific matrix effects in elephant blood. The aim of the current study was to conduct preliminary work for validation of various parameters involved in calcium metabolism in both blood and urine of captive elephants. Basal values of these parameters were compared between Asian elephants (Elephas maximus) and African elephants (Loxodonta africana). Preliminary testing of total calcium, inorganic phosphorus, and creatinine appeared valid for use in plasma and creatinine in urine in both species. Furthermore, measurements of bone alkaline phosphatase and N-terminal telopeptide of type I collagen appeared valid for use in Asian elephants. Mean heparinized plasma ionized calcium concentration and pH were not significantly affected by 3 cycles of freezing and thawing. Storage at 4 °C, room temperature, and 37 °C for 6, 12, and 24 hr did not alter the heparinized plasma ionized calcium concentration in Asian elephants. The following linear regression equation using pH (range: 6.858-7.887) and ionized calcium concentration in heparinized plasma was utilized: iCa(7.4) (mmol/l) = -2.1075 + 0.3130·pH(actual) + 0.8296·iCa(actual) (mmol/l). Mean basal values for pH and plasma in Asian elephant whole blood were 7.40 ± 0.048 and 7.49 ± 0.077, respectively. The urinary specific gravity and creatinine concentrations in both Asian and African elephants were significantly correlated and both were significantly lower in Asian elephants. © 2011 The Author(s)

  17. Parameter estimation of Monod model by the Least-Squares method for microalgae Botryococcus Braunii sp

    Science.gov (United States)

    See, J. J.; Jamaian, S. S.; Salleh, R. M.; Nor, M. E.; Aman, F.

    2018-04-01

    This research aims to estimate the parameters of Monod model of microalgae Botryococcus Braunii sp growth by the Least-Squares method. Monod equation is a non-linear equation which can be transformed into a linear equation form and it is solved by implementing the Least-Squares linear regression method. Meanwhile, Gauss-Newton method is an alternative method to solve the non-linear Least-Squares problem with the aim to obtain the parameters value of Monod model by minimizing the sum of square error ( SSE). As the result, the parameters of the Monod model for microalgae Botryococcus Braunii sp can be estimated by the Least-Squares method. However, the estimated parameters value obtained by the non-linear Least-Squares method are more accurate compared to the linear Least-Squares method since the SSE of the non-linear Least-Squares method is less than the linear Least-Squares method.

  18. Estimating physiological skin parameters from hyperspectral signatures

    Science.gov (United States)

    Vyas, Saurabh; Banerjee, Amit; Burlina, Philippe

    2013-05-01

    We describe an approach for estimating human skin parameters, such as melanosome concentration, collagen concentration, oxygen saturation, and blood volume, using hyperspectral radiometric measurements (signatures) obtained from in vivo skin. We use a computational model based on Kubelka-Munk theory and the Fresnel equations. This model forward maps the skin parameters to a corresponding multiband reflectance spectra. Machine-learning-based regression is used to generate the inverse map, and hence estimate skin parameters from hyperspectral signatures. We test our methods using synthetic and in vivo skin signatures obtained in the visible through the short wave infrared domains from 24 patients of both genders and Caucasian, Asian, and African American ethnicities. Performance validation shows promising results: good agreement with the ground truth and well-established physiological precepts. These methods have potential use in the characterization of skin abnormalities and in minimally-invasive prescreening of malignant skin cancers.

  19. Intracranial aneurysm segmentation in 3D CT angiography: Method and quantitative validation with and without prior noise filtering

    International Nuclear Information System (INIS)

    Firouzian, Azadeh; Manniesing, Rashindra; Flach, Zwenneke H.; Risselada, Roelof; Kooten, Fop van; Sturkenboom, Miriam C.J.M.; Lugt, Aad van der; Niessen, Wiro J.

    2011-01-01

    Intracranial aneurysm volume and shape are important factors for predicting rupture risk, for pre-surgical planning and for follow-up studies. To obtain these parameters, manual segmentation can be employed; however, this is a tedious procedure, which is prone to inter- and intra-observer variability. Therefore there is a need for an automated method, which is accurate, reproducible and reliable. This study aims to develop and validate an automated method for segmenting intracranial aneurysms in Computed Tomography Angiography (CTA) data. Also, it is investigated whether prior smoothing improves segmentation robustness and accuracy. The proposed segmentation method is implemented in the level set framework, more specifically Geodesic Active Surfaces, in which a surface is evolved to capture the aneurysmal wall via an energy minimization approach. The energy term is composed of three different image features, namely; intensity, gradient magnitude and intensity variance. The method requires minimal user interaction, i.e. a single seed point inside the aneurysm needs to be placed, based on which image intensity statistics of the aneurysm are derived and used in defining the energy term. The method has been evaluated on 15 aneurysms in 11 CTA data sets by comparing the results to manual segmentations performed by two expert radiologists. Evaluation measures were Similarity Index, Average Surface Distance and Volume Difference. The results show that the automated aneurysm segmentation method is reproducible, and performs in the range of inter-observer variability in terms of accuracy. Smoothing by nonlinear diffusion with appropriate parameter settings prior to segmentation, slightly improves segmentation accuracy.

  20. Development and validation of RP-HPLC method for determination of famotidine and its application in quality control of different pharmaceutical dosage forms

    International Nuclear Information System (INIS)

    Hassan, S.S.; Ayub, M.; Ishtiaq, S.; Ahmad; I; Khalid, N.

    2013-01-01

    A precise and fast novel high-performance liquid chromatography method was developed and validated for the quantitative determination of Famotidine (FMT) in commercially available pharmaceutical dosage forms. An Agilent 1200 Series High Performance Liquid Chromatography (HPLC) system having the column C 1 8 (5 micro m particle size, 150*4.6 mm) was used in this study and detection (diode array detector) was made at 280 nm. The mobile phase was acetonitrile, distilled water, triethylamine and phosphoric acid (49.9:49.9:0.1:0.1, v/v), isocratic elution under ambient temperature at flow rate of 1.5 mL min/sup -1/ with injection volume 5 micro L. In this method, the retention times for FMT pure, tablets and suspension were 0.787 min, 0.789 min and 0.839 minutes respectively. The new method was validated by different validation parameters. The procedure provided a linear response over the concentration range of 0.1-1.0 mg mL/sup -1/ (r/sup 2/ =0.998) and equation was y=3902.6+18.651. The mean % recovery for inter-day (96.56%) and intra-day (97.36%) assuring a good precision and accuracy was 96-98%. The method was found to be very rapid and the overall assay time was less than 2 minutes and the results obtained were accurate, precise and selective enough to allow the determination of FMT in the presence of certain excipients. (author)

  1. Analysis of random response of structure with uncertain parameters. Combination of substructure synthesis method and hierarchy method

    International Nuclear Information System (INIS)

    Iwatsubo, Takuzo; Kawamura, Shozo; Mori, Hiroyuki.

    1995-01-01

    In this paper, the method to obtain the random response of a structure with uncertain parameters is proposed. The proposed method is a combination of the substructure synthesis method and the hierarchy method. The concept of the proposed method is that the hierarchy equation of each substructure is obtained using the hierarchy method, and the hierarchy equation of the overall structure is obtained using the substructure synthesis method. Using the proposed method, the reduced order hierarchy equation can be obtained without analyzing the original whole structure. After the calculation of the mean square value of response, the reliability analysis can be carried out based on the first passage problem and Poisson's excursion rate. As a numerical example of structure, a simple piping system is considered. The damping constant of the support is considered as the uncertainty parameter. Then the random response is calculated using the proposed method. As a result, the proposed method is useful to analyze the random response in terms of the accuracy, computer storage and calculation time. (author)

  2. Worst-case study for cleaning validation of equipment in the radiopharmaceutical production of lyophilized reagents: Methodology validation of total organic carbon

    International Nuclear Information System (INIS)

    Porto, Luciana Valeria Ferrari Machado

    2015-01-01

    Radiopharmaceuticals are defined as pharmaceutical preparations containing a radionuclide in their composition, mostly intravenously administered, and therefore compliance with the principles of Good Manufacturing Practices (GMP) is essential and indispensable. Cleaning validation is a requirement of the current GMP, and consists of documented evidence, which demonstrates that the cleaning procedures are able to remove residues to pre-determined acceptance levels, ensuring that no cross contamination occurs. A simplification of cleaning processes validation is accepted, and consists in choosing a product, called 'worst case', to represent the cleaning processes of all equipment of the same production area. One of the steps of cleaning validation is the establishment and validation of the analytical method to quantify the residue. The aim of this study was to establish the worst case for cleaning validation of equipment in the radiopharmaceutical production of lyophilized reagent (LR) for labeling with 99m Tc, evaluate the use of Total Organic Carbon (TOC) content as indicator of equipment cleaning used in the LR manufacture, validate the method of Non-Purgeable Organic Carbon (NPOC), and perform recovery tests with the product chosen as worst case. Worst case product's choice was based on the calculation of an index called 'Worst Case Index' (WCI), using information about drug solubility, difficulty of cleaning the equipment and occupancy rate of the products in line production. The products indicated 'worst case' was the LR MIBI-TEC. The method validation assays were performed using carbon analyser model TOC-Vwp coupled to an autosampler model ASI-V, both from Shimadzu®, controlled by TOC Control-V software. It was used the direct method for NPOC quantification. The parameters evaluated in the validation method were: system suitability, robustness, linearity, detection limit (DL) and quantification limit (QL), precision

  3. Role of calibration, validation, and relevance in multi-level uncertainty integration

    International Nuclear Information System (INIS)

    Li, Chenzhao; Mahadevan, Sankaran

    2016-01-01

    Calibration of model parameters is an essential step in predicting the response of a complicated system, but the lack of data at the system level makes it impossible to conduct this quantification directly. In such a situation, system model parameters are estimated using tests at lower levels of complexity which share the same model parameters with the system. For such a multi-level problem, this paper proposes a methodology to quantify the uncertainty in the system level prediction by integrating calibration, validation and sensitivity analysis at different levels. The proposed approach considers the validity of the models used for parameter estimation at lower levels, as well as the relevance at the lower level to the prediction at the system level. The model validity is evaluated using a model reliability metric, and models with multivariate output are considered. The relevance is quantified by comparing Sobol indices at the lower level and system level, thus measuring the extent to which a lower level test represents the characteristics of the system so that the calibration results can be reliably used in the system level. Finally the results of calibration, validation and relevance analysis are integrated in a roll-up method to predict the system output. - Highlights: • Relevance analysis to quantify the closeness of two models. • Stochastic model reliability metric to integrate multiple validation experiments. • Extend the model reliability metric to deal with multivariate output. • Roll-up formula to integrate calibration, validation, and relevance.

  4. Are LOD and LOQ Reliable Parameters for Sensitivity Evaluation of Spectroscopic Methods?

    Science.gov (United States)

    Ershadi, Saba; Shayanfar, Ali

    2018-03-22

    The limit of detection (LOD) and the limit of quantification (LOQ) are common parameters to assess the sensitivity of analytical methods. In this study, the LOD and LOQ of previously reported terbium sensitized analysis methods were calculated by different methods, and the results were compared with sensitivity parameters [lower limit of quantification (LLOQ)] of U.S. Food and Drug Administration guidelines. The details of the calibration curve and standard deviation of blank samples of three different terbium-sensitized luminescence methods for the quantification of mycophenolic acid, enrofloxacin, and silibinin were used for the calculation of LOD and LOQ. A comparison of LOD and LOQ values calculated by various methods and LLOQ shows a considerable difference. The significant difference of the calculated LOD and LOQ with various methods and LLOQ should be considered in the sensitivity evaluation of spectroscopic methods.

  5. Validation method training: nurses' experiences and ratings of work climate.

    Science.gov (United States)

    Söderlund, Mona; Norberg, Astrid; Hansebo, Görel

    2014-03-01

    Training nursing staff in communication skills can impact on the quality of care for residents with dementia and contributes to nurses' job satisfaction. Changing attitudes and practices takes time and energy and can affect the entire nursing staff, not just the nurses directly involved in a training programme. Therefore, it seems important to study nurses' experiences of a training programme and any influence of the programme on work climate among the entire nursing staff. To explore nurses' experiences of a 1-year validation method training programme conducted in a nursing home for residents with dementia and to describe ratings of work climate before and after the programme. A mixed-methods approach. Twelve nurses participated in the training and were interviewed afterwards. These individual interviews were tape-recorded and transcribed, then analysed using qualitative content analysis. The Creative Climate Questionnaire was administered before (n = 53) and after (n = 56) the programme to the entire nursing staff in the participating nursing home wards and analysed with descriptive statistics. Analysis of the interviews resulted in four categories: being under extra strain, sharing experiences, improving confidence in care situations and feeling uncertain about continuing the validation method. The results of the questionnaire on work climate showed higher mean values in the assessment after the programme had ended. The training strengthened the participating nurses in caring for residents with dementia, but posed an extra strain on them. These nurses also described an extra strain on the entire nursing staff that was not reflected in the results from the questionnaire. The work climate at the nursing home wards might have made it easier to conduct this extensive training programme. Training in the validation method could develop nurses' communication skills and improve their handling of complex care situations. © 2013 Blackwell Publishing Ltd.

  6. Standard Test Method for Determining the Linearity of a Photovoltaic Device Parameter with Respect To a Test Parameter

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2005-01-01

    1.1 This test method determines the degree of linearity of a photovoltaic device parameter with respect to a test parameter, for example, short-circuit current with respect to irradiance. 1.2 The linearity determined by this test method applies only at the time of testing, and implies no past or future performance level. 1.3 This test method applies only to non-concentrator terrestrial photovoltaic devices. 1.4 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  7. Determination of resonance parameters in QCD by functional analysis methods

    International Nuclear Information System (INIS)

    Ciulli, S.; Geniet, F.; Papadopoulos, N.A.; Schilcher, K.

    1988-01-01

    A mathematically rigorous method based on functional analysis is used to determine resonance parameters of an amplitude from its given asymptotic expression in the space-like region. This method is checked on a model amplitude where both the asymptotic expression and the exact function are known. This method is then applied to the determination of the mass and the width of the ρ-meson from the corresponding space-like asymptotic QCD expression. (orig.)

  8. Validation of single-sample doubly labeled water method

    International Nuclear Information System (INIS)

    Webster, M.D.; Weathers, W.W.

    1989-01-01

    We have experimentally validated a single-sample variant of the doubly labeled water method for measuring metabolic rate and water turnover in a very small passerine bird, the verdin (Auriparus flaviceps). We measured CO 2 production using the Haldane gravimetric technique and compared these values with estimates derived from isotopic data. Doubly labeled water results based on the one-sample calculations differed from Haldane values by less than 0.5% on average (range -8.3 to 11.2%, n = 9). Water flux computed by the single-sample method differed by -1.5% on average from results for the same birds based on the standard, two-sample technique (range -13.7 to 2.0%, n = 9)

  9. Design of Experiment as a powerful tool when applying Finite Element Method: a case study on prediction of hot rolling process parameters

    Directory of Open Access Journals (Sweden)

    Giancarlo G. Bordonaro

    2018-04-01

    Full Text Available The ultimate goal in hot roll pass design is to manufacture a rolled product with the required dimensional accuracy, defect free surface, and mechanical properties. The proper selection of process parameters is crucial to meet increasing requirements for desired quality and geometrical properties of rolled products. Due to the complex behavior of the metal flow at high temperatures and the severe plastic deformations in shape rolling, most efforts that have been made so far only rely upon the practical experience gained by operators. The large number of variables involved and the difficulty in investigating the process characteristics, make the use of finite element (FE tools an effective and attractive opportunity towards a thorough understanding of the rolling process. In this work, Design of Experiment (DOE is proposed as a powerful and viable method for the prediction of rolling process parameters while reducing the computational effort. Nonlinear 3D FE models of the hot rolling process are developed for a large set of complex cross-section shapes and validated against experimental evidences provided by real plant products at each stage of the deformation sequence. Based on the accuracy of the validated FE models, DOE is applied to investigate the flat rolling process under a series of many parameters and scenarios. Effects of main roll forming variables are analyzed on material flow behavior and geometrical features of a rolled product. The selected DOE factors are the workpiece temperature, diameter size, diameter reduction (draught, and rolls angular velocity. The selected DOE responses are workpiece spread, effective stresses, contact stresses, and rolls reaction loads. Eventually, the application of Pareto optimality (a Multi-Criteria Decision Making method allows to detect an optimal combination of design factors which respect desired target requirements for the responses.

  10. A new hybrid bee pollinator flower pollination algorithm for solar PV parameter estimation

    International Nuclear Information System (INIS)

    Ram, J. Prasanth; Babu, T. Sudhakar; Dragicevic, Tomislav; Rajasekar, N.

    2017-01-01

    Highlights: • A new Bee Pollinator Flower Pollination Algorithm (BPFPA) is proposed for Solar PV Parameter extraction. • Standard RTC France data is used for the experimentation of BPFPA algorithm. • Four different PV modules are successfully tested via double diode model. • The BPFPA method is highly convincing in accuracy to convergence at faster rate. • The proposed BPFPA provides the best performance among the other recent techniques. - Abstract: The inaccurate I-V curve generation in solar PV modeling introduces less efficiency and on the other hand, accurate simulation of PV characteristics becomes a mandatory obligation before experimental validation. Although many optimization methods in literature have attempted to extract accurate PV parameters, all of these methods do not guarantee their convergence to the global optimum. Hence, the authors of this paper have proposed a new hybrid Bee pollinator Flower Pollination Algorithm (BPFPA) for the PV parameter extraction problem. The PV parameters for both single diode and double diode are extracted and tested under different environmental conditions. For brevity, the I_0_1, I_0_2, I_p_v for double diode and I_0_,I_p_v for single diode models are calculated analytically where the remaining parameters ‘R_s, R_p, a_1, a_2’ are optimized using BPFPA method. It is found that, the proposed Bee Pollinator method has all the scope to create exploration and exploitation in the control variable to yield a less RMSE value even under lower irradiated conditions. Further for performance validation, the parameters arrived via BPFPA method is compared with Genetic Algorithm (GA), Pattern Search (PS), Harmony Search (HS), Flower Pollination Algorithm (FPA) and Artificial Bee Swarm Optimization (ABSO). In addition, various outcomes of PV modeling and different parameters influencing the accurate PV modeling are critically analyzed.

  11. Experimental validation for calcul methods of structures having shock non-linearity

    International Nuclear Information System (INIS)

    Brochard, D.; Buland, P.

    1987-01-01

    For the seismic analysis of non-linear structures, numerical methods have been developed which need to be validated on experimental results. The aim of this paper is to present the design method of a test program which results will be used for this purpose. Some applications to nuclear components will illustrate this presentation [fr

  12. Parameter estimation methods for gene circuit modeling from time-series mRNA data: a comparative study.

    Science.gov (United States)

    Fan, Ming; Kuwahara, Hiroyuki; Wang, Xiaolei; Wang, Suojin; Gao, Xin

    2015-11-01

    Parameter estimation is a challenging computational problem in the reverse engineering of biological systems. Because advances in biotechnology have facilitated wide availability of time-series gene expression data, systematic parameter estimation of gene circuit models from such time-series mRNA data has become an important method for quantitatively dissecting the regulation of gene expression. By focusing on the modeling of gene circuits, we examine here the performance of three types of state-of-the-art parameter estimation methods: population-based methods, online methods and model-decomposition-based methods. Our results show that certain population-based methods are able to generate high-quality parameter solutions. The performance of these methods, however, is heavily dependent on the size of the parameter search space, and their computational requirements substantially increase as the size of the search space increases. In comparison, online methods and model decomposition-based methods are computationally faster alternatives and are less dependent on the size of the search space. Among other things, our results show that a hybrid approach that augments computationally fast methods with local search as a subsequent refinement procedure can substantially increase the quality of their parameter estimates to the level on par with the best solution obtained from the population-based methods while maintaining high computational speed. These suggest that such hybrid methods can be a promising alternative to the more commonly used population-based methods for parameter estimation of gene circuit models when limited prior knowledge about the underlying regulatory mechanisms makes the size of the parameter search space vastly large. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  13. Optimization of control parameters of a hot cold controller by means of Simplex type methods

    Science.gov (United States)

    Porte, C.; Caron-Poussin, M.; Carot, S.; Couriol, C.; Moreno, M. Martin; Delacroix, A.

    1997-01-01

    This paper describes a hot/cold controller for regulating crystallization operations. The system was identified with a common method (the Broida method) and the parameters were obtained by the Ziegler-Nichols method. The paper shows that this empirical method will only allow a qualitative approach to regulation and that, in some instances, the parameters obtained are unreliable and therefore cannot be used to cancel variations between the set point and the actual values. Optimization methods were used to determine the regulation parameters and solve this identcation problem. It was found that the weighted centroid method was the best one. PMID:18924791

  14. Standardless quantification approach of TXRF analysis using fundamental parameter method

    International Nuclear Information System (INIS)

    Szaloki, I.; Taniguchi, K.

    2000-01-01

    New standardless evaluation procedure based on the fundamental parameter method (FPM) has been developed for TXRF analysis. The theoretical calculation describes the relationship between characteristic intensities and the geometrical parameters of the excitation, detection system and the specimen parameters: size, thickness, angle of the excitation beam to the surface and the optical properties of the specimen holder. Most of the TXRF methods apply empirical calibration, which requires the application of special preparation technique. However, the characteristic lines of the specimen holder (Si Kα,β) present information from the local excitation and geometrical conditions on the substrate surface. On the basis of the theoretically calculation of the substrate characteristic intensity the excitation beam flux can be approximated. Taking into consideration the elements are in the specimen material a system of non-linear equation can be given involving the unknown concentration values and the geometrical and detection parameters. In order to solve this mathematical problem PASCAL software was written, which calculates the sample composition and the average sample thickness by gradient algorithm. Therefore, this quantitative estimation of the specimen composition requires neither external nor internal standard sample. For verification of the theoretical calculation and the numerical procedure, several experiments were carried out using mixed standard solution containing elements of K, Sc, V, Mn, Co and Cu in 0.1 - 10 ppm concentration range. (author)

  15. Validation of innovative technologies and strategies for regulatory safety assessment methods: challenges and opportunities.

    Science.gov (United States)

    Stokes, William S; Wind, Marilyn

    2010-01-01

    Advances in science and innovative technologies are providing new opportunities to develop test methods and strategies that may improve safety assessments and reduce animal use for safety testing. These include high throughput screening and other approaches that can rapidly measure or predict various molecular, genetic, and cellular perturbations caused by test substances. Integrated testing and decision strategies that consider multiple types of information and data are also being developed. Prior to their use for regulatory decision-making, new methods and strategies must undergo appropriate validation studies to determine the extent that their use can provide equivalent or improved protection compared to existing methods and to determine the extent that reproducible results can be obtained in different laboratories. Comprehensive and optimal validation study designs are expected to expedite the validation and regulatory acceptance of new test methods and strategies that will support improved safety assessments and reduced animal use for regulatory testing.

  16. Exploration of method determining hydrogeologic parameters of low permeability sandstone uranium deposits

    International Nuclear Information System (INIS)

    Ji Hongbin; Wu Liwu; Cao Zhen

    2012-01-01

    A hypothesis of regarding injecting test as 'anti-pumping' test is presented, and pumping test's 'match line method' is used to process data of injecting test. Accurate hydrogeologic parameters can be obtained by injecting test in the sandstone uranium deposits with low permeability and small pumping volume. Taking injecting test in a uranium deposit of Xinjiang for example, the hydrogeologic parameters of main ore-bearing aquifer were calculated by using the 'anti-pumping' hypothesis. Results calculated by the 'anti-pumping' hypothesis were compared with results calculated by water level recovery method. The results show that it is feasible to use 'anti-pumping' hypothesis to calculate the hydrogeologic parameters of main ore-bearing aquifer. (authors)

  17. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method.

    Science.gov (United States)

    Badran, Hani; Pluye, Pierre; Grad, Roland

    2017-03-14

    The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n

  18. Development and validation of an UHPLC-MS/MS method for β2-agonists quantification in human urine and application to clinical samples.

    Science.gov (United States)

    Bozzolino, Cristina; Leporati, Marta; Gani, Federica; Ferrero, Cinzia; Vincenti, Marco

    2018-02-20

    A fast analytical method for the simultaneous detection of 24 β 2 -agonists in human urine was developed and validated. The method covers the therapeutic drugs most commonly administered, but also potentially abused β 2 -agonists. The procedure is based on enzymatic deconjugation with β-glucuronidase followed by SPE clean up using mixed-phase cartridges with both ion-exchange and lipophilic properties. Instrumental analysis conducted by UHPLC-MS/MS allowed high peak resolution and rapid chromatographic separation, with reduced time and costs. The method was fully validated according ISO 17025:2005 principles. The following parameters were determined for each analyte: specificity, selectivity, linearity, limit of detection, limit of quantification, precision, accuracy, matrix effect, recovery and carry-over. The method was tested on real samples obtained from patients subjected to clinical treatment under chronic or acute therapy with either formoterol, indacaterol, salbutamol, or salmeterol. The drugs were administered using pressurized metered dose inhalers. All β 2 -agonists administered to the patients were detected in the real samples. The method proved adequate to accurately measure the concentration of these analytes in the real samples. The observed analytical data are discussed with reference to the administered dose and the duration of the therapy. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Method for Pre-Conditioning a Measured Surface Height Map for Model Validation

    Science.gov (United States)

    Sidick, Erkin

    2012-01-01

    This software allows one to up-sample or down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. Because the re-sampling of a surface map is accomplished based on the analytical expressions of Zernike-polynomials and a power spectral density model, such re-sampling does not introduce any aliasing and interpolation errors as is done by the conventional interpolation and FFT-based (fast-Fourier-transform-based) spatial-filtering method. Also, this new method automatically eliminates the measurement noise and other measurement errors such as artificial discontinuity. The developmental cycle of an optical system, such as a space telescope, includes, but is not limited to, the following two steps: (1) deriving requirements or specs on the optical quality of individual optics before they are fabricated through optical modeling and simulations, and (2) validating the optical model using the measured surface height maps after all optics are fabricated. There are a number of computational issues related to model validation, one of which is the "pre-conditioning" or pre-processing of the measured surface maps before using them in a model validation software tool. This software addresses the following issues: (1) up- or down-sampling a measured surface map to match it with the gridded data format of a model validation tool, and (2) eliminating the surface measurement noise or measurement errors such that the resulted surface height map is continuous or smoothly-varying. So far, the preferred method used for re-sampling a surface map is two-dimensional interpolation. The main problem of this method is that the same pixel can take different values when the method of interpolation is changed among the different methods such as the "nearest," "linear," "cubic," and "spline" fitting in Matlab. The conventional, FFT-based spatial filtering method used to

  20. Regionalisation of a distributed method for flood quantiles estimation: Revaluation of local calibration hypothesis to enhance the spatial structure of the optimised parameter

    Science.gov (United States)

    Odry, Jean; Arnaud, Patrick

    2016-04-01

    here is to develop a SHYREG evaluation scheme focusing on both local and regional performances. Indeed, it is necessary to maintain the accuracy of at site flood quantiles estimation while identifying a configuration leading to a satisfactory spatial pattern of the calibrated parameter. This ability to be regionalised can be appraised by the association of common regionalisation techniques and split sample validation tests on a set of around 1,500 catchments representing the whole diversity of France physiography. Also, the presence of many nested catchments and a size-based split sample validation make possible to assess the relevance of the calibrated parameter spatial structure inside the largest catchments. The application of this multi-objective evaluation leads to the selection of a version of SHYREG more suitable for regionalisation. References: Arnaud, P., Cantet, P., Aubert, Y., 2015. Relevance of an at-site flood frequency analysis method for extreme events based on stochastic simulation of hourly rainfall. Hydrological Sciences Journal: on press. DOI:10.1080/02626667.2014.965174 Aubert, Y., Arnaud, P., Ribstein, P., Fine, J.A., 2014. The SHYREG flow method-application to 1605 basins in metropolitan France. Hydrological Sciences Journal, 59(5): 993-1005. DOI:10.1080/02626667.2014.902061

  1. Should the mass of a nanoferrite sample prepared by autocombustion method be considered as a realistic preparation parameter?

    Energy Technology Data Exchange (ETDEWEB)

    Wahba, Adel Maher, E-mail: adel.mousa@f-eng.tanta.edu.eg [Department of Engineering Physics and Mathematics, Faculty of Engineering, Tanta University (Egypt); Mohamed, Mohamed Bakr [Ain shams University, Faculty of Science, Physics Department, Cairo (Egypt)

    2017-02-15

    Detectable variations in structural, elastic and magnetic properties have been reported depending on the mass of the cobalt nanoferrite sample prepared by citrate autocombustion method. Heat released during the autocombustion process and its duration are directly proportional to the mass to be prepared, and is thus expected to affect both the crystallite size and the cation distribution giving rise to the reported variations in microstrain, magnetization, and coercivity. Formation of a pure spinel phase has been validated using X-ray diffraction patterns (XRD) and Fourier-transform infrared (FTIR) spectra. Crystallite sizes obtained from Williamson-Hall (W-H) method range from 28–87 nm, being further supported by images of high-resolution transmission electron microscope (HRTEM). Saturation magnetization and coercivity deduced from M-H hysteresis loops show a clear correlation with the cation distribution, which was proposed on the basis of experimentally obtained data of XRD, VSM, and IR. Elastic parameters have been estimated using the cation distribution and FTIR data, with a resulting trend quite opposite to that of the lattice parameter. - Highlights: • Samples with different masses of CoFe{sub 2}O{sub 4} were prepared by autocombustion method. • XRD and IR data confirmed a pure spinel cubic structure for all samples. • Structural and magnetic properties show detectable changes with the mass prepared. • Cation distribution was suggested from experimental data of XRD, IR, and M-H loops.

  2. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    Science.gov (United States)

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  3. Validation of commercially available automated canine-specific immunoturbidimetric method for measuring canine C-reactive protein

    DEFF Research Database (Denmark)

    Hillström, Anna; Hagman, Ragnvi; Tvedten, Harold

    2014-01-01

    BACKGROUND: Measurement of C-reactive protein (CRP) is used for diagnosing and monitoring systemic inflammatory disease in canine patients. An automated human immunoturbidimetric assay has been validated for measuring canine CRP, but cross-reactivity with canine CRP is unpredictable. OBJECTIVE......: The purpose of the study was to validate a new automated canine-specific immunoturbidimetric CRP method (Gentian cCRP). METHODS: Studies of imprecision, accuracy, prozone effect, interference, limit of quantification, and stability under different storage conditions were performed. The new method was compared...... with a human CRP assay previously validated for canine CRP determination. Samples from 40 healthy dogs were analyzed to establish a reference interval. RESULTS: Total imprecision was

  4. Performance-based parameter tuning method of model-driven PID control systems.

    Science.gov (United States)

    Zhao, Y M; Xie, W F; Tu, X W

    2012-05-01

    In this paper, performance-based parameter tuning method of model-driven Two-Degree-of-Freedom PID (MD TDOF PID) control system has been proposed to enhance the control performances of a process. Known for its ability of stabilizing the unstable processes, fast tracking to the change of set points and rejecting disturbance, the MD TDOF PID has gained research interest recently. The tuning methods for the reported MD TDOF PID are based on internal model control (IMC) method instead of optimizing the performance indices. In this paper, an Integral of Time Absolute Error (ITAE) zero-position-error optimal tuning and noise effect minimizing method is proposed for tuning two parameters in MD TDOF PID control system to achieve the desired regulating and disturbance rejection performance. The comparison with Two-Degree-of-Freedom control scheme by modified smith predictor (TDOF CS MSP) and the designed MD TDOF PID tuned by the IMC tuning method demonstrates the effectiveness of the proposed tuning method. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  5. An iterative stochastic ensemble method for parameter estimation of subsurface flow models

    International Nuclear Information System (INIS)

    Elsheikh, Ahmed H.; Wheeler, Mary F.; Hoteit, Ibrahim

    2013-01-01

    Parameter estimation for subsurface flow models is an essential step for maximizing the value of numerical simulations for future prediction and the development of effective control strategies. We propose the iterative stochastic ensemble method (ISEM) as a general method for parameter estimation based on stochastic estimation of gradients using an ensemble of directional derivatives. ISEM eliminates the need for adjoint coding and deals with the numerical simulator as a blackbox. The proposed method employs directional derivatives within a Gauss–Newton iteration. The update equation in ISEM resembles the update step in ensemble Kalman filter, however the inverse of the output covariance matrix in ISEM is regularized using standard truncated singular value decomposition or Tikhonov regularization. We also investigate the performance of a set of shrinkage based covariance estimators within ISEM. The proposed method is successfully applied on several nonlinear parameter estimation problems for subsurface flow models. The efficiency of the proposed algorithm is demonstrated by the small size of utilized ensembles and in terms of error convergence rates

  6. An iterative stochastic ensemble method for parameter estimation of subsurface flow models

    KAUST Repository

    Elsheikh, Ahmed H.

    2013-06-01

    Parameter estimation for subsurface flow models is an essential step for maximizing the value of numerical simulations for future prediction and the development of effective control strategies. We propose the iterative stochastic ensemble method (ISEM) as a general method for parameter estimation based on stochastic estimation of gradients using an ensemble of directional derivatives. ISEM eliminates the need for adjoint coding and deals with the numerical simulator as a blackbox. The proposed method employs directional derivatives within a Gauss-Newton iteration. The update equation in ISEM resembles the update step in ensemble Kalman filter, however the inverse of the output covariance matrix in ISEM is regularized using standard truncated singular value decomposition or Tikhonov regularization. We also investigate the performance of a set of shrinkage based covariance estimators within ISEM. The proposed method is successfully applied on several nonlinear parameter estimation problems for subsurface flow models. The efficiency of the proposed algorithm is demonstrated by the small size of utilized ensembles and in terms of error convergence rates. © 2013 Elsevier Inc.

  7. French RSE-M and RCC-MR code appendices for flaw analysis: Presentation of the fracture parameters calculation-Part V: Elements of validation

    International Nuclear Information System (INIS)

    Marie, S.; Chapuliot, S.; Kayser, Y.; Lacire, M.H.; Drubay, B.; Barthelet, B.; Le Delliou, P.; Rougier, V.; Naudin, C.; Gilles, P.; Triay, M.

    2007-01-01

    French nuclear codes include flaw assessment procedures: the RSE-M Code 'Rules for In-service Inspection of Nuclear Power Plant Components' and the RCC-MR code 'Design and Construction Rules for Mechanical Components of FBR Nuclear Islands and High Temperature Applications'. Development of analytical methods has been made for the last 10 years in the framework of a collaboration between CEA, EDF and AREVA-NP, and by R and D actions involving CEA and IRSN. These activities have led to a unification of the common methods of the two codes. The calculation of fracture mechanics parameters, in particular the stress intensity factor K I and the J integral, has been widely developed for industrial configurations. All the developments have been integrated in the 2005 edition of RSE-M and in 2007 edition of RCC-MR. This series of articles consists of 5 parts: the first part presents an overview of the methods proposed in the RCC-MR and RSE-M codes. Parts II-IV provide the compendia for specific components. The geometries are plates (part II), pipes (part III) and elbows (part IV). This part presents validation of the methods, with details on the process followed for their development and of the evaluation accuracy of the proposed analytical methods

  8. A Comparison of Three Methods for the Analysis of Skin Flap Viability: Reliability and Validity.

    Science.gov (United States)

    Tim, Carla Roberta; Martignago, Cintia Cristina Santi; da Silva, Viviane Ribeiro; Dos Santos, Estefany Camila Bonfim; Vieira, Fabiana Nascimento; Parizotto, Nivaldo Antonio; Liebano, Richard Eloin

    2018-05-01

    Objective: Technological advances have provided new alternatives to the analysis of skin flap viability in animal models; however, the interrater validity and reliability of these techniques have yet to be analyzed. The present study aimed to evaluate the interrater validity and reliability of three different methods: weight of paper template (WPT), paper template area (PTA), and photographic analysis. Approach: Sixteen male Wistar rats had their cranially based dorsal skin flap elevated. On the seventh postoperative day, the viable tissue area and the necrotic area of the skin flap were recorded using the paper template method and photo image. The evaluation of the percentage of viable tissue was performed using three methods, simultaneously and independently by two raters. The analysis of interrater reliability and viability was performed using the intraclass correlation coefficient and Bland Altman Plot Analysis was used to visualize the presence or absence of systematic bias in the evaluations of data validity. Results: The results showed that interrater reliability for WPT, measurement of PTA, and photographic analysis were 0.995, 0.990, and 0.982, respectively. For data validity, a correlation >0.90 was observed for all comparisons made between the three methods. In addition, Bland Altman Plot Analysis showed agreement between the comparisons of the methods and the presence of systematic bias was not observed. Innovation: Digital methods are an excellent choice for assessing skin flap viability; moreover, they make data use and storage easier. Conclusion: Independently from the method used, the interrater reliability and validity proved to be excellent for the analysis of skin flaps' viability.

  9. ReSOLV: Applying Cryptocurrency Blockchain Methods to Enable Global Cross-Platform Software License Validation

    Directory of Open Access Journals (Sweden)

    Alan Litchfield

    2018-05-01

    Full Text Available This paper presents a method for a decentralised peer-to-peer software license validation system using cryptocurrency blockchain technology to ameliorate software piracy, and to provide a mechanism for software developers to protect copyrighted works. Protecting software copyright has been an issue since the late 1970s and software license validation has been a primary method employed in an attempt to minimise software piracy and protect software copyright. The method described creates an ecosystem in which the rights and privileges of participants are observed.

  10. Validation of methods for the detection and quantification of engineered nanoparticles in food

    DEFF Research Database (Denmark)

    Linsinger, T.P.J.; Chaudhry, Q.; Dehalu, V.

    2013-01-01

    the methods apply equally well to particles of different suppliers. In trueness testing, information whether the particle size distribution has changed during analysis is required. Results are largely expected to follow normal distributions due to the expected high number of particles. An approach...... approach for the validation of methods for detection and quantification of nanoparticles in food samples. It proposes validation of identity, selectivity, precision, working range, limit of detection and robustness, bearing in mind that each “result” must include information about the chemical identity...

  11. Trafficability Analysis at Traffic Crossing and Parameters Optimization Based on Particle Swarm Optimization Method

    Directory of Open Access Journals (Sweden)

    Bin He

    2014-01-01

    Full Text Available In city traffic, it is important to improve transportation efficiency and the spacing of platoon should be shortened when crossing the street. The best method to deal with this problem is automatic control of vehicles. In this paper, a mathematical model is established for the platoon’s longitudinal movement. A systematic analysis of longitudinal control law is presented for the platoon of vehicles. However, the parameter calibration for the platoon model is relatively difficult because the platoon model is complex and the parameters are coupled with each other. In this paper, the particle swarm optimization method is introduced to effectively optimize the parameters of platoon. The proposed method effectively finds the optimal parameters based on simulations and makes the spacing of platoon shorter.

  12. Study on validation method for femur finite element model under multiple loading conditions

    Science.gov (United States)

    Guan, Fengjiao; Zhang, Guanjun; Liu, Jie; Wang, Shujing; Luo, Xu

    2018-03-01

    Acquisition of accurate and reliable constitutive parameters related to bio-tissue materials was beneficial to improve biological fidelity of a Finite Element (FE) model and predict impact damages more effectively. In this paper, a femur FE model was established under multiple loading conditions with diverse impact positions. Then, based on sequential response surface method and genetic algorithms, the material parameters identification was transformed to a multi-response optimization problem. Finally, the simulation results successfully coincided with force-displacement curves obtained by numerous experiments. Thus, computational accuracy and efficiency of the entire inverse calculation process were enhanced. This method was able to effectively reduce the computation time in the inverse process of material parameters. Meanwhile, the material parameters obtained by the proposed method achieved higher accuracy.

  13. On the Methodology to Calculate the Covariance of Estimated Resonance Parameters

    International Nuclear Information System (INIS)

    Becker, B.; Kopecky, S.; Schillebeeckx, P.

    2015-01-01

    Principles to determine resonance parameters and their covariance from experimental data are discussed. Different methods to propagate the covariance of experimental parameters are compared. A full Bayesian statistical analysis reveals that the level to which the initial uncertainty of the experimental parameters propagates, strongly depends on the experimental conditions. For high precision data the initial uncertainties of experimental parameters, like a normalization factor, has almost no impact on the covariance of the parameters in case of thick sample measurements and conventional uncertainty propagation or full Bayesian analysis. The covariances derived from a full Bayesian analysis and least-squares fit are derived under the condition that the model describing the experimental observables is perfect. When the quality of the model can not be verified a more conservative method based on a renormalization of the covariance matrix is recommended to propagate fully the uncertainty of experimental systematic effects. Finally, neutron resonance transmission analysis is proposed as an accurate method to validate evaluated data libraries in the resolved resonance region

  14. Single- and multi-layered all-dielectric ENG, MNG, and DNG material parameter extraction by use of the S-parameter method

    DEFF Research Database (Denmark)

    Wu, Yunqiu; Arslanagic, Samel

    2016-01-01

    modes inside the structure. This enables the ENG, MNG, and DNG behaviors. The material parameters are obtained from the simulated S-parameters by use of the Nicholson-Ross-Weir method. For the 2-layer structure in particular, the results show a possibility of DNG realization with a negative refractive...

  15. Parameter extraction and estimation based on the PV panel outdoor ...

    African Journals Online (AJOL)

    The experimental data obtained are validated and compared with the estimated results obtained through simulation based on the manufacture's data sheet. The simulation is based on the Newton-Raphson iterative method in MATLAB environment. This approach aids the computation of the PV module's parameters at any ...

  16. Parameter Estimation of Damped Compound Pendulum Differential Evolution Algorithm

    Directory of Open Access Journals (Sweden)

    Saad Mohd Sazli

    2016-01-01

    Full Text Available This paper present the parameter identification of damped compound pendulum using differential evolution algorithm. The procedure used to achieve the parameter identification of the experimental system consisted of input output data collection, ARX model order selection and parameter estimation using conventional method least square (LS and differential evolution (DE algorithm. PRBS signal is used to be input signal to regulate the motor speed. Whereas, the output signal is taken from position sensor. Both, input and output data is used to estimate the parameter of the ARX model. The residual error between the actual and predicted output responses of the models is validated using mean squares error (MSE. Analysis showed that, MSE value for LS is 0.0026 and MSE value for DE is 3.6601×10-5. Based results obtained, it was found that DE have lower MSE than the LS method.

  17. Rapid determination of anti-estrogens by gas chromatography/mass spectrometry in urine: Method validation and application to real samples

    Directory of Open Access Journals (Sweden)

    E. Gerace

    2012-02-01

    Full Text Available A fast screening protocol was developed for the simultaneous determination of nine anti-estrogenic agents (aminoglutethimide, anastrozole, clomiphene, drostanolone, formestane, letrozole, mesterolone, tamoxifen, testolactone plus five of their metabolites in human urine. After an enzymatic hydrolysis, these compounds can be extracted simultaneously from urine with a simple liquid–liquid extraction at alkaline conditions. The analytes were subsequently analyzed by fast-gas chromatography/mass spectrometry (fast-GC/MS after derivatization. The use of a short column, high-flow carrier gas velocity and fast temperature ramping produced an efficient separation of all analytes in about 4 min, allowing a processing rate of 10 samples/h. The present analytical method was validated according to UNI EN ISO/IEC 17025 guidelines for qualitative methods. The range of investigated parameters included the limit of detection, selectivity, linearity, repeatability, robustness and extraction efficiency. High MS-sampling rate, using a benchtop quadrupole mass analyzer, resulted in accurate peak shape definition under both scan and selected ion monitoring modes, and high sensitivity in the latter mode. Therefore, the performances of the method are comparable to the ones obtainable from traditional GC/MS analysis. The method was successfully tested on real samples arising from clinical treatments of hospitalized patients and could profitably be used for clinical studies on anti-estrogenic drug administration. Keywords: Anti-estrogens, Fast-GC/MS, Urine screening, Validation, Breast cancer

  18. Development and Validation of HPLC-PDA Assay method of Frangula emodin

    Directory of Open Access Journals (Sweden)

    Deborah Duca

    2016-03-01

    Full Text Available Frangula emodin, (1,3,8-trihydroxy-6-methyl-anthraquinone, is one of the anthraquinone derivatives found abundantly in the roots and bark of a number of plant families traditionally used to treat constipation and haemorrhoids. The present study describes the development and subsequent validation of a specific Assay HPLC method for emodin. The separation was achieved on a Waters Symmetry C18, 4.6 × 250 mm, 5 μm particle size, column at a temperature of 35 °C, with UV detection at 287 and 436 nm. An isocratic elution mode consisting of 0.1% formic acid and 0.01% trifluoroacetic acid as the aqueous mobile phase, and methanol was used. The method was successfully and statistically validated for linearity, range, precision, accuracy, specificity and solution stability.

  19. Method development and validation for the simultaneous determination of organochlorine and organophosphorus pesticides in a complex sediment matrix.

    Science.gov (United States)

    Alcántara-Concepción, Victor; Cram, Silke; Gibson, Richard; Ponce de León, Claudia; Mazari-Hiriart, Marisa

    2013-01-01

    The Xochimilco area in the southeastern part of Mexico City has a variety of socioeconomic activities, such as periurban agriculture, which is of great importance in the Mexico City metropolitan area. Pesticides are used extensively, some being legal, mostly chlorpyrifos and malathion, and some illegal, mostly DDT. Sediments are a common sink for pesticides in aquatic systems near agricultural areas, and Xochimilco sediments have a complex composition with high contents of organic matter and clay that are ideal adsorption sites for organochlorine (OC) and organophosphorus (OP) pesticides. Therefore, it is important to have a quick, affordable, and reliable method to determine these pesticides. Conventional methods for the determination of OC and OP pesticides are long, laborious, and costly owing to the high volume of solvents and adsorbents. The present study developed and validated a method for determining 18 OC and five OP pesticides in sediments with high organic and clay contents. In contrast with other methods described in the literature, this method allows isolation of the 23 pesticides with a 12 min microwave-assisted extraction (MAE) and one-step cleanup of pesticides. The method developed is a simpler, time-saving procedure that uses only 3.5 g of dry sediment. The use of MAE eliminates excessive handling and the possible loss of analytes. It was shown that the use of LC-Si cartridges with hexane-ethyl acetate (75+25, v/v) in the cleanup procedure recovered all pesticides with rates between 70 and 120%. The validation parameters demonstrated good performance of the method, with intermediate precision ranging from 7.3 to 17.0%, HorRat indexes all below 0.5, and tests of accuracy with the 23 pesticides at three concentration levels demonstrating recoveries ranging from 74 to 114% and RSDs from 3.3 to 12.7%.

  20. Pre-study and in-study validation of a size-exclusion chromatography method with different detection modes for the analysis of monoclonal antibody aggregates.

    Science.gov (United States)

    Oliva, Alexis; Fariña, Jose B; Llabrés, Matías

    2016-06-01

    Size exclusion chromatography (SEC) with different detection modes was assessed as a means to characterize the type of bevacizumab aggregate that forms under thermal stress, quantitatively monitoring the aggregation kinetics. The combination of SEC with light-scattering (SEC/LS) detection was validated using in-study validation process. This was performed by applying a strategy based on a control chart to monitor the process parameters and by inserting quality control samples in routine runs. The SEC coupled with a differential refractive-index detector (SEC/RI) was validated using a pre-study validation process in accordance with the ICH-Q2 (R1) guidelines and in-study monitoring in accordance with the Analytical Target Profile (ATP) criteria. The total error and β-expectation tolerance interval rules were used to assess method suitability and control the risk of incorrectly accepting unsuitable analytical methods. The aggregation kinetics data were interpreted using a modified Lumry-Eyring model. The true order of the reaction was determined using the initial-rate approach. All the kinetic data show a linear Arrhenius dependence within the studied temperature range. The Arrhenius approach over-predicted the aggregation rate for 5°C, but provides an idea of the aggregation process and amount of aggregate formed. In any case, real-time stability data are necessary to establish the product shelf-life. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Validation of an HPLC method for quantification of total quercetin in Calendula officinalis extracts

    International Nuclear Information System (INIS)

    Muñoz Muñoz, John Alexander; Morgan Machado, Jorge Enrique; Trujillo González, Mary

    2015-01-01

    Introduction: calendula officinalis extracts are used as natural raw material in a wide range of pharmaceutical and cosmetic preparations; however, there are no official methods for quality control of these extracts. Objective: to validate an HPLC-based analytical method for quantification total quercetin in glycolic and hydroalcoholic extracts of Calendula officinalis. Methods: to quantify total quercetin content in the matrices, it was necessary to hydrolyze flavonoid glycosides under optimal conditions. The chromatographic separation was performed on a C-18 SiliaChrom 4.6x150 mm 5 µm column, adapted to a SiliaChrom 5 um C-18 4.6x10 mm precolumn, with UV detection at 370 nm. The gradient elution was performed with a mobile phase consisting of methanol (MeOH) and phosphoric acid (H 3 PO 4 ) (0.08 % w/v). The quantification was performed through the external standard method and comparison with quercetin reference standard. Results: the studied method selectivity against extract components and degradation products under acid/basic hydrolysis, oxidation and light exposure conditions showed no signals that interfere with the quercetin quantification. It was statistically proved that the method is linear from 1.0 to 5.0 mg/mL. Intermediate precision expressed as a variation coefficient was 1.8 and 1.74 % and the recovery percentage was 102.15 and 101.32 %, for glycolic and hydroalcoholic extracts, respectively. Conclusions: the suggested methodology meets the quality parameters required for quantifying total quercetin, which makes it a useful tool for quality control of C. officinalis extracts. (author)

  2. Model-based verification method for solving the parameter uncertainty in the train control system

    International Nuclear Information System (INIS)

    Cheng, Ruijun; Zhou, Jin; Chen, Dewang; Song, Yongduan

    2016-01-01

    This paper presents a parameter analysis method to solve the parameter uncertainty problem for hybrid system and explore the correlation of key parameters for distributed control system. For improving the reusability of control model, the proposed approach provides the support for obtaining the constraint sets of all uncertain parameters in the abstract linear hybrid automata (LHA) model when satisfying the safety requirements of the train control system. Then, in order to solve the state space explosion problem, the online verification method is proposed to monitor the operating status of high-speed trains online because of the real-time property of the train control system. Furthermore, we construct the LHA formal models of train tracking model and movement authority (MA) generation process as cases to illustrate the effectiveness and efficiency of the proposed method. In the first case, we obtain the constraint sets of uncertain parameters to avoid collision between trains. In the second case, the correlation of position report cycle and MA generation cycle is analyzed under both the normal and the abnormal condition influenced by packet-loss factor. Finally, considering stochastic characterization of time distributions and real-time feature of moving block control system, the transient probabilities of wireless communication process are obtained by stochastic time petri nets. - Highlights: • We solve the parameters uncertainty problem by using model-based method. • We acquire the parameter constraint sets by verifying linear hybrid automata models. • Online verification algorithms are designed to monitor the high-speed trains. • We analyze the correlation of key parameters and uncritical parameters. • The transient probabilities are obtained by using reliability analysis.

  3. Design of a Two-Step Calibration Method of Kinematic Parameters for Serial Robots

    Science.gov (United States)

    WANG, Wei; WANG, Lei; YUN, Chao

    2017-03-01

    Serial robots are used to handle workpieces with large dimensions, and calibrating kinematic parameters is one of the most efficient ways to upgrade their accuracy. Many models are set up to investigate how many kinematic parameters can be identified to meet the minimal principle, but the base frame and the kinematic parameter are indistinctly calibrated in a one-step way. A two-step method of calibrating kinematic parameters is proposed to improve the accuracy of the robot's base frame and kinematic parameters. The forward kinematics described with respect to the measuring coordinate frame are established based on the product-of-exponential (POE) formula. In the first step the robot's base coordinate frame is calibrated by the unit quaternion form. The errors of both the robot's reference configuration and the base coordinate frame's pose are equivalently transformed to the zero-position errors of the robot's joints. The simplified model of the robot's positioning error is established in second-power explicit expressions. Then the identification model is finished by the least square method, requiring measuring position coordinates only. The complete subtasks of calibrating the robot's 39 kinematic parameters are finished in the second step. It's proved by a group of calibration experiments that by the proposed two-step calibration method the average absolute accuracy of industrial robots is updated to 0.23 mm. This paper presents that the robot's base frame should be calibrated before its kinematic parameters in order to upgrade its absolute positioning accuracy.

  4. Validation of Neutron Calculation Codes and Models by means of benchmark cases in the frame of the Binational Commission of Nuclear Energy. Kinetic Parameters, Temperature Coefficients and Power Distribution

    International Nuclear Information System (INIS)

    Dos Santos, Adimir; Siqueira, Paulo de Tarso D.; Andrade e Silva, Graciete Simões; Grant, Carlos; Tarazaga, Ariel E.; Barberis, Claudia

    2013-01-01

    In year 2008 the Atomic Energy National Commission (CNEA) of Argentina, and the Brazilian Institute of Energetic and Nuclear Research (IPEN), under the frame of Nuclear Energy Argentine Brazilian Agreement (COBEN), among many others, included the project “Validation and Verification of Calculation Methods used for Research and Experimental Reactors . At this time, it was established that the validation was to be performed with models implemented in the deterministic codes HUEMUL and PUMA (cell and reactor codes) developed by CNEA and those ones implemented in MCNP by CNEA and IPEN. The necessary data for these validations would correspond to theoretical-experimental reference cases in the research reactor IPEN/MB-01 located in São Paulo, Brazil. The staff of the group Reactor and Nuclear Power Studies (SERC) of CNEA, from the argentine side, performed calculations with deterministic models (HUEMUL-PUMA) and probabilistic methods (MCNP) modeling a great number of physical situations of de reactor, which previously have been studied and modeled by members of the Center of Nuclear Engineering of the IPEN, whose results were extensively provided to CNEA. In this paper results of comparison of calculated and experimental results for temperature coefficients, kinetic parameters and fission rates spatial distributions are shown. (author)

  5. Reliability and validity of pressure and temporal parameters recorded using a pressure-sensitive insole during running.

    Science.gov (United States)

    Mann, Robert; Malisoux, Laurent; Brunner, Roman; Gette, Paul; Urhausen, Axel; Statham, Andrew; Meijer, Kenneth; Theisen, Daniel

    2014-01-01

    Running biomechanics has received increasing interest in recent literature on running-related injuries, calling for new, portable methods for large-scale measurements. Our aims were to define running strike pattern based on output of a new pressure-sensitive measurement device, the Runalyser, and to test its validity regarding temporal parameters describing running gait. Furthermore, reliability of the Runalyser measurements was evaluated, as well as its ability to discriminate different running styles. Thirty-one healthy participants (30.3 ± 7.4 years, 1.78 ± 0.10 m and 74.1 ± 12.1 kg) were involved in the different study parts. Eleven participants were instructed to use a rearfoot (RFS), midfoot (MFS) and forefoot (FFS) strike pattern while running on a treadmill. Strike pattern was subsequently defined using a linear regression (R(2)=0.89) between foot strike angle, as determined by motion analysis (1000 Hz), and strike index (SI, point of contact on the foot sole, as a percentage of foot sole length), as measured by the Runalyser. MFS was defined by the 95% confidence interval of the intercept (SI=43.9-49.1%). High agreement (overall mean difference 1.2%) was found between stance time, flight time, stride time and duty factor as determined by the Runalyser and a force-measuring treadmill (n=16 participants). Measurements of the two devices were highly correlated (R ≥ 0.80) and not significantly different. Test-retest intra-class correlation coefficients for all parameters were ≥ 0.94 (n=14 participants). Significant differences (p<0.05) between FFS, RFS and habitual running were detected regarding SI, stance time and stride time (n=24 participants). The Runalyser is suitable for, and easily applicable in large-scale studies on running biomechanics. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Nonlinear adaptive synchronization rule for identification of a large amount of parameters in dynamical models

    International Nuclear Information System (INIS)

    Ma Huanfei; Lin Wei

    2009-01-01

    The existing adaptive synchronization technique based on the stability theory and invariance principle of dynamical systems, though theoretically proved to be valid for parameters identification in specific models, is always showing slow convergence rate and even failed in practice when the number of parameters becomes large. Here, for parameters update, a novel nonlinear adaptive rule is proposed to accelerate the rate. Its feasibility is validated by analytical arguments as well as by specific parameters identification in the Lotka-Volterra model with multiple species. Two adjustable factors in this rule influence the identification accuracy, which means that a proper choice of these factors leads to an optimal performance of this rule. In addition, a feasible method for avoiding the occurrence of the approximate linear dependence among terms with parameters on the synchronized manifold is also proposed.

  7. Validation of Experimental whole-body SAR Assessment Method in a Complex Indoor Environment

    DEFF Research Database (Denmark)

    Bamba, Aliou; Joseph, Wout; Vermeeren, Gunter

    2012-01-01

    Assessing experimentally the whole-body specific absorption rate (SARwb) in a complex indoor environment is very challenging. An experimental method based on room electromagnetics theory (accounting only the Line-Of-Sight as specular path) to assess the whole-body SAR is validated by numerical...... of the proposed method is that it allows discarding the computation burden because it does not use any discretizations. Results show good agreement between measurement and computation at 2.8 GHz, as long as the plane wave assumption is valid, i.e., for high distances from the transmitter. Relative deviations 0...

  8. A simple method for validation and verification of pipettes mounted on automated liquid handlers

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Frøslev, Tobias Guldberg

     We have implemented a simple method for validation and verification of the performance of pipettes mounted on automated liquid handlers as necessary for laboratories accredited under ISO 17025. An 8-step serial dilution of Orange G was prepared in quadruplicates in a flat bottom 96-well microtit...... available. In conclusion, we have set up a simple solution for the continuous validation of automated liquid handlers used for accredited work. The method is cheap, simple and easy to use for aqueous solutions but requires a spectrophotometer that can read microtiter plates....... We have implemented a simple method for validation and verification of the performance of pipettes mounted on automated liquid handlers as necessary for laboratories accredited under ISO 17025. An 8-step serial dilution of Orange G was prepared in quadruplicates in a flat bottom 96-well microtiter...

  9. Statistical analysis of dynamic parameters of the core

    International Nuclear Information System (INIS)

    Ionov, V.S.

    2007-01-01

    The transients of various types were investigated for the cores of zero power critical facilities in RRC KI and NPP. Dynamic parameters of neutron transients were explored by tool statistical analysis. Its have sufficient duration, few channels for currents of chambers and reactivity and also some channels for technological parameters. On these values the inverse period. reactivity, lifetime of neutrons, reactivity coefficients and some effects of a reactivity are determinate, and on the values were restored values of measured dynamic parameters as result of the analysis. The mathematical means of statistical analysis were used: approximation(A), filtration (F), rejection (R), estimation of parameters of descriptive statistic (DSP), correlation performances (kk), regression analysis(KP), the prognosis (P), statistician criteria (SC). The calculation procedures were realized by computer language MATLAB. The reasons of methodical and statistical errors are submitted: inadequacy of model operation, precision neutron-physical parameters, features of registered processes, used mathematical model in reactivity meters, technique of processing for registered data etc. Examples of results of statistical analysis. Problems of validity of the methods used for definition and certification of values of statistical parameters and dynamic characteristics are considered (Authors)

  10. USFDA-GUIDELINE BASED VALIDATION OF TESTING METHOD FOR RIFAMPICIN IN INDONESIAN SERUM SPECIMEN

    Directory of Open Access Journals (Sweden)

    Tri Joko Raharjo

    2010-06-01

    Full Text Available Regarding a new regulation from Indonesia FDA (Badan POM-RI, all new non patent drugs should show bioequivalence with the originator drug prior to registration. Bioequivalence testing (BE-testing has to be performed to the people that represented of population to which the drug to be administrated. BE testing need a valid bio-analytical method for certain drug target and group of population. This research report specific validation of bio-analysis of Rifampicin in Indonesian serum specimen in order to be used for BE testing. The extraction was performed using acetonitrile while the chromatographic separation was accomplished on a RP 18 column (250 × 4.6 mm i.d., 5 µm, with a mobile phase composed of KH2PO4 10 mM-Acetonitrile (40:60, v/v and UV detection was set at 333 nm. The method shown specificity compared to blank serum specimen with retention time of rifampicin at 2.1 min. Lower limit of quantification (LLOQ was 0.06 µg/mL with dynamic range up to 20 µg/mL (R>0.990. Precision of the method was very good with coefficient of variance (CV 0.58; 7.40 and 5.56% for concentration at 0.06, 5, 15 µg/mL, respectively. Accuracies of the method were 3.22; 1.94; 1.90% for concentration 0.06, 5 and 15 µg/mL respectively. The average recoveries were 97.82, 95.50 and 97.31% for concentration of rifampicin 1, 5 and 5 µg/mL, respectively. The method was also shown reliable result on stability test on freezing-thawing, short-term and long-term stability as well as post preparation stability. Validation result shown that the method was ready to be used for Rifampicin BE testing with Indonesian subject.   Keywords: Rifampicin, Validation, USFDA-Guideline

  11. Application of Powell's optimization method to surge arrester circuit models' parameters

    Energy Technology Data Exchange (ETDEWEB)

    Christodoulou, C.A.; Stathopulos, I.A. [National Technical University of Athens, School of Electrical and Computer Engineering, 9 Iroon Politechniou St., Zografou Campus, 157 80 Athens (Greece); Vita, V.; Ekonomou, L.; Chatzarakis, G.E. [A.S.PE.T.E. - School of Pedagogical and Technological Education, Department of Electrical Engineering Educators, N. Heraklion, 141 21 Athens (Greece)

    2010-08-15

    Powell's optimization method has been used for the evaluation of the surge arrester models parameters. The proper modelling of metal-oxide surge arresters and the right selection of equivalent circuit parameters are very significant issues, since quality and reliability of lightning performance studies can be improved with the more efficient representation of the arresters' dynamic behavior. The proposed approach selects optimum arrester model equivalent circuit parameter values, minimizing the error between the simulated peak residual voltage value and this given by the manufacturer. Application of the method in performed on a 120 kV metal oxide arrester. The use of the obtained optimum parameter values reduces significantly the relative error between the simulated and manufacturer's peak residual voltage value, presenting the effectiveness of the method. (author)

  12. Development of nuclear methods for determining fluid-dynamic parameters in fluid catalyst cracking reactors

    International Nuclear Information System (INIS)

    Santos, V.A. dos; Dantas, C.C.

    1986-01-01

    Flow parameters of circulating fluidized bed in a simulated Fluid Catalyst Cracking reactor were determined by means of nuclear methods. The parameters were: residence time, density, inventory, circulation rate and radial distribution, for the catalyst; residence time for the gaseous phase. The nuclear methods where the gamma attenuation and the radiotracer. Two tracer techniques were developed, one for tagging of the catalyst by the 59 Fe as intrinsic tracer and another for tagging of the gaseous phase by the CH 3 82 Br as tracer. A detailed description of each measuring technique for all the investigated parameters is included. To carry out the determination for some of parameters a combination of the two methods was also applied. The results and the nuclear data are given in a table. (Author) [pt

  13. Development and validation of an LC-MS/MS method for the quantification of tiamulin, trimethoprim, tylosin, sulfadiazine and sulfamethazine in medicated feed.

    Science.gov (United States)

    Patyra, Ewelina; Nebot, Carolina; Gavilán, Rosa Elvira; Cepeda, Alberto; Kwiatek, Krzysztof

    2018-01-22

    A new multi-compound method for the analysis of veterinary drugs, namely tiamulin, trimethoprim, tylosin, sulfadiazine and sulfamethazine was developed and validated in medicated feeds. After extraction, the samples were centrifuged, diluted in Milli-Q water, filtered and analysed by high performance liquid chromatography coupled to tandem mass spectrometry. The separation of the analytes was performed on a biphenyl column with a gradient of 0.1% formic acid in acetonitrile and 0.1% formic acid in Milli-Q water. Quantitative validation was done in accordance with the guidelines laid down in European Commission Decision 2002/657/EC. Method performances were evaluated by the following parameters: linearity (R 2  tiamulin, tylosin and sulfamethazine were detected at the concentration levels declared by the manufacturers. The developed method can therefore be successfully used to routinely control the content and homogeneity of these antibacterial substances in medicated feed. Abbreviations AAFCO - Association of American Feed Control Officials; TYL - tylosin; TIAM - tiamulin fumarate; TRIM - trimethoprim; SDZ - sulfadiazine; SMZ - sulfamethazine; UV - ultraviolet detector; FLD - fluorescence detector; HPLC - high performance liquid chromatography; MS/MS - tandem mass spectrometry; LOD - limit of detection; LOQ - limit of quantification; CV - coefficient of variation; SD - standard deviation; U - uncertainty.

  14. Optimization of PID Parameters Utilizing Variable Weight Grey-Taguchi Method and Particle Swarm Optimization

    Science.gov (United States)

    Azmi, Nur Iffah Mohamed; Arifin Mat Piah, Kamal; Yusoff, Wan Azhar Wan; Romlay, Fadhlur Rahman Mohd

    2018-03-01

    Controller that uses PID parameters requires a good tuning method in order to improve the control system performance. Tuning PID control method is divided into two namely the classical methods and the methods of artificial intelligence. Particle swarm optimization algorithm (PSO) is one of the artificial intelligence methods. Previously, researchers had integrated PSO algorithms in the PID parameter tuning process. This research aims to improve the PSO-PID tuning algorithms by integrating the tuning process with the Variable Weight Grey- Taguchi Design of Experiment (DOE) method. This is done by conducting the DOE on the two PSO optimizing parameters: the particle velocity limit and the weight distribution factor. Computer simulations and physical experiments were conducted by using the proposed PSO- PID with the Variable Weight Grey-Taguchi DOE and the classical Ziegler-Nichols methods. They are implemented on the hydraulic positioning system. Simulation results show that the proposed PSO-PID with the Variable Weight Grey-Taguchi DOE has reduced the rise time by 48.13% and settling time by 48.57% compared to the Ziegler-Nichols method. Furthermore, the physical experiment results also show that the proposed PSO-PID with the Variable Weight Grey-Taguchi DOE tuning method responds better than Ziegler-Nichols tuning. In conclusion, this research has improved the PSO-PID parameter by applying the PSO-PID algorithm together with the Variable Weight Grey-Taguchi DOE method as a tuning method in the hydraulic positioning system.

  15. Parameters Estimation For A Patellofemoral Joint Of A Human Knee Using A Vector Method

    Science.gov (United States)

    Ciszkiewicz, A.; Knapczyk, J.

    2015-08-01

    Position and displacement analysis of a spherical model of a human knee joint using the vector method was presented. Sensitivity analysis and parameter estimation were performed using the evolutionary algorithm method. Computer simulations for the mechanism with estimated parameters proved the effectiveness of the prepared software. The method itself can be useful when solving problems concerning the displacement and loads analysis in the knee joint.

  16. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  17. Blackness coefficients, effective diffusion parameters, and control rod worths for thermal reactors - Methods

    Energy Technology Data Exchange (ETDEWEB)

    Bretscher, M M [Argonne National Laboratory, Argonne, IL 60439 (United States)

    1985-07-01

    Simple diffusion theory cannot be used to evaluate control rod worths in thermal neutron reactors because of the strongly absorbing character of the control material. However, reliable control rod worths can be obtained within the framework of diffusion theory if the control material is characterized by a set of mesh-dependent effective diffusion parameters. For thin slab absorbers the effective diffusion parameters can be expressed as functions of a suitably-defined pair of 'blackness coefficients'. Methods for calculating these blackness coefficients in the P1, P3, and P5 approximations, with and without scattering, are presented. For control elements whose geometry does not permit a thin slab treatment, other methods are needed for determining the effective diffusion parameters. One such method, based on reaction rate ratios, is discussed. (author)

  18. Analytical method development and validation for quantification of uranium in compounds of the nuclear fuel cycle by Fourier Transform Infrared (FTIR) Spectroscopy

    International Nuclear Information System (INIS)

    Pereira, Elaine

    2016-01-01

    This work presents a low cost, simple and new methodology for direct quantification of uranium in compounds of the nuclear fuel cycle, based on Fourier Transform Infrared (FTIR) spectroscopy using KBr pressed discs technique. Uranium in different matrices were used to development and validation: UO 2 (NO 3 )2.2TBP complex (TBP uranyl nitrate complex) in organic phase and uranyl nitrate (UO 2 (NO 3 ) 2 ) in aqueous phase. The parameters used in the validation process were: linearity, selectivity, accuracy, limits of detection (LD) and quantitation (LQ), precision (repeatability and intermediate precision) and robustness. The method for uranium in organic phase (UO 2 (NO 3 )2.2TBP complex in hexane/embedded in KBr) was linear (r = 0.9980) over the range of 0.20% 2.85% U/ KBr disc, LD 0.02% and LQ 0.03%, accurate (recoveries were over 101.0%), robust and precise (RSD < 1.6%). The method for uranium aqueous phase (UO 2 (NO 3 ) 2 /embedded in KBr) was linear (r = 0.9900) over the range of 0.14% 1.29% U/KBr disc, LD 0.01% and LQ 0.02%, accurate (recoveries were over 99.4%), robust and precise (RSD < 1.6%). Some process samples were analyzed in FTIR and compared with gravimetric and X-ray fluorescence (XRF) analyses showing similar results in all three methods. The statistical tests (t-Student and Fischer) showed that the techniques are equivalent. The validated method can be successfully employed for routine quality control analysis for nuclear compounds. (author)

  19. Validation of Likelihood Ratio Methods Used for Forensic Evidence Evaluation: Application in Forensic Fingerprints

    NARCIS (Netherlands)

    Haraksim, Rudolf

    2014-01-01

    In this chapter the Likelihood Ratio (LR) inference model will be introduced, the theoretical aspects of probabilities will be discussed and the validation framework for LR methods used for forensic evidence evaluation will be presented. Prior to introducing the validation framework, following

  20. A Copula-Based Method for Estimating Shear Strength Parameters of Rock Mass

    Directory of Open Access Journals (Sweden)

    Da Huang

    2014-01-01

    Full Text Available The shear strength parameters (i.e., the internal friction coefficient f and cohesion c are very important in rock engineering, especially for the stability analysis and reinforcement design of slopes and underground caverns. In this paper, a probabilistic method, Copula-based method, is proposed for estimating the shear strength parameters of rock mass. The optimal Copula functions between rock mass quality Q and f, Q and c for the marbles are established based on the correlation analyses of the results of 12 sets of in situ tests in the exploration adits of Jinping I-Stage Hydropower Station. Although the Copula functions are derived from the in situ tests for the marbles, they can be extended to be applied to other types of rock mass with similar geological and mechanical properties. For another 9 sets of in situ tests as an extensional application, by comparison with the results from Hoek-Brown criterion, the estimated values of f and c from the Copula-based method achieve better accuracy. Therefore, the proposed Copula-based method is an effective tool in estimating rock strength parameters.

  1. Model Validation Using Coordinate Distance with Performance Sensitivity

    Directory of Open Access Journals (Sweden)

    Jiann-Shiun Lew

    2008-01-01

    Full Text Available This paper presents an innovative approach to model validation for a structure with significant parameter variations. Model uncertainty of the structural dynamics is quantified with the use of a singular value decomposition technique to extract the principal components of parameter change, and an interval model is generated to represent the system with parameter uncertainty. The coordinate vector, corresponding to the identified principal directions, of the validation system is computed. The coordinate distance between the validation system and the identified interval model is used as a metric for model validation. A beam structure with an attached subsystem, which has significant parameter uncertainty, is used to demonstrate the proposed approach.

  2. Validation of the Analytical Method for the Determination of Flavonoids in Broccoli

    Directory of Open Access Journals (Sweden)

    Tuszyńska Magdalena

    2014-09-01

    Full Text Available A simple, accurate and selective HPLC method was developed and validated for determination of quercetin and kaempferol, which are the main flavonols in broccoli. The separation was achieved on a reversed-phase C18 column using a mobile phase composed of methanol/water (60/40 and phosphoric acid 0.2% at a flow rate of 1.0 ml min-1. The detection was carried out on a DAD detector at 370 nm. This method was validated according to the requirements for new methods, which include selectivity, linearity, precision, accuracy, limit of detection and limit of quantitation. The current method demonstrates good linearity, with R2 > 0.99. The recovery is within 98.07-102.15% and 97.92-101.83% for quercetin and kaempferol, respectively. The method is selective, in that quercetin and kaempferol are well separated from other compounds of broccoli with good resolution. The low limit of detection and limit of quantitation of quercetin and kaempferol enable the detection and quantitation of these flavonoids in broccoli at low con–centrations.

  3. Validation of simulation codes for future systems: motivations, approach, and the role of nuclear data

    International Nuclear Information System (INIS)

    Palmiotti, G.; Salvatores, M.; Aliberti, G.

    2007-01-01

    The validation of advanced simulation tools will still play a very significant role in several areas of reactor system analysis. This is the case of reactor physics and neutronics, where nuclear data uncertainties still play a crucial role for many core and fuel cycle parameters. The present paper gives a summary of validation motivations, objectives and approach. A validation effort is in particular necessary in the frame of advanced (e.g. Generation-IV or GNEP) reactors and associated fuel cycles assessment and design. Validation of simulation codes is complementary to the 'verification' process. In fact, 'verification' addresses the question 'are we solving the equations correctly' while validation addresses the question 'are we solving the correct equations with the correct parameters'. Verification implies comparisons with 'reference' equation solutions or with analytical solutions, when they exist. Most of what is called 'numerical validation' falls in this category. Validation strategies differ according to the relative weight of the methods and of the parameters that enter into the simulation tools. Most validation is based on experiments, and the field of neutronics where a 'robust' physics description model exists and which is function of 'input' parameters not fully known, will be the focus of this paper. In fact, in the case of reactor core, shielding and fuel cycle physics the model (theory) is well established (the Boltzmann and Bateman equations) and the parameters are the nuclear cross-sections, decay data etc. Two types of validation approaches can and have been used: (a) Mock-up experiments ('global' validation): need for a very close experimental simulation of a reference configuration. Bias factors cannot be extrapolated beyond reference configuration; (b) Use of 'clean', 'representative' integral experiments ('bias factor and adjustment' method). Allows to define bias factors, uncertainties and can be used for a wide range of applications. It

  4. NordVal: A Nordic system for validation of alternative microbiological methods

    DEFF Research Database (Denmark)

    Qvist, Sven

    2007-01-01

    NordVal was created in 1999 by the Nordic Committee of Senior Officials for Food Issues under the Nordic Council of Ministers. The Committee adopted the following objective for NordVal: NordVal evaluates the performance and field of application of alternative microbiological methods. This includes...... analyses of food, water, feed, animal faeces and food environmental samples in the Nordic countries. NordVal is managed by a steering group, which is appointed by the National Food Administrations in Denmark, Finland, Iceland, Norway and Sweden. The background for creation of NordVal was a Danish...... validation system (DanVal) established in 1995 to cope with a need to validate alternative methods to be used in the Danish Salmonella Action Program. The program attracted considerable attention in the other Nordic countries. NordVal has elaborated a number of documents, which describe the requirements...

  5. Drilling methods to keep the hydrogeological parameters of natural aquifer

    International Nuclear Information System (INIS)

    Chen Xiaoqin

    2004-01-01

    In hydrogeological drilling, how to keep the hydrogeological parameters of natural aquifer unchanged is a deeply concerned problem for the technicians, this paper introduces the methods taken by the state-owned 'Red Hill' geological company of Uzbekistan. By the research and contrast of different kinds of flush liquid, the company has found the methods to reduce the negative effects of drilling on the permeability of the vicinal aquifer. (author)

  6. Validation needs of seismic probabilistic risk assessment (PRA) methods applied to nuclear power plants

    International Nuclear Information System (INIS)

    Kot, C.A.; Srinivasan, M.G.; Hsieh, B.J.

    1985-01-01

    An effort to validate seismic PRA methods is in progress. The work concentrates on the validation of plant response and fragility estimates through the use of test data and information from actual earthquake experience. Validation needs have been identified in the areas of soil-structure interaction, structural response and capacity, and equipment fragility. Of particular concern is the adequacy of linear methodology to predict nonlinear behavior. While many questions can be resolved through the judicious use of dynamic test data, other aspects can only be validated by means of input and response measurements during actual earthquakes. A number of past, ongoing, and planned testing programs which can provide useful validation data have been identified, and validation approaches for specific problems are being formulated

  7. The method validation step of biological dosimetry accreditation process

    International Nuclear Information System (INIS)

    Roy, L.; Voisin, P.A.; Guillou, A.C.; Busset, A.; Gregoire, E.; Buard, V.; Delbos, M.; Voisin, Ph.

    2006-01-01

    One of the missions of the Laboratory of Biological Dosimetry (L.D.B.) of the Institute for Radiation and Nuclear Safety (I.R.S.N.) is to assess the radiological dose after an accidental overexposure suspicion to ionising radiation, by using radio-induced changes of some biological parameters. The 'gold standard' is the yield of dicentrics observed in patients lymphocytes, and this yield is converted in dose using dose effect relationships. This method is complementary to clinical and physical dosimetry, for medical team in charge of the patients. To obtain a formal recognition of its operational activity, the laboratory decided three years ago, to require an accreditation, by following the recommendations of both 17025 General Requirements for the Competence of Testing and Calibration Laboratories and 19238 Performance criteria for service laboratories performing biological dosimetry by cyto-genetics. Diagnostics, risks analysis were realized to control the whole analysis process leading to documents writing. Purchases, personnel department, vocational training were also included in the quality system. Audits were very helpful to improve the quality system. One specificity of this technique is that it is not normalized therefore apart from quality management aspects, several technical points needed some validations. An inventory of potentially influent factors was carried out. To estimate their real effect on the yield of dicentrics, a Placket-Burman experimental design was conducted. The effect of seven parameters was tested: the BUdr (bromodeoxyuridine), PHA (phytohemagglutinin) and colcemid concentration, the culture duration, the incubator temperature, the blood volume and the medium volume. The chosen values were calculated according to the uncertainties on the way they were measured i.e. pipettes, thermometers, test tubes. None of the factors has a significant impact on the yield of dicentrics. Therefore the uncertainty linked to their use was considered as

  8. The method validation step of biological dosimetry accreditation process

    Energy Technology Data Exchange (ETDEWEB)

    Roy, L.; Voisin, P.A.; Guillou, A.C.; Busset, A.; Gregoire, E.; Buard, V.; Delbos, M.; Voisin, Ph. [Institut de Radioprotection et de Surete Nucleaire, LDB, 92 - Fontenay aux Roses (France)

    2006-07-01

    One of the missions of the Laboratory of Biological Dosimetry (L.D.B.) of the Institute for Radiation and Nuclear Safety (I.R.S.N.) is to assess the radiological dose after an accidental overexposure suspicion to ionising radiation, by using radio-induced changes of some biological parameters. The 'gold standard' is the yield of dicentrics observed in patients lymphocytes, and this yield is converted in dose using dose effect relationships. This method is complementary to clinical and physical dosimetry, for medical team in charge of the patients. To obtain a formal recognition of its operational activity, the laboratory decided three years ago, to require an accreditation, by following the recommendations of both 17025 General Requirements for the Competence of Testing and Calibration Laboratories and 19238 Performance criteria for service laboratories performing biological dosimetry by cyto-genetics. Diagnostics, risks analysis were realized to control the whole analysis process leading to documents writing. Purchases, personnel department, vocational training were also included in the quality system. Audits were very helpful to improve the quality system. One specificity of this technique is that it is not normalized therefore apart from quality management aspects, several technical points needed some validations. An inventory of potentially influent factors was carried out. To estimate their real effect on the yield of dicentrics, a Placket-Burman experimental design was conducted. The effect of seven parameters was tested: the BUdr (bromodeoxyuridine), PHA (phytohemagglutinin) and colcemid concentration, the culture duration, the incubator temperature, the blood volume and the medium volume. The chosen values were calculated according to the uncertainties on the way they were measured i.e. pipettes, thermometers, test tubes. None of the factors has a significant impact on the yield of dicentrics. Therefore the uncertainty linked to their use was

  9. Use of reference materials for validating analytical methods. Applied to the determination of As, Co, Na, Hg, Se and Fe using neutron activation analysis

    International Nuclear Information System (INIS)

    Munoz, L; Andonie, O; Kohnenkamp, I

    2000-01-01

    The main purpose of an analytical laboratory is to provide reliable information on the nature and composition of the materials submitted for analysis. This purpose can only be attained if analytical methodologies that have the attributes of accuracy, precision, specificity and sensitivity, among others, are used. The process by which these attributes are evaluated is called validation of the analytical method. The Chilean Nuclear Energy Commission's Neutron Activation Analysis Laboratory is applying a quality guarantee program to ensure the quality of its analytical results, which aims, as well, to attain accreditation for some of its measurements. Validation of the analytical methodologies used is an essential part of applying this program. There are many forms of validation, from comparison with reference techniques to participation in inter-comparison rounds. Certified reference materials were used in this work in order to validate the application of neutron activation analysis in determining As, Co, Na, Hg, Se and Fe in shellfish samples. The use of reference materials was chosen because it is a simple option that easily detects sources of systematic errors. Neutron activation analysis is an instrumental analytical method that does not need chemical treatment and that is based on processes which take place in the nuclei of atoms, making the matrix effects unimportant and different biological reference materials can be used. The following certified reference materials were used for validating the method used: BCR human hair 397, NRCC dogfish muscle DORM-2, NRCC -dogfish liver DOLT-2, NIST - oyster tissue 1566, NIES - mussel 6 and BCR - tuna fish 464. The reference materials were analyzed using the procedure developed for the shellfish samples and the above-mentioned elements were determined. With the results obtained, the parameters of accuracy, precision, detection limit, quantification limit and uncertainty associated with the method were determined for each

  10. A practical method to assess model sensitivity and parameter uncertainty in C cycle models

    Science.gov (United States)

    Delahaies, Sylvain; Roulstone, Ian; Nichols, Nancy

    2015-04-01

    The carbon cycle combines multiple spatial and temporal scales, from minutes to hours for the chemical processes occurring in plant cells to several hundred of years for the exchange between the atmosphere and the deep ocean and finally to millennia for the formation of fossil fuels. Together with our knowledge of the transformation processes involved in the carbon cycle, many Earth Observation systems are now available to help improving models and predictions using inverse modelling techniques. A generic inverse problem consists in finding a n-dimensional state vector x such that h(x) = y, for a given N-dimensional observation vector y, including random noise, and a given model h. The problem is well posed if the three following conditions hold: 1) there exists a solution, 2) the solution is unique and 3) the solution depends continuously on the input data. If at least one of these conditions is violated the problem is said ill-posed. The inverse problem is often ill-posed, a regularization method is required to replace the original problem with a well posed problem and then a solution strategy amounts to 1) constructing a solution x, 2) assessing the validity of the solution, 3) characterizing its uncertainty. The data assimilation linked ecosystem carbon (DALEC) model is a simple box model simulating the carbon budget allocation for terrestrial ecosystems. Intercomparison experiments have demonstrated the relative merit of various inverse modelling strategies (MCMC, ENKF) to estimate model parameters and initial carbon stocks for DALEC using eddy covariance measurements of net ecosystem exchange of CO2 and leaf area index observations. Most results agreed on the fact that parameters and initial stocks directly related to fast processes were best estimated with narrow confidence intervals, whereas those related to slow processes were poorly estimated with very large uncertainties. While other studies have tried to overcome this difficulty by adding complementary

  11. Development and validation of an alternative titration method for the determination of sulfate ion in indinavir sulfate

    Directory of Open Access Journals (Sweden)

    Breno de Carvalho e Silva

    2005-02-01

    Full Text Available A simple and rapid precipitation titration method was developed and validated to determine sulfate ion content in indinavir sulfate raw material. 0.1 mol L-1 lead nitrate volumetric solution was used as titrant employing potentiometric endpoint determination using a lead-specific electrode. The United States Pharmacopoeia Forum indicates a potentiometric method for sulfate ion quantitation using 0.1 mol L-1 lead perchlorate as titrant. Both methods were validated concerning linearity, precision and accuracy, yielding good results. The sulfate ion content found by the two validated methods was compared by the statistical t-student test, indicating that there was no statistically significant difference between the methods.

  12. Weakly intrusive low-rank approximation method for nonlinear parameter-dependent equations

    KAUST Repository

    Giraldi, Loic; Nouy, Anthony

    2017-01-01

    This paper presents a weakly intrusive strategy for computing a low-rank approximation of the solution of a system of nonlinear parameter-dependent equations. The proposed strategy relies on a Newton-like iterative solver which only requires evaluations of the residual of the parameter-dependent equation and of a preconditioner (such as the differential of the residual) for instances of the parameters independently. The algorithm provides an approximation of the set of solutions associated with a possibly large number of instances of the parameters, with a computational complexity which can be orders of magnitude lower than when using the same Newton-like solver for all instances of the parameters. The reduction of complexity requires efficient strategies for obtaining low-rank approximations of the residual, of the preconditioner, and of the increment at each iteration of the algorithm. For the approximation of the residual and the preconditioner, weakly intrusive variants of the empirical interpolation method are introduced, which require evaluations of entries of the residual and the preconditioner. Then, an approximation of the increment is obtained by using a greedy algorithm for low-rank approximation, and a low-rank approximation of the iterate is finally obtained by using a truncated singular value decomposition. When the preconditioner is the differential of the residual, the proposed algorithm is interpreted as an inexact Newton solver for which a detailed convergence analysis is provided. Numerical examples illustrate the efficiency of the method.

  13. Weakly intrusive low-rank approximation method for nonlinear parameter-dependent equations

    KAUST Repository

    Giraldi, Loic

    2017-06-30

    This paper presents a weakly intrusive strategy for computing a low-rank approximation of the solution of a system of nonlinear parameter-dependent equations. The proposed strategy relies on a Newton-like iterative solver which only requires evaluations of the residual of the parameter-dependent equation and of a preconditioner (such as the differential of the residual) for instances of the parameters independently. The algorithm provides an approximation of the set of solutions associated with a possibly large number of instances of the parameters, with a computational complexity which can be orders of magnitude lower than when using the same Newton-like solver for all instances of the parameters. The reduction of complexity requires efficient strategies for obtaining low-rank approximations of the residual, of the preconditioner, and of the increment at each iteration of the algorithm. For the approximation of the residual and the preconditioner, weakly intrusive variants of the empirical interpolation method are introduced, which require evaluations of entries of the residual and the preconditioner. Then, an approximation of the increment is obtained by using a greedy algorithm for low-rank approximation, and a low-rank approximation of the iterate is finally obtained by using a truncated singular value decomposition. When the preconditioner is the differential of the residual, the proposed algorithm is interpreted as an inexact Newton solver for which a detailed convergence analysis is provided. Numerical examples illustrate the efficiency of the method.

  14. The effect of loading methods and parameters on defect detection in digital shearography

    Science.gov (United States)

    Yang, Fu; Ye, Xingchen; Qiu, Zisheng; Zhang, Borui; Zhong, Ping; Liang, ZhiYong; Sun, Zeyu; Zhu, Shu

    Digital Shearography Speckle Pattern Interferometry (DSSPI) is a non-destructive testing technique, which has a wide range of applications in industrial field due to the merits of non-contact, fast response, full-field measurement and high sensitivity. However, in the real application, the loading methods and parameters usually depend on the experience of the operator, which affect the effectiveness and accuracy of the test. Based on this background and the principle of DSSPI, a model using finite element analysis software and Matlab is established to simulate the defects detections of aluminum plate and composite laminates under different loading conditions. The simulation covers loading methods, shearing direction, shearing amount, loading intensity, defect size, defect depth and defect position. In order to quantify the testing effect, a parameter named the deviation D is first defined. And through the parameter D, the simulation system can evaluate the system detection ability. The work in this paper can provide systematic guidance for the choice of loading methods and parameters in the real DSSPI experiment system.

  15. Validation of Cs-137 measurement in food samples using gamma spectrometry system

    International Nuclear Information System (INIS)

    Yii Mei Wo; Kamarozaman Ishak

    2005-01-01

    Cs-137 was found to be one of major radionuclide contaminant present in foods consumed by human. In some countries, regulations required consumption foods moving in international trade to be scanned for caesium (Cs-134 and Cs-137) to ensure it does not exceeding the maximum permissible level. This is to ensure that the intake of such foods will not accumulate radionuclide until the significant level inside the human body. Gamma Spectrometry System was used to perform the measurement of caesium isotopes, because it was one of the easiest methods to be performed. This measuring method must be validated for several parameters include specificity, precision (repeatability), bias (accuracy), linearity, range, detection limit, robustness and ruggedness in order to ensure it is fit for the purpose. This paper would summarise how these parameters were fulfilled for this analytical method using several types certified reference materials. The same validated method would be considered workable on Cs-134 as well. (Author)

  16. Calibration of DEM parameters on shear test experiments using Kriging method

    Science.gov (United States)

    Xavier, Bednarek; Sylvain, Martin; Abibatou, Ndiaye; Véronique, Peres; Olivier, Bonnefoy

    2017-06-01

    Calibration of powder mixing simulation using Discrete-Element-Method is still an issue. Achieving good agreement with experimental results is difficult because time-efficient use of DEM involves strong assumptions. This work presents a methodology to calibrate DEM parameters using Efficient Global Optimization (EGO) algorithm based on Kriging interpolation method. Classical shear test experiments are used as calibration experiments. The calibration is made on two parameters - Young modulus and friction coefficient. The determination of the minimal number of grains that has to be used is a critical step. Simulations of a too small amount of grains would indeed not represent the realistic behavior of powder when using huge amout of grains will be strongly time consuming. The optimization goal is the minimization of the objective function which is the distance between simulated and measured behaviors. The EGO algorithm uses the maximization of the Expected Improvement criterion to find next point that has to be simulated. This stochastic criterion handles with the two interpolations made by the Kriging method : prediction of the objective function and estimation of the error made. It is thus able to quantify the improvement in the minimization that new simulations at specified DEM parameters would lead to.

  17. A method for generating subgroup parameters from resonance tables and the SPART code

    International Nuclear Information System (INIS)

    Devan, K.; Mohanakrishnan, P.

    1995-01-01

    A method for generating subgroup or band parameters from resonance tables is described. A computer code SPART was written using this method. This code generates the subgroup parameters for any number of bands within the specified broad groups at different temperatures by reading the required input data from the binary cross section library in the Cadarache format. The results obtained with SPART code for two bands were compared with that obtained from GROUPIE code and a good agreement was obtained. Results of the generation of subgroup parameters in four bands for sample case of 239 Pu from resonance tables of Cadarache Ver.2 library is also presented. 6 refs, 2 tabs

  18. Validation of a same-day real-time PCR method for screening of meat and carcass swabs for Salmonella

    DEFF Research Database (Denmark)

    Löfström, Charlotta; Krause, Michael; Josefsen, Mathilde Hartmann

    2009-01-01

    of the published PCR methods for Salmonella have been validated in collaborative studies. This study describes a validation including comparative and collaborative trials, based on the recommendations from the Nordic organization for validation of alternative microbiological methods (NordVal) of a same-day, non....... Partly based on results obtained in this study, the method has obtained NordVal approval for analysis of Salmonella in meat and carcass swabs. The PCR method was transferred to a production laboratory and the performance was compared with the BAX Salmonella test on 39 pork samples artificially...... contaminated with Salmonella. There was no significant difference in the results obtained by the two methods. Conclusion: The real-time PCR method for detection of Salmonella in meat and carcass swabs was validated in comparative and collaborative trials according to NordVal recommendations. The PCR method...

  19. Dependability validation by means of fault injection: method, implementation, application

    International Nuclear Information System (INIS)

    Arlat, Jean

    1990-01-01

    This dissertation presents theoretical and practical results concerning the use of fault injection as a means for testing fault tolerance in the framework of the experimental dependability validation of computer systems. The dissertation first presents the state-of-the-art of published work on fault injection, encompassing both hardware (fault simulation, physical fault Injection) and software (mutation testing) issues. Next, the major attributes of fault injection (faults and their activation, experimental readouts and measures, are characterized taking into account: i) the abstraction levels used to represent the system during the various phases of its development (analytical, empirical and physical models), and Il) the validation objectives (verification and evaluation). An evaluation method is subsequently proposed that combines the analytical modeling approaches (Monte Carlo Simulations, closed-form expressions. Markov chains) used for the representation of the fault occurrence process and the experimental fault Injection approaches (fault Simulation and physical injection); characterizing the error processing and fault treatment provided by the fault tolerance mechanisms. An experimental tool - MESSALINE - is then defined and presented. This tool enables physical faults to be Injected In an hardware and software prototype of the system to be validated. Finally, the application of MESSALINE for testing two fault-tolerant systems possessing very dissimilar features and the utilization of the experimental results obtained - both as design feedbacks and for dependability measures evaluation - are used to illustrate the relevance of the method. (author) [fr

  20. Cross Validation Through Two-Dimensional Solution Surface for Cost-Sensitive SVM.

    Science.gov (United States)

    Gu, Bin; Sheng, Victor S; Tay, Keng Yeow; Romano, Walter; Li, Shuo

    2017-06-01

    Model selection plays an important role in cost-sensitive SVM (CS-SVM). It has been proven that the global minimum cross validation (CV) error can be efficiently computed based on the solution path for one parameter learning problems. However, it is a challenge to obtain the global minimum CV error for CS-SVM based on one-dimensional solution path and traditional grid search, because CS-SVM is with two regularization parameters. In this paper, we propose a solution and error surfaces based CV approach (CV-SES). More specifically, we first compute a two-dimensional solution surface for CS-SVM based on a bi-parameter space partition algorithm, which can fit solutions of CS-SVM for all values of both regularization parameters. Then, we compute a two-dimensional validation error surface for each CV fold, which can fit validation errors of CS-SVM for all values of both regularization parameters. Finally, we obtain the CV error surface by superposing K validation error surfaces, which can find the global minimum CV error of CS-SVM. Experiments are conducted on seven datasets for cost sensitive learning and on four datasets for imbalanced learning. Experimental results not only show that our proposed CV-SES has a better generalization ability than CS-SVM with various hybrids between grid search and solution path methods, and than recent proposed cost-sensitive hinge loss SVM with three-dimensional grid search, but also show that CV-SES uses less running time.

  1. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Directory of Open Access Journals (Sweden)

    Shankarjee Krishnamoorthi

    Full Text Available We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  2. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Science.gov (United States)

    Krishnamoorthi, Shankarjee; Perotti, Luigi E; Borgstrom, Nils P; Ajijola, Olujimi A; Frid, Anna; Ponnaluri, Aditya V; Weiss, James N; Qu, Zhilin; Klug, William S; Ennis, Daniel B; Garfinkel, Alan

    2014-01-01

    We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  3. The measurement, control, and validation of critical parameters in an electron beam sterilization facility

    International Nuclear Information System (INIS)

    Burns, P.; Drewell, N.H.; McKeown, J.

    1996-01-01

    The delivery and validation of a specified dose to a medical device are key concerns of operators of electron beam irradiation facilities. In an IMPELA-based irradiator, four of the parameters that directly influence the absorbed dose distribution in the product are controllable in real time - the electron energy, average beam current, scanned area, and the product exposure time. The 10 MeV accelerator operates at 50 kW with a stream of 200 μs wide, 100 mA pulses at a repetition rate of 250 Hz. The combination of short-term intra-pulse regulation with long-term pulse-to-pulse stability makes the IMPELA output attractive for the sterilization of medical products. The measurement and closed-loop control techniques used in the IMPELA design will be described with reference to facilitating compliance with medical sterilization standards. (orig.)

  4. Development and Validation of High Performance Liquid Chromatography Method for Determination Atorvastatin in Tablet

    Science.gov (United States)

    Yugatama, A.; Rohmani, S.; Dewangga, A.

    2018-03-01

    Atorvastatin is the primary choice for dyslipidemia treatment. Due to patent expiration of atorvastatin, the pharmaceutical industry makes copy of the drug. Therefore, the development methods for tablet quality tests involving atorvastatin concentration on tablets needs to be performed. The purpose of this research was to develop and validate the simple atorvastatin tablet analytical method by HPLC. HPLC system used in this experiment consisted of column Cosmosil C18 (150 x 4,6 mm, 5 µm) as the stationary reverse phase chomatography, a mixture of methanol-water at pH 3 (80:20 v/v) as the mobile phase, flow rate of 1 mL/min, and UV detector at wavelength of 245 nm. Validation methods were including: selectivity, linearity, accuracy, precision, limit of detection (LOD), and limit of quantitation (LOQ). The results of this study indicate that the developed method had good validation including selectivity, linearity, accuracy, precision, LOD, and LOQ for analysis of atorvastatin tablet content. LOD and LOQ were 0.2 and 0.7 ng/mL, and the linearity range were 20 - 120 ng/mL.

  5. Automatic control system generation for robot design validation

    Science.gov (United States)

    Bacon, James A. (Inventor); English, James D. (Inventor)

    2012-01-01

    The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.

  6. Space Suit Joint Torque Measurement Method Validation

    Science.gov (United States)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  7. Method development and validation for simultaneous determination of IEA-R1 reactor’s pool water uranium and silicon content by ICP OES

    Science.gov (United States)

    Ulrich, J. C.; Guilhen, S. N.; Cotrim, M. E. B.; Pires, M. A. F.

    2018-03-01

    IPEN’s research reactor, IEA-R1, an open pool type research reactor moderated and cooled by light water. High quality water is a key factor in preventing the corrosion of the spent fuel stored in the pool. Leaching of radionuclides from the corroded fuel cladding may be prevented by an efficient water treatment and purification system. However, as a safety management policy, IPEN has adopted a water chemistry control which periodically monitors the levels of uranium (U) and silicon (Si) in the pool’s reactor, since IEA-R1 employs U3Si2-Al dispersion fuel. An analytical method was developed and validated for the determination of uranium and silicon by ICP OES. This work describes the validation process, in a context of quality assurance, including the parameters selectivity, linearity, quantification limit, precision and recovery.

  8. Validation of an HPLC method for the simultaneous determination of eletriptan and UK 120.413

    Directory of Open Access Journals (Sweden)

    LJILJANA ZIVANOVIC

    2006-11-01

    Full Text Available Arapid and sensitive RPHPLCmethod was developed for the routine control analysis of eletriptan hydrobromide and its organic impurity UK 120.413 in Relpax® tablets. The chromatography was performed at 20 °Cusing a C18 XTerraTM (5 m, 150 × 4,6 mm column at a flow rate 1.0 ml/min. The drug and its impurity were detected at 225 nm. The mobile phase consisted of TEA (1 % – methanol (67.2:32.8 v/v, the pH of which was adjusted to 6.8 with 85 % orthophosphoric acid. Quantification was accomplished by the internal standard method. The developed RP HPLC method was validated by testing: accuracy, precision, repeatibility, specificity, detection limit, quantification limit, linearity, robustness and sensitivity. High linearity of the analytical procedure was confirmed over the concentration range of 0.05 – 1.00 mg/ml for eletriptan hydrobromide and from 0.10 – 1.50 µg/ml for UK 120.413, with correlation coefficients greater than r = 0.995. The low value of the RSD expressed the good repeatability and precision of the method. Experimental design and a response surface method were used to test robustness of the analytical procedure and to evaluate the effect of variation of the method parameters, namely the mobile phase composition, pH and temperature. They showed small deviations from the method setting. The good recovery and low RSD confirm the suitability of the proposed RP HPLC method for the routine determination of eletriptan hydrobromide and its impurity UK 120.413 in Relpax® tables.

  9. Imaging disturbance zones ahead of a tunnel by elastic full-waveform inversion: Adjoint gradient based inversion vs. parameter space reduction using a level-set method

    Directory of Open Access Journals (Sweden)

    Andre Lamert

    2018-03-01

    Full Text Available We present and compare two flexible and effective methodologies to predict disturbance zones ahead of underground tunnels by using elastic full-waveform inversion. One methodology uses a linearized, iterative approach based on misfit gradients computed with the adjoint method while the other uses iterative, gradient-free unscented Kalman filtering in conjunction with a level-set representation. Whereas the former does not involve a priori assumptions on the distribution of elastic properties ahead of the tunnel, the latter introduces a massive reduction in the number of explicit model parameters to be inverted for by focusing on the geometric form of potential disturbances and their average elastic properties. Both imaging methodologies are validated through successful reconstructions of simple disturbances. As an application, we consider an elastic multiple disturbance scenario. By using identical synthetic time-domain seismograms as test data, we obtain satisfactory, albeit different, reconstruction results from the two inversion methodologies. The computational costs of both approaches are of the same order of magnitude, with the gradient-based approach showing a slight advantage. The model parameter space reduction approach compensates for this by additionally providing a posteriori estimates of model parameter uncertainty. Keywords: Tunnel seismics, Full waveform inversion, Seismic waves, Level-set method, Adjoint method, Kalman filter

  10. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  11. Investigation and validation of optimal cutting parameters for least ...

    African Journals Online (AJOL)

    The cutting parameters were analyzed and optimized using Box Behnken procedure in the DESIGN EXPERT environment. The effect of process parameters with the output variable were predicted which indicates that the highest cutting speed has significant role in producing least surface roughness followed by feed and ...

  12. Statistical methods for mechanistic model validation: Salt Repository Project

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1988-07-01

    As part of the Department of Energy's Salt Repository Program, Pacific Northwest Laboratory (PNL) is studying the emplacement of nuclear waste containers in a salt repository. One objective of the SRP program is to develop an overall waste package component model which adequately describes such phenomena as container corrosion, waste form leaching, spent fuel degradation, etc., which are possible in the salt repository environment. The form of this model will be proposed, based on scientific principles and relevant salt repository conditions with supporting data. The model will be used to predict the future characteristics of the near field environment. This involves several different submodels such as the amount of time it takes a brine solution to contact a canister in the repository, how long it takes a canister to corrode and expose its contents to the brine, the leach rate of the contents of the canister, etc. These submodels are often tested in a laboratory and should be statistically validated (in this context, validate means to demonstrate that the model adequately describes the data) before they can be incorporated into the waste package component model. This report describes statistical methods for validating these models. 13 refs., 1 fig., 3 tabs

  13. A method for the statistical interpretation of friction ridge skin impression evidence: Method development and validation.

    Science.gov (United States)

    Swofford, H J; Koertner, A J; Zemp, F; Ausdemore, M; Liu, A; Salyards, M J

    2018-04-03

    The forensic fingerprint community has faced increasing amounts of criticism by scientific and legal commentators, challenging the validity and reliability of fingerprint evidence due to the lack of an empirically demonstrable basis to evaluate and report the strength of the evidence in a given case. This paper presents a method, developed as a stand-alone software application, FRStat, which provides a statistical assessment of the strength of fingerprint evidence. The performance was evaluated using a variety of mated and non-mated datasets. The results show strong performance characteristics, often with values supporting specificity rates greater than 99%. This method provides fingerprint experts the capability to demonstrate the validity and reliability of fingerprint evidence in a given case and report the findings in a more transparent and standardized fashion with clearly defined criteria for conclusions and known error rate information thereby responding to concerns raised by the scientific and legal communities. Published by Elsevier B.V.

  14. A Semismooth Newton Method for Nonlinear Parameter Identification Problems with Impulsive Noise

    KAUST Repository

    Clason, Christian

    2012-01-01

    This work is concerned with nonlinear parameter identification in partial differential equations subject to impulsive noise. To cope with the non-Gaussian nature of the noise, we consider a model with L 1 fitting. However, the nonsmoothness of the problem makes its efficient numerical solution challenging. By approximating this problem using a family of smoothed functionals, a semismooth Newton method becomes applicable. In particular, its superlinear convergence is proved under a second-order condition. The convergence of the solution to the approximating problem as the smoothing parameter goes to zero is shown. A strategy for adaptively selecting the regularization parameter based on a balancing principle is suggested. The efficiency of the method is illustrated on several benchmark inverse problems of recovering coefficients in elliptic differential equations, for which one- and two-dimensional numerical examples are presented. © by SIAM.

  15. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  16. Development and validation of an LC-MS/MS method for the determination of adapalene in pharmaceutical forms for skin application

    Directory of Open Access Journals (Sweden)

    Dobričić Vladimir

    2016-01-01

    Full Text Available Development and validation of a liquid chromatography - tandem mass spectrometry (LC-MS/MS method for the determination of adapalene in pharmaceutical forms for skin application were presented. The MS/MS analysis of adapalene was performed by use of three mobile phases, consisted of acetonitrile and (a 0.1 % formic acid, (b 0.1 % trifluoroacetic acid and (c 20 mM ammonium acetate. The strongest signals of parent ion and dominant product ion were obtained in negative mode by use of the mobile phase (c. Validation of this method was performed according to the ICH guidelines. Small variations of selected chromatographic parameters (concentration of ammonium acetate, mobile phase composition, column temperature and flow rate did not affect qualitative and quantitative system responses significantly, which proved method’s robustness. The method is specific for the determination of adapalene. Linearity was proved in the concentration range 6.7 - 700.0 ng mL-1 (r = 0.9990, with limits of detection and quantification 2.0 ng mL-1 and 6.7 ng mL-1, respectively. Accuracy was confirmed by calculated recoveries (98.4 % - 101.5 %. Precision was tested at three levels: injection repeatability, analysis repeatability and intermediate precision. Calculated relative standard deviations were less than 1, 2 and 3 %, respectively. [Projekat Ministartsva nauke Republike Srbije, br. OI172041 i br. TR34031

  17. A Stability-Indicating HPLC-DAD Method for Determination of Ferulic Acid into Microparticles: Development, Validation, Forced Degradation, and Encapsulation Efficiency

    Directory of Open Access Journals (Sweden)

    Jessica Mendes Nadal

    2015-01-01

    Full Text Available A simple stability-indicating HPLC-DAD method was validated for the determination of ferulic acid (FA in polymeric microparticles. Chromatographic conditions consisted of a RP C18 column (250 mm × 4.60 mm, 5 μm, 110 Å using a mixture of methanol and water pH 3.0 (48 : 52 v/v as mobile phase at a flow rate of 1.0 mL/min with UV detection at 320 nm. The developed method was validated as per ICH guidelines with respect to specificity, linearity, limit of quantification, limit of detection, accuracy, precision, and robustness provided suitable results regarding all parameters investigated. The calibration curve was linear in the concentration range of 10.0–70.0 μg/mL with a correlation coefficient >0.999. Precision (intraday and interday was demonstrated by a relative standard deviation lower than 2.0%. Accuracy was assessed by the recovery test of FA from polymeric microparticles (99.02% to 100.73%. Specificity showed no interference from the components of polymeric microparticles or from the degradation products derived from acidic, basic, and photolytic conditions. In conclusion, the method is suitable to be applied to assay FA as bulk drug and into polymeric microparticles and can be used for studying its stability and degradation kinetics.

  18. Development and Validation of a Stability-Indicating LC-UV Method ...

    African Journals Online (AJOL)

    Development and Validation of a Stability-Indicating LC-UV Method for Simultaneous Determination of Ketotifen and Cetirizine in Pharmaceutical Dosage Forms. ... 5 μm) using an isocratic mobile phase that consisted of acetonitrile and 10 mM disodium hydrogen phosphate buffer (pH 6.5) in a ratio of 45:55 % v/v at a flow ...

  19. Comparison of the performances and validation of three methods for ...

    African Journals Online (AJOL)

    SARAH

    2014-02-28

    Feb 28, 2014 ... bacteria in Norwegian slaughter pigs. Int J. Food Microbiol 1, 301–309. [NCFA] Nordic Committee of Food Analysis (1996). Yersinia enterocolitica Detection in foods 117,. 3rd,edn,1-12. Nowak, B., Mueffling, T.V., Caspari, K. and Hartung, J. 2006 Validation of a method for the detection of virulent Yersinia ...

  20. A feasible, aesthetic quality evaluation of implant-supported single crowns: an analysis of validity and reliability

    DEFF Research Database (Denmark)

    Hosseini, Mandana; Gotfredsen, Klaus

    2012-01-01

    OBJECTIVES: To test the reliability and validity of six aesthetic parameters and to compare the professional- and patient-reported aesthetic outcomes. MATERIAL AND METHODS: Thirty-four patients with 66 implant-supported premolar crowns were included. Two prosthodontists and 11 dental students......,24) were found between patient and professional evaluations. CONCLUSIONS: The feasibility, reliability and validity of the CIS make the parameters useful for quality control of implant-supported restorations. The professional- and patient-reported aesthetic outcomes had no significant correlation....... and the internal consistency were analysed by Cohen's ¿ and Cronbach's a, respectively. The validity of CIS parameters was tested against the corresponding Visual Analogue Scales (VAS) scores. The Spearman correlation coefficients were used. Six aesthetic Oral Health Impact Profile (OHIP) questions were correlated...

  1. Validation of Cyanoacrylate Method for Collection of Stratum Corneum in Human Skin for Lipid Analysis

    DEFF Research Database (Denmark)

    Jungersted, JM; Hellgren, Lars; Drachmann, Tue

    2010-01-01

    Background and Objective: Lipids in the stratum corneum (SC) are of major importance for the skin barrier function. Many different methods have been used for the collection of SC for the analysis of SC lipids. The objective of the present study was to validate the cyanoacrylate method for the col......Background and Objective: Lipids in the stratum corneum (SC) are of major importance for the skin barrier function. Many different methods have been used for the collection of SC for the analysis of SC lipids. The objective of the present study was to validate the cyanoacrylate method...

  2. The validation of the analytical method (HPLC, use for identification and assay of the pharmaceutical active ingredient, colistine sulphate and the finished product Colidem 50 – hydrosoluble powder, in SC DELOS impex ‘96 SRL

    Directory of Open Access Journals (Sweden)

    Maria Neagu,

    2011-06-01

    Full Text Available In SC DELOS IMPEX ’96 SRL the quality of the active pharmaceutical ingredient (API for the finished product Colidem 50 - hydrosoluble powder is make according to European Pharmacopoeia, curent edition. The method for analysis use in this purpose is the compendial method „Colistine sulphate” in E.P. in current edition and represent a optimized variant, developed and validated „in house”.The parameters which was included in the methodology validation for chromatographic method are the follow: Selectivity/Specificity, Linearity, Range of Linearity, Limit of Detection and Limit of Quantification, Precision (Repeatability - intra day, inter-Day Reproducibility, Accuracy, Robustness, Stability Solutions and System Suitability.

  3. Development and validation of stability indicating method for the quantitative determination of venlafaxine hydrochloride in extended release formulation using high performance liquid chromatography

    Directory of Open Access Journals (Sweden)

    Jaspreet Kaur

    2010-01-01

    Full Text Available Objective : Venlafaxine,hydrochloride is a structurally novel phenethyl bicyclic antidepressant, and is usually categorized as a serotonin-norepinephrine reuptake inhibitor (SNRI but it has been referred to as a serotonin-norepinephrine-dopamine reuptake inhibitor. It inhibits the reuptake of dopamine. Venlafaxine HCL is widely prescribed in the form of sustained release formulations. In the current article we are reporting the development and validation of a fast and simple stability indicating, isocratic high performance liquid chromatographic (HPLC method for the determination of venlafaxine hydrochloride in sustained release formulations. Materials and Methods : The quantitative determination of venlafaxine hydrochloride was performed on a Kromasil C18 analytical column (250 x 4.6 mm i.d., 5 μm particle size with 0.01 M phosphate buffer (pH 4.5: methanol (40: 60 as a mobile phase, at a flow rate of 1.0 ml/min. For HPLC methods, UV detection was made at 225 nm. Results : During method validation, parameters such as precision, linearity, accuracy, stability, limit of quantification and detection and specificity were evaluated, which remained within acceptable limits. Conclusions : The method has been successfully applied for the quantification and dissolution profiling of Venlafaxine HCL in sustained release formulation. The method presents a simple and reliable solution for the routine quantitative analysis of Venlafaxine HCL.

  4. Analytic Method on Characteristic Parameters of Bacteria in Water by Multiwavelength Transmission Spectroscopy

    Directory of Open Access Journals (Sweden)

    Yuxia Hu

    2017-01-01

    Full Text Available An analytic method together with the Mie scattering theory and Beer-Lambert law is proposed for the characteristic parameter determination of bacterial cells (Escherichia coli 10389 from multiwavelength transmission spectroscopy measurements. We calculate the structural parameters of E. coli cells, and compared with the microscopy, the relative error of cell volume is 7.90%, the cell number is compared with those obtained by plate counting, the relative error is l.02%, and the nucleic content and protein content of single E. coli cells are consistent with the data reported elsewhere. The proposed method can obtain characteristic parameters of bacteria as an excellent candidate for the rapid detection and identification of bacteria in the water.

  5. A simple method for identifying parameter correlations in partially observed linear dynamic models.

    Science.gov (United States)

    Li, Pu; Vu, Quoc Dong

    2015-12-14

    Parameter estimation represents one of the most significant challenges in systems biology. This is because biological models commonly contain a large number of parameters among which there may be functional interrelationships, thus leading to the problem of non-identifiability. Although identifiability analysis has been extensively studied by analytical as well as numerical approaches, systematic methods for remedying practically non-identifiable models have rarely been investigated. We propose a simple method for identifying pairwise correlations and higher order interrelationships of parameters in partially observed linear dynamic models. This is made by derivation of the output sensitivity matrix and analysis of the linear dependencies of its columns. Consequently, analytical relations between the identifiability of the model parameters and the initial conditions as well as the input functions can be achieved. In the case of structural non-identifiability, identifiable combinations can be obtained by solving the resulting homogenous linear equations. In the case of practical non-identifiability, experiment conditions (i.e. initial condition and constant control signals) can be provided which are necessary for remedying the non-identifiability and unique parameter estimation. It is noted that the approach does not consider noisy data. In this way, the practical non-identifiability issue, which is popular for linear biological models, can be remedied. Several linear compartment models including an insulin receptor dynamics model are taken to illustrate the application of the proposed approach. Both structural and practical identifiability of partially observed linear dynamic models can be clarified by the proposed method. The result of this method provides important information for experimental design to remedy the practical non-identifiability if applicable. The derivation of the method is straightforward and thus the algorithm can be easily implemented into a

  6. Validation of insulin resistance indexes in a stable renal transplant population

    NARCIS (Netherlands)

    Oterdoom, LH; De Vries, APJ; Van Son, WJ; Van Der Heide, JJH; Ploeg, RJ; Gansevoort, RT; De Jong, PE; Gans, ROB; Bakker, SJL

    2005-01-01

    OBJECTIVE - The purpose of this study was to investigate the validity of established insulin resistance indexes, based on fasting blood parameters, in a stable renal transplant population. RESEARCH DESIGN AND METHODS - Fasting insulin, homeostasis model assessment (HOMA), the quantitative insulin

  7. Development and validation of spectrophotometric methods for simultaneous estimation of citicoline and piracetam in tablet dosage form

    Directory of Open Access Journals (Sweden)

    Akhila Sivadas

    2013-01-01

    Full Text Available Context: Citicoline (CN and piracetam (PM combination in tablet formulation is newly introduced in market. It is necessary to develop suitable quality control methods for rapid and accurate determination of these drugs. Aim: The study aimed to develop the methods for simultaneous determination of CN and PM in combined dosage form. Materials and Methods: The first method was developed by formation and solving simultaneous equations using 280.3 and 264.1 nm as two analytical wavelengths. Second method was absorbance ratio in which wavelengths selected were 256.6 nm as its absorptive point and 280.3 nm as λmax of CN. According to International Conference on Harmonization (ICH norm, the parameters - linearity, precision, and accuracy were studied. The methods were validated statistically and by recovery studies. Results: Both the drugs obeyed Beer-Lambert′s law at the selected wavelengths in concentration range of 5-13 μg/ml for CN and 10-22 μg/ml for PM. The percentage of CN and PM in marketed tablet formulation was found to be 99.006 ± 0.173 and 99.257 ± 0.613, respectively; by simultaneous equation method. For Q-Absorption ratio method the percentage of CN and PM was found to be 99.078 ± 0.158 and 99.708 ± 0.838, respectively. Conclusions: The proposed methods were simple, reproducible, precise and robust. The methods can be successfully applied for routine analysis of tablets.

  8. An optimization method for parameters in reactor nuclear physics

    International Nuclear Information System (INIS)

    Jachic, J.

    1982-01-01

    An optimization method for two basic problems of Reactor Physics was developed. The first is the optimization of a plutonium critical mass and the bruding ratio for fast reactors in function of the radial enrichment distribution of the fuel used as control parameter. The second is the maximization of the generation and the plutonium burnup by an optimization of power temporal distribution. (E.G.) [pt

  9. Validation of verbal autopsy methods using hospital medical records: a case study in Vietnam.

    Science.gov (United States)

    Tran, Hong Thi; Nguyen, Hoa Phuong; Walker, Sue M; Hill, Peter S; Rao, Chalapati

    2018-05-18

    Information on causes of death (COD) is crucial for measuring the health outcomes of populations and progress towards the Sustainable Development Goals. In many countries such as Vietnam where the civil registration and vital statistics (CRVS) system is dysfunctional, information on vital events will continue to rely on verbal autopsy (VA) methods. This study assesses the validity of VA methods used in Vietnam, and provides recommendations on methods for implementing VA validation studies in Vietnam. This validation study was conducted on a sample of 670 deaths from a recent VA study in Quang Ninh province. The study covered 116 cases from this sample, which met three inclusion criteria: a) the death occurred within 30 days of discharge after last hospitalisation, and b) medical records (MRs) for the deceased were available from respective hospitals, and c) the medical record mentioned that the patient was terminally ill at discharge. For each death, the underlying cause of death (UCOD) identified from MRs was compared to the UCOD from VA. The validity of VA diagnoses for major causes of death was measured using sensitivity, specificity and positive predictive value (PPV). The sensitivity of VA was at least 75% in identifying some leading CODs such as stroke, road traffic accidents and several site-specific cancers. However, sensitivity was less than 50% for other important causes including ischemic heart disease, chronic obstructive pulmonary diseases, and diabetes. Overall, there was 57% agreement between UCOD from VA and MR, which increased to 76% when multiple causes from VA were compared to UCOD from MR. Our findings suggest that VA is a valid method to ascertain UCOD in contexts such as Vietnam. Furthermore, within cultural contexts in which patients prefer to die at home instead of a healthcare facility, using the available MRs as the gold standard may be meaningful to the extent that recall bias from the interval between last hospital discharge and death

  10. Quantification of endocrine disruptors and pesticides in water by gas chromatography-tandem mass spectrometry. Method validation using weighted linear regression schemes.

    Science.gov (United States)

    Mansilha, C; Melo, A; Rebelo, H; Ferreira, I M P L V O; Pinho, O; Domingues, V; Pinho, C; Gameiro, P

    2010-10-22

    A multi-residue methodology based on a solid phase extraction followed by gas chromatography-tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC-MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. MR-based water content estimation in cartilage: design and validation of a method

    DEFF Research Database (Denmark)

    Shiguetomi Medina, Juan Manuel; Kristiansen, Maja Sophie; Ringgaard, Steffen

    Purpose: Design and validation of an MR-based method that allows the calculation of the water content in cartilage tissue. Methods and Materials: Cartilage tissue T1 map based water content MR sequences were used on a 37 Celsius degree stable system. The T1 map intensity signal was analyzed on 6...... cartilage samples from living animals (pig) and on 8 gelatin samples which water content was already known. For the data analysis a T1 intensity signal map software analyzer used. Finally, the method was validated after measuring and comparing 3 more cartilage samples in a living animal (pig). The obtained...... map based water content sequences can provide information that, after being analyzed using a T1-map analysis software, can be interpreted as the water contained inside a cartilage tissue. The amount of water estimated using this method was similar to the one obtained at the dry-freeze procedure...

  12. Identification of Constitutive Parameters Using Inverse Strategy Coupled to an ANN Model

    International Nuclear Information System (INIS)

    Aguir, H.; Chamekh, A.; BelHadjSalah, H.; Hambli, R.

    2007-01-01

    This paper deals with the identification of material parameters using an inverse strategy. In the classical methods, the inverse technique is generally coupled with a finite element code which leads to a long computing time. In this work an inverse strategy coupled with an ANN procedure is proposed. This method has the advantage of being faster than the classical one. To validate this approach an experimental plane tensile and bulge tests are used in order to identify material behavior. The ANN model is trained from finite element simulations of the two tests. In order to reduce the gap between the experimental responses and the numerical ones, the proposed method is coupled with an optimization procedure to identify material parameters for the AISI304. The identified material parameters are the hardening curve and the anisotropic coefficients

  13. Calibration and validation of a general infiltration model

    Science.gov (United States)

    Mishra, Surendra Kumar; Ranjan Kumar, Shashi; Singh, Vijay P.

    1999-08-01

    A general infiltration model proposed by Singh and Yu (1990) was calibrated and validated using a split sampling approach for 191 sets of infiltration data observed in the states of Minnesota and Georgia in the USA. Of the five model parameters, fc (the final infiltration rate), So (the available storage space) and exponent n were found to be more predictable than the other two parameters: m (exponent) and a (proportionality factor). A critical examination of the general model revealed that it is related to the Soil Conservation Service (1956) curve number (SCS-CN) method and its parameter So is equivalent to the potential maximum retention of the SCS-CN method and is, in turn, found to be a function of soil sorptivity and hydraulic conductivity. The general model was found to describe infiltration rate with time varying curve number.

  14. Model validation and calibration based on component functions of model output

    International Nuclear Information System (INIS)

    Wu, Danqing; Lu, Zhenzhou; Wang, Yanping; Cheng, Lei

    2015-01-01

    The target in this work is to validate the component functions of model output between physical observation and computational model with the area metric. Based on the theory of high dimensional model representations (HDMR) of independent input variables, conditional expectations are component functions of model output, and the conditional expectations reflect partial information of model output. Therefore, the model validation of conditional expectations tells the discrepancy between the partial information of the computational model output and that of the observations. Then a calibration of the conditional expectations is carried out to reduce the value of model validation metric. After that, a recalculation of the model validation metric of model output is taken with the calibrated model parameters, and the result shows that a reduction of the discrepancy in the conditional expectations can help decrease the difference in model output. At last, several examples are employed to demonstrate the rationality and necessity of the methodology in case of both single validation site and multiple validation sites. - Highlights: • A validation metric of conditional expectations of model output is proposed. • HDRM explains the relationship of conditional expectations and model output. • An improved approach of parameter calibration updates the computational models. • Validation and calibration process are applied at single site and multiple sites. • Validation and calibration process show a superiority than existing methods

  15. Determination of Microstructural Parameters of Nanocrystalline Hydroxyapatite Prepared by Mechanical Alloying Method

    Science.gov (United States)

    Joughehdoust, Sedigheh; Manafi, Sahebali

    2011-12-01

    Hydroxyapatite [HA, Ca10(PO4)6(OH)2] is chemically similar to the mineral component of bones and hard tissues. HA can support bone ingrowth and osseointegration when used in orthopaedic, dental and maxillofacial applications. In this research, HA nanostructure was synthesized by mechanical alloying method. Phase development, particle size and morphology of HA were investigated by X-ray diffraction (XRD) pattern, zetasizer instrument, scanning electron microscopy (SEM), respectively. XRD pattern has been used to determination of the microstructural parameters (crystallite size, lattice parameters and crystallinity percent) by Williamson-Hall equation, Nelson-Riley method and calculating the areas under the peaks, respectively. The crystallite size and particle size of HA powders were in nanometric scales. SEM images showed that some parts of HA particles have agglomerates. The ratio of lattice parameters of synthetic hydroxyapatite (c/a = 0.73) was determined in this study is the same as natural hydroxyapatite structure.

  16. Clashing Validities in the Comparative Method? Balancing In-Depth Understanding and Generalizability in Small-N Policy Studies

    NARCIS (Netherlands)

    van der Heijden, J.

    2013-01-01

    The comparative method receives considerable attention in political science. To some a main advantage of the method is that it allows for both in-depth insights (internal validity), and generalizability beyond the cases studied (external validity). However, others consider internal and external

  17. A Novel Non-Iterative Method for Real-Time Parameter Estimation of the Fricke-Morse Model

    Directory of Open Access Journals (Sweden)

    SIMIC, M.

    2016-11-01

    Full Text Available Parameter estimation of Fricke-Morse model of biological tissue is widely used in bioimpedance data processing and analysis. Complex nonlinear least squares (CNLS data fitting is often used for parameter estimation of the model, but limitations such as high processing time, converging into local minimums, need for good initial guess of model parameters and non-convergence have been reported. Thus, there is strong motivation to develop methods which can solve these flaws. In this paper a novel real-time method for parameter estimation of Fricke-Morse model of biological cells is presented. The proposed method uses the value of characteristic frequency estimated from the measured imaginary part of bioimpedance, whereupon the Fricke-Morse model parameters are calculated using the provided analytical expressions. The proposed method is compared with CNLS in frequency ranges of 1 kHz to 10 MHz (beta-dispersion and 10 kHz to 100 kHz, which is more suitable for low-cost microcontroller-based bioimpedance measurement systems. The obtained results are promising, and in both frequency ranges, CNLS and the proposed method have accuracies suitable for most electrical bioimpedance (EBI applications. However, the proposed algorithm has significantly lower computation complexity, so it was 20-80 times faster than CNLS.

  18. Differential evolution algorithm-based kernel parameter selection for Fukunaga-Koontz Transform subspaces construction

    Science.gov (United States)

    Binol, Hamidullah; Bal, Abdullah; Cukur, Huseyin

    2015-10-01

    The performance of the kernel based techniques depends on the selection of kernel parameters. That's why; suitable parameter selection is an important problem for many kernel based techniques. This article presents a novel technique to learn the kernel parameters in kernel Fukunaga-Koontz Transform based (KFKT) classifier. The proposed approach determines the appropriate values of kernel parameters through optimizing an objective function constructed based on discrimination ability of KFKT. For this purpose we have utilized differential evolution algorithm (DEA). The new technique overcomes some disadvantages such as high time consumption existing in the traditional cross-validation method, and it can be utilized in any type of data. The experiments for target detection applications on the hyperspectral images verify the effectiveness of the proposed method.

  19. Development and validation of an HPLC method for the determination of epicatechin in Maytenus ilicifolia (Schrad. Planch., Celastraceae

    Directory of Open Access Journals (Sweden)

    Gisely Cristiny Lopes

    2010-09-01

    Full Text Available A simple, reproducible and efficient high-performance liquid chromatography (HPLC method was developed. Water (0.05% TFA:acetonitrile (0.05% TFA was used as the mobile phase in a gradient system for the determination of epicatechin (EP in leaves of Maytenus ilicifolia (Schrad. Planch. The analysis was performed using an RP C-18 column (5 µm as the stationary phase, with a flow rate of 0.8 mL/min, at a wavelength of 210 nm for detection and determination. The main validation parameters of the method were also determined. The calibration curve was found to be linear, with a range of 10-120 µg/mL (EP. The correlation coefficient of the linear regression analysis was within 0.9988, and the detection and quantification limits were 28.61 and 86.77 µg/mL, respectively. The content of EP was successfully determined, with satisfactory reproducibility and recovery. Recovery of the EP was 99.32%. The method was successfully applied to the determination of epicatechin in leaves of M. ilicifolia. The interlaboratorial evaluation showed the reproducibility of the method with a relative standard deviation of 14.62%.

  20. Principles of Single-Laboratory Validation of Analytical Methods for Testing the Chemical Composition of Pesticides

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Underlying theoretical and practical approaches towards pesticide formulation analysis are discussed, i.e. general principles, performance characteristics, applicability of validation data, verification of method performance, and adaptation of validated methods by other laboratories. The principles of single laboratory validation of analytical methods for testing the chemical composition of pesticides are outlined. Also the theoretical background is described for performing pesticide formulation analysis as outlined in ISO, CIPAC/AOAC and IUPAC guidelines, including methodological characteristics such as specificity, selectivity, linearity, accuracy, trueness, precision and bias. Appendices I–III hereof give practical and elaborated examples on how to use the Horwitz approach and formulae for estimating the target standard deviation towards acceptable analytical repeatability. The estimation of trueness and the establishment of typical within-laboratory reproducibility are treated in greater detail by means of worked-out examples. (author)