WorldWideScience

Sample records for method validation step

  1. Validation of DRAGON side-step method for Bruce-A restart Phase-B physics tests

    International Nuclear Information System (INIS)

    Shen, W.; Ngo-Trong, C.; Davis, R.S.

    2004-01-01

    The DRAGON side-step method, developed at AECL, has a number of advantages over the all-DRAGON method that was used before. It is now the qualified method for reactivity-device calculations. Although the side-step-method-generated incremental cross sections have been validated against those previously calculated with the all-DRAGON method, it is highly desirable to validate the side-step method against device-worth measurements in power reactors directly. In this paper, the DRAGON side-step method was validated by comparison with the device-calibration measurements made in Bruce-A NGS Unit 4 restart Phase-B commissioning in 2003. The validation exercise showed excellent results, with the DRAGON code overestimating the measured ZCR worth by ∼5%. A sensitivity study was also performed in this paper to assess the effect of various DRAGON modelling techniques on the incremental cross sections. The assessment shows that the refinement of meshes in 3-D and the use of the side-step method are two major reasons contributing to the improved agreement between the calculated ZCR worths and the measurements. Use of different DRAGON versions, DRAGON libraries, local-parameter core conditions, and weighting techniques for the homogenization of tube clusters inside the ZCR have a very small effect on the ZCR incremental thermal absorption cross section and ZCR reactivity worth. (author)

  2. Free Modal Algebras Revisited: The Step-by-Step Method

    NARCIS (Netherlands)

    Bezhanishvili, N.; Ghilardi, Silvio; Jibladze, Mamuka

    2012-01-01

    We review the step-by-step method of constructing finitely generated free modal algebras. First we discuss the global step-by-step method, which works well for rank one modal logics. Next we refine the global step-by-step method to obtain the local step-by-step method, which is applicable beyond

  3. Practical procedure for method validation in INAA- A tutorial

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2015-01-01

    This paper describes the procedure employed by the Neutron Activation Laboratory at the Nuclear and Energy Research Institute (LAN, IPEN - CNEN/SP) for validation of Instrumental Neutron Activation Analysis (INAA) methods. According to recommendations of ISO/IEC 17025 the method performance characteristics (limit of detection, limit of quantification, trueness, repeatability, intermediate precision, reproducibility, selectivity, linearity and uncertainties budget) were outline in an easy, fast and convenient way. The paper presents step by step how to calculate the required method performance characteristics in a process of method validation, what are the procedures, adopted strategies and acceptance criteria for the results, that is, how to make a method validation in INAA. In order to exemplify the methodology applied, obtained results for the method validation of mass fraction determination of Co, Cr, Fe, Rb, Se and Zn in biological matrix samples, using an internal reference material of mussel tissue were presented. It was concluded that the methodology applied for validation of INAA methods is suitable, meeting all the requirements of ISO/IEC 17025, and thereby, generating satisfactory results for the studies carried at LAN, IPEN - CNEN/SP. (author)

  4. Practical procedure for method validation in INAA- A tutorial

    Energy Technology Data Exchange (ETDEWEB)

    Petroni, Robson; Moreira, Edson G., E-mail: robsonpetroni@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    This paper describes the procedure employed by the Neutron Activation Laboratory at the Nuclear and Energy Research Institute (LAN, IPEN - CNEN/SP) for validation of Instrumental Neutron Activation Analysis (INAA) methods. According to recommendations of ISO/IEC 17025 the method performance characteristics (limit of detection, limit of quantification, trueness, repeatability, intermediate precision, reproducibility, selectivity, linearity and uncertainties budget) were outline in an easy, fast and convenient way. The paper presents step by step how to calculate the required method performance characteristics in a process of method validation, what are the procedures, adopted strategies and acceptance criteria for the results, that is, how to make a method validation in INAA. In order to exemplify the methodology applied, obtained results for the method validation of mass fraction determination of Co, Cr, Fe, Rb, Se and Zn in biological matrix samples, using an internal reference material of mussel tissue were presented. It was concluded that the methodology applied for validation of INAA methods is suitable, meeting all the requirements of ISO/IEC 17025, and thereby, generating satisfactory results for the studies carried at LAN, IPEN - CNEN/SP. (author)

  5. Validation of a One-Step Method for Extracting Fatty Acids from Salmon, Chicken and Beef Samples.

    Science.gov (United States)

    Zhang, Zhichao; Richardson, Christine E; Hennebelle, Marie; Taha, Ameer Y

    2017-10-01

    Fatty acid extraction methods are time-consuming and expensive because they involve multiple steps and copious amounts of extraction solvents. In an effort to streamline the fatty acid extraction process, this study compared the standard Folch lipid extraction method to a one-step method involving a column that selectively elutes the lipid phase. The methods were tested on raw beef, salmon, and chicken. Compared to the standard Folch method, the one-step extraction process generally yielded statistically insignificant differences in chicken and salmon fatty acid concentrations, percent composition and weight percent. Initial testing showed that beef stearic, oleic and total fatty acid concentrations were significantly lower by 9-11% with the one-step method as compared to the Folch method, but retesting on a different batch of samples showed a significant 4-8% increase in several omega-3 and omega-6 fatty acid concentrations with the one-step method relative to the Folch. Overall, the findings reflect the utility of a one-step extraction method for routine and rapid monitoring of fatty acids in chicken and salmon. Inconsistencies in beef concentrations, although minor (within 11%), may be due to matrix effects. A one-step fatty acid extraction method has broad applications for rapidly and routinely monitoring fatty acids in the food supply and formulating controlled dietary interventions. © 2017 Institute of Food Technologists®.

  6. A Normalized Transfer Matrix Method for the Free Vibration of Stepped Beams: Comparison with Experimental and FE(3D Methods

    Directory of Open Access Journals (Sweden)

    Tamer Ahmed El-Sayed

    2017-01-01

    Full Text Available The exact solution for multistepped Timoshenko beam is derived using a set of fundamental solutions. This set of solutions is derived to normalize the solution at the origin of the coordinates. The start, end, and intermediate boundary conditions involve concentrated masses and linear and rotational elastic supports. The beam start, end, and intermediate equations are assembled using the present normalized transfer matrix (NTM. The advantage of this method is that it is quicker than the standard method because the size of the complete system coefficient matrix is 4 × 4. In addition, during the assembly of this matrix, there are no inverse matrix steps required. The validity of this method is tested by comparing the results of the current method with the literature. Then the validity of the exact stepped analysis is checked using experimental and FE(3D methods. The experimental results for stepped beams with single step and two steps, for sixteen different test samples, are in excellent agreement with those of the three-dimensional finite element FE(3D. The comparison between the NTM method and the finite element method results shows that the modal percentage deviation is increased when a beam step location coincides with a peak point in the mode shape. Meanwhile, the deviation decreases when a beam step location coincides with a straight portion in the mode shape.

  7. Method validation in pharmaceutical analysis: from theory to practical optimization

    Directory of Open Access Journals (Sweden)

    Jaqueline Kaleian Eserian

    2015-01-01

    Full Text Available The validation of analytical methods is required to obtain high-quality data. For the pharmaceutical industry, method validation is crucial to ensure the product quality as regards both therapeutic efficacy and patient safety. The most critical step in validating a method is to establish a protocol containing well-defined procedures and criteria. A well planned and organized protocol, such as the one proposed in this paper, results in a rapid and concise method validation procedure for quantitative high performance liquid chromatography (HPLC analysis.   Type: Commentary

  8. Development and Validation of an Automated Step Ergometer

    Directory of Open Access Journals (Sweden)

    C. de Sousa Maria do Socorro

    2014-12-01

    Full Text Available Laboratory ergometers have high costs, becoming inaccessible for most of the population, hence, it is imperative to develop affordable devices making evaluations like cardiorespiratory fitness feasible and easier. The objective of this study was to develop and validate an Automated Step Ergometer (ASE, adjusted according to the height of the subject, for predicting VO2max through a progressive test. The development process was comprised by three steps, the theoretical part, the prototype assembly and further validation. The ASE consists in an elevating platform that makes the step at a higher or lower level as required for testing. The ASE validation was obtained by comparing the values of predicted VO2max (equation and direct gas analysis on the prototype and on a, treadmill. For the validation process 167 subjects with average age of 31.24 ± 14.38 years, of both genders and different degrees of cardiorespiratory fitness, were randomized and divided by gender and training condition, into untrained (n=106, active (n=24 and trained (n=37 subjects. Each participant performed a progressive test on which the ASE started at the same height (20 cm for all. Then, according to the subject’s height, it varied to a maximum of 45 cm. Time in each stage and rhythm was chosen in accordance with training condition from lowest to highest (60-180 s; 116-160 bpm, respectively. Data was compared with the student’s t test and ANOVA; correlations were tested with Pearson’s r. The value of α was set at 0.05. No differences were found between the predicted VO2max and the direct gas analysis VO2max, nor between the ASE and treadmill VO2max (p= 0.365 with high correlation between ergometers (r= 0.974. The values for repeatability, reproducibility, and reliability of male and female groups measures were, respectively, 4.08 and 5.02; 0.50 and 1.11; 4.11 and 5.15. The values of internal consistency (Cronbach’s alpha among measures were all >0.90. It was verified

  9. Uncertainty propagation applied to multi-scale thermal-hydraulics coupled codes. A step towards validation

    Energy Technology Data Exchange (ETDEWEB)

    Geffray, Clotaire Clement

    2017-03-20

    The work presented here constitutes an important step towards the validation of the use of coupled system thermal-hydraulics and computational fluid dynamics codes for the simulation of complex flows in liquid metal cooled pool-type facilities. First, a set of methods suited for uncertainty and sensitivity analysis and validation activities with regards to the specific constraints of the work with coupled and expensive-to-run codes is proposed. Then, these methods are applied to the ATHLET - ANSYS CFX model of the TALL-3D facility. Several transients performed at this latter facility are investigated. The results are presented, discussed and compared to the experimental data. Finally, assessments of the validity of the selected methods and of the quality of the model are offered.

  10. Increased efficacy for in-house validation of real-time PCR GMO detection methods.

    Science.gov (United States)

    Scholtens, I M J; Kok, E J; Hougs, L; Molenaar, B; Thissen, J T N M; van der Voet, H

    2010-03-01

    To improve the efficacy of the in-house validation of GMO detection methods (DNA isolation and real-time PCR, polymerase chain reaction), a study was performed to gain insight in the contribution of the different steps of the GMO detection method to the repeatability and in-house reproducibility. In the present study, 19 methods for (GM) soy, maize canola and potato were validated in-house of which 14 on the basis of an 8-day validation scheme using eight different samples and five on the basis of a more concise validation protocol. In this way, data was obtained with respect to the detection limit, accuracy and precision. Also, decision limits were calculated for declaring non-conformance (>0.9%) with 95% reliability. In order to estimate the contribution of the different steps in the GMO analysis to the total variation variance components were estimated using REML (residual maximum likelihood method). From these components, relative standard deviations for repeatability and reproducibility (RSD(r) and RSD(R)) were calculated. The results showed that not only the PCR reaction but also the factors 'DNA isolation' and 'PCR day' are important factors for the total variance and should therefore be included in the in-house validation. It is proposed to use a statistical model to estimate these factors from a large dataset of initial validations so that for similar GMO methods in the future, only the PCR step needs to be validated. The resulting data are discussed in the light of agreed European criteria for qualified GMO detection methods.

  11. Valve cam design using numerical step-by-step method

    OpenAIRE

    Vasilyev, Aleksandr; Bakhracheva, Yuliya; Kabore, Ousman; Zelenskiy, Yuriy

    2014-01-01

    This article studies the numerical step-by-step method of cam profile design. The results of the study are used for designing the internal combustion engine valve gear. This method allows to profile the peak efficiency of cams in view of many restrictions, connected with valve gear serviceability and reliability.

  12. Content Validity of National Post Marriage Educational Program Using Mixed Methods

    Science.gov (United States)

    MOHAJER RAHBARI, Masoumeh; SHARIATI, Mohammad; KERAMAT, Afsaneh; YUNESIAN, Masoud; ESLAMI, Mohammad; MOUSAVI, Seyed Abbas; MONTAZERI, Ali

    2015-01-01

    Background: Although the validity of content of program is mostly conducted with qualitative methods, this study used both qualitative and quantitative methods for the validation of content of post marriage training program provided for newly married couples. Content validity is a preliminary step of obtaining authorization required to install the program in country's health care system. Methods: This mixed methodological content validation study carried out in four steps with forming three expert panels. Altogether 24 expert panelists were involved in 3 qualitative and quantitative panels; 6 in the first item development one; 12 in the reduction kind, 4 of them were common with the first panel, and 10 executive experts in the last one organized to evaluate psychometric properties of CVR and CVI and Face validity of 57 educational objectives. Results: The raw data of post marriage program had been written by professional experts of Ministry of Health, using qualitative expert panel, the content was more developed by generating 3 topics and refining one topic and its respective content. In the second panel, totally six other objectives were deleted, three for being out of agreement cut of point and three on experts' consensus. The validity of all items was above 0.8 and their content validity indices (0.8–1) were completely appropriate in quantitative assessment. Conclusion: This study provided a good evidence for validation and accreditation of national post marriage program planned for newly married couples in health centers of the country in the near future. PMID:26056672

  13. Development of interface between MCNP-FISPACT-MCNP (IPR-MFM) based on rigorous two step method

    International Nuclear Information System (INIS)

    Shaw, A.K.; Swami, H.L.; Danani, C.

    2015-01-01

    In this work we present the development of interface tool between MCNP-FISPACT-MCNP (MFM) based on Rigorous Two Step method for the shutdown dose rate (SDDR) calculation. The MFM links MCNP radiation transport and the FISPACT inventory code through a suitable coupling scheme. MFM coupling scheme has three steps. In first step it picks neutron spectrum and total flux from MCNP output file to use as input parameter for FISPACT. It prepares the FISPACT input files by using irradiation history, neutron flux and neutron spectrum and then execute the FISPACT input file in the second step. Third step of MFM coupling scheme extracts the decay gammas from the FISPACT output file and prepares MCNP input file for decay gamma transport followed by execution of MCNP input file and estimation of SDDR. Here detailing of MFM methodology and flow scheme has been described. The programming language PYTHON has been chosen for this development of the coupling scheme. A complete loop of MCNP-FISPACT-MCNP has been developed to handle the simplified geometrical problems. For validation of MFM interface a manual cross-check has been performed which shows good agreements. The MFM interface also has been validated with exiting MCNP-D1S method for a simple geometry with 14 MeV cylindrical neutron source. (author)

  14. One step linear reconstruction method for continuous wave diffuse optical tomography

    Science.gov (United States)

    Ukhrowiyah, N.; Yasin, M.

    2017-09-01

    The method one step linear reconstruction method for continuous wave diffuse optical tomography is proposed and demonstrated for polyvinyl chloride based material and breast phantom. Approximation which used in this method is selecting regulation coefficient and evaluating the difference between two states that corresponding to the data acquired without and with a change in optical properties. This method is used to recovery of optical parameters from measured boundary data of light propagation in the object. The research is demonstrated by simulation and experimental data. Numerical object is used to produce simulation data. Chloride based material and breast phantom sample is used to produce experimental data. Comparisons of results between experiment and simulation data are conducted to validate the proposed method. The results of the reconstruction image which is produced by the one step linear reconstruction method show that the image reconstruction almost same as the original object. This approach provides a means of imaging that is sensitive to changes in optical properties, which may be particularly useful for functional imaging used continuous wave diffuse optical tomography of early diagnosis of breast cancer.

  15. A two-step method for rapid characterization of electroosmotic flows in capillary electrophoresis.

    Science.gov (United States)

    Zhang, Wenjing; He, Muyi; Yuan, Tao; Xu, Wei

    2017-12-01

    The measurement of electroosmotic flow (EOF) is important in a capillary electrophoresis (CE) experiment in terms of performance optimization and stability improvement. Although several methods exist, there are demanding needs to accurately characterize ultra-low electroosmotic flow rates (EOF rates), such as in coated capillaries used in protein separations. In this work, a new method, called the two-step method, was developed to accurately and rapidly measure EOF rates in a capillary, especially for measuring the ultra-low EOF rates in coated capillaries. In this two-step method, the EOF rates were calculated by measuring the migration time difference of a neutral marker in two consecutive experiments, in which a pressure driven was introduced to accelerate the migration and the DC voltage was reversed to switch the EOF direction. Uncoated capillaries were first characterized by both this two-step method and a conventional method to confirm the validity of this new method. Then this new method was applied in the study of coated capillaries. Results show that this new method is not only fast in speed, but also better in accuracy. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. A simple method for validation and verification of pipettes mounted on automated liquid handlers

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Frøslev, Tobias Guldberg

     We have implemented a simple method for validation and verification of the performance of pipettes mounted on automated liquid handlers as necessary for laboratories accredited under ISO 17025. An 8-step serial dilution of Orange G was prepared in quadruplicates in a flat bottom 96-well microtit...... available. In conclusion, we have set up a simple solution for the continuous validation of automated liquid handlers used for accredited work. The method is cheap, simple and easy to use for aqueous solutions but requires a spectrophotometer that can read microtiter plates....... We have implemented a simple method for validation and verification of the performance of pipettes mounted on automated liquid handlers as necessary for laboratories accredited under ISO 17025. An 8-step serial dilution of Orange G was prepared in quadruplicates in a flat bottom 96-well microtiter...

  17. Validity of the Stages of Change in Steps instrument (SoC-Step) for achieving the physical activity goal of 10,000 steps per day.

    Science.gov (United States)

    Rosenkranz, Richard R; Duncan, Mitch J; Caperchione, Cristina M; Kolt, Gregory S; Vandelanotte, Corneel; Maeder, Anthony J; Savage, Trevor N; Mummery, W Kerry

    2015-11-30

    Physical activity (PA) offers numerous benefits to health and well-being, but most adults are not sufficiently physically active to afford such benefits. The 10,000 steps campaign has been a popular and effective approach to promote PA. The Transtheoretical Model posits that individuals have varying levels of readiness for health behavior change, known as Stages of Change (Precontemplation, Contemplation, Preparation, Action, and Maintenance). Few validated assessment instruments are available for determining Stages of Change in relation to the PA goal of 10,000 steps per day. The purpose of this study was to assess the criterion-related validity of the SoC-Step, a brief 10,000 steps per day Stages of Change instrument. Participants were 504 Australian adults (176 males, 328 females, mean age = 50.8 ± 13.0 years) from the baseline sample of the Walk 2.0 randomized controlled trial. Measures included 7-day accelerometry (Actigraph GT3X), height, weight, and self-reported intention, self-efficacy, and SoC-Step: Stages of Change relative to achieving 10,000 steps per day. Kruskal-Wallis H tests with pairwise comparisons were used to determine whether participants differed by stage, according to steps per day, general health, body mass index, intention, and self-efficacy to achieve 10,000 steps per day. Binary logistic regression was used to test the hypothesis that participants in Maintenance or Action stages would have greater likelihood of meeting the 10,000 steps goal, in comparison to participants in the other three stages. Consistent with study hypotheses, participants in Precontemplation had significantly lower intention scores than those in Contemplation (p = 0.003) or Preparation (p per day (OR = 3.11; 95 % CI = 1.66,5.83) compared to those in Precontemplation, Contemplation, or Preparation. Intention (p per day. Australian New Zealand Clinical Trials Registry reference: ACTRN12611000157976 World Health Organization Universal Trial

  18. Solving delay differential equations in S-ADAPT by method of steps.

    Science.gov (United States)

    Bauer, Robert J; Mo, Gary; Krzyzanski, Wojciech

    2013-09-01

    S-ADAPT is a version of the ADAPT program that contains additional simulation and optimization abilities such as parametric population analysis. S-ADAPT utilizes LSODA to solve ordinary differential equations (ODEs), an algorithm designed for large dimension non-stiff and stiff problems. However, S-ADAPT does not have a solver for delay differential equations (DDEs). Our objective was to implement in S-ADAPT a DDE solver using the methods of steps. The method of steps allows one to solve virtually any DDE system by transforming it to an ODE system. The solver was validated for scalar linear DDEs with one delay and bolus and infusion inputs for which explicit analytic solutions were derived. Solutions of nonlinear DDE problems coded in S-ADAPT were validated by comparing them with ones obtained by the MATLAB DDE solver dde23. The estimation of parameters was tested on the MATLB simulated population pharmacodynamics data. The comparison of S-ADAPT generated solutions for DDE problems with the explicit solutions as well as MATLAB produced solutions which agreed to at least 7 significant digits. The population parameter estimates from using importance sampling expectation-maximization in S-ADAPT agreed with ones used to generate the data. Published by Elsevier Ireland Ltd.

  19. Validity of Garmin Vívofit and Polar Loop for measuring daily step counts in free-living conditions in adults

    Directory of Open Access Journals (Sweden)

    Adam Šimůnek

    2016-09-01

    Full Text Available Background: Wrist activity trackers (WATs are becoming popular and widely used for the monitoring of physical activity. However, the validity of many WATs in measuring steps remains unknown. Objective: To determine the validity of the following WATs: Garmin Vívofit (Vívofit and Polar Loop (Loop, by comparing them with well-validated devices, Yamax Digiwalker SW-701 pedometer (Yamax and hip-mounted ActiGraph GT3X+ accelerometer (ActiGraph, in healthy adults. Methods: In free-living conditions, adult volunteers (N = 20 aged 25 to 52 years wore two WATs (Vívofit and Loop with Yamax and ActiGraph simultaneously over a 7 day period. The validity of Vívofit and Loop was assessed by comparing each device with the Yamax and ActiGraph, using a paired samples t-test, mean absolute percentage errors, intraclass correlation coefficients (ICC and Bland-Altman plots. Results: The differences between average steps per day were significant for all devices, except the difference between Vívofit and Yamax (p = .06; d = 0.2. Compared with Yamax and ActiGraph, the mean absolute percentage errors of Vívofit were -4.0% and 12.5%, respectively. For Loop the mean absolute percentage error was 8.9% compared with Yamax and 28.0% compared with ActiGraph. Vívofit showed a very strong correlation with both Yamax and ActiGraph (ICC = .89. Loop showed a very strong correlation with Yamax (ICC = .89 and a strong correlation with ActiGraph (ICC = .70. Conclusions: Vívofit showed higher validity than Loop in measuring daily step counts in free-living conditions. Loop appears to overestimate the daily number of steps in individuals who take more steps during a day.

  20. Detecting free-living steps and walking bouts: validating an algorithm for macro gait analysis.

    Science.gov (United States)

    Hickey, Aodhán; Del Din, Silvia; Rochester, Lynn; Godfrey, Alan

    2017-01-01

    Research suggests wearables and not instrumented walkways are better suited to quantify gait outcomes in clinic and free-living environments, providing a more comprehensive overview of walking due to continuous monitoring. Numerous validation studies in controlled settings exist, but few have examined the validity of wearables and associated algorithms for identifying and quantifying step counts and walking bouts in uncontrolled (free-living) environments. Studies which have examined free-living step and bout count validity found limited agreement due to variations in walking speed, changing terrain or task. Here we present a gait segmentation algorithm to define free-living step count and walking bouts from an open-source, high-resolution, accelerometer-based wearable (AX3, Axivity). Ten healthy participants (20-33 years) wore two portable gait measurement systems; a wearable accelerometer on the lower-back and a wearable body-mounted camera (GoPro HERO) on the chest, for 1 h on two separate occasions (24 h apart) during free-living activities. Step count and walking bouts were derived for both measurement systems and compared. For all participants during a total of almost 20 h of uncontrolled and unscripted free-living activity data, excellent relative (rho  ⩾  0.941) and absolute (ICC (2,1)   ⩾  0.975) agreement with no presence of bias were identified for step count compared to the camera (gold standard reference). Walking bout identification showed excellent relative (rho  ⩾  0.909) and absolute agreement (ICC (2,1)   ⩾  0.941) but demonstrated significant bias. The algorithm employed for identifying and quantifying steps and bouts from a single wearable accelerometer worn on the lower-back has been demonstrated to be valid and could be used for pragmatic gait analysis in prolonged uncontrolled free-living environments.

  1. Proposal for a Five-Step Method to Elicit Expert Judgment

    Directory of Open Access Journals (Sweden)

    Duco Veen

    2017-12-01

    Full Text Available Elicitation is a commonly used tool to extract viable information from experts. The information that is held by the expert is extracted and a probabilistic representation of this knowledge is constructed. A promising avenue in psychological research is to incorporated experts’ prior knowledge in the statistical analysis. Systematic reviews on elicitation literature however suggest that it might be inappropriate to directly obtain distributional representations from experts. The literature qualifies experts’ performance on estimating elements of a distribution as unsatisfactory, thus reliably specifying the essential elements of the parameters of interest in one elicitation step seems implausible. Providing feedback within the elicitation process can enhance the quality of the elicitation and interactive software can be used to facilitate the feedback. Therefore, we propose to decompose the elicitation procedure into smaller steps with adjustable outcomes. We represent the tacit knowledge of experts as a location parameter and their uncertainty concerning this knowledge by a scale and shape parameter. Using a feedback procedure, experts can accept the representation of their beliefs or adjust their input. We propose a Five-Step Method which consists of (1 Eliciting the location parameter using the trial roulette method. (2 Provide feedback on the location parameter and ask for confirmation or adjustment. (3 Elicit the scale and shape parameter. (4 Provide feedback on the scale and shape parameter and ask for confirmation or adjustment. (5 Use the elicited and calibrated probability distribution in a statistical analysis and update it with data or to compute a prior-data conflict within a Bayesian framework. User feasibility and internal validity for the Five-Step Method are investigated using three elicitation studies.

  2. An Improved Split-Step Wavelet Transform Method for Anomalous Radio Wave Propagation Modelling

    Directory of Open Access Journals (Sweden)

    A. Iqbal

    2014-12-01

    Full Text Available Anomalous tropospheric propagation caused by ducting phenomenon is a major problem in wireless communication. Thus, it is important to study the behavior of radio wave propagation in tropospheric ducts. The Parabolic Wave Equation (PWE method is considered most reliable to model anomalous radio wave propagation. In this work, an improved Split Step Wavelet transform Method (SSWM is presented to solve PWE for the modeling of tropospheric propagation over finite and infinite conductive surfaces. A large number of numerical experiments are carried out to validate the performance of the proposed algorithm. Developed algorithm is compared with previously published techniques; Wavelet Galerkin Method (WGM and Split-Step Fourier transform Method (SSFM. A very good agreement is found between SSWM and published techniques. It is also observed that the proposed algorithm is about 18 times faster than WGM and provide more details of propagation effects as compared to SSFM.

  3. Step by step parallel programming method for molecular dynamics code

    International Nuclear Information System (INIS)

    Orii, Shigeo; Ohta, Toshio

    1996-07-01

    Parallel programming for a numerical simulation program of molecular dynamics is carried out with a step-by-step programming technique using the two phase method. As a result, within the range of a certain computing parameters, it is found to obtain parallel performance by using the level of parallel programming which decomposes the calculation according to indices of do-loops into each processor on the vector parallel computer VPP500 and the scalar parallel computer Paragon. It is also found that VPP500 shows parallel performance in wider range computing parameters. The reason is that the time cost of the program parts, which can not be reduced by the do-loop level of the parallel programming, can be reduced to the negligible level by the vectorization. After that, the time consuming parts of the program are concentrated on less parts that can be accelerated by the do-loop level of the parallel programming. This report shows the step-by-step parallel programming method and the parallel performance of the molecular dynamics code on VPP500 and Paragon. (author)

  4. Criterion-Validity of Commercially Available Physical Activity Tracker to Estimate Step Count, Covered Distance and Energy Expenditure during Sports Conditions

    Directory of Open Access Journals (Sweden)

    Yvonne Wahl

    2017-09-01

    Full Text Available Background: In the past years, there was an increasing development of physical activity tracker (Wearables. For recreational people, testing of these devices under walking or light jogging conditions might be sufficient. For (elite athletes, however, scientific trustworthiness needs to be given for a broad spectrum of velocities or even fast changes in velocities reflecting the demands of the sport. Therefore, the aim was to evaluate the validity of eleven Wearables for monitoring step count, covered distance and energy expenditure (EE under laboratory conditions with different constant and varying velocities.Methods: Twenty healthy sport students (10 men, 10 women performed a running protocol consisting of four 5 min stages of different constant velocities (4.3; 7.2; 10.1; 13.0 km·h−1, a 5 min period of intermittent velocity, and a 2.4 km outdoor run (10.1 km·h−1 while wearing eleven different Wearables (Bodymedia Sensewear, Beurer AS 80, Polar Loop, Garmin Vivofit, Garmin Vivosmart, Garmin Vivoactive, Garmin Forerunner 920XT, Fitbit Charge, Fitbit Charge HR, Xaomi MiBand, Withings Pulse Ox. Step count, covered distance, and EE were evaluated by comparing each Wearable with a criterion method (Optogait system and manual counting for step count, treadmill for covered distance and indirect calorimetry for EE.Results: All Wearables, except Bodymedia Sensewear, Polar Loop, and Beurer AS80, revealed good validity (small MAPE, good ICC for all constant and varying velocities for monitoring step count. For covered distance, all Wearables showed a very low ICC (<0.1 and high MAPE (up to 50%, revealing no good validity. The measurement of EE was acceptable for the Garmin, Fitbit and Withings Wearables (small to moderate MAPE, while Bodymedia Sensewear, Polar Loop, and Beurer AS80 showed a high MAPE up to 56% for all test conditions.Conclusion: In our study, most Wearables provide an acceptable level of validity for step counts at different

  5. Ehrenfest's theorem and the validity of the two-step model for strong-field ionization

    DEFF Research Database (Denmark)

    Shvetsov-Shilovskiy, Nikolay; Dimitrovski, Darko; Madsen, Lars Bojer

    By comparison with the solution of the time-dependent Schrodinger equation we explore the validity of the two-step semiclassical model for strong-field ionization in elliptically polarized laser pulses. We find that the discrepancy between the two-step model and the quantum theory correlates...

  6. Validation of the ADAMO Care Watch for step counting in older adults.

    Science.gov (United States)

    Magistro, Daniele; Brustio, Paolo Riccardo; Ivaldi, Marco; Esliger, Dale Winfield; Zecca, Massimiliano; Rainoldi, Alberto; Boccia, Gennaro

    2018-01-01

    Accurate measurement devices are required to objectively quantify physical activity. Wearable activity monitors, such as pedometers, may serve as affordable and feasible instruments for measuring physical activity levels in older adults during their normal activities of daily living. Currently few available accelerometer-based steps counting devices have been shown to be accurate at slow walking speeds, therefore there is still lacking appropriate devices tailored for slow speed ambulation, typical of older adults. This study aimed to assess the validity of step counting using the pedometer function of the ADAMO Care Watch, containing an embedded algorithm for measuring physical activity in older adults. Twenty older adults aged ≥ 65 years (mean ± SD, 75±7 years; range, 68-91) and 20 young adults (25±5 years, range 20-40), wore a care watch on each wrist and performed a number of randomly ordered tasks: walking at slow, normal and fast self-paced speeds; a Timed Up and Go test (TUG); a step test and ascending/descending stairs. The criterion measure was the actual number of steps observed, counted with a manual tally counter. Absolute percentage error scores, Intraclass Correlation Coefficients (ICC), and Bland-Altman plots were used to assess validity. ADAMO Care Watch demonstrated high validity during slow and normal speeds (range 0.5-1.5 m/s) showing an absolute error from 1.3% to 1.9% in the older adult group and from 0.7% to 2.7% in the young adult group. The percentage error for the 30-metre walking tasks increased with faster pace in both young adult (17%) and older adult groups (6%). In the TUG test, there was less error in the steps recorded for older adults (1.3% to 2.2%) than the young adults (6.6% to 7.2%). For the total sample, the ICCs for the ADAMO Care Watch for the 30-metre walking tasks at each speed and for the TUG test were ranged between 0.931 to 0.985. These findings provide evidence that the ADAMO Care Watch demonstrated highly accurate

  7. Pre-validation methods for developing a patient reported outcome instrument

    Directory of Open Access Journals (Sweden)

    Castillo Mayret M

    2011-08-01

    Full Text Available Abstract Background Measures that reflect patients' assessment of their health are of increasing importance as outcome measures in randomised controlled trials. The methodological approach used in the pre-validation development of new instruments (item generation, item reduction and question formatting should be robust and transparent. The totality of the content of existing PRO instruments for a specific condition provides a valuable resource (pool of items that can be utilised to develop new instruments. Such 'top down' approaches are common, but the explicit pre-validation methods are often poorly reported. This paper presents a systematic and generalisable 5-step pre-validation PRO instrument methodology. Methods The method is illustrated using the example of the Aberdeen Glaucoma Questionnaire (AGQ. The five steps are: 1 Generation of a pool of items; 2 Item de-duplication (three phases; 3 Item reduction (two phases; 4 Assessment of the remaining items' content coverage against a pre-existing theoretical framework appropriate to the objectives of the instrument and the target population (e.g. ICF; and 5 qualitative exploration of the target populations' views of the new instrument and the items it contains. Results The AGQ 'item pool' contained 725 items. Three de-duplication phases resulted in reduction of 91, 225 and 48 items respectively. The item reduction phases discarded 70 items and 208 items respectively. The draft AGQ contained 83 items with good content coverage. The qualitative exploration ('think aloud' study resulted in removal of a further 15 items and refinement to the wording of others. The resultant draft AGQ contained 68 items. Conclusions This study presents a novel methodology for developing a PRO instrument, based on three sources: literature reporting what is important to patient; theoretically coherent framework; and patients' experience of completing the instrument. By systematically accounting for all items dropped

  8. Video-Recorded Validation of Wearable Step Counters under Free-living Conditions.

    Science.gov (United States)

    Toth, Lindsay P; Park, Susan; Springer, Cary M; Feyerabend, McKenzie D; Steeves, Jeremy A; Bassett, David R

    2018-06-01

    The purpose of this study was to determine the accuracy of 14-step counting methods under free-living conditions. Twelve adults (mean ± SD age, 35 ± 13 yr) wore a chest harness that held a GoPro camera pointed down at the feet during all waking hours for 1 d. The GoPro continuously recorded video of all steps taken throughout the day. Simultaneously, participants wore two StepWatch (SW) devices on each ankle (all programmed with different settings), one activPAL on each thigh, four devices at the waist (Fitbit Zip, Yamax Digi-Walker SW-200, New Lifestyles NL-2000, and ActiGraph GT9X (AG)), and two devices on the dominant and nondominant wrists (Fitbit Charge and AG). The GoPro videos were downloaded to a computer and researchers counted steps using a hand tally device, which served as the criterion method. The SW devices recorded between 95.3% and 102.8% of actual steps taken throughout the day (P > 0.05). Eleven step counting methods estimated less than 100% of actual steps; Fitbit Zip, Yamax Digi-Walker SW-200, and AG with the moving average vector magnitude algorithm on both wrists recorded 71% to 91% of steps (P > 0.05), whereas the activPAL, New Lifestyles NL-2000, and AG (without low-frequency extension (no-LFE), moving average vector magnitude) worn on the hip, and Fitbit Charge recorded 69% to 84% of steps (P 0.05), whereas the AG (LFE) on both wrists and the hip recorded 128% to 220% of steps (P < 0.05). Across all waking hours of 1 d, step counts differ between devices. The SW, regardless of settings, was the most accurate method of counting steps.

  9. Two-step Raman spectroscopy method for tumor diagnosis

    Science.gov (United States)

    Zakharov, V. P.; Bratchenko, I. A.; Kozlov, S. V.; Moryatov, A. A.; Myakinin, O. O.; Artemyev, D. N.

    2014-05-01

    Two-step Raman spectroscopy phase method was proposed for differential diagnosis of malignant tumor in skin and lung tissue. It includes detection of malignant tumor in healthy tissue on first step with identification of concrete cancer type on the second step. Proposed phase method analyze spectral intensity alteration in 1300-1340 and 1640-1680 cm-1 Raman bands in relation to the intensity of the 1450 cm-1 band on first step, and relative differences between RS intensities for tumor area and healthy skin closely adjacent to the lesion on the second step. It was tested more than 40 ex vivo samples of lung tissue and more than 50 in vivo skin tumors. Linear Discriminant Analysis, Quadratic Discriminant Analysis and Support Vector Machine were used for tumors type classification on phase planes. It is shown that two-step phase method allows to reach 88.9% sensitivity and 87.8% specificity for malignant melanoma diagnosis (skin cancer); 100% sensitivity and 81.5% specificity for adenocarcinoma diagnosis (lung cancer); 90.9% sensitivity and 77.8% specificity for squamous cell carcinoma diagnosis (lung cancer).

  10. A simple method for validation and verification of pipettes mounted on automated liquid handlers

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Frøslev, Tobias G

    2011-01-01

    We have implemented a simple, inexpensive, and fast procedure for validation and verification of the performance of pipettes mounted on automated liquid handlers (ALHs) as necessary for laboratories accredited under ISO 17025. A six- or seven-step serial dilution of OrangeG was prepared in quadru......We have implemented a simple, inexpensive, and fast procedure for validation and verification of the performance of pipettes mounted on automated liquid handlers (ALHs) as necessary for laboratories accredited under ISO 17025. A six- or seven-step serial dilution of OrangeG was prepared...... are freely available. In conclusion, we have set up a simple, inexpensive, and fast solution for the continuous validation of ALHs used for accredited work according to the ISO 17025 standard. The method is easy to use for aqueous solutions but requires a spectrophotometer that can read microtiter plates....

  11. A Novel Motion Compensation Method for Random Stepped Frequency Radar with M-sequence

    Science.gov (United States)

    Liao, Zhikun; Hu, Jiemin; Lu, Dawei; Zhang, Jun

    2018-01-01

    The random stepped frequency radar is a new kind of synthetic wideband radar. In the research, it has been found that it possesses a thumbtack-like ambiguity function which is considered to be the ideal one. This also means that only a precise motion compensation could result in the correct high resolution range profile. In this paper, we will introduce the random stepped frequency radar coded by M-sequence firstly and briefly analyse the effect of relative motion between target and radar on the distance imaging, which is called defocusing problem. Then, a novel motion compensation method, named complementary code cancellation, will be put forward to solve this problem. Finally, the simulated experiments will demonstrate its validity and the computational analysis will show up its efficiency.

  12. Validation of methods for the determination of radium in waters and soil

    International Nuclear Information System (INIS)

    Decaillon, J.-G.; Bickel, M.; Hill, C.; Altzitzoglou, T.

    2004-01-01

    This article describes the advantages and disadvantages of several analytical methods used to prepare the alpha-particle source. As a result of this study, a new method combining commercial extraction and ion chromatography prior to a final co-precipitation step is proposed. This method has been applied and validated on several matrices (soil, waters) in the framework of international intercomparisons. The integration of this method in a global procedure to analyze actinoids and radium from a single solution (or digested soil) is also described

  13. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    Science.gov (United States)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  14. The method validation step of biological dosimetry accreditation process

    International Nuclear Information System (INIS)

    Roy, L.; Voisin, P.A.; Guillou, A.C.; Busset, A.; Gregoire, E.; Buard, V.; Delbos, M.; Voisin, Ph.

    2006-01-01

    One of the missions of the Laboratory of Biological Dosimetry (L.D.B.) of the Institute for Radiation and Nuclear Safety (I.R.S.N.) is to assess the radiological dose after an accidental overexposure suspicion to ionising radiation, by using radio-induced changes of some biological parameters. The 'gold standard' is the yield of dicentrics observed in patients lymphocytes, and this yield is converted in dose using dose effect relationships. This method is complementary to clinical and physical dosimetry, for medical team in charge of the patients. To obtain a formal recognition of its operational activity, the laboratory decided three years ago, to require an accreditation, by following the recommendations of both 17025 General Requirements for the Competence of Testing and Calibration Laboratories and 19238 Performance criteria for service laboratories performing biological dosimetry by cyto-genetics. Diagnostics, risks analysis were realized to control the whole analysis process leading to documents writing. Purchases, personnel department, vocational training were also included in the quality system. Audits were very helpful to improve the quality system. One specificity of this technique is that it is not normalized therefore apart from quality management aspects, several technical points needed some validations. An inventory of potentially influent factors was carried out. To estimate their real effect on the yield of dicentrics, a Placket-Burman experimental design was conducted. The effect of seven parameters was tested: the BUdr (bromodeoxyuridine), PHA (phytohemagglutinin) and colcemid concentration, the culture duration, the incubator temperature, the blood volume and the medium volume. The chosen values were calculated according to the uncertainties on the way they were measured i.e. pipettes, thermometers, test tubes. None of the factors has a significant impact on the yield of dicentrics. Therefore the uncertainty linked to their use was considered as

  15. The method validation step of biological dosimetry accreditation process

    Energy Technology Data Exchange (ETDEWEB)

    Roy, L.; Voisin, P.A.; Guillou, A.C.; Busset, A.; Gregoire, E.; Buard, V.; Delbos, M.; Voisin, Ph. [Institut de Radioprotection et de Surete Nucleaire, LDB, 92 - Fontenay aux Roses (France)

    2006-07-01

    One of the missions of the Laboratory of Biological Dosimetry (L.D.B.) of the Institute for Radiation and Nuclear Safety (I.R.S.N.) is to assess the radiological dose after an accidental overexposure suspicion to ionising radiation, by using radio-induced changes of some biological parameters. The 'gold standard' is the yield of dicentrics observed in patients lymphocytes, and this yield is converted in dose using dose effect relationships. This method is complementary to clinical and physical dosimetry, for medical team in charge of the patients. To obtain a formal recognition of its operational activity, the laboratory decided three years ago, to require an accreditation, by following the recommendations of both 17025 General Requirements for the Competence of Testing and Calibration Laboratories and 19238 Performance criteria for service laboratories performing biological dosimetry by cyto-genetics. Diagnostics, risks analysis were realized to control the whole analysis process leading to documents writing. Purchases, personnel department, vocational training were also included in the quality system. Audits were very helpful to improve the quality system. One specificity of this technique is that it is not normalized therefore apart from quality management aspects, several technical points needed some validations. An inventory of potentially influent factors was carried out. To estimate their real effect on the yield of dicentrics, a Placket-Burman experimental design was conducted. The effect of seven parameters was tested: the BUdr (bromodeoxyuridine), PHA (phytohemagglutinin) and colcemid concentration, the culture duration, the incubator temperature, the blood volume and the medium volume. The chosen values were calculated according to the uncertainties on the way they were measured i.e. pipettes, thermometers, test tubes. None of the factors has a significant impact on the yield of dicentrics. Therefore the uncertainty linked to their use was

  16. Evaluation of lung and chest wall mechanics during anaesthesia using the PEEP-step method.

    Science.gov (United States)

    Persson, P; Stenqvist, O; Lundin, S

    2018-04-01

    Postoperative pulmonary complications are common. Between patients there are differences in lung and chest wall mechanics. Individualised mechanical ventilation based on measurement of transpulmonary pressures would be a step forward. A previously described method evaluates lung and chest wall mechanics from a change of ΔPEEP and calculation of change in end-expiratory lung volume (ΔEELV). The aim of the present study was to validate this PEEP-step method (PSM) during general anaesthesia by comparing it with the conventional method using oesophageal pressure (PES) measurements. In 24 lung healthy subjects (BMI 18.5-32), three different sizes of PEEP steps were performed during general anaesthesia and ΔEELVs were calculated. Transpulmonary driving pressure (ΔPL) for a tidal volume equal to each ΔEELV was measured using PES measurements and compared to ΔPEEP with limits of agreement and intraclass correlation coefficients (ICC). ΔPL calculated with both methods was compared with a Bland-Altman plot. Mean differences between ΔPEEP and ΔPL were mechanical properties among the lung healthy patients stresses the need for individualised ventilator settings based on measurements of lung and chest wall mechanics. The agreement between ΔPLs measured by the two methods during general anaesthesia suggests the use of the non-invasive PSM in this patient population. NCT 02830516. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Development, reliability, and validity testing of Toddler NutriSTEP: a nutrition risk screening questionnaire for children 18-35 months of age.

    Science.gov (United States)

    Randall Simpson, Janis; Gumbley, Jillian; Whyte, Kylie; Lac, Jane; Morra, Crystal; Rysdale, Lee; Turfryer, Mary; McGibbon, Kim; Beyers, Joanne; Keller, Heather

    2015-09-01

    Nutrition is vital for optimal growth and development of young children. Nutrition risk screening can facilitate early intervention when followed by nutritional assessment and treatment. NutriSTEP (Nutrition Screening Tool for Every Preschooler) is a valid and reliable nutrition risk screening questionnaire for preschoolers (aged 3-5 years). A need was identified for a similar questionnaire for toddlers (aged 18-35 months). The purpose was to develop a reliable and valid Toddler NutriSTEP. Toddler NutriSTEP was developed in 4 phases. Content and face validity were determined with a literature review, parent focus groups (n = 6; 48 participants), and experts (n = 13) (phase A). A draft questionnaire was refined with key intercept interviews of 107 parents/caregivers (phase B). Test-retest reliability (phase C), based on intra-class correlations (ICC), Kappa (κ) statistics, and Wilcoxon tests was assessed with 133 parents/caregivers. Criterion validity (phase D) was assessed using Receiver Operating Characteristic (ROC) curves by comparing scores on the Toddler NutriSTEP to a comprehensive nutritional assessment of 200 toddlers with a registered dietitian (RD). The Toddler NutriSTEP was reliable between 2 administrations (ICC = 0.951, F = 20.53, p Toddler NutriSTEP were correlated (r = 0.67, p Toddler NutriSTEP questionnaire is both reliable and valid for screening for nutritional risk in toddlers.

  18. Strong Stability Preserving Explicit Linear Multistep Methods with Variable Step Size

    KAUST Repository

    Hadjimichael, Yiannis

    2016-09-08

    Strong stability preserving (SSP) methods are designed primarily for time integration of nonlinear hyperbolic PDEs, for which the permissible SSP step size varies from one step to the next. We develop the first SSP linear multistep methods (of order two and three) with variable step size, and prove their optimality, stability, and convergence. The choice of step size for multistep SSP methods is an interesting problem because the allowable step size depends on the SSP coefficient, which in turn depends on the chosen step sizes. The description of the methods includes an optimal step-size strategy. We prove sharp upper bounds on the allowable step size for explicit SSP linear multistep methods and show the existence of methods with arbitrarily high order of accuracy. The effectiveness of the methods is demonstrated through numerical examples.

  19. Numerical Validation of the Delaunay Normalization and the Krylov-Bogoliubov-Mitropolsky Method

    Directory of Open Access Journals (Sweden)

    David Ortigosa

    2014-01-01

    Full Text Available A scalable second-order analytical orbit propagator programme based on modern and classical perturbation methods is being developed. As a first step in the validation and verification of part of our orbit propagator programme, we only consider the perturbation produced by zonal harmonic coefficients in the Earth’s gravity potential, so that it is possible to analyze the behaviour of the mathematical expressions involved in Delaunay normalization and the Krylov-Bogoliubov-Mitropolsky method in depth and determine their limits.

  20. Improved perovskite phototransistor prepared using multi-step annealing method

    Science.gov (United States)

    Cao, Mingxuan; Zhang, Yating; Yu, Yu; Yao, Jianquan

    2018-02-01

    Organic-inorganic hybrid perovskites with good intrinsic physical properties have received substantial interest for solar cell and optoelectronic applications. However, perovskite film always suffers from a low carrier mobility due to its structural imperfection including sharp grain boundaries and pinholes, restricting their device performance and application potential. Here we demonstrate a straightforward strategy based on multi-step annealing process to improve the performance of perovskite photodetector. Annealing temperature and duration greatly affects the surface morphology and optoelectrical properties of perovskites which determines the device property of phototransistor. The perovskite films treated with multi-step annealing method tend to form highly uniform, well-crystallized and high surface coverage perovskite film, which exhibit stronger ultraviolet-visible absorption and photoluminescence spectrum compare to the perovskites prepared by conventional one-step annealing process. The field-effect mobilities of perovskite photodetector treated by one-step direct annealing method shows mobility as 0.121 (0.062) cm2V-1s-1 for holes (electrons), which increases to 1.01 (0.54) cm2V-1s-1 for that treated with muti-step slow annealing method. Moreover, the perovskite phototransistors exhibit a fast photoresponse speed of 78 μs. In general, this work focuses on the influence of annealing methods on perovskite phototransistor, instead of obtains best parameters of it. These findings prove that Multi-step annealing methods is feasible to prepared high performance based photodetector.

  1. Robustness study in SSNTD method validation: indoor radon quality

    Energy Technology Data Exchange (ETDEWEB)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L., E-mail: danilacdias@gmail.com [Comissao Nacional de Energia Nuclear (LAPOC/CNEN), Pocos de Caldas, MG (Brazil). Laboratorio de Pocos de Caldas

    2017-07-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  2. Robustness study in SSNTD method validation: indoor radon quality

    International Nuclear Information System (INIS)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L.

    2017-01-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  3. Solving point reactor kinetic equations by time step-size adaptable numerical methods

    International Nuclear Information System (INIS)

    Liao Chaqing

    2007-01-01

    Based on the analysis of effects of time step-size on numerical solutions, this paper showed the necessity of step-size adaptation. Based on the relationship between error and step-size, two-step adaptation methods for solving initial value problems (IVPs) were introduced. They are Two-Step Method and Embedded Runge-Kutta Method. PRKEs were solved by implicit Euler method with step-sizes optimized by using Two-Step Method. It was observed that the control error has important influence on the step-size and the accuracy of solutions. With suitable control errors, the solutions of PRKEs computed by the above mentioned method are accurate reasonably. The accuracy and usage of MATLAB built-in ODE solvers ode23 and ode45, both of which adopt Runge-Kutta-Fehlberg method, were also studied and discussed. (authors)

  4. Pressurised liquid extraction of flavonoids in onions. Method development and validation

    DEFF Research Database (Denmark)

    Søltoft, Malene; Christensen, J.H.; Nielsen, J.

    2009-01-01

    A rapid and reliable analytical method for quantification of flavonoids in onions was developed and validated. Five extraction methods were tested on freeze-dried onions and subsequently high performance liquid chromatography (HPLC) with UV detection was used for quantification of seven flavonoids...... extraction methods. However. PLE was the preferred extraction method because the method can be highly automated, use only small amounts of solvents, provide the cleanest extracts, and allow the extraction of light and oxygen-sensitive flavonoids to be carried out in an inert atmosphere protected from light......-step PLE method showed good selectivity, precision (RSDs = 3.1-11%) and recovery of the extractable flavonoids (98-99%). The method also appeared to be a multi-method, i.e. generally applicable to, e.g. phenolic acids in potatoes and carrots....

  5. Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods

    DEFF Research Database (Denmark)

    Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin

    2013-01-01

    Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was......, on the other hand, lighter than the single-step method....

  6. Strong Stability Preserving Two-step Runge–Kutta Methods

    KAUST Repository

    Ketcheson, David I.; Gottlieb, Sigal; Macdonald, Colin B.

    2011-01-01

    We investigate the strong stability preserving (SSP) property of two-step Runge–Kutta (TSRK) methods. We prove that all SSP TSRK methods belong to a particularly simple subclass of TSRK methods, in which stages from the previous step are not used. We derive simple order conditions for this subclass. Whereas explicit SSP Runge–Kutta methods have order at most four, we prove that explicit SSP TSRK methods have order at most eight. We present explicit TSRK methods of up to eighth order that were found by numerical search. These methods have larger SSP coefficients than any known methods of the same order of accuracy and may be implemented in a form with relatively modest storage requirements. The usefulness of the TSRK methods is demonstrated through numerical examples, including integration of very high order weighted essentially non-oscillatory discretizations.

  7. Strong Stability Preserving Two-step Runge–Kutta Methods

    KAUST Repository

    Ketcheson, David I.

    2011-12-22

    We investigate the strong stability preserving (SSP) property of two-step Runge–Kutta (TSRK) methods. We prove that all SSP TSRK methods belong to a particularly simple subclass of TSRK methods, in which stages from the previous step are not used. We derive simple order conditions for this subclass. Whereas explicit SSP Runge–Kutta methods have order at most four, we prove that explicit SSP TSRK methods have order at most eight. We present explicit TSRK methods of up to eighth order that were found by numerical search. These methods have larger SSP coefficients than any known methods of the same order of accuracy and may be implemented in a form with relatively modest storage requirements. The usefulness of the TSRK methods is demonstrated through numerical examples, including integration of very high order weighted essentially non-oscillatory discretizations.

  8. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    Science.gov (United States)

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  9. Simulation of the two-fluid model on incompressible flow with Fractional Step method for both resolved and unresolved scale interfaces

    International Nuclear Information System (INIS)

    Hou, Xiaofei; Rigola, Joaquim; Lehmkuhl, Oriol; Oliva, Assensi

    2015-01-01

    Highlights: • Two phase flow with free surface is solved by means of two-fluid model (TFM). • Fractional Step method and finite volume technique is used to solve TFM. • Conservative Level Set method reduces interface sharpening diffusion problem. • Cases including high density ratios and high viscosities validate the models. - Abstract: In the present paper, the Fractional Step method usually used in single fluid flow is here extended and applied for the two-fluid model resolution using the finite volume discretization. The use of a projection method resolution instead of the usual pressure-correction method for multi-fluid flow, successfully avoids iteration processes. On the other hand, the main weakness of the two fluid model used for simulations of free surface flows, which is the numerical diffusion of the interface, is also solved by means of the conservative Level Set method (interface sharpening) (Strubelj et al., 2009). Moreover, the use of the algorithm proposed has allowed presenting different free-surface cases with or without Level Set implementation even under coarse meshes under a wide range of density ratios. Thus, the numerical results presented, numerically verified, experimentally validated and converged under high density ratios, shows the capability and reliability of this resolution method for both mixed and unmixed flows

  10. Developing and validating a nutrition knowledge questionnaire: key methods and considerations.

    Science.gov (United States)

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-10-01

    To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.

  11. An improved 4-step commutation method application for matrix converter

    DEFF Research Database (Denmark)

    Guo, Yu; Guo, Yougui; Deng, Wenlang

    2014-01-01

    A novel four-step commutation method is proposed for matrix converter cell, 3 phase inputs to 1 phase output in this paper, which is obtained on the analysis of published commutation methods for matrix converter. The first and fourth step can be shorter than the second or third one. The discussed...... method here is implemented by programming in VHDL language. Finally, the novel method in this paper is verified by experiments....

  12. The Screening Test for Emotional Problems-Parent Report (STEP-P): Studies of Reliability and Validity

    Science.gov (United States)

    Erford, Bradley T.; Alsamadi, Silvana C.

    2012-01-01

    Score reliability and validity of parent responses concerning their 10- to 17-year-old students were analyzed using the Screening Test for Emotional Problems-Parent Report (STEP-P), which assesses a variety of emotional problems classified under the Individuals with Disabilities Education Improvement Act. Score reliability, convergent, and…

  13. Single Laboratory Validated Method for Determination of Cylindrospermopsin and Anatoxin-a in Ambient Water by Liquid Chromatography/ Tandem Mass Spectrometry (LC/MS/MS)

    Science.gov (United States)

    This product is an LC/MS/MS single laboratory validated method for the determination of cylindrospermopsin and anatoxin-a in ambient waters. The product contains step-by-step instructions for sample preparation, analyses, preservation, sample holding time and QC protocols to ensu...

  14. A multi-time-step noise reduction method for measuring velocity statistics from particle tracking velocimetry

    Science.gov (United States)

    Machicoane, Nathanaël; López-Caballero, Miguel; Bourgoin, Mickael; Aliseda, Alberto; Volk, Romain

    2017-10-01

    We present a method to improve the accuracy of velocity measurements for fluid flow or particles immersed in it, based on a multi-time-step approach that allows for cancellation of noise in the velocity measurements. Improved velocity statistics, a critical element in turbulent flow measurements, can be computed from the combination of the velocity moments computed using standard particle tracking velocimetry (PTV) or particle image velocimetry (PIV) techniques for data sets that have been collected over different values of time intervals between images. This method produces Eulerian velocity fields and Lagrangian velocity statistics with much lower noise levels compared to standard PIV or PTV measurements, without the need of filtering and/or windowing. Particle displacement between two frames is computed for multiple different time-step values between frames in a canonical experiment of homogeneous isotropic turbulence. The second order velocity structure function of the flow is computed with the new method and compared to results from traditional measurement techniques in the literature. Increased accuracy is also demonstrated by comparing the dissipation rate of turbulent kinetic energy measured from this function against previously validated measurements.

  15. Three-Step Predictor-Corrector of Exponential Fitting Method for Nonlinear Schroedinger Equations

    International Nuclear Information System (INIS)

    Tang Chen; Zhang Fang; Yan Haiqing; Luo Tao; Chen Zhanqing

    2005-01-01

    We develop the three-step explicit and implicit schemes of exponential fitting methods. We use the three-step explicit exponential fitting scheme to predict an approximation, then use the three-step implicit exponential fitting scheme to correct this prediction. This combination is called the three-step predictor-corrector of exponential fitting method. The three-step predictor-corrector of exponential fitting method is applied to numerically compute the coupled nonlinear Schroedinger equation and the nonlinear Schroedinger equation with varying coefficients. The numerical results show that the scheme is highly accurate.

  16. Comparison of Stepped Care Delivery Against a Single, Empirically Validated Cognitive-Behavioral Therapy Program for Youth With Anxiety: A Randomized Clinical Trial.

    Science.gov (United States)

    Rapee, Ronald M; Lyneham, Heidi J; Wuthrich, Viviana; Chatterton, Mary Lou; Hudson, Jennifer L; Kangas, Maria; Mihalopoulos, Cathrine

    2017-10-01

    Stepped care is embraced as an ideal model of service delivery but is minimally evaluated. The aim of this study was to evaluate the efficacy of cognitive-behavioral therapy (CBT) for child anxiety delivered via a stepped-care framework compared against a single, empirically validated program. A total of 281 youth with anxiety disorders (6-17 years of age) were randomly allocated to receive either empirically validated treatment or stepped care involving the following: (1) low intensity; (2) standard CBT; and (3) individually tailored treatment. Therapist qualifications increased at each step. Interventions did not differ significantly on any outcome measures. Total therapist time per child was significantly shorter to deliver stepped care (774 minutes) compared with best practice (897 minutes). Within stepped care, the first 2 steps returned the strongest treatment gains. Stepped care and a single empirically validated program for youth with anxiety produced similar efficacy, but stepped care required slightly less therapist time. Restricting stepped care to only steps 1 and 2 would have led to considerable time saving with modest loss in efficacy. Clinical trial registration information-A Randomised Controlled Trial of Standard Care Versus Stepped Care for Children and Adolescents With Anxiety Disorders; http://anzctr.org.au/; ACTRN12612000351819. Copyright © 2017 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  17. Validation of patient determined disease steps (PDDS) scale scores in persons with multiple sclerosis.

    Science.gov (United States)

    Learmonth, Yvonne C; Motl, Robert W; Sandroff, Brian M; Pula, John H; Cadavid, Diego

    2013-04-25

    The Patient Determined Disease Steps (PDDS) is a promising patient-reported outcome (PRO) of disability in multiple sclerosis (MS). To date, there is limited evidence regarding the validity of PDDS scores, despite its sound conceptual development and broad inclusion in MS research. This study examined the validity of the PDDS based on (1) the association with Expanded Disability Status Scale (EDSS) scores and (2) the pattern of associations between PDDS and EDSS scores with Functional System (FS) scores as well as ambulatory and other outcomes. 96 persons with MS provided demographic/clinical information, completed the PDDS and other PROs including the Multiple Sclerosis Walking Scale-12 (MSWS-12), and underwent a neurological examination for generating FS and EDSS scores. Participants completed assessments of cognition, ambulation including the 6-minute walk (6 MW), and wore an accelerometer during waking hours over seven days. There was a strong correlation between EDSS and PDDS scores (ρ = .783). PDDS and EDSS scores were strongly correlated with Pyramidal (ρ = .578 &ρ = .647, respectively) and Cerebellar (ρ = .501 &ρ = .528, respectively) FS scores as well as 6 MW distance (ρ = .704 &ρ = .805, respectively), MSWS-12 scores (ρ = .801 &ρ = .729, respectively), and accelerometer steps/day (ρ = -.740 &ρ = -.717, respectively). This study provides novel evidence supporting the PDDS as valid PRO of disability in MS.

  18. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    Science.gov (United States)

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Validation of a residue method to determine pesticide residues in cucumber by using nuclear techniques

    International Nuclear Information System (INIS)

    Baysoyu, D.; Tiryaki, O.; Secer, E.; Aydin, G.

    2009-01-01

    In this study, a multi-residue method using ethyl acetate for extraction and gel permeation chromatography for clean-up was validated to determine chlorpyrifos, malathion and dichlorvos in cucumber by gas chromatography. For this purpose, homogenized cucumber samples were fortified with pesticides at 0.02 0.2, 0.8 and 1 mg/kg levels. The efficiency and repeatability of the method in extraction and cleanup steps were performed using 1 4C-carbaryl by radioisotope tracer technique. 1 4C-carbaryl recoveries after the extraction and cleanup steps were between 92.63-111.73 % with a repeatability of 4.85% (CV) and 74.83-102.22 % with a repeatability of 7.19% (CV), respectively. The homogeneity of analytical samples and the stability of pesticides during homogenization were determined using radio tracer technique and chromatographic methods, respectively.

  20. Strong Stability Preserving Explicit Linear Multistep Methods with Variable Step Size

    KAUST Repository

    Hadjimichael, Yiannis; Ketcheson, David I.; Loczi, Lajos; Né meth, Adriá n

    2016-01-01

    Strong stability preserving (SSP) methods are designed primarily for time integration of nonlinear hyperbolic PDEs, for which the permissible SSP step size varies from one step to the next. We develop the first SSP linear multistep methods (of order

  1. Perturbed Strong Stability Preserving Time-Stepping Methods For Hyperbolic PDEs

    KAUST Repository

    Hadjimichael, Yiannis

    2017-09-30

    A plethora of physical phenomena are modelled by hyperbolic partial differential equations, for which the exact solution is usually not known. Numerical methods are employed to approximate the solution to hyperbolic problems; however, in many cases it is difficult to satisfy certain physical properties while maintaining high order of accuracy. In this thesis, we develop high-order time-stepping methods that are capable of maintaining stability constraints of the solution, when coupled with suitable spatial discretizations. Such methods are called strong stability preserving (SSP) time integrators, and we mainly focus on perturbed methods that use both upwind- and downwind-biased spatial discretizations. Firstly, we introduce a new family of third-order implicit Runge–Kuttas methods with arbitrarily large SSP coefficient. We investigate the stability and accuracy of these methods and we show that they perform well on hyperbolic problems with large CFL numbers. Moreover, we extend the analysis of SSP linear multistep methods to semi-discretized problems for which different terms on the right-hand side of the initial value problem satisfy different forward Euler (or circle) conditions. Optimal perturbed and additive monotonicity-preserving linear multistep methods are studied in the context of such problems. Optimal perturbed methods attain augmented monotonicity-preserving step sizes when the different forward Euler conditions are taken into account. On the other hand, we show that optimal SSP additive methods achieve a monotonicity-preserving step-size restriction no better than that of the corresponding non-additive SSP linear multistep methods. Furthermore, we develop the first SSP linear multistep methods of order two and three with variable step size, and study their optimality. We describe an optimal step-size strategy and demonstrate the effectiveness of these methods on various one- and multi-dimensional problems. Finally, we establish necessary conditions

  2. MIDPOINT TWO- STEPS RULE FOR THE SQUARE ROOT METHOD

    African Journals Online (AJOL)

    DR S.E UWAMUSI

    Aberth third order method for finding zeros of a polynomial in interval ... KEY WORDS: Square root iteration, midpoint two steps Method, ...... A New set of Methods for the simultaneous determination of zeros of polynomial equation and iterative ...

  3. Statistical Analysis Methods for Physics Models Verification and Validation

    CERN Document Server

    De Luca, Silvia

    2017-01-01

    The validation and verification process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal

  4. High-performance liquid chromatography method validation for determination of tetracycline residues in poultry meat

    Directory of Open Access Journals (Sweden)

    Vikas Gupta

    2014-01-01

    Full Text Available Background: In this study, a method for determination of tetracycline (TC residues in poultry with the help of high-performance liquid chromatography technique was validated. Materials and Methods: The principle step involved in ultrasonic-assisted extraction of TCs from poultry samples by 2 ml of 20% trichloroacetic acid and phosphate buffer (pH 4, which gave a clearer supernatant and high recovery, followed by centrifugation and purification by using 0.22 μm filter paper. Results: Validity study of the method revealed that all obtained calibration curves showed good linearity (r2 > 0.999 over the range of 40-4500 ng. Sensitivity was found to be 1.54 and 1.80 ng for oxytetracycline (OTC and TC. Accuracy was in the range of 87.94-96.20% and 72.40-79.84% for meat. Precision was lower than 10% in all cases indicating that the method can be used as a validated method. Limit of detection was found to be 4.8 and 5.10 ng for OTC and TC, respectively. The corresponding values of limit of quantitation were 11 and 12 ng. Conclusion: The method reliably identifies and quantifies the selected TC and OTC in the reconstituted poultry meat in the low and sub-nanogram range and can be applied in any laboratory.

  5. Validated method for the analysis of goji berry, a rich source of zeaxanthin dipalmitate.

    Science.gov (United States)

    Karioti, Anastasia; Bergonzi, Maria Camilla; Vincieri, Franco F; Bilia, Anna Rita

    2014-12-31

    In the present study an HPLC-DAD method was developed for the determination of the main carotenoid, zeaxanthin dipalmitate, in the fruits of Lycium barbarum. The aim was to develop and optimize an extraction protocol to allow fast, exhaustive, and repeatable extraction, suitable for labile carotenoid content. Use of liquid N2 allowed the grinding of the fruit. A step of ultrasonication with water removed efficiently the polysaccharides and enabled the exhaustive extraction of carotenoids by hexane/acetone 50:50. The assay was fast and simple and permitted the quality control of a large number of commercial samples including fruits, juices, and a jam. The HPLC method was validated according to ICH guidelines and satisfied the requirements. Finally, the overall method was validated for precision (% RSD ranging between 3.81 and 4.13) and accuracy at three concentration levels. The recovery was between 94 and 107% with RSD values <2%, within the acceptable limits, especially if the difficulty of the matrix is taken into consideration.

  6. Fuzzy decision-making: a new method in model selection via various validity criteria

    International Nuclear Information System (INIS)

    Shakouri Ganjavi, H.; Nikravesh, K.

    2001-01-01

    Modeling is considered as the first step in scientific investigations. Several alternative models may be candida ted to express a phenomenon. Scientists use various criteria to select one model between the competing models. Based on the solution of a Fuzzy Decision-Making problem, this paper proposes a new method in model selection. The method enables the scientist to apply all desired validity criteria, systematically by defining a proper Possibility Distribution Function due to each criterion. Finally, minimization of a utility function composed of the Possibility Distribution Functions will determine the best selection. The method is illustrated through a modeling example for the A verage Daily Time Duration of Electrical Energy Consumption in Iran

  7. Validity of using tri-axial accelerometers to measure human movement - Part II: Step counts at a wide range of gait velocities.

    Science.gov (United States)

    Fortune, Emma; Lugade, Vipul; Morrow, Melissa; Kaufman, Kenton

    2014-06-01

    A subject-specific step counting method with a high accuracy level at all walking speeds is needed to assess the functional level of impaired patients. The study aim was to validate step counts and cadence calculations from acceleration data by comparison to video data during dynamic activity. Custom-built activity monitors, each containing one tri-axial accelerometer, were placed on the ankles, thigh, and waist of 11 healthy adults. ICC values were greater than 0.98 for video inter-rater reliability of all step counts. The activity monitoring system (AMS) algorithm demonstrated a median (interquartile range; IQR) agreement of 92% (8%) with visual observations during walking/jogging trials at gait velocities ranging from 0.1 to 4.8m/s, while FitBits (ankle and waist), and a Nike Fuelband (wrist) demonstrated agreements of 92% (36%), 93% (22%), and 33% (35%), respectively. The algorithm results demonstrated high median (IQR) step detection sensitivity (95% (2%)), positive predictive value (PPV) (99% (1%)), and agreement (97% (3%)) during a laboratory-based simulated free-living protocol. The algorithm also showed high median (IQR) sensitivity, PPV, and agreement identifying walking steps (91% (5%), 98% (4%), and 96% (5%)), jogging steps (97% (6%), 100% (1%), and 95% (6%)), and less than 3% mean error in cadence calculations. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.

  8. Considerations for the independent reaction times and step-by-step methods for radiation chemistry simulations

    Science.gov (United States)

    Plante, Ianik; Devroye, Luc

    2017-10-01

    Ionizing radiation interacts with the water molecules of the tissues mostly by ionizations and excitations, which result in the formation of the radiation track structure and the creation of radiolytic species such as H.,.OH, H2, H2O2, and e-aq. After their creation, these species diffuse and may chemically react with the neighboring species and with the molecules of the medium. Therefore radiation chemistry is of great importance in radiation biology. As the chemical species are not distributed homogeneously, the use of conventional models of homogeneous reactions cannot completely describe the reaction kinetics of the particles. Actually, many simulations of radiation chemistry are done using the Independent Reaction Time (IRT) method, which is a very fast technique to calculate radiochemical yields but which do not calculate the positions of the radiolytic species as a function of time. Step-by-step (SBS) methods, which are able to provide such information, have been used only sparsely because these are time-consuming in terms of calculation. Recent improvements in computer performance now allow the regular use of the SBS method in radiation chemistry. The SBS and IRT methods are both based on the Green's functions of the diffusion equation (GFDE). In this paper, several sampling algorithms of the GFDE and for the IRT method are presented. We show that the IRT and SBS methods are exactly equivalent for 2-particles systems for diffusion and partially diffusion-controlled reactions between non-interacting particles. We also show that the results obtained with the SBS simulation method with periodic boundary conditions are in agreement with the predictions by classical reaction kinetics theory, which is an important step towards using this method for modelling of biochemical networks and metabolic pathways involved in oxidative stress. Finally, the first simulation results obtained with the code RITRACKS (Relativistic Ion Tracks) are presented.

  9. High-throughput screening assay used in pharmacognosy: Selection, optimization and validation of methods of enzymatic inhibition by UV-visible spectrophotometry

    Directory of Open Access Journals (Sweden)

    Graciela Granados-Guzmán

    2014-02-01

    Full Text Available In research laboratories of both organic synthesis and extraction of natural products, every day a lot of products that can potentially introduce some biological activity are obtained. Therefore it is necessary to have in vitro assays, which provide reliable information for further evaluation in in vivo systems. From this point of view, in recent years has intensified the use of high-throughput screening assays. Such trials should be optimized and validated for accurate and precise results, i.e. reliable. The present review addresses the steps needed to develop and validate bioanalytical methods, emphasizing UV-Visible spectrophotometry as detection system. Particularly focuses on the selection of the method, the optimization to determine the best experimental conditions, validation, implementation of optimized and validated method to real samples, and finally maintenance and possible transfer it to a new laboratory.

  10. The Fractional Step Method Applied to Simulations of Natural Convective Flows

    Science.gov (United States)

    Westra, Douglas G.; Heinrich, Juan C.; Saxon, Jeff (Technical Monitor)

    2002-01-01

    This paper describes research done to apply the Fractional Step Method to finite-element simulations of natural convective flows in pure liquids, permeable media, and in a directionally solidified metal alloy casting. The Fractional Step Method has been applied commonly to high Reynold's number flow simulations, but is less common for low Reynold's number flows, such as natural convection in liquids and in permeable media. The Fractional Step Method offers increased speed and reduced memory requirements by allowing non-coupled solution of the pressure and the velocity components. The Fractional Step Method has particular benefits for predicting flows in a directionally solidified alloy, since other methods presently employed are not very efficient. Previously, the most suitable method for predicting flows in a directionally solidified binary alloy was the penalty method. The penalty method requires direct matrix solvers, due to the penalty term. The Fractional Step Method allows iterative solution of the finite element stiffness matrices, thereby allowing more efficient solution of the matrices. The Fractional Step Method also lends itself to parallel processing, since the velocity component stiffness matrices can be built and solved independently of each other. The finite-element simulations of a directionally solidified casting are used to predict macrosegregation in directionally solidified castings. In particular, the finite-element simulations predict the existence of 'channels' within the processing mushy zone and subsequently 'freckles' within the fully processed solid, which are known to result from macrosegregation, or what is often referred to as thermo-solutal convection. These freckles cause material property non-uniformities in directionally solidified castings; therefore many of these castings are scrapped. The phenomenon of natural convection in an alloy under-going directional solidification, or thermo-solutal convection, will be explained. The

  11. A Spiral Step-by-Step Educational Method for Cultivating Competent Embedded System Engineers to Meet Industry Demands

    Science.gov (United States)

    Jing,Lei; Cheng, Zixue; Wang, Junbo; Zhou, Yinghui

    2011-01-01

    Embedded system technologies are undergoing dramatic change. Competent embedded system engineers are becoming a scarce resource in the industry. Given this, universities should revise their specialist education to meet industry demands. In this paper, a spirally tight-coupled step-by-step educational method, based on an analysis of industry…

  12. The Screening Test for Emotional Problems--Teacher-Report Version (Step-T): Studies of Reliability and Validity

    Science.gov (United States)

    Erford, Bradley T.; Butler, Caitlin; Peacock, Elizabeth

    2015-01-01

    The Screening Test for Emotional Problems-Teacher Version (STEP-T) was designed to identify students aged 7-17 years with wide-ranging emotional disturbances. Coefficients alpha and test-retest reliability were adequate for all subscales except Anxiety. The hypothesized five-factor model fit the data very well and external aspects of validity were…

  13. Using a Two-Step Method to Measure Transgender Identity in Latin America/the Caribbean, Portugal, and Spain

    Science.gov (United States)

    Reisner, Sari L.; Biello, Katie; Rosenberger, Joshua G.; Austin, S. Bryn; Haneuse, Sebastien; Perez-Brumer, Amaya; Novak, David S.; Mimiaga, Matthew J.

    2014-01-01

    Few comparative data are available internationally to examine health differences by transgender identity. A barrier to monitoring the health and well-being of transgender people is the lack of inclusion of measures to assess natal sex/gender identity status in surveys. Data were from a cross-sectional anonymous online survey of members (n > 36,000) of a sexual networking website targeting men who have sex with men in Spanish- and Portuguese-speaking countries/ territories in Latin America/the Caribbean, Portugal, and Spain. Natal sex/gender identity status was assessed using a two-step method (Step 1: assigned birth sex, Step 2: current gender identity). Male-to-female (MTF) and female-to-male (FTM) participants were compared to non-transgender males in age-adjusted regression models on socioeconomic status (SES) (education, income, sex work), masculine gender conformity, psychological health and well-being (lifetime suicidality, past-week depressive distress, positive self-worth, general self-rated health, gender related stressors), and sexual health (HIV-infection, past-year STIs, past-3 month unprotected anal or vaginal sex). The two-step method identified 190 transgender participants (0.54%; 158 MTF, 32 FTM). Of the 12 health-related variables, six showed significant differences between the three groups: SES, masculine gender conformity, lifetime suicidality, depressive distress, positive self-worth, and past-year genital herpes. A two-step approach is recommended for health surveillance efforts to assess natal sex/gender identity status. Cognitive testing to formally validate assigned birth sex and current gender identity survey items in Spanish and Portuguese is encouraged. PMID:25030120

  14. The Technique of Changing the Drive Method of Micro Step Drive and Sensorless Drive for Hybrid Stepping Motor

    Science.gov (United States)

    Yoneda, Makoto; Dohmeki, Hideo

    The position control system with the advantage large torque, low vibration, and high resolution can be obtained by the constant current micro step drive applied to hybrid stepping motor. However loss is large, in order not to be concerned with load torque but to control current uniformly. As the one technique of a position control system in which high efficiency is realizable, the same sensorless control as a permanent magnet motor is effective. But, it was the purpose that the control method proposed until now controls speed. Then, this paper proposed changing the drive method of micro step drive and sensorless drive. The change of the drive method was verified from the simulation and the experiment. On no load, it was checked not producing change of a large speed at the time of a change by making electrical angle and carrying out zero reset of the integrator. On load, it was checked that a large speed change arose. The proposed system could change drive method by setting up the initial value of an integrator using the estimated result, without producing speed change. With this technique, the low loss position control system, which employed the advantage of the hybrid stepping motor, has been built.

  15. Improved Full-Newton Step O(nL) Infeasible Interior-Point Method for Linear Optimization

    NARCIS (Netherlands)

    Gu, G.; Mansouri, H.; Zangiabadi, M.; Bai, Y.Q.; Roos, C.

    2009-01-01

    We present several improvements of the full-Newton step infeasible interior-point method for linear optimization introduced by Roos (SIAM J. Optim. 16(4):1110–1136, 2006). Each main step of the method consists of a feasibility step and several centering steps. We use a more natural feasibility step,

  16. Validation of Analysis Method of pesticides in fresh tomatoes by Gas Chromatography associated to a liquid scintillation counting

    International Nuclear Information System (INIS)

    Dhib, Ahlem

    2011-01-01

    Pesticides are nowadays considered as toxic for human health. The maximum residues levels (MRL) in foodstuff are more and more strict. Therefore, selective analytical techniques are necessary for their identification and their quantification. The aim of this study is to set up a multi residue method for the determination of pesticides in tomatoes by gas chromatography with μECD detector (GC/μECD) associated to liquid scintillation counting. A global analytical protocol consisting of a QuECHERS version of the extraction step followed by purification step of the resulting extract on a polymeric sorbent was set up. The 14 C-chloropyrifos used as an internal standard proved excellent to control the different steps needed for the sample preparation. The method optimized is specific, selective with a recovery averaged more than 70 pour cent, repetitive and reproducible. Although some others criteria need to be checked regarding validation before its use in routine analysis, the potential of the method has been demonstrated.

  17. Comparing Multi-Step IMAC and Multi-Step TiO2 Methods for Phosphopeptide Enrichment

    Science.gov (United States)

    Yue, Xiaoshan; Schunter, Alissa; Hummon, Amanda B.

    2016-01-01

    Phosphopeptide enrichment from complicated peptide mixtures is an essential step for mass spectrometry-based phosphoproteomic studies to reduce sample complexity and ionization suppression effects. Typical methods for enriching phosphopeptides include immobilized metal affinity chromatography (IMAC) or titanium dioxide (TiO2) beads, which have selective affinity and can interact with phosphopeptides. In this study, the IMAC enrichment method was compared with the TiO2 enrichment method, using a multi-step enrichment strategy from whole cell lysate, to evaluate their abilities to enrich for different types of phosphopeptides. The peptide-to-beads ratios were optimized for both IMAC and TiO2 beads. Both IMAC and TiO2 enrichments were performed for three rounds to enable the maximum extraction of phosphopeptides from the whole cell lysates. The phosphopeptides that are unique to IMAC enrichment, unique to TiO2 enrichment, and identified with both IMAC and TiO2 enrichment were analyzed for their characteristics. Both IMAC and TiO2 enriched similar amounts of phosphopeptides with comparable enrichment efficiency. However, phosphopeptides that are unique to IMAC enrichment showed a higher percentage of multi-phosphopeptides, as well as a higher percentage of longer, basic, and hydrophilic phosphopeptides. Also, the IMAC and TiO2 procedures clearly enriched phosphopeptides with different motifs. Finally, further enriching with two rounds of TiO2 from the supernatant after IMAC enrichment, or further enriching with two rounds of IMAC from the supernatant TiO2 enrichment does not fully recover the phosphopeptides that are not identified with the corresponding multi-step enrichment. PMID:26237447

  18. A two-step Hilbert transform method for 2D image reconstruction

    International Nuclear Information System (INIS)

    Noo, Frederic; Clackdoyle, Rolf; Pack, Jed D

    2004-01-01

    The paper describes a new accurate two-dimensional (2D) image reconstruction method consisting of two steps. In the first step, the backprojected image is formed after taking the derivative of the parallel projection data. In the second step, a Hilbert filtering is applied along certain lines in the differentiated backprojection (DBP) image. Formulae for performing the DBP step in fan-beam geometry are also presented. The advantage of this two-step Hilbert transform approach is that in certain situations, regions of interest (ROIs) can be reconstructed from truncated projection data. Simulation results are presented that illustrate very similar reconstructed image quality using the new method compared to standard filtered backprojection, and that show the capability to correctly handle truncated projections. In particular, a simulation is presented of a wide patient whose projections are truncated laterally yet for which highly accurate ROI reconstruction is obtained

  19. Stability of one-step methods in transient nonlinear heat conduction

    International Nuclear Information System (INIS)

    Hughes, J.R.

    1977-01-01

    The purpose of the present work is to ascertain practical stability conditions for one-step methods commonly used in transient nonlinear heat conduction analyses. The class of problems considered is governed by a temporally continuous, spatially discrete system involving the capacity matrix C, conductivity matrix K, heat supply vector, temperature vector and time differenciation. In the linear case, in which K and C are constant, the stability behavior of one-step methods is well known. But in this paper the concepts of stability, appropriate to the nonlinear problem, are thoroughly discussed. They of course reduce to the usual stability criterion for the linear, constant coefficient case. However, for nonlinear problems there are differences and these ideas are of key importance in obtaining practical stability conditions. Of particular importance is a recent result which indicates that, in a sense, the trapezoidal and midpoint families are quivalent. Thus, stability results for one family may be translated into a result for the other. The main results obtained are summarized as follows. The stability behavior of the explicit Euler method in the nonlinear regime is analogous to that for linear problems. In particular, an a priori step size restriction may be determined for each time step. The precise time step restriction on implicit conditionally stable members of the trapezoidal and midpoint families is shown not to be determinable a priori. Of considerable practical significance, unconditionally stable members of the trapezoidal and midpoint families are identified

  20. Validation for chromatographic and electrophoretic methods

    OpenAIRE

    Ribani, Marcelo; Bottoli, Carla Beatriz Grespan; Collins, Carol H.; Jardim, Isabel Cristina Sales Fontes; Melo, Lúcio Flávio Costa

    2004-01-01

    The validation of an analytical method is fundamental to implementing a quality control system in any analytical laboratory. As the separation techniques, GC, HPLC and CE, are often the principal tools used in such determinations, procedure validation is a necessity. The objective of this review is to describe the main aspects of validation in chromatographic and electrophoretic analysis, showing, in a general way, the similarities and differences between the guidelines established by the dif...

  1. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    Science.gov (United States)

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  2. Validity of activity trackers, smartphones, and phone applications to measure steps in various walking conditions.

    Science.gov (United States)

    Höchsmann, C; Knaier, R; Eymann, J; Hintermann, J; Infanger, D; Schmidt-Trucksäss, A

    2018-02-20

    To examine the validity of popular smartphone accelerometer applications and a consumer activity wristband compared to a widely used research accelerometer while assessing the impact of the phone's position on the accuracy of step detection. Twenty volunteers from 2 different age groups (Group A: 18-25 years, n = 10; Group B 45-70 years, n = 10) were equipped with 3 iPhone SE smartphones (placed in pants pocket, shoulder bag, and backpack), 1 Samsung Galaxy S6 Edge (pants pocket), 1 Garmin Vivofit 2 wristband, and 2 ActiGraph wGTX+ devices (worn at wrist and hip) while walking on a treadmill (1.6, 3.2, 4.8, and 6.0 km/h) and completing a walking course. All smartphones included 6 accelerometer applications. Video observation was used as gold standard. Validity was evaluated by comparing each device with the gold standard using mean absolute percentage errors (MAPE). The MAPE of the iPhone SE (all positions) and the Garmin Vivofit was small (Samsung Galaxy and hip-worn ActiGraph showed small MAPE only for treadmill walking at 4.8 and 6.0 km/h and for free walking. The wrist-worn ActiGraph showed high MAPE (17-47) for all walking conditions. The iPhone SE and the Garmin Vivofit 2 are accurate tools for step counting in different age groups and during various walking conditions, even during slow walking. The phone's position does not impact the accuracy of step detection, which substantially improves the versatility for physical activity assessment in clinical and research settings. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Two-step versus one-step FT4 assays in heparin treated patients and non-thyroidal illness

    International Nuclear Information System (INIS)

    Reiners, C.; Bieler, G.; Ertl, G.; Gloss, H.; Boerner, W.

    1985-01-01

    The primary intention of this study is to inform the clinician about the direction and the order of magnitude of possible disturbances of different FT 4 parameters under the condition mentioned last, which is not uncommon in daily routine. No single FT 4 -RIA proved be valid in severe NTI. There is the risk to misinterprete hyperthyroidism as euthyroidism with 1step assays and the possibility to classify euthyroid patients falsely as hyperthyroid with 2step assays. In relation to this problem, the sometimes lowered FT 4 values by 1step methods are clinically not so important. It has to be established, wether TSH assays of high sensitivity are able to overcome some of the difficulties with determinations of peripheral thyroid hormones in NTI. (orig./MG)

  4. Comparison of the Screening Tests for Gestational Diabetes Mellitus between "One-Step" and "Two-Step" Methods among Thai Pregnant Women.

    Science.gov (United States)

    Luewan, Suchaya; Bootchaingam, Phenphan; Tongsong, Theera

    2018-01-01

    To compare the prevalence and pregnancy outcomes of GDM between those screened by the "one-step" (75 gm GTT) and "two-step" (100 gm GTT) methods. A prospective study was conducted on singleton pregnancies at low or average risk of GDM. All were screened between 24 and 28 weeks, using the one-step or two-step method based on patients' preference. The primary outcome was prevalence of GDM, and secondary outcomes included birthweight, gestational age, rates of preterm birth, small/large-for-gestational age, low Apgar scores, cesarean section, and pregnancy-induced hypertension. A total of 648 women were screened: 278 in the one-step group and 370 in the two-step group. The prevalence of GDM was significantly higher in the one-step group; 32.0% versus 10.3%. Baseline characteristics and pregnancy outcomes in both groups were comparable. However, mean birthweight was significantly higher among pregnancies with GDM diagnosed by the two-step approach (3204 ± 555 versus 3009 ± 666 g; p =0.022). Likewise, the rate of large-for-date tended to be higher in the two-step group, but was not significant. The one-step approach is associated with very high prevalence of GDM among Thai population, without clear evidence of better outcomes. Thus, this approach may not be appropriate for screening in a busy antenatal care clinic like our setting or other centers in developing countries.

  5. Improved Full-Newton Step O(nL) Infeasible Interior-Point Method for Linear Optimization

    OpenAIRE

    Gu, G.; Mansouri, H.; Zangiabadi, M.; Bai, Y.Q.; Roos, C.

    2009-01-01

    We present several improvements of the full-Newton step infeasible interior-point method for linear optimization introduced by Roos (SIAM J. Optim. 16(4):1110–1136, 2006). Each main step of the method consists of a feasibility step and several centering steps. We use a more natural feasibility step, which targets the ?+-center of the next pair of perturbed problems. As for the centering steps, we apply a sharper quadratic convergence result, which leads to a slightly wider neighborhood for th...

  6. Development of the Modified Four Square Step Test and its reliability and validity in people with stroke.

    Science.gov (United States)

    Roos, Margaret A; Reisman, Darcy S; Hicks, Gregory; Rose, William; Rudolph, Katherine S

    2016-01-01

    Adults with stroke have difficulty avoiding obstacles when walking, especially when a time constraint is imposed. The Four Square Step Test (FSST) evaluates dynamic balance by requiring individuals to step over canes in multiple directions while being timed, but many people with stroke are unable to complete it. The purposes of this study were to (1) modify the FSST by replacing the canes with tape so that more persons with stroke could successfully complete the test and (2) examine the reliability and validity of the modified version. Fifty-five subjects completed the Modified FSST (mFSST) by stepping over tape in all four directions while being timed. The mFSST resulted in significantly greater numbers of subjects completing the test than the FSST (39/55 [71%] and 33/55 [60%], respectively) (p < 0.04). The test-retest, intrarater, and interrater reliability of the mFSST were excellent (intraclass correlation coefficient ranges: 0.81-0.99). Construct and concurrent validity of the mFSST were also established. The minimal detectable change was 6.73 s. The mFSST, an ideal measure of dynamic balance, can identify progress in people with stroke in varied settings and can be completed by a wide range of people with stroke in approximately 5 min with the use of minimal equipment (tape, stop watch).

  7. SPAR-H Step-by-Step Guidance

    Energy Technology Data Exchange (ETDEWEB)

    W. J. Galyean; A. M. Whaley; D. L. Kelly; R. L. Boring

    2011-05-01

    This guide provides step-by-step guidance on the use of the SPAR-H method for quantifying Human Failure Events (HFEs). This guide is intended to be used with the worksheets provided in: 'The SPAR-H Human Reliability Analysis Method,' NUREG/CR-6883, dated August 2005. Each step in the process of producing a Human Error Probability (HEP) is discussed. These steps are: Step-1, Categorizing the HFE as Diagnosis and/or Action; Step-2, Rate the Performance Shaping Factors; Step-3, Calculate PSF-Modified HEP; Step-4, Accounting for Dependence, and; Step-5, Minimum Value Cutoff. The discussions on dependence are extensive and include an appendix that describes insights obtained from the psychology literature.

  8. SPAR-H Step-by-Step Guidance

    International Nuclear Information System (INIS)

    Galyean, W.J.; Whaley, A.M.; Kelly, D.L.; Boring, R.L.

    2011-01-01

    This guide provides step-by-step guidance on the use of the SPAR-H method for quantifying Human Failure Events (HFEs). This guide is intended to be used with the worksheets provided in: 'The SPAR-H Human Reliability Analysis Method,' NUREG/CR-6883, dated August 2005. Each step in the process of producing a Human Error Probability (HEP) is discussed. These steps are: Step-1, Categorizing the HFE as Diagnosis and/or Action; Step-2, Rate the Performance Shaping Factors; Step-3, Calculate PSF-Modified HEP; Step-4, Accounting for Dependence, and; Step-5, Minimum Value Cutoff. The discussions on dependence are extensive and include an appendix that describes insights obtained from the psychology literature.

  9. Q-Step methods for Newton-Jacobi operator equation | Uwasmusi ...

    African Journals Online (AJOL)

    The paper considers the Newton-Jacobi operator equation for the solution of nonlinear systems of equations. Special attention is paid to the computational part of this method with particular reference to the q-step methods. Journal of the Nigerian Association of Mathematical Physics Vol. 8 2004: pp. 237-241 ...

  10. A new method for assessing content validity in model-based creation and iteration of eHealth interventions.

    Science.gov (United States)

    Kassam-Adams, Nancy; Marsac, Meghan L; Kohser, Kristen L; Kenardy, Justin A; March, Sonja; Winston, Flaura K

    2015-04-15

    The advent of eHealth interventions to address psychological concerns and health behaviors has created new opportunities, including the ability to optimize the effectiveness of intervention activities and then deliver these activities consistently to a large number of individuals in need. Given that eHealth interventions grounded in a well-delineated theoretical model for change are more likely to be effective and that eHealth interventions can be costly to develop, assuring the match of final intervention content and activities to the underlying model is a key step. We propose to apply the concept of "content validity" as a crucial checkpoint to evaluate the extent to which proposed intervention activities in an eHealth intervention program are valid (eg, relevant and likely to be effective) for the specific mechanism of change that each is intended to target and the intended target population for the intervention. The aims of this paper are to define content validity as it applies to model-based eHealth intervention development, to present a feasible method for assessing content validity in this context, and to describe the implementation of this new method during the development of a Web-based intervention for children. We designed a practical 5-step method for assessing content validity in eHealth interventions that includes defining key intervention targets, delineating intervention activity-target pairings, identifying experts and using a survey tool to gather expert ratings of the relevance of each activity to its intended target, its likely effectiveness in achieving the intended target, and its appropriateness with a specific intended audience, and then using quantitative and qualitative results to identify intervention activities that may need modification. We applied this method during our development of the Coping Coach Web-based intervention for school-age children. In the evaluation of Coping Coach content validity, 15 experts from five countries

  11. Validity of a Newly-Designed Rectilinear Stepping Ergometer Submaximal Exercise Test to Assess Cardiorespiratory Fitness

    OpenAIRE

    Rubin Zhang, Likui Zhan, Shaoming Sun, Wei Peng, Yining Sun

    2017-01-01

    The maximum oxygen uptake (V̇O2 max), determined from graded maximal or submaximal exercise tests, is used to classify the cardiorespiratory fitness level of individuals. The purpose of this study was to examine the validity and reliability of the YMCA submaximal exercise test protocol performed on a newly-designed rectilinear stepping ergometer (RSE) that used up and down reciprocating vertical motion in place of conventional circular motion and giving precise measurement of workload, to det...

  12. some generalized two-step block hybrid numerov method for solving ...

    African Journals Online (AJOL)

    Nwokem et al.

    ABSTRACT. This paper proposes a class of generalized two-step Numerov methods, a block hybrid type for the direct solution of general second order ordinary differential equations. Both the main method and additional methods were derived via interpolation and collocation procedures. The basic properties of zero ...

  13. Qualification test of few group constants generated from an MC method by the two-step neutronics analysis system McCARD/MASTER

    International Nuclear Information System (INIS)

    Park, Ho Jin; Shim, Hyung Jin; Joo, Han Gyu; Kim, Chang Hyo

    2011-01-01

    The purpose of this paper is to examine the qualification of few group constants estimated by the Seoul National University Monte Carlo particle transport analysis code McCARD in terms of core neutronics analyses and thus to validate the McCARD method as a few group constant generator. The two- step core neutronics analyses are conducted for a mini and a realistic PWR by the McCARD/MASTER code system in which McCARD is used as an MC group constant generation code and MASTER as a diffusion core analysis code. The two-step calculations for the effective multiplication factors and assembly power distributions of the two PWR cores by McCARD/MASTER are compared with the reference McCARD calculations. By showing excellent agreements between McCARD/MASTER and the reference MC core neutronics analyses for the two PWRs, it is concluded that the MC method implemented in McCARD can generate few group constants which are well qualified for high-accuracy two-step core neutronics calculations. (author)

  14. One step geometrical calibration method for optical coherence tomography

    International Nuclear Information System (INIS)

    Díaz, Jesús Díaz; Ortmaier, Tobias; Stritzel, Jenny; Rahlves, Maik; Reithmeier, Eduard; Roth, Bernhard; Majdani, Omid

    2016-01-01

    We present a novel one-step calibration methodology for geometrical distortion correction for optical coherence tomography (OCT). A calibration standard especially designed for OCT is introduced, which consists of an array of inverse pyramidal structures. The use of multiple landmarks situated on four different height levels on the pyramids allow performing a 3D geometrical calibration. The calibration procedure itself is based on a parametric model of the OCT beam propagation. It is validated by experimental results and enables the reduction of systematic errors by more than one order of magnitude. In future, our results can improve OCT image reconstruction and interpretation for medical applications such as real time monitoring of surgery. (paper)

  15. Method for Pre-Conditioning a Measured Surface Height Map for Model Validation

    Science.gov (United States)

    Sidick, Erkin

    2012-01-01

    This software allows one to up-sample or down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. Because the re-sampling of a surface map is accomplished based on the analytical expressions of Zernike-polynomials and a power spectral density model, such re-sampling does not introduce any aliasing and interpolation errors as is done by the conventional interpolation and FFT-based (fast-Fourier-transform-based) spatial-filtering method. Also, this new method automatically eliminates the measurement noise and other measurement errors such as artificial discontinuity. The developmental cycle of an optical system, such as a space telescope, includes, but is not limited to, the following two steps: (1) deriving requirements or specs on the optical quality of individual optics before they are fabricated through optical modeling and simulations, and (2) validating the optical model using the measured surface height maps after all optics are fabricated. There are a number of computational issues related to model validation, one of which is the "pre-conditioning" or pre-processing of the measured surface maps before using them in a model validation software tool. This software addresses the following issues: (1) up- or down-sampling a measured surface map to match it with the gridded data format of a model validation tool, and (2) eliminating the surface measurement noise or measurement errors such that the resulted surface height map is continuous or smoothly-varying. So far, the preferred method used for re-sampling a surface map is two-dimensional interpolation. The main problem of this method is that the same pixel can take different values when the method of interpolation is changed among the different methods such as the "nearest," "linear," "cubic," and "spline" fitting in Matlab. The conventional, FFT-based spatial filtering method used to

  16. Validation of an analytical method for determining halothane in urine as an instrument for evaluating occupational exposure

    International Nuclear Information System (INIS)

    Gonzalez Chamorro, Rita Maria; Jaime Novas, Arelis; Diaz Padron, Heliodora

    2010-01-01

    The occupational exposure to harmful substances may impose the apparition of determined significative changes in the normal physiology of the organism when the adequate security measures are not taken in time in a working place where the risk may be present. Among the chemical risks that may affect the workers' health are the inhalable anesthetic agents. With the objective to take the first steps for the introduction of an epidemiological surveillance system to this personnel, an analytical method for determining this anesthetic in urine was validated with the instrumental conditions created in our laboratory. To carry out this validation the following parameters were taken into account: specificity, lineament, precision, accuracy, detection limit and quantification limit, and the uncertainty of the method was calculated. In the validation procedure it was found that the technique is specific and precise, the detection limit was of 0,118 μg/L, and of the quantification limit of 0,354 μg/L. The global uncertainty was of 0,243, and the expanded of 0,486. The validated method, together with the posterior introduction of the biological exposure limits, will serve as an auxiliary means of diagnosis which will allow us a periodical control of the personnel exposure

  17. A Validated Method for the Quality Control of Andrographis paniculata Preparations.

    Science.gov (United States)

    Karioti, Anastasia; Timoteo, Patricia; Bergonzi, Maria Camilla; Bilia, Anna Rita

    2017-10-01

    Andrographis paniculata is a herbal drug of Asian traditional medicine largely employed for the treatment of several diseases. Recently, it has been introduced in Europe for the prophylactic and symptomatic treatment of common cold and as an ingredient of dietary supplements. The active principles are diterpenes with andrographolide as the main representative. In the present study, an analytical protocol was developed for the determination of the main constituents in the herb and preparations of A. paniculata . Three different extraction protocols (methanol extraction using a modified Soxhlet procedure, maceration under ultrasonication, and decoction) were tested. Ultrasonication achieved the highest content of analytes. HPLC conditions were optimized in terms of solvent mixtures, time course, and temperature. A reversed phase C18 column eluted with a gradient system consisting of acetonitrile and acidified water and including an isocratic step at 30 °C was used. The HPLC method was validated for linearity, limits of quantitation and detection, repeatability, precision, and accuracy. The overall method was validated for precision and accuracy over at least three different concentration levels. Relative standard deviation was less than 1.13%, whereas recovery was between 95.50% and 97.19%. The method also proved to be suitable for the determination of a large number of commercial samples and was proposed to the European Pharmacopoeia for the quality control of Andrographidis herba. Georg Thieme Verlag KG Stuttgart · New York.

  18. Multiple-time-stepping generalized hybrid Monte Carlo methods

    Energy Technology Data Exchange (ETDEWEB)

    Escribano, Bruno, E-mail: bescribano@bcamath.org [BCAM—Basque Center for Applied Mathematics, E-48009 Bilbao (Spain); Akhmatskaya, Elena [BCAM—Basque Center for Applied Mathematics, E-48009 Bilbao (Spain); IKERBASQUE, Basque Foundation for Science, E-48013 Bilbao (Spain); Reich, Sebastian [Universität Potsdam, Institut für Mathematik, D-14469 Potsdam (Germany); Azpiroz, Jon M. [Kimika Fakultatea, Euskal Herriko Unibertsitatea (UPV/EHU) and Donostia International Physics Center (DIPC), P.K. 1072, Donostia (Spain)

    2015-01-01

    Performance of the generalized shadow hybrid Monte Carlo (GSHMC) method [1], which proved to be superior in sampling efficiency over its predecessors [2–4], molecular dynamics and hybrid Monte Carlo, can be further improved by combining it with multi-time-stepping (MTS) and mollification of slow forces. We demonstrate that the comparatively simple modifications of the method not only lead to better performance of GSHMC itself but also allow for beating the best performed methods, which use the similar force splitting schemes. In addition we show that the same ideas can be successfully applied to the conventional generalized hybrid Monte Carlo method (GHMC). The resulting methods, MTS-GHMC and MTS-GSHMC, provide accurate reproduction of thermodynamic and dynamical properties, exact temperature control during simulation and computational robustness and efficiency. MTS-GHMC uses a generalized momentum update to achieve weak stochastic stabilization to the molecular dynamics (MD) integrator. MTS-GSHMC adds the use of a shadow (modified) Hamiltonian to filter the MD trajectories in the HMC scheme. We introduce a new shadow Hamiltonian formulation adapted to force-splitting methods. The use of such Hamiltonians improves the acceptance rate of trajectories and has a strong impact on the sampling efficiency of the method. Both methods were implemented in the open-source MD package ProtoMol and were tested on a water and a protein systems. Results were compared to those obtained using a Langevin Molly (LM) method [5] on the same systems. The test results demonstrate the superiority of the new methods over LM in terms of stability, accuracy and sampling efficiency. This suggests that putting the MTS approach in the framework of hybrid Monte Carlo and using the natural stochasticity offered by the generalized hybrid Monte Carlo lead to improving stability of MTS and allow for achieving larger step sizes in the simulation of complex systems.

  19. Design for validation: An approach to systems validation

    Science.gov (United States)

    Carter, William C.; Dunham, Janet R.; Laprie, Jean-Claude; Williams, Thomas; Howden, William; Smith, Brian; Lewis, Carl M. (Editor)

    1989-01-01

    Every complex system built is validated in some manner. Computer validation begins with review of the system design. As systems became too complicated for one person to review, validation began to rely on the application of adhoc methods by many individuals. As the cost of the changes mounted and the expense of failure increased, more organized procedures became essential. Attempts at devising and carrying out those procedures showed that validation is indeed a difficult technical problem. The successful transformation of the validation process into a systematic series of formally sound, integrated steps is necessary if the liability inherent in the future digita-system-based avionic and space systems is to be minimized. A suggested framework and timetable for the transformtion are presented. Basic working definitions of two pivotal ideas (validation and system life-cyle) are provided and show how the two concepts interact. Many examples are given of past and present validation activities by NASA and others. A conceptual framework is presented for the validation process. Finally, important areas are listed for ongoing development of the validation process at NASA Langley Research Center.

  20. Interlaboratory validation of an improved U.S. Food and Drug Administration method for detection of Cyclospora cayetanensis in produce using TaqMan real-time PCR

    Science.gov (United States)

    A collaborative validation study was performed to evaluate the performance of a new U.S. Food and Drug Administration method developed for detection of the protozoan parasite, Cyclospora cayetanensis, on cilantro and raspberries. The method includes a sample preparation step in which oocysts are re...

  1. A Rapid, Simple, and Validated RP-HPLC Method for Quantitative Analysis of Levofloxacin in Human Plasma

    Directory of Open Access Journals (Sweden)

    Dion Notario

    2017-04-01

    Full Text Available To conduct a bioequivalence study for a copy product of levofloxacin (LEV, a simple and validated analytical method was needed, but the previous developed methods were still too complicated. For this reason, a simple and rapid high performance liquid chromatography method was developed and validated for LEV quantification in human plasma. Chromatographic separation was performed under isocratic elution on a Luna Phenomenex® C18 (150 × 4.6 mm, 5 µm column. The mobile phase was comprised of acetonitrile, methanol, and phosphate buffer 25 mM that adjusted at pH 3.0 (13:7:80 v/v/v and pumped at a flow rate of 1.5 mL/min. Detection was performed under UV detector at wavelength of 280 nm. Samples were prepared by adding acetonitrile and followed by centrifugation to precipitate plasma protein. Then followed successively by evaporation and reconstitution step. The optimized method meets the requirements of validation parameters which included linearity (r = 0.995, sensitivity (LLOQ and LOD was 1.77 and 0.57 µg/mL respectively, accuracy (%error above LLOQ ≤ 12% and LLOQ ≤ 20%, precision (RSD ≤ 9%, and robustness in the ranges of 1.77-28.83 µg/mL. Therefore, the method can be used as a routine analysis of LEV in human plasma as well as in bioequivalence study of LEV.

  2. Using a Three-Step Method in a Calculus Class: Extending the Worked Example

    Science.gov (United States)

    Miller, David

    2010-01-01

    This article discusses a three-step method that was used in a college calculus course. The three-step method was developed to help students understand the course material and transition to be more independent learners. In addition, the method helped students to transfer concepts from short-term to long-term memory while lowering cognitive load.…

  3. Method validation in plasma source optical emission spectroscopy (ICP-OES) - From samples to results

    International Nuclear Information System (INIS)

    Pilon, Fabien; Vielle, Karine; Birolleau, Jean-Claude; Vigneau, Olivier; Labet, Alexandre; Arnal, Nadege; Adam, Christelle; Camilleri, Virginie; Amiel, Jeanine; Granier, Guy; Faure, Joel; Arnaud, Regine; Beres, Andre; Blanchard, Jean-Marc; Boyer-Deslys, Valerie; Broudic, Veronique; Marques, Caroline; Augeray, Celine; Bellefleur, Alexandre; Bienvenu, Philippe; Delteil, Nicole; Boulet, Beatrice; Bourgarit, David; Brennetot, Rene; Fichet, Pascal; Celier, Magali; Chevillotte, Rene; Klelifa, Aline; Fuchs, Gilbert; Le Coq, Gilles; Mermet, Jean-Michel

    2017-01-01

    Even though ICP-OES (Inductively Coupled Plasma - Optical Emission Spectroscopy) is now a routine analysis technique, requirements for measuring processes impose a complete control and mastering of the operating process and of the associated quality management system. The aim of this (collective) book is to guide the analyst during all the measurement validation procedure and to help him to guarantee the mastering of its different steps: administrative and physical management of samples in the laboratory, preparation and treatment of the samples before measuring, qualification and monitoring of the apparatus, instrument setting and calibration strategy, exploitation of results in terms of accuracy, reliability, data covariance (with the practical determination of the accuracy profile). The most recent terminology is used in the book, and numerous examples and illustrations are given in order to a better understanding and to help the elaboration of method validation documents

  4. Development and Validation of a Dissolution Test Method for ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a dissolution test method for dissolution release of artemether and lumefantrine from tablets. Methods: A single dissolution method for evaluating the in vitro release of artemether and lumefantrine from tablets was developed and validated. The method comprised of a dissolution medium of ...

  5. Validation of the Rotation Ratios Method

    International Nuclear Information System (INIS)

    Foss, O.A.; Klaksvik, J.; Benum, P.; Anda, S.

    2007-01-01

    Background: The rotation ratios method describes rotations between pairs of sequential pelvic radiographs. The method seems promising but has not been validated. Purpose: To validate the accuracy of the rotation ratios method. Material and Methods: Known pelvic rotations between 165 radiographs obtained from five skeletal pelvises in an experimental material were compared with the corresponding calculated rotations to describe the accuracy of the method. The results from a clinical material of 262 pelvic radiographs from 46 patients defined the ranges of rotational differences compared. Repeated analyses, both on the experimental and the clinical material, were performed using the selected reference points to describe the robustness and the repeatability of the method. Results: The reference points were easy to identify and barely influenced by pelvic rotations. The mean differences between calculated and real pelvic rotations were 0.0 deg (SD 0.6) for vertical rotations and 0.1 deg (SD 0.7) for transversal rotations in the experimental material. The intra- and interobserver repeatability of the method was good. Conclusion: The accuracy of the method was reasonably high, and the method may prove to be clinically useful

  6. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.; Adams, M.L.; McClarren, R.G.; Mallick, B.K.

    2011-01-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework

  7. Newmark local time stepping on high-performance computing architectures

    KAUST Repository

    Rietmann, Max

    2016-11-25

    In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strong element-size contrasts (more than 100×). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.

  8. Newmark local time stepping on high-performance computing architectures

    KAUST Repository

    Rietmann, Max; Grote, Marcus; Peter, Daniel; Schenk, Olaf

    2016-01-01

    In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strong element-size contrasts (more than 100×). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.

  9. Newmark local time stepping on high-performance computing architectures

    Energy Technology Data Exchange (ETDEWEB)

    Rietmann, Max, E-mail: max.rietmann@erdw.ethz.ch [Institute for Computational Science, Università della Svizzera italiana, Lugano (Switzerland); Institute of Geophysics, ETH Zurich (Switzerland); Grote, Marcus, E-mail: marcus.grote@unibas.ch [Department of Mathematics and Computer Science, University of Basel (Switzerland); Peter, Daniel, E-mail: daniel.peter@kaust.edu.sa [Institute for Computational Science, Università della Svizzera italiana, Lugano (Switzerland); Institute of Geophysics, ETH Zurich (Switzerland); Schenk, Olaf, E-mail: olaf.schenk@usi.ch [Institute for Computational Science, Università della Svizzera italiana, Lugano (Switzerland)

    2017-04-01

    In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strong element-size contrasts (more than 100x). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.

  10. Stability of one-step methods in transient nonlinear heat conduction

    International Nuclear Information System (INIS)

    Hughes, J.R.

    1977-01-01

    The purpose of the present work is to ascertain practical stability conditions for one-step methods commonly used in transient nonlinear heat conduction analyses. In this paper the concepts of stability, appropriate to the nonlinear problem, are thoroughly discussed. They of course reduce to the usual stability critierion for the linear, constant coefficient case. However, for nonlinear problems there are differences and theses ideas are of key importance in obtaining practical stability conditions. Of particular importance is a recent result which indicates that, in a sense, the trapezoidal and midpoint families are equivalent. Thus, stability results for one family may be translated into a result for the other. The main results obtained are: The stability behaviour of the explicit Euler method in the nonlinear regime is analogous to that for linear problems. In particular, an a priori step size restriction may be determined for each time step. The precise time step restriction on implicit conditionally stable members of the trapezoidal and midpoint families is shown not to be determinable a priori. Of considerable practical significance, unconditionally stable members of the trapezoidal and midpoint families are identified. All notions of stability employed are motivated and defined, and their interpretations in practical computing are indicated. (Auth.)

  11. An extended validation of the last generation of particle finite element method for free surface flows

    Science.gov (United States)

    Gimenez, Juan M.; González, Leo M.

    2015-03-01

    In this paper, a new generation of the particle method known as Particle Finite Element Method (PFEM), which combines convective particle movement and a fixed mesh resolution, is applied to free surface flows. This interesting variant, previously described in the literature as PFEM-2, is able to use larger time steps when compared to other similar numerical tools which implies shorter computational times while maintaining the accuracy of the computation. PFEM-2 has already been extended to free surface problems, being the main topic of this paper a deep validation of this methodology for a wider range of flows. To accomplish this task, different improved versions of discontinuous and continuous enriched basis functions for the pressure field have been developed to capture the free surface dynamics without artificial diffusion or undesired numerical effects when different density ratios are involved. A collection of problems has been carefully selected such that a wide variety of Froude numbers, density ratios and dominant dissipative cases are reported with the intention of presenting a general methodology, not restricted to a particular range of parameters, and capable of using large time-steps. The results of the different free-surface problems solved, which include: Rayleigh-Taylor instability, sloshing problems, viscous standing waves and the dam break problem, are compared to well validated numerical alternatives or experimental measurements obtaining accurate approximations for such complex flows.

  12. Method development and validation of liquid chromatography-tandem/mass spectrometry for aldosterone in human plasma: Application to drug interaction study of atorvastatin and olmesartan combination

    Directory of Open Access Journals (Sweden)

    Rakesh Das

    2014-01-01

    Full Text Available In the present investigation, a simple and sensitive liquid chromatography-tandem mass spectrometry (LC/MS/MS method was developed for the quantification of aldosterone (ALD a hormone responsible for blood pressure in human plasma. The developed method was validated and extended for application on human subjects to study drug interaction of atorvastatin (ATSV and olmesartan (OLM on levels of ALD. The ALD in plasma was extracted by liquid-liquid extraction with 5 mL dichloromethane/ethyl ether (60/40% v/v. The chromatographic separation of ALD was carried on Xterra, RP-Column C18 (150 mm× 4.6 mm × 3.5 μm at 30°C followed by four-step gradient program composed of methanol and water. Step 1 started with 35% methanol for first 1 min and changed linearly to 90% in next 1.5 min in Step 2. Step 3 lasted for next 2 min with 90% methanol. The method finally concluded with Step 4 to achieve initial concentration of methanol that is, 35% thus contributing the total method run time of 17.5 min. The flow rate was 0.25 mL/min throughout the process. The developed method was validated for specificity, accuracy, precision, stability, linearity, sensitivity, and recovery. The method was linear and found to be acceptable over the range of 50-800 ng/mL. The method was successfully applied for the drug interaction study of ATSV + OLM in combination against OLM treatment on blood pressure by quantifying changes in levels of ALD in hypertensive patients. The study revealed levels of ALD were significantly higher in ATSV + OLM treatment condition when compared to OLM as single treated condition. This reflects the reason of low effectiveness of ATSV + OLM in combination instead of synergistic activity.

  13. Validation of NAA Method for Urban Particulate Matter

    International Nuclear Information System (INIS)

    Woro Yatu Niken Syahfitri; Muhayatun; Diah Dwiana Lestiani; Natalia Adventini

    2009-01-01

    Nuclear analytical techniques have been applied in many countries for determination of environmental pollutant. Method of NAA (neutron activation analysis) representing one of nuclear analytical technique of that has low detection limits, high specificity, high precision, and accuracy for large majority of naturally occurring elements, and ability of non-destructive and simultaneous determination of multi-elemental, and can handle small sample size (< 1 mg). To ensure quality and reliability of the method, validation are needed to be done. A standard reference material, SRM NIST 1648 Urban Particulate Matter, has been used to validate NAA method. Accuracy and precision test were used as validation parameters. Particulate matter were validated for 18 elements: Ti, I, V, Br, Mn, Na, K, Cl, Cu, Al, As, Fe, Co, Zn, Ag, La, Cr, and Sm,. The result showed that the percent relative standard deviation of the measured elemental concentrations are found to be within ranged from 2 to 14,8% for most of the elements analyzed whereas Hor rat value in range 0,3-1,3. Accuracy test results showed that relative bias ranged from -11,1 to 3,6%. Based on validation results, it can be stated that NAA method is reliable for characterization particulate matter and other similar matrix samples to support air quality monitoring. (author)

  14. Detection of protein concentrations using a pH-step titration method

    NARCIS (Netherlands)

    Kruise, J.; Kruise, J.; Eijkel, Jan C.T.; Bergveld, Piet

    1997-01-01

    A stimulus-response method based on the application of a pH step is proposed for the detection of protein immobilized in a membrane on top of an ion-sensitive field-effect transistor (ISFET). The ISFET response to a step-wise change in pH, applied at the interface between the membrane and the

  15. Validation of a pretreatment delivery quality assurance method for the CyberKnife Synchrony system

    Energy Technology Data Exchange (ETDEWEB)

    Mastella, E., E-mail: edoardo.mastella@cnao.it [Medical Physics Unit, CNAO Foundation—National Centre for Oncological Hadron Therapy, Pavia I-27100, Italy and Medical Physics Unit, IEO—European Institute of Oncology, Milan I-20141 (Italy); Vigorito, S.; Rondi, E.; Cattani, F. [Medical Physics Unit, IEO—European Institute of Oncology, Milan I-20141 (Italy); Piperno, G.; Ferrari, A.; Strata, E.; Rozza, D. [Department of Radiation Oncology, IEO—European Institute of Oncology, Milan I-20141 (Italy); Jereczek-Fossa, B. A. [Department of Radiation Oncology, IEO—European Institute of Oncology, Milan I-20141, Italy and Department of Oncology and Hematology Oncology, University of Milan, Milan I-20122 (Italy)

    2016-08-15

    Purpose: To evaluate the geometric and dosimetric accuracies of the CyberKnife Synchrony respiratory tracking system (RTS) and to validate a method for pretreatment patient-specific delivery quality assurance (DQA). Methods: An EasyCube phantom was mounted on the ExacTrac gating phantom, which can move along the superior–inferior (SI) axis of a patient to simulate a moving target. The authors compared dynamic and static measurements. For each case, a Gafchromic EBT3 film was positioned between two slabs of the EasyCube, while a PinPoint ionization chamber was placed in the appropriate space. There were three steps to their evaluation: (1) the field size, the penumbra, and the symmetry of six secondary collimators were measured along the two main orthogonal axes. Dynamic measurements with deliberately simulated errors were also taken. (2) The delivered dose distributions (from step 1) were compared with the planned ones, using the gamma analysis method. The local gamma passing rates were evaluated using three acceptance criteria: 3% local dose difference (LDD)/3 mm, 2%LDD/2 mm, and 3%LDD/1 mm. (3) The DQA plans for six clinical patients were irradiated in different dynamic conditions, to give a total of 19 cases. The measured and planned dose distributions were evaluated with the same gamma-index criteria used in step 2 and the measured chamber doses were compared with the planned mean doses in the sensitive volume of the chamber. Results: (1) A very slight enlargement of the field size and of the penumbra was observed in the SI direction (on average <1 mm), in line with the overall average CyberKnife system error for tracking treatments. (2) Comparison between the planned and the correctly delivered dose distributions confirmed the dosimetric accuracy of the RTS for simple plans. The multicriteria gamma analysis was able to detect the simulated errors, proving the robustness of their method of analysis. (3) All of the DQA clinical plans passed the tests, both in

  16. Validation of method in instrumental NAA for food products sample

    International Nuclear Information System (INIS)

    Alfian; Siti Suprapti; Setyo Purwanto

    2010-01-01

    NAA is a method of testing that has not been standardized. To affirm and confirm that this method is valid. it must be done validation of the method with various sample standard reference materials. In this work. the validation is carried for food product samples using NIST SRM 1567a (wheat flour) and NIST SRM 1568a (rice flour). The results show that the validation method for testing nine elements (Al, K, Mg, Mn, Na, Ca, Fe, Se and Zn) in SRM 1567a and eight elements (Al, K, Mg, Mn, Na, Ca, Se and Zn ) in SRM 1568a pass the test of accuracy and precision. It can be conclude that this method has power to give valid result in determination element of the food products samples. (author)

  17. Human Factors methods concerning integrated validation of nuclear power plant control rooms; Metodutveckling foer integrerad validering

    Energy Technology Data Exchange (ETDEWEB)

    Oskarsson, Per-Anders; Johansson, Bjoern J.E.; Gonzalez, Natalia (Swedish Defence Research Agency, Information Systems, Linkoeping (Sweden))

    2010-02-15

    The frame of reference for this work was existing recommendations and instructions from the NPP area, experiences from the review of the Turbic Validation and experiences from system validations performed at the Swedish Armed Forces, e.g. concerning military control rooms and fighter pilots. These enterprises are characterized by complex systems in extreme environments, often with high risks, where human error can lead to serious consequences. A focus group has been performed with representatives responsible for Human Factors issues from all Swedish NPP:s. The questions that were discussed were, among other things, for whom an integrated validation (IV) is performed and its purpose, what should be included in an IV, the comparison with baseline measures, the design process, the role of SSM, which methods of measurement should be used, and how the methods are affected of changes in the control room. The report brings different questions to discussion concerning the validation process. Supplementary methods of measurement for integrated validation are discussed, e.g. dynamic, psychophysiological, and qualitative methods for identification of problems. Supplementary methods for statistical analysis are presented. The study points out a number of deficiencies in the validation process, e.g. the need of common guidelines for validation and design, criteria for different types of measurements, clarification of the role of SSM, and recommendations for the responsibility of external participants in the validation process. The authors propose 12 measures for taking care of the identified problems

  18. Standardization of a two-step real-time polymerase chain reaction based method for species-specific detection of medically important Aspergillus species.

    Science.gov (United States)

    Das, P; Pandey, P; Harishankar, A; Chandy, M; Bhattacharya, S; Chakrabarti, A

    2017-01-01

    Standardization of Aspergillus polymerase chain reaction (PCR) poses two technical challenges (a) standardization of DNA extraction, (b) optimization of PCR against various medically important Aspergillus species. Many cases of aspergillosis go undiagnosed because of relative insensitivity of conventional diagnostic methods such as microscopy, culture or antigen detection. The present study is an attempt to standardize real-time PCR assay for rapid sensitive and specific detection of Aspergillus DNA in EDTA whole blood. Three nucleic acid extraction protocols were compared and a two-step real-time PCR assay was developed and validated following the recommendations of the European Aspergillus PCR Initiative in our setup. In the first PCR step (pan-Aspergillus PCR), the target was 28S rDNA gene, whereas in the second step, species specific PCR the targets were beta-tubulin (for Aspergillus fumigatus, Aspergillus flavus, Aspergillus terreus), gene and calmodulin gene (for Aspergillus niger). Species specific identification of four medically important Aspergillus species, namely, A. fumigatus, A. flavus, A. niger and A. terreus were achieved by this PCR. Specificity of the PCR was tested against 34 different DNA source including bacteria, virus, yeast, other Aspergillus sp., other fungal species and for human DNA and had no false-positive reactions. The analytical sensitivity of the PCR was found to be 102 CFU/ml. The present protocol of two-step real-time PCR assays for genus- and species-specific identification for commonly isolated species in whole blood for diagnosis of invasive Aspergillus infections offers a rapid, sensitive and specific assay option and requires clinical validation at multiple centers.

  19. ASTM Validates Air Pollution Test Methods

    Science.gov (United States)

    Chemical and Engineering News, 1973

    1973-01-01

    The American Society for Testing and Materials (ASTM) has validated six basic methods for measuring pollutants in ambient air as the first part of its Project Threshold. Aim of the project is to establish nationwide consistency in measuring pollutants; determining precision, accuracy and reproducibility of 35 standard measuring methods. (BL)

  20. A two-step method for developing a control rod program for boiling water reactors

    International Nuclear Information System (INIS)

    Taner, M.S.; Levine, S.H.; Hsiao, M.Y.

    1992-01-01

    This paper reports on a two-step method that is established for the generation of a long-term control rod program for boiling water reactors (BWRs). The new method assumes a time-variant target power distribution in core depletion. In the new method, the BWR control rod programming is divided into two steps. In step 1, a sequence of optimal, exposure-dependent Haling power distribution profiles is generated, utilizing the spectral shift concept. In step 2, a set of exposure-dependent control rod patterns is developed by using the Haling profiles generated at step 1 as a target. The new method is implemented in a computer program named OCTOPUS. The optimization procedure of OCTOPUS is based on the method of approximation programming, in which the SIMULATE-E code is used to determine the nucleonics characteristics of the reactor core state. In a test in cycle length over a time-invariant, target Haling power distribution case because of a moderate application of spectral shift. No thermal limits of the core were violated. The gain in cycle length could be increased further by broadening the extent of the spetral shift

  1. Tracking Steps on Apple Watch at Different Walking Speeds.

    Science.gov (United States)

    Veerabhadrappa, Praveen; Moran, Matthew Duffy; Renninger, Mitchell D; Rhudy, Matthew B; Dreisbach, Scott B; Gift, Kristin M

    2018-04-09

    QUESTION: How accurate are the step counts obtained from Apple Watch? In this validation study, video steps vs. Apple Watch steps (mean ± SD) were 2965 ± 144 vs. 2964 ± 145 steps; P Apple Watch steps when compared with the manual counts obtained from video recordings. Our study is one of the initial studies to objectively validate the accuracy of the step counts obtained from Apple watch at different walking speeds. Apple Watch tested to be an extremely accurate device for measuring daily step counts for adults.

  2. Method validation to determine total alpha beta emitters in water samples using LSC

    International Nuclear Information System (INIS)

    Al-Masri, M. S.; Nashawati, A.; Al-akel, B.; Saaid, S.

    2006-06-01

    In this work a method was validated to determine gross alpha and beta emitters in water samples using liquid scintillation counter. 200 ml of water from each sample were evaporated to 20 ml and 8 ml of them were mixed with 12 ml of the suitable cocktail to be measured by liquid scintillation counter Wallac Winspectral 1414. The lower detection limit by this method (LDL) was 0.33 DPM for total alpha emitters and 1.3 DPM for total beta emitters. and the reproducibility limit was (± 2.32 DPM) and (±1.41 DPM) for total alpha and beta emitters respectively, and the repeatability limit was (±2.19 DPM) and (±1.11 DPM) for total alpha and beta emitters respectively. The method is easy and fast because of the simple preparation steps and the large number of samples that can be measured at the same time. In addition, many real samples and standard samples were analyzed by the method and showed accurate results so it was concluded that the method can be used with various water samples. (author)

  3. Development and validation of a novel RP-HPLC method for simultaneous determination of paracetamol, phenylephrine hydrochloride, caffeine, cetirizine and nimesulide in tablet formulation

    Directory of Open Access Journals (Sweden)

    A.P. Dewani

    2015-07-01

    Full Text Available The present work describes development and validation of a high-performance liquid chromatography–diode array detection (HPLC–DAD procedure for the analysis of phenylephrine hydrochloride (PHE, paracetamol (PAR, caffeine anhydrous (CAF, cetirizine Dihydrochloride (CET, nimesulide (NIM in pharmaceutical mixture. Effective chromatographic separation of PHE, PAR, CAF, CET and NIM was achieved using a Kinetex-C18 (4.6 mm, 150 mm, 5 mm column with gradient elution of the mobile phase composed of 10 mM phosphate buffer (pH 3.3 and acetonitrile. The elution was a three step gradient elution program step-1 started initially with 2% (by volume acetonitrile and 98% phosphate buffer (pH 3.3 for first 2 min. In step-2 acetonitrile concentration changed linearly to 20% up to 12 min the analysis was concluded by step-3 changing acetonitrile to 2% up to 20 min. The proposed HPLC method was statistically validated with respect to linearity, ranges, precision, accuracy, selectivity and robustness. Calibration curves were linear in the ranges of 5–100, 100–1000 and 10–200 mg/mL for PHE, PAR, CAF, CET and NIM respectively, with correlation coefficients >0.9996. The HPLC method was applied to tablet dosage form in which the analytes were successfully quantified with good recovery values with no interfering peaks from the excipients.

  4. Contribution to the asymptotic estimation of the global error of single step numerical integration methods. Application to the simulation of electric power networks; Contribution a l'estimation asymptotique de l'erreur globale des methodes d'integration numerique a un pas. Application a la simulation des reseaux electriques

    Energy Technology Data Exchange (ETDEWEB)

    Aid, R.

    1998-01-07

    This work comes from an industrial problem of validating numerical solutions of ordinary differential equations modeling power systems. This problem is solved using asymptotic estimators of the global error. Four techniques are studied: Richardson estimator (RS), Zadunaisky's techniques (ZD), integration of the variational equation (EV), and Solving for the correction (SC). We give some precisions on the relative order of SC w.r.t. the order of the numerical method. A new variant of ZD is proposed that uses the Modified Equation. In the case of variable step-size, it is shown that under suitable restriction, on the hypothesis of the step-size selection, ZD and SC are still valid. Moreover, some Runge-Kutta methods are shown to need less hypothesis on the step-sizes to exhibit a valid order of convergence for ZD and SC. Numerical tests conclude this analysis. Industrial cases are given. Finally, an algorithm to avoid the a priori specification of the integration path for complex time differential equations is proposed. (author)

  5. Stability analysis and time-step limits for a Monte Carlo Compton-scattering method

    International Nuclear Information System (INIS)

    Densmore, Jeffery D.; Warsa, James S.; Lowrie, Robert B.

    2010-01-01

    A Monte Carlo method for simulating Compton scattering in high energy density applications has been presented that models the photon-electron collision kinematics exactly [E. Canfield, W.M. Howard, E.P. Liang, Inverse Comptonization by one-dimensional relativistic electrons, Astrophys. J. 323 (1987) 565]. However, implementing this technique typically requires an explicit evaluation of the material temperature, which can lead to unstable and oscillatory solutions. In this paper, we perform a stability analysis of this Monte Carlo method and develop two time-step limits that avoid undesirable behavior. The first time-step limit prevents instabilities, while the second, more restrictive time-step limit avoids both instabilities and nonphysical oscillations. With a set of numerical examples, we demonstrate the efficacy of these time-step limits.

  6. Method of making stepped photographic density standards of radiographic photographs

    International Nuclear Information System (INIS)

    Borovin, I.V.; Kondina, M.A.

    1987-01-01

    In industrial radiography practice the need often arises for a prompt evaluation of the photographic density of an x-ray film. A method of making stepped photographic density standards for industrial radiography by contact printing from a negative is described. The method is intended for industrial radiation flaw detection laboratories not having specialized sensitometric equipment

  7. Synthesis and characterization of copper nanofluid by a novel one-step method

    International Nuclear Information System (INIS)

    Kumar, S. Ananda; Meenakshi, K. Shree; Narashimhan, B.R.V.; Srikanth, S.; Arthanareeswaran, G.

    2009-01-01

    This paper presents a novel one-step method for the preparation of stable, non-agglomerated copper nanofluids by reducing copper sulphate pentahydrate with sodium hypophosphite as reducing agent in ethylene glycol as base fluid by means of conventional heating. This is an in situ, one-step method which gives high yield of product with less time consumption. The characterization of the nanofluid is done by particle size analyzer, X-ray diffraction topography, UV-vis analysis and Fourier transform infrared spectroscopy (FT-IR) followed by the study of thermal conductivity of nanofluid by the transient hot wire method

  8. The Validation of NAA Method Used as Test Method in Serpong NAA Laboratory

    International Nuclear Information System (INIS)

    Rina-Mulyaningsih, Th.

    2004-01-01

    The Validation Of NAA Method Used As Test Method In Serpong NAA Laboratory. NAA Method is a non standard testing method. The testing laboratory shall validate its using method to ensure and confirm that it is suitable with application. The validation of NAA methods have been done with the parameters of accuracy, precision, repeatability and selectivity. The NIST 1573a Tomato Leaves, NIES 10C Rice flour unpolished and standard elements were used in this testing program. The result of testing with NIST 1573a showed that the elements of Na, Zn, Al and Mn are met from acceptance criteria of accuracy and precision, whereas Co is rejected. The result of testing with NIES 10C showed that Na and Zn elements are met from acceptance criteria of accuracy and precision, but Mn element is rejected. The result of selectivity test showed that the value of quantity is between 0.1-2.5 μg, depend on the elements. (author)

  9. Lagrangian fractional step method for the incompressible Navier--Stokes equations on a periodic domain

    International Nuclear Information System (INIS)

    Boergers, C.; Peskin, C.S.

    1987-01-01

    In the Lagrangian fractional step method introduced in this paper, the fluid velocity and pressure are defined on a collection of N fluid markers. At each time step, these markers are used to generate a Voronoi diagram, and this diagram is used to construct finite-difference operators corresponding to the divergence, gradient, and Laplacian. The splitting of the Navier--Stokes equations leads to discrete Helmholtz and Poisson problems, which we solve using a two-grid method. The nonlinear convection terms are modeled simply by the displacement of the fluid markers. We have implemented this method on a periodic domain in the plane. We describe an efficient algorithm for the numerical construction of periodic Voronoi diagrams, and we report on numerical results which indicate the the fractional step method is convergent of first order. The overall work per time step is proportional to N log N. copyright 1987 Academic Press, Inc

  10. Validation of a Step Detection Algorithm during Straight Walking and Turning in Patients with Parkinson’s Disease and Older Adults Using an Inertial Measurement Unit at the Lower Back

    Directory of Open Access Journals (Sweden)

    Minh H. Pham

    2017-09-01

    Full Text Available IntroductionInertial measurement units (IMUs positioned on various body locations allow detailed gait analysis even under unconstrained conditions. From a medical perspective, the assessment of vulnerable populations is of particular relevance, especially in the daily-life environment. Gait analysis algorithms need thorough validation, as many chronic diseases show specific and even unique gait patterns. The aim of this study was therefore to validate an acceleration-based step detection algorithm for patients with Parkinson’s disease (PD and older adults in both a lab-based and home-like environment.MethodsIn this prospective observational study, data were captured from a single 6-degrees of freedom IMU (APDM (3DOF accelerometer and 3DOF gyroscope worn on the lower back. Detection of heel strike (HS and toe off (TO on a treadmill was validated against an optoelectronic system (Vicon (11 PD patients and 12 older adults. A second independent validation study in the home-like environment was performed against video observation (20 PD patients and 12 older adults and included step counting during turning and non-turning, defined with a previously published algorithm.ResultsA continuous wavelet transform (cwt-based algorithm was developed for step detection with very high agreement with the optoelectronic system. HS detection in PD patients/older adults, respectively, reached 99/99% accuracy. Similar results were obtained for TO (99/100%. In HS detection, Bland–Altman plots showed a mean difference of 0.002 s [95% confidence interval (CI −0.09 to 0.10] between the algorithm and the optoelectronic system. The Bland–Altman plot for TO detection showed mean differences of 0.00 s (95% CI −0.12 to 0.12. In the home-like assessment, the algorithm for detection of occurrence of steps during turning reached 90% (PD patients/90% (older adults sensitivity, 83/88% specificity, and 88/89% accuracy. The detection of steps during non-turning phases

  11. Toward a Unified Validation Framework in Mixed Methods Research

    Science.gov (United States)

    Dellinger, Amy B.; Leech, Nancy L.

    2007-01-01

    The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as…

  12. Accurate step-FMCW ultrasound ranging and comparison with pulse-echo signaling methods

    Science.gov (United States)

    Natarajan, Shyam; Singh, Rahul S.; Lee, Michael; Cox, Brian P.; Culjat, Martin O.; Grundfest, Warren S.; Lee, Hua

    2010-03-01

    This paper presents a method setup for high-frequency ultrasound ranging based on stepped frequency-modulated continuous waves (FMCW), potentially capable of producing a higher signal-to-noise ratio (SNR) compared to traditional pulse-echo signaling. In current ultrasound systems, the use of higher frequencies (10-20 MHz) to enhance resolution lowers signal quality due to frequency-dependent attenuation. The proposed ultrasound signaling format, step-FMCW, is well-known in the radar community, and features lower peak power, wider dynamic range, lower noise figure and simpler electronics in comparison to pulse-echo systems. In pulse-echo ultrasound ranging, distances are calculated using the transmit times between a pulse and its subsequent echoes. In step-FMCW ultrasonic ranging, the phase and magnitude differences at stepped frequencies are used to sample the frequency domain. Thus, by taking the inverse Fourier transform, a comprehensive range profile is recovered that has increased immunity to noise over conventional ranging methods. Step-FMCW and pulse-echo waveforms were created using custom-built hardware consisting of an arbitrary waveform generator and dual-channel super heterodyne receiver, providing high SNR and in turn, accuracy in detection.

  13. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  14. Validation of an HPLC–UV method for the determination of digoxin residues on the surface of manufacturing equipment

    Directory of Open Access Journals (Sweden)

    ZORAN B. TODOROVIĆ

    2009-09-01

    Full Text Available In the pharmaceutical industry, an important step consists in the removal of possible drug residues from the involved equipments and areas. The cleaning procedures must be validated and methods to determine trace amounts of drugs have, therefore, to be considered with special attention. An HPLC–UV method for the determination of digoxin residues on stainless steel surfaces was developed and validated in order to control a cleaning procedure. Cotton swabs, moistened with methanol were used to remove any residues of drugs from stainless steel surfaces, and give recoveries of 85.9, 85.2 and 78.7 % for three concentration levels. The precision of the results, reported as the relative standard deviation (RSD, were below 6.3 %. The method was validated over a concentration range of 0.05–12.5 µg mL-1. Low quantities of drug residues were determined by HPLC–UV using a Symmetry C18 column (150´4.6 mm, 5 µm at 20 °C with an acetonitrile–water (28:72, v/v mobile phase at a flow rate of 1.1 mL min-1, an injection volume of 100 µL and were detected at 220 nm. A simple, selective and sensitive HPLC–UV assay for the determination of digoxin residues on stainless steel was developed, validated and applied.

  15. A three-step Maximum-A-Posterior probability method for InSAR data inversion of coseismic rupture with application to four recent large earthquakes in Asia

    Science.gov (United States)

    Sun, J.; Shen, Z.; Burgmann, R.; Liang, F.

    2012-12-01

    We develop a three-step Maximum-A-Posterior probability (MAP) method for coseismic rupture inversion, which aims at maximizing the a posterior probability density function (PDF) of elastic solutions of earthquake rupture. The method originates from the Fully Bayesian Inversion (FBI) and the Mixed linear-nonlinear Bayesian inversion (MBI) methods , shares the same a posterior PDF with them and keeps most of their merits, while overcoming its convergence difficulty when large numbers of low quality data are used and improving the convergence rate greatly using optimization procedures. A highly efficient global optimization algorithm, Adaptive Simulated Annealing (ASA), is used to search for the maximum posterior probability in the first step. The non-slip parameters are determined by the global optimization method, and the slip parameters are inverted for using the least squares method without positivity constraint initially, and then damped to physically reasonable range. This step MAP inversion brings the inversion close to 'true' solution quickly and jumps over local maximum regions in high-dimensional parameter space. The second step inversion approaches the 'true' solution further with positivity constraints subsequently applied on slip parameters using the Monte Carlo Inversion (MCI) technique, with all parameters obtained from step one as the initial solution. Then the slip artifacts are eliminated from slip models in the third step MAP inversion with fault geometry parameters fixed. We first used a designed model with 45 degree dipping angle and oblique slip, and corresponding synthetic InSAR data sets to validate the efficiency and accuracy of method. We then applied the method on four recent large earthquakes in Asia, namely the 2010 Yushu, China earthquake, the 2011 Burma earthquake, the 2011 New Zealand earthquake and the 2008 Qinghai, China earthquake, and compared our results with those results from other groups. Our results show the effectiveness of

  16. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  17. Methods for growth of relatively large step-free SiC crystal surfaces

    Science.gov (United States)

    Neudeck, Philip G. (Inventor); Powell, J. Anthony (Inventor)

    2002-01-01

    A method for growing arrays of large-area device-size films of step-free (i.e., atomically flat) SiC surfaces for semiconductor electronic device applications is disclosed. This method utilizes a lateral growth process that better overcomes the effect of extended defects in the seed crystal substrate that limited the obtainable step-free area achievable by prior art processes. The step-free SiC surface is particularly suited for the heteroepitaxial growth of 3C (cubic) SiC, AlN, and GaN films used for the fabrication of both surface-sensitive devices (i.e., surface channel field effect transistors such as HEMT's and MOSFET's) as well as high-electric field devices (pn diodes and other solid-state power switching devices) that are sensitive to extended crystal defects.

  18. Validated modified Lycopodium spore method development for ...

    African Journals Online (AJOL)

    Validated modified lycopodium spore method has been developed for simple and rapid quantification of herbal powdered drugs. Lycopodium spore method was performed on ingredients of Shatavaryadi churna, an ayurvedic formulation used as immunomodulator, galactagogue, aphrodisiac and rejuvenator. Estimation of ...

  19. A step-by-step approach to improve data quality when using commercial business lists to characterize retail food environments.

    Science.gov (United States)

    Jones, Kelly K; Zenk, Shannon N; Tarlov, Elizabeth; Powell, Lisa M; Matthews, Stephen A; Horoi, Irina

    2017-01-07

    Food environment characterization in health studies often requires data on the location of food stores and restaurants. While commercial business lists are commonly used as data sources for such studies, current literature provides little guidance on how to use validation study results to make decisions on which commercial business list to use and how to maximize the accuracy of those lists. Using data from a retrospective cohort study [Weight And Veterans' Environments Study (WAVES)], we (a) explain how validity and bias information from existing validation studies (count accuracy, classification accuracy, locational accuracy, as well as potential bias by neighborhood racial/ethnic composition, economic characteristics, and urbanicity) were used to determine which commercial business listing to purchase for retail food outlet data and (b) describe the methods used to maximize the quality of the data and results of this approach. We developed data improvement methods based on existing validation studies. These methods included purchasing records from commercial business lists (InfoUSA and Dun and Bradstreet) based on store/restaurant names as well as standard industrial classification (SIC) codes, reclassifying records by store type, improving geographic accuracy of records, and deduplicating records. We examined the impact of these procedures on food outlet counts in US census tracts. After cleaning and deduplicating, our strategy resulted in a 17.5% reduction in the count of food stores that were valid from those purchased from InfoUSA and 5.6% reduction in valid counts of restaurants purchased from Dun and Bradstreet. Locational accuracy was improved for 7.5% of records by applying street addresses of subsequent years to records with post-office (PO) box addresses. In total, up to 83% of US census tracts annually experienced a change (either positive or negative) in the count of retail food outlets between the initial purchase and the final dataset. Our study

  20. Development and validation of a rapid resolution liquid chromatography method for the screening of dietary plant isoprenoids: carotenoids, tocopherols and chlorophylls.

    Science.gov (United States)

    Stinco, Carla M; Benítez-González, Ana M; Hernanz, Dolores; Vicario, Isabel M; Meléndez-Martínez, Antonio J

    2014-11-28

    A rapid resolution liquid chromatography (RRLC) method was developed and validated for the simultaneous determination of nine carotenoids compounds (violaxanthin, lutein, zeaxanthin, β-cryptoxanthin, α-carotene, β-carotene, lycopene, phytoene, phytofluene), four tocopherols and four chlorophylls and derivates (chlorophylls and pheophytins). The methodology consisted in a micro-extraction procedure with or without saponification and subsequent analysis by RRLC. The limits of detection were saponification step was performed. The recovery of the method without the saponification step ranged from 92% to 107%, whilst that when saponification was carried out ranged from 60% for α-tocopherol to 82% for β-carotene. Finally, the applicability of the method was demonstrated by the identification and quantification of isoprenoids in different samples. The methodology is appropriate for the high-throughput screening of dietary isoprenoids in fruits and vegetables. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.

    Science.gov (United States)

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil

    2014-08-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.

  2. Method Validation Procedure in Gamma Spectroscopy Laboratory

    International Nuclear Information System (INIS)

    El Samad, O.; Baydoun, R.

    2008-01-01

    The present work describes the methodology followed for the application of ISO 17025 standards in gamma spectroscopy laboratory at the Lebanese Atomic Energy Commission including the management and technical requirements. A set of documents, written procedures and records were prepared to achieve the management part. The technical requirements, internal method validation was applied through the estimation of trueness, repeatability , minimum detectable activity and combined uncertainty, participation in IAEA proficiency tests assure the external method validation, specially that the gamma spectroscopy lab is a member of ALMERA network (Analytical Laboratories for the Measurements of Environmental Radioactivity). Some of these results are presented in this paper. (author)

  3. Aerial robot intelligent control method based on back-stepping

    Science.gov (United States)

    Zhou, Jian; Xue, Qian

    2018-05-01

    The aerial robot is characterized as strong nonlinearity, high coupling and parameter uncertainty, a self-adaptive back-stepping control method based on neural network is proposed in this paper. The uncertain part of the aerial robot model is compensated online by the neural network of Cerebellum Model Articulation Controller and robust control items are designed to overcome the uncertainty error of the system during online learning. At the same time, particle swarm algorithm is used to optimize and fix parameters so as to improve the dynamic performance, and control law is obtained by the recursion of back-stepping regression. Simulation results show that the designed control law has desired attitude tracking performance and good robustness in case of uncertainties and large errors in the model parameters.

  4. Method validation for strobilurin fungicides in cereals and fruit

    DEFF Research Database (Denmark)

    Christensen, Hanne Bjerre; Granby, Kit

    2001-01-01

    Strobilurins are a new class of fungicides that are active against a broad spectrum of fungi. In the present work a GC method for analysis of strobilurin fungicides was validated. The method was based on extraction with ethyl acetate/cyclohexane, clean-up by gel permeation chromatography (GPC......) and determination of the content by gas chromatography (GC) with electron capture (EC-), nitrogen/phosphorous (NP-), and mass spectrometric (MS-) detection. Three strobilurins, azoxystrobin, kresoxim-methyl and trifloxystrobin were validated on three matrices, wheat, apple and grapes. The validation was based...

  5. Development and validation of a simple method for the extraction of human skin melanocytes.

    Science.gov (United States)

    Wang, Yinjuan; Tissot, Marion; Rolin, Gwenaël; Muret, Patrice; Robin, Sophie; Berthon, Jean-Yves; He, Li; Humbert, Philippe; Viennet, Céline

    2018-03-21

    Primary melanocytes in culture are useful models for studying epidermal pigmentation and efficacy of melanogenic compounds, or developing advanced therapy medicinal products. Cell extraction is an inevitable and critical step in the establishment of cell cultures. Many enzymatic methods for extracting and growing cells derived from human skin, such as melanocytes, are described in literature. They are usually based on two enzymatic steps, Trypsin in combination with Dispase, in order to separate dermis from epidermis and subsequently to provide a suspension of epidermal cells. The objective of this work was to develop and validate an extraction method of human skin melanocytes being simple, effective and applicable to smaller skin samples, and avoiding animal reagents. TrypLE™ product was tested on very limited size of human skin, equivalent of multiple 3-mm punch biopsies, and was compared to Trypsin/Dispase enzymes. Functionality of extracted cells was evaluated by analysis of viability, morphology and melanin production. In comparison with Trypsin/Dispase incubation method, the main advantages of TrypLE™ incubation method were the easier of separation between dermis and epidermis and the higher population of melanocytes after extraction. Both protocols preserved morphological and biological characteristics of melanocytes. The minimum size of skin sample that allowed the extraction of functional cells was 6 × 3-mm punch biopsies (e.g., 42 mm 2 ) whatever the method used. In conclusion, this new procedure based on TrypLE™ incubation would be suitable for establishment of optimal primary melanocytes cultures for clinical applications and research.

  6. 76 FR 28664 - Method 301-Field Validation of Pollutant Measurement Methods From Various Waste Media

    Science.gov (United States)

    2011-05-18

    ... . d m = The mean of the paired sample differences. n = Total number of paired samples. 7.4.2 t Test... being compared to a validated test method as part of the Method 301 validation and an audit sample for... tighten the acceptance criteria for the precision of candidate alternative test methods. One commenter...

  7. The Value of Qualitative Methods in Social Validity Research

    Science.gov (United States)

    Leko, Melinda M.

    2014-01-01

    One quality indicator of intervention research is the extent to which the intervention has a high degree of social validity, or practicality. In this study, I drew on Wolf's framework for social validity and used qualitative methods to ascertain five middle schoolteachers' perceptions of the social validity of System 44®--a phonics-based reading…

  8. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  9. A Multiscale Finite Element Model Validation Method of Composite Cable-Stayed Bridge Based on Structural Health Monitoring System

    Directory of Open Access Journals (Sweden)

    Rumian Zhong

    2015-01-01

    Full Text Available A two-step response surface method for multiscale finite element model (FEM updating and validation is presented with respect to Guanhe Bridge, a composite cable-stayed bridge in the National Highway number G15, in China. Firstly, the state equations of both multiscale and single-scale FEM are established based on the basic equation in structural dynamic mechanics to update the multiscale coupling parameters and structural parameters. Secondly, based on the measured data from the structural health monitoring (SHM system, a Monte Carlo simulation is employed to analyze the uncertainty quantification and transmission, where the uncertainties of the multiscale FEM and measured data were considered. The results indicate that the relative errors between the calculated and measured frequencies are less than 2%, and the overlap ratio indexes of each modal frequency are larger than 80% without the average absolute value of relative errors. These demonstrate that the proposed method can be applied to validate the multiscale FEM, and the validated FEM can reflect the current conditions of the real bridge; thus it can be used as the basis for bridge health monitoring, damage prognosis (DP, and safety prognosis (SP.

  10. Development and validation of a new fallout transport method using variable spectral winds

    International Nuclear Information System (INIS)

    Hopkins, A.T.

    1984-01-01

    A new method was developed to incorporate variable winds into fallout transport calculations. The method uses spectral coefficients derived by the National Meteorological Center. Wind vector components are computed with the coefficients along the trajectories of falling particles. Spectral winds are used in the two-step method to compute dose rate on the ground, downwind of a nuclear cloud. First, the hotline is located by computing trajectories of particles from an initial, stabilized cloud, through spectral winds to the ground. The connection of particle landing points is the hotline. Second, dose rate on and around the hotline is computed by analytically smearing the falling cloud's activity along the ground. The feasibility of using spectral winds for fallout particle transport was validated by computing Mount St. Helens ashfall locations and comparing calculations to fallout data. In addition, an ashfall equation was derived for computing volcanic ash mass/area on the ground. Ashfall data and the ashfall equation were used to back-calculate an aggregated particle size distribution for the Mount St. Helens eruption cloud

  11. SPAR-H Step-by-Step Guidance

    Energy Technology Data Exchange (ETDEWEB)

    April M. Whaley; Dana L. Kelly; Ronald L. Boring; William J. Galyean

    2012-06-01

    Step-by-step guidance was developed recently at Idaho National Laboratory for the US Nuclear Regulatory Commission on the use of the Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) method for quantifying Human Failure Events (HFEs). This work was done to address SPAR-H user needs, specifically requests for additional guidance on the proper application of various aspects of the methodology. This paper overviews the steps of the SPAR-H analysis process and highlights some of the most important insights gained during the development of the step-by-step directions. This supplemental guidance for analysts is applicable when plant-specific information is available, and goes beyond the general guidance provided in existing SPAR-H documentation. The steps highlighted in this paper are: Step-1, Categorizing the HFE as Diagnosis and/or Action; Step-2, Rate the Performance Shaping Factors; Step-3, Calculate PSF-Modified HEP; Step-4, Accounting for Dependence, and; Step-5, Minimum Value Cutoff.

  12. Two-step method for creating a gastric tube during laparoscopic-thoracoscopic Ivor-Lewis esophagectomy.

    Science.gov (United States)

    Liu, Yu; Li, Ji-Jia; Zu, Peng; Liu, Hong-Xu; Yu, Zhan-Wu; Ren, Yi

    2017-12-07

    To introduce a two-step method for creating a gastric tube during laparoscopic-thoracoscopic Ivor-Lewis esophagectomy and assess its clinical application. One hundred and twenty-two patients with middle or lower esophageal cancer who underwent laparoscopic-thoracoscopic Ivor-Lewis esophagectomy at Liaoning Cancer Hospital and Institute from March 2014 to March 2016 were included in this study, and divided into two groups based on the procedure used for creating a gastric tube. One group used a two-step method for creating a gastric tube, and the other group used the conventional method. The two groups were compared regarding the operating time, surgical complications, and number of stapler cartridges used. The mean operating time was significantly shorter in the two-step method group than in the conventional method group [238 (179-293) min vs 272 (189-347) min, P creating a gastric tube during laparoscopic-thoracoscopic Ivor-Lewis esophagectomy has the advantages of simple operation, minimal damage to the tubular stomach, and reduced use of stapler cartridges.

  13. A Three Step Explicit Method for Direct Solution of Third Order ...

    African Journals Online (AJOL)

    This study produces a three step discrete Linear Multistep Method for Direct solution of third order initial value problems of ordinary differential equations of the form y'''= f(x,y,y',y''). Taylor series expansion technique was adopted in the development of the method. The differential system from the basis polynomial function to ...

  14. Two-step calibration method for multi-algorithm score-based face recognition systems by minimizing discrimination loss

    NARCIS (Netherlands)

    Susyanto, N.; Veldhuis, R.N.J.; Spreeuwers, L.J.; Klaassen, C.A.J.; Fierrez, J.; Li, S.Z.; Ross, A.; Veldhuis, R.; Alonso-Fernandez, F.; Bigun, J.

    2016-01-01

    We propose a new method for combining multi-algorithm score-based face recognition systems, which we call the two-step calibration method. Typically, algorithms for face recognition systems produce dependent scores. The two-step method is based on parametric copulas to handle this dependence. Its

  15. Design of a Two-Step Calibration Method of Kinematic Parameters for Serial Robots

    Science.gov (United States)

    WANG, Wei; WANG, Lei; YUN, Chao

    2017-03-01

    Serial robots are used to handle workpieces with large dimensions, and calibrating kinematic parameters is one of the most efficient ways to upgrade their accuracy. Many models are set up to investigate how many kinematic parameters can be identified to meet the minimal principle, but the base frame and the kinematic parameter are indistinctly calibrated in a one-step way. A two-step method of calibrating kinematic parameters is proposed to improve the accuracy of the robot's base frame and kinematic parameters. The forward kinematics described with respect to the measuring coordinate frame are established based on the product-of-exponential (POE) formula. In the first step the robot's base coordinate frame is calibrated by the unit quaternion form. The errors of both the robot's reference configuration and the base coordinate frame's pose are equivalently transformed to the zero-position errors of the robot's joints. The simplified model of the robot's positioning error is established in second-power explicit expressions. Then the identification model is finished by the least square method, requiring measuring position coordinates only. The complete subtasks of calibrating the robot's 39 kinematic parameters are finished in the second step. It's proved by a group of calibration experiments that by the proposed two-step calibration method the average absolute accuracy of industrial robots is updated to 0.23 mm. This paper presents that the robot's base frame should be calibrated before its kinematic parameters in order to upgrade its absolute positioning accuracy.

  16. The large discretization step method for time-dependent partial differential equations

    Science.gov (United States)

    Haras, Zigo; Taasan, Shlomo

    1995-01-01

    A new method for the acceleration of linear and nonlinear time dependent calculations is presented. It is based on the Large Discretization Step (LDS) approximation, defined in this work, which employs an extended system of low accuracy schemes to approximate a high accuracy discrete approximation to a time dependent differential operator. Error bounds on such approximations are derived. These approximations are efficiently implemented in the LDS methods for linear and nonlinear hyperbolic equations, presented here. In these algorithms the high and low accuracy schemes are interpreted as the same discretization of a time dependent operator on fine and coarse grids, respectively. Thus, a system of correction terms and corresponding equations are derived and solved on the coarse grid to yield the fine grid accuracy. These terms are initialized by visiting the fine grid once in many coarse grid time steps. The resulting methods are very general, simple to implement and may be used to accelerate many existing time marching schemes.

  17. Ion-step method for surface potential sensing of silicon nanowires

    NARCIS (Netherlands)

    Chen, S.; van Nieuwkasteele, Jan William; van den Berg, Albert; Eijkel, Jan C.T.

    2016-01-01

    This paper presents a novel stimulus-response method for surface potential sensing of silicon nanowire (Si NW) field-effect transistors. When an "ion-step" from low to high ionic strength is given as a stimulus to the gate oxide surface, an increase of double layer capacitance is therefore expected.

  18. Validation of a liquid chromatography ultraviolet method for determination of herbicide diuron and its metabolites in soil samples

    Directory of Open Access Journals (Sweden)

    ANA LUCIA S.M. FELICIO

    2016-01-01

    Full Text Available ABSTRACT Diuron is one of the most widely herbicide used worldwide, which can undergo degradation producing three primary metabolites: 3,4-dichlorophenylurea, 3-(3,4-dichlorophenyl-1-methylurea, and 3,4-dichloroaniline. Since the persistence of diuron and its by-products in ecosystems involves risk of toxicity to environment and human health, a reliable quantitative method for simultaneous monitoring of these compounds is required. Hence, a simple method without preconcentration step was validated for quantitation of diuron and its main metabolites by high performance liquid chromatography with ultraviolet detection. Separation was achieved in less than 11 minutes using a C18 column, mobile phase composed of acetonitrile and water (45:55 v/v at 0.86 mL min-1 and detection at 254 nm. The validated method using solid-liquid extraction followed by an isocratic chromatographic elution proved to be specific, precise and linear (R2 ˃ 0.99, presenting more than 90% of recovery. The method was successfully applied to quantify diuron and their by-products in soil samples collected in a sugarcane cultivation area, focusing on the environmental control.

  19. Pesticide applicators questionnaire content validation: A fuzzy delphi method.

    Science.gov (United States)

    Manakandan, S K; Rosnah, I; Mohd Ridhuan, J; Priya, R

    2017-08-01

    The most crucial step in forming a set of survey questionnaire is deciding the appropriate items in a construct. Retaining irrelevant items and removing important items will certainly mislead the direction of a particular study. This article demonstrates Fuzzy Delphi method as one of the scientific analysis technique to consolidate consensus agreement within a panel of experts pertaining to each item's appropriateness. This method reduces the ambiguity, diversity, and discrepancy of the opinions among the experts hence enhances the quality of the selected items. The main purpose of this study was to obtain experts' consensus on the suitability of the preselected items on the questionnaire. The panel consists of sixteen experts from the Occupational and Environmental Health Unit of Ministry of Health, Vector-borne Disease Control Unit of Ministry of Health and Occupational and Safety Health Unit of both public and private universities. A set of questionnaires related to noise and chemical exposure were compiled based on the literature search. There was a total of six constructs with 60 items in which three constructs for knowledge, attitude, and practice of noise exposure and three constructs for knowledge, attitude, and practice of chemical exposure. The validation process replicated recent Fuzzy Delphi method that using a concept of Triangular Fuzzy Numbers and Defuzzification process. A 100% response rate was obtained from all the sixteen experts with an average Likert scoring of four to five. Post FDM analysis, the first prerequisite was fulfilled with a threshold value (d) ≤ 0.2, hence all the six constructs were accepted. For the second prerequisite, three items (21%) from noise-attitude construct and four items (40%) from chemical-practice construct had expert consensus lesser than 75%, which giving rise to about 12% from the total items in the questionnaire. The third prerequisite was used to rank the items within the constructs by calculating the average

  20. Determination of methylmercury in marine sediment samples: Method validation and occurrence data

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, Luis; Vassileva, Emilia, E-mail: e.vasileva-veleva@iaea.org

    2015-01-01

    Highlights: • A method for MeHg determination at trace level in marine sediments is completely validated. • Validation is performed according to ISO-17025 and Eurachem guidelines. • The extraction efficiency of four sample preparation procedures is evaluated. • The uncertainty budget is used as a tool for evaluation of main uncertainty contributors. • Comparison with independent methods yields good agreement within stated uncertainty. - Abstract: The determination of methylmercury (MeHg) in sediment samples is a difficult task due to the extremely low MeHg/THg (total mercury) ratio and species interconversion. Here, we present the method validation of a cost-effective fit-for-purpose analytical procedure for the measurement of MeHg in sediments, which is based on aqueous phase ethylation, followed by purge and trap and hyphenated gas chromatography–pyrolysis–atomic fluorescence spectrometry (GC–Py–AFS) separation and detection. Four different extraction techniques, namely acid and alkaline leaching followed by solvent extraction and evaporation, microwave-assisted extraction with 2-mercaptoethanol, and acid leaching, solvent extraction and back extraction into sodium thiosulfate, were examined regarding their potential to selectively extract MeHg from estuarine sediment IAEA-405 certified reference material (CRM). The procedure based on acid leaching with HNO{sub 3}/CuSO{sub 4}, solvent extraction and back extraction into Na{sub 2}S{sub 2}O{sub 3} yielded the highest extraction recovery, i.e., 94 ± 3% and offered the possibility to perform the extraction of a large number of samples in a short time, by eliminating the evaporation step. The artifact formation of MeHg was evaluated by high performance liquid chromatography coupled to inductively coupled plasma mass spectrometry (HPLC–ICP–MS), using isotopically enriched Me{sup 201}Hg and {sup 202}Hg and it was found to be nonexistent. A full validation approach in line with ISO 17025 and

  1. Determination of methylmercury in marine sediment samples: Method validation and occurrence data

    International Nuclear Information System (INIS)

    Carrasco, Luis; Vassileva, Emilia

    2015-01-01

    Highlights: • A method for MeHg determination at trace level in marine sediments is completely validated. • Validation is performed according to ISO-17025 and Eurachem guidelines. • The extraction efficiency of four sample preparation procedures is evaluated. • The uncertainty budget is used as a tool for evaluation of main uncertainty contributors. • Comparison with independent methods yields good agreement within stated uncertainty. - Abstract: The determination of methylmercury (MeHg) in sediment samples is a difficult task due to the extremely low MeHg/THg (total mercury) ratio and species interconversion. Here, we present the method validation of a cost-effective fit-for-purpose analytical procedure for the measurement of MeHg in sediments, which is based on aqueous phase ethylation, followed by purge and trap and hyphenated gas chromatography–pyrolysis–atomic fluorescence spectrometry (GC–Py–AFS) separation and detection. Four different extraction techniques, namely acid and alkaline leaching followed by solvent extraction and evaporation, microwave-assisted extraction with 2-mercaptoethanol, and acid leaching, solvent extraction and back extraction into sodium thiosulfate, were examined regarding their potential to selectively extract MeHg from estuarine sediment IAEA-405 certified reference material (CRM). The procedure based on acid leaching with HNO 3 /CuSO 4 , solvent extraction and back extraction into Na 2 S 2 O 3 yielded the highest extraction recovery, i.e., 94 ± 3% and offered the possibility to perform the extraction of a large number of samples in a short time, by eliminating the evaporation step. The artifact formation of MeHg was evaluated by high performance liquid chromatography coupled to inductively coupled plasma mass spectrometry (HPLC–ICP–MS), using isotopically enriched Me 201 Hg and 202 Hg and it was found to be nonexistent. A full validation approach in line with ISO 17025 and Eurachem guidelines was followed

  2. A structured four-step curriculum in basic laparoscopy

    DEFF Research Database (Denmark)

    Strandbygaard, Jeanett; Bjerrum, Flemming; Maagaard, Mathilde

    2014-01-01

    The objective of this study was to develop a 4-step curriculum in basic laparoscopy consisting of validated modules integrating a cognitive component, a practical component and a procedural component.......The objective of this study was to develop a 4-step curriculum in basic laparoscopy consisting of validated modules integrating a cognitive component, a practical component and a procedural component....

  3. A Dimensionality Reduction-Based Multi-Step Clustering Method for Robust Vessel Trajectory Analysis

    Directory of Open Access Journals (Sweden)

    Huanhuan Li

    2017-08-01

    Full Text Available The Shipboard Automatic Identification System (AIS is crucial for navigation safety and maritime surveillance, data mining and pattern analysis of AIS information have attracted considerable attention in terms of both basic research and practical applications. Clustering of spatio-temporal AIS trajectories can be used to identify abnormal patterns and mine customary route data for transportation safety. Thus, the capacities of navigation safety and maritime traffic monitoring could be enhanced correspondingly. However, trajectory clustering is often sensitive to undesirable outliers and is essentially more complex compared with traditional point clustering. To overcome this limitation, a multi-step trajectory clustering method is proposed in this paper for robust AIS trajectory clustering. In particular, the Dynamic Time Warping (DTW, a similarity measurement method, is introduced in the first step to measure the distances between different trajectories. The calculated distances, inversely proportional to the similarities, constitute a distance matrix in the second step. Furthermore, as a widely-used dimensional reduction method, Principal Component Analysis (PCA is exploited to decompose the obtained distance matrix. In particular, the top k principal components with above 95% accumulative contribution rate are extracted by PCA, and the number of the centers k is chosen. The k centers are found by the improved center automatically selection algorithm. In the last step, the improved center clustering algorithm with k clusters is implemented on the distance matrix to achieve the final AIS trajectory clustering results. In order to improve the accuracy of the proposed multi-step clustering algorithm, an automatic algorithm for choosing the k clusters is developed according to the similarity distance. Numerous experiments on realistic AIS trajectory datasets in the bridge area waterway and Mississippi River have been implemented to compare our

  4. A Dimensionality Reduction-Based Multi-Step Clustering Method for Robust Vessel Trajectory Analysis.

    Science.gov (United States)

    Li, Huanhuan; Liu, Jingxian; Liu, Ryan Wen; Xiong, Naixue; Wu, Kefeng; Kim, Tai-Hoon

    2017-08-04

    The Shipboard Automatic Identification System (AIS) is crucial for navigation safety and maritime surveillance, data mining and pattern analysis of AIS information have attracted considerable attention in terms of both basic research and practical applications. Clustering of spatio-temporal AIS trajectories can be used to identify abnormal patterns and mine customary route data for transportation safety. Thus, the capacities of navigation safety and maritime traffic monitoring could be enhanced correspondingly. However, trajectory clustering is often sensitive to undesirable outliers and is essentially more complex compared with traditional point clustering. To overcome this limitation, a multi-step trajectory clustering method is proposed in this paper for robust AIS trajectory clustering. In particular, the Dynamic Time Warping (DTW), a similarity measurement method, is introduced in the first step to measure the distances between different trajectories. The calculated distances, inversely proportional to the similarities, constitute a distance matrix in the second step. Furthermore, as a widely-used dimensional reduction method, Principal Component Analysis (PCA) is exploited to decompose the obtained distance matrix. In particular, the top k principal components with above 95% accumulative contribution rate are extracted by PCA, and the number of the centers k is chosen. The k centers are found by the improved center automatically selection algorithm. In the last step, the improved center clustering algorithm with k clusters is implemented on the distance matrix to achieve the final AIS trajectory clustering results. In order to improve the accuracy of the proposed multi-step clustering algorithm, an automatic algorithm for choosing the k clusters is developed according to the similarity distance. Numerous experiments on realistic AIS trajectory datasets in the bridge area waterway and Mississippi River have been implemented to compare our proposed method with

  5. International Harmonization and Cooperation in the Validation of Alternative Methods.

    Science.gov (United States)

    Barroso, João; Ahn, Il Young; Caldeira, Cristiane; Carmichael, Paul L; Casey, Warren; Coecke, Sandra; Curren, Rodger; Desprez, Bertrand; Eskes, Chantra; Griesinger, Claudius; Guo, Jiabin; Hill, Erin; Roi, Annett Janusch; Kojima, Hajime; Li, Jin; Lim, Chae Hyung; Moura, Wlamir; Nishikawa, Akiyoshi; Park, HyeKyung; Peng, Shuangqing; Presgrave, Octavio; Singer, Tim; Sohn, Soo Jung; Westmoreland, Carl; Whelan, Maurice; Yang, Xingfen; Yang, Ying; Zuang, Valérie

    The development and validation of scientific alternatives to animal testing is important not only from an ethical perspective (implementation of 3Rs), but also to improve safety assessment decision making with the use of mechanistic information of higher relevance to humans. To be effective in these efforts, it is however imperative that validation centres, industry, regulatory bodies, academia and other interested parties ensure a strong international cooperation, cross-sector collaboration and intense communication in the design, execution, and peer review of validation studies. Such an approach is critical to achieve harmonized and more transparent approaches to method validation, peer-review and recommendation, which will ultimately expedite the international acceptance of valid alternative methods or strategies by regulatory authorities and their implementation and use by stakeholders. It also allows achieving greater efficiency and effectiveness by avoiding duplication of effort and leveraging limited resources. In view of achieving these goals, the International Cooperation on Alternative Test Methods (ICATM) was established in 2009 by validation centres from Europe, USA, Canada and Japan. ICATM was later joined by Korea in 2011 and currently also counts with Brazil and China as observers. This chapter describes the existing differences across world regions and major efforts carried out for achieving consistent international cooperation and harmonization in the validation and adoption of alternative approaches to animal testing.

  6. STEPS: efficient simulation of stochastic reaction–diffusion models in realistic morphologies

    Directory of Open Access Journals (Sweden)

    Hepburn Iain

    2012-05-01

    Full Text Available Abstract Background Models of cellular molecular systems are built from components such as biochemical reactions (including interactions between ligands and membrane-bound proteins, conformational changes and active and passive transport. A discrete, stochastic description of the kinetics is often essential to capture the behavior of the system accurately. Where spatial effects play a prominent role the complex morphology of cells may have to be represented, along with aspects such as chemical localization and diffusion. This high level of detail makes efficiency a particularly important consideration for software that is designed to simulate such systems. Results We describe STEPS, a stochastic reaction–diffusion simulator developed with an emphasis on simulating biochemical signaling pathways accurately and efficiently. STEPS supports all the above-mentioned features, and well-validated support for SBML allows many existing biochemical models to be imported reliably. Complex boundaries can be represented accurately in externally generated 3D tetrahedral meshes imported by STEPS. The powerful Python interface facilitates model construction and simulation control. STEPS implements the composition and rejection method, a variation of the Gillespie SSA, supporting diffusion between tetrahedral elements within an efficient search and update engine. Additional support for well-mixed conditions and for deterministic model solution is implemented. Solver accuracy is confirmed with an original and extensive validation set consisting of isolated reaction, diffusion and reaction–diffusion systems. Accuracy imposes upper and lower limits on tetrahedron sizes, which are described in detail. By comparing to Smoldyn, we show how the voxel-based approach in STEPS is often faster than particle-based methods, with increasing advantage in larger systems, and by comparing to MesoRD we show the efficiency of the STEPS implementation. Conclusion STEPS simulates

  7. Reliability and convergent validity of the five-step test in people with chronic stroke.

    Science.gov (United States)

    Ng, Shamay S M; Tse, Mimi M Y; Tam, Eric W C; Lai, Cynthia Y Y

    2018-01-10

    (i) To estimate the intra-rater, inter-rater and test-retest reliabilities of the Five-Step Test (FST), as well as the minimum detectable change in FST completion times in people with stroke. (ii) To estimate the convergent validity of the FST with other measures of stroke-specific impairments. (iii) To identify the best cut-off times for distinguishing FST performance in people with stroke from that of healthy older adults. A cross-sectional study. University-based rehabilitation centre. Forty-eight people with stroke and 39 healthy controls. None. The FST, along with (for the stroke survivors only) scores on the Fugl-Meyer Lower Extremity Assessment (FMA-LE), the Berg Balance Scale (BBS), Limits of Stability (LOS) tests, and Activities-specific Balance Confidence (ABC) scale were tested. The FST showed excellent intra-rater (intra-class correlation coefficient; ICC = 0.866-0.905), inter-rater (ICC = 0.998), and test-retest (ICC = 0.838-0.842) reliabilities. A minimum detectable change of 9.16 s was found for the FST in people with stroke. The FST correlated significantly with the FMA-LE, BBS, and LOS results in the forward and sideways directions (r = -0.411 to -0.716, p people with stroke and healthy older adults. The FST is a reliable, easy-to-administer clinical test for assessing stroke survivors' ability to negotiate steps and stairs.

  8. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.

    2011-09-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which experimental data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport universe, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new experiments within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies. © 2011 Elsevier Ltd. All rights reserved.

  9. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    Development and Validation of Analytical Method for Losartan-Copper Complex Using UV-Vis Spectrophotometry. ... Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with ...

  10. OWL-based reasoning methods for validating archetypes.

    Science.gov (United States)

    Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás

    2013-04-01

    Some modern Electronic Healthcare Record (EHR) architectures and standards are based on the dual model-based architecture, which defines two conceptual levels: reference model and archetype model. Such architectures represent EHR domain knowledge by means of archetypes, which are considered by many researchers to play a fundamental role for the achievement of semantic interoperability in healthcare. Consequently, formal methods for validating archetypes are necessary. In recent years, there has been an increasing interest in exploring how semantic web technologies in general, and ontologies in particular, can facilitate the representation and management of archetypes, including binding to terminologies, but no solution based on such technologies has been provided to date to validate archetypes. Our approach represents archetypes by means of OWL ontologies. This permits to combine the two levels of the dual model-based architecture in one modeling framework which can also integrate terminologies available in OWL format. The validation method consists of reasoning on those ontologies to find modeling errors in archetypes: incorrect restrictions over the reference model, non-conformant archetype specializations and inconsistent terminological bindings. The archetypes available in the repositories supported by the openEHR Foundation and the NHS Connecting for Health Program, which are the two largest publicly available ones, have been analyzed with our validation method. For such purpose, we have implemented a software tool called Archeck. Our results show that around 1/5 of archetype specializations contain modeling errors, the most common mistakes being related to coded terms and terminological bindings. The analysis of each repository reveals that different patterns of errors are found in both repositories. This result reinforces the need for making serious efforts in improving archetype design processes. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. A single-step method for rapid extraction of total lipids from green microalgae.

    Directory of Open Access Journals (Sweden)

    Martin Axelsson

    Full Text Available Microalgae produce a wide range of lipid compounds of potential commercial interest. Total lipid extraction performed by conventional extraction methods, relying on the chloroform-methanol solvent system are too laborious and time consuming for screening large numbers of samples. In this study, three previous extraction methods devised by Folch et al. (1957, Bligh and Dyer (1959 and Selstam and Öquist (1985 were compared and a faster single-step procedure was developed for extraction of total lipids from green microalgae. In the single-step procedure, 8 ml of a 2∶1 chloroform-methanol (v/v mixture was added to fresh or frozen microalgal paste or pulverized dry algal biomass contained in a glass centrifuge tube. The biomass was manually suspended by vigorously shaking the tube for a few seconds and 2 ml of a 0.73% NaCl water solution was added. Phase separation was facilitated by 2 min of centrifugation at 350 g and the lower phase was recovered for analysis. An uncharacterized microalgal polyculture and the green microalgae Scenedesmus dimorphus, Selenastrum minutum, and Chlorella protothecoides were subjected to the different extraction methods and various techniques of biomass homogenization. The less labour intensive single-step procedure presented here allowed simultaneous recovery of total lipid extracts from multiple samples of green microalgae with quantitative yields and fatty acid profiles comparable to those of the previous methods. While the single-step procedure is highly correlated in lipid extractability (r² = 0.985 to the previous method of Folch et al. (1957, it allowed at least five times higher sample throughput.

  12. A stabilized Runge–Kutta–Legendre method for explicit super-time-stepping of parabolic and mixed equations

    International Nuclear Information System (INIS)

    Meyer, Chad D.; Balsara, Dinshaw S.; Aslam, Tariq D.

    2014-01-01

    Parabolic partial differential equations appear in several physical problems, including problems that have a dominant hyperbolic part coupled to a sub-dominant parabolic component. Explicit methods for their solution are easy to implement but have very restrictive time step constraints. Implicit solution methods can be unconditionally stable but have the disadvantage of being computationally costly or difficult to implement. Super-time-stepping methods for treating parabolic terms in mixed type partial differential equations occupy an intermediate position. In such methods each superstep takes “s” explicit Runge–Kutta-like time-steps to advance the parabolic terms by a time-step that is s 2 times larger than a single explicit time-step. The expanded stability is usually obtained by mapping the short recursion relation of the explicit Runge–Kutta scheme to the recursion relation of some well-known, stable polynomial. Prior work has built temporally first- and second-order accurate super-time-stepping methods around the recursion relation associated with Chebyshev polynomials. Since their stability is based on the boundedness of the Chebyshev polynomials, these methods have been called RKC1 and RKC2. In this work we build temporally first- and second-order accurate super-time-stepping methods around the recursion relation associated with Legendre polynomials. We call these methods RKL1 and RKL2. The RKL1 method is first-order accurate in time; the RKL2 method is second-order accurate in time. We verify that the newly-designed RKL1 and RKL2 schemes have a very desirable monotonicity preserving property for one-dimensional problems – a solution that is monotone at the beginning of a time step retains that property at the end of that time step. It is shown that RKL1 and RKL2 methods are stable for all values of the diffusion coefficient up to the maximum value. We call this a convex monotonicity preserving property and show by examples that it is very useful

  13. Development and Validation of Improved Method for Fingerprint ...

    African Journals Online (AJOL)

    Purpose: To develop and validate an improved method by capillary zone electrophoresis with photodiode array detection for the fingerprint analysis of Ligusticum chuanxiong Hort. (Rhizoma Chuanxiong). Methods: The optimum high performance capillary electrophoresis (HPCE) conditions were 30 mM borax containing 5 ...

  14. Workshop on acceleration of the validation and regulatory acceptance of alternative methods and implementation of testing strategies

    DEFF Research Database (Denmark)

    Piersma, A. H.; Burgdorf, T.; Louekari, K.

    2018-01-01

    concerning the regulatory acceptance and implementation of alternative test methods and testing strategies, with the aim to develop feasible solutions. Classical validation of alternative methods usually involves one to one comparison with the gold standard animal study. This approach suffers from...... the reductionist nature of an alternative test as compared to the animal study as well as from the animal study being considered as the gold standard. Modern approaches combine individual alternatives into testing strategies, for which integrated and defined approaches are emerging at OECD. Furthermore, progress......-focused hazard and risk assessment of chemicals requires an open mind towards stepping away from the animal study as the gold standard and defining human biologically based regulatory requirements for human hazard and risk assessment....

  15. Validation of an Instrument to Measure High School Students' Attitudes toward Fitness Testing

    Science.gov (United States)

    Mercier, Kevin; Silverman, Stephen

    2014-01-01

    Purpose: The purpose of this investigation was to develop an instrument that has scores that are valid and reliable for measuring students' attitudes toward fitness testing. Method: The method involved the following steps: (a) an elicitation study, (b) item development, (c) a pilot study, and (d) a validation study. The pilot study included 427…

  16. Method validation for chemical composition determination by electron microprobe with wavelength dispersive spectrometer

    Science.gov (United States)

    Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.

    2016-07-01

    The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.

  17. Automatic, semi-automatic and manual validation of urban drainage data.

    Science.gov (United States)

    Branisavljević, N; Prodanović, D; Pavlović, D

    2010-01-01

    Advances in sensor technology and the possibility of automated long distance data transmission have made continuous measurements the preferable way of monitoring urban drainage processes. Usually, the collected data have to be processed by an expert in order to detect and mark the wrong data, remove them and replace them with interpolated data. In general, the first step in detecting the wrong, anomaly data is called the data quality assessment or data validation. Data validation consists of three parts: data preparation, validation scores generation and scores interpretation. This paper will present the overall framework for the data quality improvement system, suitable for automatic, semi-automatic or manual operation. The first two steps of the validation process are explained in more detail, using several validation methods on the same set of real-case data from the Belgrade sewer system. The final part of the validation process, which is the scores interpretation, needs to be further investigated on the developed system.

  18. A three-step maximum a posteriori probability method for InSAR data inversion of coseismic rupture with application to the 14 April 2010 Mw 6.9 Yushu, China, earthquake

    Science.gov (United States)

    Sun, Jianbao; Shen, Zheng-Kang; Bürgmann, Roland; Wang, Min; Chen, Lichun; Xu, Xiwei

    2013-08-01

    develop a three-step maximum a posteriori probability method for coseismic rupture inversion, which aims at maximizing the a posterior probability density function (PDF) of elastic deformation solutions of earthquake rupture. The method originates from the fully Bayesian inversion and mixed linear-nonlinear Bayesian inversion methods and shares the same posterior PDF with them, while overcoming difficulties with convergence when large numbers of low-quality data are used and greatly improving the convergence rate using optimization procedures. A highly efficient global optimization algorithm, adaptive simulated annealing, is used to search for the maximum of a posterior PDF ("mode" in statistics) in the first step. The second step inversion approaches the "true" solution further using the Monte Carlo inversion technique with positivity constraints, with all parameters obtained from the first step as the initial solution. Then slip artifacts are eliminated from slip models in the third step using the same procedure of the second step, with fixed fault geometry parameters. We first design a fault model with 45° dip angle and oblique slip, and produce corresponding synthetic interferometric synthetic aperture radar (InSAR) data sets to validate the reliability and efficiency of the new method. We then apply this method to InSAR data inversion for the coseismic slip distribution of the 14 April 2010 Mw 6.9 Yushu, China earthquake. Our preferred slip model is composed of three segments with most of the slip occurring within 15 km depth and the maximum slip reaches 1.38 m at the surface. The seismic moment released is estimated to be 2.32e+19 Nm, consistent with the seismic estimate of 2.50e+19 Nm.

  19. Convergence and Stability of the Split-Step θ-Milstein Method for Stochastic Delay Hopfield Neural Networks

    Directory of Open Access Journals (Sweden)

    Qian Guo

    2013-01-01

    Full Text Available A new splitting method designed for the numerical solutions of stochastic delay Hopfield neural networks is introduced and analysed. Under Lipschitz and linear growth conditions, this split-step θ-Milstein method is proved to have a strong convergence of order 1 in mean-square sense, which is higher than that of existing split-step θ-method. Further, mean-square stability of the proposed method is investigated. Numerical experiments and comparisons with existing methods illustrate the computational efficiency of our method.

  20. Development and validation of a spectroscopic method for the ...

    African Journals Online (AJOL)

    Development and validation of a spectroscopic method for the simultaneous analysis of ... advanced analytical methods such as high pressure liquid ..... equipment. DECLARATIONS ... high-performance liquid chromatography. J Chromatogr.

  1. One-step leapfrog ADI-FDTD method for simulating electromagnetic wave propagation in general dispersive media.

    Science.gov (United States)

    Wang, Xiang-Hua; Yin, Wen-Yan; Chen, Zhi Zhang David

    2013-09-09

    The one-step leapfrog alternating-direction-implicit finite-difference time-domain (ADI-FDTD) method is reformulated for simulating general electrically dispersive media. It models material dispersive properties with equivalent polarization currents. These currents are then solved with the auxiliary differential equation (ADE) and then incorporated into the one-step leapfrog ADI-FDTD method. The final equations are presented in the form similar to that of the conventional FDTD method but with second-order perturbation. The adapted method is then applied to characterize (a) electromagnetic wave propagation in a rectangular waveguide loaded with a magnetized plasma slab, (b) transmission coefficient of a plane wave normally incident on a monolayer graphene sheet biased by a magnetostatic field, and (c) surface plasmon polaritons (SPPs) propagation along a monolayer graphene sheet biased by an electrostatic field. The numerical results verify the stability, accuracy and computational efficiency of the proposed one-step leapfrog ADI-FDTD algorithm in comparison with analytical results and the results obtained with the other methods.

  2. Development and validation of a local time stepping-based PaSR solver for combustion and radiation modeling

    DEFF Research Database (Denmark)

    Pang, Kar Mun; Ivarsson, Anders; Haider, Sajjad

    2013-01-01

    In the current work, a local time stepping (LTS) solver for the modeling of combustion, radiative heat transfer and soot formation is developed and validated. This is achieved using an open source computational fluid dynamics code, OpenFOAM. Akin to the solver provided in default assembly i...... library in the edcSimpleFoam solver which was introduced during the 6th OpenFOAM workshop is modified and coupled with the current solver. One of the main amendments made is the integration of soot radiation submodel since this is significant in rich flames where soot particles are formed. The new solver...

  3. Development and Single-Laboratory Validation of a Liquid Chromatography Tandem Mass Spectrometry Method for Quantitation of Tetrodotoxin in Mussels and Oysters.

    Science.gov (United States)

    Turner, Andrew D; Boundy, Michael J; Rapkova, Monika Dhanji

    2017-09-01

    In recent years, evidence has grown for the presence of tetrodotoxin (TTX) in bivalve mollusks, leading to the potential for consumers of contaminated products to be affected by Tetrodotoxin Shellfish Poisoning (TSP). A single-laboratory validation was conducted for the hydrophilic interaction LC (HILIC) tandem MS (MS/MS) analysis of TTX in common mussels and Pacific oysters-the bivalve species that have been found to contain TTXs in the United Kingdom in recent years. The method consists of a single-step dispersive extraction in 1% acetic acid, followed by a carbon SPE cleanup step before dilution and instrumental analysis. The full method was developed as a rapid tool for the quantitation of TTX, as well as for the associated analogs 4-epi-TTX; 5,6,11-trideoxy TTX; 11-nor TTX-6-ol; 5-deoxy TTX; and 4,9-anhydro TTX. The method can also be run as the acquisition of TTX together with paralytic shellfish toxins. Results demonstrated acceptable method performance characteristics for specificity, linearity, recovery, ruggedness, repeatability, matrix variability, and within-laboratory reproducibility for the analysis of TTX. The LOD and LOQ were fit-for-purpose in comparison to the current action limit for TTX enforced in The Netherlands. In addition, aspects of method performance (LOD, LOQ, and within-laboratory reproducibility) were found to be satisfactory for three other TTX analogs (11-nor TTX-6-ol, 5-deoxy TTX, and 4,9-anhydro TTX). The method was found to be practical and suitable for use in regulatory testing, providing rapid turnaround of sample analysis. Plans currently underway on a full collaborative study to validate a HILIC-MS/MS method for paralytic shellfish poisoning toxins will be extended to include TTX in order to generate international acceptance, ultimately for use as an alternative official control testing method should regulatory controls be adopted.

  4. Bayesian risk-based decision method for model validation under uncertainty

    International Nuclear Information System (INIS)

    Jiang Xiaomo; Mahadevan, Sankaran

    2007-01-01

    This paper develops a decision-making methodology for computational model validation, considering the risk of using the current model, data support for the current model, and cost of acquiring new information to improve the model. A Bayesian decision theory-based method is developed for this purpose, using a likelihood ratio as the validation metric for model assessment. An expected risk or cost function is defined as a function of the decision costs, and the likelihood and prior of each hypothesis. The risk is minimized through correctly assigning experimental data to two decision regions based on the comparison of the likelihood ratio with a decision threshold. A Bayesian validation metric is derived based on the risk minimization criterion. Two types of validation tests are considered: pass/fail tests and system response value measurement tests. The methodology is illustrated for the validation of reliability prediction models in a tension bar and an engine blade subjected to high cycle fatigue. The proposed method can effectively integrate optimal experimental design into model validation to simultaneously reduce the cost and improve the accuracy of reliability model assessment

  5. Recursive regularization step for high-order lattice Boltzmann methods

    Science.gov (United States)

    Coreixas, Christophe; Wissocq, Gauthier; Puigt, Guillaume; Boussuge, Jean-François; Sagaut, Pierre

    2017-09-01

    A lattice Boltzmann method (LBM) with enhanced stability and accuracy is presented for various Hermite tensor-based lattice structures. The collision operator relies on a regularization step, which is here improved through a recursive computation of nonequilibrium Hermite polynomial coefficients. In addition to the reduced computational cost of this procedure with respect to the standard one, the recursive step allows to considerably enhance the stability and accuracy of the numerical scheme by properly filtering out second- (and higher-) order nonhydrodynamic contributions in under-resolved conditions. This is first shown in the isothermal case where the simulation of the doubly periodic shear layer is performed with a Reynolds number ranging from 104 to 106, and where a thorough analysis of the case at Re=3 ×104 is conducted. In the latter, results obtained using both regularization steps are compared against the Bhatnagar-Gross-Krook LBM for standard (D2Q9) and high-order (D2V17 and D2V37) lattice structures, confirming the tremendous increase of stability range of the proposed approach. Further comparisons on thermal and fully compressible flows, using the general extension of this procedure, are then conducted through the numerical simulation of Sod shock tubes with the D2V37 lattice. They confirm the stability increase induced by the recursive approach as compared with the standard one.

  6. Spacecraft early design validation using formal methods

    International Nuclear Information System (INIS)

    Bozzano, Marco; Cimatti, Alessandro; Katoen, Joost-Pieter; Katsaros, Panagiotis; Mokos, Konstantinos; Nguyen, Viet Yen; Noll, Thomas; Postma, Bart; Roveri, Marco

    2014-01-01

    The size and complexity of software in spacecraft is increasing exponentially, and this trend complicates its validation within the context of the overall spacecraft system. Current validation methods are labor-intensive as they rely on manual analysis, review and inspection. For future space missions, we developed – with challenging requirements from the European space industry – a novel modeling language and toolset for a (semi-)automated validation approach. Our modeling language is a dialect of AADL and enables engineers to express the system, the software, and their reliability aspects. The COMPASS toolset utilizes state-of-the-art model checking techniques, both qualitative and probabilistic, for the analysis of requirements related to functional correctness, safety, dependability and performance. Several pilot projects have been performed by industry, with two of them having focused on the system-level of a satellite platform in development. Our efforts resulted in a significant advancement of validating spacecraft designs from several perspectives, using a single integrated system model. The associated technology readiness level increased from level 1 (basic concepts and ideas) to early level 4 (laboratory-tested)

  7. A step by step selection method for the location and the size of a waste-to-energy facility targeting the maximum output energy and minimization of gate fee.

    Science.gov (United States)

    Kyriakis, Efstathios; Psomopoulos, Constantinos; Kokkotis, Panagiotis; Bourtsalas, Athanasios; Themelis, Nikolaos

    2017-06-23

    This study attempts the development of an algorithm in order to present a step by step selection method for the location and the size of a waste-to-energy facility targeting the maximum output energy, also considering the basic obstacle which is in many cases, the gate fee. Various parameters identified and evaluated in order to formulate the proposed decision making method in the form of an algorithm. The principle simulation input is the amount of municipal solid wastes (MSW) available for incineration and along with its net calorific value are the most important factors for the feasibility of the plant. Moreover, the research is focused both on the parameters that could increase the energy production and those that affect the R1 energy efficiency factor. Estimation of the final gate fee is achieved through the economic analysis of the entire project by investigating both expenses and revenues which are expected according to the selected site and outputs of the facility. In this point, a number of commonly revenue methods were included in the algorithm. The developed algorithm has been validated using three case studies in Greece-Athens, Thessaloniki, and Central Greece, where the cities of Larisa and Volos have been selected for the application of the proposed decision making tool. These case studies were selected based on a previous publication made by two of the authors, in which these areas where examined. Results reveal that the development of a «solid» methodological approach in selecting the site and the size of waste-to-energy (WtE) facility can be feasible. However, the maximization of the energy efficiency factor R1 requires high utilization factors while the minimization of the final gate fee requires high R1 and high metals recovery from the bottom ash as well as economic exploitation of recovered raw materials if any.

  8. Method validation for uranium content analysis using a potentiometer T-90

    International Nuclear Information System (INIS)

    Torowati; Ngatijo; Rahmiati

    2016-01-01

    An experimental method validation has been conducted for uranium content analysis using Potentiometer T-90. The method validation experiment was performed in the quality control laboratory of Experiment Fuel Element Installation, PTBBN - BATAN. The objective is to determine the level of precision and accuracy of analytical results for uranium analysis referring to the latest American Standard Test Method (ASTM) of ASTM C1267-11, which is a modified reference method by reducing of reagent consumption by 10% of the amount used by the original method. The ASTM C 1267-11 reference is a new ASTM as a substitute for the older ASTM namely ASTM C799, Vol.12.01, 2003. It is, therefore, necessary to validate the renewed method. The tool used for the analysis of uranium was potentiometer T-90 and the material used was standard uranium oxide powder CRM (Certificate Reference Material). Validation of the method was done by analyzing standard uranium powder by 7 times weighing and 7 times analysis. Analysis results were used to determine the level of accuracy, precision, Relative Standard Deviation (RSD) and Horwitz coefficient Variation and limit detection and quantitation. The average uranium obtained for this method validation is 84.36% with Standard Deviation (SD) of 0.12%, Relative Standard Deviation (RSD) 0.14% and 2/3 Horwitz coefficient Variation (CV Horwitz) 2.05%. The results show that RSD value is smaller than the value of (2/3) CV Horwitz, which means that this method has a high precision. The accuracy value obtained is 0.48%, and since the acceptance limit of high level of accuracy is when the accuracy value is <2.00%, this method is regarded as having a high degree of accuracy [1]. The limit of detection (LOD) and and the limit of quantitation (LOQ) are 0.0145 g/L and 0.0446 g/L respectively. It is concluded that the ASTM C 1267-11 reference method is valid for use. (author)

  9. Likelihood ratio data to report the validation of a forensic fingerprint evaluation method

    Directory of Open Access Journals (Sweden)

    Daniel Ramos

    2017-02-01

    Full Text Available Data to which the authors refer to throughout this article are likelihood ratios (LR computed from the comparison of 5–12 minutiae fingermarks with fingerprints. These LRs data are used for the validation of a likelihood ratio (LR method in forensic evidence evaluation. These data present a necessary asset for conducting validation experiments when validating LR methods used in forensic evidence evaluation and set up validation reports. These data can be also used as a baseline for comparing the fingermark evidence in the same minutiae configuration as presented in (D. Meuwly, D. Ramos, R. Haraksim, [1], although the reader should keep in mind that different feature extraction algorithms and different AFIS systems used may produce different LRs values. Moreover, these data may serve as a reproducibility exercise, in order to train the generation of validation reports of forensic methods, according to [1]. Alongside the data, a justification and motivation for the use of methods is given. These methods calculate LRs from the fingerprint/mark data and are subject to a validation procedure. The choice of using real forensic fingerprint in the validation and simulated data in the development is described and justified. Validation criteria are set for the purpose of validation of the LR methods, which are used to calculate the LR values from the data and the validation report. For privacy and data protection reasons, the original fingerprint/mark images cannot be shared. But these images do not constitute the core data for the validation, contrarily to the LRs that are shared.

  10. Testing and Validation of Computational Methods for Mass Spectrometry.

    Science.gov (United States)

    Gatto, Laurent; Hansen, Kasper D; Hoopmann, Michael R; Hermjakob, Henning; Kohlbacher, Oliver; Beyer, Andreas

    2016-03-04

    High-throughput methods based on mass spectrometry (proteomics, metabolomics, lipidomics, etc.) produce a wealth of data that cannot be analyzed without computational methods. The impact of the choice of method on the overall result of a biological study is often underappreciated, but different methods can result in very different biological findings. It is thus essential to evaluate and compare the correctness and relative performance of computational methods. The volume of the data as well as the complexity of the algorithms render unbiased comparisons challenging. This paper discusses some problems and challenges in testing and validation of computational methods. We discuss the different types of data (simulated and experimental validation data) as well as different metrics to compare methods. We also introduce a new public repository for mass spectrometric reference data sets ( http://compms.org/RefData ) that contains a collection of publicly available data sets for performance evaluation for a wide range of different methods.

  11. An analytical method for the calculation of static characteristics of linear step motors for control rod drives in nuclear reactors

    International Nuclear Information System (INIS)

    Khan, S.H.; Ivanov, A.A.

    1995-01-01

    An analytical method for calculating static characteristics of linear dc step motors (LSM) is described. These multiphase passive-armature motors are now being developed for control rod drives (CRD) in large nuclear reactors. The static characteristics of such LSM is defined by the variation of electromagnetic force with armature displacement and it determines motor performance in its standing and dynamic modes of operation. The proposed analytical technique for calculating this characteristic is based on the permeance analysis method applied to phase magnetic circuits of LSM. Reluctances of various parts of phase magnetic circuit is calculated analytically by assuming probable flux paths and by taking into account complex nature of magnetic field distribution in it. For given armature positions stator and armature iron saturations are taken into account by an efficient iterative algorithm which gives fast convergence. The method is validated by comparing theoretical results with experimental ones which shows satisfactory agreement for small stator currents and weak iron saturation

  12. A systematic methodology for creep master curve construction using the stepped isostress method (SSM): a numerical assessment

    Science.gov (United States)

    Miranda Guedes, Rui

    2018-02-01

    Long-term creep of viscoelastic materials is experimentally inferred through accelerating techniques based on the time-temperature superposition principle (TTSP) or on the time-stress superposition principle (TSSP). According to these principles, a given property measured for short times at a higher temperature or higher stress level remains the same as that obtained for longer times at a lower temperature or lower stress level, except that the curves are shifted parallel to the horizontal axis, matching a master curve. These procedures enable the construction of creep master curves with short-term experimental tests. The Stepped Isostress Method (SSM) is an evolution of the classical TSSP method. Higher reduction of the required number of test specimens to obtain the master curve is achieved by the SSM technique, since only one specimen is necessary. The classical approach, using creep tests, demands at least one specimen per each stress level to produce a set of creep curves upon which TSSP is applied to obtain the master curve. This work proposes an analytical method to process the SSM raw data. The method is validated using numerical simulations to reproduce the SSM tests based on two different viscoelastic models. One model represents the viscoelastic behavior of a graphite/epoxy laminate and the other represents an adhesive based on epoxy resin.

  13. Factors affecting GEBV accuracy with single-step Bayesian models.

    Science.gov (United States)

    Zhou, Lei; Mrode, Raphael; Zhang, Shengli; Zhang, Qin; Li, Bugao; Liu, Jian-Feng

    2018-01-01

    A single-step approach to obtain genomic prediction was first proposed in 2009. Many studies have investigated the components of GEBV accuracy in genomic selection. However, it is still unclear how the population structure and the relationships between training and validation populations influence GEBV accuracy in terms of single-step analysis. Here, we explored the components of GEBV accuracy in single-step Bayesian analysis with a simulation study. Three scenarios with various numbers of QTL (5, 50, and 500) were simulated. Three models were implemented to analyze the simulated data: single-step genomic best linear unbiased prediction (GBLUP; SSGBLUP), single-step BayesA (SS-BayesA), and single-step BayesB (SS-BayesB). According to our results, GEBV accuracy was influenced by the relationships between the training and validation populations more significantly for ungenotyped animals than for genotyped animals. SS-BayesA/BayesB showed an obvious advantage over SSGBLUP with the scenarios of 5 and 50 QTL. SS-BayesB model obtained the lowest accuracy with the 500 QTL in the simulation. SS-BayesA model was the most efficient and robust considering all QTL scenarios. Generally, both the relationships between training and validation populations and LD between markers and QTL contributed to GEBV accuracy in the single-step analysis, and the advantages of single-step Bayesian models were more apparent when the trait is controlled by fewer QTL.

  14. Single-Camera-Based Method for Step Length Symmetry Measurement in Unconstrained Elderly Home Monitoring.

    Science.gov (United States)

    Cai, Xi; Han, Guang; Song, Xin; Wang, Jinkuan

    2017-11-01

    single-camera-based gait monitoring is unobtrusive, inexpensive, and easy-to-use to monitor daily gait of seniors in their homes. However, most studies require subjects to walk perpendicularly to camera's optical axis or along some specified routes, which limits its application in elderly home monitoring. To build unconstrained monitoring environments, we propose a method to measure step length symmetry ratio (a useful gait parameter representing gait symmetry without significant relationship with age) from unconstrained straight walking using a single camera, without strict restrictions on walking directions or routes. according to projective geometry theory, we first develop a calculation formula of step length ratio for the case of unconstrained straight-line walking. Then, to adapt to general cases, we propose to modify noncollinear footprints, and accordingly provide general procedure for step length ratio extraction from unconstrained straight walking. Our method achieves a mean absolute percentage error (MAPE) of 1.9547% for 15 subjects' normal and abnormal side-view gaits, and also obtains satisfactory MAPEs for non-side-view gaits (2.4026% for 45°-view gaits and 3.9721% for 30°-view gaits). The performance is much better than a well-established monocular gait measurement system suitable only for side-view gaits with a MAPE of 3.5538%. Independently of walking directions, our method can accurately estimate step length ratios from unconstrained straight walking. This demonstrates our method is applicable for elders' daily gait monitoring to provide valuable information for elderly health care, such as abnormal gait recognition, fall risk assessment, etc. single-camera-based gait monitoring is unobtrusive, inexpensive, and easy-to-use to monitor daily gait of seniors in their homes. However, most studies require subjects to walk perpendicularly to camera's optical axis or along some specified routes, which limits its application in elderly home monitoring

  15. Human Factors methods concerning integrated validation of nuclear power plant control rooms

    International Nuclear Information System (INIS)

    Oskarsson, Per-Anders; Johansson, Bjoern J.E.; Gonzalez, Natalia

    2010-02-01

    The frame of reference for this work was existing recommendations and instructions from the NPP area, experiences from the review of the Turbic Validation and experiences from system validations performed at the Swedish Armed Forces, e.g. concerning military control rooms and fighter pilots. These enterprises are characterized by complex systems in extreme environments, often with high risks, where human error can lead to serious consequences. A focus group has been performed with representatives responsible for Human Factors issues from all Swedish NPP:s. The questions that were discussed were, among other things, for whom an integrated validation (IV) is performed and its purpose, what should be included in an IV, the comparison with baseline measures, the design process, the role of SSM, which methods of measurement should be used, and how the methods are affected of changes in the control room. The report brings different questions to discussion concerning the validation process. Supplementary methods of measurement for integrated validation are discussed, e.g. dynamic, psychophysiological, and qualitative methods for identification of problems. Supplementary methods for statistical analysis are presented. The study points out a number of deficiencies in the validation process, e.g. the need of common guidelines for validation and design, criteria for different types of measurements, clarification of the role of SSM, and recommendations for the responsibility of external participants in the validation process. The authors propose 12 measures for taking care of the identified problems

  16. A simple two-step method to fabricate highly transparent ITO/polymer nanocomposite films

    International Nuclear Information System (INIS)

    Liu, Haitao; Zeng, Xiaofei; Kong, Xiangrong; Bian, Shuguang; Chen, Jianfeng

    2012-01-01

    Highlights: ► A simple two-step method without further surface modification step was employed. ► ITO nanoparticles were easily to be uniformly dispersed in polymer matrix. ► ITO/polymer nanocomposite film had high transparency and UV/IR blocking properties. - Abstract: Transparent functional indium tin oxide (ITO)/polymer nanocomposite films were fabricated via a simple approach with two steps. Firstly, the functional monodisperse ITO nanoparticles were synthesized via a facile nonaqueous solvothermal method using bifunctional chemical agent (N-methyl-pyrrolidone, NMP) as the reaction solvent and surface modifier. Secondly, the ITO/acrylics polyurethane (PUA) nanocomposite films were fabricated by a simple sol-solution mixing method without any further surface modification step as often employed traditionally. Flower-like ITO nanoclusters with about 45 nm in diameter were mono-dispersed in ethyl acetate and each nanocluster was assembled by nearly spherical nanoparticles with primary size of 7–9 nm in diameter. The ITO nanoclusters exhibited an excellent dispersibility in polymer matrix of PUA, remaining their original size without any further agglomeration. When the loading content of ITO nanoclusters reached to 5 wt%, the transparent functional nanocomposite film featured a high transparency more than 85% in the visible light region (at 550 nm), meanwhile cutting off near-infrared radiation about 50% at 1500 nm and blocking UV ray about 45% at 350 nm. It could be potential for transparent functional coating materials applications.

  17. Steps to standardization and validation of hippocampal volumetry as a biomarker in clinical trials and diagnostic criteria for Alzheimer’s disease

    Science.gov (United States)

    Jack, Clifford R; Barkhof, Frederik; Bernstein, Matt A; Cantillon, Marc; Cole, Patricia E; DeCarli, Charles; Dubois, Bruno; Duchesne, Simon; Fox, Nick C; Frisoni, Giovanni B; Hampel, Harald; Hill, Derek LG; Johnson, Keith; Mangin, Jean-François; Scheltens, Philip; Schwarz, Adam J; Sperling, Reisa; Suhy, Joyce; Thompson, Paul M; Weiner, Michael; Foster, Norman L

    2012-01-01

    Background The promise of Alzheimer’s disease (AD) biomarkers has led to their incorporation in new diagnostic criteria and in therapeutic trials; however, significant barriers exist to widespread use. Chief among these is the lack of internationally accepted standards for quantitative metrics. Hippocampal volumetry is the most widely studied quantitative magnetic resonance imaging (MRI) measure in AD and thus represents the most rational target for an initial effort at standardization. Methods and Results The authors of this position paper propose a path toward this goal. The steps include: 1) Establish and empower an oversight board to manage and assess the effort, 2) Adopt the standardized definition of anatomic hippocampal boundaries on MRI arising from the EADC-ADNI hippocampal harmonization effort as a Reference Standard, 3) Establish a scientifically appropriate, publicly available Reference Standard Dataset based on manual delineation of the hippocampus in an appropriate sample of subjects (ADNI), and 4) Define minimum technical and prognostic performance metrics for validation of new measurement techniques using the Reference Standard Dataset as a benchmark. Conclusions Although manual delineation of the hippocampus is the best available reference standard, practical application of hippocampal volumetry will require automated methods. Our intent is to establish a mechanism for credentialing automated software applications to achieve internationally recognized accuracy and prognostic performance standards that lead to the systematic evaluation and then widespread acceptance and use of hippocampal volumetry. The standardization and assay validation process outlined for hippocampal volumetry is envisioned as a template that could be applied to other imaging biomarkers. PMID:21784356

  18. UHPLC/MS-MS Analysis of Six Neonicotinoids in Honey by Modified QuEChERS: Method Development, Validation, and Uncertainty Measurement

    Directory of Open Access Journals (Sweden)

    Michele Proietto Galeano

    2013-01-01

    Full Text Available Rapid and reliable multiresidue analytical methods were developed and validated for the determination of 6 neonicotinoids pesticides (acetamiprid, clothianidin, imidacloprid, nitenpyram, thiacloprid, and thiamethoxam in honey. A modified QuEChERS method has allowed a very rapid and efficient single-step extraction, while the detection was performed by UHPLC/MS-MS. The recovery studies were carried out by spiking the samples at two concentration levels (10 and 40 μg/kg. The methods were subjected to a thorough validation procedure. The mean recovery was in the range of 75 to 114% with repeatability below 20%. The limits of detection were below 2.5 μg/kg, while the limits of quantification did not exceed 4.0 μg/kg. The total uncertainty was evaluated taking the main independent uncertainty sources under consideration. The expanded uncertainty did not exceed 49% for the 10 μg/kg concentration level and was in the range of 16–19% for the 40 μg/kg fortification level.

  19. Validation of calculational methods for nuclear criticality safety - approved 1975

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    The American National Standard for Nuclear Criticality Safety in Operations with Fissionable Materials Outside Reactors, N16.1-1975, states in 4.2.5: In the absence of directly applicable experimental measurements, the limits may be derived from calculations made by a method shown to be valid by comparison with experimental data, provided sufficient allowances are made for uncertainties in the data and in the calculations. There are many methods of calculation which vary widely in basis and form. Each has its place in the broad spectrum of problems encountered in the nuclear criticality safety field; however, the general procedure to be followed in establishing validity is common to all. The standard states the requirements for establishing the validity and area(s) of applicability of any calculational method used in assessing nuclear criticality safety

  20. Experimental validation of a method characterizing bow tie filters in CT scanners using a real-time dose probe

    International Nuclear Information System (INIS)

    McKenney, Sarah E.; Nosratieh, Anita; Gelskey, Dale; Yang Kai; Huang Shinying; Chen Lin; Boone, John M.

    2011-01-01

    Purpose: Beam-shaping or ''bow tie'' (BT) filters are used to spatially modulate the x-ray beam in a CT scanner, but the conventional method of step-and-shoot measurement to characterize a beam's profile is tedious and time-consuming. The theory for characterization of bow tie relative attenuation (COBRA) method, which relies on a real-time dosimeter to address the issues of conventional measurement techniques, was previously demonstrated using computer simulations. In this study, the feasibility of the COBRA theory is further validated experimentally through the employment of a prototype real-time radiation meter and a known BT filter. Methods: The COBRA method consisted of four basic steps: (1) The probe was placed at the edge of a scanner's field of view; (2) a real-time signal train was collected as the scanner's gantry rotated with the x-ray beam on; (3) the signal train, without a BT filter, was modeled using peak values measured in the signal train of step 2; and (4) the relative attenuation of the BT filter was estimated from filtered and unfiltered data sets. The prototype probe was first verified to have an isotropic and linear response to incident x-rays. The COBRA method was then tested on a dedicated breast CT scanner with a custom-designed BT filter and compared to the conventional step-and-shoot characterization of the BT filter. Using basis decomposition of dual energy signal data, the thickness of the filter was estimated and compared to the BT filter's manufacturing specifications. The COBRA method was also demonstrated with a clinical whole body CT scanner using the body BT filter. The relative attenuation was calculated at four discrete x-ray tube potentials and used to estimate the thickness of the BT filter. Results: The prototype probe was found to have a linear and isotropic response to x-rays. The relative attenuation produced from the COBRA method fell within the error of the relative attenuation measured with the step-and-shoot method

  1. Reliability and validity of ten consumer activity trackers

    NARCIS (Netherlands)

    Kooiman, Thea; Dontje, Manon L.; Sprenger, Siska; Krijnen, Wim; van der Schans, Cees; de Groot, Martijn

    2015-01-01

    Background: Activity trackers can potentially stimulate users to increase their physical activity behavior. The aim of this study was to examine the reliability and validity of ten consumer activity trackers for measuring step count in both laboratory and free-living conditions. Method: Healthy

  2. Nonlinear Stability and Convergence of Two-Step Runge-Kutta Methods for Volterra Delay Integro-Differential Equations

    Directory of Open Access Journals (Sweden)

    Haiyan Yuan

    2013-01-01

    Full Text Available This paper introduces the stability and convergence of two-step Runge-Kutta methods with compound quadrature formula for solving nonlinear Volterra delay integro-differential equations. First, the definitions of (k,l-algebraically stable and asymptotically stable are introduced; then the asymptotical stability of a (k,l-algebraically stable two-step Runge-Kutta method with 0step Runge-Kutta method is algebraically stable and diagonally stable and its generalized stage order is p, then the method with compound quadrature formula is D-convergent of order at least min{p,ν}, where ν depends on the compound quadrature formula.

  3. A method of validating climate models in climate research with a view to extreme events; Eine Methode zur Validierung von Klimamodellen fuer die Klimawirkungsforschung hinsichtlich der Wiedergabe extremer Ereignisse

    Energy Technology Data Exchange (ETDEWEB)

    Boehm, U

    2000-08-01

    A method is presented to validate climate models with respect to extreme events which are suitable for risk assessment in impact modeling. The algorithm is intended to complement conventional techniques. These procedures mainly compare simulation results with reference data based on single or only a few climatic variables at the same time under the aspect how well a model performs in reproducing the known physical processes of the atmosphere. Such investigations are often based on seasonal or annual mean values. For impact research, however, extreme climatic conditions with shorter typical time scales are generally more interesting. Furthermore, such extreme events are frequently characterized by combinations of individual extremes which require a multivariate approach. The validation method presented here basically consists of a combination of several well-known statistical techniques, completed by a newly developed diagnosis module to quantify model deficiencies. First of all, critical threshold values of key climatic variables for impact research have to be derived serving as criteria to define extreme conditions for a specific activity. Unlike in other techniques, the simulation results to be validated are interpolated to the reference data sampling points in the initial step of this new technique. Besides that fact that the same spatial representation is provided in this way in both data sets for the next diagnostic steps, this procedure also enables to leave the reference basis unchanged for any type of model output and to perform the validation on a real orography. To simultaneously identify the spatial characteristics of a given situation regarding all considered extreme value criteria, a multivariate cluster analysis method for pattern recognition is separately applied to both simulation results and reference data. Afterwards, various distribution-free statistical tests are applied depending on the specific situation to detect statistical significant

  4. A method of validating climate models in climate research with a view to extreme events; Eine Methode zur Validierung von Klimamodellen fuer die Klimawirkungsforschung hinsichtlich der Wiedergabe extremer Ereignisse

    Energy Technology Data Exchange (ETDEWEB)

    Boehm, U.

    2000-08-01

    A method is presented to validate climate models with respect to extreme events which are suitable for risk assessment in impact modeling. The algorithm is intended to complement conventional techniques. These procedures mainly compare simulation results with reference data based on single or only a few climatic variables at the same time under the aspect how well a model performs in reproducing the known physical processes of the atmosphere. Such investigations are often based on seasonal or annual mean values. For impact research, however, extreme climatic conditions with shorter typical time scales are generally more interesting. Furthermore, such extreme events are frequently characterized by combinations of individual extremes which require a multivariate approach. The validation method presented here basically consists of a combination of several well-known statistical techniques, completed by a newly developed diagnosis module to quantify model deficiencies. First of all, critical threshold values of key climatic variables for impact research have to be derived serving as criteria to define extreme conditions for a specific activity. Unlike in other techniques, the simulation results to be validated are interpolated to the reference data sampling points in the initial step of this new technique. Besides that fact that the same spatial representation is provided in this way in both data sets for the next diagnostic steps, this procedure also enables to leave the reference basis unchanged for any type of model output and to perform the validation on a real orography. To simultaneously identify the spatial characteristics of a given situation regarding all considered extreme value criteria, a multivariate cluster analysis method for pattern recognition is separately applied to both simulation results and reference data. Afterwards, various distribution-free statistical tests are applied depending on the specific situation to detect statistical significant

  5. Study on evaluation method for image quality of radiograph by step plate, (2)

    International Nuclear Information System (INIS)

    Terada, Yukihiro; Hirayama, Kazuo; Katoh, Mitsuaki.

    1992-01-01

    Recently, penetrameter sensitivity is used not only for the evaluation of radiographic image quality but also as a control method for examination conditions. However, it is necessary to take the parametric data for radiation quality in order to use it for the second purpose. The quantitative factor of radiation quality is determined by the absorption coefficient and the ratio of scattered radiation to transmitted radiation reaching the X-ray film. When the X-ray equipment changes in conducting the radiographic examination, these data must be measured in each case. This is a demerit in controlling examination conditions based on parametric data. As shown theoretically in the first report, the image quality value of a step plate which is defined by the density difference divided by film contrast and step plate thickness is useful to obtain the value of the radiation quality factor. This report deal with experimental investigation to measure it with the step plate. The result is showing that the value of the radiation quality factor calculated by the parametric data corresponded well with the image quality value measured by the step plate. Therefore, the convenient method to measure the value of the radiation quality factor has been established in order to control examination conditions in radiographic examination. (author)

  6. Moving beyond Traditional Methods of Survey Validation

    Science.gov (United States)

    Maul, Andrew

    2017-01-01

    In his focus article, "Rethinking Traditional Methods of Survey Validation," published in this issue of "Measurement: Interdisciplinary Research and Perspectives," Andrew Maul wrote that it is commonly believed that self-report, survey-based instruments can be used to measure a wide range of psychological attributes, such as…

  7. A simple three step method for selective placement of organic groups in mesoporous silica thin films

    Energy Technology Data Exchange (ETDEWEB)

    Franceschini, Esteban A. [Gerencia Química, Centro Atómico Constituyentes, Comisión Nacional de Energía Atómica, Av. Gral Paz 1499 (B1650KNA) San Martín, Buenos Aires (Argentina); Llave, Ezequiel de la; Williams, Federico J. [Departamento de Química Inorgánica, Analítica y Química Física and INQUIMAE-CONICET, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Ciudad Universitaria, Pabellón II, C1428EHA Buenos Aires (Argentina); Soler-Illia, Galo J.A.A., E-mail: galo.soler.illia@gmail.com [Departamento de Química Inorgánica, Analítica y Química Física and INQUIMAE-CONICET, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Ciudad Universitaria, Pabellón II, C1428EHA Buenos Aires (Argentina); Instituto de Nanosistemas, Universidad Nacional de General San Martín, 25 de Mayo y Francia (1650) San Martín, Buenos Aires (Argentina)

    2016-02-01

    Selective functionalization of mesoporous silica thin films was achieved using a three step method. The first step consists in an outer surface functionalization, followed by washing off the structuring agent (second step), leaving the inner surface of the pores free to be functionalized in the third step. This reproducible method permits to anchor a volatile silane group in the outer film surface, and a second type of silane group in the inner surface of the pores. As a concept test we modified the outer surface of a mesoporous silica film with trimethylsilane (–Si–(CH{sub 3}){sub 3}) groups and the inner pore surface with propylamino (–Si–(CH{sub 2}){sub 3}–NH{sub 2}) groups. The obtained silica films were characterized by Environmental Ellipsometric Porosimetry (EEP), EDS, XPS, contact angle and electron microscopy. The selectively functionalized silica (SF) shows an amount of surface amino functions 4.3 times lower than the one-step functionalized (OSF) silica samples. The method presented here can be extended to a combination of silane chlorides and alkoxides as functional groups, opening up a new route toward the synthesis of multifunctional mesoporous thin films with precisely localized organic functions. - Highlights: • Selective functionalization of mesoporous silica thin films was achieved using a three step method. • A volatile silane group is anchored by evaporation on the outer film surface. • A second silane is deposited in the inner surface of the pores by post-grafting. • Contact angle, EDS and XPS measurements show different proportions of amino groups on both surfaces. • This method can be extended to a combination of silane chlorides and alkoxides functional groups.

  8. Validation Process Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); English, Christine M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gesick, Joshua C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukkamala, Saikrishna [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-01-04

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  9. Numerical characterisation of one-step and three-step solar air heating collectors used for cocoa bean solar drying.

    Science.gov (United States)

    Orbegoso, Elder Mendoza; Saavedra, Rafael; Marcelo, Daniel; La Madrid, Raúl

    2017-12-01

    In the northern coastal and jungle areas of Peru, cocoa beans are dried using artisan methods, such as direct exposure to sunlight. This traditional process is time intensive, leading to a reduction in productivity and, therefore, delays in delivery times. The present study was intended to numerically characterise the thermal behaviour of three configurations of solar air heating collectors in order to determine which demonstrated the best thermal performance under several controlled operating conditions. For this purpose, a computational fluid dynamics model was developed to describe the simultaneous convective and radiative heat transfer phenomena under several operation conditions. The constructed computational fluid dynamics model was firstly validated through comparison with the data measurements of a one-step solar air heating collector. We then simulated two further three-step solar air heating collectors in order to identify which demonstrated the best thermal performance in terms of outlet air temperature and thermal efficiency. The numerical results show that under the same solar irradiation area of exposition and operating conditions, the three-step solar air heating collector with the collector plate mounted between the second and third channels was 67% more thermally efficient compared to the one-step solar air heating collector. This is because the air exposition with the surface of the collector plate for the three-step solar air heating collector former device was twice than the one-step solar air heating collector. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. SYSTEMATIZATION OF THE BASIC STEPS OF THE STEP-AEROBICS

    Directory of Open Access Journals (Sweden)

    Darinka Korovljev

    2011-03-01

    Full Text Available Following the development of the powerful sport industry, in front of us appeared a lot of new opportunities for creating of the new programmes of exercising with certain requisites. One of such programmes is certainly step-aerobics. Step-aerobics can be defined as a type of aerobics consisting of the basic aerobic steps (basic steps applied in exercising on stepper (step bench, with a possibility to regulate its height. Step-aerobics itself can be divided into several groups, depending on the following: type of music, working methods and adopted knowledge of the attendants. In this work, the systematization of the basic steps in step-aerobics was made on the basis of the following criteria: steps origin, number of leg motions in stepping and relating the body support at the end of the step. Systematization of the basic steps of the step-aerobics is quite significant for making a concrete review of the existing basic steps, thus making creation of the step-aerobics lesson easier

  11. LiLEDDA: A Six-Step Forum-Based Netnographic Research Method for Nursing Science

    Directory of Open Access Journals (Sweden)

    MARTIN SALZMANN-ERIKSON

    2012-01-01

    Full Text Available Internet research methods in nursing science are less developed than in other sciences. We choose to present an approach to conducting nursing research on an internet-based forum. This paper presents LiLEDDA, a six-step forum-based netnographic research method for nursing science. The steps consist of: 1. Literature review and identification of the research question(s; 2. Locating the field(s online; 3. Ethical considerations; 4. Data gathering; 5. Data analysis and interpretation; and 6. Abstractions and trustworthiness. Traditional research approaches are limiting when studying non-normative and non-mainstream life-worlds and their cultures. We argue that it is timely to develop more up-to-date research methods and study designs applicable to nursing science that reflect social developments and human living conditions that tend to be increasingly online-based.

  12. A validated RP-HPLC method for the determination of Irinotecan hydrochloride residues for cleaning validation in production area

    Directory of Open Access Journals (Sweden)

    Sunil Reddy

    2013-03-01

    Full Text Available Introduction: cleaning validation is an integral part of current good manufacturing practices in pharmaceutical industry. The main purpose of cleaning validation is to prove the effectiveness and consistency of cleaning in a given pharmaceutical production equipment to prevent cross contamination and adulteration of drug product with other active ingredient. Objective: a rapid, sensitive and specific reverse phase HPLC method was developed and validated for the quantitative determination of irinotecan hydrochloride in cleaning validation swab samples. Method: the method was validated using waters symmetry shield RP-18 (250mm x 4.6mm 5 µm column with isocratic mobile phase containing a mixture of 0.02 M potassium di-hydrogen ortho-phosphate, pH adjusted to 3.5 with ortho-phosphoric acid, methanol and acetonitrile (60:20:20 v/v/v. The flow rate of mobile phase was 1.0 mL/min with column temperature of 25°C and detection wavelength at 220nm. The sample injection volume was 100 µl. Results: the calibration curve was linear over a concentration range from 0.024 to 0.143 µg/mL with a correlation coefficient of 0.997. The intra-day and inter-day precision expressed as relative standard deviation were below 3.2%. The recoveries obtained from stainless steel, PCGI, epoxy, glass and decron cloth surfaces were more than 85% and there was no interference from the cotton swab. The detection limit (DL and quantitation limit (QL were 0.008 and 0.023 µg ml-1, respectively. Conclusion: the developed method was validated with respect to specificity, linearity, limit of detection and quantification, accuracy, precision and solution stability. The overall procedure can be used as part of a cleaning validation program in pharmaceutical manufacture of irinotecan hydrochloride.

  13. Combined Effects of Numerical Method Type and Time Step on Water Stressed Actual Crop ET

    Directory of Open Access Journals (Sweden)

    B. Ghahraman

    2016-02-01

    Full Text Available Introduction: Actual crop evapotranspiration (Eta is important in hydrologic modeling and irrigation water management issues. Actual ET depends on an estimation of a water stress index and average soil water at crop root zone, and so depends on a chosen numerical method and adapted time step. During periods with no rainfall and/or irrigation, actual ET can be computed analytically or by using different numerical methods. Overal, there are many factors that influence actual evapotranspiration. These factors are crop potential evapotranspiration, available root zone water content, time step, crop sensitivity, and soil. In this paper different numerical methods are compared for different soil textures and different crops sensitivities. Materials and Methods: During a specific time step with no rainfall or irrigation, change in soil water content would be equal to evapotranspiration, ET. In this approach, however, deep percolation is generally ignored due to deep water table and negligible unsaturated hydraulic conductivity below rooting depth. This differential equation may be solved analytically or numerically considering different algorithms. We adapted four different numerical methods, as explicit, implicit, and modified Euler, midpoint method, and 3-rd order Heun method to approximate the differential equation. Three general soil types of sand, silt, and clay, and three different crop types of sensitive, moderate, and resistant under Nishaboor plain were used. Standard soil fraction depletion (corresponding to ETc=5 mm.d-1, pstd, below which crop faces water stress is adopted for crop sensitivity. Three values for pstd were considered in this study to cover the common crops in the area, including winter wheat and barley, cotton, alfalfa, sugar beet, saffron, among the others. Based on this parameter, three classes for crop sensitivity was considered, sensitive crops with pstd=0.2, moderate crops with pstd=0.5, and resistive crops with pstd=0

  14. Development and validation of analytical methods for dietary supplements

    International Nuclear Information System (INIS)

    Sullivan, Darryl; Crowley, Richard

    2006-01-01

    The expanding use of innovative botanical ingredients in dietary supplements and foods has resulted in a flurry of research aimed at the development and validation of analytical methods for accurate measurement of active ingredients. The pressing need for these methods is being met through an expansive collaborative initiative involving industry, government, and analytical organizations. This effort has resulted in the validation of several important assays as well as important advances in the method engineering procedures which have improved the efficiency of the process. The initiative has also allowed researchers to hurdle many of the barricades that have hindered accurate analysis such as the lack of reference standards and comparative data. As the availability for nutraceutical products continues to increase these methods will provide consumers and regulators with the scientific information needed to assure safety and dependable labeling

  15. Validating the JobFit system functional assessment method

    Energy Technology Data Exchange (ETDEWEB)

    Jenny Legge; Robin Burgess-Limerick

    2007-05-15

    Workplace injuries are costing the Australian coal mining industry and its communities $410 Million a year. This ACARP study aims to meet those demands by developing a safe, reliable and valid pre-employment functional assessment tool. All JobFit System Pre-Employment Functional Assessments (PEFAs) consist of a musculoskeletal screen, balance test, aerobic fitness test and job-specific postural tolerances and material handling tasks. The results of each component are compared to the applicant's job demands and an overall PEFA score between 1 and 4 is given with 1 being the better score. The reliability study and validity study were conducted concurrently. The reliability study examined test-retest, intra-tester and inter-tester reliability of the JobFit System Functional Assessment Method. Overall, good to excellent reliability was found, which was sufficient to be used for comparison with injury data for determining the validity of the assessment. The overall assessment score and material handling tasks had the greatest reliability. The validity study compared the assessment results of 336 records from a Queensland underground and open cut coal mine with their injury records. A predictive relationship was found between PEFA score and the risk of a back/trunk/shoulder injury from manual handling. An association was also found between PEFA score of 1 and increased length of employment. Lower aerobic fitness test results had an inverse relationship with injury rates. The study found that underground workers, regardless of PEFA score, were more likely to have an injury when compared to other departments. No relationship was found between age and risk of injury. These results confirm the validity of the JobFit System Functional Assessment method.

  16. Development and Validation of a Multiresidue Method for the Determination of Pesticides in Dry Samples (Rice and Wheat Flour) Using Liquid Chromatography/Triple Quadrupole Tandem Mass Spectrometry.

    Science.gov (United States)

    Grande-Martínez, Ángel; Arrebola, Francisco Javier; Moreno, Laura Díaz; Vidal, José Luis Martínez; Frenich, Antonia Garrido

    2015-01-01

    A rapid and sensitive multiresidue method was developed and validated for the determination of around 100 pesticides in dry samples (rice and wheat flour) by ultra-performance LC coupled to a triple quadrupole mass analyzer working in tandem mode (UPLC/QqQ-MS/MS). The sample preparation step was optimized for both matrixes. Pesticides were extracted from rice samples using aqueous ethyl acetate, while aqueous acetonitrile extraction [modified QuEChERS (quick, easy, cheap, effective, rugged, and safe) method] was used for wheat flour matrixes. In both cases the extracts were then cleaned up by dispersive solid phase extraction with MgSO4 and primary secondary amine+C18 sorbents. A further cleanup step with Florisil was necessary to remove fat in wheat flour. The method was validated at two concentration levels (3.6 and 40 μg/kg for most compounds), obtaining recoveries ranging from 70 to 120%, intraday and interday precision values≤20% expressed as RSDs, and expanded uncertainty values≤50%. The LOQ values ranged between 3.6 and 20 μg/kg, although it was set at 3.6 μg/kg for the majority of the pesticides. The method was applied to the analysis of 20 real samples, and no pesticides were detected.

  17. Validation of an HPLC-UV method for the determination of ceftriaxone sodium residues on stainless steel surface of pharmaceutical manufacturing equipments.

    Science.gov (United States)

    Akl, Magda A; Ahmed, Mona A; Ramadan, Ahmed

    2011-05-15

    In pharmaceutical industry, an important step consists in the removal of possible drug residues from the involved equipments and areas. The cleaning procedures must be validated and methods to determine trace amounts of drugs have, therefore, to be considered with special attention. An HPLC-UV method for the determination of ceftriaxone sodium residues on stainless steel surface was developed and validated in order to control a cleaning procedure. Cotton swabs, moistened with extraction solution (50% water and 50% mobile phase), were used to remove any residues of drugs from stainless steel surfaces, and give recoveries of 91.12, 93.8 and 98.7% for three concentration levels. The precision of the results, reported as the relative standard deviation (RSD), were below 1.5%. The method was validated over a concentration range of 1.15-6.92 μg ml(-1). Low quantities of drug residues were determined by HPLC-UV using a Hypersil ODS 5 μm (250×4.6 mm) at 50 °C with an acetonitrile:water:pH 7:pH 5 (39-55-5.5-0.5) mobile phase at flow rate of 1.5 ml min(-1), an injection volume of 20 μl and were detected at 254 nm. A simple, selective and sensitive HPLC-UV assay for the determination of ceftriaxone sodium residues on stainless steel surfaces was developed, validated and applied. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. A novel enterovirus and parechovirus multiplex one-step real-time PCR-validation and clinical experience

    DEFF Research Database (Denmark)

    Nielsen, A. C. Y.; Bottiger, B.; Midgley, S. E.

    2013-01-01

    As the number of new enteroviruses and human parechoviruses seems ever growing, the necessity for updated diagnostics is relevant. We have updated an enterovirus assay and combined it with a previously published assay for human parechovirus resulting in a multiplex one-step RT-PCR assay....... The multiplex assay was validated by analysing the sensitivity and specificity of the assay compared to the respective monoplex assays, and a good concordance was found. Furthermore, the enterovirus assay was able to detect 42 reference strains from all 4 species, and an additional 9 genotypes during panel...... testing and routine usage. During 15 months of routine use, from October 2008 to December 2009, we received and analysed 2187 samples (stool samples, cerebrospinal fluids, blood samples, respiratory samples and autopsy samples) were tested, from 1546 patients and detected enteroviruses and parechoviruses...

  19. Validation and application of an high-order spectral difference method for flow induced noise simulation

    KAUST Repository

    Parsani, Matteo

    2011-09-01

    The main goal of this paper is to develop an efficient numerical algorithm to compute the radiated far field noise provided by an unsteady flow field from bodies in arbitrary motion. The method computes a turbulent flow field in the near fields using a high-order spectral difference method coupled with large-eddy simulation approach. The unsteady equations are solved by advancing in time using a second-order backward difference formulae scheme. The nonlinear algebraic system arising from the time discretization is solved with the nonlinear lowerupper symmetric GaussSeidel algorithm. In the second step, the method calculates the far field sound pressure based on the acoustic source information provided by the first step simulation. The method is based on the Ffowcs WilliamsHawkings approach, which provides noise contributions for monopole, dipole and quadrupole acoustic sources. This paper will focus on the validation and assessment of this hybrid approach using different test cases. The test cases used are: a laminar flow over a two-dimensional (2D) open cavity at Re = 1.5 × 10 3 and M = 0.15 and a laminar flow past a 2D square cylinder at Re = 200 and M = 0.5. In order to show the application of the numerical method in industrial cases and to assess its capability for sound field simulation, a three-dimensional turbulent flow in a muffler at Re = 4.665 × 10 4 and M = 0.05 has been chosen as a third test case. The flow results show good agreement with numerical and experimental reference solutions. Comparison of the computed noise results with those of reference solutions also shows that the numerical approach predicts noise accurately. © 2011 IMACS.

  20. One-step method for the production of nanofluids

    Science.gov (United States)

    Kostic, Milivoje [Chicago, IL; Golubovic, Mihajlo [Chicago, IL; Hull, John R [Downers Grove, IL; Choi, Stephen U. S. [Napersville, IL

    2010-05-18

    A one step method and system for producing nanofluids by a particle-source evaporation and deposition of the evaporant into a base fluid. The base fluid such (i.e. ethylene glycol) is placed in a rotating cylindrical drum having an adjustable heater-boat-evaporator and heat exchanger-cooler apparatus. As the drum rotates, a thin liquid layer is formed on the inside surface of the drum. A heater-boat-evaporator having an evaporant material (particle-source) placed within its boat evaporator is adjustably positioned near a portion of the rotating thin liquid layer, the evaporant material being heated thereby evaporating a portion of the evaporant material, the evaporated material absorbed by the liquid film to form nanofluid.

  1. Rapid expansion method (REM) for time‐stepping in reverse time migration (RTM)

    KAUST Repository

    Pestana, Reynam C.

    2009-01-01

    We show that the wave equation solution using a conventional finite‐difference scheme, derived commonly by the Taylor series approach, can be derived directly from the rapid expansion method (REM). After some mathematical manipulation we consider an analytical approximation for the Bessel function where we assume that the time step is sufficiently small. From this derivation we find that if we consider only the first two Chebyshev polynomials terms in the rapid expansion method we can obtain the second order time finite‐difference scheme that is frequently used in more conventional finite‐difference implementations. We then show that if we use more terms from the REM we can obtain a more accurate time integration of the wave field. Consequently, we have demonstrated that the REM is more accurate than the usual finite‐difference schemes and it provides a wave equation solution which allows us to march in large time steps without numerical dispersion and is numerically stable. We illustrate the method with post and pre stack migration results.

  2. Novel two-step method to form silk fibroin fibrous hydrogel

    International Nuclear Information System (INIS)

    Ming, Jinfa; Li, Mengmeng; Han, Yuhui; Chen, Ying; Li, Han; Zuo, Baoqi; Pan, Fukui

    2016-01-01

    Hydrogels prepared by silk fibroin solution have been studied. However, mimicking the nanofibrous structures of extracellular matrix for fabricating biomaterials remains a challenge. Here, a novel two-step method was applied to prepare fibrous hydrogels using regenerated silk fibroin solution containing nanofibrils in a range of tens to hundreds of nanometers. When the gelation process of silk solution occurred, it showed a top-down type gel within 30 min. After gelation, silk fibroin fibrous hydrogels exhibited nanofiber network morphology with β-sheet structure. Moreover, the compressive stress and modulus of fibrous hydrogels were 31.9 ± 2.6 and 2.8 ± 0.8 kPa, respectively, which was formed using 2.0 wt.% concentration solutions. In addition, fibrous hydrogels supported BMSCs attachment and proliferation over 12 days. This study provides important insight in the in vitro processing of silk fibroin into useful new materials. - Highlights: • SF fibrous hydrogel was prepared by a novel two-step method. • SF solution containing nanofibrils in a range of tens to hundreds of nanometers was prepared. • Gelation process was top-down type gel with several minutes. • SF fibrous hydrogels exhibited nanofiber network morphology with β-sheet structure. • Fibrous hydrogels had higher compressive stresses superior to porous hydrogels.

  3. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    Science.gov (United States)

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the

  4. Arsenic fractionation in agricultural soil using an automated three-step sequential extraction method coupled to hydride generation-atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Rosas-Castor, J.M. [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico); Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Portugal, L.; Ferrer, L. [Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Guzmán-Mar, J.L.; Hernández-Ramírez, A. [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico); Cerdà, V. [Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Hinojosa-Reyes, L., E-mail: laura.hinojosary@uanl.edu.mx [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico)

    2015-05-18

    Highlights: • A fully automated flow-based modified-BCR extraction method was developed to evaluate the extractable As of soil. • The MSFIA–HG-AFS system included an UV photo-oxidation step for organic species degradation. • The accuracy and precision of the proposed method were found satisfactory. • The time analysis can be reduced up to eight times by using the proposed flow-based BCR method. • The labile As (F1 + F2) was <50% of total As in soil samples from As-contaminated-mining zones. - Abstract: A fully automated modified three-step BCR flow-through sequential extraction method was developed for the fractionation of the arsenic (As) content from agricultural soil based on a multi-syringe flow injection analysis (MSFIA) system coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS). Critical parameters that affect the performance of the automated system were optimized by exploiting a multivariate approach using a Doehlert design. The validation of the flow-based modified-BCR method was carried out by comparison with the conventional BCR method. Thus, the total As content was determined in the following three fractions: fraction 1 (F1), the acid-soluble or interchangeable fraction; fraction 2 (F2), the reducible fraction; and fraction 3 (F3), the oxidizable fraction. The limits of detection (LOD) were 4.0, 3.4, and 23.6 μg L{sup −1} for F1, F2, and F3, respectively. A wide working concentration range was obtained for the analysis of each fraction, i.e., 0.013–0.800, 0.011–0.900 and 0.079–1.400 mg L{sup −1} for F1, F2, and F3, respectively. The precision of the automated MSFIA–HG-AFS system, expressed as the relative standard deviation (RSD), was evaluated for a 200 μg L{sup −1} As standard solution, and RSD values between 5 and 8% were achieved for the three BCR fractions. The new modified three-step BCR flow-based sequential extraction method was satisfactorily applied for arsenic fractionation in real agricultural

  5. The Diabetes Evaluation Framework for Innovative National Evaluations (DEFINE): Construct and Content Validation Using a Modified Delphi Method.

    Science.gov (United States)

    Paquette-Warren, Jann; Tyler, Marie; Fournie, Meghan; Harris, Stewart B

    2017-06-01

    In order to scale-up successful innovations, more evidence is needed to evaluate programs that attempt to address the rising prevalence of diabetes and the associated burdens on patients and the healthcare system. This study aimed to assess the construct and content validity of the Diabetes Evaluation Framework for Innovative National Evaluations (DEFINE), a tool developed to guide the evaluation, design and implementation with built-in knowledge translation principles. A modified Delphi method, including 3 individual rounds (questionnaire with 7-point agreement/importance Likert scales and/or open-ended questions) and 1 group round (open discussion) were conducted. Twelve experts in diabetes, research, knowledge translation, evaluation and policy from Canada (Ontario, Quebec and British Columbia) and Australia participated. Quantitative consensus criteria were an interquartile range of ≤1. Qualitative data were analyzed thematically and confirmed by participants. An importance scale was used to determine a priority multi-level indicator set. Items rated very or extremely important by 80% or more of the experts were reviewed in the final group round to build the final set. Participants reached consensus on the content and construct validity of DEFINE, including its title, overall goal, 5-step evaluation approach, medical and nonmedical determinants of health schematics, full list of indicators and associated measurement tools, priority multi-level indicator set and next steps in DEFINE's development. Validated by experts, DEFINE has the right theoretic components to evaluate comprehensively diabetes prevention and management programs and to support acquisition of evidence that could influence the knowledge translation of innovations to reduce the burden of diabetes. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Validation of Helicopter Gear Condition Indicators Using Seeded Fault Tests

    Science.gov (United States)

    Dempsey, Paula; Brandon, E. Bruce

    2013-01-01

    A "seeded fault test" in support of a rotorcraft condition based maintenance program (CBM), is an experiment in which a component is tested with a known fault while health monitoring data is collected. These tests are performed at operating conditions comparable to operating conditions the component would be exposed to while installed on the aircraft. Performance of seeded fault tests is one method used to provide evidence that a Health Usage Monitoring System (HUMS) can replace current maintenance practices required for aircraft airworthiness. Actual in-service experience of the HUMS detecting a component fault is another validation method. This paper will discuss a hybrid validation approach that combines in service-data with seeded fault tests. For this approach, existing in-service HUMS flight data from a naturally occurring component fault will be used to define a component seeded fault test. An example, using spiral bevel gears as the targeted component, will be presented. Since the U.S. Army has begun to develop standards for using seeded fault tests for HUMS validation, the hybrid approach will be mapped to the steps defined within their Aeronautical Design Standard Handbook for CBM. This paper will step through their defined processes, and identify additional steps that may be required when using component test rig fault tests to demonstrate helicopter CI performance. The discussion within this paper will provide the reader with a better appreciation for the challenges faced when defining a seeded fault test for HUMS validation.

  7. Determination of Modafinil in Tablet Formulation Using Three New Validated Spectrophotometric Methods

    International Nuclear Information System (INIS)

    Basniwal, P.K.; Jain, D.; Basniwal, P.K.

    2014-01-01

    In this study, three new UV spectrophotometric methods viz. linear regression equation (LRE), standard absorptivity (SA) and first order derivative (FOD) method were developed and validated for determination of modafinil in tablet form. The Beer-Lamberts law was obeyed as linear in the range of 10-50 μg/ mL and all the methods were validated for linearity, accuracy, precision and robustness. These methods were successfully applied for assay of modafinil drug content in tablets in the range of 100.20 - 100.42 %, 100.11 - 100.58 % and 100.25 - 100.34 %, respectively with acceptable standard deviation (less than two) for all the methods. The validated spectrophotometric methods may be successfully applied for assay, dissolution studies, bio-equivalence studies as well as routine analysis in pharmaceutical industries. (author)

  8. Rapid expansion method (REM) for time‐stepping in reverse time migration (RTM)

    KAUST Repository

    Pestana, Reynam C.; Stoffa, Paul L.

    2009-01-01

    an analytical approximation for the Bessel function where we assume that the time step is sufficiently small. From this derivation we find that if we consider only the first two Chebyshev polynomials terms in the rapid expansion method we can obtain the second

  9. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    Science.gov (United States)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  10. Cross-cultural adaptation of instruments assessing breastfeeding determinants: a multi-step approach

    Science.gov (United States)

    2014-01-01

    Background Cross-cultural adaptation is a necessary process to effectively use existing instruments in other cultural and language settings. The process of cross-culturally adapting, including translation, of existing instruments is considered a critical set to establishing a meaningful instrument for use in another setting. Using a multi-step approach is considered best practice in achieving cultural and semantic equivalence of the adapted version. We aimed to ensure the content validity of our instruments in the cultural context of KwaZulu-Natal, South Africa. Methods The Iowa Infant Feeding Attitudes Scale, Breastfeeding Self-Efficacy Scale-Short Form and additional items comprise our consolidated instrument, which was cross-culturally adapted utilizing a multi-step approach during August 2012. Cross-cultural adaptation was achieved through steps to maintain content validity and attain semantic equivalence in the target version. Specifically, Lynn’s recommendation to apply an item-level content validity index score was followed. The revised instrument was translated and back-translated. To ensure semantic equivalence, Brislin’s back-translation approach was utilized followed by the committee review to address any discrepancies that emerged from translation. Results Our consolidated instrument was adapted to be culturally relevant and translated to yield more reliable and valid results for use in our larger research study to measure infant feeding determinants effectively in our target cultural context. Conclusions Undertaking rigorous steps to effectively ensure cross-cultural adaptation increases our confidence that the conclusions we make based on our self-report instrument(s) will be stronger. In this way, our aim to achieve strong cross-cultural adaptation of our consolidated instruments was achieved while also providing a clear framework for other researchers choosing to utilize existing instruments for work in other cultural, geographic and population

  11. Coupling of Spinosad Fermentation and Separation Process via Two-Step Macroporous Resin Adsorption Method.

    Science.gov (United States)

    Zhao, Fanglong; Zhang, Chuanbo; Yin, Jing; Shen, Yueqi; Lu, Wenyu

    2015-08-01

    In this paper, a two-step resin adsorption technology was investigated for spinosad production and separation as follows: the first step resin addition into the fermentor at early cultivation period to decrease the timely product concentration in the broth; the second step of resin addition was used after fermentation to adsorb and extract the spinosad. Based on this, a two-step macroporous resin adsorption-membrane separation process for spinosad fermentation, separation, and purification was established. Spinosad concentration in 5-L fermentor increased by 14.45 % after adding 50 g/L macroporous at the beginning of fermentation. The established two-step macroporous resin adsorption-membrane separation process got the 95.43 % purity and 87 % yield for spinosad, which were both higher than that of the conventional crystallization of spinosad from aqueous phase that were 93.23 and 79.15 % separately. The two-step macroporous resin adsorption method has not only carried out the coupling of spinosad fermentation and separation but also increased spinosad productivity. In addition, the two-step macroporous resin adsorption-membrane separation process performs better in spinosad yield and purity.

  12. Investigating the Effectiveness of Teaching Methods Based on a Four-Step Constructivist Strategy

    Science.gov (United States)

    Çalik, Muammer; Ayas, Alipaşa; Coll, Richard K.

    2010-02-01

    This paper reports on an investigation of the effectiveness an intervention using several different methods for teaching solution chemistry. The teaching strategy comprised a four-step approach derived from a constructivist view of learning. A sample consisting of 44 students (18 boys and 26 girls) was selected purposively from two different Grade 9 classes in the city of Trabzon, Turkey. Data collection employed a purpose-designed `solution chemistry concept test', consisting of 17 items, with the quantitative data from the survey supported by qualitative interview data. The findings suggest that using different methods embedded within the four-step constructivist-based teaching strategy enables students to refute some alternative conceptions, but does not completely eliminate student alternative conceptions for solution chemistry.

  13. A comparison of accuracy validation methods for genomic and pedigree-based predictions of swine litter size traits using Large White and simulated data.

    Science.gov (United States)

    Putz, A M; Tiezzi, F; Maltecca, C; Gray, K A; Knauer, M T

    2018-02-01

    The objective of this study was to compare and determine the optimal validation method when comparing accuracy from single-step GBLUP (ssGBLUP) to traditional pedigree-based BLUP. Field data included six litter size traits. Simulated data included ten replicates designed to mimic the field data in order to determine the method that was closest to the true accuracy. Data were split into training and validation sets. The methods used were as follows: (i) theoretical accuracy derived from the prediction error variance (PEV) of the direct inverse (iLHS), (ii) approximated accuracies from the accf90(GS) program in the BLUPF90 family of programs (Approx), (iii) correlation between predictions and the single-step GEBVs from the full data set (GEBV Full ), (iv) correlation between predictions and the corrected phenotypes of females from the full data set (Y c ), (v) correlation from method iv divided by the square root of the heritability (Y ch ) and (vi) correlation between sire predictions and the average of their daughters' corrected phenotypes (Y cs ). Accuracies from iLHS increased from 0.27 to 0.37 (37%) in the Large White. Approximation accuracies were very consistent and close in absolute value (0.41 to 0.43). Both iLHS and Approx were much less variable than the corrected phenotype methods (ranging from 0.04 to 0.27). On average, simulated data showed an increase in accuracy from 0.34 to 0.44 (29%) using ssGBLUP. Both iLHS and Y ch approximated the increase well, 0.30 to 0.46 and 0.36 to 0.45, respectively. GEBV Full performed poorly in both data sets and is not recommended. Results suggest that for within-breed selection, theoretical accuracy using PEV was consistent and accurate. When direct inversion is infeasible to get the PEV, correlating predictions to the corrected phenotypes divided by the square root of heritability is adequate given a large enough validation data set. © 2017 Blackwell Verlag GmbH.

  14. Analysis of the chronic lower limb injuries occurrence in step aerobic instructors in relation to their working step class profile: a three year longitudinal prospective study.

    Science.gov (United States)

    Malliou, P; Rokka, S; Beneka, A; Gioftsidou, A; Mavromoustakos, S; Godolias, G

    2014-01-01

    There is limited information on injury patterns in Step Aerobic Instructors (SAI) who exclusively execute "step" aerobic classes. To record the type and the anatomical position in relation to diagnosis of muscular skeletal injuries in step aerobic instructors. Also, to analyse the days of absence due to chronic injury in relation to weekly working hours, height of the step platform, working experience and working surface and footwear during the step class. The Step Aerobic Instructors Injuries Questionnaire was developed, and then validity and reliability indices were calculated. 63 SAI completed the questionnaire. For the statistical analysis of the data, the method used was the analysis of frequencies, the non-parametric test χ^{2} (chi square distribution), correlation and linear and logistic regressions analysis from the SPSS statistical package. 63 SAI reported 115 injuries that required more than 2 days absence from step aerobic classes. The chronic lower extremity injuries were 73.5%, with the leg pain, the anterior knee pain, the plantar tendinopathy and the Achilles tendinopathy being most common overuse syndromes. The working hours, the platform height, the years of aerobic dance seem to affect the days of absence due to chronic lower limb injury occurrence in SAI.

  15. Validation of pestice multi residue analysis method on cucumber

    International Nuclear Information System (INIS)

    2011-01-01

    In this study we aimed to validate the method of multi pesticide residue analysis on cucumber. Before real sample injection, system suitability test was performed in gas chromatography (GC). For this purpose, a sensitive pesticide mixture was used for GC-NPD and estimated the performance parameters such as number of effective theoretical plates, resolution factor, asymmetry, tailing and selectivity. It was detected that the system was suitable for calibration and sample injection. Samples were fortified at the level of 0.02, 0.2, 0.8 and 1 mg/kg with mixture of dichlorvos, malathion and chloropyrifos pesticides. In the fortification step 1 4C-carbaryl was also added on homogenized analytical portions to make use of 1 4C labelled pesticides for the determining extraction efficiency. Then the basic analytical process, such as ethyl acetate extraction, filtration, evaporation and cleanup, were performed. The GPC calibration using 1 4C- carbaryl and fortification mixture (dichlorvos, malathion and chloropyrifos) showed that pesticide fraction come through the column between the 8-23 ml fractions. The recovery of 1 4C-carbaryl after the extraction and cleanup step were 92.63-111.73 % and 74.83-102.22 %, respectively. The stability of pesticides during analysis is an important factor. In this study, stability test was performed including matrix effect. Our calculation and t test results showed that above mentioned pesticides were not stabile during sample processing in our laboratory conditions and it was found that sample comminution with dry ice may improve stability. In the other part of the study, 1 4C-chloropyrifos was used to determine homogeneity of analytical portions taken from laboratory samples. Use of 1 4C labelled pesticides allows us for quick quantification analyte, even with out clean-up. The analytical results show that after sample processing with waring blender, analytical portions were homogenous. Sample processing uncertainty depending on quantity of

  16. Development and Validation of a Stability-Indicating LC-UV Method ...

    African Journals Online (AJOL)

    Keywords: Ketotifen, Cetirizine, Stability indicating method, Stressed conditions, Validation. Tropical ... in biological fluids [13] are also reported. Stability indicating HPLC method is reported for ketotifen where drug is ..... paracetamol, cetirizine.

  17. Comparison on genomic predictions using GBLUP models and two single-step blending methods with different relationship matrices in the Nordic Holstein population

    DEFF Research Database (Denmark)

    Gao, Hongding; Christensen, Ole Fredslund; Madsen, Per

    2012-01-01

    Background A single-step blending approach allows genomic prediction using information of genotyped and non-genotyped animals simultaneously. However, the combined relationship matrix in a single-step method may need to be adjusted because marker-based and pedigree-based relationship matrices may...... not be on the same scale. The same may apply when a GBLUP model includes both genomic breeding values and residual polygenic effects. The objective of this study was to compare single-step blending methods and GBLUP methods with and without adjustment of the genomic relationship matrix for genomic prediction of 16......) a simple GBLUP method, 2) a GBLUP method with a polygenic effect, 3) an adjusted GBLUP method with a polygenic effect, 4) a single-step blending method, and 5) an adjusted single-step blending method. In the adjusted GBLUP and single-step methods, the genomic relationship matrix was adjusted...

  18. s-Step Krylov Subspace Methods as Bottom Solvers for Geometric Multigrid

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lijewski, Mike [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Almgren, Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Straalen, Brian Van [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Carson, Erin [Univ. of California, Berkeley, CA (United States); Knight, Nicholas [Univ. of California, Berkeley, CA (United States); Demmel, James [Univ. of California, Berkeley, CA (United States)

    2014-08-14

    Geometric multigrid solvers within adaptive mesh refinement (AMR) applications often reach a point where further coarsening of the grid becomes impractical as individual sub domain sizes approach unity. At this point the most common solution is to use a bottom solver, such as BiCGStab, to reduce the residual by a fixed factor at the coarsest level. Each iteration of BiCGStab requires multiple global reductions (MPI collectives). As the number of BiCGStab iterations required for convergence grows with problem size, and the time for each collective operation increases with machine scale, bottom solves in large-scale applications can constitute a significant fraction of the overall multigrid solve time. In this paper, we implement, evaluate, and optimize a communication-avoiding s-step formulation of BiCGStab (CABiCGStab for short) as a high-performance, distributed-memory bottom solver for geometric multigrid solvers. This is the first time s-step Krylov subspace methods have been leveraged to improve multigrid bottom solver performance. We use a synthetic benchmark for detailed analysis and integrate the best implementation into BoxLib in order to evaluate the benefit of a s-step Krylov subspace method on the multigrid solves found in the applications LMC and Nyx on up to 32,768 cores on the Cray XE6 at NERSC. Overall, we see bottom solver improvements of up to 4.2x on synthetic problems and up to 2.7x in real applications. This results in as much as a 1.5x improvement in solver performance in real applications.

  19. Fuzzy-logic based strategy for validation of multiplex methods: example with qualitative GMO assays.

    Science.gov (United States)

    Bellocchi, Gianni; Bertholet, Vincent; Hamels, Sandrine; Moens, W; Remacle, José; Van den Eede, Guy

    2010-02-01

    This paper illustrates the advantages that a fuzzy-based aggregation method could bring into the validation of a multiplex method for GMO detection (DualChip GMO kit, Eppendorf). Guidelines for validation of chemical, bio-chemical, pharmaceutical and genetic methods have been developed and ad hoc validation statistics are available and routinely used, for in-house and inter-laboratory testing, and decision-making. Fuzzy logic allows summarising the information obtained by independent validation statistics into one synthetic indicator of overall method performance. The microarray technology, introduced for simultaneous identification of multiple GMOs, poses specific validation issues (patterns of performance for a variety of GMOs at different concentrations). A fuzzy-based indicator for overall evaluation is illustrated in this paper, and applied to validation data for different genetically modified elements. Remarks were drawn on the analytical results. The fuzzy-logic based rules were shown to be applicable to improve interpretation of results and facilitate overall evaluation of the multiplex method.

  20. The Step Method - Battling Identity Theft Using E-Retailers' Websites

    Science.gov (United States)

    Schulze, Marion; Shah, Mahmood H.

    Identity theft is the fastest growing crime in the 21st century. This paper investigates firstly what well-known e-commerce organizations are communicating on their websites to address this issue. For this purpose we analyze secondary data (literature and websites of ten organizations). Secondly we investigate the good practice in this area and recommend practical steps. The key findings are that some organizations only publish minimum security information to comply with legal requirements. Others inform consumers on how they actively try to prevent identity theft, how consumers can protect themselves, and about supporting actions when identity theft related fraud actually happens. From these findings we developed the Support - Trust - Empowerment -Prevention (STEP) method. It is aimed at helping to prevent identity theft and dealing with consequences when it occurs. It can help organizations on gaining and keeping consumers’ trust which is so essential for e-retailers in a climate of rising fraud.

  1. Method development and validation of potent pyrimidine derivative by UV-VIS spectrophotometer.

    Science.gov (United States)

    Chaudhary, Anshu; Singh, Anoop; Verma, Prabhakar Kumar

    2014-12-01

    A rapid and sensitive ultraviolet-visible (UV-VIS) spectroscopic method was developed for the estimation of pyrimidine derivative 6-Bromo-3-(6-(2,6-dichlorophenyl)-2-(morpolinomethylamino) pyrimidine4-yl) -2H-chromen-2-one (BT10M) in bulk form. Pyrimidine derivative was monitored at 275 nm with UV detection, and there is no interference of diluents at 275 nm. The method was found to be linear in the range of 50 to 150 μg/ml. The accuracy and precision were determined and validated statistically. The method was validated as a guideline. The results showed that the proposed method is suitable for the accurate, precise, and rapid determination of pyrimidine derivative. Graphical Abstract Method development and validation of potent pyrimidine derivative by UV spectroscopy.

  2. A long-term validation of the modernised DC-ARC-OES solid-sample method.

    Science.gov (United States)

    Flórián, K; Hassler, J; Förster, O

    2001-12-01

    The validation procedure based on ISO 17025 standard has been used to study and illustrate both the longterm stability of the calibration process of the DC-ARC solid sample spectrometric method and the main validation criteria of the method. In the calculation of the validation characteristics depending on the linearity(calibration), also the fulfilment of predetermining criteria such as normality and homoscedasticity was checked. In order to decide whether there are any trends in the time-variation of the analytical signal or not, also the Neumann test of trend was applied and evaluated. Finally, a comparison with similar validation data of the ETV-ICP-OES method was carried out.

  3. Comparing the Validity of Non-Invasive Methods in Measuring Thoracic Kyphosis and Lumbar Lordosis

    Directory of Open Access Journals (Sweden)

    Mohammad Yousefi

    2012-04-01

    Full Text Available Background: the purpose of this article is to study the validity of each of the non-invasive methods (flexible ruler, spinal mouse, and processing the image versus the one through-Ray radiation (the basic method and comparing them with each other.Materials and Methods: for evaluating the validity of each of these non-invasive methods, the thoracic Kyphosis and lumber Lordosis angle of 20 students of Birjand University (age mean and standard deviation: 26±2, weight: 72±2.5 kg, height: 169±5.5 cm through fours methods of flexible ruler, spinal mouse, and image processing and X-ray.Results: the results indicated that the validity of the methods including flexible ruler, spinal mouse, and image processing in measuring the thoracic Kyphosis and lumber Lordosis angle respectively have an adherence of 0.81, 0.87, 0.73, 0.76, 0.83, 0.89 (p>0.05. As a result, regarding the gained validity against the golden method of X-ray, it could be stated that the three mentioned non-invasive methods have adequate validity. In addition, the one-way analysis of variance test indicated that there existed a meaningful relationship between the three methods of measuring the thoracic Kyphosis and lumber Lordosis, and with respect to the Tukey’s test result, the image processing method is the most precise one.Conclusion as a result, this method could be used along with other non-invasive methods as a valid measuring method.

  4. Design and Implementation Content Validity Study: Development of an instrument for measuring Patient-Centered Communication

    Directory of Open Access Journals (Sweden)

    Vahid Zamanzadeh

    2015-06-01

    Full Text Available ABSTRACT Introduction: The importance of content validity in the instrument psychometric and its relevance with reliability, have made it an essential step in the instrument development. This article attempts to give an overview of the content validity process and to explain the complexity of this process by introducing an example. Methods: We carried out a methodological study conducted to examine the content validity of the patient-centered communication instrument through a two-step process (development and judgment. At the first step, domain determination, sampling (item generation and instrument formation and at the second step, content validity ratio, content validity index and modified kappa statistic was performed. Suggestions of expert panel and item impact scores are used to examine the instrument face validity. Results: From a set of 188 items, content validity process identified seven dimensions includes trust building (eight items, informational support (seven items, emotional support (five items, problem solving (seven items, patient activation (10 items, intimacy/friendship (six items and spirituality strengthening (14 items. Content validity study revealed that this instrument enjoys an appropriate level of content validity. The overall content validity index of the instrument using universal agreement approach was low; however, it can be advocated with respect to the high number of content experts that makes consensus difficult and high value of the S-CVI with the average approach, which was equal to 0.93. Conclusion: This article illustrates acceptable quantities indices for content validity a new instrument and outlines them during design and psychometrics of patient-centered communication measuring instrument.

  5. Multi-step polynomial regression method to model and forecast malaria incidence.

    Directory of Open Access Journals (Sweden)

    Chandrajit Chatterjee

    Full Text Available Malaria is one of the most severe problems faced by the world even today. Understanding the causative factors such as age, sex, social factors, environmental variability etc. as well as underlying transmission dynamics of the disease is important for epidemiological research on malaria and its eradication. Thus, development of suitable modeling approach and methodology, based on the available data on the incidence of the disease and other related factors is of utmost importance. In this study, we developed a simple non-linear regression methodology in modeling and forecasting malaria incidence in Chennai city, India, and predicted future disease incidence with high confidence level. We considered three types of data to develop the regression methodology: a longer time series data of Slide Positivity Rates (SPR of malaria; a smaller time series data (deaths due to Plasmodium vivax of one year; and spatial data (zonal distribution of P. vivax deaths for the city along with the climatic factors, population and previous incidence of the disease. We performed variable selection by simple correlation study, identification of the initial relationship between variables through non-linear curve fitting and used multi-step methods for induction of variables in the non-linear regression analysis along with applied Gauss-Markov models, and ANOVA for testing the prediction, validity and constructing the confidence intervals. The results execute the applicability of our method for different types of data, the autoregressive nature of forecasting, and show high prediction power for both SPR and P. vivax deaths, where the one-lag SPR values plays an influential role and proves useful for better prediction. Different climatic factors are identified as playing crucial role on shaping the disease curve. Further, disease incidence at zonal level and the effect of causative factors on different zonal clusters indicate the pattern of malaria prevalence in the city

  6. Shutdown Dose Rate Analysis Using the Multi-Step CADIS Method

    International Nuclear Information System (INIS)

    Ibrahim, Ahmad M.; Peplow, Douglas E.; Peterson, Joshua L.; Grove, Robert E.

    2015-01-01

    The Multi-Step Consistent Adjoint Driven Importance Sampling (MS-CADIS) hybrid Monte Carlo (MC)/deterministic radiation transport method was proposed to speed up the shutdown dose rate (SDDR) neutron MC calculation using an importance function that represents the neutron importance to the final SDDR. This work applied the MS-CADIS method to the ITER SDDR benchmark problem. The MS-CADIS method was also used to calculate the SDDR uncertainty resulting from uncertainties in the MC neutron calculation and to determine the degree of undersampling in SDDR calculations because of the limited ability of the MC method to tally detailed spatial and energy distributions. The analysis that used the ITER benchmark problem compared the efficiency of the MS-CADIS method to the traditional approach of using global MC variance reduction techniques for speeding up SDDR neutron MC calculation. Compared to the standard Forward-Weighted-CADIS (FW-CADIS) method, the MS-CADIS method increased the efficiency of the SDDR neutron MC calculation by 69%. The MS-CADIS method also increased the fraction of nonzero scoring mesh tally elements in the space-energy regions of high importance to the final SDDR

  7. Reliability and validity of the AutoCAD software method in lumbar lordosis measurement.

    Science.gov (United States)

    Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe

    2011-12-01

    The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis.

  8. The validity and reliability of the four square step test in different adult populations: a systematic review.

    Science.gov (United States)

    Moore, Martha; Barker, Karen

    2017-09-11

    The four square step test (FSST) was first validated in healthy older adults to provide a measure of dynamic standing balance and mobility. The FSST has since been used in a variety of patient populations. The purpose of this systematic review is to determine the validity and reliability of the FSST in these different adult patient populations. The literature search was conducted to highlight all the studies that measured validity and reliability of the FSST. Six electronic databases were searched including AMED, CINAHL, MEDLINE, PEDro, Web of Science and Google Scholar. Grey literature was also searched for any documents relevant to the review. Two independent reviewers carried out study selection and quality assessment. The methodological quality was assessed using the QUADAS-2 tool, which is a validated tool for the quality assessment of diagnostic accuracy studies, and the COSMIN four-point checklist, which contains standards for evaluating reliability studies on the measurement properties of health instruments. Fifteen studies were reviewed studying community-dwelling older adults, Parkinson's disease, Huntington's disease, multiple sclerosis, vestibular disorders, post stroke, post unilateral transtibial amputation, knee pain and hip osteoarthritis. Three of the studies were of moderate methodological quality scoring low in risk of bias and applicability for all domains in the QUADAS-2 tool. Three studies scored "fair" on the COSMIN four-point checklist for the reliability components. The concurrent validity of the FSST was measured in nine of the studies with moderate to strong correlations being found. Excellent Intraclass Correlation Coefficients were found between physiotherapists carrying out the tests (ICC = .99) with good to excellent test-retest reliability shown in nine of the studies (ICC = .73-.98). The FSST may be an effective and valid tool for measuring dynamic balance and a participants' falls risk. It has been shown to have strong

  9. Application and validation of superior spectrophotometric methods for simultaneous determination of ternary mixture used for hypertension management

    Science.gov (United States)

    Mohamed, Heba M.; Lamie, Nesrine T.

    2016-02-01

    Telmisartan (TL), Hydrochlorothiazide (HZ) and Amlodipine besylate (AM) are co-formulated together for hypertension management. Three smart, specific and precise spectrophotometric methods were applied and validated for simultaneous determination of the three cited drugs. Method A is the ratio isoabsorptive point and ratio difference in subtracted spectra (RIDSS) which is based on dividing the ternary mixture of the studied drugs by the spectrum of AM to get the division spectrum, from which concentration of AM can be obtained by measuring the amplitude values in the plateau region at 360 nm. Then the amplitude value of the plateau region was subtracted from the division spectrum and HZ concentration was obtained by measuring the difference in amplitude values at 278.5 and 306 nm (corresponding to zero difference of TL) while the total concentration of HZ and TL in the mixture was measured at their isoabsorptive point in the division spectrum at 278.5 nm (Aiso). TL concentration is then obtained by subtraction. Method B; double divisor ratio spectra derivative spectrophotometry (RS-DS) and method C; mean centering of ratio spectra (MCR) spectrophotometric methods. The proposed methods did not require any initial separation steps prior the analysis of the three drugs. A comparative study was done between the three methods regarding their; simplicity, sensitivity and limitations. Specificity was investigated by analyzing the synthetic mixtures containing different ratios of the three studied drugs and their tablets dosage form. Statistical comparison of the obtained results with those found by the official methods was done, differences were non-significant in regard to accuracy and precision. The three methods were validated in accordance with ICH guidelines and can be used for quality control laboratories for TL, HZ and AM.

  10. Danish validation of sniffin' sticks olfactory test for threshold, discrimination, and identification

    DEFF Research Database (Denmark)

    Niklassen, Andreas Steenholt; Ovesen, Therese; Fernandes, Henrique

    2017-01-01

    to investigate external validity of international normative values to separate hyposmia from normosmia. METHODS: The study included 388 participants. The first step was a questionnaire study in which 238 adults rated their familiarity with 125 odor descriptors. In the second step, we evaluated the original...... in improvement of familiarity and rate of I, making the test valid for use in Denmark. Furthermore, the study found a large variation in T and D scores between different countries, which should be considered when using these scores to separate hyposmia and anosmia from normosmia. LEVEL OF EVIDENCE: 2b...

  11. Session-RPE Method for Training Load Monitoring: Validity, Ecological Usefulness, and Influencing Factors

    Directory of Open Access Journals (Sweden)

    Monoem Haddad

    2017-11-01

    Full Text Available Purpose: The aim of this review is to (1 retrieve all data validating the Session-rating of perceived exertion (RPE-method using various criteria, (2 highlight the rationale of this method and its ecological usefulness, and (3 describe factors that can alter RPE and users of this method should take into consideration.Method: Search engines such as SPORTDiscus, PubMed, and Google Scholar databases in the English language between 2001 and 2016 were consulted for the validity and usefulness of the session-RPE method. Studies were considered for further analysis when they used the session-RPE method proposed by Foster et al. in 2001. Participants were athletes of any gender, age, or level of competition. Studies using languages other than English were excluded in the analysis of the validity and reliability of the session-RPE method. Other studies were examined to explain the rationale of the session-RPE method and the origin of RPE.Results: A total of 950 studies cited the Foster et al. study that proposed the session RPE-method. 36 studies have examined the validity and reliability of this proposed method using the modified CR-10.Conclusion: These studies confirmed the validity and good reliability and internal consistency of session-RPE method in several sports and physical activities with men and women of different age categories (children, adolescents, and adults among various expertise levels. This method could be used as “standing alone” method for training load (TL monitoring purposes though some recommend to combine it with other physiological parameters as heart rate.

  12. Comparison of the Danish step test and the watt-max test for estimation of maximal oxygen uptake

    DEFF Research Database (Denmark)

    Aadahl, Mette; Zacho, Morten; Linneberg, Allan René

    2013-01-01

    . Altogether, 795 eligible participants (response rate 35.8%) performed the watt max and the Danish step test. Correlation and agreement between the two VO(2max) test results was explored by Pearson's rho, Bland-Altman plots, Kappa(w), and gamma coefficients.Results: The correlation between VO(2max) (ml......Introduction: There is a need for simple and feasible methods for estimation of cardiorespiratory fitness (CRF) in large study populations, as existing methods for valid estimation of maximal oxygen consumption are generally time consuming and relatively expensive to administer. The Danish step...

  13. Single-laboratory validation of a saponification method for the determination of four polycyclic aromatic hydrocarbons in edible oils by HPLC-fluorescence detection.

    Science.gov (United States)

    Akdoğan, Abdullah; Buttinger, Gerhard; Wenzl, Thomas

    2016-01-01

    An analytical method is reported for the determination of four polycyclic aromatic hydrocarbons (benzo[a]pyrene (BaP), benz[a]anthracene (BaA), benzo[b]fluoranthene (BbF) and chrysene (CHR)) in edible oils (sesame, maize, sunflower and olive oil) by high-performance liquid chromatography. Sample preparation is based on three steps including saponification, liquid-liquid partitioning and, finally, clean-up by solid phase extraction on 2 g of silica. Guidance on single-laboratory validation of the proposed analysis method was taken from the second edition of the Eurachem guide on method validation. The lower level of the working range of the method was determined by the limits of quantification of the individual analytes, and the upper level was equal to 5.0 µg kg(-1). The limits of detection and quantification of the four PAHs ranged from 0.06 to 0.12 µg kg(-1) and from 0.13 to 0.24 µg kg(-1). Recoveries of more than 84.8% were achieved for all four PAHs at two concentration levels (2.5 and 5.0 µg kg(-1)), and expanded relative measurement uncertainties were below 20%. The performance of the validated method was in all aspects compliant with provisions set in European Union legislation for the performance of analytical methods employed in the official control of food. The applicability of the method to routine samples was evaluated based on a limited number of commercial edible oil samples.

  14. Study of CdTe quantum dots grown using a two-step annealing method

    Science.gov (United States)

    Sharma, Kriti; Pandey, Praveen K.; Nagpal, Swati; Bhatnagar, P. K.; Mathur, P. C.

    2006-02-01

    High size dispersion, large average radius of quantum dot and low-volume ratio has been a major hurdle in the development of quantum dot based devices. In the present paper, we have grown CdTe quantum dots in a borosilicate glass matrix using a two-step annealing method. Results of optical characterization and the theoretical model of absorption spectra have shown that quantum dots grown using two-step annealing have lower average radius, lesser size dispersion, higher volume ratio and higher decrease in bulk free energy as compared to quantum dots grown conventionally.

  15. Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory

    Science.gov (United States)

    Long, Haiying

    2017-01-01

    Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…

  16. Validation of KENO-based criticality calculations at Rocky Flats

    International Nuclear Information System (INIS)

    Felsher, P.D.; McKamy, J.N.; Monahan, S.P.

    1992-01-01

    In the absence of experimental data, it is necessary to rely on computer-based computational methods in evaluating the criticality condition of a nuclear system. The validity of the computer codes is established in a two-part procedure as outlined in ANSI/ANS 8.1. The first step, usually the responsibility of the code developer, involves verification that the algorithmic structure of the code is performing the intended mathematical operations correctly. The second step involves an assessment of the code's ability to realistically portray the governing physical processes in question. This is accomplished by determining the code's bias, or systematic error, through a comparison of computational results to accepted values obtained experimentally. In this paper, the authors discuss the validation process for KENO and the Hansen-Roach cross sections in use at EG and G Rocky Flats. The validation process at Rocky Flats consists of both global and local techniques. The global validation resulted in a maximum k eff limit of 0.95 for the limiting-accident scanarios of a criticality evaluation

  17. Laboratory diagnostic methods, system of quality and validation

    Directory of Open Access Journals (Sweden)

    Ašanin Ružica

    2005-01-01

    Full Text Available It is known that laboratory investigations secure safe and reliable results that provide a final confirmation of the quality of work. Ideas, planning, knowledge, skills, experience, and environment, along with good laboratory practice, quality control and reliability of quality, make the area of biological investigations very complex. In recent years, quality control, including the control of work in the laboratory, is based on international standards and is used at that level. The implementation of widely recognized international standards, such as the International Standard ISO/IEC 17025 (1 and the implementing of the quality system series ISO/IEC 9000 (2 have become the imperative on the grounds of which laboratories have a formal, visible and corresponding system of quality. The diagnostic methods that are used must constantly yield results which identify the animal as positive or negative, and the precise status of the animal is determined with a predefined degree of statistical significance. Methods applied on a selected population reduce the risk of obtaining falsely positive or falsely negative results. A condition for this are well conceived and documented methods, with the application of the corresponding reagents, and work with professional and skilled staff. This process requires also a consistent implementation of the most rigorous experimental plans, epidemiological and statistical data and estimations, with constant monitoring of the validity of the applied methods. Such an approach is necessary in order to cut down the number of misconceptions and accidental mistakes, for a referent population of animals on which the validity of a method is tested. Once a valid method is included in daily routine investigations, it is necessary to apply constant monitoring for the purpose of internal quality control, in order adequately to evaluate its reproducibility and reliability. Consequently, it is necessary at least twice yearly to conduct

  18. Single-step reinitialization and extending algorithms for level-set based multi-phase flow simulations

    Science.gov (United States)

    Fu, Lin; Hu, Xiangyu Y.; Adams, Nikolaus A.

    2017-12-01

    We propose efficient single-step formulations for reinitialization and extending algorithms, which are critical components of level-set based interface-tracking methods. The level-set field is reinitialized with a single-step (non iterative) "forward tracing" algorithm. A minimum set of cells is defined that describes the interface, and reinitialization employs only data from these cells. Fluid states are extrapolated or extended across the interface by a single-step "backward tracing" algorithm. Both algorithms, which are motivated by analogy to ray-tracing, avoid multiple block-boundary data exchanges that are inevitable for iterative reinitialization and extending approaches within a parallel-computing environment. The single-step algorithms are combined with a multi-resolution conservative sharp-interface method and validated by a wide range of benchmark test cases. We demonstrate that the proposed reinitialization method achieves second-order accuracy in conserving the volume of each phase. The interface location is invariant to reapplication of the single-step reinitialization. Generally, we observe smaller absolute errors than for standard iterative reinitialization on the same grid. The computational efficiency is higher than for the standard and typical high-order iterative reinitialization methods. We observe a 2- to 6-times efficiency improvement over the standard method for serial execution. The proposed single-step extending algorithm, which is commonly employed for assigning data to ghost cells with ghost-fluid or conservative interface interaction methods, shows about 10-times efficiency improvement over the standard method while maintaining same accuracy. Despite their simplicity, the proposed algorithms offer an efficient and robust alternative to iterative reinitialization and extending methods for level-set based multi-phase simulations.

  19. Multi-Time Step Service Restoration for Advanced Distribution Systems and Microgrids

    International Nuclear Information System (INIS)

    Chen, Bo; Chen, Chen; Wang, Jianhui; Butler-Purry, Karen L.

    2017-01-01

    Modern power systems are facing increased risk of disasters that can cause extended outages. The presence of remote control switches (RCSs), distributed generators (DGs), and energy storage systems (ESS) provides both challenges and opportunities for developing post-fault service restoration methodologies. Inter-temporal constraints of DGs, ESS, and loads under cold load pickup (CLPU) conditions impose extra complexity on problem formulation and solution. In this paper, a multi-time step service restoration methodology is proposed to optimally generate a sequence of control actions for controllable switches, ESSs, and dispatchable DGs to assist the system operator with decision making. The restoration sequence is determined to minimize the unserved customers by energizing the system step by step without violating operational constraints at each time step. The proposed methodology is formulated as a mixed-integer linear programming (MILP) model and can adapt to various operation conditions. Furthermore, the proposed method is validated through several case studies that are performed on modified IEEE 13-node and IEEE 123-node test feeders.

  20. Validated RP-HPLC Method for Quantification of Phenolic ...

    African Journals Online (AJOL)

    Purpose: To evaluate the total phenolic content and antioxidant potential of the methanol extracts of aerial parts and roots of Thymus sipyleus Boiss and also to determine some phenolic compounds using a newly developed and validated reversed phase high performance liquid chromatography (RP-HPLC) method.

  1. Development of a quality-assessment tool for experimental bruxism studies: reliability and validity

    NARCIS (Netherlands)

    Dawson, A.; Raphael, K.G.; Glaros, A.; Axelsson, S.; Arima, T.; Ernberg, M.; Farella, M.; Lobbezoo, F.; Manfredini, D.; Michelotti, A.; Svensson, P.; List, T.

    2013-01-01

    AIMS: To combine empirical evidence and expert opinion in a formal consensus method in order to develop a quality-assessment tool for experimental bruxism studies in systematic reviews. METHODS: Tool development comprised five steps: (1) preliminary decisions, (2) item generation, (3) face-validity

  2. Development and Validation of a Bioanalytical Method for Direct ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a user-friendly spiked plasma method for the extraction of diclofenac potassium that reduces the number of treatments with plasma sample, in order to minimize human error. Method: Instead of solvent evaporation technique, the spiked plasma sample was modified with H2SO4 and NaCl, ...

  3. WSRC approach to validation of criticality safety computer codes

    International Nuclear Information System (INIS)

    Finch, D.R.; Mincey, J.F.

    1991-01-01

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K eff ) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope 236 U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed

  4. Rationale and methods of the European Food Consumption Validation (EFCOVAL) Project

    NARCIS (Netherlands)

    Boer, de E.J.; Slimani, N.; Boeing, H.; Feinberg, M.; Leclerq, C.; Trolle, E.; Amiano, P.; Andersen, L.F.; Freisling, H.; Geelen, A.; Harttig, U.; Huybrechts, I.; Kaic-Rak, A.; Lafay, L.; Lillegaard, I.T.L.; Ruprich, J.; Vries, de J.H.M.; Ocke, M.C.

    2011-01-01

    Background/Objectives: The overall objective of the European Food Consumption Validation (EFCOVAL) Project was to further develop and validate a trans-European food consumption method to be used for the evaluation of the intake of foods, nutrients and potentially hazardous chemicals within the

  5. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    Science.gov (United States)

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  6. A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation.

    Science.gov (United States)

    Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf

    2017-07-01

    This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes' inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of the evidence for a trace specimen, e.g. a fingermark, and a reference specimen, e.g. a fingerprint, to originate from common or different sources. Some theoretical aspects of probabilities necessary for this Guideline were discussed prior to its elaboration, which started after a workshop of forensic researchers and practitioners involved in this topic. In the workshop, the following questions were addressed: "which aspects of a forensic evaluation scenario need to be validated?", "what is the role of the LR as part of a decision process?" and "how to deal with uncertainty in the LR calculation?". The questions: "what to validate?" focuses on the validation methods and criteria and "how to validate?" deals with the implementation of the validation protocol. Answers to these questions were deemed necessary with several objectives. First, concepts typical for validation standards [1], such as performance characteristics, performance metrics and validation criteria, will be adapted or applied by analogy to the LR framework. Second, a validation strategy will be defined. Third, validation methods will be described. Finally, a validation protocol and an example of validation report will be proposed, which can be applied to the forensic fields developing and validating LR methods for the evaluation of the strength of evidence at source level under the following propositions. Copyright © 2016. Published by Elsevier B.V.

  7. Methods for Geometric Data Validation of 3d City Models

    Science.gov (United States)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges

  8. Content validity of methods to assess malnutrition in cancer patients: a systematic review

    NARCIS (Netherlands)

    Sealy, Martine; Nijholt, Willemke; Stuiver, M.M.; van der Berg, M.M.; Ottery, Faith D.; van der Schans, Cees; Roodenburg, Jan L N; Jager-Wittenaar, Harriët

    Content validity of methods to assess malnutrition in cancer patients: A systematic review Rationale: Inadequate operationalisation of the multidimensial concept of malnutrition may result in inadequate evaluation of nutritional status. In this review we aimed to assess content validity of methods

  9. Development and Validation of a Liquid Chromatographic Method ...

    African Journals Online (AJOL)

    A liquid chromatographic method for the simultaneous determination of six human immunodeficiency virus (HIV) protease inhibitors, indinavir, saquinavir, ritonavir, amprenavir, nelfinavir and lopinavir, was developed and validated. Optimal separation was achieved on a PLRP-S 100 Å, 250 x 4.6 mm I.D. column maintained ...

  10. A Method of MPPT Control Based on Power Variable Step-size in Photovoltaic Converter System

    Directory of Open Access Journals (Sweden)

    Xu Hui-xiang

    2016-01-01

    Full Text Available Since the disadvantage of traditional MPPT algorithms of variable step-size, proposed power tracking based on variable step-size with the advantage method of the constant-voltage and the perturb-observe (P&O[1-3]. The control strategy modify the problem of voltage fluctuation caused by perturb-observe method, at the same time, introducing the advantage of constant-voltage method and simplify the circuit topology. With the theoretical derivation, control the output power of photovoltaic modules to change the duty cycle of main switch. Achieve the maximum power stabilization output, reduce the volatility of energy loss effectively, and improve the inversion efficiency[3,4]. Given the result of experimental test based theoretical derivation and the curve of MPPT when the prototype work.

  11. Empirical Performance of Cross-Validation With Oracle Methods in a Genomics Context.

    Science.gov (United States)

    Martinez, Josue G; Carroll, Raymond J; Müller, Samuel; Sampson, Joshua N; Chatterjee, Nilanjan

    2011-11-01

    When employing model selection methods with oracle properties such as the smoothly clipped absolute deviation (SCAD) and the Adaptive Lasso, it is typical to estimate the smoothing parameter by m-fold cross-validation, for example, m = 10. In problems where the true regression function is sparse and the signals large, such cross-validation typically works well. However, in regression modeling of genomic studies involving Single Nucleotide Polymorphisms (SNP), the true regression functions, while thought to be sparse, do not have large signals. We demonstrate empirically that in such problems, the number of selected variables using SCAD and the Adaptive Lasso, with 10-fold cross-validation, is a random variable that has considerable and surprising variation. Similar remarks apply to non-oracle methods such as the Lasso. Our study strongly questions the suitability of performing only a single run of m-fold cross-validation with any oracle method, and not just the SCAD and Adaptive Lasso.

  12. New clinical validation method for automated sphygmomanometer: a proposal by Japan ISO-WG for sphygmomanometer standard.

    Science.gov (United States)

    Shirasaki, Osamu; Asou, Yosuke; Takahashi, Yukio

    2007-12-01

    Owing to fast or stepwise cuff deflation, or measuring at places other than the upper arm, the clinical accuracy of most recent automated sphygmomanometers (auto-BPMs) cannot be validated by one-arm simultaneous comparison, which would be the only accurate validation method based on auscultation. Two main alternative methods are provided by current standards, that is, two-arm simultaneous comparison (method 1) and one-arm sequential comparison (method 2); however, the accuracy of these validation methods might not be sufficient to compensate for the suspicious accuracy in lateral blood pressure (BP) differences (LD) and/or BP variations (BPV) between the device and reference readings. Thus, the Japan ISO-WG for sphygmomanometer standards has been studying a new method that might improve validation accuracy (method 3). The purpose of this study is to determine the appropriateness of method 3 by comparing immunity to LD and BPV with those of the current validation methods (methods 1 and 2). The validation accuracy of the above three methods was assessed in human participants [N=120, 45+/-15.3 years (mean+/-SD)]. An oscillometric automated monitor, Omron HEM-762, was used as the tested device. When compared with the others, methods 1 and 3 showed a smaller intra-individual standard deviation of device error (SD1), suggesting their higher reproducibility of validation. The SD1 by method 2 (P=0.004) significantly correlated with the participant's BP, supporting our hypothesis that the increased SD of device error by method 2 is at least partially caused by essential BPV. Method 3 showed a significantly (P=0.0044) smaller interparticipant SD of device error (SD2), suggesting its higher interparticipant consistency of validation. Among the methods of validation of the clinical accuracy of auto-BPMs, method 3, which showed the highest reproducibility and highest interparticipant consistency, can be proposed as being the most appropriate.

  13. Determination of vitamin C in foods: current state of method validation.

    Science.gov (United States)

    Spínola, Vítor; Llorent-Martínez, Eulogio J; Castilho, Paula C

    2014-11-21

    Vitamin C is one of the most important vitamins, so reliable information about its content in foodstuffs is a concern to both consumers and quality control agencies. However, the heterogeneity of food matrixes and the potential degradation of this vitamin during its analysis create enormous challenges. This review addresses the development and validation of high-performance liquid chromatography methods for vitamin C analysis in food commodities, during the period 2000-2014. The main characteristics of vitamin C are mentioned, along with the strategies adopted by most authors during sample preparation (freezing and acidification) to avoid vitamin oxidation. After that, the advantages and handicaps of different analytical methods are discussed. Finally, the main aspects concerning method validation for vitamin C analysis are critically discussed. Parameters such as selectivity, linearity, limit of quantification, and accuracy were studied by most authors. Recovery experiments during accuracy evaluation were in general satisfactory, with usual values between 81 and 109%. However, few methods considered vitamin C stability during the analytical process, and the study of the precision was not always clear or complete. Potential future improvements regarding proper method validation are indicated to conclude this review. Copyright © 2014. Published by Elsevier B.V.

  14. Simultaneous determination and validation of emtricitabine, rilpivirine and tenofovir from biological samples using LC and CE methods.

    Science.gov (United States)

    Gumustas, Mehmet; Caglayan, Mehmet Gokhan; Onur, Feyyaz; Ozkan, Sibel A

    2018-04-01

    A combination of antiretroviral agents is frequently used in effective treatment of the human immunodeficiency virus infection. In this study, two different separation methods are presented for the simultaneous determination of emtricitabine, rilpivirine and tenofovir from raw materials and urine samples. Developed liquid chromatography and capillary electrophoresis methods were thoroughly optimized for high analytical performances. Optimization of multiple variables at the same time by performing a minimum number of experiments was achieved by the Box-Behnken design, which is an experimental design in response surface methodology, in capillary electrophoresis. The results of the experimental design ensure minimum analysis time with well-separated analytes. Separation conditions, such as different stationary phases, pH level, organic modifiers and temperatures in liquid chromatography method, were also optimized. In particular, among stationary phases, the core-shell column especially enhanced the effectiveness of separation in liquid chromatography. Both methods were fully validated and applied to real samples. The main advantage of the developed methods is the separation of the drug combination in a short time with high efficiency and without any time-consuming steps. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Research on the range side lobe suppression method for modulated stepped frequency radar signals

    Science.gov (United States)

    Liu, Yinkai; Shan, Tao; Feng, Yuan

    2018-05-01

    The magnitude of time-domain range sidelobe of modulated stepped frequency radar affects the imaging quality of inverse synthetic aperture radar (ISAR). In this paper, the cause of high sidelobe in modulated stepped frequency radar imaging is analyzed first in real environment. Then, the chaos particle swarm optimization (CPSO) is used to select the amplitude and phase compensation factors according to the minimum sidelobe criterion. Finally, the compensated one-dimensional range images are obtained. Experimental results show that the amplitude-phase compensation method based on CPSO algorithm can effectively reduce the sidelobe peak value of one-dimensional range images, which outperforms the common sidelobe suppression methods and avoids the coverage of weak scattering points by strong scattering points due to the high sidelobes.

  16. [Data validation methods and discussion on Chinese materia medica resource survey].

    Science.gov (United States)

    Zhang, Yue; Ma, Wei-Feng; Zhang, Xiao-Bo; Zhu, Shou-Dong; Guo, Lan-Ping; Wang, Xing-Xing

    2013-07-01

    From the beginning of the fourth national survey of the Chinese materia medica resources, there were 22 provinces have conducted pilots. The survey teams have reported immense data, it put forward the very high request to the database system construction. In order to ensure the quality, it is necessary to check and validate the data in database system. Data validation is important methods to ensure the validity, integrity and accuracy of census data. This paper comprehensively introduce the data validation system of the fourth national survey of the Chinese materia medica resources database system, and further improve the design idea and programs of data validation. The purpose of this study is to promote the survey work smoothly.

  17. A two-step method for fast and reliable EUV mask metrology

    Science.gov (United States)

    Helfenstein, Patrick; Mochi, Iacopo; Rajendran, Rajeev; Yoshitake, Shusuke; Ekinci, Yasin

    2017-03-01

    One of the major obstacles towards the implementation of extreme ultraviolet lithography for upcoming technology nodes in semiconductor industry remains the realization of a fast and reliable detection methods patterned mask defects. We are developing a reflective EUV mask-scanning lensless imaging tool (RESCAN), installed at the Swiss Light Source synchrotron at the Paul Scherrer Institut. Our system is based on a two-step defect inspection method. In the first step, a low-resolution defect map is generated by die to die comparison of the diffraction patterns from areas with programmed defects, to those from areas that are known to be defect-free on our test sample. In a later stage, a die to database comparison will be implemented in which the measured diffraction patterns will be compared to those calculated directly from the mask layout. This Scattering Scanning Contrast Microscopy technique operates purely in the Fourier domain without the need to obtain the aerial image and, given a sufficient signal to noise ratio, defects are found in a fast and reliable way, albeit with a location accuracy limited by the spot size of the incident illumination. Having thus identified rough locations for the defects, a fine scan is carried out in the vicinity of these locations. Since our source delivers coherent illumination, we can use an iterative phase-retrieval method to reconstruct the aerial image of the scanned area with - in principle - diffraction-limited resolution without the need of an objective lens. Here, we will focus on the aerial image reconstruction technique and give a few examples to illustrate the capability of the method.

  18. VALUE - Validating and Integrating Downscaling Methods for Climate Change Research

    Science.gov (United States)

    Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose

    2013-04-01

    Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of

  19. Validity of a Newly-Designed Rectilinear Stepping Ergometer Submaximal Exercise Test to Assess Cardiorespiratory Fitness.

    Science.gov (United States)

    Zhang, Rubin; Zhan, Likui; Sun, Shaoming; Peng, Wei; Sun, Yining

    2017-09-01

    The maximum oxygen uptake (V̇O 2 max), determined from graded maximal or submaximal exercise tests, is used to classify the cardiorespiratory fitness level of individuals. The purpose of this study was to examine the validity and reliability of the YMCA submaximal exercise test protocol performed on a newly-designed rectilinear stepping ergometer (RSE) that used up and down reciprocating vertical motion in place of conventional circular motion and giving precise measurement of workload, to determine V̇O 2 max in young healthy male adults. Thirty-two young healthy male adults (32 males; age range: 20-35 years; height: 1.75 ± 0.05 m; weight: 67.5 ± 8.6 kg) firstly participated in a maximal-effort graded exercise test using a cycle ergometer (CE) to directly obtain measured V̇O 2 max. Subjects then completed the progressive multistage test on the RSE beginning at 50W and including additional stages of 70, 90, 110, 130, and 150W, and the RSE YMCA submaximal test consisting of a workload increase every 3 minutes until the termination criterion was reached. A metabolic equation was derived from the RSE multistage exercise test to predict oxygen consumption (V̇O 2 ) from power output (W) during the submaximal exercise test (V̇O 2 (mL·min -1 )=12.4 ×W(watts)+3.5 mL·kg -1 ·min -1 ×M+160mL·min -1 , R 2 = 0.91, standard error of the estimate (SEE) = 134.8mL·min -1 ). A high correlation was observed between the RSE YMCA estimated V̇O 2 max and the CE measured V̇O 2 max (r=0.87). The mean difference between estimated and measured V̇O 2 max was 2.5 mL·kg -1 ·min -1 , with an SEE of 3.55 mL·kg -1 ·min -1 . The data suggest that the RSE YMCA submaximal exercise test is valid for predicting V̇O 2 max in young healthy male adults. The findings show that the rectilinear stepping exercise is an effective submaximal exercise for predicting V̇O 2 max. The newly-designed RSE may be potentially further developed as an alternative ergometer for assessing

  20. The Value of Step-by-Step Risk Assessment for Unmanned Aircraft

    DEFF Research Database (Denmark)

    La Cour-Harbo, Anders

    2018-01-01

    The new European legislation expected in 2018 or 2019 will introduce a step-by-step process for conducting risk assessments for unmanned aircraft flight operations. This is a relatively simple approach to a very complex challenge. This work compares this step-by-step process to high fidelity risk...... modeling, and shows that at least for a series of example flight missions there is reasonable agreement between the two very different methods....

  1. Pyrosequencing™ : A one-step method for high resolution HLA typing

    Directory of Open Access Journals (Sweden)

    Marincola Francesco M

    2003-11-01

    Full Text Available Abstract While the use of high-resolution molecular typing in routine matching of human leukocyte antigens (HLA is expected to improve unrelated donor selection and transplant outcome, the genetic complexity of HLA still makes the current methodology limited and laborious. Pyrosequencing™ is a gel-free, sequencing-by-synthesis method. In a Pyrosequencing reaction, nucleotide incorporation proceeds sequentially along each DNA template at a given nucleotide dispensation order (NDO that is programmed into a pyrosequencer. Here we describe the design of a NDO that generates a pyrogram unique for any given allele or combination of alleles. We present examples of unique pyrograms generated from each of two heterozygous HLA templates, which would otherwise remain cis/trans ambiguous using standard sequencing based typing (SBT method. In addition, we display representative data that demonstrate long read and linear signal generation. These features are prerequisite of high-resolution typing and automated data analysis. In conclusion Pyrosequencing is a one-step method for high resolution DNA typing.

  2. Studying Hospitalizations and Mortality in the Netherlands: Feasible and Valid Using Two-Step Medical Record Linkage with Nationwide Registers.

    Directory of Open Access Journals (Sweden)

    Elske Sieswerda

    Full Text Available In the Netherlands, the postal code is needed to study hospitalizations of individuals in the nationwide hospitalization register. Studying hospitalizations longitudinally becomes troublesome if individuals change address. We aimed to report on the feasibility and validity of a two-step medical record linkage approach to examine longitudinal trends in hospitalizations and mortality in a study cohort. First, we linked a study cohort of 1564 survivors of childhood cancer with the Municipal Personal Records Database (GBA which has postal code history and mortality data available. Within GBA, we sampled a reference population matched on year of birth, gender and calendar year. Second, we extracted hospitalizations from the Hospital Discharge Register (LMR with a date of discharge during unique follow-up (based on date of birth, gender and postal code in GBA. We calculated the agreement of death and being hospitalized in survivors according to the registers and to available cohort data. We retrieved 1477 (94% survivors from GBA. Median percentages of unique/potential follow-up were 87% (survivors and 83% (reference persons. Characteristics of survivors and reference persons contributing to unique follow-up were comparable. Agreement of hospitalization during unique follow-up was 94% and agreement of death was 98%. In absence of unique identifiers in the Dutch hospitalization register, it is feasible and valid to study hospitalizations and mortality of individuals longitudinally using a two-step medical record linkage approach. Cohort studies in the Netherlands have the opportunity to study mortality and hospitalization rates over time. These outcomes provide insight into the burden of clinical events and healthcare use in studies on patients at risk of long-term morbidities.

  3. Impurities in biogas - validation of analytical methods for siloxanes; Foeroreningar i biogas - validering av analysmetodik foer siloxaner

    Energy Technology Data Exchange (ETDEWEB)

    Arrhenius, Karine; Magnusson, Bertil; Sahlin, Eskil [SP Technical Research Institute of Sweden, Boraas (Sweden)

    2011-11-15

    Biogas produced from digester or landfill contains impurities which can be harmful for component that will be in contact with the biogas during its utilization. Among these, the siloxanes are often mentioned. During combustion, siloxanes are converted to silicon dioxide which accumulates on the heated surfaces in combustion equipment. Silicon dioxide is a solid compound and will remain in the engine and cause damages. Consequently, it is necessary to develop methods for the accurate determination of these compounds in biogases. In the first part of this report, a method for analysis of siloxanes in biogases was validated. The sampling was performed directly at the plant by drawing a small volume of biogas onto an adsorbent tube under a short period of time. These tubes were subsequently sent to the laboratory for analysis. The purpose of method validation is to demonstrate that the established method is fit for the purpose. This means that the method, as used by the laboratory generating the data, will provide data that meets a set of criteria concerning precision and accuracy. At the end, the uncertainty of the method was calculated. In the second part of this report, the validated method was applied to real samples collected in waste water treatment plants, co-digestion plants and plants digesting other wastes (agriculture waste). Results are presented at the end of this report. As expected, the biogases from waste water treatment plants contained largely higher concentrations of siloxanes than biogases from co-digestion plants and plants digesting agriculture wastes. The concentration of siloxanes in upgraded biogas regardless of which feedstock was digested and which upgrading technique was used was low.

  4. An explicit multi-time-stepping algorithm for aerodynamic flows

    NARCIS (Netherlands)

    Niemann-Tuitman, B.E.; Veldman, A.E.P.

    1997-01-01

    An explicit multi-time-stepping algorithm with applications to aerodynamic flows is presented. In the algorithm, in different parts of the computational domain different time steps are taken, and the flow is synchronized at the so-called synchronization levels. The algorithm is validated for

  5. A chromatographic method validation to quantify tablets Mephenesine of national production

    International Nuclear Information System (INIS)

    Suarez Perez, Yania; Izquierdo Castro, Adalberto; Milian Sanchez, Jana Daria

    2009-01-01

    Authors made validation of an analytical method by high performance liquid chromatography (HPLC) for quantification of Mephenesine in recently reformulated 500 mg tablets. With regard to its application to quality control, validation included the following parameters: linearity, accuracy, precision, and selectivity. Results were satisfactory within 50-150 % rank. In the case of its use in subsequent studies of chemical stability, the selectivity for stability and sensitivity was assessed. Estimated detection and quantification limits were appropriate, and the method was selective versus the possible degradation products. (Author)

  6. Reliability and validity of non-radiographic methods of thoracic kyphosis measurement: a systematic review.

    Science.gov (United States)

    Barrett, Eva; McCreesh, Karen; Lewis, Jeremy

    2014-02-01

    A wide array of instruments are available for non-invasive thoracic kyphosis measurement. Guidelines for selecting outcome measures for use in clinical and research practice recommend that properties such as validity and reliability are considered. This systematic review reports on the reliability and validity of non-invasive methods for measuring thoracic kyphosis. A systematic search of 11 electronic databases located studies assessing reliability and/or validity of non-invasive thoracic kyphosis measurement techniques. Two independent reviewers used a critical appraisal tool to assess the quality of retrieved studies. Data was extracted by the primary reviewer. The results were synthesized qualitatively using a level of evidence approach. 27 studies satisfied the eligibility criteria and were included in the review. The reliability, validity and both reliability and validity were investigated by sixteen, two and nine studies respectively. 17/27 studies were deemed to be of high quality. In total, 15 methods of thoracic kyphosis were evaluated in retrieved studies. All investigated methods showed high (ICC ≥ .7) to very high (ICC ≥ .9) levels of reliability. The validity of the methods ranged from low to very high. The strongest levels of evidence for reliability exists in support of the Debrunner kyphometer, Spinal Mouse and Flexicurve index, and for validity supports the arcometer and Flexicurve index. Further reliability and validity studies are required to strengthen the level of evidence for the remaining methods of measurement. This should be addressed by future research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Optimization of One-Step In Situ Transesterification Method for Accurate Quantification of EPA in Nannochloropsis gaditana

    Directory of Open Access Journals (Sweden)

    Yuting Tang

    2016-11-01

    Full Text Available Microalgae are a valuable source of lipid feedstocks for biodiesel and valuable omega-3 fatty acids. Nannochloropsis gaditana has emerged as a promising producer of eicosapentaenoic acid (EPA due to its fast growth rate and high EPA content. In the present study, the fatty acid profile of Nannochloropsis gaditana was found to be naturally high in EPA and devoid of docosahexaenoic acid (DHA, thereby providing an opportunity to maximize the efficacy of EPA production. Using an optimized one-step in situ transesterification method (methanol:biomass = 90 mL/g; HCl 5% by vol.; 70 °C; 1.5 h, the maximum fatty acid methyl ester (FAME yield of Nannochloropsis gaditana cultivated under rich condition was quantified as 10.04% ± 0.08% by weight with EPA-yields as high as 4.02% ± 0.17% based on dry biomass. The total FAME and EPA yields were 1.58- and 1.23-fold higher separately than that obtained using conventional two-step method (solvent system: methanol and chloroform. This one-step in situ method provides a fast and simple method to measure fatty acid methyl ester (FAME yields and could serve as a promising method to generate eicosapentaenoic acid methyl ester from microalgae.

  8. An explicit multi-time-stepping algorithm for aerodynamic flows

    OpenAIRE

    Niemann-Tuitman, B.E.; Veldman, A.E.P.

    1997-01-01

    An explicit multi-time-stepping algorithm with applications to aerodynamic flows is presented. In the algorithm, in different parts of the computational domain different time steps are taken, and the flow is synchronized at the so-called synchronization levels. The algorithm is validated for aerodynamic turbulent flows. For two-dimensional flows speedups in the order of five with respect to single time stepping are obtained.

  9. A Validation Study of the Impression Replica Technique.

    Science.gov (United States)

    Segerström, Sofia; Wiking-Lima de Faria, Johanna; Braian, Michael; Ameri, Arman; Ahlgren, Camilla

    2018-04-17

    To validate the well-known and often-used impression replica technique for measuring fit between a preparation and a crown in vitro. The validation consisted of three steps. First, a measuring instrument was validated to elucidate its accuracy. Second, a specimen consisting of male and female counterparts was created and validated by the measuring instrument. Calculations were made for the exact values of three gaps between the male and female. Finally, impression replicas were produced of the specimen gaps and sectioned into four pieces. The replicas were then measured with the use of a light microscope. The values received from measuring the specimen were then compared with the values received from the impression replicas, and the technique was thereby validated. The impression replica technique overvalued all measured gaps. Depending on location of the three measuring sites, the difference between the specimen and the impression replicas varied from 47 to 130 μm. The impression replica technique overestimates gaps within the range of 2% to 11%. The validation of the replica technique enables the method to be used as a reference when testing other methods for evaluating fit in dentistry. © 2018 by the American College of Prosthodontists.

  10. Imaginary Time Step Method to Solve the Dirac Equation with Nonlocal Potential

    International Nuclear Information System (INIS)

    Zhang Ying; Liang Haozhao; Meng Jie

    2009-01-01

    The imaginary time step (ITS) method is applied to solve the Dirac equation with nonlocal potentials in coordinate space. Taking the nucleus 12 C as an example, even with nonlocal potentials, the direct ITS evolution for the Dirac equation still meets the disaster of the Dirac sea. However, following the recipe in our former investigation, the disaster can be avoided by the ITS evolution for the corresponding Schroedinger-like equation without localization, which gives the convergent results exactly the same with those obtained iteratively by the shooting method with localized effective potentials.

  11. Specification and Preliminary Validation of IAT (Integrated Analysis Techniques) Methods: Executive Summary.

    Science.gov (United States)

    1985-03-01

    conceptual framwork , and preliminary validation of IAT concepts. Planned work for FY85, including more extensive validation, is also described. 20...Developments: Required Capabilities .... ......... 10 2-1 IAT Conceptual Framework - FY85 (FEO) ..... ........... 11 2-2 Recursive Nature of Decomposition...approach: 1) Identify needs & requirements for IAT. 2) Develop IAT conceptual framework. 3) Validate IAT methods. 4) Develop applications materials. To

  12. Using process elicitation and validation to understand and improve chemotherapy ordering and delivery.

    Science.gov (United States)

    Mertens, Wilson C; Christov, Stefan C; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Cassells, Lucinda J; Marquard, Jenna L

    2012-11-01

    Chemotherapy ordering and administration, in which errors have potentially severe consequences, was quantitatively and qualitatively evaluated by employing process formalism (or formal process definition), a technique derived from software engineering, to elicit and rigorously describe the process, after which validation techniques were applied to confirm the accuracy of the described process. The chemotherapy ordering and administration process, including exceptional situations and individuals' recognition of and responses to those situations, was elicited through informal, unstructured interviews with members of an interdisciplinary team. The process description (or process definition), written in a notation developed for software quality assessment purposes, guided process validation (which consisted of direct observations and semistructured interviews to confirm the elicited details for the treatment plan portion of the process). The overall process definition yielded 467 steps; 207 steps (44%) were dedicated to handling 59 exceptional situations. Validation yielded 82 unique process events (35 new expected but not yet described steps, 16 new exceptional situations, and 31 new steps in response to exceptional situations). Process participants actively altered the process as ambiguities and conflicts were discovered by the elicitation and validation components of the study. Chemotherapy error rates declined significantly during and after the project, which was conducted from October 2007 through August 2008. Each elicitation method and the subsequent validation discussions contributed uniquely to understanding the chemotherapy treatment plan review process, supporting rapid adoption of changes, improved communication regarding the process, and ensuing error reduction.

  13. Procedural key steps in laparoscopic colorectal surgery, consensus through Delphi methodology.

    Science.gov (United States)

    Dijkstra, Frederieke A; Bosker, Robbert J I; Veeger, Nicolaas J G M; van Det, Marc J; Pierie, Jean Pierre E N

    2015-09-01

    While several procedural training curricula in laparoscopic colorectal surgery have been validated and published, none have focused on dividing surgical procedures into well-identified segments, which can be trained and assessed separately. This enables the surgeon and resident to focus on a specific segment, or combination of segments, of a procedure. Furthermore, it will provide a consistent and uniform method of training for residents rotating through different teaching hospitals. The goal of this study was to determine consensus on the key steps of laparoscopic right hemicolectomy and laparoscopic sigmoid colectomy among experts in our University Medical Center and affiliated hospitals. This will form the basis for the INVEST video-assisted side-by-side training curriculum. The Delphi method was used for determining consensus on key steps of both procedures. A list of 31 steps for laparoscopic right hemicolectomy and 37 steps for laparoscopic sigmoid colectomy was compiled from textbooks and national and international guidelines. In an online questionnaire, 22 experts in 12 hospitals within our teaching region were invited to rate all steps on a Likert scale on importance for the procedure. Consensus was reached in two rounds. Sixteen experts agreed to participate. Of these 16 experts, 14 (88%) completed the questionnaire for both procedures. Of the 14 who completed the first round, 13 (93%) completed the second round. Cronbach's alpha was 0.79 for the right hemicolectomy and 0.91 for the sigmoid colectomy, showing high internal consistency between the experts. For the right hemicolectomy, 25 key steps were established; for the sigmoid colectomy, 24 key steps were established. Expert consensus on the key steps for laparoscopic right hemicolectomy and laparoscopic sigmoid colectomy was reached. These key steps will form the basis for a video-assisted teaching curriculum.

  14. Accelerated solvent extraction method with one-step clean-up for hydrocarbons in soil

    International Nuclear Information System (INIS)

    Nurul Huda Mamat Ghani; Norashikin Sain; Rozita Osman; Zuraidah Abdullah Munir

    2007-01-01

    The application of accelerated solvent extraction (ASE) using hexane combined with neutral silica gel and sulfuric acid/ silica gel (SA/ SG) to remove impurities prior to analysis by gas chromatograph with flame ionization detector (GC-FID) was studied. The efficiency of extraction was evaluated based on the three hydrocarbons; dodecane, tetradecane and pentadecane spiked to soil sample. The effect of ASE operating conditions (extraction temperature, extraction pressure, static time) was evaluated and the optimized condition obtained from the study was extraction temperature of 160 degree Celsius, extraction pressure of 2000 psi with 5 minutes static extraction time. The developed ASE with one-step clean-up method was applied in the extraction of hydrocarbons from spiked soil and the amount extracted was comparable to ASE extraction without clean-up step with the advantage of obtaining cleaner extract with reduced interferences. Therefore in the developed method, extraction and clean-up for hydrocarbons in soil can be achieved rapidly and efficiently with reduced solvent usage. (author)

  15. A three operator split-step method covering a larger set of non-linear partial differential equations

    Science.gov (United States)

    Zia, Haider

    2017-06-01

    This paper describes an updated exponential Fourier based split-step method that can be applied to a greater class of partial differential equations than previous methods would allow. These equations arise in physics and engineering, a notable example being the generalized derivative non-linear Schrödinger equation that arises in non-linear optics with self-steepening terms. These differential equations feature terms that were previously inaccessible to model accurately with low computational resources. The new method maintains a 3rd order error even with these additional terms and models the equation in all three spatial dimensions and time. The class of non-linear differential equations that this method applies to is shown. The method is fully derived and implementation of the method in the split-step architecture is shown. This paper lays the mathematical ground work for an upcoming paper employing this method in white-light generation simulations in bulk material.

  16. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  17. A validated solid-liquid extraction method for the HPLC determination of polyphenols in apple tissues Comparison with pressurised liquid extraction.

    Science.gov (United States)

    Alonso-Salces, Rosa M; Barranco, Alejandro; Corta, Edurne; Berrueta, Luis A; Gallo, Blanca; Vicente, Francisca

    2005-02-15

    A solid-liquid extraction procedure followed by reversed-phase high-performance liquid chromatography (RP-HPLC) coupled with a photodiode array detector (DAD) for the determination of polyphenols in freeze-dried apple peel and pulp is reported. The extraction step consists in sonicating 0.5g of freeze-dried apple tissue with 30mL of methanol-water-acetic acid (30:69:1, v/v/v) containing 2g of ascorbic acid/L, for 10min in an ultrasonic bath. The whole method was validated, concluding that it is a robust method that presents high extraction efficiencies (peel: >91%, pulp: >95%) and appropriate precisions (within day: R.S.D. (n = 5) <5%, and between days: R.S.D. (n = 5) <7%) at the different concentration levels of polyphenols that can be found in apple samples. The method was compared with one previously published, consisting in a pressurized liquid extraction (PLE) followed by RP-HPLC-DAD determination. The advantages and disadvantages of both methods are discussed.

  18. Validation of Theory: Exploring and Reframing Popper’s Worlds

    Directory of Open Access Journals (Sweden)

    Steven E. Wallis

    2008-12-01

    Full Text Available Popper’s well-known arguments describe the need for advancing social theory through a process of falsification. Despite Popper’s call, there has been little change in the academic process of theory development and testing. This paper builds on Popper’s lesser-known idea of “three worlds” (physical, emotional/conceptual, and theoretical to investigate the relationship between knowledge, theory, and action. In this paper, I explore his three worlds to identify alternative routes to support the validation of theory. I suggest there are alternative methods for validation, both between, and within, the three worlds and that a combination of validation and falsification methods may be superior to any one method. Integral thinking is also put forward to support the validation process. Rather than repeating the call for full Popperian falsification, this paper recognizes that the current level of social theorizing provides little opportunity for such falsification. Rather than sidestepping the goal of Popperian falsification, the paths suggested here may be seen as providing both validation and falsification as stepping-stones toward the goal of more effective social and organizational theory.

  19. Development and validation of a bioanalytical LC-MS method for the quantification of GHRP-6 in human plasma.

    Science.gov (United States)

    Gil, Jeovanis; Cabrales, Ania; Reyes, Osvaldo; Morera, Vivian; Betancourt, Lázaro; Sánchez, Aniel; García, Gerardo; Moya, Galina; Padrón, Gabriel; Besada, Vladimir; González, Luis Javier

    2012-02-23

    Growth hormone-releasing peptide 6 (GHRP-6, His-(DTrp)-Ala-Trp-(DPhe)-Lys-NH₂, MW=872.44 Da) is a potent growth hormone secretagogue that exhibits a cytoprotective effect, maintaining tissue viability during acute ischemia/reperfusion episodes in different organs like small bowel, liver and kidneys. In the present work a quantitative method to analyze GHRP-6 in human plasma was developed and fully validated following FDA guidelines. The method uses an internal standard (IS) of GHRP-6 with ¹³C-labeled Alanine for quantification. Sample processing includes a precipitation step with cold acetone to remove the most abundant plasma proteins, recovering the GHRP-6 peptide with a high yield. Quantification was achieved by LC-MS in positive full scan mode in a Q-Tof mass spectrometer. The sensitivity of the method was evaluated, establishing the lower limit of quantification at 5 ng/mL and a range for the calibration curve from 5 ng/mL to 50 ng/mL. A dilution integrity test was performed to analyze samples at higher concentration of GHRP-6. The validation process involved five calibration curves and the analysis of quality control samples to determine accuracy and precision. The calibration curves showed R² higher than 0.988. The stability of the analyte and its internal standard (IS) was demonstrated in all conditions the samples would experience in a real time analyses. This method was applied to the quantification of GHRP-6 in plasma from nine healthy volunteers participating in a phase I clinical trial. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Validity and reliability of a method for assessment of cervical vertebral maturation.

    Science.gov (United States)

    Zhao, Xiao-Guang; Lin, Jiuxiang; Jiang, Jiu-Hui; Wang, Qingzhu; Ng, Sut Hong

    2012-03-01

    To evaluate the validity and reliability of the cervical vertebral maturation (CVM) method with a longitudinal sample. Eighty-six cephalograms from 18 subjects (5 males and 13 females) were selected from the longitudinal database. Total mandibular length was measured on each film; an increased rate served as the gold standard in examination of the validity of the CVM method. Eleven orthodontists, after receiving intensive training in the CVM method, evaluated all films twice. Kendall's W and the weighted kappa statistic were employed. Kendall's W values were higher than 0.8 at both times, indicating strong interobserver reproducibility, but interobserver agreement was documented twice at less than 50%. A wide range of intraobserver agreement was noted (40.7%-79.1%), and substantial intraobserver reproducibility was proved by kappa values (0.53-0.86). With regard to validity, moderate agreement was reported between the gold standard and observer staging at the initial time (kappa values 0.44-0.61). However, agreement seemed to be unacceptable for clinical use, especially in cervical stage 3 (26.8%). Even though the validity and reliability of the CVM method proved statistically acceptable, we suggest that many other growth indicators should be taken into consideration in evaluating adolescent skeletal maturation.

  1. Update of Standard Practices for New Method Validation in Forensic Toxicology.

    Science.gov (United States)

    Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T

    2017-01-01

    International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Quantitation of pregabalin in dried blood spots and dried plasma spots by validated LC-MS/MS methods.

    Science.gov (United States)

    Kostić, Nađa; Dotsikas, Yannis; Jović, Nebojša; Stevanović, Galina; Malenović, Anđelija; Medenica, Mirjana

    2015-05-10

    In this paper, novel LC-MS/MS methods for the determination of antiepileptic drug pregabalin in dried matrix spots (DMS) are presented. This attractive technique of sample collection in micro amount was utilized in the form of dried blood spots (DBS) and dried plasma spots (DPS). Following a pre-column derivatization procedure, using n-propyl chloroformate in the presence of n-propanol, and consecutive liquid-liquid extraction, derivatized pregabalin and its internal standard, 4-aminocyclohexanecarboxylic acid, were detected in positive ion mode by applying two SRM transitions per analyte. A YMC-Pack Octyl column (50mm×4.0mm, 3μm particle size) maintained at 30°C, was utilized with running mobile phase composed of acetonitrile: 0.15% formic acid (85:15, v/v). Flow rate was 550μL/min and total run time 2min. Established methods were fully validated over the concentration range of 0.200-20.0μg/mL for DBS and 0.400-40.0μg/mL for DPS, respectively, while specificity, accuracy, precision, recovery, matrix-effect, stability, dilution integrity and spot homogeneity were found within acceptance criteria. Validated methods were applied for the determination of pregabalin levels in dried blood and plasma samples obtained from patients with epilepsy, after per os administration of commercial capsules. Comparison of drug level in blood and plasma, as well as correction steps undertaken in order to overcome hematocrit issue, when analyzing DBS, are also given. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Method development and validation for the simultaneous determination of organochlorine and organophosphorus pesticides in a complex sediment matrix.

    Science.gov (United States)

    Alcántara-Concepción, Victor; Cram, Silke; Gibson, Richard; Ponce de León, Claudia; Mazari-Hiriart, Marisa

    2013-01-01

    The Xochimilco area in the southeastern part of Mexico City has a variety of socioeconomic activities, such as periurban agriculture, which is of great importance in the Mexico City metropolitan area. Pesticides are used extensively, some being legal, mostly chlorpyrifos and malathion, and some illegal, mostly DDT. Sediments are a common sink for pesticides in aquatic systems near agricultural areas, and Xochimilco sediments have a complex composition with high contents of organic matter and clay that are ideal adsorption sites for organochlorine (OC) and organophosphorus (OP) pesticides. Therefore, it is important to have a quick, affordable, and reliable method to determine these pesticides. Conventional methods for the determination of OC and OP pesticides are long, laborious, and costly owing to the high volume of solvents and adsorbents. The present study developed and validated a method for determining 18 OC and five OP pesticides in sediments with high organic and clay contents. In contrast with other methods described in the literature, this method allows isolation of the 23 pesticides with a 12 min microwave-assisted extraction (MAE) and one-step cleanup of pesticides. The method developed is a simpler, time-saving procedure that uses only 3.5 g of dry sediment. The use of MAE eliminates excessive handling and the possible loss of analytes. It was shown that the use of LC-Si cartridges with hexane-ethyl acetate (75+25, v/v) in the cleanup procedure recovered all pesticides with rates between 70 and 120%. The validation parameters demonstrated good performance of the method, with intermediate precision ranging from 7.3 to 17.0%, HorRat indexes all below 0.5, and tests of accuracy with the 23 pesticides at three concentration levels demonstrating recoveries ranging from 74 to 114% and RSDs from 3.3 to 12.7%.

  4. Development of a three dimensional circulation model based on fractional step method

    Directory of Open Access Journals (Sweden)

    Mazen Abualtayef

    2010-03-01

    Full Text Available A numerical model was developed for simulating a three-dimensional multilayer hydrodynamic and thermodynamic model in domains with irregular bottom topography. The model was designed for examining the interactions between flow and topography. The model was based on the three-dimensional Navier-Stokes equations and was solved using the fractional step method, which combines the finite difference method in the horizontal plane and the finite element method in the vertical plane. The numerical techniques were described and the model test and application were presented. For the model application to the northern part of Ariake Sea, the hydrodynamic and thermodynamic results were predicted. The numerically predicted amplitudes and phase angles were well consistent with the field observations.

  5. Triangulation, Respondent Validation, and Democratic Participation in Mixed Methods Research

    Science.gov (United States)

    Torrance, Harry

    2012-01-01

    Over the past 10 years or so the "Field" of "Mixed Methods Research" (MMR) has increasingly been exerting itself as something separate, novel, and significant, with some advocates claiming paradigmatic status. Triangulation is an important component of mixed methods designs. Triangulation has its origins in attempts to validate research findings…

  6. Acomparative Study Comparing Low-dose Step-up Versus Step-down in Polycystic Ovary Syndrome Resistant to Clomiphene

    Directory of Open Access Journals (Sweden)

    S Peivandi

    2010-03-01

    Full Text Available Introduction: Polycystic ovary syndrome(PCOS is one of the most common cause of infertility in women. clomiphene is the first line of treatment. however 20% of patients are resistant to clomiphene. because of follicular hypersensitivity to gonadotropins in pcod, multiple follicular growth and development occurs which is cause of OHSS and multiple pregnancy. Our aim of this random and clinical study was comparation between step-down and low dose step-up methods for induction ovulation in clomiphene resistant. Methods: 60 cases were included 30 women in low-dose step-up group and 30 women in step-down group. In low-dose step-up HMG 75u/d and in step-down HMG 225u/d was started on 3th days of cycle, monitoring with vaginal sonography was done on 8th days of cycle. When follicle with>14 mm in diameter was seen HMG dose was continued in low-dose step-up and was decreased in step-down group. When follicle reached to 18mm in diameter, amp HCG 10000 unit was injected and IUI was performed 36 hours later. Results: Number of HMG ampules, number of follicles> 14mm on the day of HCG injection and level of serum estradiol was greater in low dose step up protocol than step down protocol(p<0/0001. Ovulation rate and pregnancy rate was greater in lowdose step up group than step down group with significant difference (p<0/0001. Conclusion: Our study showed that low-dose step-up regimen with HMG is effective for stimulating ovulation and clinical pregnancy but in view of monofollicular growth, the step down method was more effective and safe. In our study multifolliular growth in step-up method was higher than step-down method. We can predict possibility of Ovarian Hyperstimulation Syndrome syndrome in highly sensitive PCOS patients.

  7. Influence of the drying method in chitosans purification step

    International Nuclear Information System (INIS)

    Fonseca, Ana C.M.; Batista, Jorge G.S.; Bettega, Antonio; Lima, Nelson B. de

    2015-01-01

    Currently, the study of extracellular biopolymers properties has received prominence for being easy extraction and purification. Chitosan has been an attractive proposition for applications in various fields such as engineering, biotechnology, medicine and pharmacology. For such applications, it is necessary purification of chitosan to obtain a product more concentrated and free of undesirable impurities. However, at this stage of the process of obtaining the biopolymer may occur morphological and physicochemical changes. This study evaluated the influence of the drying process after purification of a commercial chitosan sample and the importance of this step and its cost/benefit in applications requiring a high degree of purity. The method of drying influenced in the organoleptic properties and in the main characteristics of material. Analysis of the crystal structure by X-ray diffraction showed that the degree of crystallinity, X (%), in the purified chitosan samples was lower when compared with the unpurified sample. The degree of acetylation, DA (%), was analyzed by spectroscopy infrared with no significant changes on the three drying methods assessed, unlike the viscosimetric molecular weight, M_v, determined by capillary viscometry. (author)

  8. Validity of a Manual Soft Tissue Profile Prediction Method Following Mandibular Setback Osteotomy

    OpenAIRE

    Kolokitha, Olga-Elpis

    2007-01-01

    Objectives The aim of this study was to determine the validity of a manual cephalometric method used for predicting the post-operative soft tissue profiles of patients who underwent mandibular setback surgery and compare it to a computerized cephalometric prediction method (Dentofacial Planner). Lateral cephalograms of 18 adults with mandibular prognathism taken at the end of pre-surgical orthodontics and approximately one year after surgery were used. Methods To test the validity of the manu...

  9. Interacting steps with finite-range interactions: Analytical approximation and numerical results

    Science.gov (United States)

    Jaramillo, Diego Felipe; Téllez, Gabriel; González, Diego Luis; Einstein, T. L.

    2013-05-01

    We calculate an analytical expression for the terrace-width distribution P(s) for an interacting step system with nearest- and next-nearest-neighbor interactions. Our model is derived by mapping the step system onto a statistically equivalent one-dimensional system of classical particles. The validity of the model is tested with several numerical simulations and experimental results. We explore the effect of the range of interactions q on the functional form of the terrace-width distribution and pair correlation functions. For physically plausible interactions, we find modest changes when next-nearest neighbor interactions are included and generally negligible changes when more distant interactions are allowed. We discuss methods for extracting from simulated experimental data the characteristic scale-setting terms in assumed potential forms.

  10. The Validity of Dimensional Regularization Method on Fractal Spacetime

    Directory of Open Access Journals (Sweden)

    Yong Tao

    2013-01-01

    Full Text Available Svozil developed a regularization method for quantum field theory on fractal spacetime (1987. Such a method can be applied to the low-order perturbative renormalization of quantum electrodynamics but will depend on a conjectural integral formula on non-integer-dimensional topological spaces. The main purpose of this paper is to construct a fractal measure so as to guarantee the validity of the conjectural integral formula.

  11. Influence of application methods of one-step self-etching adhesives on microtensile bond strength

    OpenAIRE

    Chul-Kyu Choi,; Sung-Ae Son; Jin-Hee Ha; Bock Hur; Hyeon-Cheol Kim; Yong-Hun Kwon; Jeong-Kil Park

    2011-01-01

    Objectives The purpose of this study was to evaluate the effect of various application methods of one-step self-etch adhesives to microtensile resin-dentin bond strength. Materials and Methods Thirty-six extracted human molars were used. The teeth were assigned randomly to twelve groups (n = 15), according to the three different adhesive systems (Clearfil Tri-S Bond, Adper Prompt L-Pop, G-Bond) and application methods. The adhesive systems were applied on the dentin as follows: 1) T...

  12. CosmoQuest:Using Data Validation for More Than Just Data Validation

    Science.gov (United States)

    Lehan, C.; Gay, P.

    2016-12-01

    It is often taken for granted that different scientists completing the same task (e.g. mapping geologic features) will get the same results, and data validation is often skipped or under-utilized due to time and funding constraints. Robbins et. al (2014), however, demonstrated that this is a needed step, as large variation can exist even among collaborating team members completing straight-forward tasks like marking craters. Data Validation should be much more than a simple post-project verification of results. The CosmoQuest virtual research facility employs regular data-validation for a variety of benefits, including real-time user feedback, real-time tracking to observe user activity while it's happening, and using pre-solved data to analyze users' progress and to help them retain skills. Some creativity in this area can drastically improve project results. We discuss methods of validating data in citizen science projects and outline the variety of uses for validation, which, when used properly, improves the scientific output of the project and the user experience for the citizens doing the work. More than just a tool for scientists, validation can assist users in both learning and retaining important information and skills, improving the quality and quantity of data gathered. Real-time analysis of user data can give key information in the effectiveness of the project that a broad glance would miss, and properly presenting that analysis is vital. Training users to validate their own data, or the data of others, can significantly improve the accuracy of misinformed or novice users.

  13. A GPU-accelerated semi-implicit fractional-step method for numerical solutions of incompressible Navier-Stokes equations

    Science.gov (United States)

    Ha, Sanghyun; Park, Junshin; You, Donghyun

    2018-01-01

    Utility of the computational power of Graphics Processing Units (GPUs) is elaborated for solutions of incompressible Navier-Stokes equations which are integrated using a semi-implicit fractional-step method. The Alternating Direction Implicit (ADI) and the Fourier-transform-based direct solution methods used in the semi-implicit fractional-step method take advantage of multiple tridiagonal matrices whose inversion is known as the major bottleneck for acceleration on a typical multi-core machine. A novel implementation of the semi-implicit fractional-step method designed for GPU acceleration of the incompressible Navier-Stokes equations is presented. Aspects of the programing model of Compute Unified Device Architecture (CUDA), which are critical to the bandwidth-bound nature of the present method are discussed in detail. A data layout for efficient use of CUDA libraries is proposed for acceleration of tridiagonal matrix inversion and fast Fourier transform. OpenMP is employed for concurrent collection of turbulence statistics on a CPU while the Navier-Stokes equations are computed on a GPU. Performance of the present method using CUDA is assessed by comparing the speed of solving three tridiagonal matrices using ADI with the speed of solving one heptadiagonal matrix using a conjugate gradient method. An overall speedup of 20 times is achieved using a Tesla K40 GPU in comparison with a single-core Xeon E5-2660 v3 CPU in simulations of turbulent boundary-layer flow over a flat plate conducted on over 134 million grids. Enhanced performance of 48 times speedup is reached for the same problem using a Tesla P100 GPU.

  14. Development and validation of reversed-phase high performance liquid chromatographic method for analysis of cephradine in human plasma samples

    International Nuclear Information System (INIS)

    Ahmad, M.; Usman, M.; Madni, A.; Akhtar, N.; Khalid, N.; Asghar, W.

    2010-01-01

    An HPLC method with high precision, accuracy and selectivity was developed and validated for the assessment of cephradine in human plasma samples. The extraction procedure was simple and accurate with single step followed by direct injection of sample into HPLC system. The extracted cephradine in spiked human plasma was separated and quantitated using reversed phase C/sub 18/ column and UV detection wavelength of 254 nm. The optimized mobile phase of new composition of 0.05 M potassium dihydrogen phosphate (pH 3.4)-acetonitrile (88: 12) was pumped at an optimum flow rate of 1 mL.min/sup 1/. The method resulted linearity in the concentration range 0.15- 20 micro g mL/sup -1/. The limit of detection (LOD) and limit of quantification (LOQ) were 0.05 and 0.150 Microg.mL/sup -1/, respectively. The accuracy of method was 98.68 %. This method can 1>e applied for bioequivalence studies and therapeutic drug monitoring as well as for the routine analysis of cephradine. (author)

  15. Radiation grafting of pH-sensitive acrylic acid and 4-vinyl pyridine onto nylon-6 using one- and two-step methods

    International Nuclear Information System (INIS)

    Ortega, Alejandra; Alarcón, Darío; Muñoz-Muñoz, Franklin; Garzón-Fontecha, Angélica; Burillo, Guillermina

    2015-01-01

    Acrylic acid (AAc) and 4-vinyl pyridine (4VP) were γ-ray grafted onto nylon-6 (Ny 6 ) films via pre-irradiation oxidative method. These monomers were grafted using a one-step method to render Ny 6 -g–(AAc/4VP). A two-step or sequential method was used to render (Ny 6 -g–AAc)-g–4VP. Random copolymer branches were obtained when the grafting was carried out via one-step method using the two monomers together. The two-step method was applied to graft chains of 4VP on both Ny 6 substrate and previously grafted AAc chains (Ny 6 -g–AAc). The two types of binary copolymers synthesized were characterized to determine the amount of grafted polymers, the thermal behavior (DSC and TGA), the surface composition (XPS), and the pH responsiveness. In the two-step process, it is possible to achieve a higher graft yield, better control of the amount of each monomer, good reversibility in the swelling/deswelling process and shorter time to achieve equilibrium swelling. - Highlights: • A new binary graft of 4VP and AAc onto Ny 6 films was synthesized by γ-radiation. • The binary grafted material has potential application for heavy ion retention. • The two-step method shows better conditions in swelling and reversibility properties. • Surface distribution of monomers was evaluate by XPS characterization

  16. Canine distemper virus detection by different methods of One-Step RT-qPCR

    Directory of Open Access Journals (Sweden)

    Claudia de Camargo Tozato

    2016-01-01

    Full Text Available ABSTRACT: Three commercial kits of One-Step RT-qPCR were evaluated for the molecular diagnosis of Canine Distemper Virus. Using the kit that showed better performance, two systems of Real-time RT-PCR (RT-qPCR assays were tested and compared for analytical sensitivity to Canine Distemper Virus RNA detection: a One-Step RT-qPCR (system A and a One-Step RT-qPCR combined with NESTED-qPCR (system B. Limits of detection for both systems were determined using a serial dilution of Canine Distemper Virus synthetic RNA or a positive urine sample. In addition, the same urine sample was tested using samples with prior centrifugation or ultracentrifugation. Commercial kits of One-Step RT-qPCR assays detected canine distemper virus RNA in 10 (100% urine samples from symptomatic animals tested. The One-Step RT-qPCR kit that showed better results was used to evaluate the analytical sensitivity of the A and B systems. Limit of detection using synthetic RNA for the system A was 11 RNA copies µL-1 and 110 RNA copies µl-1 for first round System B. The second round of the NESTED-qPCR for System B had a limit of detection of 11 copies µl-1. Relationship between Ct values and RNA concentration was linear. The RNA extracted from the urine dilutions was detected in dilutions of 10-3 and10-2 by System A and B respectively. Urine centrifugation increased the analytical sensitivity of the test and proved to be useful for routine diagnostics. The One-Step RT-qPCR is a fast, sensitive and specific method for canine distemper routine diagnosis and research projects that require sensitive and quantitative methodology.

  17. Three-step interferometric method with blind phase shifts by use of interframe correlation between interferograms

    Science.gov (United States)

    Muravsky, Leonid I.; Kmet', Arkady B.; Stasyshyn, Ihor V.; Voronyak, Taras I.; Bobitski, Yaroslav V.

    2018-06-01

    A new three-step interferometric method with blind phase shifts to retrieve phase maps (PMs) of smooth and low-roughness engineering surfaces is proposed. Evaluating of two unknown phase shifts is fulfilled by using the interframe correlation between interferograms. The method consists of two stages. The first stage provides recording of three interferograms of a test object and their processing including calculation of unknown phase shifts, and retrieval of a coarse PM. The second stage implements firstly separation of high-frequency and low-frequency PMs and secondly producing of a fine PM consisting of areal surface roughness and waviness PMs. Extraction of the areal surface roughness and waviness PMs is fulfilled by using a linear low-pass filter. The computer simulation and experiments fulfilled to retrieve a gauge block surface area and its areal surface roughness and waviness have confirmed the reliability of the proposed three-step method.

  18. Effect of One-Step and Multi-Steps Polishing System on Enamel Roughness

    Directory of Open Access Journals (Sweden)

    Cynthia Sumali

    2013-07-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 The final procedures of orthodontic treatment are bracket debonding and cleaning the remaining adhesive. Multi-step polishing system is the most common method used. The disadvantage of that system is long working time, because of the stages that should be done. Therefore, dental material manufacturer make an improvement to the system, to reduce several stages into one stage only. This new system is known as one-step polishing system. Objective: To compare the effect of one-step and multi-step polishing system on enamel roughness after orthodontic bracket debonding. Methods: Randomized control trial was conducted included twenty-eight maxillary premolar randomized into two polishing system; one-step OptraPol (Ivoclar, Vivadent and multi-step AstroPol (Ivoclar, Vivadent. After bracket debonding, the remaining adhesive on each group was cleaned by subjective polishing system for ninety seconds using low speed handpiece. The enamel roughness was subjected to profilometer, registering two roughness parameters (Ra, Rz. Independent t-test was used to analyze the mean score of enamel roughness in each group. Results: There was no significant difference of enamel roughness between one-step and multi-step polishing system (p>0.005. Conclusion: One-step polishing system can produce a similar enamel roughness to multi-step polishing system after bracket debonding and adhesive cleaning.DOI: 10.14693/jdi.v19i3.136

  19. A broad view of model validation

    International Nuclear Information System (INIS)

    Tsang, C.F.

    1989-10-01

    The safety assessment of a nuclear waste repository requires the use of models. Such models need to be validated to ensure, as much as possible, that they are a good representation of the actual processes occurring in the real system. In this paper we attempt to take a broad view by reviewing step by step the modeling process and bringing out the need to validating every step of this process. This model validation includes not only comparison of modeling results with data from selected experiments, but also evaluation of procedures for the construction of conceptual models and calculational models as well as methodologies for studying data and parameter correlation. The need for advancing basic scientific knowledge in related fields, for multiple assessment groups, and for presenting our modeling efforts in open literature to public scrutiny is also emphasized. 16 refs

  20. Validering av vattenkraftmodeller i ARISTO

    OpenAIRE

    Lundbäck, Maja

    2013-01-01

    This master thesis was made to validate hydropower models of a turbine governor, Kaplan turbine and a Francis turbine in the power system simulator ARISTO at Svenska Kraftnät. The validation was made in three steps. The first step was to make sure the models was implement correctly in the simulator. The second was to compare the simulation results from the Kaplan turbine model to data from a real hydropower plant. The comparison was made to see how the models could generate simulation result ...

  1. Development and Validation of a RP-HPLC Method for the ...

    African Journals Online (AJOL)

    Development and Validation of a RP-HPLC Method for the Simultaneous Determination of Rifampicin and a Flavonoid Glycoside - A Novel ... range, accuracy, precision, limit of detection, limit of quantification, robustness and specificity.

  2. Validation parameters of instrumental method for determination of total bacterial count in milk

    Directory of Open Access Journals (Sweden)

    Nataša Mikulec

    2004-10-01

    Full Text Available The method of flow citometry as rapid, instrumental and routine microbiological method is used for determination of total bacterial count in milk. The results of flow citometry are expressed as individual bacterial cells count. Problems regarding the interpretation of the results of total bacterial count can be avoided by transformation of the results of flow citometry method onto the scale of reference method (HRN ISO 6610:2001.. The method of flow citometry, like any analitycal method, according to the HRN EN ISO/IEC 17025:2000 standard, requires validation and verification. This paper describes parameters of validation: accuracy, precision, specificity, range, robustness and measuring uncertainty for the method of flow citometry.

  3. [Steps to transform a necessity into a validated and useful screening tool for early detection of developmental problems in Mexican children].

    Science.gov (United States)

    Rizzoli-Córdoba, Antonio; Delgado-Ginebra, Ismael

    A screening test is an instrument whose primary function is to identify individuals with a probable disease among an apparently healthy population, establishing risk or suspicion of a disease. Caution must be taken when using a screening tool in order to avoid unrealistic measurements, delaying an intervention for those who may benefit from it. Before introducing a screening test into clinical practice, it is necessary to certify the presence of some characteristics making its worth useful. This "certification" process is called validation. The main objective of this paper is to describe the different steps that must be taken, from the identification of a need for early detection through the generation of a validated and reliable screening tool using, as an example, the process for the modified version of the Child Development Evaluation Test (CDE or Prueba EDI) in Mexico. Copyright © 2015 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.

  4. Sharp Penalty Term and Time Step Bounds for the Interior Penalty Discontinuous Galerkin Method for Linear Hyperbolic Problems

    NARCIS (Netherlands)

    Geevers, Sjoerd; van der Vegt, J.J.W.

    2017-01-01

    We present sharp and sucient bounds for the interior penalty term and time step size to ensure stability of the symmetric interior penalty discontinuous Galerkin (SIPDG) method combined with an explicit time-stepping scheme. These conditions hold for generic meshes, including unstructured

  5. Large-area gold nanohole arrays fabricated by one-step method for surface plasmon resonance biochemical sensing.

    Science.gov (United States)

    Qi, Huijie; Niu, Lihong; Zhang, Jie; Chen, Jian; Wang, Shujie; Yang, Jingjing; Guo, Siyi; Lawson, Tom; Shi, Bingyang; Song, Chunpeng

    2018-04-01

    Surface plasmon resonance (SPR) nanosensors based on metallic nanohole arrays have been widely reported to detect binding interactions in biological specimens. A simple and effective method for constructing nanoscale arrays is essential for the development of SPR nanosensors. In this work, we report a one-step method to fabricate nanohole arrays by thermal nanoimprinting in the matrix of IPS (Intermediate Polymer Stamp). No additional etching process or supporting substrate is required. The preparation process is simple, time-saving and compatible for roll-to-roll process, potentially allowing mass production. Moreover, the nanohole arrays were integrated into detection platform as SPR sensors to investigate different types of biological binding interactions. The results demonstrate that our one-step method can be used to efficiently fabricate large-area and uniform nanohole arrays for biochemical sensing.

  6. Reliability and Validity of Ten Consumer Activity Trackers Depend on Walking Speed

    NARCIS (Netherlands)

    Fokkema, Tryntsje; Kooiman, Thea J. M.; Krijnen, Wim P.; Van der Schans, Cees P.; De Groot, Martijn

    Purpose: To examine the test-retest reliability and validity of ten activity trackers for step counting at three different walking speeds. Methods: Thirty-one healthy participants walked twice on a treadmill for 30 min while wearing 10 activity trackers (Polar Loop, Garmin Vivosmart, Fitbit Charge

  7. Reliability and validity of ten consumer activity trackers depend on walking speed

    NARCIS (Netherlands)

    Fokkema, Tryntsje; Kooiman, Thea; Krijnen, Wim; van der Schans, Cees; de Groot, Martijn

    Purpose: To examine the test–retest reliability and validity of ten activity trackers for step counting at three different walking speeds. Methods: Thirty-one healthy participants walked twice on a treadmill for 30 min while wearing 10 activity trackers (Polar Loop, Garmin Vivosmart, Fitbit Charge

  8. On the limitations of fixed-step-size adaptive methods with response confidence.

    Science.gov (United States)

    Hsu, Yung-Fong; Chin, Ching-Lan

    2014-05-01

    The family of (non-parametric, fixed-step-size) adaptive methods, also known as 'up-down' or 'staircase' methods, has been used extensively in psychophysical studies for threshold estimation. Extensions of adaptive methods to non-binary responses have also been proposed. An example is the three-category weighted up-down (WUD) method (Kaernbach, 2001) and its four-category extension (Klein, 2001). Such an extension, however, is somewhat restricted, and in this paper we discuss its limitations. To facilitate the discussion, we characterize the extension of WUD by an algorithm that incorporates response confidence into a family of adaptive methods. This algorithm can also be applied to two other adaptive methods, namely Derman's up-down method and the biased-coin design, which are suitable for estimating any threshold quantiles. We then discuss via simulations of the above three methods the limitations of the algorithm. To illustrate, we conduct a small scale of experiment using the extended WUD under different response confidence formats to evaluate the consistency of threshold estimation. © 2013 The British Psychological Society.

  9. Validity and Reliability of Turkish Male Breast Self-Examination Instrument.

    Science.gov (United States)

    Erkin, Özüm; Göl, İlknur

    2018-04-01

    This study aims to measure the validity and reliability of Turkish male breast self-examination (MBSE) instrument. The methodological study was performed in 2016 at Ege University, Faculty of Nursing, İzmir, Turkey. The MBSE includes ten steps. For validity studies, face validity, content validity, and construct validity (exploratory factor analysis) were done. For reliability study, Kuder Richardson was calculated. The content validity index was found to be 0.94. Kendall W coefficient was 0.80 (p=0.551). The total variance explained by the two factors was found to be 63.24%. Kuder Richardson 21 was done for reliability study and found to be 0.97 for the instrument. The final instrument included 10 steps and two stages. The Turkish version of MBSE is a valid and reliable instrument for early diagnose. The MBSE can be used in Turkish speaking countries and cultures with two stages and 10 steps.

  10. A study on development of the step complexity measure for emergency operating procedures using entropy concepts

    International Nuclear Information System (INIS)

    Park, J. K.; Jung, W. D.; Kim, J. W.; Ha, J. J.

    2001-04-01

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. For example, it was reported that about 70% of aviation accidents are due to human errors, and that approximately 28% of accidents in process industries are caused by human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors in aviation and manufacturing industries. In case of NPPs, the importance of procedures is more salient than other industries because not only over 50% of human errors were due to procedures but also about 18% of accidents were caused by the failure of following procedures. Thus, the provision of emergency operating procedures (EOPs) that are designed so that the possibility of human errors can be reduced is very important. To accomplish this goal, a quantitative and objective measure that can evaluate EOPs is indispensable. The purpose of this study is the development of a method that can quantify the complexity of a step included in EOPs. In this regard, the step complexity measure (SC) is developed based on three sub-measures such as the SIC (step information complexity), the SLC (step logic complexity) and the SSC (step size complexity). To verify the SC measure, not only quantitative validations (such as comparing SC scores with subjective evaluation results and with averaged step performance time) but also qualitative validations to clarify physical meanings of the SC measure are performed

  11. A study on development of the step complexity measure for emergency operating procedures using entropy concepts

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. K.; Jung, W. D.; Kim, J. W.; Ha, J. J

    2001-04-01

    In complex systems, such as nuclear power plants (NPPs) or airplane control systems, human errors play a major role in many accidents. For example, it was reported that about 70% of aviation accidents are due to human errors, and that approximately 28% of accidents in process industries are caused by human errors. According to related studies, written manuals or operating procedures are revealed as one of the most important factors in aviation and manufacturing industries. In case of NPPs, the importance of procedures is more salient than other industries because not only over 50% of human errors were due to procedures but also about 18% of accidents were caused by the failure of following procedures. Thus, the provision of emergency operating procedures (EOPs) that are designed so that the possibility of human errors can be reduced is very important. To accomplish this goal, a quantitative and objective measure that can evaluate EOPs is indispensable. The purpose of this study is the development of a method that can quantify the complexity of a step included in EOPs. In this regard, the step complexity measure (SC) is developed based on three sub-measures such as the SIC (step information complexity), the SLC (step logic complexity) and the SSC (step size complexity). To verify the SC measure, not only quantitative validations (such as comparing SC scores with subjective evaluation results and with averaged step performance time) but also qualitative validations to clarify physical meanings of the SC measure are performed.

  12. Mixed convection flow and heat transfer over different geometries of backward-facing step

    Directory of Open Access Journals (Sweden)

    BADER SHABEEB ALSHURAIAAN

    2013-12-01

    Full Text Available Mixed convective flow and heat transfer characteristics for two-dimensional laminar flow in a channel with different geometries of a backward-facing step are presented for various Grashof numbers. The wall downstream of the step was maintained at a constant temperature; TH, while the upper wall was considered isothermal at TC. The wall upstream of the step and the backward-facing step were considered as adiabatic surfaces. Navier-Stokes equations were employed to represent the transport phenomena in the channel. Further, the governing equations were solved using a finite element formulation based on the Galerkin method of weighted residuals. The numerical results of the reattachement lengths for recirculation region in a vertical channel with a backward-facing step (Re = 100 were validated by comparing them against documented studies in the literature. The results of this investigation show that the local skin friction coefficient increases with an increase in Grashof numbers. The results of this investigation show that configuration II of the backward-facing step (inclined exhibited an absence of vortices for all values of Grashof numbers and consequently the minimum skin friction coefficient. However, configuration I is found to have the largest local skin friction coefficient.

  13. A clinical decision support system algorithm for intravenous to oral antibiotic switch therapy: validity, clinical relevance and usefulness in a three-step evaluation study.

    Science.gov (United States)

    Akhloufi, H; Hulscher, M; van der Hoeven, C P; Prins, J M; van der Sijs, H; Melles, D C; Verbon, A

    2018-04-26

    To evaluate a clinical decision support system (CDSS) based on consensus-based intravenous to oral switch criteria, which identifies intravenous to oral switch candidates. A three-step evaluation study of a stand-alone CDSS with electronic health record interoperability was performed at the Erasmus University Medical Centre in the Netherlands. During the first step, we performed a technical validation. During the second step, we determined the sensitivity, specificity, negative predictive value and positive predictive value in a retrospective cohort of all hospitalized adult patients starting at least one therapeutic antibacterial drug between 1 and 16 May 2013. ICU, paediatric and psychiatric wards were excluded. During the last step the clinical relevance and usefulness was prospectively assessed by reports to infectious disease specialists. An alert was considered clinically relevant if antibiotics could be discontinued or switched to oral therapy at the time of the alert. During the first step, one technical error was found. The second step yielded a positive predictive value of 76.6% and a negative predictive value of 99.1%. The third step showed that alerts were clinically relevant in 53.5% of patients. For 43.4% it had already been decided to discontinue or switch the intravenous antibiotics by the treating physician. In 10.1%, the alert resulted in advice to change antibiotic policy and was considered useful. This prospective cohort study shows that the alerts were clinically relevant in >50% (n = 449) and useful in 10% (n = 85). The CDSS needs to be evaluated in hospitals with varying activity of infectious disease consultancy services as this probably influences usefulness.

  14. Validity and Reliability of Accelerometers in Patients With COPD: A SYSTEMATIC REVIEW.

    Science.gov (United States)

    Gore, Shweta; Blackwood, Jennifer; Guyette, Mary; Alsalaheen, Bara

    2018-05-01

    Reduced physical activity is associated with poor prognosis in chronic obstructive pulmonary disease (COPD). Accelerometers have greatly improved quantification of physical activity by providing information on step counts, body positions, energy expenditure, and magnitude of force. The purpose of this systematic review was to compare the validity and reliability of accelerometers used in patients with COPD. An electronic database search of MEDLINE and CINAHL was performed. Study quality was assessed with the Strengthening the Reporting of Observational Studies in Epidemiology checklist while methodological quality was assessed using the modified Quality Appraisal Tool for Reliability Studies. The search yielded 5392 studies; 25 met inclusion criteria. The SenseWear Pro armband reported high criterion validity under controlled conditions (r = 0.75-0.93) and high reliability (ICC = 0.84-0.86) for step counts. The DynaPort MiniMod demonstrated highest concurrent validity for step count using both video and manual methods. Validity of the SenseWear Pro armband varied between studies especially in free-living conditions, slower walking speeds, and with addition of weights during gait. A high degree of variability was found in the outcomes used and statistical analyses performed between studies, indicating a need for further studies to measure reliability and validity of accelerometers in COPD. The SenseWear Pro armband is the most commonly used accelerometer in COPD, but measurement properties are limited by gait speed variability and assistive device use. DynaPort MiniMod and Stepwatch accelerometers demonstrated high validity in patients with COPD but lack reliability data.

  15. Development and validation of a confirmatory method for the determination of 12 non steroidal anti-inflammatory drugs in milk using liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Dubreil-Chéneau, Estelle; Pirotais, Yvette; Bessiral, Mélaine; Roudaut, Brigitte; Verdon, Eric

    2011-09-16

    A rapid and reliable LC-MS/MS method for the simultaneous confirmation of twelve non steroidal anti-inflammatory drugs (NSAIDs) in bovine milk was developed and fully validated in accordance with the European Commission Decision 2002/657/EC. The validation scheme was built in accordance with the MRLs or target analytical levels (EU-CRL recommended concentrations and detection capabilities) of the analytes, except for diclofenac for which the lower level of validation achieved was 0.5 μg kg(-1) whereas its MRL is 0.1 μg kg(-1). The NSAIDs investigated were as follows: phenylbutazone (PBZ), oxyphenylbutazone (OPB), naproxen (NP), mefenamic acid (MF), vedaprofen (VDP), flunixin (FLU), 5-hydroxyflunixin (FLU-OH), tolfenamic acid (TLF), meloxicam (MLX), diclofenac (DC), carprofen (CPF) and ketoprofen (KTP). Several extraction procedures had been investigated during the development phase. Finally, the best results were obtained with a procedure using only methanol as the extraction solvent, with an evaporation step included and no further purification. Chromatographic separation was achieved on a C18 analytical column and the run was split in 2 segments. Matrix effects were also investigated. Data acquisition implemented for the confirmatory purpose was performed by monitoring 2 MRM transitions per analyte under the negative electrospray mode. Mean relative recoveries ranged from 94.7% to 110.0%, with their coefficients of variation lying between 2.9% and 14.7%. Analytical limits expressed in terms of decision limits (CCα) were evaluated between 0.69 μg kg(-1) (FLU) and 27.54 μg kg(-1) (VDP) for non-MRL compounds, and at 0.10 (DC), 15.37 (MLX), 45.08 (FLU-OH), and 62.96 μg kg(-1) (TLF) for MRL compounds. The validation results proved that the method is suitable for the screening and confirmatory steps as implemented for the French monitoring plan for NSAID residue control in bovine milk. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. A Simple Three-Step Method for Design and Affinity Testing of New Antisense Peptides: An Example of Erythropoietin

    Directory of Open Access Journals (Sweden)

    Nikola Štambuk

    2014-05-01

    Full Text Available Antisense peptide technology is a valuable tool for deriving new biologically active molecules and performing peptide–receptor modulation. It is based on the fact that peptides specified by the complementary (antisense nucleotide sequences often bind to each other with a higher specificity and efficacy. We tested the validity of this concept on the example of human erythropoietin, a well-characterized and pharmacologically relevant hematopoietic growth factor. The purpose of the work was to present and test simple and efficient three-step procedure for the design of an antisense peptide targeting receptor-binding site of human erythropoietin. Firstly, we selected the carboxyl-terminal receptor binding region of the molecule (epitope as a template for the antisense peptide modeling; Secondly, we designed an antisense peptide using mRNA transcription of the epitope sequence in the 3'→5' direction and computational screening of potential paratope structures with BLAST; Thirdly, we evaluated sense–antisense (epitope–paratope peptide binding and affinity by means of fluorescence spectroscopy and microscale thermophoresis. Both methods showed similar Kd values of 850 and 816 µM, respectively. The advantages of the methods were: fast screening with a small quantity of the sample needed, and measurements done within the range of physicochemical parameters resembling physiological conditions. Antisense peptides targeting specific erythropoietin region(s could be used for the development of new immunochemical methods. Selected antisense peptides with optimal affinity are potential lead compounds for the development of novel diagnostic substances, biopharmaceuticals and vaccines.

  17. The Role of Generalizability in Validity.

    Science.gov (United States)

    Kane, Michael

    The relationship between generalizability and validity is explained, making four important points. The first is that generalizability coefficients provide upper bounds on validity. The second point is that generalization is one step in most interpretive arguments, and therefore, generalizability is a necessary condition for the validity of these…

  18. One-Step Method for Preparation of Magnetic Nanoparticles Coated with Chitosan

    Directory of Open Access Journals (Sweden)

    Karla M. Gregorio-Jauregui

    2012-01-01

    Full Text Available Preparation of magnetic nanoparticles coated with chitosan in one step by the coprecipitation method in the presence of different chitosan concentrations is reported here. Obtaining of magnetic superparamagnetic nanoparticles was confirmed by X-ray diffraction and magnetic measurements. Scanning transmission electron microscopy allowed to identify spheroidal nanoparticles with around 10-11 nm in average diameter. Characterization of the products by Fourier transform infrared spectroscopy demonstrated that composite chitosan-magnetic nanoparticles were obtained. Chitosan content in obtained nanocomposites was estimated by thermogravimetric analysis. The nanocomposites were tested in Pb2+ removal from a PbCl2 aqueous solution, showing a removal efficacy up to 53.6%. This work provides a simple method for chitosan-coated nanoparticles obtaining, which could be useful for heavy metal ions removal from water.

  19. Development and content validation of the information assessment method for patients and consumers.

    Science.gov (United States)

    Pluye, Pierre; Granikov, Vera; Bartlett, Gillian; Grad, Roland M; Tang, David L; Johnson-Lafleur, Janique; Shulha, Michael; Barbosa Galvão, Maria Cristiane; Ricarte, Ivan Lm; Stephenson, Randolph; Shohet, Linda; Hutsul, Jo-Anne; Repchinsky, Carol A; Rosenberg, Ellen; Burnand, Bernard; Légaré, France; Dunikowski, Lynn; Murray, Susan; Boruff, Jill; Frati, Francesca; Kloda, Lorie; Macaulay, Ann; Lagarde, François; Doray, Geneviève

    2014-02-18

    Online consumer health information addresses health problems, self-care, disease prevention, and health care services and is intended for the general public. Using this information, people can improve their knowledge, participation in health decision-making, and health. However, there are no comprehensive instruments to evaluate the value of health information from a consumer perspective. We collaborated with information providers to develop and validate the Information Assessment Method for all (IAM4all) that can be used to collect feedback from information consumers (including patients), and to enable a two-way knowledge translation between information providers and consumers. Content validation steps were followed to develop the IAM4all questionnaire. The first version was based on a theoretical framework from information science, a critical literature review and prior work. Then, 16 laypersons were interviewed on their experience with online health information and specifically their impression of the IAM4all questionnaire. Based on the summaries and interpretations of interviews, questionnaire items were revised, added, and excluded, thus creating the second version of the questionnaire. Subsequently, a panel of 12 information specialists and 8 health researchers participated in an online survey to rate each questionnaire item for relevance, clarity, representativeness, and specificity. The result of this expert panel contributed to the third, current, version of the questionnaire. The current version of the IAM4all questionnaire is structured by four levels of outcomes of information seeking/receiving: situational relevance, cognitive impact, information use, and health benefits. Following the interviews and the expert panel survey, 9 questionnaire items were confirmed as relevant, clear, representative, and specific. To improve readability and accessibility for users with a lower level of literacy, 19 items were reworded and all inconsistencies in using a

  20. Potential of accuracy profile for method validation in inductively coupled plasma spectrochemistry

    International Nuclear Information System (INIS)

    Mermet, J.M.; Granier, G.

    2012-01-01

    Method validation is usually performed over a range of concentrations for which analytical criteria must be verified. One important criterion in quantitative analysis is accuracy, i.e. the contribution of both trueness and precision. The study of accuracy over this range is called an accuracy profile and provides experimental tolerance intervals. Comparison with acceptability limits fixed by the end user defines a validity domain. This work describes the computation involved in the building of the tolerance intervals, particularly for the intermediate precision with within-laboratory experiments and for the reproducibility with interlaboratory studies. Computation is based on ISO 5725‐4 and on previously published work. Moreover, the bias uncertainty is also computed to verify the bias contribution to accuracy. The various types of accuracy profile behavior are exemplified with results obtained by using ICP-MS and ICP-AES. This procedure allows the analyst to define unambiguously a validity domain for a given accuracy. However, because the experiments are time-consuming, the accuracy profile method is mainly dedicated to method validation. - Highlights: ► An analytical method is defined by its accuracy, i.e. both trueness and precision. ► The accuracy as a function of an analyte concentration is an accuracy profile. ► Profile basic concepts are explained for trueness and intermediate precision. ► Profile-based tolerance intervals have to be compared with acceptability limits. ► Typical accuracy profiles are given for both ICP-AES and ICP-MS techniques.

  1. Accuracy of Single-Step versus 2-Step Double-Mix Impression Technique

    DEFF Research Database (Denmark)

    Franco, Eduardo Batista; da Cunha, Leonardo Fernandes; Herrera, Francyle Simões

    2011-01-01

    Objective. To investigate the accuracy of dies obtained from single-step and 2-step double-mix impressions. Material and Methods. Impressions (n = 10) of a stainless steel die simulating a complete crown preparation were performed using a polyether (Impregum Soft Heavy and Light body) and a vinyl...

  2. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2015-05-01

    Several sophisticated methods of footprint analysis currently exist. However, it is sometimes useful to apply standard measurement methods of recognized evidence with an easy and quick application. We sought to assess the reliability and validity of a new method of footprint assessment in a healthy population using Photoshop CS5 software (Adobe Systems Inc, San Jose, California). Forty-two footprints, corresponding to 21 healthy individuals (11 men with a mean ± SD age of 20.45 ± 2.16 years and 10 women with a mean ± SD age of 20.00 ± 1.70 years) were analyzed. Footprints were recorded in static bipedal standing position using optical podography and digital photography. Three trials for each participant were performed. The Hernández-Corvo, Chippaux-Smirak, and Staheli indices and the Clarke angle were calculated by manual method and by computerized method using Photoshop CS5 software. Test-retest was used to determine reliability. Validity was obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed high values (ICC, 0.98-0.99). Moreover, the validity test clearly showed no difference between techniques (ICC, 0.99-1). The reliability and validity of a method to measure, assess, and record the podometric indices using Photoshop CS5 software has been demonstrated. This provides a quick and accurate tool useful for the digital recording of morphostatic foot study parameters and their control.

  3. Method Development and Validation for the Determination of Caffeine: An Alkaloid from Coffea arabica by High-performance Liquid Chromatography Method.

    Science.gov (United States)

    Naveen, P; Lingaraju, H B; Deepak, M; Medhini, B; Prasad, K Shyam

    2018-01-01

    The present study was investigated to develop and validate a reversed phase high performance liquid chromatography method for the determination of caffeine from bean material of Coffee arabica. The separation was achieved on a reversed-phase C18 column using a mobile phase composed of water: methanol (50:50) at a flow rate of 1.0 mlmin-1. The detection was carried out on a UV detector at 272 nm. The developed method was validated according to the requirements for International Conference on Harmonisation (ICH) guidelines, which includes specificity, linearity, precision, accuracy, limit of detection and limit of quantitation. The developed method validates good linearity with excellent correlation coefficient (R2 > 0.999). In repeatability and intermediate precision, the percentage relative standard deviation (% RSD) of peak area was less than 1% shows high precision of the method. The recovery rate for caffeine was within 98.78% - 101.28% indicates high accuracy of the method. The low limit of detection and limit of quantitation of caffeine enable the detection and quantitation of caffeine from C. arabica at low concentrations. The developed HPLC method is a simple, rapid, precisely, accurately and widely accepted and it is recommended for efficient assays in routine work. A simple, accurate, and sensitive high-performance liquid chromatography (HPLC) method for caffeine from Coffea arabica has been developed and validated. The developed HPLC method was validated for linearity, specificity, precision, recovery, limits of detection, and limits of quantification by the International Conference on Harmonization guidelines. The results revealed that the proposed method is highly reliable. This method could be successfully applied for routine quality work analysis. Abbreviation Used: C. arabica : Coffee arabica, ICH: International Conference on Harmonisation, % RSD: Percentage Relative Standard Deviation, R2: Correlation Coefficient, ppm: Parts per million, LOD: Limits

  4. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  5. A validated bioanalytical HPLC method for pharmacokinetic evaluation of 2-deoxyglucose in human plasma.

    Science.gov (United States)

    Gounder, Murugesan K; Lin, Hongxia; Stein, Mark; Goodin, Susan; Bertino, Joseph R; Kong, Ah-Ng Tony; DiPaola, Robert S

    2012-05-01

    2-Deoxyglucose (2-DG), an analog of glucose, is widely used to interfere with glycolysis in tumor cells and studied as a therapeutic approach in clinical trials. To evaluate the pharmacokinetics of 2-DG, we describe the development and validation of a sensitive HPLC fluorescent method for the quantitation of 2-DG in plasma. Plasma samples were deproteinized with methanol and the supernatant was dried at 45°C. The residues were dissolved in methanolic sodium acetate-boric acid solution. 2-DG and other monosaccharides were derivatized to 2-aminobenzoic acid derivatives in a single step in the presence of sodium cyanoborohydride at 80°C for 45 min. The analytes were separated on a YMC ODS C₁₈ reversed-phase column using gradient elution. The excitation and emission wavelengths were set at 360 and 425 nm. The 2-DG calibration curves were linear over the range of 0.63-300 µg/mL with a limit of detection of 0.5 µg/mL. The assay provided satisfactory intra-day and inter-day precision with RSD less than 9.8%, and the accuracy ranged from 86.8 to 110.0%. The HPLC method is reproducible and suitable for the quantitation of 2-DG in plasma. The method was successfully applied to characterize the pharmacokinetics profile of 2-DG in patients with advanced solid tumors. Copyright © 2011 John Wiley & Sons, Ltd.

  6. Step responses of a torsional system with multiple clearances: Study of vibro-impact phenomenon using experimental and computational methods

    Science.gov (United States)

    Oruganti, Pradeep Sharma; Krak, Michael D.; Singh, Rajendra

    2018-01-01

    Recently Krak and Singh (2017) proposed a scientific experiment that examined vibro-impacts in a torsional system under a step down excitation and provided preliminary measurements and limited non-linear model studies. A major goal of this article is to extend the prior work with a focus on the examination of vibro-impact phenomena observed under step responses in a torsional system with one, two or three controlled clearances. First, new measurements are made at several locations with a higher sampling frequency. Measured angular accelerations are examined in both time and time-frequency domains. Minimal order non-linear models of the experiment are successfully constructed, using piecewise linear stiffness and Coulomb friction elements; eight cases of the generic system are examined though only three are experimentally studied. Measured and predicted responses for single and dual clearance configurations exhibit double sided impacts and time varying periods suggest softening trends under the step down torque. Non-linear models are experimentally validated by comparing results with new measurements and with those previously reported. Several metrics are utilized to quantify and compare the measured and predicted responses (including peak to peak accelerations). Eigensolutions and step responses of the corresponding linearized models are utilized to better understand the nature of the non-linear dynamic system. Finally, the effect of step amplitude on the non-linear responses is examined for several configurations, and hardening trends are observed in the torsional system with three clearances.

  7. Validating the CORE-10 as a mental health screening tool for prisoners

    OpenAIRE

    Lewis, Gwen

    2016-01-01

    Background: Few mental health screening tools have been validated with prisoners and existing tools, do not assess severity of need in line with contemporary stepped care service models. \\ud \\ud Aims: The current research aims to assess the CORE-10’s psychometric reliability, validity and predictive accuracy as a screening tool for common (primary care) and severe (secondary care) mental health problems in prisoners. \\ud \\ud Method: Cross –sectional study of 150 prisoners. All participants co...

  8. Rapid one-step selection method for generating nucleic acid aptamers: development of a DNA aptamer against α-bungarotoxin.

    Directory of Open Access Journals (Sweden)

    Lasse H Lauridsen

    Full Text Available BACKGROUND: Nucleic acids based therapeutic approaches have gained significant interest in recent years towards the development of therapeutics against many diseases. Recently, research on aptamers led to the marketing of Macugen®, an inhibitor of vascular endothelial growth factor (VEGF for the treatment of age related macular degeneration (AMD. Aptamer technology may prove useful as a therapeutic alternative against an array of human maladies. Considering the increased interest in aptamer technology globally that rival antibody mediated therapeutic approaches, a simplified selection, possibly in one-step, technique is required for developing aptamers in limited time period. PRINCIPAL FINDINGS: Herein, we present a simple one-step selection of DNA aptamers against α-bungarotoxin. A toxin immobilized glass coverslip was subjected to nucleic acid pool binding and extensive washing followed by PCR enrichment of the selected aptamers. One round of selection successfully identified a DNA aptamer sequence with a binding affinity of 7.58 µM. CONCLUSION: We have demonstrated a one-step method for rapid production of nucleic acid aptamers. Although the reported binding affinity is in the low micromolar range, we believe that this could be further improved by using larger targets, increasing the stringency of selection and also by combining a capillary electrophoresis separation prior to the one-step selection. Furthermore, the method presented here is a user-friendly, cheap and an easy way of deriving an aptamer unlike the time consuming conventional SELEX-based approach. The most important application of this method is that chemically-modified nucleic acid libraries can also be used for aptamer selection as it requires only one enzymatic step. This method could equally be suitable for developing RNA aptamers.

  9. Two-step reconstruction method using global optimization and conjugate gradient for ultrasound-guided diffuse optical tomography.

    Science.gov (United States)

    Tavakoli, Behnoosh; Zhu, Quing

    2013-01-01

    Ultrasound-guided diffuse optical tomography (DOT) is a promising method for characterizing malignant and benign lesions in the female breast. We introduce a new two-step algorithm for DOT inversion in which the optical parameters are estimated with the global optimization method, genetic algorithm. The estimation result is applied as an initial guess to the conjugate gradient (CG) optimization method to obtain the absorption and scattering distributions simultaneously. Simulations and phantom experiments have shown that the maximum absorption and reduced scattering coefficients are reconstructed with less than 10% and 25% errors, respectively. This is in contrast with the CG method alone, which generates about 20% error for the absorption coefficient and does not accurately recover the scattering distribution. A new measure of scattering contrast has been introduced to characterize benign and malignant breast lesions. The results of 16 clinical cases reconstructed with the two-step method demonstrates that, on average, the absorption coefficient and scattering contrast of malignant lesions are about 1.8 and 3.32 times higher than the benign cases, respectively.

  10. Validation of ultraviolet method to determine serum phosphorus level

    International Nuclear Information System (INIS)

    Garcia Borges, Lisandra; Perez Prieto, Teresa Maria; Valdes Diez, Lilliam

    2009-01-01

    Validation of a spectrophotometry method applicable in clinic labs was proposed to analytical assessment of serum phosphates using a kit UV-phosphorus of domestic production from 'Carlos J. Finlay' Biologics Production Enterprise (Havana, Cuba). Analysis method was based on phosphorus reaction to ammonium molybdenum to acid pH to creating a measurable complex to 340 nm. Specificity and precision were measured considering the method strength, linearity, accuracy and sensitivity. Analytical method was linear up to 4,8 mmol/L, precise (CV 30 .999) during clinical interest concentration interval where there were not interferences by matrix. Detection limit values were of 0.037 mmol/L and of quantification of 0.13 mmol/L both were satisfactory for product use

  11. Validation of battery-alternator model against experimental data - a first step towards developing a future power supply system

    Energy Technology Data Exchange (ETDEWEB)

    Boulos, A.M.; Burnham, K.J.; Mahtani, J.L. [Coventry University (United Kingdom). Control Theory and Applications Centre; Pacaud, C. [Jaguar Cars Ltd., Coventry (United Kingdom). Engineering Centre

    2004-01-01

    The electric power system of a modern vehicle has to supply enough electrical energy to drive numerous electrical and electronic systems and components. The electric power system of a vehicle consists of two major components: an alternator and a battery. A detailed understanding of the characteristics of the electric power system, electrical load demands and the operating environment, such as road conditions and vehicle laden weight, is required when the capacities of the generator and the battery are to be determined for a vehicle. In this study, a battery-alternator system has been developed and simulated in MATLAB/Simulink, and data obtained from vehicle tests have been used as a basis for validating the models. This is considered to be a necessary first step in the design and development of a new 42 V power supply system. (author)

  12. Validation of New Cancer Biomarkers

    DEFF Research Database (Denmark)

    Duffy, Michael J; Sturgeon, Catherine M; Söletormos, Georg

    2015-01-01

    BACKGROUND: Biomarkers are playing increasingly important roles in the detection and management of patients with cancer. Despite an enormous number of publications on cancer biomarkers, few of these biomarkers are in widespread clinical use. CONTENT: In this review, we discuss the key steps...... in advancing a newly discovered cancer candidate biomarker from pilot studies to clinical application. Four main steps are necessary for a biomarker to reach the clinic: analytical validation of the biomarker assay, clinical validation of the biomarker test, demonstration of clinical value from performance...... of the biomarker test, and regulatory approval. In addition to these 4 steps, all biomarker studies should be reported in a detailed and transparent manner, using previously published checklists and guidelines. Finally, all biomarker studies relating to demonstration of clinical value should be registered before...

  13. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  14. The STEP database through the end-users eyes--USABILITY STUDY.

    Science.gov (United States)

    Salunke, Smita; Tuleu, Catherine

    2015-08-15

    The user-designed database of Safety and Toxicity of Excipients for Paediatrics ("STEP") is created to address the shared need of drug development community to access the relevant information of excipients effortlessly. Usability testing was performed to validate if the database satisfies the need of the end-users. Evaluation framework was developed to assess the usability. The participants performed scenario based tasks and provided feedback and post-session usability ratings. Failure Mode Effect Analysis (FMEA) was performed to prioritize the problems and improvements to the STEP database design and functionalities. The study revealed several design vulnerabilities. Tasks such as limiting the results, running complex queries, location of data and registering to access the database were challenging. The three critical attributes identified to have impact on the usability of the STEP database included (1) content and presentation (2) the navigation and search features (3) potential end-users. Evaluation framework proved to be an effective method for evaluating database effectiveness and user satisfaction. This study provides strong initial support for the usability of the STEP database. Recommendations would be incorporated into the refinement of the database to improve its usability and increase user participation towards the advancement of the database. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Validation of geotechnical software for repository performance assessment

    International Nuclear Information System (INIS)

    LeGore, T.; Hoover, J.D.; Khaleel, R.; Thornton, E.C.; Anantatmula, R.P.; Lanigan, D.C.

    1989-01-01

    An important step in the characterization of a high level nuclear waste repository is to demonstrate that geotechnical software, used in performance assessment, correctly models validation. There is another type of validation, called software validation. It is based on meeting the requirements of specifications documents (e.g. IEEE specifications) and does not directly address the correctness of the specifications. The process of comparing physical experimental results with the predicted results should incorporate an objective measure of the level of confidence regarding correctness. This paper reports on a methodology developed that allows the experimental uncertainties to be explicitly included in the comparison process. The methodology also allows objective confidence levels to be associated with the software. In the event of a poor comparison, the method also lays the foundation for improving the software

  16. High-resolution wave-theory-based ultrasound reflection imaging using the split-step fourier and globally optimized fourier finite-difference methods

    Science.gov (United States)

    Huang, Lianjie

    2013-10-29

    Methods for enhancing ultrasonic reflection imaging are taught utilizing a split-step Fourier propagator in which the reconstruction is based on recursive inward continuation of ultrasonic wavefields in the frequency-space and frequency-wave number domains. The inward continuation within each extrapolation interval consists of two steps. In the first step, a phase-shift term is applied to the data in the frequency-wave number domain for propagation in a reference medium. The second step consists of applying another phase-shift term to data in the frequency-space domain to approximately compensate for ultrasonic scattering effects of heterogeneities within the tissue being imaged (e.g., breast tissue). Results from various data input to the method indicate significant improvements are provided in both image quality and resolution.

  17. Critical Values for Lawshe's Content Validity Ratio: Revisiting the Original Methods of Calculation

    Science.gov (United States)

    Ayre, Colin; Scally, Andrew John

    2014-01-01

    The content validity ratio originally proposed by Lawshe is widely used to quantify content validity and yet methods used to calculate the original critical values were never reported. Methods for original calculation of critical values are suggested along with tables of exact binomial probabilities.

  18. Single-step electrochemical method for producing very sharp Au scanning tunneling microscopy tips

    International Nuclear Information System (INIS)

    Gingery, David; Buehlmann, Philippe

    2007-01-01

    A single-step electrochemical method for making sharp gold scanning tunneling microscopy tips is described. 3.0M NaCl in 1% perchloric acid is compared to several previously reported etchants. The addition of perchloric acid to sodium chloride solutions drastically shortens etching times and is shown by transmission electron microscopy to produce very sharp tips with a mean radius of curvature of 15 nm

  19. In situ synthesis carbonated hydroxyapatite layers on enamel slices with acidic amino acids by a novel two-step method

    International Nuclear Information System (INIS)

    Wu, Xiaoguang; Zhao, Xu; Li, Yi; Yang, Tao; Yan, Xiujuan; Wang, Ke

    2015-01-01

    In situ fabrication of carbonated hydroxyapatite (CHA) remineralization layer on an enamel slice was completed in a novel, biomimetic two-step method. First, a CaCO 3 layer was synthesized on the surface of demineralized enamel using an acidic amino acid (aspartic acid or glutamate acid) as a soft template. Second, at the same concentration of the acidic amino acid, rod-like carbonated hydroxyapatite was produced with the CaCO 3 layer as a sacrificial template and a reactant. The morphology, crystallinity and other physicochemical properties of the crystals were characterized using field emission scanning electron microscopy (FESEM), Fourier transform infrared spectrometry (FTIR), X-ray diffraction (XRD) and energy-dispersive X-ray analysis (EDAX), respectively. Acidic amino acid could promote the uniform deposition of hydroxyapatite with rod-like crystals via absorption of phosphate and carbonate ions from the reaction solution. Moreover, compared with hydroxyapatite crystals coated on the enamel when synthesized by a one-step method, the CaCO 3 coating that was synthesized in the first step acted as an active bridge layer and sacrificial template. It played a vital role in orienting the artificial coating layer through the template effect. The results show that the rod-like carbonated hydroxyapatite crystals grow into bundles, which are similar in size and appearance to prisms in human enamel, when using the two-step method with either aspartic acid or acidic glutamate (20.00 mmol/L). - Graphical abstract: FESEM images of enamel slices etched for 60 s and repaired by the two-step method with Glu concentration of 20.00 mmol/L. (A) The boundary (dotted line) of the repaired areas (b) and unrepaired areas (a). (Some selected areas of etched enamel slices were coated with a nail polish before the reaction, which was removed by acetone after the reaction); (B) high magnification image of Ga, (C) high magnification image of Gb. In situ fabrication of carbonated

  20. In situ synthesis carbonated hydroxyapatite layers on enamel slices with acidic amino acids by a novel two-step method

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Xiaoguang [Department of Pediatric Dentistry, The Hospital of Stomatology, Jilin University, Changchun 130021 (China); Zhao, Xu [College of Chemistry, Jilin University, Changchun 130021 (China); Li, Yi, E-mail: lyi99@jlu.edu.cn [Department of Pediatric Dentistry, The Hospital of Stomatology, Jilin University, Changchun 130021 (China); Yang, Tao [Department of Stomatology, Children' s Hospital of Changchun, 130051 (China); Yan, Xiujuan; Wang, Ke [Department of Pediatric Dentistry, The Hospital of Stomatology, Jilin University, Changchun 130021 (China)

    2015-09-01

    In situ fabrication of carbonated hydroxyapatite (CHA) remineralization layer on an enamel slice was completed in a novel, biomimetic two-step method. First, a CaCO{sub 3} layer was synthesized on the surface of demineralized enamel using an acidic amino acid (aspartic acid or glutamate acid) as a soft template. Second, at the same concentration of the acidic amino acid, rod-like carbonated hydroxyapatite was produced with the CaCO{sub 3} layer as a sacrificial template and a reactant. The morphology, crystallinity and other physicochemical properties of the crystals were characterized using field emission scanning electron microscopy (FESEM), Fourier transform infrared spectrometry (FTIR), X-ray diffraction (XRD) and energy-dispersive X-ray analysis (EDAX), respectively. Acidic amino acid could promote the uniform deposition of hydroxyapatite with rod-like crystals via absorption of phosphate and carbonate ions from the reaction solution. Moreover, compared with hydroxyapatite crystals coated on the enamel when synthesized by a one-step method, the CaCO{sub 3} coating that was synthesized in the first step acted as an active bridge layer and sacrificial template. It played a vital role in orienting the artificial coating layer through the template effect. The results show that the rod-like carbonated hydroxyapatite crystals grow into bundles, which are similar in size and appearance to prisms in human enamel, when using the two-step method with either aspartic acid or acidic glutamate (20.00 mmol/L). - Graphical abstract: FESEM images of enamel slices etched for 60 s and repaired by the two-step method with Glu concentration of 20.00 mmol/L. (A) The boundary (dotted line) of the repaired areas (b) and unrepaired areas (a). (Some selected areas of etched enamel slices were coated with a nail polish before the reaction, which was removed by acetone after the reaction); (B) high magnification image of Ga, (C) high magnification image of Gb. In situ fabrication of

  1. Microscale validation of 4-aminoantipyrine test method for quantifying phenolic compounds in microbial culture

    International Nuclear Information System (INIS)

    Justiz Mendoza, Ibrahin; Aguilera Rodriguez, Isabel; Perez Portuondo, Irasema

    2014-01-01

    Validation of test methods microscale is currently of great importance due to the economic and environmental advantages possessed, which constitutes a prerequisite for the performance of services and quality assurance of the results to provide customer. This paper addresses the microscale validation of 4-aminoantipyrine spectrophotometric method for the quantification of phenolic compounds in culture medium. Parameters linearity, precision, regression, accuracy, detection limits, quantification limits and robustness were evaluated, addition to the comparison test with no standardized method for determining polyphenols (Folin Ciocalteu). The results showed that both methods are feasible for determining phenols

  2. Two-step extraction method for lead isotope fractionation to reveal anthropogenic lead pollution.

    Science.gov (United States)

    Katahira, Kenshi; Moriwaki, Hiroshi; Kamura, Kazuo; Yamazaki, Hideo

    2018-05-28

    This study developed the 2-step extraction method which eluted the Pb adsorbing on the surface of sediments in the first solution by aqua regia and extracted the Pb absorbed inside particles into the second solution by mixed acid of nitric acid, hydrofluoric acid and hydrogen peroxide solution. We applied the method to sediments in the enclosed water area and found out that the isotope ratios of Pb in the second solution represented those of natural origin. This advantage of the method makes it possible to distinguish the Pb between natural origin and anthropogenic source on the basis of the isotope ratios. The results showed that the method was useful to discuss the Pb sources and that anthropogenic Pb in the sediment samples analysed was mainly derived from China because of transboundary air pollution.

  3. Development and validation of a multi-analyte method for the regulatory control of carotenoids used as feed additives in fish and poultry feed.

    Science.gov (United States)

    Vincent, Ursula; Serano, Federica; von Holst, Christoph

    2017-08-01

    Carotenoids are used in animal nutrition mainly as sensory additives that favourably affect the colour of fish, birds and food of animal origin. Various analytical methods exist for their quantification in compound feed, reflecting the different physico-chemical characteristics of the carotenoid and the corresponding feed additives. They may be natural products or specific formulations containing the target carotenoids produced by chemical synthesis. In this study a multi-analyte method was developed that can be applied to the determination of all 10 carotenoids currently authorised within the European Union for compound feedingstuffs. The method functions regardless of whether the carotenoids have been added to the compound feed via natural products or specific formulations. It is comprised of three steps: (1) digestion of the feed sample with an enzyme; (2) pressurised liquid extraction; and (3) quantification of the analytes by reversed-phase HPLC coupled to a photodiode array detector in the visible range. The method was single-laboratory validated for poultry and fish feed covering a mass fraction range of the target analyte from 2.5 to 300 mg kg - 1 . The following method performance characteristics were obtained: the recovery rate varied from 82% to 129% and precision expressed as the relative standard deviation of intermediate precision varied from 1.6% to 15%. Based on the acceptable performance obtained in the validation study, the multi-analyte method is considered fit for the intended purpose.

  4. Chemometric approach for development, optimization, and validation of different chromatographic methods for separation of opium alkaloids.

    Science.gov (United States)

    Acevska, J; Stefkov, G; Petkovska, R; Kulevanova, S; Dimitrovska, A

    2012-05-01

    The excessive and continuously growing interest in the simultaneous determination of poppy alkaloids imposes the development and optimization of convenient high-throughput methods for the assessment of the qualitative and quantitative profile of alkaloids in poppy straw. Systematic optimization of two chromatographic methods (gas chromatography (GC)/flame ionization detector (FID)/mass spectrometry (MS) and reversed-phase (RP)-high-performance liquid chromatography (HPLC)/diode array detector (DAD)) for the separation of alkaloids from Papaver somniferum L. (Papaveraceae) was carried out. The effects of various conditions on the predefined chromatographic descriptors were investigated using chemometrics. A full factorial linear design of experiments for determining the relationship between chromatographic conditions and the retention behavior of the analytes was used. Central composite circumscribed design was utilized for the final method optimization. By conducting the optimization of the methods in very rational manner, a great deal of excessive and unproductive laboratory research work was avoided. The developed chromatographic methods were validated and compared in line with the resolving power, sensitivity, accuracy, speed, cost, ecological aspects, and compatibility with the poppy straw extraction procedure. The separation of the opium alkaloids using the GC/FID/MS method was achieved within 10 min, avoiding any derivatization step. This method has a stronger resolving power, shorter analysis time, better cost/effectiveness factor than the RP-HPLC/DAD method and is in line with the "green trend" of the analysis. The RP-HPLC/DAD method on the other hand displayed better sensitivity for all tested alkaloids. The proposed methods provide both fast screening and an accurate content assessment of the six alkaloids in the poppy samples obtained from the selection program of Papaver strains.

  5. Validation of the method for investigation of radiopharmaceuticals for in vitro use

    International Nuclear Information System (INIS)

    Vranjes, S; Jovanovic, M.; Orlic, M.; Lazic, E. . E-mail address of corresponding author: sanjav@vin.bg.ac.yu

    2005-01-01

    The aim of this study was to validate analytical method for determination of total radioactivity and radioactive concentration of 125 I-triiodotironin, radiopharmaceutical for in vitro use. Analytical parameters: selectivity, accuracy, linearity and range of this method were determined. Values obtained for all parameters are reasonable for analytical methods, therefore this method could be used for farther investigation. (author)

  6. Ensuring the validity of calculated subcritical limits

    International Nuclear Information System (INIS)

    Clark, H.K.

    1977-01-01

    The care taken at the Savannah River Laboratory and Plant to ensure the validity of calculated subcritical limits is described. Close attention is given to ANSI N16.1-1975, ''Validation of Calculational Methods for Nuclear Criticality Safety.'' The computer codes used for criticality safety computations, which are listed and are briefly described, have been placed in the SRL JOSHUA system to facilitate calculation and to reduce input errors. A driver module, KOKO, simplifies and standardizes input and links the codes together in various ways. For any criticality safety evaluation, correlations of the calculational methods are made with experiment to establish bias. Occasionally subcritical experiments are performed expressly to provide benchmarks. Calculated subcritical limits contain an adequate but not excessive margin to allow for uncertainty in the bias. The final step in any criticality safety evaluation is the writing of a report describing the calculations and justifying the margin

  7. Two-Step Injection Method for Collecting Digital Evidence in Digital Forensics

    Directory of Open Access Journals (Sweden)

    Nana Rachmana Syambas

    2015-01-01

    Full Text Available In digital forensic investigations, the investigators take digital evidence from computers, laptops or other electronic goods. There are many complications when a suspect or related person does not want to cooperate or has removed digital evidence. A lot of research has been done with the goal of retrieving data from flash memory or other digital storage media from which the content has been deleted. Unfortunately, such methods cannot guarantee that all data will be recovered. Most data can only be recovered partially and sometimes not perfectly, so that some or all files cannot be opened. This paper proposes the development of a new method for the retrieval of digital evidence called the Two-Step Injection method (TSI. It focuses on the prevention of the loss of digital evidence through the deletion of data by suspects or other parties. The advantage of this method is that the system works in secret and can be combined with other digital evidence applications that already exist, so that the accuracy and completeness of the resulting digital evidence can be improved. An experiment to test the effectiveness of the method was set up. The developed TSI system worked properly and had a 100% success rate.

  8. Validation method training: nurses' experiences and ratings of work climate.

    Science.gov (United States)

    Söderlund, Mona; Norberg, Astrid; Hansebo, Görel

    2014-03-01

    Training nursing staff in communication skills can impact on the quality of care for residents with dementia and contributes to nurses' job satisfaction. Changing attitudes and practices takes time and energy and can affect the entire nursing staff, not just the nurses directly involved in a training programme. Therefore, it seems important to study nurses' experiences of a training programme and any influence of the programme on work climate among the entire nursing staff. To explore nurses' experiences of a 1-year validation method training programme conducted in a nursing home for residents with dementia and to describe ratings of work climate before and after the programme. A mixed-methods approach. Twelve nurses participated in the training and were interviewed afterwards. These individual interviews were tape-recorded and transcribed, then analysed using qualitative content analysis. The Creative Climate Questionnaire was administered before (n = 53) and after (n = 56) the programme to the entire nursing staff in the participating nursing home wards and analysed with descriptive statistics. Analysis of the interviews resulted in four categories: being under extra strain, sharing experiences, improving confidence in care situations and feeling uncertain about continuing the validation method. The results of the questionnaire on work climate showed higher mean values in the assessment after the programme had ended. The training strengthened the participating nurses in caring for residents with dementia, but posed an extra strain on them. These nurses also described an extra strain on the entire nursing staff that was not reflected in the results from the questionnaire. The work climate at the nursing home wards might have made it easier to conduct this extensive training programme. Training in the validation method could develop nurses' communication skills and improve their handling of complex care situations. © 2013 Blackwell Publishing Ltd.

  9. Construct validity of the reporter-interpreter-manager-educator structure for assessing students' patient encounter skills

    DEFF Research Database (Denmark)

    Tolsgaard, Martin G.; Arendrup, Henrick; Lindhardt, Bjarne O.

    2012-01-01

    PURPOSE: The aim of this study, done in Denmark, was to explore the construct validity of a Reporter-Interpreter-Manager-Educator (RIME)-structured scoring format for assessing patient encounter skills. METHOD: The authors developed a RIME-structured scoring form and explored its construct validity...... in a two-step procedure. The first step (implemented in 2009) was a randomized, controlled, experimental study in which the performance of three groups (16 fourth-year medical students, 16 sixth-year medical students, and 16 interns) was assessed in two simulated patient encounters. The second step...... (carried out during 2009-2010) was an observational study of patient encounter skills where clinician examiners used the scoring form in end-of-clerkship oral examinations of three consecutive cohorts of a total of 547 fourth-year medical students. RESULTS: In the experimental study, RIME scores showed...

  10. Adaptive step-size algorithm for Fourier beam-propagation method with absorbing boundary layer of auto-determined width.

    Science.gov (United States)

    Learn, R; Feigenbaum, E

    2016-06-01

    Two algorithms that enhance the utility of the absorbing boundary layer are presented, mainly in the framework of the Fourier beam-propagation method. One is an automated boundary layer width selector that chooses a near-optimal boundary size based on the initial beam shape. The second algorithm adjusts the propagation step sizes based on the beam shape at the beginning of each step in order to reduce aliasing artifacts.

  11. Validation of single-sample doubly labeled water method

    International Nuclear Information System (INIS)

    Webster, M.D.; Weathers, W.W.

    1989-01-01

    We have experimentally validated a single-sample variant of the doubly labeled water method for measuring metabolic rate and water turnover in a very small passerine bird, the verdin (Auriparus flaviceps). We measured CO 2 production using the Haldane gravimetric technique and compared these values with estimates derived from isotopic data. Doubly labeled water results based on the one-sample calculations differed from Haldane values by less than 0.5% on average (range -8.3 to 11.2%, n = 9). Water flux computed by the single-sample method differed by -1.5% on average from results for the same birds based on the standard, two-sample technique (range -13.7 to 2.0%, n = 9)

  12. The a3 problem solving report: a 10-step scientific method to execute performance improvements in an academic research vivarium.

    Science.gov (United States)

    Bassuk, James A; Washington, Ida M

    2013-01-01

    The purpose of this study was to illustrate the application of A3 Problem Solving Reports of the Toyota Production System to our research vivarium through the methodology of Continuous Performance Improvement, a lean approach to healthcare management at Seattle Children's (Hospital, Research Institute, Foundation). The Report format is described within the perspective of a 10-step scientific method designed to realize measurable improvements of Issues identified by the Report's Author, Sponsor and Coach. The 10-step method (Issue, Background, Current Condition, Goal, Root Cause, Target Condition, Countermeasures, Implementation Plan, Test, and Follow-up) was shown to align with Shewhart's Plan-Do-Check-Act process improvement cycle in a manner that allowed for quantitative analysis of the Countermeasure's outcomes and of Testing results. During fiscal year 2012, 9 A3 Problem Solving Reports were completed in the vivarium under the teaching and coaching system implemented by the Research Institute. Two of the 9 reports are described herein. Report #1 addressed the issue of the vivarium's veterinarian not being able to provide input into sick animal cases during the work day, while report #7 tackled the lack of a standard in keeping track of weekend/holiday animal health inspections. In each Report, a measurable Goal that established the basis for improvement recognition was present. A Five Whys analysis identified the Root Cause for Report #1 as historical work patterns that existed before the veterinarian was hired on and that modern electronic communication tools had not been implemented. The same analysis identified the Root Cause for Report #7 as the vivarium had never standardized the process for weekend/holiday checks. Successful outcomes for both Reports were obtained and validated by robust audit plans. The collective data indicate that vivarium staff acquired a disciplined way of reporting on, as well as solving, problems in a manner consistent with high

  13. The method of quick satellite aiming with 3-Steps on the mobile satellite station

    Directory of Open Access Journals (Sweden)

    Sheng Liang

    2017-02-01

    Full Text Available The study analyses and concludes the technology of the satellite aiming during real-time broadcast of mobile video.We conclude a method of quick satellite aiming with 3-steps according to practical exercises and users' requirement to meet situation of facts and standardized operation,which can improve efficiency and quality of service.

  14. Experimental validation for calcul methods of structures having shock non-linearity

    International Nuclear Information System (INIS)

    Brochard, D.; Buland, P.

    1987-01-01

    For the seismic analysis of non-linear structures, numerical methods have been developed which need to be validated on experimental results. The aim of this paper is to present the design method of a test program which results will be used for this purpose. Some applications to nuclear components will illustrate this presentation [fr

  15. Validation of innovative technologies and strategies for regulatory safety assessment methods: challenges and opportunities.

    Science.gov (United States)

    Stokes, William S; Wind, Marilyn

    2010-01-01

    Advances in science and innovative technologies are providing new opportunities to develop test methods and strategies that may improve safety assessments and reduce animal use for safety testing. These include high throughput screening and other approaches that can rapidly measure or predict various molecular, genetic, and cellular perturbations caused by test substances. Integrated testing and decision strategies that consider multiple types of information and data are also being developed. Prior to their use for regulatory decision-making, new methods and strategies must undergo appropriate validation studies to determine the extent that their use can provide equivalent or improved protection compared to existing methods and to determine the extent that reproducible results can be obtained in different laboratories. Comprehensive and optimal validation study designs are expected to expedite the validation and regulatory acceptance of new test methods and strategies that will support improved safety assessments and reduced animal use for regulatory testing.

  16. Used-habitat calibration plots: A new procedure for validating species distribution, resource selection, and step-selection models

    Science.gov (United States)

    Fieberg, John R.; Forester, James D.; Street, Garrett M.; Johnson, Douglas H.; ArchMiller, Althea A.; Matthiopoulos, Jason

    2018-01-01

    “Species distribution modeling” was recently ranked as one of the top five “research fronts” in ecology and the environmental sciences by ISI's Essential Science Indicators (Renner and Warton 2013), reflecting the importance of predicting how species distributions will respond to anthropogenic change. Unfortunately, species distribution models (SDMs) often perform poorly when applied to novel environments. Compounding on this problem is the shortage of methods for evaluating SDMs (hence, we may be getting our predictions wrong and not even know it). Traditional methods for validating SDMs quantify a model's ability to classify locations as used or unused. Instead, we propose to focus on how well SDMs can predict the characteristics of used locations. This subtle shift in viewpoint leads to a more natural and informative evaluation and validation of models across the entire spectrum of SDMs. Through a series of examples, we show how simple graphical methods can help with three fundamental challenges of habitat modeling: identifying missing covariates, non-linearity, and multicollinearity. Identifying habitat characteristics that are not well-predicted by the model can provide insights into variables affecting the distribution of species, suggest appropriate model modifications, and ultimately improve the reliability and generality of conservation and management recommendations.

  17. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method.

    Science.gov (United States)

    Badran, Hani; Pluye, Pierre; Grad, Roland

    2017-03-14

    The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n

  18. Rotor cascade shape optimization with unsteady passing wakes using implicit dual time stepping method

    Science.gov (United States)

    Lee, Eun Seok

    2000-10-01

    An improved aerodynamics performance of a turbine cascade shape can be achieved by an understanding of the flow-field associated with the stator-rotor interaction. In this research, an axial gas turbine airfoil cascade shape is optimized for improved aerodynamic performance by using an unsteady Navier-Stokes solver and a parallel genetic algorithm. The objective of the research is twofold: (1) to develop a computational fluid dynamics code having faster convergence rate and unsteady flow simulation capabilities, and (2) to optimize a turbine airfoil cascade shape with unsteady passing wakes for improved aerodynamic performance. The computer code solves the Reynolds averaged Navier-Stokes equations. It is based on the explicit, finite difference, Runge-Kutta time marching scheme and the Diagonalized Alternating Direction Implicit (DADI) scheme, with the Baldwin-Lomax algebraic and k-epsilon turbulence modeling. Improvements in the code focused on the cascade shape design capability, convergence acceleration and unsteady formulation. First, the inverse shape design method was implemented in the code to provide the design capability, where a surface transpiration concept was employed as an inverse technique to modify the geometry satisfying the user specified pressure distribution on the airfoil surface. Second, an approximation storage multigrid method was implemented as an acceleration technique. Third, the preconditioning method was adopted to speed up the convergence rate in solving the low Mach number flows. Finally, the implicit dual time stepping method was incorporated in order to simulate the unsteady flow-fields. For the unsteady code validation, the Stokes's 2nd problem and the Poiseuille flow were chosen and compared with the computed results and analytic solutions. To test the code's ability to capture the natural unsteady flow phenomena, vortex shedding past a cylinder and the shock oscillation over a bicircular airfoil were simulated and compared with

  19. AFM tip characterization by using FFT filtered images of step structures

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Yongda, E-mail: yanyongda@hit.edu.cn [Key Laboratory of Micro-systems and Micro-structures Manufacturing of Ministry of Education, Harbin Institute of Technology, Harbin, Heilongjiang 150001 (China); Center For Precision Engineering, Harbin Institute of Technology, Harbin, Heilongjiang 150001 (China); Xue, Bo [Key Laboratory of Micro-systems and Micro-structures Manufacturing of Ministry of Education, Harbin Institute of Technology, Harbin, Heilongjiang 150001 (China); Center For Precision Engineering, Harbin Institute of Technology, Harbin, Heilongjiang 150001 (China); Hu, Zhenjiang; Zhao, Xuesen [Center For Precision Engineering, Harbin Institute of Technology, Harbin, Heilongjiang 150001 (China)

    2016-01-15

    The measurement resolution of an atomic force microscope (AFM) is largely dependent on the radius of the tip. Meanwhile, when using AFM to study nanoscale surface properties, the value of the tip radius is needed in calculations. As such, estimation of the tip radius is important for analyzing results taken using an AFM. In this study, a geometrical model created by scanning a step structure with an AFM tip was developed. The tip was assumed to have a hemispherical cone shape. Profiles simulated by tips with different scanning radii were calculated by fast Fourier transform (FFT). By analyzing the influence of tip radius variation on the spectra of simulated profiles, it was found that low-frequency harmonics were more susceptible, and that the relationship between the tip radius and the low-frequency harmonic amplitude of the step structure varied monotonically. Based on this regularity, we developed a new method to characterize the radius of the hemispherical tip. The tip radii estimated with this approach were comparable to the results obtained using scanning electron microscope imaging and blind reconstruction methods. - Highlights: • The AFM tips with different radii were simulated to scan a nano-step structure. • The spectra of the simulation scans under different radii were analyzed. • The functions of tip radius and harmonic amplitude were used for evaluating tip. • The proposed method has been validated by SEM imaging and blind reconstruction.

  20. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    Science.gov (United States)

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  1. Validation of commercially available automated canine-specific immunoturbidimetric method for measuring canine C-reactive protein

    DEFF Research Database (Denmark)

    Hillström, Anna; Hagman, Ragnvi; Tvedten, Harold

    2014-01-01

    BACKGROUND: Measurement of C-reactive protein (CRP) is used for diagnosing and monitoring systemic inflammatory disease in canine patients. An automated human immunoturbidimetric assay has been validated for measuring canine CRP, but cross-reactivity with canine CRP is unpredictable. OBJECTIVE......: The purpose of the study was to validate a new automated canine-specific immunoturbidimetric CRP method (Gentian cCRP). METHODS: Studies of imprecision, accuracy, prozone effect, interference, limit of quantification, and stability under different storage conditions were performed. The new method was compared...... with a human CRP assay previously validated for canine CRP determination. Samples from 40 healthy dogs were analyzed to establish a reference interval. RESULTS: Total imprecision was

  2. Characteristic analysis of laser isotope separation process by two-step photodissociation method

    International Nuclear Information System (INIS)

    Okamoto, Tsuyoshi; Suzuki, Atsuyuki; Kiyose, Ryohei

    1981-01-01

    A large number of laser isotope separation experiments have been performed actively in many countries. In this paper, the selective two-step photodissociation method is chosen and simultaneous nonlinear differential equations that express the separation process are solved directly by using computer. Predicted separation factors are investigated in relation to the incident pulse energy and the concentration of desired molecules. Furthermore, the concept of separative work is used to evaluate the results of separation for this method. It is shown from an example of numerical calculation that a very large separation factor can be obtained if the concentration of desired molecules is lowered and two laser pulses to be closely synchronized are not always required in operation for the photodissociation of molecules. (author)

  3. A Comparison of Three Methods for the Analysis of Skin Flap Viability: Reliability and Validity.

    Science.gov (United States)

    Tim, Carla Roberta; Martignago, Cintia Cristina Santi; da Silva, Viviane Ribeiro; Dos Santos, Estefany Camila Bonfim; Vieira, Fabiana Nascimento; Parizotto, Nivaldo Antonio; Liebano, Richard Eloin

    2018-05-01

    Objective: Technological advances have provided new alternatives to the analysis of skin flap viability in animal models; however, the interrater validity and reliability of these techniques have yet to be analyzed. The present study aimed to evaluate the interrater validity and reliability of three different methods: weight of paper template (WPT), paper template area (PTA), and photographic analysis. Approach: Sixteen male Wistar rats had their cranially based dorsal skin flap elevated. On the seventh postoperative day, the viable tissue area and the necrotic area of the skin flap were recorded using the paper template method and photo image. The evaluation of the percentage of viable tissue was performed using three methods, simultaneously and independently by two raters. The analysis of interrater reliability and viability was performed using the intraclass correlation coefficient and Bland Altman Plot Analysis was used to visualize the presence or absence of systematic bias in the evaluations of data validity. Results: The results showed that interrater reliability for WPT, measurement of PTA, and photographic analysis were 0.995, 0.990, and 0.982, respectively. For data validity, a correlation >0.90 was observed for all comparisons made between the three methods. In addition, Bland Altman Plot Analysis showed agreement between the comparisons of the methods and the presence of systematic bias was not observed. Innovation: Digital methods are an excellent choice for assessing skin flap viability; moreover, they make data use and storage easier. Conclusion: Independently from the method used, the interrater reliability and validity proved to be excellent for the analysis of skin flaps' viability.

  4. ReSOLV: Applying Cryptocurrency Blockchain Methods to Enable Global Cross-Platform Software License Validation

    Directory of Open Access Journals (Sweden)

    Alan Litchfield

    2018-05-01

    Full Text Available This paper presents a method for a decentralised peer-to-peer software license validation system using cryptocurrency blockchain technology to ameliorate software piracy, and to provide a mechanism for software developers to protect copyrighted works. Protecting software copyright has been an issue since the late 1970s and software license validation has been a primary method employed in an attempt to minimise software piracy and protect software copyright. The method described creates an ecosystem in which the rights and privileges of participants are observed.

  5. STEPS: A NARRATIVE ACCOUNT OF A GABAPENTIN SEEDING TRIAL

    Science.gov (United States)

    Krumholz, Samuel D.; Egilman, David S.; Ross, Joseph S.

    2012-01-01

    Background Seeding trials, clinical studies conducted by pharmaceutical companies for marketing purposes, have rarely been described in detail. Methods We examined all documents relating to the clinical trial Study of Neurontin: Titrate to Effect, Profile of Safety (STEPS) produced during the Neurontin marketing, sales practices and product liability litigation, including company internal and external correspondence, reports, and presentations, as well as depositions elicited in legal proceedings of Harden Manufacturing v. Pfizer and Franklin v. Warner-Lambert, the majority of which were created between 1990 and 2009. Using a systematic search strategy, we identified and reviewed all documents related to the STEPS trial, in order to identify key themes related to the trial’s conduct and determine the extent of marketing involvement in its planning and implementation. Results Documents demonstrated that STEPS was a seeding trial posing as a legitimate scientific study. Documents consistently described the trial itself, not trial results, to be a marketing tactic in the company’s marketing plans. Documents demonstrated that several external sources questioned the validity of the study before execution, and that data quality during the study was often compromised. Furthermore, documents described company analyses examining the impact of participating as a STEPS investigator on rates and dosages of gabapentin prescribing, finding a positive association. None of these findings were reported in two published papers. Conclusions The STEPS trial was a seeding trial, used to promote gabapentin and increase prescribing among investigators, and marketing was extensively involved in its planning and implementation. PMID:21709111

  6. LACO-Wiki: A land cover validation tool and a new, innovative teaching resource for remote sensing and the geosciences

    Science.gov (United States)

    See, Linda; Perger, Christoph; Dresel, Christopher; Hofer, Martin; Weichselbaum, Juergen; Mondel, Thomas; Steffen, Fritz

    2016-04-01

    The validation of land cover products is an important step in the workflow of generating a land cover map from remotely-sensed imagery. Many students of remote sensing will be given exercises on classifying a land cover map followed by the validation process. Many algorithms exist for classification, embedded within proprietary image processing software or increasingly as open source tools. However, there is little standardization for land cover validation, nor a set of open tools available for implementing this process. The LACO-Wiki tool was developed as a way of filling this gap, bringing together standardized land cover validation methods and workflows into a single portal. This includes the storage and management of land cover maps and validation data; step-by-step instructions to guide users through the validation process; sound sampling designs; an easy-to-use environment for validation sample interpretation; and the generation of accuracy reports based on the validation process. The tool was developed for a range of users including producers of land cover maps, researchers, teachers and students. The use of such a tool could be embedded within the curriculum of remote sensing courses at a university level but is simple enough for use by students aged 13-18. A beta version of the tool is available for testing at: http://www.laco-wiki.net.

  7. Preisach hysteresis implementation in reluctance network method, comparison with finite element method

    OpenAIRE

    Allag , Hicham; Kedous-Lebouc , Afef; Latreche , Mohamed E. H.

    2008-01-01

    International audience; In this work, an implementation of static magnetic hysteresis in the reluctance network method is presented and its effectiveness is demonstrated. This implementation is achieved by a succession of iterative steps in the form of algorithm explained and developed for simple examples. However it remains valid for any magnetic circuit. The results obtained are compared to those given by finite element method simulation and essentially the effect of relaxation is discussed...

  8. A new multi-step technique with differential transform method for analytical solution of some nonlinear variable delay differential equations.

    Science.gov (United States)

    Benhammouda, Brahim; Vazquez-Leal, Hector

    2016-01-01

    This work presents an analytical solution of some nonlinear delay differential equations (DDEs) with variable delays. Such DDEs are difficult to treat numerically and cannot be solved by existing general purpose codes. A new method of steps combined with the differential transform method (DTM) is proposed as a powerful tool to solve these DDEs. This method reduces the DDEs to ordinary differential equations that are then solved by the DTM. Furthermore, we show that the solutions can be improved by Laplace-Padé resummation method. Two examples are presented to show the efficiency of the proposed technique. The main advantage of this technique is that it possesses a simple procedure based on a few straight forward steps and can be combined with any analytical method, other than the DTM, like the homotopy perturbation method.

  9. Validation of methods for the detection and quantification of engineered nanoparticles in food

    DEFF Research Database (Denmark)

    Linsinger, T.P.J.; Chaudhry, Q.; Dehalu, V.

    2013-01-01

    the methods apply equally well to particles of different suppliers. In trueness testing, information whether the particle size distribution has changed during analysis is required. Results are largely expected to follow normal distributions due to the expected high number of particles. An approach...... approach for the validation of methods for detection and quantification of nanoparticles in food samples. It proposes validation of identity, selectivity, precision, working range, limit of detection and robustness, bearing in mind that each “result” must include information about the chemical identity...

  10. A simple test of choice stepping reaction time for assessing fall risk in people with multiple sclerosis.

    Science.gov (United States)

    Tijsma, Mylou; Vister, Eva; Hoang, Phu; Lord, Stephen R

    2017-03-01

    Purpose To determine (a) the discriminant validity for established fall risk factors and (b) the predictive validity for falls of a simple test of choice stepping reaction time (CSRT) in people with multiple sclerosis (MS). Method People with MS (n = 210, 21-74y) performed the CSRT, sensorimotor, balance and neuropsychological tests in a single session. They were then followed up for falls using monthly fall diaries for 6 months. Results The CSRT test had excellent discriminant validity with respect to established fall risk factors. Frequent fallers (≥3 falls) performed significantly worse in the CSRT test than non-frequent fallers (0-2 falls). With the odds of suffering frequent falls increasing 69% with each SD increase in CSRT (OR = 1.69, 95% CI: 1.27-2.26, p = falls in people with MS. This test may prove useful in documenting longitudinal changes in fall risk in relation to MS disease progression and effects of interventions. Implications for rehabilitation Good choice stepping reaction time (CSRT) is required for maintaining balance. A simple low-tech CSRT test has excellent discriminative and predictive validity in relation to falls in people with MS. This test may prove useful documenting longitudinal changes in fall risk in relation to MS disease progression and effects of interventions.

  11. Assessment of critical steps of a GC/MS based indirect analytical method for the determination of fatty acid esters of monochloropropanediols (MCPDEs) and of glycidol (GEs).

    Science.gov (United States)

    Zelinkova, Zuzana; Giri, Anupam; Wenzl, Thomas

    2017-07-01

    Fatty acid esters of 2- and 3-chloropropanediol (MCPDEs) and fatty acid esters of glycidol (GEs) are commonly monitored in edible fats and oils. A recommendation issued by the European Commission emphasizes the need of generating data on the occurrence of these substances in a broad range of different foods. So far, analytical methods for the determination of MCPDEs and GEs are fully validated only for oils, fats and margarine. This manuscript presents the assessment of critical steps in the AOCS Cd 29a-13 method for the simultaneous determination of MCPDEs and GEs in the fat phase obtained from bakery and potato products, smoked and fried fish and meat, and other cereal products. The trueness of the method is affected by the additional formation of 3-MBPD esters from monoacylglycerols (MAGs), which are frequently present in food. The overestimation of GE contents for some samples was confirmed by the comparison of results with results obtained by an independent analytical method (direct analysis of GE by HPLC-MS/MS). An additional sample pre-treatment by SPE was introduced to remove MAGs from fat prior to the GEs conversion, while the overall method sensitivity was not significantly affected. Trueness of the determination of GEs by the modified analytical procedure was confirmed by comparison with a direct analysis of GEs. The potential impact on accuracy of results of the final sample preparation step of the analytical procedure, the derivatization of free forms MCPD and MBPD with PBA, was evaluated as well. Different commercial batches of PBA showed differences in solubility in a non-polar organic solvent. The PBA derivatization in organic solvent did not affect precision and trueness of the method due to the isotopic standard dilution. However, method sensitivity might be significantly compromised.

  12. Development of a real time activity monitoring Android application utilizing SmartStep.

    Science.gov (United States)

    Hegde, Nagaraj; Melanson, Edward; Sazonov, Edward

    2016-08-01

    Footwear based activity monitoring systems are becoming popular in academic research as well as consumer industry segments. In our previous work, we had presented developmental aspects of an insole based activity and gait monitoring system-SmartStep, which is a socially acceptable, fully wireless and versatile insole. The present work describes the development of an Android application that captures the SmartStep data wirelessly over Bluetooth Low energy (BLE), computes features on the received data, runs activity classification algorithms and provides real time feedback. The development of activity classification methods was based on the the data from a human study involving 4 participants. Participants were asked to perform activities of sitting, standing, walking, and cycling while they wore SmartStep insole system. Multinomial Logistic Discrimination (MLD) was utilized in the development of machine learning model for activity prediction. The resulting classification model was implemented in an Android Smartphone. The Android application was benchmarked for power consumption and CPU loading. Leave one out cross validation resulted in average accuracy of 96.9% during model training phase. The Android application for real time activity classification was tested on a human subject wearing SmartStep resulting in testing accuracy of 95.4%.

  13. Development and validity of a method for the evaluation of printed education material.

    Directory of Open Access Journals (Sweden)

    Castro MS

    2007-06-01

    Full Text Available Objectives: To develop and study the validity of an instrument for evaluation of Printed Education Materials (PEM; to evaluate the use of acceptability indices; to identify possible influences of professional aspects. Methods: An instrument for PEM evaluation was developed which included tree steps: domain identification, item generation and instrument design. A reading to easy PEM was developed for education of patient with systemic hypertension and its treatment with hydrochlorothiazide. Construct validity was measured based on previously established errors purposively introduced into the PEM, which served as extreme groups. An acceptability index was applied taking into account the rate of professionals who should approve each item. Participants were 10 physicians (9 men and 5 nurses (all women.Results: Many professionals identified intentional errors of crude character. Few participants identified errors that needed more careful evaluation, and no one detected the intentional error that required literature analysis. Physicians considered as acceptable 95.8% of the items of the PEM, and nurses 29.2%. The differences between the scoring were statistically significant in 27% of the items. In the overall evaluation, 66.6% were considered as acceptable. The analysis of each item revealed a behavioral pattern for each professional group.Conclusions: The use of instruments for evaluation of printed education materials is required and may improve the quality of the PEM available for the patients. Not always are the acceptability indices totally correct or represent high quality of information. The professional experience, the practice pattern, and perhaps the gendre of the reviewers may influence their evaluation. An analysis of the PEM by professionals in communication, in drug information, and patients should be carried out to improve the quality of the proposed material.

  14. A novel two-step method for screening shade tolerant mutant plants via dwarfism

    Directory of Open Access Journals (Sweden)

    Wei Li

    2016-10-01

    Full Text Available When subjected to shade, plants undergo rapid shoot elongation, which often makes them more prone to disease and mechanical damage. Shade-tolerant plants can be difficult to breed; however, they offer a substantial benefit over other varieties in low-light areas. Although perennial ryegrass (Lolium perenne L. is a popular species of turf grasses because of their good appearance and fast establishment, the plant normally does not perform well under shade conditions. It has been reported that, in turfgrass, induced dwarfism can enhance shade tolerance. Here we describe a two-step procedure for isolating shade tolerant mutants of perennial ryegrass by first screening for dominant dwarf mutants, and then screening dwarf plants for shade tolerance. The two-step screening process to isolate shade tolerant mutants can be done efficiently with limited space at early seedling stages, which enables quick and efficient isolation of shade tolerant mutants, and thus facilitates development of shade tolerant new cultivars of turfgrasses. Using the method, we isolated 136 dwarf mutants from 300,000 mutagenized seeds, with 65 being shade tolerant (0.022%. When screened directly for shade tolerance, we recovered only four mutants from a population of 150,000 (0.003% mutagenized seeds. One shade tolerant mutant, shadow-1, was characterized in detail. In addition to dwarfism, shadow-1 and its sexual progeny displayed high degrees of tolerance to both natural and artificial shade. We showed that endogenous gibberellin (GA content in shadow-1 was higher than wild-type controls, and shadow-1 was also partially GA insensitive. Our novel, simple and effective two-step screening method should be applicable to breeding shade tolerant cultivars of turfgrasses, ground covers, and other economically important crop plants that can be used under canopies of existing vegetation to increase productivity per unit area of land.

  15. Development and Validation of HPLC-PDA Assay method of Frangula emodin

    Directory of Open Access Journals (Sweden)

    Deborah Duca

    2016-03-01

    Full Text Available Frangula emodin, (1,3,8-trihydroxy-6-methyl-anthraquinone, is one of the anthraquinone derivatives found abundantly in the roots and bark of a number of plant families traditionally used to treat constipation and haemorrhoids. The present study describes the development and subsequent validation of a specific Assay HPLC method for emodin. The separation was achieved on a Waters Symmetry C18, 4.6 × 250 mm, 5 μm particle size, column at a temperature of 35 °C, with UV detection at 287 and 436 nm. An isocratic elution mode consisting of 0.1% formic acid and 0.01% trifluoroacetic acid as the aqueous mobile phase, and methanol was used. The method was successfully and statistically validated for linearity, range, precision, accuracy, specificity and solution stability.

  16. Current lipid extraction methods are significantly enhanced adding a water treatment step in Chlorella protothecoides.

    Science.gov (United States)

    Ren, Xiaojie; Zhao, Xinhe; Turcotte, François; Deschênes, Jean-Sébastien; Tremblay, Réjean; Jolicoeur, Mario

    2017-02-11

    Microalgae have the potential to rapidly accumulate lipids of high interest for the food, cosmetics, pharmaceutical and energy (e.g. biodiesel) industries. However, current lipid extraction methods show efficiency limitation and until now, extraction protocols have not been fully optimized for specific lipid compounds. The present study thus presents a novel lipid extraction method, consisting in the addition of a water treatment of biomass between the two-stage solvent extraction steps of current extraction methods. The resulting modified method not only enhances lipid extraction efficiency, but also yields a higher triacylglycerols (TAG) ratio, which is highly desirable for biodiesel production. Modification of four existing methods using acetone, chloroform/methanol (Chl/Met), chloroform/methanol/H 2 O (Chl/Met/H 2 O) and dichloromethane/methanol (Dic/Met) showed respective lipid extraction yield enhancement of 72.3, 35.8, 60.3 and 60.9%. The modified acetone method resulted in the highest extraction yield, with 68.9 ± 0.2% DW total lipids. Extraction of TAG was particularly improved with the water treatment, especially for the Chl/Met/H 2 O and Dic/Met methods. The acetone method with the water treatment led to the highest extraction level of TAG with 73.7 ± 7.3 µg/mg DW, which is 130.8 ± 10.6% higher than the maximum value obtained for the four classical methods (31.9 ± 4.6 µg/mg DW). Interestingly, the water treatment preferentially improved the extraction of intracellular fractions, i.e. TAG, sterols, and free fatty acids, compared to the lipid fractions of the cell membranes, which are constituted of phospholipids (PL), acetone mobile polar lipids and hydrocarbons. Finally, from the 32 fatty acids analyzed for both neutral lipids (NL) and polar lipids (PL) fractions, it is clear that the water treatment greatly improves NL-to-PL ratio for the four standard methods assessed. Water treatment of biomass after the first solvent extraction step

  17. Influence of application methods of one-step self-etching adhesives on microtensile bond strength

    Directory of Open Access Journals (Sweden)

    Chul-Kyu Choi,

    2011-05-01

    Full Text Available Objectives The purpose of this study was to evaluate the effect of various application methods of one-step self-etch adhesives to microtensile resin-dentin bond strength. Materials and Methods Thirty-six extracted human molars were used. The teeth were assigned randomly to twelve groups (n = 15, according to the three different adhesive systems (Clearfil Tri-S Bond, Adper Prompt L-Pop, G-Bond and application methods. The adhesive systems were applied on the dentin as follows: 1 The single coating, 2 The double coating, 3 Manual agitation, 4 Ultrasonic agitation. Following the adhesive application, light-cure composite resin was constructed. The restored teeth were stored in distilled water at room temperature for 24 hours, and prepared 15 specimens per groups. Then microtensile bond strength was measured and the failure mode was examined. Results Manual agitation and ultrasonic agitation of adhesive significantly increased the microtensile bond strength than single coating and double coating did. Double coating of adhesive significantly increased the microtensile bond strength than single coating did and there was no significant difference between the manual agitation and ultrasonic agitation group. There was significant difference in microtensile bonding strength among all adhesives and Clearfil Tri-S Bond showed the highest bond strength. Conclusions In one-step self-etching adhesives, there was significant difference according to application methods and type of adhesives. No matter of the material, the manual or ultrasonic agitation of the adhesive showed significantly higher microtensile bond strength.

  18. Discriminant content validity: a quantitative methodology for assessing content of theory-based measures, with illustrative applications.

    Science.gov (United States)

    Johnston, Marie; Dixon, Diane; Hart, Jo; Glidewell, Liz; Schröder, Carin; Pollard, Beth

    2014-05-01

    In studies involving theoretical constructs, it is important that measures have good content validity and that there is not contamination of measures by content from other constructs. While reliability and construct validity are routinely reported, to date, there has not been a satisfactory, transparent, and systematic method of assessing and reporting content validity. In this paper, we describe a methodology of discriminant content validity (DCV) and illustrate its application in three studies. Discriminant content validity involves six steps: construct definition, item selection, judge identification, judgement format, single-sample test of content validity, and assessment of discriminant items. In three studies, these steps were applied to a measure of illness perceptions (IPQ-R) and control cognitions. The IPQ-R performed well with most items being purely related to their target construct, although timeline and consequences had small problems. By contrast, the study of control cognitions identified problems in measuring constructs independently. In the final study, direct estimation response formats for theory of planned behaviour constructs were found to have as good DCV as Likert format. The DCV method allowed quantitative assessment of each item and can therefore inform the content validity of the measures assessed. The methods can be applied to assess content validity before or after collecting data to select the appropriate items to measure theoretical constructs. Further, the data reported for each item in Appendix S1 can be used in item or measure selection. Statement of contribution What is already known on this subject? There are agreed methods of assessing and reporting construct validity of measures of theoretical constructs, but not their content validity. Content validity is rarely reported in a systematic and transparent manner. What does this study add? The paper proposes discriminant content validity (DCV), a systematic and transparent method

  19. Acceleration of step and linear discontinuous schemes for the method of characteristics in DRAGON5

    Directory of Open Access Journals (Sweden)

    Alain Hébert

    2017-09-01

    Full Text Available The applicability of the algebraic collapsing acceleration (ACA technique to the method of characteristics (MOC in cases with scattering anisotropy and/or linear sources was investigated. Previously, the ACA was proven successful in cases with isotropic scattering and uniform (step sources. A presentation is first made of the MOC implementation, available in the DRAGON5 code. Two categories of schemes are available for integrating the propagation equations: (1 the first category is based on exact integration and leads to the classical step characteristics (SC and linear discontinuous characteristics (LDC schemes and (2 the second category leads to diamond differencing schemes of various orders in space. The acceleration of these MOC schemes using a combination of the generalized minimal residual [GMRES(m] method preconditioned with the ACA technique was focused on. Numerical results are provided for a two-dimensional (2D eight-symmetry pressurized water reactor (PWR assembly mockup in the context of the DRAGON5 code.

  20. A multi-step dealloying method to produce nanoporous gold with no volume change and minimal cracking

    Energy Technology Data Exchange (ETDEWEB)

    Sun Ye [Department of Chemical and Materials Engineering, University of Kentucky, 177 F. Paul Anderson Tower, Lexington, KY 40506 (United States); Balk, T. John [Department of Chemical and Materials Engineering, University of Kentucky, 177 F. Paul Anderson Tower, Lexington, KY 40506 (United States)], E-mail: balk@engr.uky.edu

    2008-05-15

    We report a simple two-step dealloying method for producing bulk nanoporous gold with no volume change and no significant cracking. The galvanostatic dealloying method used here appears superior to potentiostatic methods for fabricating millimeter-scale samples. Care must be taken when imaging the nanoscale, interconnected sponge-like structure with a focused ion beam, as even brief exposure caused immediate and extensive cracking of nanoporous gold, as well as ligament coarsening at the surface00.

  1. Symplectic integrators with adaptive time steps

    Science.gov (United States)

    Richardson, A. S.; Finn, J. M.

    2012-01-01

    In recent decades, there have been many attempts to construct symplectic integrators with variable time steps, with rather disappointing results. In this paper, we identify the causes for this lack of performance, and find that they fall into two categories. In the first, the time step is considered a function of time alone, Δ = Δ(t). In this case, backward error analysis shows that while the algorithms remain symplectic, parametric instabilities may arise because of resonance between oscillations of Δ(t) and the orbital motion. In the second category the time step is a function of phase space variables Δ = Δ(q, p). In this case, the system of equations to be solved is analyzed by introducing a new time variable τ with dt = Δ(q, p) dτ. The transformed equations are no longer in Hamiltonian form, and thus do not benefit from integration methods which would be symplectic for Hamiltonian systems. We analyze two methods for integrating the transformed equations which do, however, preserve the structure of the original equations. The first is an extended phase space method, which has been successfully used in previous studies of adaptive time step symplectic integrators. The second, novel, method is based on a non-canonical mixed-variable generating function. Numerical trials for both of these methods show good results, without parametric instabilities or spurious growth or damping. It is then shown how to adapt the time step to an error estimate found by backward error analysis, in order to optimize the time-stepping scheme. Numerical results are obtained using this formulation and compared with other time-stepping schemes for the extended phase space symplectic method.

  2. Cross-cultural adaptation, validation and reliability of the brazilian version of the Richmond Compulsive Buying Scale

    Directory of Open Access Journals (Sweden)

    Priscilla Leite

    2013-03-01

    Full Text Available OBJECTIVE: To present the process of transcultural adaptation of the Richmond Compulsive Buying Scale to Brazilian Portuguese. METHODS: For the semantic adaptation step, the scale was translated to Portuguese and then back-translated to English by two professional translators and one psychologist, without any communication between them. The scale was then applied to 20 participants from the general population for language adjustments. For the construct validation step, an exploratory factor analysis was performed, using the scree plot test, principal component analysis for factor extraction, and Varimax rotation. For convergent validity, the correlation matrix was analyzed through Pearson's coefficient. RESULTS: The scale showed easy applicability, satisfactory internal consistency (Cronbach's alpha=.87, and a high correlation with other rating scales for compulsive buying disorder, indicating that it is suitable to be used in the assessment and diagnosis of compulsive buying disorder, as it presents psychometric validity. CONCLUSION: The Brazilian Portuguese version of the Richmond Compulsive Buying Scale has good validity and reliability

  3. Validation of Experimental whole-body SAR Assessment Method in a Complex Indoor Environment

    DEFF Research Database (Denmark)

    Bamba, Aliou; Joseph, Wout; Vermeeren, Gunter

    2012-01-01

    Assessing experimentally the whole-body specific absorption rate (SARwb) in a complex indoor environment is very challenging. An experimental method based on room electromagnetics theory (accounting only the Line-Of-Sight as specular path) to assess the whole-body SAR is validated by numerical...... of the proposed method is that it allows discarding the computation burden because it does not use any discretizations. Results show good agreement between measurement and computation at 2.8 GHz, as long as the plane wave assumption is valid, i.e., for high distances from the transmitter. Relative deviations 0...

  4. Validação em métodos cromatográficos para análises de pequenas moléculas em matrizes biológicas Chromatographic methods validation for analysis of small molecules in biological matrices

    OpenAIRE

    Neila Maria Cassiano; Juliana Cristina Barreiro; Lúcia Regina Rocha Martins; Regina Vincenzi Oliveira; Quezia Bezerra Cass

    2009-01-01

    Chromatographic methods are commonly used for analysis of small molecules in different biological matrices. An important step to be considered upon a bioanalytical method's development is the capacity to yield reliable and reproducible results. This review discusses validation procedures adopted by different governmental agencies, such as Food and Drug Administration (USA), European Union (EU) and Agência Nacional de Vigilância Sanitária (BR) for quantification of small molecules by bioanalyt...

  5. Estimating population cause-specific mortality fractions from in-hospital mortality: validation of a new method.

    Directory of Open Access Journals (Sweden)

    Christopher J L Murray

    2007-11-01

    Full Text Available Cause-of-death data for many developing countries are not available. Information on deaths in hospital by cause is available in many low- and middle-income countries but is not a representative sample of deaths in the population. We propose a method to estimate population cause-specific mortality fractions (CSMFs using data already collected in many middle-income and some low-income developing nations, yet rarely used: in-hospital death records.For a given cause of death, a community's hospital deaths are equal to total community deaths multiplied by the proportion of deaths occurring in hospital. If we can estimate the proportion dying in hospital, we can estimate the proportion dying in the population using deaths in hospital. We propose to estimate the proportion of deaths for an age, sex, and cause group that die in hospital from the subset of the population where vital registration systems function or from another population. We evaluated our method using nearly complete vital registration (VR data from Mexico 1998-2005, which records whether a death occurred in a hospital. In this validation test, we used 45 disease categories. We validated our method in two ways: nationally and between communities. First, we investigated how the method's accuracy changes as we decrease the amount of Mexican VR used to estimate the proportion of each age, sex, and cause group dying in hospital. Decreasing VR data used for this first step from 100% to 9% produces only a 12% maximum relative error between estimated and true CSMFs. Even if Mexico collected full VR information only in its capital city with 9% of its population, our estimation method would produce an average relative error in CSMFs across the 45 causes of just over 10%. Second, we used VR data for the capital zone (Distrito Federal and Estado de Mexico and estimated CSMFs for the three lowest-development states. Our estimation method gave an average relative error of 20%, 23%, and 31% for

  6. Validation of ascorbic acid tablets of national production by igh-performance liquid chromatography method

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Izquierdo Castro, Idalberto

    2009-01-01

    We validate an analytical method by high-performance liquid chromatography to determine ascorbic acid proportion in vitamin C tablets, which was designed as an alternative method to quality control and to follow-up of active principle chemical stability, since official techniques to quality control of ascorbic acid in tablets are not selective with degradation products. Method was modified according to that reported in USP 28, 2005 for analysis of injectable product. We used a RP-18 column of 250 x 4.6 mm 5 μm with a UV detector to 245 nm. Its validation was necessary for both objectives, considering parameters required for methods of I and II categories. This method was enough linear, exact, and precise in the rank of 100-300 μg/mL. Also, it was selective with remaining components of matrix and with the possible degradation products achieved in stressing conditions. Detection and quantification limits were estimated. When method was validated it was applied to ascorbic acid quantification in two batches of expired tablets and we detected a marked influence of container in active degradation principle after 12 months at room temperature. (Author)

  7. USFDA-GUIDELINE BASED VALIDATION OF TESTING METHOD FOR RIFAMPICIN IN INDONESIAN SERUM SPECIMEN

    Directory of Open Access Journals (Sweden)

    Tri Joko Raharjo

    2010-06-01

    Full Text Available Regarding a new regulation from Indonesia FDA (Badan POM-RI, all new non patent drugs should show bioequivalence with the originator drug prior to registration. Bioequivalence testing (BE-testing has to be performed to the people that represented of population to which the drug to be administrated. BE testing need a valid bio-analytical method for certain drug target and group of population. This research report specific validation of bio-analysis of Rifampicin in Indonesian serum specimen in order to be used for BE testing. The extraction was performed using acetonitrile while the chromatographic separation was accomplished on a RP 18 column (250 × 4.6 mm i.d., 5 µm, with a mobile phase composed of KH2PO4 10 mM-Acetonitrile (40:60, v/v and UV detection was set at 333 nm. The method shown specificity compared to blank serum specimen with retention time of rifampicin at 2.1 min. Lower limit of quantification (LLOQ was 0.06 µg/mL with dynamic range up to 20 µg/mL (R>0.990. Precision of the method was very good with coefficient of variance (CV 0.58; 7.40 and 5.56% for concentration at 0.06, 5, 15 µg/mL, respectively. Accuracies of the method were 3.22; 1.94; 1.90% for concentration 0.06, 5 and 15 µg/mL respectively. The average recoveries were 97.82, 95.50 and 97.31% for concentration of rifampicin 1, 5 and 5 µg/mL, respectively. The method was also shown reliable result on stability test on freezing-thawing, short-term and long-term stability as well as post preparation stability. Validation result shown that the method was ready to be used for Rifampicin BE testing with Indonesian subject.   Keywords: Rifampicin, Validation, USFDA-Guideline

  8. Mercury speciation analysis in seafood by species-specific isotope dilution: method validation and occurrence data

    Energy Technology Data Exchange (ETDEWEB)

    Clemens, Stephanie; Guerin, Thierry [Agence Nationale de Securite Sanitaire de l' Alimentation, Laboratoire de Securite des Aliments de Maisons-Alfort, Unite des Contaminants Inorganiques et Mineraux de l' Environnement, ANSES, Maisons-Alfort (France); Monperrus, Mathilde; Donard, Olivier F.X.; Amouroux, David [IPREM UMR 5254 CNRS - Universite de Pau et des Pays de l' Adour, Laboratoire de Chimie Analytique Bio-Inorganique et Environnement, Institut des Sciences Analytiques et de Physico-chimie pour l' Environnement et les Materiaux, Pau Cedex (France)

    2011-11-15

    Methylmercury (MeHg) and total mercury (THg) in seafood were determined using species-specific isotope dilution analysis and gas chromatography combined with inductively coupled plasma mass spectrometry. Sample preparation methods (extraction and derivation step) were evaluated on certified reference materials using isotopically enriched Hg species. Solid-liquid extraction, derivation by propylation and automated agitation gave excellent accuracy and precision results. Satisfactory figures of merit for the selected method were obtained in terms of limit of quantification (1.2 {mu}g Hg kg{sup -1} for MeHg and 1.4 {mu}g Hg kg{sup -1} for THg), repeatability (1.3-1.7%), intermediate precision reproducibility (1.5% for MeHg and 2.2% for THg) and trueness (bias error less than 7%). By means of a recent strategy based on accuracy profiles ({beta}-expectation tolerance intervals), the selected method was successfully validated in the range of approximately 0.15-5.1 mg kg{sup -1} for MeHg and 0.27-5.2 mg kg{sup -1} for THg. Probability {beta} was set to 95% and the acceptability limits to {+-}15%. The method was then applied to 62 seafood samples representative of consumption in the French population. The MeHg concentrations were generally low (1.9-588 {mu}g kg{sup -1}), and the percentage of MeHg varied from 28% to 98% in shellfish and from 84% to 97% in fish. For all real samples tested, methylation and demethylation reactions were not significant, except in one oyster sample. The method presented here could be used for monitoring food contamination by MeHg and inorganic Hg in the future to more accurately assess human exposure. (orig.)

  9. A New Statistical Method to Determine the Degree of Validity of Health Economic Model Outcomes against Empirical Data.

    Science.gov (United States)

    Corro Ramos, Isaac; van Voorn, George A K; Vemer, Pepijn; Feenstra, Talitha L; Al, Maiwenn J

    2017-09-01

    The validation of health economic (HE) model outcomes against empirical data is of key importance. Although statistical testing seems applicable, guidelines for the validation of HE models lack guidance on statistical validation, and actual validation efforts often present subjective judgment of graphs and point estimates. To discuss the applicability of existing validation techniques and to present a new method for quantifying the degrees of validity statistically, which is useful for decision makers. A new Bayesian method is proposed to determine how well HE model outcomes compare with empirical data. Validity is based on a pre-established accuracy interval in which the model outcomes should fall. The method uses the outcomes of a probabilistic sensitivity analysis and results in a posterior distribution around the probability that HE model outcomes can be regarded as valid. We use a published diabetes model (Modelling Integrated Care for Diabetes based on Observational data) to validate the outcome "number of patients who are on dialysis or with end-stage renal disease." Results indicate that a high probability of a valid outcome is associated with relatively wide accuracy intervals. In particular, 25% deviation from the observed outcome implied approximately 60% expected validity. Current practice in HE model validation can be improved by using an alternative method based on assessing whether the model outcomes fit to empirical data at a predefined level of accuracy. This method has the advantage of assessing both model bias and parameter uncertainty and resulting in a quantitative measure of the degree of validity that penalizes models predicting the mean of an outcome correctly but with overly wide credible intervals. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  10. Evaluating teaching methods: validation of an evaluation tool for hydrodissection and phacoemulsification portions of cataract surgery.

    Science.gov (United States)

    Smith, Ronald J; McCannel, Colin A; Gordon, Lynn K; Hollander, David A; Giaconi, JoAnn A; Stelzner, Sadiqa K; Devgan, Uday; Bartlett, John; Mondino, Bartly J

    2014-09-01

    To develop and assess the validity of an evaluation tool to assess quantitatively the hydrodissection and phacoemulsification portions of cataract surgery performed by residents. Case series. Jules Stein Eye Institute, Olive View-UCLA Medical Center, and Veterans Administration Medical Center, Los Angeles, California, USA. The UCLA ophthalmology faculty members were surveyed and the literature was reviewed to develop a grading tool consisting of 15 questions to evaluate surgical technique, including questions from the Global Rating Assessment of Skills in Intraocular Surgery and from the International Council of Ophthalmology's Ophthalmology Surgical Competency Assessment Rubric. Video clips of the hydrodissection and phacoemulsification portions of cataract surgery performed by 1 postgraduate year 2 (PGY2) resident, 1 PGY3 resident, 2 PGY4 residents, and an advanced surgeon were independently graded in a masked fashion by an 8-member faculty panel. Eleven of the 15 questions had a significant association with surgical experience level (Pinstrument handling, flow of operation, and nucleus rotation. Nucleus cracking also had low variability. Less directly visible tasks, especially 3-dimensional tasks, had wider interobserver variability. Surgical performance can be validly measured using an evaluation tool. Improved videography and studies to identify the best questions for evaluating each step of cataract surgery may help ophthalmic educators more precisely measure training outcomes for improving teaching interventions. No author has a financial or proprietary interest in any material or method mentioned. Published by Elsevier Inc.

  11. The Finite-Surface Method for incompressible flow: a step beyond staggered grid

    Science.gov (United States)

    Hokpunna, Arpiruk; Misaka, Takashi; Obayashi, Shigeru

    2017-11-01

    We present a newly developed higher-order finite surface method for the incompressible Navier-Stokes equations (NSE). This method defines the velocities as a surface-averaged value on the surfaces of the pressure cells. Consequently, the mass conservation on the pressure cells becomes an exact equation. The only things left to approximate is the momentum equation and the pressure at the new time step. At certain conditions, the exact mass conservation enables the explicit n-th order accurate NSE solver to be used with the pressure treatment that is two or four order less accurate without loosing the apparent convergence rate. This feature was not possible with finite volume of finite difference methods. We use Fourier analysis with a model spectrum to determine the condition and found that the range covers standard boundary layer flows. The formal convergence and the performance of the proposed scheme is compared with a sixth-order finite volume method. Finally, the accuracy and performance of the method is evaluated in turbulent channel flows. This work is partially funded by a research colloaboration from IFS, Tohoku university and ASEAN+3 funding scheme from CMUIC, Chiang Mai University.

  12. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A.

    2008-12-17

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses on validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.

  13. GC Method Validation for the Analysis of Menthol in Suppository Pharmaceutical Dosage Form

    Directory of Open Access Journals (Sweden)

    Murad N. Abualhasan

    2017-01-01

    Full Text Available Menthol is widely used as a fragrance and flavor in the food and cosmetic industries. It is also used in the medical and pharmaceutical fields for its various biological effects. Gas chromatography (GC is considered to be a sensitive method for the analysis of menthol. GC chromatographic separation was developed using capillary column (VF-624 and a flame ionization detector (FID. The method was validated as per ICH guidelines for various parameters such as precision, linearity, accuracy, solution stability, robustness, limit of detection, and quantification. The tested validation parameters were found to be within acceptable limits. The method was successfully applied for the quantification of menthol in suppositories formulations. Quality control departments and official pharmacopeias can use our developed method in the analysis of menthol in pharmaceutical dosage formulation and raw material.

  14. The Theory and Practice of the Six-Step Method in EFL and Its Transferability to Engineering Programmes

    Science.gov (United States)

    Ntombela, Berrington X. S.

    2013-01-01

    This paper outlines the theory of the six-step method developed by personnel in the Language and Learning department at Caledonian College of Engineering, Oman. The paper further illustrates the application of this method in teaching Project, Listening, Reading, Writing, and Speaking & Debate at Foundation level. The assumption in applying the…

  15. Validation of Likelihood Ratio Methods Used for Forensic Evidence Evaluation: Application in Forensic Fingerprints

    NARCIS (Netherlands)

    Haraksim, Rudolf

    2014-01-01

    In this chapter the Likelihood Ratio (LR) inference model will be introduced, the theoretical aspects of probabilities will be discussed and the validation framework for LR methods used for forensic evidence evaluation will be presented. Prior to introducing the validation framework, following

  16. Validation of the Analytical Method for the Determination of Flavonoids in Broccoli

    Directory of Open Access Journals (Sweden)

    Tuszyńska Magdalena

    2014-09-01

    Full Text Available A simple, accurate and selective HPLC method was developed and validated for determination of quercetin and kaempferol, which are the main flavonols in broccoli. The separation was achieved on a reversed-phase C18 column using a mobile phase composed of methanol/water (60/40 and phosphoric acid 0.2% at a flow rate of 1.0 ml min-1. The detection was carried out on a DAD detector at 370 nm. This method was validated according to the requirements for new methods, which include selectivity, linearity, precision, accuracy, limit of detection and limit of quantitation. The current method demonstrates good linearity, with R2 > 0.99. The recovery is within 98.07-102.15% and 97.92-101.83% for quercetin and kaempferol, respectively. The method is selective, in that quercetin and kaempferol are well separated from other compounds of broccoli with good resolution. The low limit of detection and limit of quantitation of quercetin and kaempferol enable the detection and quantitation of these flavonoids in broccoli at low con–centrations.

  17. A high-order positivity-preserving single-stage single-step method for the ideal magnetohydrodynamic equations

    Science.gov (United States)

    Christlieb, Andrew J.; Feng, Xiao; Seal, David C.; Tang, Qi

    2016-07-01

    We propose a high-order finite difference weighted ENO (WENO) method for the ideal magnetohydrodynamics (MHD) equations. The proposed method is single-stage (i.e., it has no internal stages to store), single-step (i.e., it has no time history that needs to be stored), maintains a discrete divergence-free condition on the magnetic field, and has the capacity to preserve the positivity of the density and pressure. To accomplish this, we use a Taylor discretization of the Picard integral formulation (PIF) of the finite difference WENO method proposed in Christlieb et al. (2015) [23], where the focus is on a high-order discretization of the fluxes (as opposed to the conserved variables). We use the version where fluxes are expanded to third-order accuracy in time, and for the fluid variables space is discretized using the classical fifth-order finite difference WENO discretization. We use constrained transport in order to obtain divergence-free magnetic fields, which means that we simultaneously evolve the magnetohydrodynamic (that has an evolution equation for the magnetic field) and magnetic potential equations alongside each other, and set the magnetic field to be the (discrete) curl of the magnetic potential after each time step. In this work, we compute these derivatives to fourth-order accuracy. In order to retain a single-stage, single-step method, we develop a novel Lax-Wendroff discretization for the evolution of the magnetic potential, where we start with technology used for Hamilton-Jacobi equations in order to construct a non-oscillatory magnetic field. The end result is an algorithm that is similar to our previous work Christlieb et al. (2014) [8], but this time the time stepping is replaced through a Taylor method with the addition of a positivity-preserving limiter. Finally, positivity preservation is realized by introducing a parameterized flux limiter that considers a linear combination of high and low-order numerical fluxes. The choice of the free

  18. Development and validation of a dissolution method using HPLC for diclofenac potassium in oral suspension

    Directory of Open Access Journals (Sweden)

    Alexandre Machado Rubim

    2014-04-01

    Full Text Available The present study describes the development and validation of an in vitro dissolution method for evaluation to release diclofenac potassium in oral suspension. The dissolution test was developed and validated according to international guidelines. Parameters like linearity, specificity, precision and accuracy were evaluated, as well as the influence of rotation speed and surfactant concentration on the medium. After selecting the best conditions, the method was validated using apparatus 2 (paddle, 50-rpm rotation speed, 900 mL of water with 0.3% sodium lauryl sulfate (SLS as dissolution medium at 37.0 ± 0.5°C. Samples were analyzed using the HPLC-UV (PDA method. The results obtained were satisfactory for the parameters evaluated. The method developed may be useful in routine quality control for pharmaceutical industries that produce oral suspensions containing diclofenac potassium.

  19. NordVal: A Nordic system for validation of alternative microbiological methods

    DEFF Research Database (Denmark)

    Qvist, Sven

    2007-01-01

    NordVal was created in 1999 by the Nordic Committee of Senior Officials for Food Issues under the Nordic Council of Ministers. The Committee adopted the following objective for NordVal: NordVal evaluates the performance and field of application of alternative microbiological methods. This includes...... analyses of food, water, feed, animal faeces and food environmental samples in the Nordic countries. NordVal is managed by a steering group, which is appointed by the National Food Administrations in Denmark, Finland, Iceland, Norway and Sweden. The background for creation of NordVal was a Danish...... validation system (DanVal) established in 1995 to cope with a need to validate alternative methods to be used in the Danish Salmonella Action Program. The program attracted considerable attention in the other Nordic countries. NordVal has elaborated a number of documents, which describe the requirements...

  20. Validation needs of seismic probabilistic risk assessment (PRA) methods applied to nuclear power plants

    International Nuclear Information System (INIS)

    Kot, C.A.; Srinivasan, M.G.; Hsieh, B.J.

    1985-01-01

    An effort to validate seismic PRA methods is in progress. The work concentrates on the validation of plant response and fragility estimates through the use of test data and information from actual earthquake experience. Validation needs have been identified in the areas of soil-structure interaction, structural response and capacity, and equipment fragility. Of particular concern is the adequacy of linear methodology to predict nonlinear behavior. While many questions can be resolved through the judicious use of dynamic test data, other aspects can only be validated by means of input and response measurements during actual earthquakes. A number of past, ongoing, and planned testing programs which can provide useful validation data have been identified, and validation approaches for specific problems are being formulated

  1. Volumetric adsorptive microsampling-liquid chromatography tandem mass spectrometry assay for the simultaneous quantification of four antibiotics in human blood: Method development, validation and comparison with dried blood spot.

    Science.gov (United States)

    Barco, Sebastiano; Castagnola, Elio; Moscatelli, Andrea; Rudge, James; Tripodi, Gino; Cangemi, Giuliana

    2017-10-25

    In this paper we show the development and validation of a volumetric absorptive microsampling (VAMS™)-LC-MS/MS method for the simultaneous quantification of four antibiotics: piperacillin-tazobactam, meropenem, linezolid and ceftazidime in 10μL human blood. The novel VAMS-LC-MS/MS method has been compared with a dried blood spot (DBS)-based method in terms of impact of hematocrit (HCT) on accuracy, reproducibility, recovery and matrix effect. Antibiotics were extracted from VAMS and DBS by protein precipitation with methanol after a re-hydration step at 37°C for 10min. LC-MS/MS was carried out on a Thermo Scientific™ TSQ Quantum™ Access MAX triple quadrupole coupled to an Accela ™UHPLC system. The VAMS-LC-MS/MS method is selective, precise and reproducible. In contrast to DBS, it allows an accurate quantification without any HCT influence. It has been applied to samples derived from pediatric patients under therapy. VAMS is a valid alternative sampling strategy for the quantification of antibiotics and is valuable in support of clinical PK/PD studies and consequently therapeutic drug monitoring (TDM) in pediatrics. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Initial Steps in Creating a Developmentally Valid Tool for Observing/Assessing Rope Jumping

    Science.gov (United States)

    Roberton, Mary Ann; Thompson, Gregory; Langendorfer, Stephen J.

    2017-01-01

    Background: Valid motor development sequences show the various behaviors that children display as they progress toward competence in specific motor skills. Teachers can use these sequences to observe informally or formally assess their students. While longitudinal study is ultimately required to validate developmental sequences, there are earlier,…

  3. Development and validation of an alternative titration method for the determination of sulfate ion in indinavir sulfate

    Directory of Open Access Journals (Sweden)

    Breno de Carvalho e Silva

    2005-02-01

    Full Text Available A simple and rapid precipitation titration method was developed and validated to determine sulfate ion content in indinavir sulfate raw material. 0.1 mol L-1 lead nitrate volumetric solution was used as titrant employing potentiometric endpoint determination using a lead-specific electrode. The United States Pharmacopoeia Forum indicates a potentiometric method for sulfate ion quantitation using 0.1 mol L-1 lead perchlorate as titrant. Both methods were validated concerning linearity, precision and accuracy, yielding good results. The sulfate ion content found by the two validated methods was compared by the statistical t-student test, indicating that there was no statistically significant difference between the methods.

  4. General methods for analysis of sequential "n-step" kinetic mechanisms: application to single turnover kinetics of helicase-catalyzed DNA unwinding.

    Science.gov (United States)

    Lucius, Aaron L; Maluf, Nasib K; Fischer, Christopher J; Lohman, Timothy M

    2003-10-01

    Helicase-catalyzed DNA unwinding is often studied using "all or none" assays that detect only the final product of fully unwound DNA. Even using these assays, quantitative analysis of DNA unwinding time courses for DNA duplexes of different lengths, L, using "n-step" sequential mechanisms, can reveal information about the number of intermediates in the unwinding reaction and the "kinetic step size", m, defined as the average number of basepairs unwound between two successive rate limiting steps in the unwinding cycle. Simultaneous nonlinear least-squares analysis using "n-step" sequential mechanisms has previously been limited by an inability to float the number of "unwinding steps", n, and m, in the fitting algorithm. Here we discuss the behavior of single turnover DNA unwinding time courses and describe novel methods for nonlinear least-squares analysis that overcome these problems. Analytic expressions for the time courses, f(ss)(t), when obtainable, can be written using gamma and incomplete gamma functions. When analytic expressions are not obtainable, the numerical solution of the inverse Laplace transform can be used to obtain f(ss)(t). Both methods allow n and m to be continuous fitting parameters. These approaches are generally applicable to enzymes that translocate along a lattice or require repetition of a series of steps before product formation.

  5. Comprehensive validation scheme for in situ fiber optics dissolution method for pharmaceutical drug product testing.

    Science.gov (United States)

    Mirza, Tahseen; Liu, Qian Julie; Vivilecchia, Richard; Joshi, Yatindra

    2009-03-01

    There has been a growing interest during the past decade in the use of fiber optics dissolution testing. Use of this novel technology is mainly confined to research and development laboratories. It has not yet emerged as a tool for end product release testing despite its ability to generate in situ results and efficiency improvement. One potential reason may be the lack of clear validation guidelines that can be applied for the assessment of suitability of fiber optics. This article describes a comprehensive validation scheme and development of a reliable, robust, reproducible and cost-effective dissolution test using fiber optics technology. The test was successfully applied for characterizing the dissolution behavior of a 40-mg immediate-release tablet dosage form that is under development at Novartis Pharmaceuticals, East Hanover, New Jersey. The method was validated for the following parameters: linearity, precision, accuracy, specificity, and robustness. In particular, robustness was evaluated in terms of probe sampling depth and probe orientation. The in situ fiber optic method was found to be comparable to the existing manual sampling dissolution method. Finally, the fiber optic dissolution test was successfully performed by different operators on different days, to further enhance the validity of the method. The results demonstrate that the fiber optics technology can be successfully validated for end product dissolution/release testing. (c) 2008 Wiley-Liss, Inc. and the American Pharmacists Association

  6. Brief International Cognitive Assessment for MS (BICAMS): international standards for validation.

    Science.gov (United States)

    Benedict, Ralph H B; Amato, Maria Pia; Boringa, Jan; Brochet, Bruno; Foley, Fred; Fredrikson, Stan; Hamalainen, Paivi; Hartung, Hans; Krupp, Lauren; Penner, Iris; Reder, Anthony T; Langdon, Dawn

    2012-07-16

    An international expert consensus committee recently recommended a brief battery of tests for cognitive evaluation in multiple sclerosis. The Brief International Cognitive Assessment for MS (BICAMS) battery includes tests of mental processing speed and memory. Recognizing that resources for validation will vary internationally, the committee identified validation priorities, to facilitate international acceptance of BICAMS. Practical matters pertaining to implementation across different languages and countries were discussed. Five steps to achieve optimal psychometric validation were proposed. In Step 1, test stimuli should be standardized for the target culture or language under consideration. In Step 2, examiner instructions must be standardized and translated, including all information from manuals necessary for administration and interpretation. In Step 3, samples of at least 65 healthy persons should be studied for normalization, matched to patients on demographics such as age, gender and education. The objective of Step 4 is test-retest reliability, which can be investigated in a small sample of MS and/or healthy volunteers over 1-3 weeks. Finally, in Step 5, criterion validity should be established by comparing MS and healthy controls. At this time, preliminary studies are underway in a number of countries as we move forward with this international assessment tool for cognition in MS.

  7. Sensing cocaine in saliva with attenuated total reflection infrared (ATR-IR) spectroscopy combined with a one-step extraction method

    Science.gov (United States)

    Hans, Kerstin M.-C.; Gianella, Michele; Sigrist, Markus W.

    2012-03-01

    On-site drug tests have gained importance, e.g., for protecting the society from impaired drivers. Since today's drug tests are majorly only positive/negative, there is a great need for a reliable, portable and preferentially quantitative drug test. In the project IrSens we aim to bridge this gap with the development of an optical sensor platform based on infrared spectroscopy and focus on cocaine detection in saliva. We combine a one-step extraction method, a sample drying technique and infrared attenuated total reflection (ATR) spectroscopy. As a first step we have developed an extraction technique that allows us to extract cocaine from saliva to an almost infrared-transparent solvent and to record ATR spectra with a commercially available Fourier Transform-infrared spectrometer. To the best of our knowledge this is the first time that such a simple and easy-to-use one-step extraction method is used to transfer cocaine from saliva into an organic solvent and detect it quantitatively. With this new method we are able to reach a current limit of detection around 10 μg/ml. This new extraction method could also be applied to waste water monitoring and controlling caffeine content in beverages.

  8. Necessary steps in factor analysis : Enhancing validation studies of educational instruments. The PHEEM applied to clerks as an example

    NARCIS (Netherlands)

    Schonrock-Adema, Johanna; Heijne-Penninga, Marjolein; van Hell, Elisabeth A.; Cohen-Schotanus, Janke

    2009-01-01

    Background: The validation of educational instruments, in particular the employment of factor analysis, can be improved in many instances. Aims: To demonstrate the superiority of a sophisticated method of factor analysis, implying an integration of recommendations described in the factor analysis

  9. Validation of a same-day real-time PCR method for screening of meat and carcass swabs for Salmonella

    DEFF Research Database (Denmark)

    Löfström, Charlotta; Krause, Michael; Josefsen, Mathilde Hartmann

    2009-01-01

    of the published PCR methods for Salmonella have been validated in collaborative studies. This study describes a validation including comparative and collaborative trials, based on the recommendations from the Nordic organization for validation of alternative microbiological methods (NordVal) of a same-day, non....... Partly based on results obtained in this study, the method has obtained NordVal approval for analysis of Salmonella in meat and carcass swabs. The PCR method was transferred to a production laboratory and the performance was compared with the BAX Salmonella test on 39 pork samples artificially...... contaminated with Salmonella. There was no significant difference in the results obtained by the two methods. Conclusion: The real-time PCR method for detection of Salmonella in meat and carcass swabs was validated in comparative and collaborative trials according to NordVal recommendations. The PCR method...

  10. Oxcarbazepine: validation and application of an analytical method

    Directory of Open Access Journals (Sweden)

    Paula Cristina Rezende Enéas

    2010-06-01

    Full Text Available Oxcarbazepine (OXC is an important anticonvulsant and mood stabilizing drug. A pharmacopoeial monograph for OXC is not yet available and therefore the development and validation of a new analytical method for quantification of this drug is essential. In the present study, a UV spectrophotometric method for the determination of OXC was developed. The various parameters, such as linearity, precision, accuracy and specificity, were studied according to International Conference on Harmonization Guidelines. Batches of 150 mg OXC capsules were prepared and analyzed using the validated UV method. The formulations were also evaluated for parameters including drug-excipient compatibility, flowability, uniformity of weight, disintegration time, assay, uniformity of content and the amount of drug dissolved during the first hour.Oxcarbazepina (OXC é um fármaco anticonvulsivante e estabilizante do humor. O desenvolvimento e validação de método analítico para quantificação da OXC são de fundamental importância devido à ausência de monografias farmacopéicas oficiais para esse fármaco. Nesse trabalho, um método espectrofotométrico UV para determinação da OXC foi desenvolvido. O método proposto foi validado seguindo os parâmetros de linearidade, precisão, exatidão e especificidade de acordo com as normas da Conferência Internacional de Harmonização. Cápsulas de OXC 150 mg foram preparadas e analisadas utilizando-se o método analítico validado. As formulações foram avaliadas com relação à compatibilidade fármaco-excipientes, fluidez, determinação de peso, tempo de desintegração, doseamento, uniformidade de conteúdo e quantidade do fármaco dissolvido após 60 minutos.

  11. Dependability validation by means of fault injection: method, implementation, application

    International Nuclear Information System (INIS)

    Arlat, Jean

    1990-01-01

    This dissertation presents theoretical and practical results concerning the use of fault injection as a means for testing fault tolerance in the framework of the experimental dependability validation of computer systems. The dissertation first presents the state-of-the-art of published work on fault injection, encompassing both hardware (fault simulation, physical fault Injection) and software (mutation testing) issues. Next, the major attributes of fault injection (faults and their activation, experimental readouts and measures, are characterized taking into account: i) the abstraction levels used to represent the system during the various phases of its development (analytical, empirical and physical models), and Il) the validation objectives (verification and evaluation). An evaluation method is subsequently proposed that combines the analytical modeling approaches (Monte Carlo Simulations, closed-form expressions. Markov chains) used for the representation of the fault occurrence process and the experimental fault Injection approaches (fault Simulation and physical injection); characterizing the error processing and fault treatment provided by the fault tolerance mechanisms. An experimental tool - MESSALINE - is then defined and presented. This tool enables physical faults to be Injected In an hardware and software prototype of the system to be validated. Finally, the application of MESSALINE for testing two fault-tolerant systems possessing very dissimilar features and the utilization of the experimental results obtained - both as design feedbacks and for dependability measures evaluation - are used to illustrate the relevance of the method. (author) [fr

  12. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Directory of Open Access Journals (Sweden)

    Shankarjee Krishnamoorthi

    Full Text Available We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  13. Simulation Methods and Validation Criteria for Modeling Cardiac Ventricular Electrophysiology.

    Science.gov (United States)

    Krishnamoorthi, Shankarjee; Perotti, Luigi E; Borgstrom, Nils P; Ajijola, Olujimi A; Frid, Anna; Ponnaluri, Aditya V; Weiss, James N; Qu, Zhilin; Klug, William S; Ennis, Daniel B; Garfinkel, Alan

    2014-01-01

    We describe a sequence of methods to produce a partial differential equation model of the electrical activation of the ventricles. In our framework, we incorporate the anatomy and cardiac microstructure obtained from magnetic resonance imaging and diffusion tensor imaging of a New Zealand White rabbit, the Purkinje structure and the Purkinje-muscle junctions, and an electrophysiologically accurate model of the ventricular myocytes and tissue, which includes transmural and apex-to-base gradients of action potential characteristics. We solve the electrophysiology governing equations using the finite element method and compute both a 6-lead precordial electrocardiogram (ECG) and the activation wavefronts over time. We are particularly concerned with the validation of the various methods used in our model and, in this regard, propose a series of validation criteria that we consider essential. These include producing a physiologically accurate ECG, a correct ventricular activation sequence, and the inducibility of ventricular fibrillation. Among other components, we conclude that a Purkinje geometry with a high density of Purkinje muscle junctions covering the right and left ventricular endocardial surfaces as well as transmural and apex-to-base gradients in action potential characteristics are necessary to produce ECGs and time activation plots that agree with physiological observations.

  14. Development and validation of an ultra-high performance liquid chromatography-tandem mass spectrometry method to measure creatinine in human urine.

    Science.gov (United States)

    Fraselle, S; De Cremer, K; Coucke, W; Glorieux, G; Vanmassenhove, J; Schepers, E; Neirynck, N; Van Overmeire, I; Van Loco, J; Van Biesen, W; Vanholder, R

    2015-04-15

    Despite decades of creatinine measurement in biological fluids using a large variety of analytical methods, an accurate determination of this compound remains challenging. Especially with the novel trend to assess biomarkers on large sample sets preserved in biobanks, a simple and fast method that could cope with both a high sample throughput and a low volume of sample is still of interest. In answer to these challenges, a fast and accurate ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method was developed to measure creatinine in small volumes of human urine. In this method, urine samples are simply diluted with a basic mobile phase and injected directly under positive electrospray ionization (ESI) conditions, without further purification steps. The combination of an important diluting factor (10(4) times) due to the use of a very sensitive triple quadrupole mass spectrometer (XEVO TQ) and the addition of creatinine-d3 as internal standard completely eliminates matrix effects coming from the urine. The method was validated in-house in 2012 according to the EMA guideline on bioanalytical method validation using Certified Reference samples from the German External Quality Assessment Scheme (G-Equas) proficiency test. All obtained results for accuracy and recovery are within the authorized tolerance ranges defined by G-Equas. The method is linear between 0 and 5 g/L, with LOD and LOQ of 5 × 10(-3) g/L and 10(-2) g/L, respectively. The repeatability (CV(r) = 1.03-2.07%) and intra-laboratory reproducibility (CV(RW) = 1.97-2.40%) satisfy the EMA 2012 guideline. The validated method was firstly applied to perform the German G-Equas proficiency test rounds 51 and 53, in 2013 and 2014, respectively. The obtained results were again all within the accepted tolerance ranges and very close to the reference values defined by the organizers of the proficiency test scheme, demonstrating an excellent accuracy of the developed method. The

  15. Development and Validation of High Performance Liquid Chromatography Method for Determination Atorvastatin in Tablet

    Science.gov (United States)

    Yugatama, A.; Rohmani, S.; Dewangga, A.

    2018-03-01

    Atorvastatin is the primary choice for dyslipidemia treatment. Due to patent expiration of atorvastatin, the pharmaceutical industry makes copy of the drug. Therefore, the development methods for tablet quality tests involving atorvastatin concentration on tablets needs to be performed. The purpose of this research was to develop and validate the simple atorvastatin tablet analytical method by HPLC. HPLC system used in this experiment consisted of column Cosmosil C18 (150 x 4,6 mm, 5 µm) as the stationary reverse phase chomatography, a mixture of methanol-water at pH 3 (80:20 v/v) as the mobile phase, flow rate of 1 mL/min, and UV detector at wavelength of 245 nm. Validation methods were including: selectivity, linearity, accuracy, precision, limit of detection (LOD), and limit of quantitation (LOQ). The results of this study indicate that the developed method had good validation including selectivity, linearity, accuracy, precision, LOD, and LOQ for analysis of atorvastatin tablet content. LOD and LOQ were 0.2 and 0.7 ng/mL, and the linearity range were 20 - 120 ng/mL.

  16. Space Suit Joint Torque Measurement Method Validation

    Science.gov (United States)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  17. Validation study of core analysis methods for full MOX BWR

    International Nuclear Information System (INIS)

    2013-01-01

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO 2 and MOX fuel rods, (3) analysis of isotopic composition data for UO 2 and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  18. Validation study of core analysis methods for full MOX BWR

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO{sub 2} and MOX fuel rods, (3) analysis of isotopic composition data for UO{sub 2} and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  19. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  20. The Three-Step Test-Interview (TSTI: An observation-based method for pretesting self-completion questionnaires

    Directory of Open Access Journals (Sweden)

    Tony Hak

    2008-12-01

    Full Text Available Three-Step Test-Interview (TSTI is a method for pretesting a self-completion questionnaire by first observing actual instances of interaction between the instrument and respondents (the response process before exploring the reasons for this behavior. The TSTI consists of the following three steps: 1. (Respondent-driven observation of response behavior. 2. (Interviewer-driven follow-up probing aimed at remedying gaps in observational data. 3. (Interviewer-driven debriefing aimed at eliciting experiences and opinions. We describe the aims and the techniques of these three steps, and then discuss pilot studies in which we tested the feasibility and the productivity of the TSTI by applying it in testing three rather different types of questionnaires. In the first study, the quality of a set of questions about alcohol consumption was assessed. The TSTI proved to be productive in identifying problems that resulted from a mismatch between the ‘theory’ underlying the questions on the one hand, and features of a respondent’s actual behavior and biography on the other hand. In the second pilot study, Dutch and Norwegian versions of an attitude scale, the 20-item Illegal Aliens Scale, were tested. The TSTI appeared to be productive in identifying problems that resulted from different ‘response strategies’. In the third pilot, a two-year longitudinal study, the TSTI appeared to be an effective method for documenting processes of ‘response shift’ in repeated measurements of health-related Quality of Life (QoL.

  1. Statistical methods for mechanistic model validation: Salt Repository Project

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1988-07-01

    As part of the Department of Energy's Salt Repository Program, Pacific Northwest Laboratory (PNL) is studying the emplacement of nuclear waste containers in a salt repository. One objective of the SRP program is to develop an overall waste package component model which adequately describes such phenomena as container corrosion, waste form leaching, spent fuel degradation, etc., which are possible in the salt repository environment. The form of this model will be proposed, based on scientific principles and relevant salt repository conditions with supporting data. The model will be used to predict the future characteristics of the near field environment. This involves several different submodels such as the amount of time it takes a brine solution to contact a canister in the repository, how long it takes a canister to corrode and expose its contents to the brine, the leach rate of the contents of the canister, etc. These submodels are often tested in a laboratory and should be statistically validated (in this context, validate means to demonstrate that the model adequately describes the data) before they can be incorporated into the waste package component model. This report describes statistical methods for validating these models. 13 refs., 1 fig., 3 tabs

  2. Avoid the tsunami of the Dirac sea in the imaginary time step method

    International Nuclear Information System (INIS)

    Zhang, Ying; Liang, Haozhao; Meng, Jie

    2010-01-01

    The discrete single-particle spectra in both the Fermi and Dirac sea have been calculated by the imaginary time step (ITS) method for the Schroedinger-like equation after avoiding the "tsunami" of the Dirac sea, i.e. the diving behavior of the single-particle level into the Dirac sea in the direct application of the ITS method for the Dirac equation. It is found that by the transform from the Dirac equation to the Schroedinger-like equation, the single-particle spectra, which extend from the positive to the negative infinity, can be separately obtained by the ITS evolution in either the Fermi sea or the Dirac sea. Identical results with those in the conventional shooting method have been obtained via the ITS evolution for the equivalent Schroedinger-like equation, which demonstrates the feasibility, practicality and reliability of the present algorithm and dispels the doubts on the ITS method in the relativistic system. (author)

  3. A method for the statistical interpretation of friction ridge skin impression evidence: Method development and validation.

    Science.gov (United States)

    Swofford, H J; Koertner, A J; Zemp, F; Ausdemore, M; Liu, A; Salyards, M J

    2018-04-03

    The forensic fingerprint community has faced increasing amounts of criticism by scientific and legal commentators, challenging the validity and reliability of fingerprint evidence due to the lack of an empirically demonstrable basis to evaluate and report the strength of the evidence in a given case. This paper presents a method, developed as a stand-alone software application, FRStat, which provides a statistical assessment of the strength of fingerprint evidence. The performance was evaluated using a variety of mated and non-mated datasets. The results show strong performance characteristics, often with values supporting specificity rates greater than 99%. This method provides fingerprint experts the capability to demonstrate the validity and reliability of fingerprint evidence in a given case and report the findings in a more transparent and standardized fashion with clearly defined criteria for conclusions and known error rate information thereby responding to concerns raised by the scientific and legal communities. Published by Elsevier B.V.

  4. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-05-27

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.

  5. Synthesis and photocatalytic application of α-Fe2O3/ZnO fine particles prepared by two-step chemical method

    Directory of Open Access Journals (Sweden)

    Patij Shah

    2013-06-01

    Full Text Available Composite iron oxide-Zinc oxide (α-Fe2O3/ZnO was synthesized by two-step method: in the first one step uniform α-Fe2O3 particles were prepared through a hydrolysis process of ferric chloride at 80 °C. In the second step, the ZnO particles were included in the α-Fe2O3 particles by a zinc acetate [Zn(Ac2·2H2O] assisted hydrothermal method at low temperature (90°C±C. The α-Fe2O3 and ZnO phases were identified by XRD, energy dispersive X-ray analysis (EDX. The photoreactivities of α-Fe2O3/ZnO nanoparticles under UV irradiation were quantified by the degradation of formaldehyde.

  6. Development and Validation of a Stability-Indicating LC-UV Method ...

    African Journals Online (AJOL)

    Development and Validation of a Stability-Indicating LC-UV Method for Simultaneous Determination of Ketotifen and Cetirizine in Pharmaceutical Dosage Forms. ... 5 μm) using an isocratic mobile phase that consisted of acetonitrile and 10 mM disodium hydrogen phosphate buffer (pH 6.5) in a ratio of 45:55 % v/v at a flow ...

  7. Comparison of the performances and validation of three methods for ...

    African Journals Online (AJOL)

    SARAH

    2014-02-28

    Feb 28, 2014 ... bacteria in Norwegian slaughter pigs. Int J. Food Microbiol 1, 301–309. [NCFA] Nordic Committee of Food Analysis (1996). Yersinia enterocolitica Detection in foods 117,. 3rd,edn,1-12. Nowak, B., Mueffling, T.V., Caspari, K. and Hartung, J. 2006 Validation of a method for the detection of virulent Yersinia ...

  8. Validation of Cyanoacrylate Method for Collection of Stratum Corneum in Human Skin for Lipid Analysis

    DEFF Research Database (Denmark)

    Jungersted, JM; Hellgren, Lars; Drachmann, Tue

    2010-01-01

    Background and Objective: Lipids in the stratum corneum (SC) are of major importance for the skin barrier function. Many different methods have been used for the collection of SC for the analysis of SC lipids. The objective of the present study was to validate the cyanoacrylate method for the col......Background and Objective: Lipids in the stratum corneum (SC) are of major importance for the skin barrier function. Many different methods have been used for the collection of SC for the analysis of SC lipids. The objective of the present study was to validate the cyanoacrylate method...

  9. Developing Teaching Material Software Assisted for Numerical Methods

    Science.gov (United States)

    Handayani, A. D.; Herman, T.; Fatimah, S.

    2017-09-01

    The NCTM vision shows the importance of two things in school mathematics, which is knowing the mathematics of the 21st century and the need to continue to improve mathematics education to answer the challenges of a changing world. One of the competencies associated with the great challenges of the 21st century is the use of help and tools (including IT), such as: knowing the existence of various tools for mathematical activity. One of the significant challenges in mathematical learning is how to teach students about abstract concepts. In this case, technology in the form of mathematics learning software can be used more widely to embed the abstract concept in mathematics. In mathematics learning, the use of mathematical software can make high level math activity become easier accepted by student. Technology can strengthen student learning by delivering numerical, graphic, and symbolic content without spending the time to calculate complex computing problems manually. The purpose of this research is to design and develop teaching materials software assisted for numerical method. The process of developing the teaching material starts from the defining step, the process of designing the learning material developed based on information obtained from the step of early analysis, learners, materials, tasks that support then done the design step or design, then the last step is the development step. The development of teaching materials software assisted for numerical methods is valid in content. While validator assessment for teaching material in numerical methods is good and can be used with little revision.

  10. A Renormalisation Group Method. V. A Single Renormalisation Group Step

    Science.gov (United States)

    Brydges, David C.; Slade, Gordon

    2015-05-01

    This paper is the fifth in a series devoted to the development of a rigorous renormalisation group method applicable to lattice field theories containing boson and/or fermion fields, and comprises the core of the method. In the renormalisation group method, increasingly large scales are studied in a progressive manner, with an interaction parametrised by a field polynomial which evolves with the scale under the renormalisation group map. In our context, the progressive analysis is performed via a finite-range covariance decomposition. Perturbative calculations are used to track the flow of the coupling constants of the evolving polynomial, but on their own perturbative calculations are insufficient to control error terms and to obtain mathematically rigorous results. In this paper, we define an additional non-perturbative coordinate, which together with the flow of coupling constants defines the complete evolution of the renormalisation group map. We specify conditions under which the non-perturbative coordinate is contractive under a single renormalisation group step. Our framework is essentially combinatorial, but its implementation relies on analytic results developed earlier in the series of papers. The results of this paper are applied elsewhere to analyse the critical behaviour of the 4-dimensional continuous-time weakly self-avoiding walk and of the 4-dimensional -component model. In particular, the existence of a logarithmic correction to mean-field scaling for the susceptibility can be proved for both models, together with other facts about critical exponents and critical behaviour.

  11. Two-step design method for highly compact three-dimensional freeform optical system for LED surface light source.

    Science.gov (United States)

    Mao, Xianglong; Li, Hongtao; Han, Yanjun; Luo, Yi

    2014-10-20

    Designing an illumination system for a surface light source with a strict compactness requirement is quite challenging, especially for the general three-dimensional (3D) case. In accordance with the two key features of an expected illumination distribution, i.e., a well-controlled boundary and a precise illumination pattern, a two-step design method is proposed in this paper for highly compact 3D freeform illumination systems. In the first step, a target shape scaling strategy is combined with an iterative feedback modification algorithm to generate an optimized freeform optical system with a well-controlled boundary of the target distribution. In the second step, a set of selected radii of the system obtained in the first step are optimized to further improve the illuminating quality within the target region. The method is quite flexible and effective to design highly compact optical systems with almost no restriction on the shape of the desired target field. As examples, three highly compact freeform lenses with ratio of center height h of the lens and the maximum dimension D of the source ≤ 2.5:1 are designed for LED surface light sources to form a uniform illumination distribution on a rectangular, a cross-shaped and a complex cross pierced target plane respectively. High light control efficiency of η > 0.7 as well as low relative standard illumination deviation of RSD < 0.07 is obtained simultaneously for all the three design examples.

  12. Stepwise Procedure for Development and Validation of a Multipesticide Method

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    The stepwise procedure for development and the validation of so called multi-pesticide methods are described. Principles, preliminary actions, criteria for the selection of chromatographic separation, detection and performance verification of multi-pesticide methods are outlined. Also the long term repeatability and reproducibility, as well as the necessity for the documentation of laboratory work are highlighted. Appendix I hereof describes in detail the calculation of calibration parameters, whereas Appendix II focuses on the calculation of the significance of differences of concentrations obtained on two different separation columns. (author)

  13. A GPU-accelerated semi-implicit fractional step method for numerical solutions of incompressible Navier-Stokes equations

    Science.gov (United States)

    Ha, Sanghyun; Park, Junshin; You, Donghyun

    2017-11-01

    Utility of the computational power of modern Graphics Processing Units (GPUs) is elaborated for solutions of incompressible Navier-Stokes equations which are integrated using a semi-implicit fractional-step method. Due to its serial and bandwidth-bound nature, the present choice of numerical methods is considered to be a good candidate for evaluating the potential of GPUs for solving Navier-Stokes equations using non-explicit time integration. An efficient algorithm is presented for GPU acceleration of the Alternating Direction Implicit (ADI) and the Fourier-transform-based direct solution method used in the semi-implicit fractional-step method. OpenMP is employed for concurrent collection of turbulence statistics on a CPU while Navier-Stokes equations are computed on a GPU. Extension to multiple NVIDIA GPUs is implemented using NVLink supported by the Pascal architecture. Performance of the present method is experimented on multiple Tesla P100 GPUs compared with a single-core Xeon E5-2650 v4 CPU in simulations of boundary-layer flow over a flat plate. Supported by the National Research Foundation of Korea (NRF) Grant funded by the Korea government (Ministry of Science, ICT and Future Planning NRF-2016R1E1A2A01939553, NRF-2014R1A2A1A11049599, and Ministry of Trade, Industry and Energy 201611101000230).

  14. Validation Testing of the Nitric Acid Dissolution Step Within the K Basin Sludge Pretreatment Process

    International Nuclear Information System (INIS)

    AJ Schmidt; CH Delegard; KL Silvers; PR Bredt; CD Carlson; EW Hoppe; JC Hayes; DE Rinehart; SR Gano; BM Thornton

    1999-01-01

    The work described in this report involved comprehensive bench-scale testing of nitric acid (HNO 3 ) dissolution of actual sludge materials from the Hanford K East (KE) Basin to confirm the baseline chemical pretreatment process. In addition, process monitoring and material balance information was collected to support the development and refinement of process flow diagrams. The testing was performed by Pacific Northwest National Laboratory (PNNL)for the US Department of Energy's Office of Spent Fuel Stabilization (EM-67) and Numatec Hanford Corporation (NHC) to assist in the development of the K Basin Sludge Pretreatment Process. The baseline chemical pretreatment process for K Basin sludge is nitric acid dissolution of all particulate material passing a 1/4-in. screen. The acid-insoluble fraction (residual solids) will be stabilized (possibly by chemical leaching/rinsing and grouting), packaged, and transferred to the Hanford Environmental Restoration Disposal Facility (ERDF). The liquid fraction is to be diluted with depleted uranium for uranium criticality safety and iron nitrate for plutonium criticality safety, and neutralized with sodium hydroxide. The liquid fraction and associated precipitates are to be stored in the Hanford Tank Waste Remediation Systems (TWRS) pending vitrification. It is expected that most of the polychlorinated biphenyls (PCBs), associated with some K Basin sludges, will remain with the residual solids for ultimate disposal to ERDF. Filtration and precipitation during the neutralization step will further remove trace quantities of PCBs within the liquid fraction. The purpose of the work discussed in this report was to examine the dissolution behavior of actual KE Basin sludge materials at baseline flowsheet conditions and validate the.dissolution process step through bench-scale testing. The progress of the dissolution was evaluated by measuring the solution electrical conductivity and concentrations of key species in the dissolver

  15. Validation Testing of the Nitric Acid Dissolution Step Within the K Basin Sludge Pretreatment Process

    Energy Technology Data Exchange (ETDEWEB)

    AJ Schmidt; CH Delegard; KL Silvers; PR Bredt; CD Carlson; EW Hoppe; JC Hayes; DE Rinehart; SR Gano; BM Thornton

    1999-03-24

    The work described in this report involved comprehensive bench-scale testing of nitric acid (HNO{sub 3}) dissolution of actual sludge materials from the Hanford K East (KE) Basin to confirm the baseline chemical pretreatment process. In addition, process monitoring and material balance information was collected to support the development and refinement of process flow diagrams. The testing was performed by Pacific Northwest National Laboratory (PNNL)for the US Department of Energy's Office of Spent Fuel Stabilization (EM-67) and Numatec Hanford Corporation (NHC) to assist in the development of the K Basin Sludge Pretreatment Process. The baseline chemical pretreatment process for K Basin sludge is nitric acid dissolution of all particulate material passing a 1/4-in. screen. The acid-insoluble fraction (residual solids) will be stabilized (possibly by chemical leaching/rinsing and grouting), packaged, and transferred to the Hanford Environmental Restoration Disposal Facility (ERDF). The liquid fraction is to be diluted with depleted uranium for uranium criticality safety and iron nitrate for plutonium criticality safety, and neutralized with sodium hydroxide. The liquid fraction and associated precipitates are to be stored in the Hanford Tank Waste Remediation Systems (TWRS) pending vitrification. It is expected that most of the polychlorinated biphenyls (PCBs), associated with some K Basin sludges, will remain with the residual solids for ultimate disposal to ERDF. Filtration and precipitation during the neutralization step will further remove trace quantities of PCBs within the liquid fraction. The purpose of the work discussed in this report was to examine the dissolution behavior of actual KE Basin sludge materials at baseline flowsheet conditions and validate the.dissolution process step through bench-scale testing. The progress of the dissolution was evaluated by measuring the solution electrical conductivity and concentrations of key species in the

  16. Performance evaluation of CT measurements made on step gauges using statistical methodologies

    DEFF Research Database (Denmark)

    Angel, J.; De Chiffre, L.; Kruth, J.P.

    2015-01-01

    In this paper, a study is presented in which statistical methodologies were applied to evaluate the measurement of step gauges on an X-ray computed tomography (CT) system. In particular, the effects of step gauge material density and orientation were investigated. The step gauges consist of uni......- and bidirectional lengths. By confirming the repeatability of measurements made on the test system, the number of required scans in the design of experiment (DOE) was reduced. The statistical model was checked using model adequacy principles; model adequacy checking is an important step in validating...

  17. Validation of verbal autopsy methods using hospital medical records: a case study in Vietnam.

    Science.gov (United States)

    Tran, Hong Thi; Nguyen, Hoa Phuong; Walker, Sue M; Hill, Peter S; Rao, Chalapati

    2018-05-18

    Information on causes of death (COD) is crucial for measuring the health outcomes of populations and progress towards the Sustainable Development Goals. In many countries such as Vietnam where the civil registration and vital statistics (CRVS) system is dysfunctional, information on vital events will continue to rely on verbal autopsy (VA) methods. This study assesses the validity of VA methods used in Vietnam, and provides recommendations on methods for implementing VA validation studies in Vietnam. This validation study was conducted on a sample of 670 deaths from a recent VA study in Quang Ninh province. The study covered 116 cases from this sample, which met three inclusion criteria: a) the death occurred within 30 days of discharge after last hospitalisation, and b) medical records (MRs) for the deceased were available from respective hospitals, and c) the medical record mentioned that the patient was terminally ill at discharge. For each death, the underlying cause of death (UCOD) identified from MRs was compared to the UCOD from VA. The validity of VA diagnoses for major causes of death was measured using sensitivity, specificity and positive predictive value (PPV). The sensitivity of VA was at least 75% in identifying some leading CODs such as stroke, road traffic accidents and several site-specific cancers. However, sensitivity was less than 50% for other important causes including ischemic heart disease, chronic obstructive pulmonary diseases, and diabetes. Overall, there was 57% agreement between UCOD from VA and MR, which increased to 76% when multiple causes from VA were compared to UCOD from MR. Our findings suggest that VA is a valid method to ascertain UCOD in contexts such as Vietnam. Furthermore, within cultural contexts in which patients prefer to die at home instead of a healthcare facility, using the available MRs as the gold standard may be meaningful to the extent that recall bias from the interval between last hospital discharge and death

  18. HVS-based quantization steps for validation of digital cinema extended bitrates

    Science.gov (United States)

    Larabi, M.-C.; Pellegrin, P.; Anciaux, G.; Devaux, F.-O.; Tulet, O.; Macq, B.; Fernandez, C.

    2009-02-01

    In Digital Cinema, the video compression must be as transparent as possible to provide the best image quality to the audience. The goal of compression is to simplify transport, storing, distribution and projection of films. For all those tasks, equipments need to be developed. It is thus mandatory to reduce the complexity of the equipments by imposing limitations in the specifications. In this sense, the DCI has fixed the maximum bitrate for a compressed stream to 250 Mbps independently from the input format (4K/24fps, 2K/48fps or 2K/24fps). The work described in this paper This parameter is discussed in this paper because it is not consistent to double/quadruple the input rate without increasing the output rate. The work presented in this paper is intended to define quantization steps ensuring the visually lossless compression. Two steps are followed first to evaluate the effect of each subband separately and then to fin the scaling ratio. The obtained results show that it is necessary to increase the bitrate limit for cinema material in order to achieve the visually lossless.

  19. MR-based water content estimation in cartilage: design and validation of a method

    DEFF Research Database (Denmark)

    Shiguetomi Medina, Juan Manuel; Kristiansen, Maja Sophie; Ringgaard, Steffen

    Purpose: Design and validation of an MR-based method that allows the calculation of the water content in cartilage tissue. Methods and Materials: Cartilage tissue T1 map based water content MR sequences were used on a 37 Celsius degree stable system. The T1 map intensity signal was analyzed on 6...... cartilage samples from living animals (pig) and on 8 gelatin samples which water content was already known. For the data analysis a T1 intensity signal map software analyzer used. Finally, the method was validated after measuring and comparing 3 more cartilage samples in a living animal (pig). The obtained...... map based water content sequences can provide information that, after being analyzed using a T1-map analysis software, can be interpreted as the water contained inside a cartilage tissue. The amount of water estimated using this method was similar to the one obtained at the dry-freeze procedure...

  20. Development and validation of stability indicating UPLC assay method for ziprasidone active pharma ingredient

    Directory of Open Access Journals (Sweden)

    Sonam Mittal

    2012-01-01

    Full Text Available Background: Ziprasidone, a novel antipsychotic, exhibits a potent highly selective antagonistic activity on D2 and 5HT2A receptors. Literature survey for ziprasidone revealed several analytical methods based on different techniques but no UPLC method has been reported so far. Aim: Aim of this research paper is to present a simple and rapid stability indicating isocratic, ultra performance liquid chromatographic (UPLC method which was developed and validated for the determination of ziprasidone active pharmaceutical ingredient. Forced degradation studies of ziprasidone were studied under acid, base, oxidative hydrolysis, thermal stress and photo stress conditions. Materials and Methods: The quantitative determination of ziprasidone drug was performed on a Supelco analytical column (100×2.1 mm i.d., 2.7 ΅m with 10 mM ammonium acetate buffer (pH: 6.7 and acetonitrile (ACN as mobile phase with the ratio (55:45-Buffer:ACN at a flow rate of 0.35 ml/ min. For UPLC method, UV detection was made at 318 nm and the run time was 3 min. Developed UPLC method was validated as per ICH guidelines. Results and Conclusion: Mild degradation of the drug substance was observed during oxidative hydrolysis and considerable degradation observed during basic hydrolysis. During method validation, parameters such as precision, linearity, ruggedness, stability, robustness, and specificity were evaluated, which remained within acceptable limits. Developed UPLC method was successfully applied for evaluating assay of Ziprasidone active Pharma ingredient.

  1. Variable Step Integration Coupled with the Method of Characteristics Solution for Water-Hammer Analysis, A Case Study

    Science.gov (United States)

    Turpin, Jason B.

    2004-01-01

    One-dimensional water-hammer modeling involves the solution of two coupled non-linear hyperbolic partial differential equations (PDEs). These equations result from applying the principles of conservation of mass and momentum to flow through a pipe, and usually the assumption that the speed at which pressure waves propagate through the pipe is constant. In order to solve these equations for the interested quantities (i.e. pressures and flow rates), they must first be converted to a system of ordinary differential equations (ODEs) by either approximating the spatial derivative terms with numerical techniques or using the Method of Characteristics (MOC). The MOC approach is ideal in that no numerical approximation errors are introduced in converting the original system of PDEs into an equivalent system of ODEs. Unfortunately this resulting system of ODEs is bound by a time step constraint so that when integrating the equations the solution can only be obtained at fixed time intervals. If the fluid system to be modeled also contains dynamic components (i.e. components that are best modeled by a system of ODEs), it may be necessary to take extremely small time steps during certain points of the model simulation in order to achieve stability and/or accuracy in the solution. Coupled together, the fixed time step constraint invoked by the MOC, and the occasional need for extremely small time steps in order to obtain stability and/or accuracy, can greatly increase simulation run times. As one solution to this problem, a method for combining variable step integration (VSI) algorithms with the MOC was developed for modeling water-hammer in systems with highly dynamic components. A case study is presented in which reverse flow through a dual-flapper check valve introduces a water-hammer event. The predicted pressure responses upstream of the check-valve are compared with test data.

  2. Clashing Validities in the Comparative Method? Balancing In-Depth Understanding and Generalizability in Small-N Policy Studies

    NARCIS (Netherlands)

    van der Heijden, J.

    2013-01-01

    The comparative method receives considerable attention in political science. To some a main advantage of the method is that it allows for both in-depth insights (internal validity), and generalizability beyond the cases studied (external validity). However, others consider internal and external

  3. Initial Development and Validation of the BullyHARM: The Bullying, Harassment, and Aggression Receipt Measure

    Science.gov (United States)

    Hall, William J.

    2016-01-01

    This article describes the development and preliminary validation of the Bullying, Harassment, and Aggression Receipt Measure (BullyHARM). The development of the BullyHARM involved a number of steps and methods, including a literature review, expert review, cognitive testing, readability testing, data collection from a large sample, reliability…

  4. Principles of Single-Laboratory Validation of Analytical Methods for Testing the Chemical Composition of Pesticides

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Underlying theoretical and practical approaches towards pesticide formulation analysis are discussed, i.e. general principles, performance characteristics, applicability of validation data, verification of method performance, and adaptation of validated methods by other laboratories. The principles of single laboratory validation of analytical methods for testing the chemical composition of pesticides are outlined. Also the theoretical background is described for performing pesticide formulation analysis as outlined in ISO, CIPAC/AOAC and IUPAC guidelines, including methodological characteristics such as specificity, selectivity, linearity, accuracy, trueness, precision and bias. Appendices I–III hereof give practical and elaborated examples on how to use the Horwitz approach and formulae for estimating the target standard deviation towards acceptable analytical repeatability. The estimation of trueness and the establishment of typical within-laboratory reproducibility are treated in greater detail by means of worked-out examples. (author)

  5. Methodologies for pre-validation of biofilters and wetlands for stormwater treatment.

    Directory of Open Access Journals (Sweden)

    Kefeng Zhang

    Full Text Available Water Sensitive Urban Design (WSUD systems are frequently used as part of a stormwater harvesting treatment trains (e.g. biofilters (bio-retentions and rain-gardens and wetlands. However, validation frameworks for such systems do not exist, limiting their adoption for end-uses such as drinking water. The first stage in the validation framework is pre-validation, which prepares information for further validation monitoring.A pre-validation roadmap, consisting of five steps, is suggested in this paper. Detailed methods for investigating target micropollutants in stormwater, and determining challenge conditions for biofilters and wetlands, are provided.A literature review was undertaken to identify and quantify micropollutants in stormwater. MUSIC V5.1 was utilized to simulate the behaviour of the systems based on 30-year rainfall data in three distinct climate zones; outputs were evaluated to identify the threshold of operational variables, including length of dry periods (LDPs and volume of water treated per event.The paper highlights that a number of micropollutants were found in stormwater at levels above various worldwide drinking water guidelines (eight pesticides, benzene, benzo(apyrene, pentachlorophenol, di-(2-ethylhexyl-phthalate and a total of polychlorinated biphenyls. The 95th percentile LDPs was exponentially related to system design area while the 5th percentile length of dry periods remained within short durations (i.e. 2-8 hours. 95th percentile volume of water treated per event was exponentially related to system design area as a percentage of an impervious catchment area.The out-comings of this study show that pre-validation could be completed through a roadmap consisting of a series of steps; this will help in the validation of stormwater treatment systems.

  6. Adaptation and validation of the Melbourne Decision Making Questionnaire to Brazilian Portuguese

    Directory of Open Access Journals (Sweden)

    Charles Cotrena

    2017-12-01

    Full Text Available Abstract Introduction: Decision making (DM is among the most important abilities for everyday functioning. However, the most widely used measures of DM come from behavioral paradigms, whose ecological validity and standalone use has been criticized in the literature. Though these issues could be addressed by the use of DM questionnaires as a complementary assessment method, no such instruments have been validated for use in Brazilian Portuguese. Therefore, the aim of this study was to conduct the translation and validation of the Melbourne Decision Making Questionnaire (MDMQ for use in a Brazilian population. Methods: The adaptation of the MDMQ involved the following steps: translation, back-translation, expert review and pilot study. These steps were followed by factor analysis and internal consistency measurements, which led to the exclusion of 4 items from the scale. The 18-item version of the MDMQ was then administered to a validation sample consisting of healthy adults, as well as patients with bipolar disorder (BD and major depressive disorder (MDD. Results: The instrument displayed good internal consistency, with the hypervigilance subscale showing the lowest, though still acceptable, Cronbach's alpha value. Its factor structure was comparable to that of the original MDMQ according to confirmatory factor analysis. Nevertheless, the MDMQ was sensitive to both depression severity and the presence of MDD and BD, both of which are known to have an impact on DM ability. Conclusion: The version of the MDMQ produced in the present study may be an important addition to neuropsychological assessment batteries with a focus on DM and related abilities

  7. Dose Rate Experiment at JET for Benchmarking the Calculation Direct One Step Method

    International Nuclear Information System (INIS)

    Angelone, M.; Petrizzi, L.; Pillon, M.; Villari, R.; Popovichev, S.

    2006-01-01

    Neutrons produced by D-D and D-T plasmas induce the activation of tokamak materials and of components. The development of reliable methods to assess dose rates is a key issue for maintenance and operating nuclear machines, in normal and off-normal conditions. In the frame of the EFDA Fusion Technology work programme, a computational tool based upon MCNP Monte Carlo code has been developed to predict the dose rate after shutdown: it is called Direct One Step Method (D1S). The D1S is an innovative approach in which the decay gammas are coupled to the neutrons as in the prompt case and they are transported in one single step in the same run. Benchmarking of this new tool with experimental data taken in a complex geometry like that of a tokamak is a fundamental step to test the reliability of the D1S method. A dedicated benchmark experiment was proposed for the 2005-2006 experimental campaign of JET. Two irradiation positions have been selected for the benchmark: one inner position inside the vessel, not far from the plasma, called the 2 upper irradiation end (IE2), where neutron fluence is relatively high. The second position is just outside a vertical port in an external position (EX). Here the neutron flux is lower and the dose rate to be measured is not very far from the residual background. Passive detectors are used for in-vessel measurements: the high sensitivity Thermo Luminescent Dosimeters (TLDs) GR-200A (natural LiF), which ensure measurements down to environmental dose level. An active detector of Geiger-Muller (GM) type is used for out of vessel dose rate measurement. Before their use the detectors were calibrated in a secondary gamma-ray standard (Cs-137 and Co-60) facility in term of air-kerma. The background measurement was carried-out in the period July -September 2005 in the outside position EX using the GM tube and in September 2005 inside the vacuum vessel using TLD detectors located in the 2 Upper irradiation end IE2. In the present work

  8. Reference Proteome Extracts for Mass Spec Instrument Performance Validation and Method Development

    Science.gov (United States)

    Rosenblatt, Mike; Urh, Marjeta; Saveliev, Sergei

    2014-01-01

    Biological samples of high complexity are required to test protein mass spec sample preparation procedures and validate mass spec instrument performance. Total cell protein extracts provide the needed sample complexity. However, to be compatible with mass spec applications, such extracts should meet a number of design requirements: compatibility with LC/MS (free of detergents, etc.)high protein integrity (minimal level of protein degradation and non-biological PTMs)compatibility with common sample preparation methods such as proteolysis, PTM enrichment and mass-tag labelingLot-to-lot reproducibility Here we describe total protein extracts from yeast and human cells that meet the above criteria. Two extract formats have been developed: Intact protein extracts with primary use for sample preparation method development and optimizationPre-digested extracts (peptides) with primary use for instrument validation and performance monitoring

  9. A step in the right direction: new flow depth relationships for stepped spillway design

    Science.gov (United States)

    A common deficiency for embankment dams changing from a low hazard to a high hazard dam is inadequate spillway capacity. Roller compacted concrete (RCC) stepped spillways are a popular method to address this issue. Stepped spillway research has gained momentum in recent years due to the need for d...

  10. Low-cost extrapolation method for maximal LTE radio base station exposure estimation: test and validation.

    Science.gov (United States)

    Verloock, Leen; Joseph, Wout; Gati, Azeddine; Varsier, Nadège; Flach, Björn; Wiart, Joe; Martens, Luc

    2013-06-01

    An experimental validation of a low-cost method for extrapolation and estimation of the maximal electromagnetic-field exposure from long-term evolution (LTE) radio base station installations are presented. No knowledge on downlink band occupation or service characteristics is required for the low-cost method. The method is applicable in situ. It only requires a basic spectrum analyser with appropriate field probes without the need of expensive dedicated LTE decoders. The method is validated both in laboratory and in situ, for a single-input single-output antenna LTE system and a 2×2 multiple-input multiple-output system, with low deviations in comparison with signals measured using dedicated LTE decoders.

  11. Two step continuous method to synthesize colloidal spheroid gold nanorods.

    Science.gov (United States)

    Chandra, S; Doran, J; McCormack, S J

    2015-12-01

    This research investigated a two-step continuous process to synthesize colloidal suspension of spheroid gold nanorods. In the first step; gold precursor was reduced to seed-like particles in the presence of polyvinylpyrrolidone and ascorbic acid. In continuous second step; silver nitrate and alkaline sodium hydroxide produced various shape and size Au nanoparticles. The shape was manipulated through weight ratio of ascorbic acid to silver nitrate by varying silver nitrate concentration. The specific weight ratio of 1.35-1.75 grew spheroid gold nanorods of aspect ratio ∼1.85 to ∼2.2. Lower weight ratio of 0.5-1.1 formed spherical nanoparticle. The alkaline medium increased the yield of gold nanorods and reduced reaction time at room temperature. The synthesized gold nanorods retained their shape and size in ethanol. The surface plasmon resonance was red shifted by ∼5 nm due to higher refractive index of ethanol than water. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Validation of three new methods for determination of metal emissions using a modified Environmental Protection Agency Method 301

    Energy Technology Data Exchange (ETDEWEB)

    Catherine A. Yanca; Douglas C. Barth; Krag A. Petterson; Michael P. Nakanishi; John A. Cooper; Bruce E. Johnsen; Richard H. Lambert; Daniel G. Bivins [Cooper Environmental Services, LLC, Portland, OR (United States)

    2006-12-15

    Three new methods applicable to the determination of hazardous metal concentrations in stationary source emissions were developed and evaluated for use in U.S. Environmental Protection Agency (EPA) compliance applications. Two of the three independent methods, a continuous emissions monitor-based method (Xact) and an X-ray-based filter method (XFM), are used to measure metal emissions. The third method involves a quantitative aerosol generator (QAG), which produces a reference aerosol used to evaluate the measurement methods. A modification of EPA Method 301 was used to validate the three methods for As, Cd, Cr, Pb, and Hg, representing three hazardous waste combustor Maximum Achievable Control Technology (MACT) metal categories (low volatile, semivolatile, and volatile). The measurement methods were evaluated at a hazardous waste combustor (HWC) by comparing measured with reference aerosol concentrations. The QAG, Xact, and XFM met the modified Method 301 validation criteria. All three of the methods demonstrated precisions and accuracies on the order of 5%. The measurement methods should be applicable to emissions from a wide range of sources, and the reference aerosol generator should be applicable to additional analytes. EPA recently approved an alternative monitoring petition for an HWC at Eli Lilly's Tippecanoe site in Lafayette, IN, in which the Xact is used for demonstrating compliance with the HWC MACT metal emissions (low volatile, semivolatile, and volatile). The QAG reference aerosol generator was approved as a method for providing a quantitative reference aerosol, which is required for certification and continuing quality assurance of the Xact. 30 refs., 5 figs., 11 tabs.

  13. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    Science.gov (United States)

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  14. AOAC Official MethodSM Matrix Extension Validation Study of Assurance GDSTM for the Detection of Salmonella in Selected Spices.

    Science.gov (United States)

    Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew

    2015-01-01

    Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella.

  15. The a3 problem solving report: a 10-step scientific method to execute performance improvements in an academic research vivarium.

    Directory of Open Access Journals (Sweden)

    James A Bassuk

    Full Text Available The purpose of this study was to illustrate the application of A3 Problem Solving Reports of the Toyota Production System to our research vivarium through the methodology of Continuous Performance Improvement, a lean approach to healthcare management at Seattle Children's (Hospital, Research Institute, Foundation. The Report format is described within the perspective of a 10-step scientific method designed to realize measurable improvements of Issues identified by the Report's Author, Sponsor and Coach. The 10-step method (Issue, Background, Current Condition, Goal, Root Cause, Target Condition, Countermeasures, Implementation Plan, Test, and Follow-up was shown to align with Shewhart's Plan-Do-Check-Act process improvement cycle in a manner that allowed for quantitative analysis of the Countermeasure's outcomes and of Testing results. During fiscal year 2012, 9 A3 Problem Solving Reports were completed in the vivarium under the teaching and coaching system implemented by the Research Institute. Two of the 9 reports are described herein. Report #1 addressed the issue of the vivarium's veterinarian not being able to provide input into sick animal cases during the work day, while report #7 tackled the lack of a standard in keeping track of weekend/holiday animal health inspections. In each Report, a measurable Goal that established the basis for improvement recognition was present. A Five Whys analysis identified the Root Cause for Report #1 as historical work patterns that existed before the veterinarian was hired on and that modern electronic communication tools had not been implemented. The same analysis identified the Root Cause for Report #7 as the vivarium had never standardized the process for weekend/holiday checks. Successful outcomes for both Reports were obtained and validated by robust audit plans. The collective data indicate that vivarium staff acquired a disciplined way of reporting on, as well as solving, problems in a manner

  16. Reliability and Concurrent Validity of the Narrow Path Walking Test in Persons With Multiple Sclerosis.

    Science.gov (United States)

    Rosenblum, Uri; Melzer, Itshak

    2017-01-01

    About 90% of people with multiple sclerosis (PwMS) have gait instability and 50% fall. Reliable and clinically feasible methods of gait instability assessment are needed. The study investigated the reliability and validity of the Narrow Path Walking Test (NPWT) under single-task (ST) and dual-task (DT) conditions for PwMS. Thirty PwMS performed the NPWT on 2 different occasions, a week apart. Number of Steps, Trial Time, Trial Velocity, Step Length, Number of Step Errors, Number of Cognitive Task Errors, and Number of Balance Losses were measured. Intraclass correlation coefficients (ICC2,1) were calculated from the average values of NPWT parameters. Absolute reliability was quantified from standard error of measurement (SEM) and smallest real difference (SRD). Concurrent validity of NPWT with Functional Reach Test, Four Square Step Test (FSST), 12-item Multiple Sclerosis Walking Scale (MSWS-12), and 2 Minute Walking Test (2MWT) was determined using partial correlations. Intraclass correlation coefficients (ICCs) for most NPWT parameters during ST and DT ranged from 0.46-0.94 and 0.55-0.95, respectively. The highest relative reliability was found for Number of Step Errors (ICC = 0.94 and 0.93, for ST and DT, respectively) and Trial Velocity (ICC = 0.83 and 0.86, for ST and DT, respectively). Absolute reliability was high for Number of Step Errors in ST (SEM % = 19.53%) and DT (SEM % = 18.14%) and low for Trial Velocity in ST (SEM % = 6.88%) and DT (SEM % = 7.29%). Significant correlations for Number of Step Errors and Trial Velocity were found with FSST, MSWS-12, and 2MWT. In persons with PwMS performing the NPWT, Number of Step Errors and Trial Velocity were highly reliable parameters. Based on correlations with other measures of gait instability, Number of Step Errors was the most valid parameter of dynamic balance under the conditions of our test.Video Abstract available for more insights from the authors (see Supplemental Digital Content 1, available at: http

  17. A Component-Based Modeling and Validation Method for PLC Systems

    Directory of Open Access Journals (Sweden)

    Rui Wang

    2014-05-01

    Full Text Available Programmable logic controllers (PLCs are complex embedded systems that are widely used in industry. This paper presents a component-based modeling and validation method for PLC systems using the behavior-interaction-priority (BIP framework. We designed a general system architecture and a component library for a type of device control system. The control software and hardware of the environment were all modeled as BIP components. System requirements were formalized as monitors. Simulation was carried out to validate the system model. A realistic example from industry of the gates control system was employed to illustrate our strategies. We found a couple of design errors during the simulation, which helped us to improve the dependability of the original systems. The results of experiment demonstrated the effectiveness of our approach.

  18. Validação de métodos cromatográficos de análise: um experimento de fácil aplicação utilizando cromatografia líquida de alta eficiência (CLAE e os princípios da "Química Verde" na determinação de metilxantinas em bebidas Validation of chromatographic methods: an experiment using HPLC and Green Chemistry in methylxanthines determination

    Directory of Open Access Journals (Sweden)

    Nádia Machado de Aragão

    2009-01-01

    Full Text Available The validation of analytical methods is an important step in quality control. The main objective of this study is to propose an HPLC experiment to verify the parameters of validation of chromatographic methods, based on green chemistry principles, which can be used in experimental courses of chemistry and related areas.

  19. In situ synthesis carbonated hydroxyapatite layers on enamel slices with acidic amino acids by a novel two-step method.

    Science.gov (United States)

    Wu, Xiaoguang; Zhao, Xu; Li, Yi; Yang, Tao; Yan, Xiujuan; Wang, Ke

    2015-09-01

    In situ fabrication of carbonated hydroxyapatite (CHA) remineralization layer on an enamel slice was completed in a novel, biomimetic two-step method. First, a CaCO3 layer was synthesized on the surface of demineralized enamel using an acidic amino acid (aspartic acid or glutamate acid) as a soft template. Second, at the same concentration of the acidic amino acid, rod-like carbonated hydroxyapatite was produced with the CaCO3 layer as a sacrificial template and a reactant. The morphology, crystallinity and other physicochemical properties of the crystals were characterized using field emission scanning electron microscopy (FESEM), Fourier transform infrared spectrometry (FTIR), X-ray diffraction (XRD) and energy-dispersive X-ray analysis (EDAX), respectively. Acidic amino acid could promote the uniform deposition of hydroxyapatite with rod-like crystals via absorption of phosphate and carbonate ions from the reaction solution. Moreover, compared with hydroxyapatite crystals coated on the enamel when synthesized by a one-step method, the CaCO3 coating that was synthesized in the first step acted as an active bridge layer and sacrificial template. It played a vital role in orienting the artificial coating layer through the template effect. The results show that the rod-like carbonated hydroxyapatite crystals grow into bundles, which are similar in size and appearance to prisms in human enamel, when using the two-step method with either aspartic acid or acidic glutamate (20.00 mmol/L). Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Rapid detection of Salmonella in meat: Comparative and collaborative validation of a non-complex and cost effective pre-PCR protocol

    DEFF Research Database (Denmark)

    Löfström, Charlotta; Hansen, F.; Mansdal, S.

    2011-01-01

    samples using a real-time PCR method. The protocol included incubation in buffered peptone water, centrifugation of an aliquot and a boiling procedure. The validation study included comparative and collaborative trials recommended by the Nordic Organization for Validation of Alternative Methods (NordVal......). The comparative trial was performed against a culture based reference method (NMKL187, 2007) and a previously NordVal approved PCR method with a semi-automated magnetic bead-based DNA extraction step using 122 artificially contaminated samples. The limit of detection (LOD50) was found to be 3.0, 3.2 and 3.4 CFU...

  1. The truncated Wigner method for Bose-condensed gases: limits of validity and applications

    International Nuclear Information System (INIS)

    Sinatra, Alice; Lobo, Carlos; Castin, Yvan

    2002-01-01

    We study the truncated Wigner method applied to a weakly interacting spinless Bose-condensed gas which is perturbed away from thermal equilibrium by a time-dependent external potential. The principle of the method is to generate an ensemble of classical fields ψ(r) which samples the Wigner quasi-distribution function of the initial thermal equilibrium density operator of the gas, and then to evolve each classical field with the Gross-Pitaevskii equation. In the first part of the paper we improve the sampling technique over our previous work (Sinatra et al 2000 J. Mod. Opt. 47 2629-44) and we test its accuracy against the exactly solvable model of the ideal Bose gas. In the second part of the paper we investigate the conditions of validity of the truncated Wigner method. For short evolution times it is known that the time-dependent Bogoliubov approximation is valid for almost pure condensates. The requirement that the truncated Wigner method reproduces the Bogoliubov prediction leads to the constraint that the number of field modes in the Wigner simulation must be smaller than the number of particles in the gas. For longer evolution times the nonlinear dynamics of the noncondensed modes of the field plays an important role. To demonstrate this we analyse the case of a three-dimensional spatially homogeneous Bose-condensed gas and we test the ability of the truncated Wigner method to correctly reproduce the Beliaev-Landau damping of an excitation of the condensate. We have identified the mechanism which limits the validity of the truncated Wigner method: the initial ensemble of classical fields, driven by the time-dependent Gross-Pitaevskii equation, thermalizes to a classical field distribution at a temperature T class which is larger than the initial temperature T of the quantum gas. When T class significantly exceeds T a spurious damping is observed in the Wigner simulation. This leads to the second validity condition for the truncated Wigner method, T class - T

  2. Comparison of 10 single and stepped methods to identify frail older persons in primary care: diagnostic and prognostic accuracy.

    Science.gov (United States)

    Sutorius, Fleur L; Hoogendijk, Emiel O; Prins, Bernard A H; van Hout, Hein P J

    2016-08-03

    Many instruments have been developed to identify frail older adults in primary care. A direct comparison of the accuracy and prevalence of identification methods is rare and most studies ignore the stepped selection typically employed in routine care practice. Also it is unclear whether the various methods select persons with different characteristics. We aimed to estimate the accuracy of 10 single and stepped methods to identify frailty in older adults and to predict adverse health outcomes. In addition, the methods were compared on their prevalence of the identified frail persons and on the characteristics of persons identified. The Groningen Frailty Indicator (GFI), the PRISMA-7, polypharmacy, the clinical judgment of the general practitioner (GP), the self-rated health of the older adult, the Edmonton Frail Scale (EFS), the Identification Seniors At Risk Primary Care (ISAR PC), the Frailty Index (FI), the InterRAI screener and gait speed were compared to three measures: two reference standards (the clinical judgment of a multidisciplinary expert panel and Fried's frailty criteria) and 6-years mortality or long term care admission. Data were used from the Dutch Identification of Frail Elderly Study, consisting of 102 people aged 65 and over from a primary care practice in Amsterdam. Frail older adults were oversampled. The accuracy of each instrument and several stepped strategies was estimated by calculating the area under the ROC-curve. Prevalence rates of frailty ranged from 14.8 to 52.9 %. The accuracy for recommended cut off values ranged from poor (AUC = 0.556 ISAR-PC) to good (AUC = 0.865 gait speed). PRISMA-7 performed best over two reference standards, GP predicted adversities best. Stepped strategies resulted in lower prevalence rates and accuracy. Persons selected by the different instruments varied greatly in age, IADL dependency, receiving homecare and mood. We found huge differences between methods to identify frail persons in prevalence

  3. Acceptability criteria for linear dependence in validating UV-spectrophotometric methods of quantitative determination in forensic and toxicological analysis

    Directory of Open Access Journals (Sweden)

    L. Yu. Klimenko

    2014-08-01

    Full Text Available Introduction. This article is the result of authors’ research in the field of development of the approaches to validation of quantitative determination methods for purposes of forensic and toxicological analysis and devoted to the problem of acceptability criteria formation for validation parameter «linearity/calibration model». The aim of research. The purpose of this paper is to analyse the present approaches to acceptability estimation of the calibration model chosen for method description according to the requirements of the international guidances, to form the own approaches to acceptability estimation of the linear dependence when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. Materials and methods. UV-spectrophotometric method of doxylamine quantitative determination in blood. Results. The approaches to acceptability estimation of calibration models when carrying out the validation of bioanalytical methods is stated in international papers, namely «Guidance for Industry: Bioanalytical method validation» (U.S. FDA, 2001, «Standard Practices for Method Validation in Forensic Toxicology» (SWGTOX, 2012, «Guidance for the Validation of Analytical Methodology and Calibration of Equipment used for Testing of Illicit Drugs in Seized Materials and Biological Specimens» (UNODC, 2009 and «Guideline on validation of bioanalytical methods» (ЕМА, 2011 have been analysed. It has been suggested to be guided by domestic developments in the field of validation of analysis methods for medicines and, particularly, by the approaches to validation methods in the variant of the calibration curve method for forming the acceptability criteria of the obtained linear dependences when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. The choice of the method of calibration curve is

  4. ONE-STEP AND TWO-STEP CALIBRATION OF A PORTABLE PANORAMIC IMAGE MAPPING SYSTEM

    Directory of Open Access Journals (Sweden)

    P.-C. Wang

    2012-07-01

    Full Text Available A Portable Panoramic Image Mapping System (PPIMS is proposed for rapid acquisition of three-dimensional spatial information. By considering the convenience of use, cost, weight of equipment, precision, and power supply, the designed PPIMS is equipped with 6 circularly arranged cameras to capture panoramic images and a GPS receiver for positioning. The motivation for this design is to develop a hand-held Mobile Mapping System (MMS for some difficult accessing areas by vehicle MMS, such as rugged terrains, forest areas, heavily damaged disaster areas, and crowed places etc. This PPIMS is in fact a GPS assisted close-range photogrammetric system. Compared with the traditional close-range photogrammetry, PPIMS can reduce the need of ground control points significantly. Under the condition of knowing the relative geometric relationships of the equipped sensors, the elements of exterior orientation of each captured image can be solved. However, the procedure of a system calibration should be done accurately to determine the relative geometric relationships of multi-cameras and the GPS antenna center, before the PPIMS can be applied for geo-referenced mapping. In this paper, both of one-step and two-step calibration procedures for PPIMS are performed to determine the lever-arm offsets and boresight angles among cameras and GPS. The performance of the one-step and two-step calibration is evaluated through the analysis of the experimental results. The comparison between these two calibration procedures was also conducted. The two-step calibration method outperforms the one-step calibration method in terms of calibration accuracy and operation convenience. We expect that the proposed two-step calibration procedure can also be applied to other platform-based MMSs.

  5. Application of EU guidelines for the validation of screening methods for veterinary drugs

    NARCIS (Netherlands)

    Stolker, A.A.M.

    2012-01-01

    Commission Decision (CD) 2002/657/EC describes detailed rules for method validation within the framework of residue monitoring programmes. The approach described in this CD is based on criteria. For (qualitative) screening methods, the most important criteria is that the CCß has to be below any

  6. 2-Step IMAT and 2-Step IMRT in three dimensions

    International Nuclear Information System (INIS)

    Bratengeier, Klaus

    2005-01-01

    In two dimensions, 2-Step Intensity Modulated Arc Therapy (2-Step IMAT) and 2-Step Intensity Modulated Radiation Therapy (IMRT) were shown to be powerful methods for the optimization of plans with organs at risk (OAR) (partially) surrounded by a target volume (PTV). In three dimensions, some additional boundary conditions have to be considered to establish 2-Step IMAT as an optimization method. A further aim was to create rules for ad hoc adaptations of an IMRT plan to a daily changing PTV-OAR constellation. As a test model, a cylindrically symmetric PTV-OAR combination was used. The centrally placed OAR can adapt arbitrary diameters with different gap widths toward the PTV. Along the rotation axis the OAR diameter can vary, the OAR can even vanish at some axis positions, leaving a circular PTV. The width and weight of the second segment were the free parameters to optimize. The objective function f to minimize was the root of the integral of the squared difference of the dose in the target volume and a reference dose. For the problem, two local minima exist. Therefore, as a secondary criteria, the magnitude of hot and cold spots were taken into account. As a result, the solution with a larger segment width was recommended. From plane to plane for varying radii of PTV and OAR and for different gaps between them, different sets of weights and widths were optimal. Because only one weight for one segment shall be used for all planes (respectively leaf pairs), a strategy for complex three-dimensional (3-D) cases was established to choose a global weight. In a second step, a suitable segment width was chosen, minimizing f for this global weight. The concept was demonstrated in a planning study for a cylindrically symmetric example with a large range of different radii of an OAR along the patient axis. The method is discussed for some classes of tumor/organ at risk combinations. Noncylindrically symmetric cases were treated exemplarily. The product of width and weight of

  7. A novel single-step, multipoint calibration method for instrumented Lab-on-Chip systems

    DEFF Research Database (Denmark)

    Pfreundt, Andrea; Patou, François; Zulfiqar, Azeem

    2014-01-01

    for instrument-based PoC blood biomarker analysis systems. Motivated by the complexity of associating high-accuracy biosensing using silicon nanowire field effect transistors with ease of use for the PoC system user, we propose a novel one-step, multipoint calibration method for LoC-based systems. Our approach...... specifically addresses the important interfaces between a novel microfluidic unit to integrate the sensor array and a mobile-device hardware accessory. A multi-point calibration curve is obtained by generating a defined set of reference concentrations from a single input. By consecutively splitting the flow...

  8. Validity of the CT to attenuation coefficient map conversion methods

    International Nuclear Information System (INIS)

    Faghihi, R.; Ahangari Shahdehi, R.; Fazilat Moadeli, M.

    2004-01-01

    The most important commercialized methods of attenuation correction in SPECT are based on attenuation coefficient map from a transmission imaging method. The transmission imaging system can be the linear source of radioelement or a X-ray CT system. The image of transmission imaging system is not useful unless to replacement of the attenuation coefficient or CT number with the attenuation coefficient in SPECT energy. In this paper we essay to evaluate the validity and estimate the error of the most used method of this transformation. The final result shows that the methods which use a linear or multi-linear curve accept a error in their estimation. The value of mA is not important but the patient thickness is very important and it can introduce a error more than 10 percent in the final result

  9. A validated multianalyte LC-MS/MS method for quantification of 25 mycotoxins in cassava flour, peanut cake and maize samples.

    Science.gov (United States)

    Ediage, Emmanuel Njumbe; Di Mavungu, José Diana; Monbaliu, Sofie; Van Peteghem, Carlos; De Saeger, Sarah

    2011-05-25

    This study was designed to develop a sensitive liquid chromatography tandem mass spectrometry (LC-MS/MS) method for the simultaneous detection and quantification of 25 mycotoxins in cassava flour, peanut cake and maize samples with particular focus on the optimization of the sample preparation protocol and method validation. All 25 mycotoxins were extracted in a single step with a mixture of methanol/ethyl acetate/water (70:20:10, v/v/v). The method limits of quantification (LOQ) varied from 0.3 μg/kg to 106 μg/kg. Good precision and linearity were observed for most of the mycotoxins. The method was applied for the analysis of naturally contaminated peanut cake, cassava flour and maize samples from the Republic of Benin. All samples analyzed (fifteen peanut cakes, four maize flour and four cassava flour samples) tested positive for one or more mycotoxins. Aflatoxins (total aflatoxins; 10-346 μg/kg) and ochratoxin A (cake samples while fumonisin B(1) (4-21 μg/kg), aflatoxin B(2) (flour samples. Fumonisin B(1) (13-836 μg/kg), fumonisin B(2) (5-221 μg/kg), fumonisin B(3) (

  10. Validation and comparison of dispersion models of RTARC DSS

    International Nuclear Information System (INIS)

    Duran, J.; Pospisil, M.

    2004-01-01

    RTARC DSS (Real Time Accident Release Consequences - Decision Support System) is a computer code developed at the VUJE Trnava, Inc. (Stubna, M. et al, 1993). The code calculations include atmospheric transport and diffusion, dose assessment, evaluation and displaying of the affected zones, evaluation of the early health effects, concentration and dose rate time dependence in the selected sites etc. The simulation of the protective measures (sheltering, iodine administration) is involved. The aim of this paper is to present the process of validation of the RTARC dispersion models. RTARC includes models for calculations of release for very short (Method Monte Carlo - MEMOC), short (Gaussian Straight-Line Model) and long distances (Puff Trajectory Model - PTM). Validation of the code RTARC was performed using the results of comparisons and experiments summarized in the Table 1.: 1. Experiments and comparisons in the process of validation of the system RTARC - experiments or comparison - distance - model. Wind tunnel experiments (Universitaet der Bundeswehr, Muenchen) - Area of NPP - Method Monte Carlo. INEL (Idaho National Engineering Laboratory) - short/medium - Gaussian model and multi tracer atmospheric experiment - distances - PTM. Model Validation Kit - short distances - Gaussian model. STEP II.b 'Realistic Case Studies' - long distances - PTM. ENSEMBLE comparison - long distances - PTM (orig.)

  11. A systematic review and appraisal of methods of developing and validating lifestyle cardiovascular disease risk factors questionnaires.

    Science.gov (United States)

    Nse, Odunaiya; Quinette, Louw; Okechukwu, Ogah

    2015-09-01

    Well developed and validated lifestyle cardiovascular disease (CVD) risk factors questionnaires is the key to obtaining accurate information to enable planning of CVD prevention program which is a necessity in developing countries. We conducted this review to assess methods and processes used for development and content validation of lifestyle CVD risk factors questionnaires and possibly develop an evidence based guideline for development and content validation of lifestyle CVD risk factors questionnaires. Relevant databases at the Stellenbosch University library were searched for studies conducted between 2008 and 2012, in English language and among humans. Using the following databases; pubmed, cinahl, psyc info and proquest. Search terms used were CVD risk factors, questionnaires, smoking, alcohol, physical activity and diet. Methods identified for development of lifestyle CVD risk factors were; review of literature either systematic or traditional, involvement of expert and /or target population using focus group discussion/interview, clinical experience of authors and deductive reasoning of authors. For validation, methods used were; the involvement of expert panel, the use of target population and factor analysis. Combination of methods produces questionnaires with good content validity and other psychometric properties which we consider good.

  12. Validation of analytical methods for the stability studies of naproxen suppositories for infant and adult use

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar

    2011-01-01

    Analytical and validating studies were performed in this paper, with a view to using them in the stability studies of the future formulations of naproxen suppositories for children and adults. The most influential factors in the naproxen stability were determined, that is, the major degradation occurred in acid medium, oxidative medium and by light action. One high-performance liquid chromatography-based method was evaluated, which proved to be adequate to quantify naproxen in suppositories and was selective against degradation products. The quantification limit was 3,480 μg, so it was valid for these studies. Additionally, the parameters specificity for stability, detection and quantification limits were evaluated for the direct semi-aqueous acid-base method, which was formerly validated for the quality control and showed satisfactory results. Nevertheless, the volumetric methods were not regarded as stability indicators; therefore, this method will be used along with the chromatographic methods of choice, that is, thin-layer chromatography and highperformance liquid chromatography, to determine the degradation products

  13. Validation of GOES-Derived Surface Radiation Using NOAA's Physical Retrieval Method

    Energy Technology Data Exchange (ETDEWEB)

    Habte, A.; Sengupta, M.; Wilcox, S.

    2013-01-01

    This report was part of a multiyear collaboration with the University of Wisconsin and the National Oceanic and Atmospheric Administration (NOAA) to produce high-quality, satellite-based, solar resource datasets for the United States. High-quality, solar resource assessment accelerates technology deployment by making a positive impact on decision making and reducing uncertainty in investment decisions. Satellite-based solar resource datasets are used as a primary source in solar resource assessment. This is mainly because satellites provide larger areal coverage and longer periods of record than ground-based measurements. With the advent of newer satellites with increased information content and faster computers that can process increasingly higher data volumes, methods that were considered too computationally intensive are now feasible. One class of sophisticated methods for retrieving solar resource information from satellites is a two-step, physics-based method that computes cloud properties and uses the information in a radiative transfer model to compute solar radiation. This method has the advantage of adding additional information as satellites with newer channels come on board. This report evaluates the two-step method developed at NOAA and adapted for solar resource assessment for renewable energy with the goal of identifying areas that can be improved in the future.

  14. Improvement of Simulation Method in Validation of Software of the Coordinate Measuring Systems

    Science.gov (United States)

    Nieciąg, Halina

    2015-10-01

    Software is used in order to accomplish various tasks at each stage of the functioning of modern measuring systems. Before metrological confirmation of measuring equipment, the system has to be validated. This paper discusses the method for conducting validation studies of a fragment of software to calculate the values of measurands. Due to the number and nature of the variables affecting the coordinate measurement results and the complex character and multi-dimensionality of measurands, the study used the Monte Carlo method of numerical simulation. The article presents an attempt of possible improvement of results obtained by classic Monte Carlo tools. The algorithm LHS (Latin Hypercube Sampling) was implemented as alternative to the simple sampling schema of classic algorithm.

  15. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    Science.gov (United States)

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  16. Comparison of validation methods for forming simulations

    Science.gov (United States)

    Schug, Alexander; Kapphan, Gabriel; Bardl, Georg; Hinterhölzl, Roland; Drechsler, Klaus

    2018-05-01

    The forming simulation of fibre reinforced thermoplastics could reduce the development time and improve the forming results. But to take advantage of the full potential of the simulations it has to be ensured that the predictions for material behaviour are correct. For that reason, a thorough validation of the material model has to be conducted after characterising the material. Relevant aspects for the validation of the simulation are for example the outer contour, the occurrence of defects and the fibre paths. To measure these features various methods are available. Most relevant and also most difficult to measure are the emerging fibre orientations. For that reason, the focus of this study was on measuring this feature. The aim was to give an overview of the properties of different measuring systems and select the most promising systems for a comparison survey. Selected were an optical, an eddy current and a computer-assisted tomography system with the focus on measuring the fibre orientations. Different formed 3D parts made of unidirectional glass fibre and carbon fibre reinforced thermoplastics were measured. Advantages and disadvantages of the tested systems were revealed. Optical measurement systems are easy to use, but are limited to the surface plies. With an eddy current system also lower plies can be measured, but it is only suitable for carbon fibres. Using a computer-assisted tomography system all plies can be measured, but the system is limited to small parts and challenging to evaluate.

  17. Unexpected but Most Welcome: Mixed Methods for the Validation and Revision of the Participatory Evaluation Measurement Instrument

    Science.gov (United States)

    Daigneault, Pierre-Marc; Jacob, Steve

    2014-01-01

    Although combining methods is nothing new, more contributions about why and how to mix methods for validation purposes are needed. This article presents a case of validating the inferences drawn from the Participatory Evaluation Measurement Instrument, an instrument that purports to measure stakeholder participation in evaluation. Although the…

  18. 48 CFR 14.503-1 - Step one.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Step one. 14.503-1 Section... AND CONTRACT TYPES SEALED BIDDING Two-Step Sealed Bidding 14.503-1 Step one. (a) Requests for... use the two step method. (3) The requirements of the technical proposal. (4) The evaluation criteria...

  19. Validity of physical activity and cardiorespiratory fitness in the Danish cohort "Diet, Cancer and Health-Next Generations".

    Science.gov (United States)

    Lerche, L; Olsen, A; Petersen, K E N; Rostgaard-Hansen, A L; Dragsted, L O; Nordsborg, N B; Tjønneland, A; Halkjaer, J

    2017-12-01

    Valid assessments of physical activity (PA) and cardiorespiratory fitness (CRF) are essential in epidemiological studies to define dose-response relationship for formulating thorough recommendations of an appropriate pattern of PA to maintain good health. The aim of this study was to validate the Danish step test, the physical activity questionnaire Active-Q, and self-rated fitness against directly measured maximal oxygen uptake (VO 2 max). A population-based subsample (n=125) was included from the "Diet, Cancer and Health-Next Generations" (DCH-NG) cohort which is under establishment. Validity coefficients, which express the correlation between measured and "true" exposure, were calculated, and misclassification across categories was evaluated. The validity of the Danish step test was moderate (women: r=.66, and men: r=.56); however, men were systematically underestimated (43% misclassification). When validating the questionnaire-derived measures of PA, leisure-time physical activity was not correlated with VO 2 max. Positive correlations were found for sports overall, but these were only significant for men: total hours per week of sports (r=.26), MET-hours per week of sports (r=.28) and vigorous sports (0.28) alone were positively correlated with VO 2 max. Finally, the percentage of misclassification was low for self-rated fitness (women: 9% and men: 13%). Thus, self-rated fitness was found to be a superior method to the Danish step test, as well as being less cost prohibitive and more practical than the VO 2 max method. Finally, even if correlations were low, they support the potential for questionnaire outcomes, particularly sports, vigorous sports, and self-rated fitness to be used to estimate CRF. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Validation of histamine determination Method in yoghurt using High Performance Liquid Chromatography

    Directory of Open Access Journals (Sweden)

    M Jahedinia

    2014-02-01

    Full Text Available Biogenic amines are organic, basic nitrogenous compounds of low molecular weight that are mainly generated by the enzymatic decarboxylation of amino acids by microorganisms. Dairy products are among the foods with the highest amine content. A wide variety of methods and procedures for determination of histamine and biogenic amines have been established. Amongst, HPLC method is considered as reference method. The aim of this study was to validate Reversed Phase HPLC method determination of histamine in yoghurt. The mobile phase consisted of acetonitrile/water (18:88 v/v and the flow rate was set at 0.5 ml/min using isocratic HPLC. Detection was carried out at 254 nm using UV-detector. Calibration curve that was constructed using peak area of standards was linear and value of correlation coefficient (r2 was estimated at 0.998. Good recoveries were observed for histamine under investigation at all spiking levels and average of recoveries was 84%. The RSD% value from repeatability test was found to be %4.4. Limit of detection and limit of quantitation were 0.14 and 0.42 µ/ml, respectively. The results of validation tests showed that the method is reliable and rapid for quantification of histamine in yoghurt.