WorldWideScience

Sample records for set validation matrix

  1. Set Matrix Theory as a Physically Motivated Generalization of Zermelo-Fraenkel Set Theory

    OpenAIRE

    Cabbolet, Marcoen J. T. F.; de Swart, Harrie C. M.

    2012-01-01

    Recently, the Elementary Process Theory (EPT) has been developed as a set of fundamental principles that might underlie a gravitational repulsion of matter and antimatter. This paper presents set matrix theory (SMT) as the foundation of the mathematical-logical framework in which the EPT has been formalized: Zermelo-Fraenkel set theory (ZF), namely, cannot be used as such. SMT is a generalization of ZF: whereas ZF uses only sets as primitive objects, in the framework of SMT finite matrices wi...

  2. 45 CFR 162.1011 - Valid code sets.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Valid code sets. 162.1011 Section 162.1011 Public... ADMINISTRATIVE REQUIREMENTS Code Sets § 162.1011 Valid code sets. Each code set is valid within the dates specified by the organization responsible for maintaining that code set. ...

  3. A General Method of Empirical Q-matrix Validation.

    Science.gov (United States)

    de la Torre, Jimmy; Chiu, Chia-Yi

    2016-06-01

    In contrast to unidimensional item response models that postulate a single underlying proficiency, cognitive diagnosis models (CDMs) posit multiple, discrete skills or attributes, thus allowing CDMs to provide a finer-grained assessment of examinees' test performance. A common component of CDMs for specifying the attributes required for each item is the Q-matrix. Although construction of Q-matrix is typically performed by domain experts, it nonetheless, to a large extent, remains a subjective process, and misspecifications in the Q-matrix, if left unchecked, can have important practical implications. To address this concern, this paper proposes a discrimination index that can be used with a wide class of CDM subsumed by the generalized deterministic input, noisy "and" gate model to empirically validate the Q-matrix specifications by identifying and replacing misspecified entries in the Q-matrix. The rationale for using the index as the basis for a proposed validation method is provided in the form of mathematical proofs to several relevant lemmas and a theorem. The feasibility of the proposed method was examined using simulated data generated under various conditions. The proposed method is illustrated using fraction subtraction data.

  4. Analysis of gene set using shrinkage covariance matrix approach

    Science.gov (United States)

    Karjanto, Suryaefiza; Aripin, Rasimah

    2013-09-01

    Microarray methodology has been exploited for different applications such as gene discovery and disease diagnosis. This technology is also used for quantitative and highly parallel measurements of gene expression. Recently, microarrays have been one of main interests of statisticians because they provide a perfect example of the paradigms of modern statistics. In this study, the alternative approach to estimate the covariance matrix has been proposed to solve the high dimensionality problem in microarrays. The extension of traditional Hotelling's T2 statistic is constructed for determining the significant gene sets across experimental conditions using shrinkage approach. Real data sets were used as illustrations to compare the performance of the proposed methods with other methods. The results across the methods are consistent, implying that this approach provides an alternative to existing techniques.

  5. A Method of Q-Matrix Validation for the Linear Logistic Test Model.

    Science.gov (United States)

    Baghaei, Purya; Hohensinn, Christine

    2017-01-01

    The linear logistic test model (LLTM) is a well-recognized psychometric model for examining the components of difficulty in cognitive tests and validating construct theories. The plausibility of the construct model, summarized in a matrix of weights, known as the Q-matrix or weight matrix, is tested by (1) comparing the fit of LLTM with the fit of the Rasch model (RM) using the likelihood ratio (LR) test and (2) by examining the correlation between the Rasch model item parameters and LLTM reconstructed item parameters. The problem with the LR test is that it is almost always significant and, consequently, LLTM is rejected. The drawback of examining the correlation coefficient is that there is no cut-off value or lower bound for the magnitude of the correlation coefficient. In this article we suggest a simulation method to set a minimum benchmark for the correlation between item parameters from the Rasch model and those reconstructed by the LLTM. If the cognitive model is valid then the correlation coefficient between the RM-based item parameters and the LLTM-reconstructed item parameters derived from the theoretical weight matrix should be greater than those derived from the simulated matrices.

  6. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  7. Validation of the PHEEM instrument in a Danish hospital setting

    DEFF Research Database (Denmark)

    Aspegren, Knut; Bastholt, Lars; Bested, K.M.

    2007-01-01

    The Postgraduate Hospital Educational Environment Measure (PHEEM) has been translated into Danish and then validated with good internal consistency by 342 Danish junior and senior hospital doctors. Four of the 40 items are culturally dependent in the Danish hospital setting. Factor analysis...

  8. Robust Validation Of Approximate 1-Matrix Functionals With Few-Electron Harmonium Atoms

    CERN Document Server

    Cioslowski, Jerzy; Matito, Eduard

    2015-01-01

    A simple comparison between the exact and approximate correlation components U of the electron-electron repulsion energy of several states of few-electron harmonium atoms with varying confinement strengths provides a superior validation tool for 1-matrix functionals. The robustness of this tool is clearly demonstrated in a survey of 14 known functionals, which reveals their substandard performance within different electron correlation regimes. Unlike spot-testing that employs dissociation curves of diatomic molecules or more extensive benchmarking against experimental atomization energies of molecules comprising one of standard sets, the present approach not only uncovers the flaws and patent failures of the functionals but, even more importantly, allows for pinpointing their root causes. Since the approximate values of U are computed at exact 1-densities, the testing requires minimal programming, and thus is particularly useful in quick screening of new functionals.

  9. Development and validation of a job exposure matrix for physical risk factors in low back pain.

    Directory of Open Access Journals (Sweden)

    Svetlana Solovieva

    Full Text Available OBJECTIVES: The aim was to construct and validate a gender-specific job exposure matrix (JEM for physical exposures to be used in epidemiological studies of low back pain (LBP. MATERIALS AND METHODS: We utilized two large Finnish population surveys, one to construct the JEM and another to test matrix validity. The exposure axis of the matrix included exposures relevant to LBP (heavy physical work, heavy lifting, awkward trunk posture and whole body vibration and exposures that increase the biomechanical load on the low back (arm elevation or those that in combination with other known risk factors could be related to LBP (kneeling or squatting. Job titles with similar work tasks and exposures were grouped. Exposure information was based on face-to-face interviews. Validity of the matrix was explored by comparing the JEM (group-based binary measures with individual-based measures. The predictive validity of the matrix against LBP was evaluated by comparing the associations of the group-based (JEM exposures with those of individual-based exposures. RESULTS: The matrix includes 348 job titles, representing 81% of all Finnish job titles in the early 2000s. The specificity of the constructed matrix was good, especially in women. The validity measured with kappa-statistic ranged from good to poor, being fair for most exposures. In men, all group-based (JEM exposures were statistically significantly associated with one-month prevalence of LBP. In women, four out of six group-based exposures showed an association with LBP. CONCLUSIONS: The gender-specific JEM for physical exposures showed relatively high specificity without compromising sensitivity. The matrix can therefore be considered as a valid instrument for exposure assessment in large-scale epidemiological studies, when more precise but more labour-intensive methods are not feasible. Although the matrix was based on Finnish data we foresee that it could be applicable, with some modifications, in

  10. [An optimal selection method of samples of calibration set and validation set for spectral multivariate analysis].

    Science.gov (United States)

    Liu, Wei; Zhao, Zhong; Yuan, Hong-Fu; Song, Chun-Feng; Li, Xiao-Yu

    2014-04-01

    The side effects in spectral multivariate modeling caused by the uneven distribution of sample numbers in the region of the calibration set and validation set were analyzed, and the "average" phenomenon that samples with small property values are predicted with larger values, and those with large property values are predicted with less values in spectral multivariate calibration is showed in this paper. Considering the distribution feature of spectral space and property space simultaneously, a new method of optimal sample selection named Rank-KS is proposed. Rank-KS aims at improving the uniformity of calibration set and validation set. Y-space was divided into some regions uniformly, samples of calibration set and validation set were extracted by Kennard-Stone (KS) and Random-Select (RS) algorithm respectively in every region, so the calibration set was distributed evenly and had a strong presentation. The proposed method were applied to the prediction of dimethylcarbonate (DMC) content in gasoline with infrared spectra and dimethylsulfoxide in its aqueous solution with near infrared spectra. The "average" phenomenon showed in the prediction of multiple linear regression (MLR) model of dimethylsulfoxide was weakened effectively by Rank-KS. For comparison, the MLR models and PLS1 models of MDC and dimethylsulfoxide were constructed by using RS, KS, Rank-Select, sample set partitioning based on joint X- and Y-blocks (SPXY) and proposed Rank-KS algorithms to select the calibration set, respectively. Application results verified that the best prediction was achieved by using Rank-KS. Especially, for the distribution of sample set with more in the middle and less on the boundaries, or none in the local, prediction of the model constructed by calibration set selected using Rank-KS can be improved obviously.

  11. Empirical validation data sets for double skin facade models

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    During recent years application of double skin facades (DSF) has greatly increased. However, successful application depends heavily on reliable and validated models for simulation of the DSF performance and this in turn requires access to high quality experimental data. Three sets of accurate...... empirical data for validation of DSF modeling with building simulation software were produced within the International Energy Agency (IEA) SHCTask 34 / ECBCS Annex 43. This paper describes the full-scale outdoor experimental test facility, the experimental set-up and the measurements procedure....... The empirical data is composed for key-operating modes, i.e. external air curtain mode (summer cooling), thermal insulation mode (all openings are closed) and air pre-heating mode (heating season) and consist of boundary conditions and DSF performance parameters. The DSF performance parameters discussed...

  12. TRAC-P validation test matrix. Revision 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, E.D.; Boyack, B.E.

    1997-09-05

    This document briefly describes the elements of the Nuclear Regulatory Commission`s (NRC`s) software quality assurance program leading to software (code) qualification and identifies a test matrix for qualifying Transient Reactor Analysis Code (TRAC)-Pressurized Water Reactor Version (-P), or TRAC-P, to the NRC`s software quality assurance requirements. Code qualification is the outcome of several software life-cycle activities, specifically, (1) Requirements Definition, (2) Design, (3) Implementation, and (4) Qualification Testing. The major objective of this document is to define the TRAC-P Qualification Testing effort.

  13. Data Set for Emperical Validation of Double Skin Facade Model

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    During the recent years the attention to the double skin facade (DSF) concept has greatly increased. Nevertheless, the application of the concept depends on whether a reliable model for simulation of the DSF performance will be developed or pointed out. This is, however, not possible to do, until...... the model is empirically validated and its' limitations for the DSF modeling are identified. Correspondingly, the existence and availability of the experimental data is very essential. Three sets of accurate empirical data for validation of DSF modeling with building simulation software were produced within...... of a double skin facade: 1. External air curtain mode, it is the naturally ventilated DSF cavity with the top and bottom openings open to the outdoor; 2. Thermal insulation mode, when all of the DSF openings closed; 3. Preheating mode, with the bottom DSF openings open to the outdoor and top openings open...

  14. The Visual Matrix Method: Imagery and Affect in a Group-Based Research Setting

    Directory of Open Access Journals (Sweden)

    Lynn Froggett

    2015-07-01

    Full Text Available The visual matrix is a method for researching shared experience, stimulated by sensory material relevant to a research question. It is led by imagery, visualization and affect, which in the matrix take precedence over discourse. The method enables the symbolization of imaginative and emotional material, which might not otherwise be articulated and allows "unthought" dimensions of experience to emerge into consciousness in a participatory setting. We describe the process of the matrix with reference to the study "Public Art and Civic Engagement" (FROGGETT, MANLEY, ROY, PRIOR & DOHERTY, 2014 in which it was developed and tested. Subsequently, examples of its use in other contexts are provided. Both the matrix and post-matrix discussions are described, as is the interpretive process that follows. Theoretical sources are highlighted: its origins in social dreaming; the atemporal, associative nature of the thinking during and after the matrix which we describe through the Deleuzian idea of the rhizome; and the hermeneutic analysis which draws from object relations theory and the Lorenzerian tradition of scenic understanding. The matrix has been conceptualized as a "scenic rhizome" to account for its distinctive quality and hybrid origins in research practice. The scenic rhizome operates as a "third" between participants and the "objects" of contemplation. We suggest that some of the drawbacks of other group-based methods are avoided in the visual matrix—namely the tendency for inter-personal dynamics to dominate the event. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs150369

  15. Discriminant Validity of the WISC-IV Culture-Language Interpretive Matrix

    Science.gov (United States)

    Styck, Kara M.; Watkins, Marley W.

    2014-01-01

    The Culture-Language Interpretive Matrix (C-LIM) was developed to help practitioners determine the validity of test scores obtained from students who are culturally and linguistically different from the normative group of a test. The present study used an idiographic approach to investigate the diagnostic utility of the C-LIM for the Wechsler…

  16. Test and measurement procedures to set up the Quality-/Energy-Matrix for UPS

    Energy Technology Data Exchange (ETDEWEB)

    Schnyder, G.; Mauchle, P.

    2005-03-15

    This comprehensive report for the Swiss Federal Office of Energy (SFOE) is one of a set of nine reports that provide an overall review of the energy-efficiency of UPS systems. This report takes a look at the test and energy-measurement procedures necessary for the setting up of a quality/energy matrix. General definitions applicable to all measurements are defined along with specific test definitions and measurement criteria for specific faults and failures caused, for example, by variations in supply frequency and transients. Also considered are the measurement of power-factors and distortions as well as losses and efficiency. The report is completed with an appendix concerning standards along with the quality/energy matrix form itself.

  17. A set of pathological tests to validate new finite elements

    Indian Academy of Sciences (India)

    M. Senthilkumar (Newgen Imaging) 1461 1996 Oct 15 13:05:22

    , of fixed type i.e., an element selected for a specific application and with a given dof configuration. Sze (1996). Admissible matrix formulation for efficient construction of multifield finite element models which employ patch test to identify the con-.

  18. A note on the estimation of the Pareto efficient set for multiobjective matrix permutation problems.

    Science.gov (United States)

    Brusco, Michael J; Steinley, Douglas

    2012-02-01

    There are a number of important problems in quantitative psychology that require the identification of a permutation of the n rows and columns of an n × n proximity matrix. These problems encompass applications such as unidimensional scaling, paired-comparison ranking, and anti-Robinson forms. The importance of simultaneously incorporating multiple objective criteria in matrix permutation applications is well recognized in the literature; however, to date, there has been a reliance on weighted-sum approaches that transform the multiobjective problem into a single-objective optimization problem. Although exact solutions to these single-objective problems produce supported Pareto efficient solutions to the multiobjective problem, many interesting unsupported Pareto efficient solutions may be missed. We illustrate the limitation of the weighted-sum approach with an example from the psychological literature and devise an effective heuristic algorithm for estimating both the supported and unsupported solutions of the Pareto efficient set. © 2011 The British Psychological Society.

  19. A set of pathological tests to validate new finite elements

    Indian Academy of Sciences (India)

    Hence it is essential to subject all new finite elements to an adequate set of pathological tests in order to assess their performance. Many such ... We present an adequate set of tests, which every new finite element should pass. ... Department of Mechanical Engineering, Indian Institute of Science, Bangalore 560 012, India ...

  20. GSMA: Gene Set Matrix Analysis, An Automated Method for Rapid Hypothesis Testing of Gene Expression Data

    Directory of Open Access Journals (Sweden)

    Chris Cheadle

    2007-01-01

    Full Text Available Background: Microarray technology has become highly valuable for identifying complex global changes in gene expression patterns. The assignment of functional information to these complex patterns remains a challenging task in effectively interpreting data and correlating results from across experiments, projects and laboratories. Methods which allow the rapid and robust evaluation of multiple functional hypotheses increase the power of individual researchers to data mine gene expression data more efficiently.Results: We have developed (gene set matrix analysis GSMA as a useful method for the rapid testing of group-wise up- or downregulation of gene expression simultaneously for multiple lists of genes (gene sets against entire distributions of gene expression changes (datasets for single or multiple experiments. The utility of GSMA lies in its flexibility to rapidly poll gene sets related by known biological function or as designated solely by the end-user against large numbers of datasets simultaneously.Conclusions: GSMA provides a simple and straightforward method for hypothesis testing in which genes are tested by groups across multiple datasets for patterns of expression enrichment.

  1. Data Set for Emperical Validation of Double Skin Facade Model

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Jensen, Rasmus Lund; Heiselberg, Per

    2008-01-01

    of a double skin facade: 1. External air curtain mode, it is the naturally ventilated DSF cavity with the top and bottom openings open to the outdoor; 2. Thermal insulation mode, when all of the DSF openings closed; 3. Preheating mode, with the bottom DSF openings open to the outdoor and top openings open...... the International Energy Agency (IEA) Task 34 Annex 43. This paper describes the full-scale outdoor experimental test facility ‘the Cube', where the experiments were conducted, the experimental set-up and the measurements procedure for the data sets. The empirical data is composed for the key-functioning modes...... etc. Parameters of the DSF performance discussed in the paper are: the temperature gradients in the DSF cavity, mass flow rate in the naturally ventilated cavity, surface temperatures, etc....

  2. Validation of the PHQ-15 for Somatoform Disorder in the Occupational Health Care Setting

    NARCIS (Netherlands)

    de Vroege, Lars; Hoedeman, Rob; Nuyen, Jasper; Sijtsma, Klaas; van der Feltz-Cornelis, Christina M.

    Introduction Within the occupational health setting, somatoform disorders are a frequent cause of sick leave. Few validated screening questionnaires for these disorders are available. The aim of this study is to validate the PHQ-15 in this setting. Methods In a cross-sectional study of 236

  3. Validation of neutron current formulations for the response matrix method based on the SP3 theory

    Energy Technology Data Exchange (ETDEWEB)

    Tada, Kenichi, E-mail: k-tada@fermi.nucl.nagoya-u.ac.j [Department of Materials, Physics and Energy Engineering, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603 (Japan); Yamamoto, Akio; Yamane, Yoshihiro [Department of Materials, Physics and Energy Engineering, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603 (Japan); Kosaka, Shinya; Hirano, Gou [TEPCOSYSTEMS CORPORATION, 2-37-28, Eitai, Koto-ku, Tokyo 135-0034 (Japan)

    2010-01-15

    The pin-by-pin fine mesh BWR core analysis code SUBARU has been developed as a next-generation BWR core analysis code. SUBARU is based on the SP3 theory and the response matrix method is used for flux calculations. The SP3 theory consists of the 0th and 2nd order neutron fluxes. Therefore, the relations among the 0th and 2nd order partial neutron currents and the fluxes are required to apply the response matrix method. SUBARU is approximated the relations among the partial neutron currents and the fluxes are similar to that the diffusion theory. Our previous study revealed that the prediction accuracy of SUBARU is much higher than that of conventional core analysis codes. However, validity of the above approximation is not directly investigated so far. Therefore, relations among the partial neutron currents and the fluxes are theoretically derived and calculation results with the rigorous and the conventional formulations are compared. The calculation results indicate that the approximation of the conventional formulation is appropriate for the BWR core analysis.

  4. Assessing the validity of commercial and municipal food environment data sets in Vancouver, Canada.

    Science.gov (United States)

    Daepp, Madeleine Ig; Black, Jennifer

    2017-10-01

    The present study assessed systematic bias and the effects of data set error on the validity of food environment measures in two municipal and two commercial secondary data sets. Sensitivity, positive predictive value (PPV) and concordance were calculated by comparing two municipal and two commercial secondary data sets with ground-truthed data collected within 800 m buffers surrounding twenty-six schools. Logistic regression examined associations of sensitivity and PPV with commercial density and neighbourhood socio-economic deprivation. Kendall's τ estimated correlations between density and proximity of food outlets near schools constructed with secondary data sets v. ground-truthed data. Vancouver, Canada. Food retailers located within 800 m of twenty-six schools RESULTS: All data sets scored relatively poorly across validity measures, although, overall, municipal data sets had higher levels of validity than did commercial data sets. Food outlets were more likely to be missing from municipal health inspections lists and commercial data sets in neighbourhoods with higher commercial density. Still, both proximity and density measures constructed from all secondary data sets were highly correlated (Kendall's τ>0·70) with measures constructed from ground-truthed data. Despite relatively low levels of validity in all secondary data sets examined, food environment measures constructed from secondary data sets remained highly correlated with ground-truthed data. Findings suggest that secondary data sets can be used to measure the food environment, although estimates should be treated with caution in areas with high commercial density.

  5. Experimental validation of the partial coherence model in spectroscopic ellipsometry and Mueller matrix polarimetry

    Science.gov (United States)

    Miranda-Medina, M.; Garcia-Caurel, E.; Peinado, A.; Stchakovsky, M.; Hingerl, K.; Ossikovski, R.

    2017-11-01

    In this contribution, a recently advanced analytical approach for addressing partial coherence in spectroscopic polarimetric measurements is experimentally validated. The approach is based on the fundamental representation of the measurement process as the convolution of the polarimetric response of the sample and the instrumental function of the measurement system. Experimentally, the optical responses of two optically thick transparent layers were acquired by using spectroscopic Mueller matrix polarimetry at various angles of incidence over two spectral ranges (visible and infrared). The layers are considered isotropic and the loss of coherence is assumed to originate from the finite spectral resolution of the instrument. In parallel with the analytical approximation, the standard numerical approach implemented in commercial software was likewise used to reproduce the polarimetric responses. Excellent agreement between the analytical approximation, the commercial software one and the polarimetric measurements was found. The experimental validation of the analytical approximation represents a time-saving alternative to the numerical approaches used in commercial software and is of potential interest to real-time process monitoring by using spectroscopic ellipsometry or polarimetry.

  6. Criterion-Referenced Tests in Science: An Investigation of Reliability, Validity, and Standards-Setting.

    Science.gov (United States)

    Lang, Harry G.

    1982-01-01

    Reliability, validity, and standards-setting procedure for a criterion-referenced test (Test of Metric Skills) were examined for use in science curricula. Results indicate a number of factors influencing test reliability/validity and that science teachers need to be aware of these factors to enhance accuracy of their judgments. (Author/JN)

  7. Design and regularization of neural networks: the optimal use of a validation set

    DEFF Research Database (Denmark)

    Larsen, Jan; Hansen, Lars Kai; Svarer, Claus

    1996-01-01

    We derive novel algorithms for estimation of regularization parameters and for optimization of neural net architectures based on a validation set. Regularisation parameters are estimated using an iterative gradient descent scheme. Architecture optimization is performed by approximative...... combinatorial search among the relevant subsets of an initial neural network architecture by employing a validation set based optimal brain damage/surgeon (OBD/OBS) or a mean field combinatorial optimization approach. Numerical results with linear models and feed-forward neural networks demonstrate...

  8. Development and validation of a physical and psychosocial job-exposure matrix in older and retired workers

    NARCIS (Netherlands)

    Rijs, K.J.; van der Pas, S.; Geuskens, G.A.; Cozijnsen, M.R.; Koppes, L.L.J.; van der Beek, A.J.; Deeg, D.

    2014-01-01

    Objectives:A general population job-exposure matrix (GPJEM) including physical and psychosocial demands as well as psychosocial resources applicable to older and retired workers was developed. Its validity was evaluated by examining associations of physical demands and iso-strain (combination of

  9. Nasolaryngoscopic validation of a set of clinical predictors of aspiration in a critical care setting.

    Science.gov (United States)

    Caviedes, Iván R; Lavados, Pablo M; Hoppe, Arnold J; López, María A

    2010-01-01

    Aspiration is frequent in patients with acute neurologic disorders and swallowing dysfunction. Its incidence in stroke, as high as 51%, increases mortality by up to 3 times. Pneumonia, its main complication, further increases morbidity, mortality, and patient care costs. The objective of this study was to evaluate a set of bedside predictors of aspiration ["wet voice," 3-oz water swallow test, and cervical auscultation in an intensive care unit (ICU)] and compare them with nasolaryngoscopy as the gold standard. We conducted a prospective, nonblinded study of bedside predictors of aspiration risks in 65 consecutive ICU patients with an acute neurologic disorder or a severe medical or surgical condition with decreased level of consciousness. Endoscopic aspiration was detected in 17 patients. Sensitivities for wet voice, 3-oz water swallow test, and cervical auscultation were 58.82%, 88.23%, and 82.35%; specificities were 78.26%, 62.50%, and 80.43%. Positive predictive values were 50%, 45.45%, and 60.86%, and negative predictive values were 83.72%, 93.75%, and 92.50%, respectively. Positive likelihood ratios were 2.70, 2.35, and 4.20, respectively. The association of 2 positive clinical predictors, wet voice and cervical auscultation or wet voice and 3-oz water swallow test, improved specificity to 92.85% and 84.61%, positive predictive values to 83.33% and 69.23%, and likelihood ratios to 10.76 and 5.85, respectively. Bedside clinical predictors for aspiration risks are a useful screening tool for ICU patients presenting with risk factors for this complication.

  10. Evaluation of the confusion matrix method in the validation of an automated system for measuring feeding behaviour of cattle.

    Science.gov (United States)

    Ruuska, Salla; Hämäläinen, Wilhelmiina; Kajava, Sari; Mughal, Mikaela; Matilainen, Pekka; Mononen, Jaakko

    2018-03-01

    The aim of the present study was to evaluate empirically confusion matrices in device validation. We compared the confusion matrix method to linear regression and error indices in the validation of a device measuring feeding behaviour of dairy cattle. In addition, we studied how to extract additional information on classification errors with confusion probabilities. The data consisted of 12 h behaviour measurements from five dairy cows; feeding and other behaviour were detected simultaneously with a device and from video recordings. The resulting 216 000 pairs of classifications were used to construct confusion matrices and calculate performance measures. In addition, hourly durations of each behaviour were calculated and the accuracy of measurements was evaluated with linear regression and error indices. All three validation methods agreed when the behaviour was detected very accurately or inaccurately. Otherwise, in the intermediate cases, the confusion matrix method and error indices produced relatively concordant results, but the linear regression method often disagreed with them. Our study supports the use of confusion matrix analysis in validation since it is robust to any data distribution and type of relationship, it makes a stringent evaluation of validity, and it offers extra information on the type and sources of errors. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Review and evaluation of performance measures for survival prediction models in external validation settings.

    Science.gov (United States)

    Rahman, M Shafiqur; Ambler, Gareth; Choodari-Oskooei, Babak; Omar, Rumana Z

    2017-04-18

    When developing a prediction model for survival data it is essential to validate its performance in external validation settings using appropriate performance measures. Although a number of such measures have been proposed, there is only limited guidance regarding their use in the context of model validation. This paper reviewed and evaluated a wide range of performance measures to provide some guidelines for their use in practice. An extensive simulation study based on two clinical datasets was conducted to investigate the performance of the measures in external validation settings. Measures were selected from categories that assess the overall performance, discrimination and calibration of a survival prediction model. Some of these have been modified to allow their use with validation data, and a case study is provided to describe how these measures can be estimated in practice. The measures were evaluated with respect to their robustness to censoring and ease of interpretation. All measures are implemented, or are straightforward to implement, in statistical software. Most of the performance measures were reasonably robust to moderate levels of censoring. One exception was Harrell's concordance measure which tended to increase as censoring increased. We recommend that Uno's concordance measure is used to quantify concordance when there are moderate levels of censoring. Alternatively, Gönen and Heller's measure could be considered, especially if censoring is very high, but we suggest that the prediction model is re-calibrated first. We also recommend that Royston's D is routinely reported to assess discrimination since it has an appealing interpretation. The calibration slope is useful for both internal and external validation settings and recommended to report routinely. Our recommendation would be to use any of the predictive accuracy measures and provide the corresponding predictive accuracy curves. In addition, we recommend to investigate the characteristics

  12. Construct validity and reliability of structured assessment of endovascular expertise in a simulated setting

    DEFF Research Database (Denmark)

    Bech, B.; Lönn, L.; Falkenberg, M.

    2011-01-01

    Objectives To study the construct validity and reliability of a novel endovascular global rating scale, Structured Assessment of endoVascular Expertise (SAVE). Design A Clinical, experimental study. Materials Twenty physicians with endovascular experiences ranging from complete novices to highly....... Validity was analysed by correlating experience with performance results. Reliability was analysed according to generalisability theory. Results The mean score on the 29 items of the SAVE scale correlated well with clinical experience (R = 0.84, P ... with clinical experience (R = -0.53, P validity and reliability of assessment with the SAVE scale was high when applied to performances in a simulation setting with advanced realism. No ceiling effect...

  13. Validation of the Diagnostic Value of Nuclear Matrix Protein 22 Depending on Tumoral Stage and Grade

    Directory of Open Access Journals (Sweden)

    Zoltán A. MIHÁLY

    2013-02-01

    Full Text Available Objectives: The aim of the present study was to validate the sensitivity and specificity of the NMP22® BladderChek® test in our group of patients according to the tumoral stage and grade and to identify the patient categories that might benefit from the non-invasive nature of NMP22® BladderChek® test. Methods: Voided urine samples from 266 patients with imagistic suspicion of bladder cancer were collected to perform the NMP22® BladderChek® test. The nuclear matrix protein 22 (NMP22 levels were measured by a lateral flow immunochromatographic qualitative assay, using 10 U/ml as the cut-off value. After this patients underwent transurethral resection of bladder tumors (TUR-BT follewed by histologic grading and tumor staging for a proper and optimal patient management. Sensitivity specificity and positive predictive value of the NMP22® BladderChek® test were defined for different tumoral stage and grade. Results: Two hundred thirty-eight of the 265 patients had urothelial malignancies (76 pTa, 81 pT1, 37 pT2, 32 pT3, 12 pT4, 27 pT0; 118 G1, 54 G2, 64 G3. The sensitivity was 0.629 [0.612; 0.629] for the NMP22® BladderChek® test while the specificity was equal to 1 [0.851; 1]. Positive predictive values was 1 [0.973; 1], and the negative predictive value was 0.235 [0.200; 0.235]. Conclusions: The results demonstrate that the even if the NMP22® BladderChek® is an easily applied test, giving diagnostic findings within 30 min, cannot be recommended for screening or surveillance in clinical routine use in non muscle invasive bladder cancer because of its poor sensitivity.

  14. Development and validation of an Argentine set of facial expressions of emotion.

    Science.gov (United States)

    Vaiman, Marcelo; Wagner, Mónica Anna; Caicedo, Estefanía; Pereno, Germán Leandro

    2017-02-01

    Pictures of facial expressions of emotion are used in a wide range of experiments. The last decade has seen an increase in the number of studies presenting local sets of emotion stimuli. However, only a few existing sets contain pictures of Latin Americans, despite the growing attention emotion research is receiving in this region. Here we present the development and validation of the Universidad Nacional de Cordoba, Expresiones de Emociones Faciales (UNCEEF), a Facial Action Coding System (FACS)-verified set of pictures of Argentineans expressing the six basic emotions, plus neutral expressions. FACS scores, recognition rates, Hu scores, and discrimination indices are reported. Evidence of convergent validity was obtained using the Pictures of Facial Affect in an Argentine sample. However, recognition accuracy was greater for UNCEEF. The importance of local sets of emotion pictures is discussed.

  15. The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults.

    Science.gov (United States)

    LoBue, Vanessa; Thrasher, Cat

    2014-01-01

    Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development-The Child Affective Facial Expression set (CAFE). The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for six emotional facial expressions-angry, fearful, sad, happy, surprised, and disgusted-and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants.

  16. The Child Affective Facial Expression (CAFE Set: Validity and Reliability from Untrained Adults

    Directory of Open Access Journals (Sweden)

    Vanessa eLoBue

    2015-01-01

    Full Text Available Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development—The Child Affective Facial Expression set (CAFE. The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for 6 emotional facial expressions—angry, fearful, sad, happy, surprised, and disgusted—and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants.

  17. Risk Analysis and Setting Priorities in Air Traffic Control by Using a Matrix of Similarities

    Directory of Open Access Journals (Sweden)

    Lacane Monta

    2015-11-01

    Full Text Available This article considers how mathematical decision-making in Air Traffic Control could be done in order to minimize the risk of collisions. An example of how to prioritize airplanes which are in the vicinity of an airport according to their level of risk in respect to other airplanes is given by using a matrix of similarities and Euclidean metric. The analysis has shown that it is necessary to classify ATC specialists and ATC centers according to their ability to provide safe enough service using time methods and highly experienced team work.

  18. Validation of Virtual Learning Team Competencies for Individual Students in a Distance Education Setting

    Science.gov (United States)

    Topchyan, Ruzanna; Zhang, Jie

    2014-01-01

    The purpose of this study was twofold. First, the study aimed to validate the scale of the Virtual Team Competency Inventory in distance education, which had initially been designed for a corporate setting. Second, the methodological advantages of Exploratory Structural Equation Modeling (ESEM) framework over Confirmatory Factor Analysis (CFA)…

  19. Determining Gestational Age in a Low-resource Setting: Validity of Last Menstrual Period

    OpenAIRE

    Rosenberg, Rebecca E.; Ahmed, A.S.M. Nawshad U.; Ahmed, Saifuddin; Saha, Samir K.; Chowdhury, M.A.K. Azad; Robert E Black; Santosham, Mathuram; Darmstadt, Gary L

    2009-01-01

    The validity of three methods (last menstrual period [LPM], Ballard and Dubowitz scores) for assessment of gestational age for premature infants in a low-resource setting was assessed, using antenatal ultrasound as the gold standard. It was hypothesized that LMP and other methods would perform similarly in determining postnatal gestational age. Concordance analysis was applied to data on 355 neonates of

  20. Developing Treatment, Treatment Validation & Treatment Scope in the Setting of an Autism Clinical Trial

    Science.gov (United States)

    2011-10-01

    Validation & Treatment Scope in the Setting of an Autism Clinical Trial PRINCIPAL INVESTIGATOR: T. Peter Stein, Ph.D...School of Osteopathic Medicine Piscataway, NJ 08854 REPORT DATE...Unlimited The views, opinions and/or findings contained in this report are those of the author(s) and should not be construed as an official

  1. Analysis and classification of data sets for calibration and validation of agro-ecosystem models

    DEFF Research Database (Denmark)

    Kersebaum, K C; Boote, K J; Jorgenson, J S

    2015-01-01

    Experimental field data are used at different levels of complexity to calibrate, validate and improve agro-ecosystem models to enhance their reliability for regional impact assessment. A methodological framework and software are presented to evaluate and classify data sets into four classes...

  2. Validating justice climate and peer justice in a real work setting

    Directory of Open Access Journals (Sweden)

    Agustin Molina

    2016-12-01

    Full Text Available In this study we tested the validity of justice climate and peer justice, measured as second-order constructs, in a real work setting. First, we investigated the appropriateness of aggregating first-order facets of justice climate and peer justice to work-unit level of analysis. Second, we examined the construct validity of justice climate and peer justice as two different factor structures. Third, we tested the hierarchical structure of justice climate and peer justice as second-order factors. Finally, we examined the predictive validity of second-order factors justice climate and peer justice within a nomological network composed of reciprocity with the supervisor and reciprocity with coworkers. We conducted these analyses in a sample of 532 employees nested in 79 organizations. Our results suggest the validity of justice climate and peer justice measured as second-order factors. We discuss these results and their implications for organizational justice research.

  3. Validation of a job-exposure matrix for assessment of utility worker exposure to magnetic fields.

    Science.gov (United States)

    Johansen, Christoffer; Raaschou-Nielsen, Ole; Skotte, Jørgen; Thomsen, Birthe L; Olsen, Jørgen H

    2002-04-01

    The aim of this study was to evaluate a 50-Hz electromagnetic field job-exposure matrix used in epidemiological studies of a nationwide cohort of utility workers in Denmark. We compared a job-exposure matrix that distinguished four categories of exposure to 50-Hz time-weighted average (TWA) magnetic fields: low ( 1.0 microT) of utility company employees with 196 measurements of 8-h exposure for 129 workers in this industry. The 129 workers were selected from the following five main work environments: generation facilities, transmission lines, distribution lines, substations, and other electrically and non-electrically relates jobs. This study shows that the job-exposure matrix can be expected to introduce misclassification mainly between adjacent categories of exposure. Thus, the distribution of measurements of exposure to 50-Hz magnetic fields was similar for workers in the medium and the high exposure matrix categories. But the two extreme categories satisfactorily separate low and very highly exposed workers. The study shows that epidemiological use of this job-exposure matrix might combine the two intermediate categories of exposure. If the sample size in extreme categories provides enough power, a study in which this job-exposure matrix is used should allow detection of a true association between exposure to 50-Hz magnetic field and disease.

  4. A multiple criteria decision making for raking alternatives using preference relation matrix based on intuitionistic fuzzy sets

    Directory of Open Access Journals (Sweden)

    Mehdi Bahramloo

    2013-10-01

    Full Text Available Ranking various alternatives has been under investigation and there are literally various methods and techniques for making a decision based on various criteria. One of the primary concerns on ranking methodologies such as analytical hierarchy process (AHP is that decision makers cannot express his/her feeling in crisp form. Therefore, we need to use linguistic terms to receive the relative weights for comparing various alternatives. In this paper, we discuss ranking different alternatives based on the implementation of preference relation matrix based on intuitionistic fuzzy sets.

  5. Animal neuropsychology: validation of the Intra-Dimensional Extra-Dimensional set shifting task for mice.

    Science.gov (United States)

    Garner, Joseph P; Thogerson, Collette M; Würbel, Hanno; Murray, James D; Mench, Joy A

    2006-10-02

    Research in animal neuropsychology is providing an exciting new generation of behavioral tests for mice that promise to overcome many of the limitations of current high-throughput testing, and provide direct animal homologues of clinically important measures in human research. Set shifting tasks are some of the best understood and widely used human neuropsychological tasks, with clinical relevance to traumatic brain injury, schizophrenia, autism, obsessive compulsive disorder, trichotillomania, and many other disorders. Here we report the first successful modification of a human set shifting neuropsychological task, the Intra-Dimensional Extra-Dimensional (IDED) task, for use with mice. We presented mice with a series of compound discrimination and reversal tasks where one stimulus dimension consistently cued reward. Task performance improved with a new set of compound stimuli, as did reversal performance--indicating the formation of a cognitive-attentional set. We then overtrained a subset of the mice, and presented control and overtrained mice with a new compound discrimination where a novel stimulus dimension cued reward. As is the case in human control subjects, control mice persisted in responding to the now-incorrect stimulus dimension, performing poorly on this extra-dimensional shift compared with the previous intra-dimensional shift, thereby validating the task as a measure of set shifting. Furthermore, overtrained mice were impaired on this extra-dimensional shift compared with controls, further validating the task. The advantages and disadvantages of the IDED task compared to high-throughput approaches are discussed.

  6. Validation of the 4DSQ somatization subscale in the occupational health care setting as a screener.

    Science.gov (United States)

    de Vroege, Lars; Emons, Wilco H M; Sijtsma, Klaas; Hoedeman, Rob; van der Feltz-Cornelis, Christina M

    2015-03-01

    Somatoform disorders (physical symptoms without medical explanation that cause dysfunction) are prevalent in the occupational health (OH) care setting and are associated with functional impairment and absenteeism. Availability of psychometric instruments aimed at assessing somatoform disorders is limited. In the OH setting, so far only the Patient-Health-Questionnaire 15 has been validated as screener for somatoform disorder, and has been shown to have moderate validity. The Four-Dimensional Symptom Questionnaire (4DSQ) is frequently used in the OH setting but the Somatization subscale is not validated yet. The aim of this study is to validate the 4DSQ Somatization subscale as screener for DSM-IV somatoform disorder in the OH setting by using the MINI interview as gold standard. Employees absent from work due to physical symptoms, for a period longer than 6 weeks and shorter than 2 years, were asked to participate in this study. They filled out the 4DSQ and underwent a MINI interview by telephone for DSM-IV classification. Specificity and sensitivity scores were calculated for all possible cut-off scores and a receiver operator curve was computed for the Somatization subscale. 95 % confidence intervals (95 % CIs) were calculated for sensitivity and specificity. The Somatization subscale of the 4DSQ has an optimal cut point of 9, with specificity and sensitivity equal to 64.3 % [95 % CI (53.6; 73.7 %)] and 60.9 % [95 % CI (40.8; 77.8 %)], respectively. Receiver operator curves showed an area under the curve equal to 0.61 [SE = 0.07; 95 % CI (0.48; 0.75)] for the Somatization subscale of the 4DSQ. The 4DSQ Somatization subscale is a questionnaire of moderate sensitivity and specificity.

  7. Validation of a global scale to assess the quality of interprofessional teamwork in mental health settings.

    Science.gov (United States)

    Tomizawa, Ryoko; Yamano, Mayumi; Osako, Mitue; Hirabayashi, Naotugu; Oshima, Nobuo; Sigeta, Masahiro; Reeves, Scott

    2017-12-01

    Few scales currently exist to assess the quality of interprofessional teamwork through team members' perceptions of working together in mental health settings. The purpose of this study was to revise and validate an interprofessional scale to assess the quality of teamwork in inpatient psychiatric units and to use it multi-nationally. A literature review was undertaken to identify evaluative teamwork tools and develop an additional 12 items to ensure a broad global focus. Focus group discussions considered adaptation to different care systems using subjective judgements from 11 participants in a pre-test of items. Data quality, construct validity, reproducibility, and internal consistency were investigated in the survey using an international comparative design. Exploratory factor analysis yielded five factors with 21 items: 'patient/community centred care', 'collaborative communication', 'interprofessional conflict', 'role clarification', and 'environment'. High overall internal consistency, reproducibility, adequate face validity, and reasonable construct validity were shown in the USA and Japan. The revised Collaborative Practice Assessment Tool (CPAT) is a valid measure to assess the quality of interprofessional teamwork in psychiatry and identifies the best strategies to improve team performance. Furthermore, the revised scale will generate more rigorous evidence for collaborative practice in psychiatry internationally.

  8. Validation of the Essentials of Magnetism II in Chinese critical care settings.

    Science.gov (United States)

    Bai, Jinbing; Hsu, Lily; Zhang, Qing

    2015-05-01

    To translate and evaluate the psychometric properties of the Essentials of Magnetism II tool (EOM II) for Chinese nurses in critical care settings. The EOM II is a reliable and valid scale for measuring the healthy work environment (HWE) for nurses in Western countries, however, it has not been validated among Chinese nurses. The translation of the EOM II followed internationally recognized guidelines. The Chinese version of the Essentials of Magnetism II tool (C-EOM II) was reviewed by an expert panel for culturally semantic equivalence and content validity. Then, 706 nurses from 28 intensive care units (ICUs) affiliated with 14 tertiary hospitals participated in this study. The reliability of the C-EOM II was assessed using the Cronbach's alpha coefficient; the content validity of this scale was assessed using the content validity index (CVI); and the construct validity was assessed using the confirmatory factor analysis (CFA). The C-EOM II showed excellent content validity with a CVI of 0·92. All the subscales of the C-EOM II were significantly correlated with overall nurse job satisfaction and nurse-assessed quality of care. The CFA showed that the C-EOM II was composed of 45 items with nine factors, accounting for 46·51% of the total variance. Cronbach's alpha coefficients for these factors ranged from 0·56 to 0·89. The C-EOM II is a promising scale to assess the HWE for Chinese ICU nurses. Nursing administrators and health care policy-makers can use the C-EOM II to evaluate clinical work environment so that a healthier work environment can be created and sustained for staff nurses. © 2013 British Association of Critical Care Nurses.

  9. Independent validation of the modified prognosis palliative care study predictor models in three palliative care settings.

    Science.gov (United States)

    Baba, Mika; Maeda, Isseki; Morita, Tatsuya; Hisanaga, Takayuki; Ishihara, Tatsuhiko; Iwashita, Tomoyuki; Kaneishi, Keisuke; Kawagoe, Shohei; Kuriyama, Toshiyuki; Maeda, Takashi; Mori, Ichiro; Nakajima, Nobuhisa; Nishi, Tomohiro; Sakurai, Hiroki; Shimoyama, Satofumi; Shinjo, Takuya; Shirayama, Hiroto; Yamada, Takeshi; Ono, Shigeki; Ozawa, Taketoshi; Yamamoto, Ryo; Tsuneto, Satoru

    2015-05-01

    Accurate prognostic information in palliative care settings is needed for patients to make decisions and set goals and priorities. The Prognosis Palliative Care Study (PiPS) predictor models were presented in 2011, but have not yet been fully validated by other research teams. The primary aim of this study is to examine the accuracy and to validate the modified PiPS (using physician-proxy ratings of mental status instead of patient interviews) in three palliative care settings, namely palliative care units, hospital-based palliative care teams, and home-based palliative care services. This multicenter prospective cohort study was conducted in 58 palliative care services including 16 palliative care units, 19 hospital-based palliative care teams, and 23 home-based palliative care services in Japan from September 2012 through April 2014. A total of 2426 subjects were recruited. For reasons including lack of followup and missing variables (primarily blood examination data), we obtained analyzable data from 2212 and 1257 patients for the modified PiPS-A and PiPS-B, respectively. In all palliative care settings, both the modified PiPS-A and PiPS-B identified three risk groups with different survival rates (Ppalliative care units, hospital-based palliative care teams, and home-based palliative care services. Copyright © 2015 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  10. An Ethical Issue Scale for Community Pharmacy Setting (EISP): Development and Validation.

    Science.gov (United States)

    Crnjanski, Tatjana; Krajnovic, Dusanka; Tadic, Ivana; Stojkov, Svetlana; Savic, Mirko

    2016-04-01

    Many problems that arise when providing pharmacy services may contain some ethical components and the aims of this study were to develop and validate a scale that could assess difficulties of ethical issues, as well as the frequency of those occurrences in everyday practice of community pharmacists. Development and validation of the scale was conducted in three phases: (1) generating items for the initial survey instrument after qualitative analysis; (2) defining the design and format of the instrument; (3) validation of the instrument. The constructed Ethical Issue scale for community pharmacy setting has two parts containing the same 16 items for assessing the difficulty and frequency thereof. The results of the 171 completely filled out scales were analyzed (response rate 74.89%). The Cronbach's α value of the part of the instrument that examines difficulties of the ethical situations was 0.83 and for the part of the instrument that examined frequency of the ethical situations was 0.84. Test-retest reliability for both parts of the instrument was satisfactory with all Interclass correlation coefficient (ICC) values above 0.6, (for the part that examines severity ICC = 0.809, for the part that examines frequency ICC = 0.929). The 16-item scale, as a self assessment tool, demonstrated a high degree of content, criterion, and construct validity and test-retest reliability. The results support its use as a research tool to asses difficulty and frequency of ethical issues in community pharmacy setting. The validated scale needs to be further employed on a larger sample of pharmacists.

  11. Accuracy and range of validity of the Wigner surmise for mixed symmetry classes in random matrix theory

    Science.gov (United States)

    Nishigaki, Shinsuke M.

    2012-12-01

    Schierenberg [Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.85.061130 85, 061130 (2012)] recently applied the Wigner surmise, i.e., substitution of ∞×∞ matrices by their 2×2 counterparts for the computation of level spacing distributions, to random matrix ensembles in transition between two universality classes. I examine the accuracy and the range of validity of the surmise for the crossover between the Gaussian orthogonal and unitary ensembles by contrasting them with the large-N results that I evaluated using the Nyström-type method for the Fredholm determinant. The surmised expression at the best-fitting parameter provides a good approximation for 0≲s≲2, i.e., the validity range of the original surmise.

  12. Development and validation of a taxonomy of adverse handover events in hospital settings

    DEFF Research Database (Denmark)

    Andersen, Henning Boje; Siemsen, Inger Margrete D.; Petersen, Lene Funck

    2015-01-01

    To develop and validate a taxonomy to classify and support the analysis of adverse events related to patient handovers in hospital settings. A taxonomy was established using descriptions of handover events extracted from incident reports, interviews and root cause analysis reports. The inter......-rater reliability and distribution of types of handover failures and causal factors. The taxonomy contains five types of failures and seven types of main causal factors. The taxonomy was validated against 432 adverse handover event descriptions contained in incident reports (stratified random sample from the Danish...... a tool for analyzing adverse handover events to identify frequent causes among reported handover failures. In turn, this provides a basis for selecting safety measures including handover protocols and training programmes....

  13. Validation of the comprehensive ICF core set for osteoarthritis: the perspective of physical therapists.

    Science.gov (United States)

    Bossmann, Tanja; Kirchberger, Inge; Glaessel, Andrea; Stucki, Gerold; Cieza, Alarcos

    2011-03-01

    Osteoarthritis is a common chronic disease associated with functional impairments and activity limitations, as well as participation restrictions. The Comprehensive International Classification of Functioning, Disability and Health (ICF) Core Set for Osteoarthritis is an application of the ICF and represents the typical spectrum of problems in functioning of patients with osteoarthritis. To validate the Comprehensive ICF Core Set for Osteoarthritis from the perspective of physical therapists. Physical therapists experienced in the treatment of patients with osteoarthritis were asked about patients' problems, resources and aspects of the environmental factors treated by physical therapists in patients with osteoarthritis in a three-round, electronic-mail survey using the Delphi technique. Responses were linked to the ICF. Seventy-two experts from 22 countries named 744 meaningful concepts that covered all ICF components. One hundred and fifty-two ICF categories were linked to these answers, 32 concepts were linked to the not-yet-developed personal factors component, and 14 issues were not covered by a single ICF category. Twelve ICF categories were not represented in the Comprehensive ICF Core Set for Osteoarthritis, although at least 75% of the participants rated them as important. The content validity of the ICF was widely supported by the physical therapists. However, several issues were raised that were not covered and need to be investigated further. Copyright © 2010 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  14. Diagnosing migraine in research and clinical settings: The validation of the Structured Migraine Interview (SMI)

    Science.gov (United States)

    2010-01-01

    Background Migraine is a common disorder that is highly co-morbid with psychopathological conditions such as depression and anxiety. Despite the extensive research and availability of treatment, migraine remains under-recognised and undertreated. The aim of this study was to design a short and practical screening tool to identify migraine for clinical and research purposes. Methods The structured migraine interview (SMI) based on the International Classification of Headache Disorders (ICHD) criteria was used in a clinical setting of headache sufferers and compared to clinical diagnosis by headache specialist. In addition to the validating characteristics of the interview different methods of administration were also tested. Results The SMI has high sensitivity (0.87) and modest specificity (0.58) when compared to headache specialist's clinical diagnosis. Conclusions Our study demonstrated that a structured interview based on the ICHD criteria is a useful and valid tool to identify migraine in research settings and to a limited extent in clinical settings, and could be used in studies on large samples where clinical interviews are less practical. PMID:20074361

  15. Community Priority Index: utility, applicability and validation for priority setting in community-based participatory research

    Directory of Open Access Journals (Sweden)

    Hamisu M. Salihu

    2015-07-01

    Full Text Available Background. Providing practitioners with an intuitive measure for priority setting that can be combined with diverse data collection methods is a necessary step to foster accountability of the decision-making process in community settings. Yet, there is a lack of easy-to-use, but methodologically robust measures, that can be feasibly implemented for reliable decision-making in community settings. To address this important gap in community based participatory research (CBPR, the purpose of this study was to demonstrate the utility, applicability, and validation of a community priority index in a community-based participatory research setting. Design and Methods. Mixed-method study that combined focus groups findings, nominal group technique with six key informants, and the generation of a Community Priority Index (CPI that integrated community importance, changeability, and target populations. Bootstrapping and simulation were performed for validation. Results. For pregnant mothers, the top three highly important and highly changeable priorities were: stress (CPI=0.85; 95%CI: 0.70, 1.00, lack of affection (CPI=0.87; 95%CI: 0.69, 1.00, and nutritional issues (CPI=0.78; 95%CI: 0.48, 1.00. For non-pregnant women, top priorities were: low health literacy (CPI=0.87; 95%CI: 0.69, 1.00, low educational attainment (CPI=0.78; 95%CI: 0.48, 1.00, and lack of self-esteem (CPI=0.72; 95%CI: 0.44, 1.00. For children and adolescents, the top three priorities were: obesity (CPI=0.88; 95%CI: 0.69, 1.00, low self-esteem (CPI=0.81; 95%CI: 0.69, 0.94, and negative attitudes toward education (CPI=0.75; 95%CI: 0.50, 0.94. Conclusions. This study demonstrates the applicability of the CPI as a simple and intuitive measure for priority setting in CBPR.

  16. Validation of an Australian sign language instrument of outcome measurement for adults in mental health settings.

    Science.gov (United States)

    Munro, Louise; Rodwell, John

    2009-04-01

    There are currently no adult mental health outcome measures that have been translated into Australian sign language (Auslan). Without a valid and reliable Auslan outcome measure, empirical research into the efficacy of mental health interventions for sign language users is unattainable. To address this research problem the Outcome Rating Scale (ORS), a measure of general functioning, was translated into Auslan and recorded on to digital video disk for use in clinical settings. The purpose of the present study was therefore to examine the reliability, validity and acceptability of an Auslan version of the ORS (ORS-Auslan). The ORS-Auslan was administered to 44 deaf people who use Auslan as their first language and who identify as members of a deaf community (termed 'Deaf' people) on their first presentation to a mental health or counselling facility and to 55 Deaf people in the general community. The community sample also completed an Auslan version of the Depression Anxiety Stress Scale-21 (DASS-21). t-Tests indicated significant differences between the mean scores for the clinical and community sample. Internal consistency was acceptable given the low number of items in the ORS-Auslan. Construct validity was established by significant correlations between total scores on the DASS-21-Auslan and ORS-Auslan. Acceptability of ORS-Auslan was evident in the completion rate of 93% compared with 63% for DASS-21-Auslan. This is the only Auslan outcome measure available that can be used across a wide variety of mental health and clinical settings. The ORS-Auslan provides mental health clinicians with a reliable and valid, brief measure of general functioning that can significantly distinguish between clinical and non-clinical presentations for members of the Deaf community.

  17. Open-Switch Fault Diagnosis and Fault Tolerant for Matrix Converter with Finite Control Set-Model Predictive Control

    DEFF Research Database (Denmark)

    Peng, Tao; Dan, Hanbing; Yang, Jian

    2016-01-01

    To improve the reliability of the matrix converter (MC), a fault diagnosis method to identify single open-switch fault is proposed in this paper. The introduced fault diagnosis method is based on finite control set-model predictive control (FCS-MPC), which employs a time-discrete model of the MC...... topology and a cost function to select the best switching state for the next sampling period. The proposed fault diagnosis method is realized by monitoring the load currents and judging the switching state to locate the faulty switch. Compared to the conventional modulation strategies such as carrier...... the feasibility and effectiveness of the proposed fault diagnosis method and fault tolerant strategy....

  18. Impact of Sample Matrix on Accuracy of Peptide Quantification: Assessment of Calibrator and Internal Standard Selection and Method Validation.

    Science.gov (United States)

    Arnold, Samuel L; Stevison, Faith; Isoherranen, Nina

    2016-01-05

    Protein quantification based on peptides using LC-MS/MS has emerged as a promising method to measure biomarkers, protein drugs, and endogenous proteins. However, the best practices for selection, optimization, and validation of the quantification peptides are not well established, and the influence of different matrices on protein digestion, peptide stability, and MS detection has not been systematically addressed. The aim of this study was to determine how biological matrices affect digestion, detection, and stability of peptides. The microsomal retinol dehydrogenase (RDH11) and cytosolic soluble aldehyde dehydrogenases (ALDH1As) involved in the synthesis of retinoic acid (RA) were chosen as model proteins. Considerable differences in the digestion efficiency, sensitivity, and matrix effects between peptides were observed regardless of the target protein's subcellular localization. The precision and accuracy of the quantification of RDH11 and ALDH1A were affected by the choice of calibration and internal standards. The final method using recombinant protein calibrators and stable isotope labeled (SIL) peptide internal standards was validated for human liver. The results demonstrate that different sample matrices have peptide, time, and matrix specific effects on protein digestion and absolute quantification.

  19. Reliability, construct validity and measurement potential of the ICF comprehensive core set for osteoarthritis

    Directory of Open Access Journals (Sweden)

    Kurtaiş Yeşim

    2011-11-01

    Full Text Available Abstract Background This study aimed to investigate the reliability and construct validity of the International Classification of Functioning, Disability and Health (ICF Comprehensive Core Set for osteoarthritis (OA in order to test its possible use as a measuring tool for functioning. Methods 100 patients with OA (84 F, 16 M; mean age 63 yr completed forms including demographic and clinical information besides the Short Form (36 Health Survey (SF-36® and the Western Ontario and McMaster Universities Index of Osteoarthritis (WOMAC. The ICF Comprehensive Core Set for OA was filled by health professionals. The internal construct validities of "Body Functions-Body structures" (BF-BS, "Activity" (A, "Participation" (P and "Environmental Factors" (EF domains were tested by Rasch analysis and reliability by internal consistency and person separation index (PSI. External construct validity was evaluated by correlating the Rasch transformed scores with SF-36 and WOMAC. Results In each scale, some items showing disordered thresholds were rescored, testlets were created to overcome the problem of local dependency and items that did not fit to the Rasch model were deleted. The internal construct validity of the four scales (BF-BS 16 items, A 8 items, P 7 items, EF 13 items were good [mean item fit (SD 0.138 (0.921, 0.216 (1.237, 0.759 (0.986 and -0.079 (2.200; person item fit (SD -0.147 (0.652, -0.241 (0.894, -0.310 (1.187 and -0.491 (1.173 respectively], indicating a single underlying construct for each scale. The scales were free of differential item functioning (DIF for age, gender, years of education and duration of disease. Reliabilities of the BF-BS, A, P, and EF scales were good with Cronbach's alphas of 0.79, 0.86, 0.88, and 0.83 and PSI's of 0.76, 0.86, 0.87, and 0.71, respectively. Rasch scores of BF-BS, A, and P showed moderate correlations with SF-36 and WOMAC scores where the EF had significant but weak correlations only with SF36-Social

  20. Ground Truth Observations of the Interior of a Rockglacier as Validation for Geophysical Monitoring Data Sets

    Science.gov (United States)

    Hilbich, C.; Roer, I.; Hauck, C.

    2007-12-01

    Monitoring the permafrost evolution in mountain regions is currently one of the important tasks in cryospheric studies as little data on past and present changes of the ground thermal regime and its material properties are available. In addition to recently established borehole temperature monitoring networks, techniques to determine and monitor the ground ice content have to be developed. A reliable quantification of ground ice is especially important for modelling the thermal evolution of frozen ground and for assessing the hazard potential due to thawing permafrost induced slope instability. Near surface geophysical methods are increasingly applied to detect and monitor ground ice occurrences in permafrost areas. Commonly, characteristic values of electrical resistivity and seismic velocity are used as indicators for the presence of frozen material. However, validation of the correct interpretation of the geophysical parameters can only be obtained through boreholes, and only regarding vertical temperature profiles. Ground truth of the internal structure and the ice content is usually not available. In this contribution we will present a unique data set from a recently excavated rockglacier near Zermatt/Valais in the Swiss Alps, where an approximately 5 m deep trench was cut across the rockglacier body for the construction of a ski track. Longitudinal electrical resistivity tomography (ERT) and refraction seismic tomography profiles were conducted prior to the excavation, yielding data sets for cross validation of commonly applied geophysical interpretation approaches in the context of ground ice detection. A recently developed 4-phase model was applied to calculate ice-, air- and unfrozen water contents from the geophysical data sets, which were compared to the ground truth data from the excavated trench. The obtained data sets will be discussed in the context of currently established geophysical monitoring networks in permafrost areas. In addition to the

  1. Development and construct validation of the Client-Centredness of Goal Setting (C-COGS) scale.

    Science.gov (United States)

    Doig, Emmah; Prescott, Sarah; Fleming, Jennifer; Cornwell, Petrea; Kuipers, Pim

    2015-07-01

    Client-centred philosophy is integral to occupational therapy practice and client-centred goal planning is considered fundamental to rehabilitation. Evaluation of whether goal-planning practices are client-centred requires an understanding of the client's perspective about goal-planning processes and practices. The Client-Centredness of Goal Setting (C-COGS) was developed for use by practitioners who seek to be more client-centred and who require a scale to guide and evaluate individually orientated practice, especially with adults with cognitive impairment related to acquired brain injury. To describe development of the C-COGS scale and examine its construct validity. The C-COGS was administered to 42 participants with acquired brain injury after multidisciplinary goal planning. C-COGS scores were correlated with the Canadian Occupational Performance Measure (COPM) importance scores, and measures of therapeutic alliance, motivation, and global functioning to establish construct validity. The C-COGS scale has three subscales evaluating goal alignment, goal planning participation, and client-centredness of goals. The C-COGS subscale items demonstrated moderately significant correlations with scales measuring similar constructs. Findings provide preliminary evidence to support the construct validity of the C-COGS scale, which is intended to be used to evaluate and reflect on client-centred goal planning in clinical practice, and to highlight factors contributing to best practice rehabilitation.

  2. Antibody Selection for Cancer Target Validation of FSH-Receptor in Immunohistochemical Settings

    Directory of Open Access Journals (Sweden)

    Nina Moeker

    2017-10-01

    Full Text Available Background: The follicle-stimulating hormone (FSH-receptor (FSHR has been reported to be an attractive target for antibody therapy in human cancer. However, divergent immunohistochemical (IHC findings have been reported for FSHR expression in tumor tissues, which could be due to the specificity of the antibodies used. Methods: Three frequently used antibodies (sc-7798, sc-13935, and FSHR323 were validated for their suitability in an immunohistochemical study for FSHR expression in different tissues. As quality control, two potential therapeutic anti-hFSHR Ylanthia® antibodies (Y010913, Y010916 were used. The specificity criteria for selection of antibodies were binding to native hFSHR of different sources, and no binding to non-related proteins. The ability of antibodies to stain the paraffin-embedded Flp-In Chinese hamster ovary (CHO/FSHR cells was tested after application of different epitope retrieval methods. Results: From the five tested anti-hFSHR antibodies, only Y010913, Y010916, and FSHR323 showed specific binding to native, cell-presented hFSHR. Since Ylanthia® antibodies were selected to specifically recognize native FSHR, as required for a potential therapeutic antibody candidate, FSHR323 was the only antibody to detect the receptor in IHC/histochemical settings on transfected cells, and at markedly lower, physiological concentrations (ex., in Sertoli cells of human testes. The pattern of FSH323 staining noticed for ovarian, prostatic, and renal adenocarcinomas indicated that FSHR was expressed mainly in the peripheral tumor blood vessels. Conclusion: Of all published IHC antibodies tested, only antibody FSHR323 proved suitable for target validation of hFSHR in an IHC setting for cancer. Our studies could not confirm the previously reported FSHR overexpression in ovarian and prostate cancer cells. Instead, specific overexpression in peripheral tumor blood vessels could be confirmed after thorough validation of the antibodies used.

  3. Validation of international stroke scales for use by nurses in Greek settings.

    Science.gov (United States)

    Theofanidis, Dimitrios

    2017-04-01

    Improving stroke outcomes by educating nurses in state-of-the-art stroke nursing skills is essential, but unfortunately, to date, there are limited validated stroke assessment scales for routine clinical and research use in Greece. The aim of this paper is to validate and culturally adapt three internationally recognised stroke scales for use in Greece. A critical appraisal of the international literature was undertaken to identify suitable scales to assess stroke impact: neurological, functional status and level of dependence. We identified: Scandinavian Stroke Scale (SSS), Barthel Index (BI) and modified Rankin Scale (mRS). They were formally translated and culturally adapted from English to Greek. Their validity was tested using Cronbach's alpha and Median Discrimination Index, while construct validity was checked by Principal Component Analysis (PCA). These were used on 57 consecutively selected patients with stroke from a Greek hospital, mean age 67.7 (±6.7 SD) years, range 54-85 years, length of stay, 8.5 (±2.7 SD) days. All three scales show high internal consistency. The Cronbach's α on admission/ discharge for the SSS ranged from 0.86 to 0.88. The BI's reliability ranged from 0.95 to 0.93. The Median Discrimination Index was 0.70 (SSS) and 0.83 (BI). PCA showed that although a significant general factor (F1) explains most of the variance (57.0% on admission and 56.4% on discharge) a second factor (F2) of less significance was also highlighted. The convergent validity of the three scales was confirmed. The stroke tools selected showed high reliability and validity, thus making these suitable for use in Greek clinical/academic environments. All three scales used are almost routinely undertaken in stroke studies internationally and form a backdrop for bio-statistical, functional and social outcome post-stroke. The Greek version of the stroke tools show that both SSS and BI have high internal consistency and reliability and together with the mRS could be

  4. Validation of the PHQ-15 for somatoform disorder in the occupational health care setting.

    Science.gov (United States)

    de Vroege, Lars; Hoedeman, Rob; Nuyen, Jasper; Sijtsma, Klaas; van der Feltz-Cornelis, Christina M

    2012-03-01

    Within the occupational health setting, somatoform disorders are a frequent cause of sick leave. Few validated screening questionnaires for these disorders are available. The aim of this study is to validate the PHQ-15 in this setting. In a cross-sectional study of 236 sicklisted employees, we studied the performance of the PHQ-15 in comparison with the Mini International Neuropsychiatric Interview (MINI) as golden reference standard. We approached employees who were sick listed for a period longer than 6 weeks and shorter than 2 years for participation. This study was conducted on one location of a large occupation health service in the Netherlands, serving companies with more than 500 employees. All employees who returned the PHQ-15 were invited for the MINI interview. Specificity and sensitivity were calculated for optimal cut point and a receiver operating characteristic (ROC) was constructed. A total of 107 participants consented to participate in the MINI interview. A non-response analysis showed no significant differences between groups. According to the MINI, the prevalence of somatoform disorders was 21.5%, and the most frequent found disorder was a pain disorder. The PHQ-15 had an optimal cut point of 9 (patients scoring 9 or higher (≥9) were most likely to suffer from a somatoform disorder), with specificity and sensitivity equal to 61.9 and 56.5%, respectively. ROCs showed an area under the curve (AUC) of 0.63. The PHQ-15 shows moderate sensitivity but limited efficiency with a cut point of 9 and can be a useful questionnaire in the occupational health setting.

  5. Development and validation of an early childhood development scale for use in low-resourced settings.

    Science.gov (United States)

    McCoy, Dana Charles; Sudfeld, Christopher R; Bellinger, David C; Muhihi, Alfa; Ashery, Geofrey; Weary, Taylor E; Fawzi, Wafaie; Fink, Günther

    2017-02-09

    Low-cost, cross-culturally comparable measures of the motor, cognitive, and socioemotional skills of children under 3 years remain scarce. In the present paper, we aim to develop a new caregiver-reported early childhood development (ECD) scale designed to be implemented as part of household surveys in low-resourced settings. We evaluate the acceptability, test-retest reliability, internal consistency, and discriminant validity of the new ECD items, subscales, and full scale in a sample of 2481 18- to 36-month-old children from peri-urban and rural Tanzania. We also compare total and subscale scores with performance on the Bayley Scales of Infant Development (BSID-III) in a subsample of 1036 children. Qualitative interviews from 10 mothers and 10 field workers are used to inform quantitative data. Adequate levels of acceptability and internal consistency were found for the new scale and its motor, cognitive, and socioemotional subscales. Correlations between the new scale and the BSID-III were high (r > .50) for the motor and cognitive subscales, but low (r < .20) for the socioemotional subscale. The new scale discriminated between children's skills based on age, stunting status, caregiver-reported disability, and adult stimulation. Test-retest reliability scores were variable among a subset of items tested. Results of this study provide empirical support from a low-income country setting for the acceptability, reliability, and validity of a new caregiver-reported ECD scale. Additional research is needed to test these and other caregiver reported items in children in the full 0 to 3 year range across multiple cultural and linguistic settings.

  6. Pain communication through body posture: the development and validation of a stimulus set.

    Science.gov (United States)

    Walsh, Joseph; Eccleston, Christopher; Keogh, Edmund

    2014-11-01

    Pain can be communicated nonverbally through facial expressions, vocalisations, and bodily movements. Most studies have focussed on the facial display of pain, whereas there is little research on postural display. Stimulus sets for facial and vocal expressions of pain have been developed, but there is no equivalent for body-based expressions. Reported here is the development of a new stimulus set of dynamic body postures that communicate pain and basic emotions. This stimulus set is designed to facilitate research into the bodily communication of pain. We report a 3-phase development and validation study. First 16 actors performed affective body postures for pain, as well as happiness, sadness, fear, disgust, surprise, anger, and neutral expressions. Second, 20 observers independently selected the best image stimuli based on the accuracy of emotion identification and valence/arousal ratings. Third, to establish reliability, this accuracy and valence rating procedure was repeated with a second independent group of 40 participants. A final set of 144 images with good reliability was established and is made available. Results demonstrate that pain, along with basic emotions, can be communicated through body posture. Cluster analysis demonstrates that pain and emotion are recognised with a high degree of specificity. In addition, pain was rated as the most unpleasant (negative valence) of the expressions, and was associated with a high level of arousal. For the first time, specific postures communicating pain are described. The stimulus set is provided as a tool to facilitate the study of nonverbal pain communication, and its possible uses are discussed. Copyright © 2014 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  7. Determining gestational age in a low-resource setting: validity of last menstrual period.

    Science.gov (United States)

    Rosenberg, Rebecca E; Ahmed, A S M Nawshad U; Ahmed, Saifuddin; Saha, Samir K; Chowdhury, M A K Azad; Black, Robert E; Santosham, Mathuram; Darmstadt, Gary L

    2009-06-01

    The validity of three methods (last menstrual period [LPM], Ballard and Dubowitz scores) for assessment of gestational age for premature infants in a low-resource setting was assessed, using antenatal ultrasound as the gold standard. It was hypothesized that LMP and other methods would perform similarly in determining postnatal gestational age. Concordance analysis was applied to data on 355 neonates of Ballard, and Dubowitz was 0.878, 0.914, and 0.886 respectively. LMP and Ballard underestimated gestational age by one day (+/-11) and 2.9 days (+/-7.8) respectively while Dubowitz overestimated gestational age by 3.9 days (+/-7.1) compared to ultrasound finding. LMP in a low-resource setting was a more reliable measure of gestational age than previously thought for estimation of postnatal gestational age of preterm infants. Ballard and Dubowitz scores are slightly more reliable but require more technical skills to perform. Additional prospective trials are warranted to examine LMP against antenatal ultrasound for primary assessment of neonatal gestational age in other low-resource settings.

  8. Measuring automatic associations: validation of algorithms for the Implicit Association Test (IAT) in a laboratory setting.

    Science.gov (United States)

    Glashouwer, Klaske A; Smulders, Fren T Y; de Jong, Peter J; Roefs, Anne; Wiers, Reinout W H J

    2013-03-01

    In their paper, "Understanding and using the Implicit Association Test: I. An improved scoring algorithm", Greenwald, Nosek, and Banaji (2003) investigated different ways to calculate the IAT-effect. However, up to now, it remained unclear whether these findings - based on internet data - also generalize to laboratory settings. Therefore, the main goal of the present study was to cross-validate scoring algorithms for the IAT in a laboratory setting, specifically in the domain of psychopathology. Four known IAT algorithms and seven alternative IAT algorithms were evaluated on several performance criteria in the large-scale laboratory sample of the Netherlands Study of Depression and Anxiety (N = 2981) in which two IATs were included to obtain measurements of automatic self-anxious and automatic self-depressed associations. Results clearly demonstrated that the D(2SD)-measure and the D(600)-measure as well as an alternative algorithm based on the correct trials only (D(noEP)-measure) are suitable to be used in a laboratory setting for IATs with a fixed order of category combinations. It remains important to further replicate these findings, especially in studies that include outcome measures of more spontaneous kinds of behaviors. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. SET-bullying: presentation of a collaborative project and discussion of its internal and external validity.

    Science.gov (United States)

    Chalamandaris, Alexandros-Georgios; Wilmet-Dramaix, Michèle; Eslea, Mike; Ertesvåg, Sigrun Karin; Piette, Danielle

    2016-04-12

    Since the early 1980s, several school based anti-bullying interventions (SBABI) have been implemented and evaluated in different countries. Some meta-analyses have also drawn conclusions on the effectiveness of SBABIs. However, the relationship between time and effectiveness of SBABIs has not been fully studied. For this aim, a collaborative project, SET-Bullying, is established by researchers from Greece, Belgium, Norway and United Kingdom. Its primary objective is to further understand and statistically model the relationship between the time and the sustainability of the effectiveness of SBABI. The secondary objective of SET-Bullying is to assess the possibility of predicting the medium-term or long-term effectiveness using as key information the prior measurement and the short-term effectiveness of the intervention. Researchers and owners of potentially eligible databases were asked to participate in this effort. Two studies have contributed data for the purpose of SET-Bullying. This paper summarizes the main characteristics of the participating studies and provides a high level overview of the collaborative project. It also discusses on the extent to which both study and project characteristics may pose threats to the expected internal and external validity of the potential outcomes of the project. Despite these threats, this work represents the first effort to understand the impact of time on the observed effectiveness of SBABIs and assess its predictability, which would allow for better planning, implementation and evaluation of SBABIs.

  10. A strategy for developing representative germplasm sets for systematic QTL validation, demonstrated for apple, peach, and sweet cherry

    NARCIS (Netherlands)

    Peace, C.P.; Luby, J.; Weg, van de W.E.; Bink, M.C.A.M.; Iezzoni, A.F.

    2014-01-01

    Horticultural crop improvement would benefit from a standardized, systematic, and statistically robust procedure for validating quantitative trait loci (QTLs) in germplasm relevant to breeding programs. Here, we describe and demonstrate a strategy for developing reference germplasm sets of

  11. A validated stability-indicating UPLC method for determination of diclofenac sodium in its pure form and matrix formulations

    Directory of Open Access Journals (Sweden)

    Ehab M. Elzayat

    2017-05-01

    Full Text Available The aim of this work is to develop a validated stability indicating reverse phase ultra performance liquid chromatography (UPLC method for the rapid and accurate determination of diclofenac sodium in its pure form and in matrix formulations. This UPLC method is composed of isocratic mobile phase, 0.05 M ammonium acetate buffer (pH = 2.5 and acetonitrile (50:50, with flow rate 0.5 ml/min, column BEH C18 (2.1 × 50 mm, 1.7 μm. The method is rapid (1.2 min run, selective with well resoluted diclofenac peak with retention time 0.94 min and sensitive (LOD = 2 ppm and LLOQ = 6 ppm with UV detection at 254 nm. The drug was subjected to acidic, alkaline media, boiling and oxidizing agent to apply stress conditions. The developed method was able to separate degradation product generated under forced degradation studies. The developed method was validated as per the FDA guidelines for specificity, linearity, accuracy, precision, LOD, LLOQ and found to be satisfactory. The study suggests that the developed UPLC method can be used for the assessment of drug purity and stability. It can be also used to monitor the drug content and release from different formulations without any interference of excipients and/or degradation products.

  12. Adaption and validation of the Safety Attitudes Questionnaire for the Danish hospital setting

    Directory of Open Access Journals (Sweden)

    Kristensen S

    2015-02-01

    Full Text Available Solvejg Kristensen,1–3 Svend Sabroe,4 Paul Bartels,1,5 Jan Mainz,3,5 Karl Bang Christensen6 1The Danish Clinical Registries, Aarhus, Denmark; 2Department of Health Science and Technology, Aalborg University, Aalborg, Denmark; 3Aalborg University Hospital, Psychiatry, Aalborg, Denmark; 4Department of Public Health, Aarhus University, Aarhus, Denmark; 5Department of Clinical Medicine, Aalborg University, Aalborg, Denmark; 6Department of Biostatistics, University of Copenhagen, Copenhagen, Denmark Purpose: Measuring and developing a safe culture in health care is a focus point in creating highly reliable organizations being successful in avoiding patient safety incidents where these could normally be expected. Questionnaires can be used to capture a snapshot of an employee's perceptions of patient safety culture. A commonly used instrument to measure safety climate is the Safety Attitudes Questionnaire (SAQ. The purpose of this study was to adapt the SAQ for use in Danish hospitals, assess its construct validity and reliability, and present benchmark data.Materials and methods: The SAQ was translated and adapted for the Danish setting (SAQ-DK. The SAQ-DK was distributed to 1,263 staff members from 31 in- and outpatient units (clinical areas across five somatic and one psychiatric hospitals through meeting administration, hand delivery, and mailing. Construct validity and reliability were tested in a cross-sectional study. Goodness-of-fit indices from confirmatory factor analysis were reported along with inter-item correlations, Cronbach's alpha (α, and item and subscale scores.Results: Participation was 73.2% (N=925 of invited health care workers. Goodness-of-fit indices from the confirmatory factor analysis showed: c2=1496.76, P<0.001, CFI 0.901, RMSEA (90%CI 0.053 (0.050-0056, Probability RMSEA (p close=0.057. Inter-scale correlations between the factors showed moderate-to-high correlations. The scale stress recognition had significant

  13. Validation of the palliative performance scale in the acute tertiary care hospital setting.

    Science.gov (United States)

    Olajide, Oludamilola; Hanson, Laura; Usher, Barbara M; Qaqish, Bahjat F; Schwartz, Robert; Bernard, Stephen

    2007-02-01

    Physicians are often asked to prognosticate patient survival. However, prediction of survival is difficult, particularly with critically ill and dying patients within the hospitals. The Palliative Performance Scale (PPS) was designed to assess functional status and measure progressive decline in palliative care patients, yet it has not been validated within hospital health care settings. This study explores the application of the PPS for its predictive ability related to length of survival. Other variables examined were correlates of symptom distress in a tertiary academic setting. Patients were assigned a score on the PPS ranging from 0% to 100% at initial consultation. Standardized symptom assessments were carried out daily, and survival was determined by medical record review and search of the National Death Index. Of 261 patients seen since January 2002, 157 had cancer and 104 had other diagnoses. PPS scores ranged from 10% to 80% with 92% of the scores between 10% and 40%. Survival ranged from 0 to 30 months, with a median of 9 days. By 90 days, 83% of patients had died. Proportional hazards regression estimates showed that a 10% decrement in PPS score was associated with a hazard ratio of 1.65 (95% confidence interval [CI]: 1.42-1.92). Proportional odds regression models showed that a lower PPS was significantly associated with higher levels of dyspnea. The PPS correlated well with length of survival and with select symptom distress scores. We consider it to be a useful tool in predicting outcomes for palliative care patients.

  14. The use of questionnaires in colour research in real-life settings : In search of validity and methodological pitfalls

    NARCIS (Netherlands)

    Bakker, I.C.; van der Voordt, Theo; Vink, P.; de Boon, J

    2014-01-01

    This research discusses the validity of applying questionnaires in colour research in real life settings.
    In the literature the conclusions concerning the influences of colours on human performance and well-being are often conflicting. This can be caused by the artificial setting of the test

  15. Validity of Chinese Version of the Composite International Diagnostic Interview-3.0 in Psychiatric Settings

    Directory of Open Access Journals (Sweden)

    Jin Lu

    2015-01-01

    Full Text Available Background: The Composite International Diagnostic Interview-3.0 (CIDI-3.0 is a fully structured lay-administered diagnostic interview for the assessment of mental disorders according to ICD-10 and Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV criteria. The aim of the study was to investigate the concurrent validity of the Chinese CIDI in diagnosing mental disorders in psychiatric settings. Methods: We recruited 208 participants, of whom 148 were patients from two psychiatric hospitals and 60 healthy people from communities. These participants were administered with CIDI by six trained lay interviewers and the Structured Clinical Interview for DSM-IV Axis I Disorders (SCID-I, gold standard by two psychiatrists. Agreement between CIDI and SCID-I was assessed with sensitivity, specificity, positive predictive value and negative predictive value. Individual-level CIDI-SCID diagnostic concordance was evaluated using the area under the receiver operator characteristic curve and Cohen′s K. Results: Substantial to excellent CIDI to SCID concordance was found for any substance use disorder (area under the receiver operator characteristic curve [AUC] = 0.926, any anxiety disorder (AUC = 0.807 and any mood disorder (AUC = 0.806. The concordance between the CIDI and the SCID for psychotic and eating disorders is moderate. However, for individual mental disorders, the CIDI-SCID concordance for bipolar disorders (AUC = 0.55 and anorexia nervosa (AUC = 0.50 was insufficient. Conclusions: Overall, the Chinese version of CIDI-3.0 has acceptable validity in diagnosing the substance use disorder, anxiety disorder and mood disorder among Chinese adult population. However, we should be cautious when using it for bipolar disorders and anorexia nervosa.

  16. Revising the retrieval technique of a long-term stratospheric HNO{sub 3} data set. From a constrained matrix inversion to the optimal estimation algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Fiorucci, I.; Muscari, G. [Istituto Nazionale di Geofisica e Vulcanologia, Rome (Italy); De Zafra, R.L. [State Univ. of New York, Stony Brook, NY (United States). Dept. of Physics and Astronomy

    2011-07-01

    The Ground-Based Millimeter-wave Spectrometer (GBMS) was designed and built at the State University of New York at Stony Brook in the early 1990s and since then has carried out many measurement campaigns of stratospheric O{sub 3}, HNO{sub 3}, CO and N{sub 2}O at polar and mid-latitudes. Its HNO{sub 3} data set shed light on HNO{sub 3} annual cycles over the Antarctic continent and contributed to the validation of both generations of the satellite-based JPL Microwave Limb Sounder (MLS). Following the increasing need for long-term data sets of stratospheric constituents, we resolved to establish a long-term GMBS observation site at the Arctic station of Thule (76.5 N, 68.8 W), Greenland, beginning in January 2009, in order to track the long- and short-term interactions between the changing climate and the seasonal processes tied to the ozone depletion phenomenon. Furthermore, we updated the retrieval algorithm adapting the Optimal Estimation (OE) method to GBMS spectral data in order to conform to the standard of the Network for the Detection of Atmospheric Composition Change (NDACC) microwave group, and to provide our retrievals with a set of averaging kernels that allow more straightforward comparisons with other data sets. The new OE algorithm was applied to GBMS HNO{sub 3} data sets from 1993 South Pole observations to date, in order to produce HNO{sub 3} version 2 (v2) profiles. A sample of results obtained at Antarctic latitudes in fall and winter and at mid-latitudes is shown here. In most conditions, v2 inversions show a sensitivity (i.e., sum of column elements of the averaging kernel matrix) of 100{+-}20% from 20 to 45 km altitude, with somewhat worse (better) sensitivity in the Antarctic winter lower (upper) stratosphere. The 1{sigma} uncertainty on HNO{sub 3} v2 mixing ratio vertical profiles depends on altitude and is estimated at {proportional_to}15% or 0.3 ppbv, whichever is larger. Comparisons of v2 with former (v1) GBMS HNO{sub 3} vertical profiles

  17. Matrix Extension Study: Validation of the Compact Dry TC Method for Enumeration of Total Aerobic Bacteria in Selected Foods.

    Science.gov (United States)

    Mizuochi, Shingo; Nelson, Maria; Baylis, Chris; Jewell, Keith; Green, Becky; Limbum, Rob; Fernandez, Maria Cristina; Salfinger, Yvonne; Chen, Yi

    2016-01-01

    A validation study was conducted to extend the matrix claim for the Nissui Compact Dry Total Count (TC), Performance Tested Method(s)(SM) (PTM) Certification No. 010404, to cooked chicken, lettuce, frozen fish, milk powder, and pasteurized whole milk. The method was originally certified by the AOAC Research Institute Performance Tested Method(s)(SM) Program for raw meat products. The Compact Dry TC is a ready-to-use dry media sheet that is rehydrated by adding 1 mL of diluted sample. A total aerobic colony count can be determined in the sample following 48 h of incubation. Matrix extension studies were conducted by Campden BRI (formerly Campden and Chorleywood Food Research Association Technology Limited), Chipping Campden, UK. Single-laboratory data were collected for cooked chicken, lettuce, frozen fish, and milk powder, whereas a multilaboratory study was conducted on pasteurized milk. Fourteen laboratories participated in the collaborative study. The Compact Dry TC was tested at two time points, 48 ± 3 h and 72 ± 3 h and compared with the current International Organization for Standardization (ISO) method at the time of the study, ISO 4833:2003 (this standard is withdrawn and has been replaced by: ISO 4833-1:2013 and ISO 4833-2:2013), Microbiology of food and animal feeding stuffs-Horizontal method for the enumeration of microorganisms-Colony-count technique at 30°C. The data were logarithmically transformed and evaluated for repeatability (plus reproducibility for pasteurized milk), RSD of repeatability (plus RSD of reproducibility for milk), r(2), and mean difference between methods with 95% confidence interval (CI). A CI outside of (-0.5 to 0.5) on the log10 mean difference was used as the criterion to establish significant statistical difference between methods. No significant differences were found between the Compact Dry TC 48 and 72 h time points, with the exception of one contamination level of cooked chicken and one contamination level of dry milk

  18. Population Validity and Cross-Validity: Applications of Distribution Theory for Testing Hypotheses, Setting Confidence Intervals, and Determining Sample Size

    Science.gov (United States)

    Algina, James; Keselman, H. J.

    2008-01-01

    Applications of distribution theory for the squared multiple correlation coefficient and the squared cross-validation coefficient are reviewed, and computer programs for these applications are made available. The applications include confidence intervals, hypothesis testing, and sample size selection. (Contains 2 tables.)

  19. Validation of the comprehensive ICF core sets for diabetes mellitus:a Malaysian perspective.

    Science.gov (United States)

    Abdullah, Mohd Faudzi; Nor, Norsiah Mohd; Mohd Ali, Siti Zubaidah; Ismail Bukhary, Norizzati Bukhary; Amat, Azlin; Latif, Lydia Abdul; Hasnan, Nazirah; Omar, Zaliha

    2011-04-01

    Diabetes mellitus (DM) is a chronic disease that is prevalent in many countries. The prevalence of DM is on the rise, and its complications pose a heavy burden on the healthcare systems and on the patients' quality of life worldwide. This is a multicentre, cross-sectional study involving 5 Health Clinics conducted by Family Medicine Specialists in Malaysia. Convenience sampling of 100 respondents with DM were selected. The International Classifi cation of Functioning, Disability and Health (ICF) based measures were collected using the Comprehensive Core Set for DM. SF-36 and self-administered forms and comorbidity questionnaire (SCQ) were also used. Ninety-seven percent had Type 2 DM and 3% had Type 1 DM. The mean period of having DM was 6 years. Body functions related to physical health including exercise tolerance (b455), general physical endurance (b4550), aerobic capacity (b4551) and fatiguability (b4552) were the most affected. For body structures, the structure of pancreas (s550) was the most affected. In the ICF component of activities and participation, limitation in sports (d9201) was the highest most affected followed by driving (d475), intimate relationships (d770), handling stress and other psychological demands (d240) and moving around (d455). Only 7% (e355 and e450) in the environmental category were documented as being a relevant factor by more than 90% of the patients. The content validity of the comprehensive ICF Core set DM for Malaysian population were identified and the results show that physical and mental functioning were impaired in contrast to what the respondents perceived as leading healthy lifestyles.

  20. Validation of the Amsterdam Dynamic Facial Expression Set – Bath Intensity Variations (ADFES-BIV): A Set of Videos Expressing Low, Intermediate, and High Intensity Emotions

    Science.gov (United States)

    Wingenbach, Tanja S. H.

    2016-01-01

    Most of the existing sets of facial expressions of emotion contain static photographs. While increasing demand for stimuli with enhanced ecological validity in facial emotion recognition research has led to the development of video stimuli, these typically involve full-blown (apex) expressions. However, variations of intensity in emotional facial expressions occur in real life social interactions, with low intensity expressions of emotions frequently occurring. The current study therefore developed and validated a set of video stimuli portraying three levels of intensity of emotional expressions, from low to high intensity. The videos were adapted from the Amsterdam Dynamic Facial Expression Set (ADFES) and termed the Bath Intensity Variations (ADFES-BIV). A healthy sample of 92 people recruited from the University of Bath community (41 male, 51 female) completed a facial emotion recognition task including expressions of 6 basic emotions (anger, happiness, disgust, fear, surprise, sadness) and 3 complex emotions (contempt, embarrassment, pride) that were expressed at three different intensities of expression and neutral. Accuracy scores (raw and unbiased (Hu) hit rates) were calculated, as well as response times. Accuracy rates above chance level of responding were found for all emotion categories, producing an overall raw hit rate of 69% for the ADFES-BIV. The three intensity levels were validated as distinct categories, with higher accuracies and faster responses to high intensity expressions than intermediate intensity expressions, which had higher accuracies and faster responses than low intensity expressions. To further validate the intensities, a second study with standardised display times was conducted replicating this pattern. The ADFES-BIV has greater ecological validity than many other emotion stimulus sets and allows for versatile applications in emotion research. It can be retrieved free of charge for research purposes from the corresponding author

  1. Validation of the Amsterdam Dynamic Facial Expression Set--Bath Intensity Variations (ADFES-BIV: A Set of Videos Expressing Low, Intermediate, and High Intensity Emotions.

    Directory of Open Access Journals (Sweden)

    Tanja S H Wingenbach

    Full Text Available Most of the existing sets of facial expressions of emotion contain static photographs. While increasing demand for stimuli with enhanced ecological validity in facial emotion recognition research has led to the development of video stimuli, these typically involve full-blown (apex expressions. However, variations of intensity in emotional facial expressions occur in real life social interactions, with low intensity expressions of emotions frequently occurring. The current study therefore developed and validated a set of video stimuli portraying three levels of intensity of emotional expressions, from low to high intensity. The videos were adapted from the Amsterdam Dynamic Facial Expression Set (ADFES and termed the Bath Intensity Variations (ADFES-BIV. A healthy sample of 92 people recruited from the University of Bath community (41 male, 51 female completed a facial emotion recognition task including expressions of 6 basic emotions (anger, happiness, disgust, fear, surprise, sadness and 3 complex emotions (contempt, embarrassment, pride that were expressed at three different intensities of expression and neutral. Accuracy scores (raw and unbiased (Hu hit rates were calculated, as well as response times. Accuracy rates above chance level of responding were found for all emotion categories, producing an overall raw hit rate of 69% for the ADFES-BIV. The three intensity levels were validated as distinct categories, with higher accuracies and faster responses to high intensity expressions than intermediate intensity expressions, which had higher accuracies and faster responses than low intensity expressions. To further validate the intensities, a second study with standardised display times was conducted replicating this pattern. The ADFES-BIV has greater ecological validity than many other emotion stimulus sets and allows for versatile applications in emotion research. It can be retrieved free of charge for research purposes from the

  2. Validation of the Amsterdam Dynamic Facial Expression Set--Bath Intensity Variations (ADFES-BIV): A Set of Videos Expressing Low, Intermediate, and High Intensity Emotions.

    Science.gov (United States)

    Wingenbach, Tanja S H; Ashwin, Chris; Brosnan, Mark

    2016-01-01

    Most of the existing sets of facial expressions of emotion contain static photographs. While increasing demand for stimuli with enhanced ecological validity in facial emotion recognition research has led to the development of video stimuli, these typically involve full-blown (apex) expressions. However, variations of intensity in emotional facial expressions occur in real life social interactions, with low intensity expressions of emotions frequently occurring. The current study therefore developed and validated a set of video stimuli portraying three levels of intensity of emotional expressions, from low to high intensity. The videos were adapted from the Amsterdam Dynamic Facial Expression Set (ADFES) and termed the Bath Intensity Variations (ADFES-BIV). A healthy sample of 92 people recruited from the University of Bath community (41 male, 51 female) completed a facial emotion recognition task including expressions of 6 basic emotions (anger, happiness, disgust, fear, surprise, sadness) and 3 complex emotions (contempt, embarrassment, pride) that were expressed at three different intensities of expression and neutral. Accuracy scores (raw and unbiased (Hu) hit rates) were calculated, as well as response times. Accuracy rates above chance level of responding were found for all emotion categories, producing an overall raw hit rate of 69% for the ADFES-BIV. The three intensity levels were validated as distinct categories, with higher accuracies and faster responses to high intensity expressions than intermediate intensity expressions, which had higher accuracies and faster responses than low intensity expressions. To further validate the intensities, a second study with standardised display times was conducted replicating this pattern. The ADFES-BIV has greater ecological validity than many other emotion stimulus sets and allows for versatile applications in emotion research. It can be retrieved free of charge for research purposes from the corresponding author.

  3. Validity of photographs for food portion estimation in a rural West African setting.

    Science.gov (United States)

    Huybregts, L; Roberfroid, D; Lachat, C; Van Camp, J; Kolsteren, P

    2008-06-01

    To validate food photographs for food portion size estimation of frequently consumed dishes, to be used in a 24-hour recall food consumption study of pregnant women in a rural environment in Burkina Faso. This food intake study is part of an intervention evaluating the efficacy of prenatal micronutrient supplementation on birth outcomes. Women of childbearing age (15-45 years). A food photograph album containing four photographs of food portions per food item was compiled for eight selected food items. Subjects were presented two food items each in the morning and two in the afternoon. These foods were weighed to the exact weight of a food depicted in one of the photographs and were in the same receptacles. The next day another fieldworker presented the food photographs to the subjects to test their ability to choose the correct photograph. The correct photograph out of the four proposed was chosen in 55% of 1028 estimations. For each food, proportions of underestimating and overestimating participants were balanced, except for rice and couscous. On a group level, mean differences between served and estimated portion sizes were between -8.4% and 6.3%. Subjects who attended school were almost twice as likely to choose the correct photograph. The portion size served (small vs. largest sizes) had a significant influence on the portion estimation ability. The results from this study indicate that in a West African rural setting, food photographs can be a valuable tool for the quantification of food portion size on group level.

  4. Validation of the National Early Warning Score in the prehospital setting.

    Science.gov (United States)

    Silcock, Daniel J; Corfield, Alasdair R; Gowens, Paul A; Rooney, Kevin D

    2015-04-01

    Early intervention and response to deranged physiological parameters in the critically ill patient improves outcomes. A National Early Warning Score (NEWS) based on physiological observations has been developed for use throughout the National Health Service (NHS) in the UK. Although a good predictor of mortality and deterioration in inpatients, its performance in the prehospital setting is largely untested. This study aimed to assess the validity of the NEWS in unselected prehospital patients. All clinical observations taken by emergency ambulance crews transporting patients to a single hospital were collated along with information relating to hospital outcome over a two month period. The performance of the NEWS in identifying the endpoints of 48h and 30 day mortality, intensive care unit (ICU) admission, and a combined endpoint of 48h mortality or ICU admission was analysed. 1684 patients were analysed. All three of the primary endpoints and the combined endpoint were associated with higher NEWS scores (pearly involvement of senior Emergency Department staff and appropriate critical care. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. ACE-FTS version 3.0 data set: validation and data processing update

    Directory of Open Access Journals (Sweden)

    Claire Waymark

    2014-01-01

    Full Text Available On 12 August 2003, the Canadian-led Atmospheric Chemistry Experiment (ACE was launched into a 74° inclination orbit at 650 km with the mission objective to measure atmospheric composition using infrared and UV-visible spectroscopy (Bernath et al. 2005. The ACE mission consists of two main instruments, ACE-FTS and MAESTRO (McElroy et al. 2007, which are being used to investigate the chemistry and dynamics of the Earth’s atmosphere.  Here, we focus on the high resolution (0.02 cm-1 infrared Fourier Transform Spectrometer, ACE-FTS, that measures in the 750-4400 cm-1 (2.2 to 13.3 µm spectral region.  This instrument has been making regular solar occultation observations for more than nine years.  The current ACE-FTS data version (version 3.0 provides profiles of temperature and volume mixing ratios (VMRs of more than 30 atmospheric trace gas species, as well as 20 subsidiary isotopologues of the most abundant trace atmospheric constituents over a latitude range of ~85°N to ~85°S.  This letter describes the current data version and recent validation comparisons and provides a description of our planned updates for the ACE-FTS data set. [...

  6. Validity of the KABC-II Culture-Language Interpretive Matrix: A Comparison of Native English Speakers and Spanish-Speaking English Language Learners

    Science.gov (United States)

    Van Deth, Leah M.

    2013-01-01

    The purpose of the present study was to investigate the validity of the Culture-Language Interpretive Matrix (C-LIM; Flanagan, Ortiz, & Alfonso, 2013) when applied to scores from the Kaufman Assessment Battery for Children, 2nd Edition (KABC-II; Kaufman & Kaufman, 2004). Data were analyzed from the KABC-II standardization sample as well as…

  7. Validation of the SimSET simulation package for modeling the Siemens Biograph mCT PET scanner.

    Science.gov (United States)

    Poon, Jonathan K; Dahlbom, Magnus L; Casey, Michael E; Qi, Jinyi; Cherry, Simon R; Badawi, Ramsey D

    2015-02-07

    Monte Carlo simulation provides a valuable tool in performance assessment and optimization of system design parameters for PET scanners. SimSET is a popular Monte Carlo simulation toolkit that features fast simulation time, as well as variance reduction tools to further enhance computational efficiency. However, SimSET has lacked the ability to simulate block detectors until its most recent release. Our goal is to validate new features of SimSET by developing a simulation model of the Siemens Biograph mCT PET scanner and comparing the results to a simulation model developed in the GATE simulation suite and to experimental results. We used the NEMA NU-2 2007 scatter fraction, count rates, and spatial resolution protocols to validate the SimSET simulation model and its new features. The SimSET model overestimated the experimental results of the count rate tests by 11-23% and the spatial resolution test by 13-28%, which is comparable to previous validation studies of other PET scanners in the literature. The difference between the SimSET and GATE simulation was approximately 4-8% for the count rate test and approximately 3-11% for the spatial resolution test. In terms of computational time, SimSET performed simulations approximately 11 times faster than GATE simulations. The new block detector model in SimSET offers a fast and reasonably accurate simulation toolkit for PET imaging applications.

  8. Affordances in the home environment for motor development: Validity and reliability for the use in daycare setting.

    Science.gov (United States)

    Müller, Alessandra Bombarda; Valentini, Nadia Cristina; Bandeira, Paulo Felipe Ribeiro

    2017-05-01

    The range of stimuli provided by physical space, toys and care practices contributes to the motor, cognitive and social development of children. However, assessing the quality of child education environments is a challenge, and can be considered a health promotion initiative. This study investigated the validity of the criterion, content, construct and reliability of the Affordances in the Home Environment for Motor Development - Infant Scale (AHEMD-IS), version 3-18 months, for the use in daycare settings. Content validation was conducted with the participation of seven motor development and health care experts; and, face validity by 20 specialists in health and education. The results indicate the suitability of the adapted AHEMD-IS, evidencing its validity for the daycare setting a potential tool to assess the opportunities that the collective context offers to child development. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Atmospheric correction at AERONET locations: A new science and validation data set

    Science.gov (United States)

    Wang, Y.; Lyapustin, A.I.; Privette, J.L.; Morisette, J.T.; Holben, B.

    2009-01-01

    This paper describes an Aerosol Robotic Network (AERONET)-based Surface Reflectance Validation Network (ASRVN) and its data set of spectral surface bidirectional reflectance and albedo based on Moderate Resolution Imaging Spectroradiometer (MODIS) TERRA and AQUA data. The ASRVN is an operational data collection and processing system. It receives 50 ?? 50 km2; subsets of MODIS level 1B (L1B) data from MODIS adaptive processing system and AERONET aerosol and water-vapor information. Then, it performs an atmospheric correction (AC) for about 100 AERONET sites based on accurate radiative-transfer theory with complex quality control of the input data. The ASRVN processing software consists of an L1B data gridding algorithm, a new cloud-mask (CM) algorithm based on a time-series analysis, and an AC algorithm using ancillary AERONET aerosol and water-vapor data. The AC is achieved by fitting the MODIS top-of-atmosphere measurements, accumulated for a 16-day interval, with theoretical reflectance parameterized in terms of the coefficients of the Li SparseRoss Thick (LSRT) model of the bidirectional reflectance factor (BRF). The ASRVN takes several steps to ensure high quality of results: 1) the filtering of opaque clouds by a CM algorithm; 2) the development of an aerosol filter to filter residual semitransparent and subpixel clouds, as well as cases with high inhomogeneity of aerosols in the processing area; 3) imposing the requirement of the consistency of the new solution with previously retrieved BRF and albedo; 4) rapid adjustment of the 16-day retrieval to the surface changes using the last day of measurements; and 5) development of a seasonal backup spectral BRF database to increase data coverage. The ASRVN provides a gapless or near-gapless coverage for the processing area. The gaps, caused by clouds, are filled most naturally with the latest solution for a given pixel. The ASRVN products include three parameters of the LSRT model (kL, kG, and kV), surface albedo

  10. Computational discovery and functional validation of novel fluoroquinolone resistance genes in public metagenomic data sets.

    Science.gov (United States)

    Boulund, Fredrik; Berglund, Fanny; Flach, Carl-Fredrik; Bengtsson-Palme, Johan; Marathe, Nachiket P; Larsson, D G Joakim; Kristiansson, Erik

    2017-09-02

    Fluoroquinolones are broad-spectrum antibiotics used to prevent and treat a wide range of bacterial infections. Plasmid-mediated qnr genes provide resistance to fluoroquinolones in many bacterial species and are increasingly encountered in clinical settings. Over the last decade, several families of qnr genes have been discovered and characterized, but their true prevalence and diversity still remain unclear. In particular, environmental and host-associated bacterial communities have been hypothesized to maintain a large and unknown collection of qnr genes that could be mobilized into pathogens. In this study we used computational methods to screen genomes and metagenomes for novel qnr genes. In contrast to previous studies, we analyzed an almost 20-fold larger dataset comprising almost 13 terabases of sequence data. In total, 362,843 potential qnr gene fragments were identified, from which 611 putative qnr genes were reconstructed. These gene sequences included all previously described plasmid-mediated qnr gene families. Fifty-two of the 611 identified qnr genes were reconstructed from metagenomes, and 20 of these were previously undescribed. All of the novel qnr genes were assembled from metagenomes associated with aquatic environments. Nine of the novel genes were selected for validation, and six of the tested genes conferred consistently decreased susceptibility to ciprofloxacin when expressed in Escherichia coli. The results presented in this study provide additional evidence for the ubiquitous presence of qnr genes in environmental microbial communities, expand the number of known qnr gene variants and further elucidate the diversity of this class of resistance genes. This study also strengthens the hypothesis that environmental bacterial communities act as sources of previously uncharacterized qnr genes.

  11. Setting Specific Criteria for Scoring Word Problems in Mathematics: Effects on Test Validity and Reliability.

    Science.gov (United States)

    Ibe, Milagros D.

    1983-01-01

    This study investigated the effects of specific criteria for marking a test on its reliability and validity. Eight algebra word problems were administered to grade 10 students. The objectivity of scoring criteria improved the reliability of the test, but did not affect its validity. (MNS)

  12. Development and Validation of a Set of Palliative Medicine Entrustable Professional Activities: Findings from a Mixed Methods Study.

    Science.gov (United States)

    Myers, Jeff; Krueger, Paul; Webster, Fiona; Downar, James; Herx, Leonie; Jeney, Christa; Oneschuk, Doreen; Schroder, Cori; Sirianni, Giovanna; Seccareccia, Dori; Tucker, Tara; Taniguchi, Alan

    2015-08-01

    Entrustable professional activities (EPAs) are routine tasks considered essential to a professional practice. An EPA can serve as a performance-based outcome that a clinical supervisor would progressively entrust a learner to perform. Our aim was to identify, develop, and validate a set of EPAs for the palliative medicine discipline. The design was a sequential qualitative and quantitative mixed methods study. A working group was convened to develop a set of EPAs. Focus groups and surveys were used for validation purposes. Palliative medicine educators and content experts from across Canada participated in both the working group as well as the focus groups. Attendees of the 2014 Canadian Society of Palliative Care Physicians (CSPCP) annual conference completed surveys. A questionnaire was used to collect survey participant sociodemographic, clinical, and academic information along with ratings of the importance of the EPAs individually and collectively. Cronbach's alpha examined internal consistency of the set of EPAs. Focus group participants strongly endorsed the 12 EPAs. Virtually all survey participants rated the individual EPAs as being "fairly/very important" (range 94% to 100%). Of the participants, 97% agreed that residents able to perform the set of EPAs would be practicing palliative medicine and 87% indicated strong agreement that this collective set of EPAs captures activities that all palliative medicine physicians must be able to perform. A Cronbach's alpha of 0.841 confirmed good internal consistency. Near uniform agreement from a national group of palliative medicine physicians provides strong validation for the set of 12 EPAs.

  13. Retest Effects in Matrix Test Performance: Differential Impact of Predictors at Different Hierarchy Levels in an Educational Setting

    Science.gov (United States)

    Freund, Philipp Alexander; Holling, Heinz

    2011-01-01

    If tests of cognitive ability are repeatedly taken, test scores rise. Such retest effects have been observed for a long time and for a variety of tasks. This study investigates retest effects on figural matrix items in an educational context. A short term effect is assumed for the direct retest administration in the same test session, and a long…

  14. Validation of the Amsterdam Beverage Picture Set: A Controlled Picture Set for Cognitive Bias Measurement and Modification Paradigms.

    Science.gov (United States)

    Pronk, Thomas; van Deursen, Denise S; Beraha, Esther M; Larsen, Helle; Wiers, Reinout W

    2015-10-01

    Alcohol research may benefit from controlled and validated picture sets. We have constructed the Amsterdam Beverage Picture Set (ABPS), which was designed for alcohol research in general and cognitive bias measurement and modification in particular. Here, we first formulate a position on alcohol stimulus validity that prescribes that alcohol-containing pictures, compared to nonalcohol-containing pictures, should induce a stronger urge to drink in heavy drinkers than in light drinkers. Because a perceptually simple picture might induce stronger cognitive biases but the presence of a drinking context might induce a stronger urge to drink, the ABPS contains pictures with and without drinking context. By limiting drinking contexts to simple consumption scenes instead of real-life scenes, complexity was minimized. A validation study was conducted to establish validity, to examine ABPS drinking contexts, and to explore the role of familiarity, valence, arousal, and control. Two hundred ninety-one psychology students completed the Alcohol Use Disorders Identification Test, as well as rating and recognition tasks for a subset of the ABPS pictures. The ABPS was well-recognized, familiar, and heavy drinkers reported a greater urge to drink in response to the alcohol-containing pictures only. Alcohol presented in drinking context did not elicit a stronger urge to drink but was recognized more slowly than alcohol presented without context. The ABPS was found to be valid, although pictures without context might be preferable for measuring cognitive biases than pictures with context. We discuss how an explicit approach to picture construction may aid in creating variations of the ABPS. Finally, we describe how ABPS adoption across studies may allow more reproducible and comparable results across paradigms, while allowing researchers to apply picture selection criteria that correspond to a wide range of theoretical positions. The latter is exemplified by ABPS derivatives and

  15. [Validation Study on a Multi-Residue Method for Determination of Pesticide Residues in Vegetables and Fruits by using General Matrix Standard Solutions].

    Science.gov (United States)

    Fukui, Naoki; Takatori, Satoshi; Yamaguchi, Satoko; Kitagawa, Yoko; Yoshimitsu, Masato; Osakada, Masakazu; Kajimura, Keiji; Obana, Hirotaka

    2015-01-01

    Quantitative methods using the matrix-matched standard solutions approach are widely used for multi-residue pesticide determination by GC-MS/MS to deal with the issue of matrix effects. However, preparing matrix-matched standard solutions in analyses of many kinds of samples is very time-consuming. In order to solve this problem, a method that employs general matrix standard solutions has been developed using polyethylene glycol (PEG), extract of vegetables-fruit juice (VFJm) and triphenyl phosphate (named the PEG-VFJm method). Here, a validation study for 168 pesticides was performed on three kinds of samples [potato, spinach and apple] at concentrations of 0.010 and 0.050 μg/g. In these three commodities, 144 to 158 pesticides satisfied the required criteria using the matrix-matched method and 129 to 149 pesticides satisfied the same criteria using the PEG-VFJm method. Our results suggest that application of general matrix standard solutions would enable rapid and effective analyses of pesticides.

  16. Establishing the Reliability and Validity of a Computerized Assessment of Children's Working Memory for Use in Group Settings

    Science.gov (United States)

    St Clair-Thompson, Helen

    2014-01-01

    The aim of the present study was to investigate the reliability and validity of a brief standardized assessment of children's working memory; "Lucid Recall." Although there are many established assessments of working memory, "Lucid Recall" is fully automated and can therefore be administered in a group setting. It is therefore…

  17. Validation of Correction Algorithms for Near-IR Analysis of Human Milk in an Independent Sample Set-Effect of Pasteurization.

    Science.gov (United States)

    Kotrri, Gynter; Fusch, Gerhard; Kwan, Celia; Choi, Dasol; Choi, Arum; Al Kafi, Nisreen; Rochow, Niels; Fusch, Christoph

    2016-02-26

    Commercial infrared (IR) milk analyzers are being increasingly used in research settings for the macronutrient measurement of breast milk (BM) prior to its target fortification. These devices, however, may not provide reliable measurement if not properly calibrated. In the current study, we tested a correction algorithm for a Near-IR milk analyzer (Unity SpectraStar, Brookfield, CT, USA) for fat and protein measurements, and examined the effect of pasteurization on the IR matrix and the stability of fat, protein, and lactose. Measurement values generated through Near-IR analysis were compared against those obtained through chemical reference methods to test the correction algorithm for the Near-IR milk analyzer. Macronutrient levels were compared between unpasteurized and pasteurized milk samples to determine the effect of pasteurization on macronutrient stability. The correction algorithm generated for our device was found to be valid for unpasteurized and pasteurized BM. Pasteurization had no effect on the macronutrient levels and the IR matrix of BM. These results show that fat and protein content can be accurately measured and monitored for unpasteurized and pasteurized BM. Of additional importance is the implication that donated human milk, generally low in protein content, has the potential to be target fortified.

  18. Validation and evaluation of common large-area display set (CLADS) performance specification

    Science.gov (United States)

    Hermann, David J.; Gorenflo, Ronald L.

    1998-09-01

    Battelle is under contract with Warner Robins Air Logistics Center to design a Common Large Area Display Set (CLADS) for use in multiple Command, Control, Communications, Computers, and Intelligence (C4I) applications that currently use 19- inch Cathode Ray Tubes (CRTs). Battelle engineers have built and fully tested pre-production prototypes of the CLADS design for AWACS, and are completing pre-production prototype displays for three other platforms simultaneously. With the CLADS design, any display technology that can be packaged to meet the form, fit, and function requirements defined by the Common Large Area Display Head Assembly (CLADHA) performance specification is a candidate for CLADS applications. This technology independent feature reduced the risk of CLADS development, permits life long technology insertion upgrades without unnecessary redesign, and addresses many of the obsolescence problems associated with COTS technology-based acquisition. Performance and environmental testing were performed on the AWACS CLADS and continues on other platforms as a part of the performance specification validation process. A simulator assessment and flight assessment were successfully completed for the AWACS CLADS, and lessons learned from these assessments are being incorporated into the performance specifications. Draft CLADS specifications were released to potential display integrators and manufacturers for review in 1997, and the final version of the performance specifications are scheduled to be released to display integrators and manufacturers in May, 1998. Initial USAF applications include replacements for the E-3 AWACS color monitor assembly, E-8 Joint STARS graphics display unit, and ABCCC airborne color display. Initial U.S. Navy applications include the E-2C ACIS display. For these applications, reliability and maintainability are key objectives. The common design will reduce the cost of operation and maintenance by an estimated 3.3M per year on E-3 AWACS

  19. Set-up and validation of a Delft-FEWS based coastal hazard forecasting system

    Science.gov (United States)

    Valchev, Nikolay; Eftimova, Petya; Andreeva, Nataliya

    2017-04-01

    European coasts are increasingly threatened by hazards related to low-probability and high-impact hydro-meteorological events. Uncertainties in hazard prediction and capabilities to cope with their impact lie in both future storm pattern and increasing coastal development. Therefore, adaptation to future conditions requires a re-evaluation of coastal disaster risk reduction (DRR) strategies and introduction of a more efficient mix of prevention, mitigation and preparedness measures. The latter presumes that development of tools, which can manage the complex process of merging data and models and generate products on the current and expected hydro-and morpho-dynamic states of the coasts, such as forecasting system of flooding and erosion hazards at vulnerable coastal locations (hotspots), is of vital importance. Output of such system can be of an utmost value for coastal stakeholders and the entire coastal community. In response to these challenges, Delft-FEWS provides a state-of-the-art framework for implementation of such system with vast capabilities to trigger the early warning process. In addition, this framework is highly customizable to the specific requirements of any individual coastal hotspot. Since its release many Delft-FEWS based forecasting system related to inland flooding have been developed. However, limited number of coastal applications was implemented. In this paper, a set-up of Delft-FEWS based forecasting system for Varna Bay (Bulgaria) and a coastal hotspot, which includes a sandy beach and port infrastructure, is presented. It is implemented in the frame of RISC-KIT project (Resilience-Increasing Strategies for Coasts - toolKIT). The system output generated in hindcast mode is validated with available observations of surge levels, wave and morphodynamic parameters for a sequence of three short-duration and relatively weak storm events occurred during February 4-12, 2015. Generally, the models' performance is considered as very good and

  20. Validated QSAR prediction of OH tropospheric degradation of VOCs: splitting into training-test sets and consensus modeling.

    Science.gov (United States)

    Gramatica, Paola; Pilutti, Pamela; Papa, Ester

    2004-01-01

    The rate constant for hydroxyl radical tropospheric degradation of 460 heterogeneous organic compounds is predicted by QSAR modeling. The applied Multiple Linear Regression is based on a variety of theoretical molecular descriptors, selected by the Genetic Algorithms-Variable Subset Selection (GA-VSS) procedure. The models were validated for predictivity by both internal and external validation. For the external validation two splitting approaches, D-optimal Experimental Design and Kohonen Artificial Neural Networks (K-ANN), were applied to the original data set to compare the two methodologies. We emphasize that external validation is the only way to establish a reliable QSAR model for predictive purposes. Predicted data by consensus modeling from different models are also proposed. Copyright 2004 American Chemical Society

  1. Measuring technology self efficacy: reliability and construct validity of a modified computer self efficacy scale in a clinical rehabilitation setting.

    Science.gov (United States)

    Laver, Kate; George, Stacey; Ratcliffe, Julie; Crotty, Maria

    2012-01-01

    To describe a modification of the computer self efficacy scale for use in clinical settings and to report on the modified scale's reliability and construct validity. The computer self efficacy scale was modified to make it applicable for clinical settings (for use with older people or people with disabilities using everyday technologies). The modified scale was piloted, then tested with patients in an Australian inpatient rehabilitation setting (n = 88) to determine the internal consistency using Cronbach's alpha coefficient. Construct validity was assessed by correlation of the scale with age and technology use. Factor analysis using principal components analysis was undertaken to identify important constructs within the scale. The modified computer self efficacy scale demonstrated high internal consistency with a standardised alpha coefficient of 0.94. Two constructs within the scale were apparent; using the technology alone, and using the technology with the support of others. Scores on the scale were correlated with age and frequency of use of some technologies thereby supporting construct validity. The modified computer self efficacy scale has demonstrated reliability and construct validity for measuring the self efficacy of older people or people with disabilities when using everyday technologies. This tool has the potential to assist clinicians in identifying older patients who may be more open to using new technologies to maintain independence.

  2. Matrix Extension Study: Validation of the Compact Dry CF Method for Enumeration of Total Coliform Bacteria in Selected Foods.

    Science.gov (United States)

    Mizuochi, Shingo; Nelson, Maria; Baylis, Chris; Green, Becky; Jewell, Keith; Monadjemi, Farinaz; Chen, Yi; Salfinger, Yvonne; Fernandez, Maria Cristina

    2016-01-01

    The Compact Dry "Nissui" CF method, Performance Tested Method(SM) 110401, was originally certified for enumeration of coliform bacteria by the AOAC Research Institute Performance Tested Methods(SM) program for raw meat products. Compact Dry CF is a ready-to-use dry media sheet, containing a cold-soluble gelling agent, a chromogenic medium, and selective agents, which are rehydrated by adding 1 mL of diluted sample. Coliform bacteria produce blue/blue-green colonies on the Compact Dry CF, allowing a coliform colony count to be determined in the sample after 24 ± 2 h incubation. A validation study was organized by Campden BRI (formerly Campden and Chorleywood Food Research Association Technology, Ltd), Chipping Campden, United Kingdom, to extend the method's claim to include cooked chicken, fresh bagged prewashed shredded iceberg lettuce, frozen fish, milk powder, and pasteurized 2% milk. Campden BRI collected single-laboratory data for cooked chicken, lettuce, frozen fish, and milk powder, whereas a multilaboratory study was conducted on pasteurized milk. Thirteen laboratories participated in the interlaboratory study. The Compact Dry CF method was compared to ISO 4832:2006 "Microbiology of food and animal feeding stuffs-Horizontal method for the enumeration of coliforms-Colony-count technique," the current version at the time this study was conducted. Each matrix was evaluated at either four or five contamination levels of coliform bacteria (including an uncontaminated level). After logarithmic transformation of counts at each level, the data for pasteurized whole milk were analyzed for sr, sR, RSDr, and RSDR. Regression analysis was also performed and r(2) was reported. Mean difference between methods with 95% confidence interval (CI) was calculated. A log10 range of -0.5 to 0.5 for the CI was used as the acceptance criterion to establish significant statistical difference between methods. In the single-laboratory evaluation (for cooked chicken, lettuce, frozen

  3. Incremental validity of selected MMPI-A content scales in an inpatient setting.

    Science.gov (United States)

    McGrath, Robert E; Pogge, David L; Stokes, John M

    2002-12-01

    To date, relatively few studies have been published evaluating the validity or incremental validity of the content scales from the adolescent version of the Minnesota Multiphasic Personality Inventory (MMPI-A; J. N. Butcher et al., 1992). A sample of 629 psychiatric inpatient adolescents who had completed the MMPI-A was used to evaluate the ability of selected clinical and content scales to predict conceptually related clinical variables. Criteria were based on clinician ratings, admission and discharge diagnoses, and chart reviews. Results from hierarchical multiple and logistic regression analyses indicated the content scales offered incremental validity over the clinical scales and supported the use of the content scales as an adjunct to the traditional clinical scales.

  4. Examining the validity of the Athlete Engagement Questionnaire (AEQ in a Portuguese sport setting

    Directory of Open Access Journals (Sweden)

    Paulo Martins

    2014-03-01

    Full Text Available Sport psychology literature suggests that understanding engagement levels is pivotal to promote positive sporting experiences among athletes. The purpose of this study was to examine the psychometric properties of the Athlete Engagement Questionnaire among Portuguese sport athletes. Two distinct samples of Portuguese athletes from different competitive levels were collected, and the results of a confirmatory factor analysis demonstrated a good fit of the model to the data. A review of the psychometric properties indicated that all factors showed good composite reliability, convergent validity, and discriminant validity. In addition, a multi-groups analysis showed the invariance of the model in two independent samples providing evidence of cross validity. Implications of these results for scholars and coaches are discussed and guidelines for future studies are suggested.

  5. Phenotypic identification of Porphyromonas gingivalis validated with matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    Science.gov (United States)

    Rams, Thomas E; Sautter, Jacqueline D; Getreu, Adam; van Winkelhoff, Arie J

    2016-05-01

    Porphyromonas gingivalis is a major bacterial pathogen in human periodontitis. This study used matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry to assess the accuracy of a rapid phenotypic identification scheme for detection of cultivable P. gingivalis in human subgingival plaque biofilms. A total of 314 fresh cultivable subgingival isolates from 38 adults with chronic periodontitis were presumptively identified on anaerobically-incubated enriched Brucella blood agar primary isolation plates as P. gingivalis based on dark-pigmented colony morphology, lack of a brick-red autofluorescence reaction under long-wave ultraviolet light, and a positive CAAM fluorescence test for trypsin-like enzyme activity. Each presumptive P. gingivalis isolate, and a panel of other human subgingival bacterial species, were subjected to MALDI-TOF mass spectrometry analysis using a benchtop mass spectrometer equipped with software containing mass spectra for P. gingivalis in its reference library of bacterial protein profiles. A MALDI-TOF mass spectrometry log score of ≥1.7 was required for species identification of the subgingival isolates. All 314 (100%) presumptive P. gingivalis subgingival isolates were confirmed as P. gingivalis with MALDI-TOF mass spectrometry analysis (Cohen's kappa coefficient = 1.0). MALDI-TOF mass spectrometry log scores between 1.7 and 1.9, and ≥2.0, were found for 92 (29.3%) and 222 (70.7%), respectively, of the presumptive P. gingivalis clinical isolates. No other tested bacterial species was identified as P. gingivalis by MALDI-TOF mass spectrometry. Rapid phenotypic identification of cultivable P. gingivalis in human subgingival biofilm specimens was found to be 100% accurate with MALDI-TOF mass spectrometry. These findings provide validation for the continued use of P. gingivalis research data based on this species identification methodology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. PatentMatrix: an automated tool to survey patents related to large sets of genes or proteins

    Directory of Open Access Journals (Sweden)

    de Rinaldis Emanuele

    2007-09-01

    Full Text Available Abstract Background The number of patents associated with genes and proteins and the amount of information contained in each patent often present a real obstacle to the rapid evaluation of the novelty of findings associated to genes from an intellectual property (IP perspective. This assessment, normally carried out by expert patent professionals, can therefore become cumbersome and time consuming. Here we present PatentMatrix, a novel software tool for the automated analysis of patent sequence text entries. Methods and Results PatentMatrix is written in the Awk language and requires installation of the Derwent GENESEQ™ patent sequence database under the sequence retrieval system SRS. The software works by taking as input two files: i a list of genes or proteins with the associated GENESEQ™ patent sequence accession numbers ii a list of keywords describing the research context of interest (e.g. 'lung', 'cancer', 'therapeutics', 'diagnostics'. The GENESEQ™ database is interrogated through the SRS system and each patent entry of interest is screened for the occurrence of user-defined keywords. Moreover, the software extracts the basic information useful for a preliminary assessment of the IP coverage of each patent from the GENESEQ™ database. As output, two tab-delimited files are generated which provide the user with a detailed and an aggregated view of the results. An example is given where the IP position of five genes is evaluated in the context of 'development of antibodies for cancer treatment' Conclusion PatentMatrix allows a rapid survey of patents associated with genes or proteins in a particular area of interest as defined by keywords. It can be efficiently used to evaluate the IP-related novelty of scientific findings and to rank genes or proteins according to their IP position.

  7. Development and validation of an Argentine set of facial expressions of emotion

    NARCIS (Netherlands)

    Vaiman, M.; Wagner, M.A.; Caicedo, E.; Pereno, G.L.

    2017-01-01

    Pictures of facial expressions of emotion are used in a wide range of experiments. The last decade has seen an increase in the number of studies presenting local sets of emotion stimuli. However, only a few existing sets contain pictures of Latin Americans, despite the growing attention emotion

  8. Setting Standards for English Foreign Language Assessment: Methodology, Validation, and a Degree of Arbitrariness

    Science.gov (United States)

    Tiffin-Richards, Simon P.; Pant, Hans Anand; Koller, Olaf

    2013-01-01

    Cut-scores were set by expert judges on assessments of reading and listening comprehension of English as a foreign language (EFL), using the bookmark standard-setting method to differentiate proficiency levels defined by the Common European Framework of Reference (CEFR). Assessments contained stratified item samples drawn from extensive item…

  9. Validating an Indirect Measure of Clarity of Feelings: Evidence from Laboratory and Naturalistic Settings

    Science.gov (United States)

    Lischetzke, Tanja; Angelova, Rozalina; Eid, Michael

    2011-01-01

    This study analyzed the reliability and validity of an indirect measure of clarity of feelings that is based on response latencies (RTs) of mood ratings. Fifty-two participants completed a laboratory session and an experience-sampling week with 6 measurement occasions per day. Shorter RT of mood ratings measured in the laboratory (but not…

  10. Validation of the patient health questionnaire-9 for major depressive disorder in the occupational health setting

    NARCIS (Netherlands)

    Volker, D.; Zijlstra-Vlasveld, M.C.; Brouwers, E.P M.; Homans, W.A.; Emons, W.H.M.; van der Feltz-Cornelis, C.M.

    2016-01-01

    Purpose Because of the increased risk of long-term sickness leave for employees with a major depressive disorder (MDD), it is important for occupational health professionals to recognize depression in a timely manner. The Patient Health Questionnaire-9 (PHQ-9) has proven to be a reliable and valid

  11. Bioanalytical chromatographic method validation according to current regulations, with a special focus on the non-well defined parameters limit of quantification, robustness and matrix effect.

    Science.gov (United States)

    González, Oskar; Blanco, María Encarnación; Iriarte, Gorka; Bartolomé, Luis; Maguregui, Miren Itxaso; Alonso, Rosa M

    2014-08-01

    Method validation is a mandatory step in bioanalysis, to evaluate the ability of developed methods in providing reliable results for their routine application. Even if some organisations have developed guidelines to define the different parameters to be included in method validation (FDA, EMA); there are still some ambiguous concepts in validation criteria and methodology that need to be clarified. The methodology to calculate fundamental parameters such as the limit of quantification has been defined in several ways without reaching a harmonised definition, which can lead to very different values depending on the applied criterion. Other parameters such as robustness or ruggedness are usually omitted and when defined there is not an established approach to evaluate them. Especially significant is the case of the matrix effect evaluation which is one of the most critical points to be studied in LC-MS methods but has been traditionally overlooked. Due to the increasing importance of bioanalysis this scenario is no longer acceptable and harmonised criteria involving all the concerned parties should be arisen. The objective of this review is thus to discuss and highlight several essential aspects of method validation, focused in bioanalysis. The overall validation process including common validation parameters (selectivity, linearity range, precision, accuracy, stability…) will be reviewed. Furthermore, the most controversial parameters (limit of quantification, robustness and matrix effect) will be carefully studied and the definitions and methodology proposed by the different regulatory bodies will be compared. This review aims to clarify the methodology to be followed in bioanalytical method validation, facilitating this time consuming step. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Validation of the 4DSQ Somatization Subscale in the Occupational Health Care Setting as a Screener

    NARCIS (Netherlands)

    de Vroege, Lars; Emons, Wilco H. M.; Sijtsma, Klaas; Hoedeman, Rob; van der Feltz-Cornelis, Christina M.

    Purpose Somatoform disorders (physical symptoms without medical explanation that cause dysfunction) are prevalent in the occupational health (OH) care setting and are associated with functional impairment and absenteeism. Availability of psychometric instruments aimed at assessing somatoform

  13. Generation and validation of a Shewanella oneidensis MR-1 clone set for protein expression and phage display.

    Directory of Open Access Journals (Sweden)

    Haichun Gao

    Full Text Available A comprehensive gene collection for S. oneidensis was constructed using the lambda recombinase (Gateway cloning system. A total of 3584 individual ORFs (85% have been successfully cloned into the entry plasmids. To validate the use of the clone set, three sets of ORFs were examined within three different destination vectors constructed in this study. Success rates for heterologous protein expression of S. oneidensis His- or His/GST-tagged proteins in E. coli were approximately 70%. The ArcA and NarP transcription factor proteins were tested in an in vitro binding assay to demonstrate that functional proteins can be successfully produced using the clone set. Further functional validation of the clone set was obtained from phage display experiments in which a phage encoding thioredoxin was successfully isolated from a pool of 80 different clones after three rounds of biopanning using immobilized anti-thioredoxin antibody as a target. This clone set complements existing genomic (e.g., whole-genome microarray and other proteomic tools (e.g., mass spectrometry-based proteomic analysis, and facilitates a wide variety of integrated studies, including protein expression, purification, and functional analyses of proteins both in vivo and in vitro.

  14. Validation of Social Cognition Rating Tools in Indian Setting (SOCRATIS): A new test-battery to assess social cognition.

    Science.gov (United States)

    Mehta, Urvakhsh M; Thirthalli, Jagadisha; Naveen Kumar, C; Mahadevaiah, Mahesh; Rao, Kiran; Subbakrishna, Doddaballapura K; Gangadhar, Bangalore N; Keshavan, Matcheri S

    2011-09-01

    Social cognition is a cognitive domain that is under substantial cultural influence. There are no culturally appropriate standardized tools in India to comprehensively test social cognition. This study describes validation of tools for three social cognition constructs: theory of mind, social perception and attributional bias. Theory of mind tests included adaptations of, (a) two first order tasks [Sally-Anne and Smarties task], (b) two second order tasks [Ice cream van and Missing cookies story], (c) two metaphor-irony tasks and (d) the faux pas recognition test. Internal, Personal, and Situational Attributions Questionnaire (IPSAQ) and Social Cue Recognition Test were adapted to assess attributional bias and social perception, respectively. These tests were first modified to suit the Indian cultural context without changing the constructs to be tested. A panel of experts then rated the tests on likert scales as to (1) whether the modified tasks tested the same construct as in the original and (2) whether they were culturally appropriate. The modified tests were then administered to groups of actively symptomatic and remitted schizophrenia patients as well as healthy comparison subjects. All tests of the Social Cognition Rating Tools in Indian Setting had good content validity and known groups validity. In addition, the social cure recognition test in Indian setting had good internal consistency and concurrent validity. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. The impact of crowd noise on officiating in Muay Thai: achieving external validity in an experimental setting

    Directory of Open Access Journals (Sweden)

    Tony D Myers

    2012-09-01

    Full Text Available Numerous factors have been proposed to explain the home advantage in sport. Several authors have suggested that a partisan home crowd enhances home advantage and that this is at least in part a consequence of their influence on officiating. However, while experimental studies examining this phenomenon have high levels of internal validity (since only the ‘crowd noise’ intervention is allowed to vary, they suffer from a lack of external validity, with decision-making in a laboratory setting typically bearing little resemblance to decision-making in live sports settings. Conversely, observational and quasi-experimental studies with high levels of external validity suffer from low levels of internal validity as countless factors besides crowd noise vary. The present study provides a unique opportunity to address these criticisms, by conducting a controlled experiment on the impact of crowd noise on officiating in a live tournament setting. Seventeen qualified judges officiated on thirty Thai boxing bouts in a live international tournament setting featuring ‘home’ and ‘away’ boxers. In each bout, judges were randomised into a ‘noise’ (live sound or ‘no crowd noise’ (noise cancelling headphones and white noise condition, resulting in 59 judgements in the ‘no crowd noise’ and 61 in the ‘crowd noise’ condition. The results provide the first experimental evidence of the impact of live crowd noise on officials in sport. A cross-classified statistical model indicated that crowd noise had a statistically significant impact, equating to just over half a point per bout (in the context of five round bouts with the ‘ten point must’ scoring system shared with professional boxing. The practical significance of the findings, their implications for officiating and for the future conduct of crowd noise studies are discussed.

  16. The impact of crowd noise on officiating in muay thai: achieving external validity in an experimental setting.

    Science.gov (United States)

    Myers, Tony; Balmer, Nigel

    2012-01-01

    Numerous factors have been proposed to explain the home advantage in sport. Several authors have suggested that a partisan home crowd enhances home advantage and that this is at least in part a consequence of their influence on officiating. However, while experimental studies examining this phenomenon have high levels of internal validity (since only the "crowd noise" intervention is allowed to vary), they suffer from a lack of external validity, with decision-making in a laboratory setting typically bearing little resemblance to decision-making in live sports settings. Conversely, observational and quasi-experimental studies with high levels of external validity suffer from low levels of internal validity as countless factors besides crowd noise vary. The present study provides a unique opportunity to address these criticisms, by conducting a controlled experiment on the impact of crowd noise on officiating in a live tournament setting. Seventeen qualified judges officiated on thirty Thai boxing bouts in a live international tournament setting featuring "home" and "away" boxers. In each bout, judges were randomized into a "noise" (live sound) or "no crowd noise" (noise-canceling headphones and white noise) condition, resulting in 59 judgments in the "no crowd noise" and 61 in the "crowd noise" condition. The results provide the first experimental evidence of the impact of live crowd noise on officials in sport. A cross-classified statistical model indicated that crowd noise had a statistically significant impact, equating to just over half a point per bout (in the context of five round bouts with the "10-point must" scoring system shared with professional boxing). The practical significance of the findings, their implications for officiating and for the future conduct of crowd noise studies are discussed.

  17. Eigenvalue-eigenvector decomposition (EED) analysis of dissimilarity and covariance matrix obtained from total synchronous fluorescence spectral (TSFS) data sets of herbal preparations: Optimizing the classification approach

    Science.gov (United States)

    Tarai, Madhumita; Kumar, Keshav; Divya, O.; Bairi, Partha; Mishra, Kishor Kumar; Mishra, Ashok Kumar

    2017-09-01

    The present work compares the dissimilarity and covariance based unsupervised chemometric classification approaches by taking the total synchronous fluorescence spectroscopy data sets acquired for the cumin and non-cumin based herbal preparations. The conventional decomposition method involves eigenvalue-eigenvector analysis of the covariance of the data set and finds the factors that can explain the overall major sources of variation present in the data set. The conventional approach does this irrespective of the fact that the samples belong to intrinsically different groups and hence leads to poor class separation. The present work shows that classification of such samples can be optimized by performing the eigenvalue-eigenvector decomposition on the pair-wise dissimilarity matrix.

  18. Adaption and validation of the Safety Attitudes Questionnaire for the Danish hospital setting

    DEFF Research Database (Denmark)

    Kristensen, Solvejg; Sabroe, Svend; Bartels, Paul

    2015-01-01

    Purpose: Measuring and developing a safe culture in health care is a focus point in creating highly reliable organizations being successful in avoiding patient safety incidents where these could normally be expected. Questionnaires can be used to capture a snapshot of an employee's perceptions...... of patient safety culture. A commonly used instrument to measure safety climate is the Safety Attitudes Questionnaire (SAQ). The purpose of this study was to adapt the SAQ for use in Danish hospitals, assess its construct validity and reliability, and present benchmark data. Materials and methods: The SAQ...... at the unit level in all six scale mean scores was found within the somatic and the psychiatric samples. Conclusion: SAQ-DK showed good construct validity and internal consistency reliability. SAQ-DK is potentially a useful tool for evaluating perceptions of patient safety culture in Danish hospitals....

  19. Geostatistics as a validation tool for setting ozone standards for durum wheat

    Energy Technology Data Exchange (ETDEWEB)

    De Marco, Alessandra; Screpanti, Augusto [Italian National Agency for New Technologies, Energy and the Environment (ENEA), C.R. Casaccia, Via Anguillarese 301, 00123 S. Maria di Galeria, Rome (Italy); Paoletti, Elena, E-mail: e.paoletti@ipp.cnr.i [Institute of Plant Protection, National Council of Research (IPP-CNR), Via Madonna del Piano 10, 50019 Sesto Fiorentino, Florence (Italy)

    2010-02-15

    Which is the best standard for protecting plants from ozone? To answer this question, we must validate the standards by testing biological responses vs. ambient data in the field. A validation is missing for European and USA standards, because the networks for ozone, meteorology and plant responses are spatially independent. We proposed geostatistics as validation tool, and used durum wheat in central Italy as a test. The standards summarized ozone impact on yield better than hourly averages. Although USA criteria explained ozone-induced yield losses better than European criteria, USA legal level (75 ppb) protected only 39% of sites. European exposure-based standards protected >=90%. Reducing the USA level to the Canadian 65 ppb or using W126 protected 91% and 97%, respectively. For a no-threshold accumulated stomatal flux, 22 mmol m{sup -2} was suggested to protect 97% of sites. In a multiple regression, precipitation explained 22% and ozone explained <0.9% of yield variability. - Geostatistics can be used as a tool for testing the effects of ozone standards on plant responses in the field.

  20. The impact of corrections for faking on the validity of noncognitive measures in selection settings.

    Science.gov (United States)

    Schmitt, Neal; Oswald, Frederick L

    2006-05-01

    In selection research and practice, there have been many attempts to correct scores on noncognitive measures for applicants who may have faked their responses somehow. A related approach with more impact would be identifying and removing faking applicants from consideration for employment entirely, replacing them with high-scoring alternatives. The current study demonstrates that under typical conditions found in selection, even this latter approach has minimal impact on mean performance levels. Results indicate about .1 SD change in mean performance across a range of typical correlations between a faking measure and the criterion. Where trait scores were corrected only for suspected faking, and applicants not removed or replaced, the minimal impact the authors found on mean performance was reduced even further. By comparison, the impact of selection ratio and test validity is much larger across a range of realistic levels of selection ratios and validities. If selection researchers are interested only in maximizing predicted performance or validity, the use of faking measures to correct scores or remove applicants from further employment consideration will produce minimal effects.

  1. Choice of a High-Level Fault Model for the Optimization of Validation Test Set Reused for Manufacturing Test

    Directory of Open Access Journals (Sweden)

    Yves Joannon

    2008-01-01

    Full Text Available With the growing complexity of wireless systems on chip integrating hundreds-of-millions of transistors, electronic design methods need to be upgraded to reduce time-to-market. In this paper, the test benches defined for design validation or characterization of AMS & RF SoCs are optimized and reused for production testing. Although the original validation test set allows the verification of both design functionalities and performances, this test set is not well adapted to manufacturing test due to its high execution time and high test equipment costs requirement. The optimization of this validation test set is based on the evaluation of each test vector. This evaluation relies on high-level fault modeling and fault simulation. Hence, a fault model based on the variations of the parameters of high abstraction level descriptions and its related qualification metric are presented. The choice of functional or behavioral abstraction levels is discussed by comparing their impact on structural fault coverage. Experiments are performed on the receiver part of a WCDMA transceiver. Results show that for this SoC, using behavioral abstraction level is justified for the generation of manufacturing test benches.

  2. Validation of the Brief ICF core set for low back pain from the Norwegian perspective.

    Science.gov (United States)

    Røe, C; Sveen, U; Cieza, A; Geyh, S; Bautz-Holter, E

    2009-09-01

    The aim of this study was to identify candidate categories from the International Classification of Functioning, Disability and Health (ICF) to be included in the Brief ICF Core Set for low back pain (LBP) by examining their relation to general health and functionality. This was part of an international multicentre study with 118 participating Norwegian patients with LBP. The Comprehensive ICF Core Set for LBP was filled in by health professionals. The patients reported their health-related quality of life in the Medical Outcome Study Short Form 36 (SF-36) and function in the Oswestry Disability Index. Two questions regarding the patient's general health and functioning were completed by the health professionals and the patients themselves. Regression models were developed in order to identify ICF categories explaining most of the variance of the criterion measures. Twelve ICF categories remained as significant explanatory factors according to the eight regression models, four of which were not included in a previously proposed Brief ICF Core Set for LBP. The present study complements the development of the Brief ICF Core Set for LBP, and indicates a minimum number of categories needed to explain LBP patients' functioning and health. Further elaboration of the Brief ICF Core Set for LBP with multinational data is needed.

  3. The ToMenovela - A Photograph-Based Stimulus Set for the Study of Social Cognition with High Ecological Validity.

    Science.gov (United States)

    Herbort, Maike C; Iseev, Jenny; Stolz, Christopher; Roeser, Benedict; Großkopf, Nora; Wüstenberg, Torsten; Hellweg, Rainer; Walter, Henrik; Dziobek, Isabel; Schott, Björn H

    2016-01-01

    We present the ToMenovela, a stimulus set that has been developed to provide a set of normatively rated socio-emotional stimuli showing varying amount of characters in emotionally laden interactions for experimental investigations of (i) cognitive and (ii) affective Theory of Mind (ToM), (iii) emotional reactivity, and (iv) complex emotion judgment with respect to Ekman's basic emotions (happiness, anger, disgust, fear, sadness, surprise, Ekman and Friesen, 1975). Stimuli were generated with focus on ecological validity and consist of 190 scenes depicting daily-life situations. Two or more of eight main characters with distinct biographies and personalities are depicted on each scene picture. To obtain an initial evaluation of the stimulus set and to pave the way for future studies in clinical populations, normative data on each stimulus of the set was obtained from a sample of 61 neurologically and psychiatrically healthy participants (31 female, 30 male; mean age 26.74 ± 5.84), including a visual analog scale rating of Ekman's basic emotions (happiness, anger, disgust, fear, sadness, surprise) and free-text descriptions of the content of each scene. The ToMenovela is being developed to provide standardized material of social scenes that are available to researchers in the study of social cognition. It should facilitate experimental control while keeping ecological validity high.

  4. Procedure to validate sexual stimuli: reliability and validity of a set of sexual stimuli in a sample of young colombian heterosexual males

    Directory of Open Access Journals (Sweden)

    Pablo Vallejo-Medina

    2017-01-01

    Full Text Available Penile plethysmography – or phallometric assessment – is a very relevant evaluation for sexual health. The objective of this research is to suggest a guideline to validate sexual stimuli and validate a set of sexual stimuli to assess “normative” sexual behavior in Colombian young heterosexual men. Six videos of 3:15 minute-long were used. A total of 24 men were assessed. Objective sexual arousal, the International Index of Erectile Function-5, Self-Assessment Manikin, Multidimensional Scale to Assess Subjective Sexual Arousal and socio-psycho-sexual questions were used. The results showed three sexual excerpts which were clearly superior to the others – something discordant with the subjective opinion of researchers. These three sexual excerpts generated internally consistent measurements; moreover, good indicators of external validity have been observed with statistically significant differences as expected. Furthermore, with a small healthy Colombian young population, it has been shown that the three stimuli produce objective sexual arousal if used together.

  5. Studying primate cognition in a social setting to improve validity and welfare: a literature review highlighting successful approaches.

    Science.gov (United States)

    Cronin, Katherine A; Jacobson, Sarah L; Bonnie, Kristin E; Hopper, Lydia M

    2017-01-01

    Studying animal cognition in a social setting is associated with practical and statistical challenges. However, conducting cognitive research without disturbing species-typical social groups can increase ecological validity, minimize distress, and improve animal welfare. Here, we review the existing literature on cognitive research run with primates in a social setting in order to determine how widespread such testing is and highlight approaches that may guide future research planning. Using Google Scholar to search the terms "primate" "cognition" "experiment" and "social group," we conducted a systematic literature search covering 16 years (2000-2015 inclusive). We then conducted two supplemental searches within each journal that contained a publication meeting our criteria in the original search, using the terms "primate" and "playback" in one search and the terms "primate" "cognition" and "social group" in the second. The results were used to assess how frequently nonhuman primate cognition has been studied in a social setting (>3 individuals), to gain perspective on the species and topics that have been studied, and to extract successful approaches for social testing. Our search revealed 248 unique publications in 43 journals encompassing 71 species. The absolute number of publications has increased over years, suggesting viable strategies for studying cognition in social settings. While a wide range of species were studied they were not equally represented, with 19% of the publications reporting data for chimpanzees. Field sites were the most common environment for experiments run in social groups of primates, accounting for more than half of the results. Approaches to mitigating the practical and statistical challenges were identified. This analysis has revealed that the study of primate cognition in a social setting is increasing and taking place across a range of environments. This literature review calls attention to examples that may provide valuable

  6. Identification and Validation of a New Set of Five Genes for Prediction of Risk in Early Breast Cancer

    Directory of Open Access Journals (Sweden)

    Giorgio Mustacchi

    2013-05-01

    Full Text Available Molecular tests predicting the outcome of breast cancer patients based on gene expression levels can be used to assist in making treatment decisions after consideration of conventional markers. In this study we identified a subset of 20 mRNA differentially regulated in breast cancer analyzing several publicly available array gene expression data using R/Bioconductor package. Using RTqPCR we evaluate 261 consecutive invasive breast cancer cases not selected for age, adjuvant treatment, nodal and estrogen receptor status from paraffin embedded sections. The biological samples dataset was split into a training (137 cases and a validation set (124 cases. The gene signature was developed on the training set and a multivariate stepwise Cox analysis selected five genes independently associated with DFS: FGF18 (HR = 1.13, p = 0.05, BCL2 (HR = 0.57, p = 0.001, PRC1 (HR = 1.51, p = 0.001, MMP9 (HR = 1.11, p = 0.08, SERF1a (HR = 0.83, p = 0.007. These five genes were combined into a linear score (signature weighted according to the coefficients of the Cox model, as: 0.125FGF18 − 0.560BCL2 + 0.409PRC1 + 0.104MMP9 − 0.188SERF1A (HR = 2.7, 95% CI = 1.9–4.0, p < 0.001. The signature was then evaluated on the validation set assessing the discrimination ability by a Kaplan Meier analysis, using the same cut offs classifying patients at low, intermediate or high risk of disease relapse as defined on the training set (p < 0.001. Our signature, after a further clinical validation, could be proposed as prognostic signature for disease free survival in breast cancer patients where the indication for adjuvant chemotherapy added to endocrine treatment is uncertain.

  7. Phenotypic identification of Porphyromonas gingivalis validated with matrix-assisted laser desorption/ionization time-of-flight mass spectrometry

    NARCIS (Netherlands)

    Rams, Thomas E; Sautter, Jacqueline D; Getreu, Adam; van Winkelhoff, Arie J

    OBJECTIVE: Porphyromonas gingivalis is a major bacterial pathogen in human periodontitis. This study used matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry to assess the accuracy of a rapid phenotypic identification scheme for detection of cultivable P.

  8. Development and validation of UHPLC-MS/MS method for determination of eight naturally occurring catechin derivatives in various tea samples and the role of matrix effects.

    Science.gov (United States)

    Svoboda, Pavel; Vlčková, Hana; Nováková, Lucie

    2015-10-10

    A complete analytical procedure combining optimized tea infusion preparation and validated UHPLC-MS/MS method was developed for routine quantification of eight naturally occurring catechin derivatives in various tea samples. The preparation of tea infusions was optimized in terms of temperature, time and water-to-tea ratio in green, white and black teas. The catechins were analyzed using ultra-high performance liquid chromatography coupled with triple quadrupole mass spectrometry in a run of only 4 min including equilibration of the system. The UHPLC-MS/MS method was fully validated in terms of inter/intra-day precision, accuracy, linearity (r(2)>0.9991), range (50-5000 ng/ml), LOD (1.5-7.5 ng/ml) and LOQ (5-25 ng/ml). Validation of the method included also the determination of the matrix effects that were evaluated in both flavored and unflavored green, white and black teas. Dilution of the resulting tea infusions appeared to be crucial for the matrix effects and also for subsequent catechin quantification in real tea samples in order to fit into the linear range of the UHPLC-MS/MS method. This complete procedure for catechin quantification was finally applied to real sample analysis represented by 70 commercial tea samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Validation of the Male Osteoporosis Risk Estimation Score (MORES) in a primary care setting.

    Science.gov (United States)

    Cass, Alvah R; Shepherd, Angela J

    2013-01-01

    Primary care physicians are positioned to promote early recognition and treatment of men at risk for osteoporosis-related fractures; however, efficient screening strategies are needed. This study was designed to validate the Male Osteoporosis Risk Estimation Score (MORES) for identifying men at increased risk of osteoporosis. This was a blinded analysis of the MORES, administered prospectively in a cross-sectional sample of men aged 60 years or older. Participants completed a research questionnaire at an outpatient visit and had a dual-energy X-ray absorptiometry (DXA) scan to assess bone density. Sensitivity, specificity, and area under-the-curve (AUC) were estimated for the MORES. Effectiveness was assessed by the number needed-to-screen (NNS) to prevent one additional major osteoporotic fracture. A total of 346 men completed the study. The mean age was 70.2 ± 6.9 years; 76% were non-Hispanic white. Fifteen men (4.3%) had osteoporosis of the hip. The operating characteristics were sensitivity 0.80 (95% confidence interval [CI], 0.52-0.96); specificity 0.70 (95% CI, 0.64-0.74), and AUC of 0.82 (95% CI, 0.71-0.92). Screening with the MORES yielded a NNS to prevent one additional major osteoporotic fracture over 10 years with 259 (95% CI, 192-449) compared to 636 for universal screening with a DXA. This study validated the MORES as an effective and efficient approach to identifying men at increased risk of osteoporosis who may benefit from a diagnostic DXA scan.

  10. The nuclear higher-order structure defined by the set of topological relationships between DNA and the nuclear matrix is species-specific in hepatocytes.

    Science.gov (United States)

    Silva-Santiago, Evangelina; Pardo, Juan Pablo; Hernández-Muñoz, Rolando; Aranda-Anzaldo, Armando

    2017-01-15

    During the interphase the nuclear DNA of metazoan cells is organized in supercoiled loops anchored to constituents of a nuclear substructure or compartment known as the nuclear matrix. The stable interactions between DNA and the nuclear matrix (NM) correspond to a set of topological relationships that define a nuclear higher-order structure (NHOS). Current evidence suggests that the NHOS is cell-type-specific. Biophysical evidence and theoretical models suggest that thermodynamic and structural constraints drive the actualization of DNA-NM interactions. However, if the topological relationships between DNA and the NM were the subject of any biological constraint with functional significance then they must be adaptive and thus be positively selected by natural selection and they should be reasonably conserved, at least within closely related species. We carried out a coarse-grained, comparative evaluation of the DNA-NM topological relationships in primary hepatocytes from two closely related mammals: rat and mouse, by determining the relative position to the NM of a limited set of target sequences corresponding to highly-conserved genomic regions that also represent a sample of distinct chromosome territories within the interphase nucleus. Our results indicate that the pattern of topological relationships between DNA and the NM is not conserved between the hepatocytes of the two closely related species, suggesting that the NHOS, like the karyotype, is species-specific. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Development and validation of a set of German stimulus- and target words for an attachment related semantic priming paradigm.

    Directory of Open Access Journals (Sweden)

    Anke Maatz

    Full Text Available Experimental research in adult attachment theory is faced with the challenge to adequately activate the adult attachment system. In view of the multitude of methods employed for this purpose so far, this paper suggests to further make use of the methodological advantages of semantic priming. In order to enable the use of such a paradigm in a German speaking context, a set of German words belonging to the semantic categories 'interpersonal closeness', 'interpersonal distance' and 'neutral' were identified and their semantics were validated combining production- and rating method. 164 university students answered corresponding online-questionnaires. Ratings were analysed using analysis of variance (ANOVA and cluster analysis from which three clearly distinct groups emerged. Beyond providing validated stimulus- and target words which can be used to activate the adult attachment system in a semantic priming paradigm, the results of this study point at important links between attachment and stress which call for further investigation in the future.

  12. Development and validation of a set of German stimulus- and target words for an attachment related semantic priming paradigm.

    Science.gov (United States)

    Maatz, Anke; Strauss, Bernhard; Bär, Karl-Jürgen

    2013-01-01

    Experimental research in adult attachment theory is faced with the challenge to adequately activate the adult attachment system. In view of the multitude of methods employed for this purpose so far, this paper suggests to further make use of the methodological advantages of semantic priming. In order to enable the use of such a paradigm in a German speaking context, a set of German words belonging to the semantic categories 'interpersonal closeness', 'interpersonal distance' and 'neutral' were identified and their semantics were validated combining production- and rating method. 164 university students answered corresponding online-questionnaires. Ratings were analysed using analysis of variance (ANOVA) and cluster analysis from which three clearly distinct groups emerged. Beyond providing validated stimulus- and target words which can be used to activate the adult attachment system in a semantic priming paradigm, the results of this study point at important links between attachment and stress which call for further investigation in the future.

  13. Development and validation of an instrument to measure collaborative goal setting in the care of patients with diabetes

    Science.gov (United States)

    Morris, Heather L; Dumenci, Levent; Lafata, Jennifer E

    2017-01-01

    Objective Despite known benefits of patient-perceived collaborative goal setting, we have a limited ability to monitor this process in practice. We developed the Patient Measure of Collaborative Goal Setting (PM-CGS) to evaluate the use of collaborative goal setting from the patient's perspective. Research design and methods A random sample of 400 patients aged 40 years or older, receiving diabetes care from the Virginia Commonwealth University Health System between 8/2012 and 8/2013, were mailed a survey containing potential PM-CGS items (n=44) as well as measures of patient demographics, perceived self-management competence, trust in their physician, and self-management behaviors. Confirmatory factor analysis was used to evaluate construct validity. External validity was evaluated via a structural equation model (SEM) that tested the association of the PM-CGS with self-management behaviors. The direct and two mediated (via trust and self-efficacy) pathways were tested. Results A total of 259 patients responded to the survey (64% response rate), of which 192 were eligible for inclusion. Results from the factor analysis supported a 37-item measure of patient-perceived CGS spanning five domains: listen and learn; share ideas; caring relationship; measurable objective; and goal achievement support (χ=4366.13, psetting discussions. PMID:28316793

  14. Validity of Different Activity Monitors to Count Steps in an Inpatient Rehabilitation Setting.

    Science.gov (United States)

    Treacy, Daniel; Hassett, Leanne; Schurr, Karl; Chagpar, Sakina; Paul, Serene S; Sherrington, Catherine

    2017-05-01

    Commonly used activity monitors have been shown to be accurate in counting steps in active people; however, further validation is needed in slower walking populations. To determine the validity of activity monitors for measuring step counts in rehabilitation inpatients compared with visually observed step counts. To explore the influence of gait parameters, activity monitor position, and use of walkers on activity monitor accuracy. One hundred and sixty-six inpatients admitted to a rehabilitation unit with an average walking speed of 0.4 m/s (SD 0.2) wore 16 activity monitors (7 different devices in different positions) simultaneously during 6-minute and 6-m walks. The number of steps taken during the tests was also counted by a physical therapist. Gait parameters were assessed using the GAITRite system. To analyze the influence of different gait parameters, the percentage accuracy for each monitor was graphed against various gait parameters for each activity monitor. The StepWatch, Fitbit One worn on the ankle and the ActivPAL showed excellent agreement with observed step count (ICC 2,1 0.98; 0.92; 0.78 respectively). Other devices (Fitbit Charge, Fitbit One worn on hip, G-Sensor, Garmin Vivofit, Actigraph) showed poor agreement with the observed step count (ICC 2,1 0.12-0.40). Percentage agreement with observed step count was highest for the StepWatch (mean 98%). The StepWatch and the Fitbit One worn on the ankle maintained accuracy in individuals who walked more slowly and with shorter strides but other devices were less accurate in these individuals. There were small numbers of participants for some gait parameters. The StepWatch showed the highest accuracy and closest agreement with observed step count. This device can be confidently used by researchers for accurate measurement of step counts in inpatient rehabilitation in individuals who walk slowly. If immediate feedback is desired, the Fitbit One when worn on the ankle would be the best choice for this

  15. Validation, revision and extension of the Mantle Cell Lymphoma International Prognostic Index in a population-based setting.

    Science.gov (United States)

    van de Schans, Saskia A M; Janssen-Heijnen, Maryska L G; Nijziel, Marten R; Steyerberg, Ewout W; van Spronsen, Dick Johan

    2010-09-01

    The aim of this study was to validate the Mantle Cell Lymphoma International Prognostic Index in a population-based cohort and to study the relevance of its revisions. We analyzed data from 178 unselected patients with stage III or IV mantle cell lymphoma, registered between 1994 and 2006 in the Eindhoven Cancer Registry. Follow-up was completed up to January 1(st), 2008. Multiple imputations for missing covariates were used. Validity was assessed by comparing observed survival in our cohort with predicted survival according to the original Mantle cell lymphoma International Prognostic Index. A revised model was constructed with Cox regression analysis. Discrimination was assessed by a concordance statistic ('c'). The original Mantle cell lymphoma International Prognostic Index could stratify our cohort into three distinct risk groups based on Eastern Cooperative Group performance status, white blood cell count, lactate dehydrogenase level, and age, with the discrimination being nearly as good as in the original cohort (c 0.65 versus 0.63). A modified model including performance status in five categories (0/1/2/3/4) instead of two (0-1/2-4), the presence of B-symptoms (yes/no) and sex (male/female) in addition to the original variables resulted in a better prognostic index (c 0.75). The Mantle cell lymphoma International Prognostic Index is a valid tool for risk stratification, comparison of prognosis, and treatment decisions in an unselected Dutch population-based setting. Although the index can be significantly improved, external validation on an independent data set is warranted before broad application of the modified instrument could be recommended.

  16. Setting up a large set of protein-ligand PDB complexes for the development and validation of knowledge-based docking algorithms.

    Science.gov (United States)

    Diago, Luis A; Morell, Persy; Aguilera, Longendri; Moreno, Ernesto

    2007-08-25

    The number of algorithms available to predict ligand-protein interactions is large and ever-increasing. The number of test cases used to validate these methods is usually small and problem dependent. Recently, several databases have been released for further understanding of protein-ligand interactions, having the Protein Data Bank as backend support. Nevertheless, it appears to be difficult to test docking methods on a large variety of complexes. In this paper we report the development of a new database of protein-ligand complexes tailored for testing of docking algorithms. Using a new definition of molecular contact, small ligands contained in the 2005 PDB edition were identified and processed. The database was enriched in molecular properties. In particular, an automated typing of ligand atoms was performed. A filtering procedure was applied to select a non-redundant dataset of complexes. Data mining was performed to obtain information on the frequencies of different types of atomic contacts. Docking simulations were run with the program DOCK. We compiled a large database of small ligand-protein complexes, enriched with different calculated properties, that currently contains more than 6000 non-redundant structures. As an example to demonstrate the value of the new database, we derived a new set of chemical matching rules to be used in the context of the program DOCK, based on contact frequencies between ligand atoms and points representing the protein surface, and proved their enhanced efficiency with respect to the default set of rules included in that program. The new database constitutes a valuable resource for the development of knowledge-based docking algorithms and for testing docking programs on large sets of protein-ligand complexes. The new chemical matching rules proposed in this work significantly increase the success rate in DOCKing simulations. The database developed in this work is available at http://cimlcsext.cim.sld.cu:8080/screeningbrowser/.

  17. Setting up a large set of protein-ligand PDB complexes for the development and validation of knowledge-based docking algorithms

    Directory of Open Access Journals (Sweden)

    Aguilera Longendri

    2007-08-01

    Full Text Available Abstract Background The number of algorithms available to predict ligand-protein interactions is large and ever-increasing. The number of test cases used to validate these methods is usually small and problem dependent. Recently, several databases have been released for further understanding of protein-ligand interactions, having the Protein Data Bank as backend support. Nevertheless, it appears to be difficult to test docking methods on a large variety of complexes. In this paper we report the development of a new database of protein-ligand complexes tailored for testing of docking algorithms. Methods Using a new definition of molecular contact, small ligands contained in the 2005 PDB edition were identified and processed. The database was enriched in molecular properties. In particular, an automated typing of ligand atoms was performed. A filtering procedure was applied to select a non-redundant dataset of complexes. Data mining was performed to obtain information on the frequencies of different types of atomic contacts. Docking simulations were run with the program DOCK. Results We compiled a large database of small ligand-protein complexes, enriched with different calculated properties, that currently contains more than 6000 non-redundant structures. As an example to demonstrate the value of the new database, we derived a new set of chemical matching rules to be used in the context of the program DOCK, based on contact frequencies between ligand atoms and points representing the protein surface, and proved their enhanced efficiency with respect to the default set of rules included in that program. Conclusion The new database constitutes a valuable resource for the development of knowledge-based docking algorithms and for testing docking programs on large sets of protein-ligand complexes. The new chemical matching rules proposed in this work significantly increase the success rate in DOCKing simulations. The database developed in this work is

  18. Validity of verbal autopsy method to determine causes of death among adults in the urban setting of Ethiopia

    Directory of Open Access Journals (Sweden)

    Misganaw Awoke

    2012-08-01

    Full Text Available Abstract Background Verbal autopsy has been widely used to estimate causes of death in settings with inadequate vital registries, but little is known about its validity. This analysis was part of Addis Ababa Mortality Surveillance Program to examine the validity of verbal autopsy for determining causes of death compared with hospital medical records among adults in the urban setting of Ethiopia. Methods This validation study consisted of comparison of verbal autopsy final diagnosis with hospital diagnosis taken as a “gold standard”. In public and private hospitals of Addis Ababa, 20,152 adult deaths (15 years and above were recorded between 2007 and 2010. With the same period, a verbal autopsy was conducted for 4,776 adult deaths of which, 1,356 were deceased in any of Addis Ababa hospitals. Then, verbal autopsy and hospital data sets were merged using the variables; full name of the deceased, sex, address, age, place and date of death. We calculated sensitivity, specificity and positive predictive values with 95% confidence interval. Results After merging, a total of 335 adult deaths were captured. For communicable diseases, the values of sensitivity, specificity and positive predictive values of verbal autopsy diagnosis were 79%, 78% and 68% respectively. For non-communicable diseases, sensitivity of the verbal autopsy diagnoses was 69%, specificity 78% and positive predictive value 79%. Regarding injury, sensitivity of the verbal autopsy diagnoses was 70%, specificity 98% and positive predictive value 83%. Higher sensitivity was achieved for HIV/AIDS and tuberculosis, but lower specificity with relatively more false positives. Conclusion These findings may indicate the potential of verbal autopsy to provide cost-effective information to guide policy on communicable and non communicable diseases double burden among adults in Ethiopia. Thus, a well structured verbal autopsy method, followed by qualified physician reviews could be capable of

  19. Validity and reliability of a simple, low cost measure to quantify children’s dietary intake in afterschool settings

    Science.gov (United States)

    Davison, Kirsten K.; Austin, S. Bryn; Giles, Catherine; Cradock, Angie L.; Lee, Rebekka M.; Gortmaker, Steven L.

    2017-01-01

    Interest in evaluating and improving children’s diets in afterschool settings has grown, necessitating the development of feasible yet valid measures for capturing children’s intake in such settings. This study’s purpose was to test the criterion validity and cost of three unobtrusive visual estimation methods compared to a plate-weighing method: direct on-site observation using a 4-category rating scale and off-site rating of digital photographs taken on-site using 4- and 10-category scales. Participants were 111 children in grades 1–6 attending four afterschool programs in Boston, MA in December 2011. Researchers observed and photographed 174 total snack meals consumed across two days at each program. Visual estimates of consumption were compared to weighed estimates (the criterion measure) using intra-class correlations. All three methods were highly correlated with the criterion measure, ranging from 0.92–0.94 for total calories consumed, 0.86–0.94 for consumption of pre-packaged beverages, 0.90–0.93 for consumption of fruits/vegetables, and 0.92–0.96 for consumption of grains. For water, which was not pre-portioned, coefficients ranged from 0.47–0.52. The photographic methods also demonstrated excellent inter-rater reliability: 0.84–0.92 for the 4-point and 0.92–0.95 for the 10-point scale. The costs of the methods for estimating intake ranged from $0.62 per observation for the on-site direct visual method to $0.95 per observation for the criterion measure. This study demonstrates that feasible, inexpensive methods can validly and reliably measure children’s dietary intake in afterschool settings. Improving precision in measures of children’s dietary intake can reduce the likelihood of spurious or null findings in future studies. PMID:25596895

  20. Validation of Nurse Practitioner Primary Care Organizational Climate Questionnaire: A New Tool to Study Nurse Practitioner Practice Settings.

    Science.gov (United States)

    Poghosyan, Lusine; Chaplin, William F; Shaffer, Jonathan A

    2017-04-01

    Favorable organizational climate in primary care settings is necessary to expand the nurse practitioner (NP) workforce and promote their practice. Only one NP-specific tool, the Nurse Practitioner Primary Care Organizational Climate Questionnaire (NP-PCOCQ), measures NP organizational climate. We confirmed NP-PCOCQ's factor structure and established its predictive validity. A crosssectional survey design was used to collect data from 314 NPs in Massachusetts in 2012. Confirmatory factor analysis and regression models were used. The 4-factor model characterized NP-PCOCQ. The NP-PCOCQ score predicted job satisfaction (beta = .36; p organizational climate in their clinics. Further testing of NP-PCOCQ is needed.

  1. Predictive validity of a service-setting-based measure to identify infancy mental health problems

    DEFF Research Database (Denmark)

    Ammitzbøll, Janni; Thygesen, Lau Caspar; Holstein, Bjørn E

    2017-01-01

    Measures to identify infancy mental health problems are essential to guide interventions and reduce the risk of developmental psychopathology in early years. We investigated a new service-setting-based measure the Copenhagen Infant Mental Health Screening (CIMHS) within the general child health...... surveillance by community health nurses (CHN). The study population of 2973 infants was assessed by CIMHS at age 9-10 months. A subsample of 416 children was examined at age 1½ years, using parent interviews including the Child Behavior Checklist (CBCL 1½-5), Check List of Autism and Toddlers (CHAT), Infant...... logistic regression analyses adjusted and weighted to adjust for sampling and bias. CIMHS problems of sleep, feeding and eating, emotions, attention, communication, and language were associated with an up to fivefold increased risk of child mental disorders across the diagnostic spectrum of ICD-10...

  2. A high confidence, manually validated human blood plasma protein reference set

    DEFF Research Database (Denmark)

    Schenk, Susann; Schoenhals, Gary J; de Souza, Gustavo

    2008-01-01

    BACKGROUND: The immense diagnostic potential of human plasma has prompted great interest and effort in cataloging its contents, exemplified by the Human Proteome Organization (HUPO) Plasma Proteome Project (PPP) pilot project. Due to challenges in obtaining a reliable blood plasma protein list...... the full diagnostic potential of blood plasma, we feel that there is still a need for an ultra-high confidence reference list (at least 99% confidence) of blood plasma proteins. METHODS: To address the complexity and dynamic protein concentration range of the plasma proteome, we employed a linear ion...... consecutive stages of tandem mass spectrometry (MS3). The combination of MS3 with very high mass accuracy in the parent peptide allows peptide identification with orders of magnitude more confidence than that typically achieved. RESULTS: Herein we established a high confidence set of 697 blood plasma proteins...

  3. Development and Validation of a Photographic Method to Use for Dietary Assessment in School Settings.

    Science.gov (United States)

    Olafsdottir, Anna S; Hörnell, Agneta; Hedelin, Marlene; Waling, Maria; Gunnarsdottir, Ingibjörg; Olsson, Cecilia

    2016-01-01

    To develop and validate a photographic method aimed at making assessment of dietary intake in school canteens non-obstrusive, practical and feasible. The study was conducted in two elementary schools representing two different school canteen systems; main dish being served by canteen staff (Iceland), and complete self-serving (Sweden). Food items in serving and leftovers were weighed and photographed. Trained researchers estimated weights of food items by viewing the photographs and comparing them with pictures of half and full reference portions with known weights. Plates of servings and leftovers from 48 children during five school days (n = 448 plates) and a total of 5967 food items were estimated. The researchers' estimates were then compared with the true weight of the foods and the energy content calculated. Weighed and estimated amounts correlated across meals both in grams and as total energy (0.853-0.977, pschool meals was close to the true measurement from weighed records; on average 4-19 kcal below true values. Organisation of meal service impacted the efficacy of the method as seen in the difference between countries; with Iceland (served by canteen staff) having higher rate of acceptable estimates than Sweden (self-serving), being 95% vs 73% for total amount (g) in serving. Iceland more often had serving size between or above the half and full reference plates compared with Sweden. The photographic method provides acceptable estimates of food and energy intake in school canteens. However, greater accuracy can be expected when foods are served by canteen staff compared with self-serving.

  4. Comprehensive ICF core set for obstructive pulmonary diseases: validation of the activities and participation component through the patient's perspective.

    Science.gov (United States)

    Marques, Alda; Jácome, Cristina; Gabriel, Raquel; Figueiredo, Daniela

    2013-09-01

    This study aimed to validate the Activities and Participation component of the Comprehensive International Classification of Functioning, Disability and Health (ICF) Core Set for Obstructive Pulmonary Diseases (OPD) from the patient's perspective. A cross-sectional qualitative study was conducted with a convenience sample of outpatients with Chronic Obstructive Pulmonary Disease (COPD). Individual interviews were performed and analysed according to the meaning condensation procedure. Fifty-one participants (70.6% male) with a mean age of 69.5 ± 10.8 years old were included. Twenty-one of the 24 categories contained in the Activities and Participation component of the Comprehensive ICF Core Set for OPD were identified by the participants. Additionally, seven second-level categories that are not covered by the Core Set were reported: complex interpersonal interactions, informal social relationships, family relationships, conversation, maintaining a body position, eating and preparing meals. The activities and participation component of the ICF Core Set for OPD was largely supported by the patient's perspective. The categories included in the ICF Core Set that were not confirmed by the participants and the additional categories that were raised need to be further investigated in order to develop an instrument according to the patient's perspective. This will promote a more patient-centred assessments and rehabilitation interventions. Implications for Rehabilitation The Activities and Participation component of the Comprehensive ICF Core Set for OPD is largely supported by the perspective of patients with COPD and therefore could be used in the assessment of patients' individual and social life. The information collected through the Activities and Participation component of the Comprehensive ICF Core Set for OPD could be used to plan and assess rehabilitation interventions for patients with COPD.

  5. Convergence of Breit-Pauli spin-orbit matrix elements with basis set size and configuration interaction space: The halogen atoms F, Cl, and Br

    Energy Technology Data Exchange (ETDEWEB)

    Nicklass, Andreas [Department of Chemistry, Washington State University and the Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, Richland, Washington 99352 (United States); Peterson, Kirk A. [Department of Chemistry, Washington State University and the Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, Richland, Washington 99352 (United States); Berning, Andreas [Institut fuer Theoretische Chemie, Universitaet Stuttgart, 70550 Stuttgart, (Germany); Werner, Hans-Joachim [Institut fuer Theoretische Chemie, Universitaet Stuttgart, 70550 Stuttgart, (Germany); Knowles, Peter J. [School of Chemistry, University of Birmingham, Edgbaston, Birmingham B15 2TT, (United Kingdom)

    2000-04-01

    Systematic sequences of basis sets are used to calculate the spin-orbit splittings of the halogen atoms F, Cl, and Br in the framework of first-order perturbation theory with the Breit-Pauli operator and internally contracted configuration interaction wave functions. The effects of both higher angular momentum functions and the presence of tight functions are studied. By systematically converging the one-particle basis set, an unambiguous evaluation of the effects of correlating different numbers of electrons in the Cl treatment is carried out. Correlation of the 2p-electrons in chlorine increases the spin-orbit splitting by {approx}80 cm-1, while in bromine we observe incremental increases of 130, 145, and 93 cm-1, when adding the 3d, 3p, and 2p electrons to the set of explicitly correlated electrons, respectively. For fluorine and chlorine the final basis set limit, all-electrons correlated results match the experimentally observed spin-orbit splittings to within {approx}5 cm-1, while for bromine the Breit-Pauli operator underestimates the splitting by about 100 cm-1. More extensive treatment of electron correlation results in only a slight lowering of the spin-orbit matrix elements. Thus, the discrepancy for bromine is proposed to arise from the nonrelativistic character of the underlying wave function. (c) 2000 American Institute of Physics.

  6. Using affective knowledge to generate and validate a set of emotion-related, action words

    Directory of Open Access Journals (Sweden)

    Emma Portch

    2015-07-01

    Full Text Available Emotion concepts are built through situated experience. Abstract word meaning is grounded in this affective knowledge, giving words the potential to evoke emotional feelings and reactions (e.g., Vigliocco et al., 2009. In the present work we explore whether words differ in the extent to which they evoke ‘specific’ emotional knowledge. Using a categorical approach, in which an affective ‘context’ is created, it is possible to assess whether words proportionally activate knowledge relevant to different emotional states (e.g., ‘sadness’, ‘anger’, Stevenson, Mikels & James, 2007a. We argue that this method may be particularly effective when assessing the emotional meaning of action words (e.g., Schacht & Sommer, 2009. In study 1 we use a constrained feature generation task to derive a set of action words that participants associated with six, basic emotional states (see full list in Appendix S1. Generation frequencies were taken to indicate the likelihood that the word would evoke emotional knowledge relevant to the state to which it had been paired. In study 2 a rating task was used to assess the strength of association between the six most frequently generated, or ‘typical’, action words and corresponding emotion labels. Participants were presented with a series of sentences, in which action words (typical and atypical and labels were paired e.g., “If you are feeling ‘sad’ how likely would you be to act in the following way?” … ‘cry.’ Findings suggest that typical associations were robust. Participants always gave higher ratings to typical vs. atypical action word and label pairings, even when (a rating direction was manipulated (the label or verb appeared first in the sentence, and (b the typical behaviours were to be performed by the rater themselves, or others. Our findings suggest that emotion-related action words vary in the extent to which they evoke knowledge relevant for different emotional states. When

  7. Predicting death from kala-azar: construction, development, and validation of a score set and accompanying software.

    Science.gov (United States)

    Costa, Dorcas Lamounier; Rocha, Regina Lunardi; Chaves, Eldo de Brito Ferreira; Batista, Vivianny Gonçalves de Vasconcelos; Costa, Henrique Lamounier; Costa, Carlos Henrique Nery

    2016-01-01

    Early identification of patients at higher risk of progressing to severe disease and death is crucial for implementing therapeutic and preventive measures; this could reduce the morbidity and mortality from kala-azar. We describe a score set composed of four scales in addition to software for quick assessment of the probability of death from kala-azar at the point of care. Data from 883 patients diagnosed between September 2005 and August 2008 were used to derive the score set, and data from 1,031 patients diagnosed between September 2008 and November 2013 were used to validate the models. Stepwise logistic regression analyses were used to derive the optimal multivariate prediction models. Model performance was assessed by its discriminatory accuracy. A computational specialist system (Kala-Cal(r)) was developed to speed up the calculation of the probability of death based on clinical scores. The clinical prediction score showed high discrimination (area under the curve [AUC] 0.90) for distinguishing death from survival for children ≤2 years old. Performance improved after adding laboratory variables (AUC 0.93). The clinical score showed equivalent discrimination (AUC 0.89) for older children and adults, which also improved after including laboratory data (AUC 0.92). The score set also showed a high, although lower, discrimination when applied to the validation cohort. This score set and Kala-Cal(r) software may help identify individuals with the greatest probability of death. The associated software may speed up the calculation of the probability of death based on clinical scores and assist physicians in decision-making.

  8. Revisiting depression in palliative care settings: the need to focus on clinical utility over validity.

    Science.gov (United States)

    Reeve, J L; Lloyd-Williams, M; Dowrick, C

    2008-06-01

    To review the literature on depression in palliative care patients to identify implications for development of clinical practice and individual patient care. A qualitative review of depression prevalence studies in palliative care settings. We explore the utility of existing prevalence studies for clinical practice through testing two hypotheses: that high prevalence rates are associated with increased risk factors in study samples, and that poor methodological quality of the studies artefactually inflate prevalence estimates. Eighteen studies were identified in the search and included in this review. Risk factors may contribute to depression prevalence but through a complex interaction of factors making individual risk levels hard to determine. Measurement artefact cannot, alone, account for elevated levels of depression in this population but may contribute to imprecision. The importance of organic decline as a potential confounding variable is highlighted. Future research into the causes and prevalence of depression should adopt longitudinal approaches using large samples, and consider the impact of organic disorder as an important confounding factor. Clinical practice and care of individual patients may be better supported by development of a prognostic index considering the predictive power of depressive symptoms and risk factors on well-being.

  9. The Musical Emotional Bursts: A validated set of musical affect bursts to investigate auditory affective processing.

    Directory of Open Access Journals (Sweden)

    Sébastien ePaquette

    2013-08-01

    Full Text Available The Musical Emotional Bursts (MEB consist of 80 brief musical executions expressing basic emotional states (happiness, sadness and fear and neutrality. These musical bursts were designed to be the musical analogue of the Montreal Affective Voices (MAV – a set of brief non-verbal affective vocalizations portraying different basic emotions. The MEB consist of short (mean duration: 1.6 sec improvisations on a given emotion or of imitations of a given MAV stimulus, played on a violin (n:40 or a clarinet (n:40. The MEB arguably represent a primitive form of music emotional expression, just like the MAV represent a primitive form of vocal, nonlinguistic emotional expression. To create the MEB, stimuli were recorded from 10 violinists and 10 clarinetists, and then evaluated by 60 participants. Participants evaluated 240 stimuli (30 stimuli x 4 [3 emotions + neutral] x 2 instruments by performing either a forced-choice emotion categorization task, a valence rating task or an arousal rating task (20 subjects per task; 40 MAVs were also used in the same session with similar task instructions. Recognition accuracy of emotional categories expressed by the MEB (n:80 was lower than for the MAVs but still very high with an average percent correct recognition score of 80.4%. Highest recognition accuracies were obtained for happy clarinet (92.0% and fearful or sad violin (88.0% each MEB stimuli. The MEB can be used to compare the cerebral processing of emotional expressions in music and vocal communication, or used for testing affective perception in patients with communication problems.

  10. Copernicus stratospheric ozone service, 2009–2012: validation, system intercomparison and roles of input data sets

    Directory of Open Access Journals (Sweden)

    K. Lefever

    2015-03-01

    Full Text Available This paper evaluates and discusses the quality of the stratospheric ozone analyses delivered in near real time by the MACC (Monitoring Atmospheric Composition and Climate project during the 3-year period between September 2009 and September 2012. Ozone analyses produced by four different chemical data assimilation (CDA systems are examined and compared: the Integrated Forecast System coupled to the Model for OZone And Related chemical Tracers (IFS-MOZART; the Belgian Assimilation System for Chemical ObsErvations (BASCOE; the Synoptic Analysis of Chemical Constituents by Advanced Data Assimilation (SACADA; and the Data Assimilation Model based on Transport Model version 3 (TM3DAM. The assimilated satellite ozone retrievals differed for each system; SACADA and TM3DAM assimilated only total ozone observations, BASCOE assimilated profiles for ozone and some related species, while IFS-MOZART assimilated both types of ozone observations. All analyses deliver total column values that agree well with ground-based observations (biases The northern spring 2011 period is studied in more detail to evaluate the ability of the analyses to represent the exceptional ozone depletion event, which happened above the Arctic in March 2011. Offline sensitivity tests are performed during this month and indicate that the differences between the forward models or the assimilation algorithms are much less important than the characteristics of the assimilated data sets. They also show that IFS-MOZART is able to deliver realistic analyses of ozone both in the troposphere and in the stratosphere, but this requires the assimilation of observations from nadir-looking instruments as well as the assimilation of profiles, which are well resolved vertically and extend into the lowermost stratosphere.

  11. Prediction potential of candidate biomarker sets identified and validated on gene expression data from multiple datasets

    Directory of Open Access Journals (Sweden)

    Karacali Bilge

    2007-10-01

    Full Text Available Abstract Background Independently derived expression profiles of the same biological condition often have few genes in common. In this study, we created populations of expression profiles from publicly available microarray datasets of cancer (breast, lymphoma and renal samples linked to clinical information with an iterative machine learning algorithm. ROC curves were used to assess the prediction error of each profile for classification. We compared the prediction error of profiles correlated with molecular phenotype against profiles correlated with relapse-free status. Prediction error of profiles identified with supervised univariate feature selection algorithms were compared to profiles selected randomly from a all genes on the microarray platform and b a list of known disease-related genes (a priori selection. We also determined the relevance of expression profiles on test arrays from independent datasets, measured on either the same or different microarray platforms. Results Highly discriminative expression profiles were produced on both simulated gene expression data and expression data from breast cancer and lymphoma datasets on the basis of ER and BCL-6 expression, respectively. Use of relapse-free status to identify profiles for prognosis prediction resulted in poorly discriminative decision rules. Supervised feature selection resulted in more accurate classifications than random or a priori selection, however, the difference in prediction error decreased as the number of features increased. These results held when decision rules were applied across-datasets to samples profiled on the same microarray platform. Conclusion Our results show that many gene sets predict molecular phenotypes accurately. Given this, expression profiles identified using different training datasets should be expected to show little agreement. In addition, we demonstrate the difficulty in predicting relapse directly from microarray data using supervised machine

  12. Matrix-Specific Method Validation of an Automated Most-Probable-Number System for Use in Measuring Bacteriological Quality of Grade "A" Milk Products.

    Science.gov (United States)

    Lindemann, Samantha; Kmet, Matthew; Reddy, Ravinder; Uhlig, Steffen

    2016-11-01

    The U.S. Food and Drug Administration (FDA) oversees a long-standing cooperative federal and state milk sanitation program that uses the grade "A" Pasteurized Milk Ordinance standards to maintain the safety of grade "A" milk sold in the United States. The Pasteurized Milk Ordinance requires that grade "A" milk samples be tested using validated total aerobic bacterial and coliform count methods. The objective of this project was to conduct an interlaboratory method validation study to compare performance of a film plate method with an automated most-probable-number method for total aerobic bacterial and coliform counts, using statistical approaches from international data standards. The matrix-specific validation study was administered concurrently with the FDA's annual milk proficiency test to compare method performance in five milk types. Eighteen analysts from nine laboratories analyzed test portions from 12 samples in triplicate. Statistics, including mean bias and matrix standard deviation, were calculated. Sample-specific bias of the alternative method for total aerobic count suggests that there are no large deviations within the population of samples considered. Based on analysis of 648 data points, mean bias of the alternative method across milk samples for total aerobic count was 0.013 log CFU/ml and the confidence interval for mean deviation was -0.066 to 0.009 log CFU/ml. These results indicate that the mean difference between the selected methods is small and not statistically significant. Matrix standard deviation was 0.077 log CFU/ml, showing that there is a low risk for large sample-specific bias based on milk matrix. Mean bias of the alternative method was -0.160 log CFU/ml for coliform count data. The 95% confidence interval was -0.210 to -0.100 log CFU/ml, indicating that mean deviation is significantly different from zero. The standard deviation of the sample-specific bias for coliform data was 0.033 log CFU/ml, indicating no significant effect of

  13. Validation of an HIV-related stigma scale among health care providers in a resource-poor Ethiopian setting

    Directory of Open Access Journals (Sweden)

    Feyissa GT

    2012-03-01

    Full Text Available Garumma Tolu Feyissa1, Lakew Abebe1, Eshetu Girma1, Mirkuzie Woldie21Department of Health Education and Behavioral Sciences, 2Department of Health Services Management, Jimma University, Jimma, EthiopiaBackground: Stigma and discrimination (SAD against people living with human immunodeficiency virus (HIV are barriers affecting effective responses to HIV. Understanding the causes and extent of SAD requires the use of a psychometrically reliable and valid scale. The objective of this study was to validate an HIV-related stigma scale among health care providers in a resource-poor setting.Methods: A cross-sectional validation study was conducted in 18 health care institutions in southwest Ethiopia, from March 14, 2011 to April 14, 2011. A total of 255 health care providers responded to questionnaires asking about sociodemographic characteristics, HIV knowledge, perceived institutional support (PIS and HIV-related SAD. Exploratory factor analysis (EFA with principal component extraction and varimax with Kaiser normalization rotation were employed to develop scales for SAD. Eigenvalues greater than 1 were used as a criterion of extraction. Items with item-factor loadings less than 0.4 and items loading onto more than one factor were dropped. The convergent validity of the scales was tested by assessing the association with HIV knowledge, PIS, training on topics related to SAD, educational status, HIV case load, presence of an antiretroviral therapy (ART service in the health care facility, and perceived religiosity.Results: Seven factors emerged from the four dimensions of SAD during the EFA. The factor loadings of the items ranged from 0.58 to 0.93. Cronbach's alphas of the scales ranged from 0.80 to 0.95. An in-depth knowledge of HIV, perceptions of institutional support, attendance of training on topics related to SAD, degree or higher education levels, high HIV case loads, the availability of ART in the health care facility and claiming oneself as

  14. Reliability and validity of the Conditional Goal Setting in Eating Disorders Scale (CGS-EDS) among adults with eating disorders.

    Science.gov (United States)

    Watson, Hunna J; Street, Helen; Raykos, Bronwyn C; Byrne, Susan M; Fursland, Anthea; Nathan, Paula R

    2010-04-01

    The aim of this study was to develop and validate a self-report measure of Conditional Goal Setting (CGS) for use among individuals with eating disorders, the Conditional Goal Setting in Eating Disorders Scale (CGS-EDS). The CGS-EDS assesses the degree to which an individual believes that the achievement of happiness is contingent upon the attainment of body shape and weight goals. Women with a DSM-IV diagnosed eating disorder consecutively referred to a specialist outpatient clinic (N=238) completed the CGS-EDS and self-report measures of theoretically related constructs. Exploratory factor analysis indicated a one-factor solution, which accounted for 65% of the variance. The CGS-EDS correlated positively with theoretically related measures of overvaluation of shape and weight, concern with shape and weight, dichotomous thinking, and depression. The alpha reliability of the scale was .92. The CGS-EDS is a valid and reliable measure of CGS in eating disorders and is relevant to cognitive and behavioral models of maintenance and intervention. Copyright 2009 Elsevier Ltd. All rights reserved.

  15. The reliability and validity of an authentic motor skill assessment tool for early adolescent girls in an Australian school setting.

    Science.gov (United States)

    Lander, Natalie; Morgan, Philip J; Salmon, Jo; Logan, Samuel W; Barnett, Lisa M

    2017-06-01

    Proficiency in fundamental movement skills (FMS) is positively correlated with cardiorespiratory fitness, healthy weight status, and physical activity. Many instruments have been developed to assess FMS in children. It is important to accurately measure FMS competency in adolescent populations, particularly in girls, who are less proficient than boys. Yet these tests have not been validated or tested for reliability among girls in this age group. The current study tested the concurrent validity and reliability of two FMS assessment instruments; the newly developed Canadian Agility and Movement Skill Assessment (CAMSA), against the Victorian FMS Assessment from Australia, among a sample of early adolescent girls. In total, 34 Year 7 females (mean age 12.6 years) from Australia were tested and retested on each instrument in a school setting. Test-retest reliability was excellent for the overall CAMSA score (ICC=0.91) and for the isolated time and skill score components (time: ICC=0.80; skill: ICC=0.85). Test-retest reliability of the Victorian FMS Assessment was also good (ICC=0.79). There was no evidence of proportional bias in either assessment. There was evidence of strong concurrent validity (rs=0.68, pvalid. However, compared to the Victorian FMS instrument, the CAMSA has the advantage of both process and product assessment, less time needed to administer and higher authenticity, and so may be an attractive alternative to the more traditional forms of FMS assessment, for use with early adolescent girls, in school settings. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  16. Studying primate cognition in a social setting to improve validity and welfare: a literature review highlighting successful approaches

    Directory of Open Access Journals (Sweden)

    Katherine A. Cronin

    2017-08-01

    Full Text Available Background Studying animal cognition in a social setting is associated with practical and statistical challenges. However, conducting cognitive research without disturbing species-typical social groups can increase ecological validity, minimize distress, and improve animal welfare. Here, we review the existing literature on cognitive research run with primates in a social setting in order to determine how widespread such testing is and highlight approaches that may guide future research planning. Survey Methodology Using Google Scholar to search the terms “primate” “cognition” “experiment” and “social group,” we conducted a systematic literature search covering 16 years (2000–2015 inclusive. We then conducted two supplemental searches within each journal that contained a publication meeting our criteria in the original search, using the terms “primate” and “playback” in one search and the terms “primate” “cognition” and “social group” in the second. The results were used to assess how frequently nonhuman primate cognition has been studied in a social setting (>3 individuals, to gain perspective on the species and topics that have been studied, and to extract successful approaches for social testing. Results Our search revealed 248 unique publications in 43 journals encompassing 71 species. The absolute number of publications has increased over years, suggesting viable strategies for studying cognition in social settings. While a wide range of species were studied they were not equally represented, with 19% of the publications reporting data for chimpanzees. Field sites were the most common environment for experiments run in social groups of primates, accounting for more than half of the results. Approaches to mitigating the practical and statistical challenges were identified. Discussion This analysis has revealed that the study of primate cognition in a social setting is increasing and taking place across

  17. Implementing the Science Assessment Standards: Developing and validating a set of laboratory assessment tasks in high school biology

    Science.gov (United States)

    Saha, Gouranga Chandra

    Very often a number of factors, especially time, space and money, deter many science educators from using inquiry-based, hands-on, laboratory practical tasks as alternative assessment instruments in science. A shortage of valid inquiry-based laboratory tasks for high school biology has been cited. Driven by this need, this study addressed the following three research questions: (1) How can laboratory-based performance tasks be designed and developed that are doable by students for whom they are designed/written? (2) Do student responses to the laboratory-based performance tasks validly represent at least some of the intended process skills that new biology learning goals want students to acquire? (3) Are the laboratory-based performance tasks psychometrically consistent as individual tasks and as a set? To answer these questions, three tasks were used from the six biology tasks initially designed and developed by an iterative process of trial testing. Analyses of data from 224 students showed that performance-based laboratory tasks that are doable by all students require careful and iterative process of development. Although the students demonstrated more skill in performing than planning and reasoning, their performances at the item level were very poor for some items. Possible reasons for the poor performances have been discussed and suggestions on how to remediate the deficiencies have been made. Empirical evidences for validity and reliability of the instrument have been presented both from the classical and the modern validity criteria point of view. Limitations of the study have been identified. Finally implications of the study and directions for further research have been discussed.

  18. The OARSI core set of performance-based measures for knee osteoarthritis is reliable but not valid and responsive.

    Science.gov (United States)

    Tolk, J J; Janssen, R P A; Prinsen, C A C; Latijnhouwers, D A J M; van der Steen, M C; Bierma-Zeinstra, S M A; Reijman, M

    2017-11-11

    The Osteoarthritis Research Society International has identified a core set of performance-based tests of physical function for use in people with knee osteoarthritis (OA). The core set consists of the 30-second chair stand test (30-s CST), 4 × 10 m fast-paced walk test (40 m FPWT) and a stair climb test. The aim of this study was to evaluate the reliability, validity and responsiveness of these performance-based measures to assess the ability to measure physical function in knee OA patients. A prospective cohort study of 85 knee OA patients indicated for total knee arthroplasty (TKA) was performed. Construct validity and responsiveness were assessed by testing of predefined hypotheses. A subgroup (n = 30) underwent test-retest measurements for reliability analysis. The Oxford Knee Score, Knee injury and Osteoarthritis Outcome Score-Physical Function Short Form, pain during activity score and knee extensor strength were used as comparator instruments. Measurements were obtained at baseline and 12 months after TKA. Appropriate test-retest reliability was found for all three tests. Intraclass correlation coefficient (ICC) for the 30-s CST was 0.90 (95% CI 0.68; 0.96), 40 m FPWT 0.93 (0.85; 0.96) and for the 10-step stair climb test (10-step SCT) 0.94 (0.89; 0.97). Adequate construct validity could not be confirmed for the three tests. For the 30-s CST, 42% of the predefined hypotheses were confirmed; for the 40 m FPWT, 27% and for the 10-step SCT 36% were confirmed. The 40 m FPWT was found to be responsive with 75% of predefined hypothesis confirmed, whereas the responsiveness for the other tests could not be confirmed. For the 30 s CST and 10-step SCT, only 50% of hypotheses were confirmed. The three performance-based tests had good reliability, but poor construct validity and responsiveness in the assessment of function for the domains sit-to-stand movement, walking short distances and stair negotiation. The findings of the present study do not

  19. An Exploratory Factor Analysis and Construct Validity of the Resident Choice Assessment Scale with Paid Carers of Adults with Intellectual Disabilities and Challenging Behavior in Community Settings

    Science.gov (United States)

    Ratti, Victoria; Vickerstaff, Victoria; Crabtree, Jason; Hassiotis, Angela

    2017-01-01

    Introduction: The Resident Choice Assessment Scale (RCAS) is used to assess choice availability for adults with intellectual disabilities (ID). The aim of the study was to explore the factor structure, construct validity, and internal consistency of the measure in community settings to further validate this tool. Method: 108 paid carers of adults…

  20. Validity and predictive ability of the juvenile arthritis disease activity score based on CRP versus ESR in a Nordic population-based setting

    DEFF Research Database (Denmark)

    Nordal, E B; Zak, M; Aalto, K

    2012-01-01

    To compare the juvenile arthritis disease activity score (JADAS) based on C reactive protein (CRP) (JADAS-CRP) with JADAS based on erythrocyte sedimentation rate (ESR) (JADAS-ESR) and to validate JADAS in a population-based setting.......To compare the juvenile arthritis disease activity score (JADAS) based on C reactive protein (CRP) (JADAS-CRP) with JADAS based on erythrocyte sedimentation rate (ESR) (JADAS-ESR) and to validate JADAS in a population-based setting....

  1. Discriminating real victims from feigners of psychological injury in gender violence: Validating a protocol for forensic setting

    Directory of Open Access Journals (Sweden)

    Ramon Arce

    2009-07-01

    Full Text Available Standard clinical assessment of psychological injury does not provide valid evidence in forensic settings, and screening of genuine from feigned complaints must be undertaken prior to the diagnosis of mental state (American Psychological Association, 2002. Whereas psychological injury is Post-traumatic Stress Disorder (PTSD, a clinical diagnosis may encompass other nosologies (e.g., depression and anxiety. The assessment of psychological injury in forensic contexts requires a multimethod approach consisting of a psychometric measure and an interview. To assess the efficacy of the multimethod approach in discriminating real from false victims, 25 real victims of gender violence and 24 feigners were assessed using a the Symptom Checklist-90-Revised (SCL-90-R, a recognition task; and a forensic clinical interview, a knowledge task. The results revealed that feigners reported more clinical symptoms on the SCL-90-R than real victims. Moreover, the feigning indicators on the SCL-90-R, GSI, PST, and PSDI were higher in feigners, but not sufficient to provide a screening test for invalidating feigning protocols. In contrast, real victims reported more clinical symptoms related to PTSD in the forensic clinical interview than feigners. Notwithstanding, in the forensic clinical interview feigners were able to feign PTSD which was not detected by the analysis of feigning strategies. The combination of both measures and their corresponding validity controls enabled the discrimination of real victims from feigners. Hence, a protocol for discriminating the psychological sequelae of real victims from feigners of gender violence is described.

  2. RELAP-7 Software Verification and Validation Plan - Requirements Traceability Matrix (RTM) Part 2: Code Assessment Strategy, Procedure, and RTM Update

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.

  3. Support Vector Data Description Model to Map Specific Land Cover with Optimal Parameters Determined from a Window-Based Validation Set.

    Science.gov (United States)

    Zhang, Jinshui; Yuan, Zhoumiqi; Shuai, Guanyuan; Pan, Yaozhong; Zhu, Xiufang

    2017-04-26

    This paper developed an approach, the window-based validation set for support vector data description (WVS-SVDD), to determine optimal parameters for support vector data description (SVDD) model to map specific land cover by integrating training and window-based validation sets. Compared to the conventional approach where the validation set included target and outlier pixels selected visually and randomly, the validation set derived from WVS-SVDD constructed a tightened hypersphere because of the compact constraint by the outlier pixels which were located neighboring to the target class in the spectral feature space. The overall accuracies for wheat and bare land achieved were as high as 89.25% and 83.65%, respectively. However, target class was underestimated because the validation set covers only a small fraction of the heterogeneous spectra of the target class. The different window sizes were then tested to acquire more wheat pixels for validation set. The results showed that classification accuracy increased with the increasing window size and the overall accuracies were higher than 88% at all window size scales. Moreover, WVS-SVDD showed much less sensitivity to the untrained classes than the multi-class support vector machine (SVM) method. Therefore, the developed method showed its merits using the optimal parameters, tradeoff coefficient (C) and kernel width (s), in mapping homogeneous specific land cover.

  4. Support Vector Data Description Model to Map Specific Land Cover with Optimal Parameters Determined from a Window-Based Validation Set

    Directory of Open Access Journals (Sweden)

    Jinshui Zhang

    2017-04-01

    Full Text Available This paper developed an approach, the window-based validation set for support vector data description (WVS-SVDD, to determine optimal parameters for support vector data description (SVDD model to map specific land cover by integrating training and window-based validation sets. Compared to the conventional approach where the validation set included target and outlier pixels selected visually and randomly, the validation set derived from WVS-SVDD constructed a tightened hypersphere because of the compact constraint by the outlier pixels which were located neighboring to the target class in the spectral feature space. The overall accuracies for wheat and bare land achieved were as high as 89.25% and 83.65%, respectively. However, target class was underestimated because the validation set covers only a small fraction of the heterogeneous spectra of the target class. The different window sizes were then tested to acquire more wheat pixels for validation set. The results showed that classification accuracy increased with the increasing window size and the overall accuracies were higher than 88% at all window size scales. Moreover, WVS-SVDD showed much less sensitivity to the untrained classes than the multi-class support vector machine (SVM method. Therefore, the developed method showed its merits using the optimal parameters, tradeoff coefficient (C and kernel width (s, in mapping homogeneous specific land cover.

  5. Safety and performance of cohesive polydensified matrix hyaluronic acid fillers with lidocaine in the clinical setting – an open-label, multicenter study

    Directory of Open Access Journals (Sweden)

    Kühne U

    2016-10-01

    Full Text Available Ulrich Kühne,1 Jørgen Esmann,2 Dennis von Heimburg,3 Matthias Imhof,1 Petra Weissenberger,4 Gerhard Sattler,5 On behalf of the BALIA Study Group 1Aesthetische Dermatologie im Medico Palais, Bad Soden, Germany; 2Jørgen Esmann Aps, Hellerup, Denmark; 3Praxisklinik Kaiserplatz, Frankfurt am Main, Germany; 4Corporate Clinical Research, Merz Pharmaceuticals GmbH, Frankfurt am Main, Germany; 5Rosenparkklinik GmbH, Darmstadt, Germany Abstract: Cohesive polydensified matrix (CPM® hyaluronic acid fillers are now available with or without lidocaine. The aim of this study was to investigate the safety and performance of CPM® fillers with lidocaine in the clinical setting. In an open-label, prospective, postmarketing study, 108 patients from seven sites in Germany and Denmark were treated with one or more lidocaine-containing CPM® fillers. Performance was assessed using the Merz Aesthetics Scales® (MAS. Pain was rated on an 11-point visual analog scale. Patients’ and physicians’ satisfaction as well as adverse events were recorded. Improvements of ≥1-point on MAS immediately after and 17 days posttreatment were observed in ~90% of patients compared with baseline. All investigators assessed ejection force, product positioning, and performance as similar or superior to the respective nonlidocaine products. Overall, 94% of investigators were satisfied with the esthetic outcomes and were willing to continue using the products. All patients except one were satisfied with the results, and all were willing to repeat the treatment. Mean pain scores were low during (<3.0 and after injection (<0.6. Except for one case of bruising, all adverse events were mild to moderate. CPM® fillers with lidocaine are safe and effective for a wide range of esthetic facial indications. Keywords: cohesive polydensified matrix, dermal fillers, Belotero, Esthélis, Fortélis, Modélis

  6. Computation of leaky guided waves dispersion spectrum using vibroacoustic analyses and the Matrix Pencil Method: a validation study for immersed rectangular waveguides.

    Science.gov (United States)

    Mazzotti, M; Bartoli, I; Castellazzi, G; Marzani, A

    2014-09-01

    The paper aims at validating a recently proposed Semi Analytical Finite Element (SAFE) formulation coupled with a 2.5D Boundary Element Method (2.5D BEM) for the extraction of dispersion data in immersed waveguides of generic cross-section. To this end, three-dimensional vibroacoustic analyses are carried out on two waveguides of square and rectangular cross-section immersed in water using the commercial Finite Element software Abaqus/Explicit. Real wavenumber and attenuation dispersive data are extracted by means of a modified Matrix Pencil Method. It is demonstrated that the results obtained using the two techniques are in very good agreement. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Development of three multiplex PCR primer sets for ark shell ( Scapharca broughtonii) and their validation in parentage assignment

    Science.gov (United States)

    Li, Ning; Li, Qi; Kong, Lingfeng; Yu, Hong

    2016-04-01

    Scapharca broughtonii is a commercially important and over-exploited species. In order to investigate its genetic diversity and population structure, 43 novel polymorphic microsatellites were isolated and characterized. The number of alleles per locus ranged from 3 to 22 with an average of 6.93, and the observed and expected heterozygosities varied between 0.233 and 1.000, and 0.250 and 0.953, with an average of 0.614 and 0.707, respectively. Three highly informative multiplex PCRs were developed from nine of those microsatellites for S. broughtonii. We evaluated and validated these multiplex PCRs in 8 full-sib families. The average polymorphism information content (PIC) was 0.539. The frequency of null alleles was estimated as 3.13% of all the alleles segregation based on a within-family analysis of Mendelian segregation patterns. Parentage analysis of real offspring demonstrated that 100% of all offspring were unambiguously allocated to a pair of parents based on 3 multiplex sets. Those 43 microsatellite loci with high variability will be helpful for the analysis of population genetics and conservation of wild stock of S. broughtonii. The 3 sets of multiplex PCRs could be an important tool of pedigree reconstruction, population genetic analysis and brood stock management.

  8. Validating Satellite Radar Altimetry Estimates of Antarctic sea ice Thickness Using the ASPeCt Data set

    Science.gov (United States)

    Giles, K. A.; Laxon, S. W.; Worby, T.

    2006-12-01

    Measurements of sea ice freeboard from spaceborne radar altimeters have been used to calculate Artic sea ice thickness on a basin wide scale during the winter. The same technique has the potential to be used in the Antarctic. The technique used to convert freeboard to thickness assumes hydrostatic equilibrium and uses estimates of snow depth and density and water and ice density from climatology. The nature of the Arctic climate means that the sea ice has a positive freeboard and that it becomes entirely snow free during the summer months, which simplifies the analysis of the radar return from the sea ice. However, in the Antarctic the situation may be more complicated with negative ice freeboards and flooded and refrozen snow resulting in inaccurate estimate of sea ice freeboard and therefore ice thickness. We present, for the first time, a comparison of estimates of Antarctic sea ice thickness calculated from satellite radar altimetry measurements of sea ice freeboard with ship observation of sea ice thickness from the ASPeCt data set. We describe the both the satellite and ship borne estimates of Antarctic sea ice thickness, the method used to compare the two data sets and outcome of the validation. We also assess the future potential of satellite radar altimetry to provide sea ice thickness in the Antarctic.

  9. Identification and validation of a new set of five genes for prediction of risk in early breast cancer.

    Science.gov (United States)

    Mustacchi, Giorgio; Sormani, Maria Pia; Bruzzi, Paolo; Gennari, Alessandra; Zanconati, Fabrizio; Bonifacio, Daniela; Monzoni, Adriana; Morandi, Luca

    2013-05-06

    Molecular tests predicting the outcome of breast cancer patients based on gene expression levels can be used to assist in making treatment decisions after consideration of conventional markers. In this study we identified a subset of 20 mRNA differentially regulated in breast cancer analyzing several publicly available array gene expression data using R/Bioconductor package. Using RTqPCR we evaluate 261 consecutive invasive breast cancer cases not selected for age, adjuvant treatment, nodal and estrogen receptor status from paraffin embedded sections. The biological samples dataset was split into a training (137 cases) and a validation set (124 cases). The gene signature was developed on the training set and a multivariate stepwise Cox analysis selected five genes independently associated with DFS: FGF18 (HR = 1.13, p = 0.05), BCL2 (HR = 0.57, p = 0.001), PRC1 (HR = 1.51, p = 0.001), MMP9 (HR = 1.11, p = 0.08), SERF1a (HR = 0.83, p = 0.007). These five genes were combined into a linear score (signature) weighted according to the coefficients of the Cox model, as: 0.125FGF18 - 0.560BCL2 + 0.409PRC1 + 0.104MMP9 - 0.188SERF1A (HR = 2.7, 95% CI = 1.9-4.0, p breast cancer patients where the indication for adjuvant chemotherapy added to endocrine treatment is uncertain.

  10. Development and validation of a reverse phase liquid chromatography method for the simultaneous quantification of eserine and pralidoxime chloride in drugs-in-adhesive matrix type transdermal patches.

    Science.gov (United States)

    Banerjee, S; Chattopadhyay, P; Ghosh, A; Kaity, S; Veer, V

    2013-09-01

    In the present study, a simple, precise, specific, fast, accurate and reliable reverse phase high performance liquid chromatographic (RP-HPLC) method has been developed and validated for the simultaneous determination and quantification of eserine and pralidoxime chloride in drugs-in-adhesive matrix type transdermal patches. The chromatographic separation was achieved by C18 column, using a mobile phase consisting of acetonitrile: 10 mM potassium dihydrogen phosphate, 10 mM heptane-1-sulfonic acid sodium salt monohydrate in water (30:70, v/v) adjusted at pH 3.0 with ortho-phosphoric acid. Flow rate was 1.0 mL/min and UV detection at 238 nm. The method was validated according to the International Conference on Harmonization (ICH) guidelines. The calibration curves were linear over the different concentration ranges of 0.5-10 μg/ml for eserine and 5-25 μg/mL for 2PAM. Relative standard deviation for precision was less than 2.0%. Limit of detection values of eserine and 2-PAM were 0.018 µg/mL and 0.008 µg/mL, respectively. The limit of quantification of eserine and 2-PAM were 0.055 µg/mL and 0.026 µg/mL, respectively. The developed method was applied for the routine analysis of these 2 drugs in drugs-in-adhesive matrix type transdermal patches in order to evaluate the drug content of different formulations. It could be also used with reliability for the determination of the drug in other pharmaceutical dosage forms. © Georg Thieme Verlag KG Stuttgart · New York.

  11. Oral/dental items in the resident assessment instrument – minimum Data Set 2.0 lack validity: results of a retrospective, longitudinal validation study

    Directory of Open Access Journals (Sweden)

    Matthias Hoben

    2016-10-01

    Full Text Available Abstract Background Oral health in nursing home residents is poor. Robust, mandated assessment tools such as the Resident Assessment Instrument – Minimum Data Set (RAI-MDS 2.0 are key to monitoring and improving quality of oral health care in nursing homes. However, psychometric properties of RAI-MDS 2.0 oral/dental items have been challenged and criterion validity of these items has never been assessed. Methods We used 73,829 RAI-MDS 2.0 records (13,118 residents, collected in a stratified random sample of 30 urban nursing homes in Western Canada (2007–2012. We derived a subsample of all residents (n = 2,711 with an admission and two or more subsequent annual assessments. Using Generalized Estimating Equations, adjusted for known covariates of nursing home residents’ oral health, we assessed the association of oral/dental problems with time, dentate status, dementia, debris, and daily cleaning. Results Prevalence of oral/dental problems fluctuated (4.8 %–5.6 % with no significant differences across time. This range of prevalence is substantially smaller than the ones reported by studies using clinical assessments by dental professionals. Denture wearers were less likely than dentate residents to have oral/dental problems (adjusted odds ratio [OR] = 0.458, 95 % confidence interval [CI]: 0.308, 0.680. Residents lacking teeth and not wearing dentures had higher odds than dentate residents of oral/dental problems (adjusted OR = 2.718, 95 % CI: 1.845, 4.003. Oral/dental problems were more prevalent in persons with debris (OR = 2.187, 95 % CI: 1.565, 3.057. Of the other variables assessed, only age at assessment was significantly associated with oral/dental problems. Conclusions Robust, reliable RAI-MDS 2.0 oral health indicators are vital to monitoring and improving oral health related quality and safety in nursing homes. However, severe underdetection of oral/dental problems and lack of association of well-known oral

  12. Analysis of Fundus Fluorescein Angiogram Based on the Hessian Matrix of Directional Curvelet Sub-bands and Distance Regularized Level Set Evolution.

    Science.gov (United States)

    Soltanipour, Asieh; Sadri, Saeed; Rabbani, Hossein; Akhlaghi, Mohammad Reza

    2015-01-01

    This paper presents a new procedure for automatic extraction of the blood vessels and optic disk (OD) in fundus fluorescein angiogram (FFA). In order to extract blood vessel centerlines, the algorithm of vessel extraction starts with the analysis of directional images resulting from sub-bands of fast discrete curvelet transform (FDCT) in the similar directions and different scales. For this purpose, each directional image is processed by using information of the first order derivative and eigenvalues obtained from the Hessian matrix. The final vessel segmentation is obtained using a simple region growing algorithm iteratively, which merges centerline images with the contents of images resulting from modified top-hat transform followed by bit plane slicing. After extracting blood vessels from FFA image, candidates regions for OD are enhanced by removing blood vessels from the FFA image, using multi-structure elements morphology, and modification of FDCT coefficients. Then, canny edge detector and Hough transform are applied to the reconstructed image to extract the boundary of candidate regions. At the next step, the information of the main arc of the retinal vessels surrounding the OD region is used to extract the actual location of the OD. Finally, the OD boundary is detected by applying distance regularized level set evolution. The proposed method was tested on the FFA images from angiography unit of Isfahan Feiz Hospital, containing 70 FFA images from different diabetic retinopathy stages. The experimental results show the accuracy more than 93% for vessel segmentation and more than 87% for OD boundary extraction.

  13. Validation of a pediatric early warning system for hospitalized pediatric oncology patients in a resource-limited setting.

    Science.gov (United States)

    Agulnik, Asya; Méndez Aceituno, Alejandra; Mora Robles, Lupe Nataly; Forbes, Peter W; Soberanis Vasquez, Dora Judith; Mack, Ricardo; Antillon-Klussmann, Federico; Kleinman, Monica; Rodriguez-Galindo, Carlos

    2017-12-15

    Pediatric oncology patients are at high risk of clinical deterioration, particularly in hospitals with resource limitations. The performance of pediatric early warning systems (PEWS) to identify deterioration has not been assessed in these settings. This study evaluates the validity of PEWS to predict the need for unplanned transfer to the pediatric intensive care unit (PICU) among pediatric oncology patients in a resource-limited hospital. A retrospective case-control study comparing the highest documented and corrected PEWS score before unplanned PICU transfer in pediatric oncology patients (129 cases) with matched controls (those not requiring PICU care) was performed. Documented and corrected PEWS scores were found to be highly correlated with the need for PICU transfer (area under the receiver operating characteristic, 0.940 and 0.930, respectively). PEWS scores increased 24 hours prior to unplanned transfer (P = .0006). In cases, organ dysfunction at the time of PICU admission correlated with maximum PEWS score (correlation coefficient, 0.26; P = .003), patients with PEWS results ≥4 had a higher Pediatric Index of Mortality 2 (PIM2) (P = .028), and PEWS results were higher in patients with septic shock (P = .01). The PICU mortality rate was 17.1%; nonsurvivors had higher mean PEWS scores before PICU transfer (P = .0009). A single-point increase in the PEWS score increased the odds of mechanical ventilation or vasopressors within the first 24 hours and during PICU admission (odds ratio 1.3-1.4). PEWS accurately predicted the need for unplanned PICU transfer in pediatric oncology patients in this resource-limited setting, with abnormal results beginning 24 hours before PICU admission and higher scores predicting the severity of illness at the time of PICU admission, need for PICU interventions, and mortality. These results demonstrate that PEWS aid in the identification of clinical deterioration in this high-risk population, regardless of a hospital

  14. RelMon: A General Approach to QA, Validation and Physics Analysis through Comparison of large Sets of Histograms

    Science.gov (United States)

    Piparo, Danilo

    2012-12-01

    The estimation of the compatibility of large amounts of histogram pairs is a recurrent problem in high energy physics. The issue is common to several different areas, from software quality monitoring to data certification, preservation and analysis. Given two sets of histograms, it is very important to be able to scrutinize the outcome of several goodness of fit tests, obtain a clear answer about the overall compatibility, easily spot the single anomalies and directly access the concerned histogram pairs. This procedure must be automated in order to reduce the human workload, therefore improving the process of identification of differences which is usually carried out by a trained human mind. Some solutions to this problem have been proposed, but they are experiment specific. RelMon depends only on ROOT and offers several goodness of fit tests (e.g. chi-squared or Kolmogorov-Smirnov). It produces highly readable web reports, in which aggregations of the comparisons rankings are available as well as all the plots of the single histogram overlays. The comparison procedure is fully automatic and scales smoothly towards ensembles of millions of histograms. Examples of RelMon utilisation within the regular workflows of the CMS collaboration and the advantages therewith obtained are described. Its interplay with the data quality monitoring infrastructure is illustrated as well as its role in the QA of the event reconstruction code, its integration in the CMS software release cycle process, CMS user data analysis and dataset validation.

  15. One-two-triage: validation and reliability of a novel triage system for low-resource settings.

    Science.gov (United States)

    Khan, Ayesha; Mahadevan, S V; Dreyfuss, Andrea; Quinn, James; Woods, Joan; Somontha, Koy; Strehlow, Matthew

    2016-10-01

    To validate and assess reliability of a novel triage system, one-two-triage (OTT), that can be applied by inexperienced providers in low-resource settings. This study was a two-phase prospective, comparative study conducted at three hospitals. Phase I assessed criterion validity of OTT on all patients arriving at an American university hospital by comparing agreement among three methods of triage: OTT, Emergency Severity Index (ESI) and physician-defined acuity (the gold standard). Agreement was reported in normalised and raw-weighted Cohen κ using two different scales for weighting, Expert-weighted and triage-weighted κ. Phase II tested reliability, reported in Fleiss κ, of OTT using standardised cases among three groups of providers at an urban and rural Cambodian hospital and the American university hospital. Normalised for prevalence of patients in each category, OTT and ESI performed similarly well for expert-weighted κ (OTT κ=0.58, 95% CI 0.52 to 0.65; ESI κ=0.47, 95% CI 0.40 to 0.53) and triage-weighted κ (κ=0.54, 95% CI 0.48 to 0.61; ESI κ=0.57, 95% CI 0.51 to 0.64). Without normalising, agreement with gold standard was less for both systems but performance of OTT and ESI remained similar, expert-weighted (OTT κ=0.57, 95% CI 0.52 to 0.62; ESI κ=0.6, 95% CI 0.58 to 0.66) and triage-weighted (OTT κ=0.31, 95% CI 0.25 to 0.38; ESI κ=0.41, 95% CI 0.35 to 0.4). In the reliability phase, all triagers showed fair inter-rater agreement, Fleiss κ (κ=0.308). OTT can be reliably applied and performs as well as ESI compared with gold standard, but requires fewer resources and less experience. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  16. Validity and reliability of the minimum basic data set in estimating nosocomial acute gastroenteritis caused by rotavirus

    Directory of Open Access Journals (Sweden)

    Olga Redondo-González

    2015-03-01

    Full Text Available Introduction: Rotavirus is the principal cause of nosocomial acute gastroenteritis (NAGE under 5 years of age. The objective is to evaluate the validity and reliability of the minimum basic data set (MBDS in estimating the NAGE caused by rotavirus (NAGER and to analyze any changes during the three years that the Rotarix® and Rotateq® vaccines were used in Spain. Material and methods: A descriptive, retrospective study was carried out in the University Hospital of Guadalajara (UHG (Spain between 2003-2009 using the MBDS, positive microbiological results for rotavirus (PMRs, and medical histories. Three methods of estimation were used: 1 An ICD-9-CM code 008.61 in the secondary diagnosis fields (DIAG2 of MBDS; 2 method 1 and/or PMRs with a current or recent hospitalization; and 3 the reference method or method 2 contrasted with patient medical histories. The validity of methods 1 and 2 was determined -sensitivity, specificity, predictive values and likelihood ratios (LRs-, along with their agreement with method 3 (Kappa coefficient. In addition, the incidence rate ratio between the NAGER rate in 2007-2009 (commercialization period of both vaccines was calculated with respect to 2003-2005 (pre-commercialization period. Results: Method 1 identified 65 records with a DIAG2 of 008.61. Method 2 found 62 probable cases, and the reference method, 49 true cases. The sensitivity of the MBDS was 67 %, the positive predictive value was 51 %, and both negative LR (LR- and reliability were moderate (LR- 0.33, Kappa coefficient 0.58. During 2007-2009, the NARGE decreased by 5 cases per 10³ hospitalizations by 9 per 10(4 days of hospitalization. Method 2 overestimated both the decline in incidence by 2 per 10³ hospitalizations and the decreased risk per day of stay by 10 %. The MBDS found no differences between the two three-year periods, but, like method 2, showed an excellent level of diagnostic evidence (LR+ 67. Conclusion: The MBDS taken together with

  17. Studies Related to the Oregon State University High Temperature Test Facility: Scaling, the Validation Matrix, and Similarities to the Modular High Temperature Gas-Cooled Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Richard R. Schultz; Paul D. Bayless; Richard W. Johnson; William T. Taitano; James R. Wolf; Glenn E. McCreery

    2010-09-01

    The Oregon State University (OSU) High Temperature Test Facility (HTTF) is an integral experimental facility that will be constructed on the OSU campus in Corvallis, Oregon. The HTTF project was initiated, by the U.S. Nuclear Regulatory Commission (NRC), on September 5, 2008 as Task 4 of the 5 year High Temperature Gas Reactor Cooperative Agreement via NRC Contract 04-08-138. Until August, 2010, when a DOE contract was initiated to fund additional capabilities for the HTTF project, all of the funding support for the HTTF was provided by the NRC via their cooperative agreement. The U.S. Department of Energy (DOE) began their involvement with the HTTF project in late 2009 via the Next Generation Nuclear Plant project. Because the NRC interests in HTTF experiments were only centered on the depressurized conduction cooldown (DCC) scenario, NGNP involvement focused on expanding the experimental envelope of the HTTF to include steady-state operations and also the pressurized conduction cooldown (PCC). Since DOE has incorporated the HTTF as an ingredient in the NGNP thermal-fluids validation program, several important outcomes should be noted: 1. The reference prismatic reactor design, that serves as the basis for scaling the HTTF, became the modular high temperature gas-cooled reactor (MHTGR). The MHTGR has also been chosen as the reference design for all of the other NGNP thermal-fluid experiments. 2. The NGNP validation matrix is being planned using the same scaling strategy that has been implemented to design the HTTF, i.e., the hierarchical two-tiered scaling methodology developed by Zuber in 1991. Using this approach a preliminary validation matrix has been designed that integrates the HTTF experiments with the other experiments planned for the NGNP thermal-fluids verification and validation project. 3. Initial analyses showed that the inherent power capability of the OSU infrastructure, which only allowed a total operational facility power capability of 0.6 MW, is

  18. The development and validation of the Virtual Tissue Matrix, a software application that facilitates the review of tissue microarrays on line

    Directory of Open Access Journals (Sweden)

    Gallagher William M

    2006-05-01

    Full Text Available Abstract Background The Tissue Microarray (TMA facilitates high-throughput analysis of hundreds of tissue specimens simultaneously. However, bottlenecks in the storage and manipulation of the data generated from TMA reviews have become apparent. A number of software applications have been developed to assist in image and data management; however no solution currently facilitates the easy online review, scoring and subsequent storage of images and data associated with TMA experimentation. Results This paper describes the design, development and validation of the Virtual Tissue Matrix (VTM. Through an intuitive HTML driven user interface, the VTM provides digital/virtual slide based images of each TMA core and a means to record observations on each TMA spot. Data generated from a TMA review is stored in an associated relational database, which facilitates the use of flexible scoring forms. The system allows multiple users to record their interpretation of each TMA spot for any parameters assessed. Images generated for the VTM were captured using a standard background lighting intensity and corrective algorithms were applied to each image to eliminate any background lighting hue inconsistencies or vignetting. Validation of the VTM involved examination of inter-and intra-observer variability between microscope and digital TMA reviews. Six bladder TMAs were immunohistochemically stained for E-Cadherin, β-Catenin and PhosphoMet and were assessed by two reviewers for the amount of core and tumour present, the amount and intensity of membrane, cytoplasmic and nuclear staining. Conclusion Results show that digital VTM images are representative of the original tissue viewed with a microscope. There were equivalent levels of inter-and intra-observer agreement for five out of the eight parameters assessed. Results also suggest that digital reviews may correct potential problems experienced when reviewing TMAs using a microscope, for example, removal of

  19. Analysis of anti-neoplastic drug in bacterial ghost matrix, w/o/w double nanoemulsion and w/o nanoemulsion by a validated 'green' liquid chromatographic method.

    Science.gov (United States)

    Youssof, Abdullah M E; Salem-Bekhit, Mounir M; Shakeel, Faiyaz; Alanazi, Fars K; Haq, Nazrul

    2016-07-01

    The objective of the present investigation was to develop and validate a 'green' reversed phase high-performance liquid chromatography (RP-HPLC) method for rapid analysis of a cytotoxic drug 5-fluorouracil (5-FU) in bulk drug, marketed injection, water-in-oil (w/o) nanoemulsion, double water-in-oil-in-water (w/o/w) nanoemulsion and bacterial ghost (BG) matrix. The chromatography study was carried out at room temperature (25±1°C) using an HPLC system with the help of ultraviolet (UV)-visible detector. The chromatographic performance was achieved with a Nucleodur 150mm×4.6mm RP C8 column filled with 5µm filler as a static phase. The mobile phase consisted of ethyl acetate: methanol (7:3% v/v) which was delivered at a flow rate of 1.0mLmin(-1) and the drug was detected in UV mode at 254nm. The developed method was validated in terms of linearity (r(2)=0.998), accuracy (98.19-102.09%), precision (% RSD=0.58-1.17), robustness (% RSD=0.12-0.53) and sensitivity with satisfactory results. The efficiency of the method was demonstrated by the assay of the drug in marketed injection, w/o nanoemulsion, w/o/w nanoemulsion and BG with satisfactory results. The successful resolution of the drug along with its degradation products clearly established the stability-indicating nature of the proposed method. Overall, these results suggested that the proposed analytical method could be effectively applied to the routine analysis of 5-FU in bulk drug, various pharmaceutical dosage forms and BG. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Improved Diagnostic Accuracy of Alzheimer's Disease by Combining Regional Cortical Thickness and Default Mode Network Functional Connectivity: Validated in the Alzheimer's Disease Neuroimaging Initiative Set.

    Science.gov (United States)

    Park, Ji Eun; Park, Bumwoo; Kim, Sang Joon; Kim, Ho Sung; Choi, Choong Gon; Jung, Seung Chai; Oh, Joo Young; Lee, Jae-Hong; Roh, Jee Hoon; Shim, Woo Hyun

    2017-01-01

    To identify potential imaging biomarkers of Alzheimer's disease by combining brain cortical thickness (CThk) and functional connectivity and to validate this model's diagnostic accuracy in a validation set. Data from 98 subjects was retrospectively reviewed, including a study set (n = 63) and a validation set from the Alzheimer's Disease Neuroimaging Initiative (n = 35). From each subject, data for CThk and functional connectivity of the default mode network was extracted from structural T1-weighted and resting-state functional magnetic resonance imaging. Cortical regions with significant differences between patients and healthy controls in the correlation of CThk and functional connectivity were identified in the study set. The diagnostic accuracy of functional connectivity measures combined with CThk in the identified regions was evaluated against that in the medial temporal lobes using the validation set and application of a support vector machine. Group-wise differences in the correlation of CThk and default mode network functional connectivity were identified in the superior temporal (p functional connectivity combined with the CThk of those two regions were more accurate than that combined with the CThk of both medial temporal lobes (91.7% vs. 75%). Combining functional information with CThk of the superior temporal and supramarginal gyri in the left cerebral hemisphere improves diagnostic accuracy, making it a potential imaging biomarker for Alzheimer's disease.

  1. Cross-validation of a mass spectrometric-based method for the therapeutic drug monitoring of irinotecan: implementation of matrix-assisted laser desorption/ionization mass spectrometry in pharmacokinetic measurements.

    Science.gov (United States)

    Calandra, Eleonora; Posocco, Bianca; Crotti, Sara; Marangon, Elena; Giodini, Luciana; Nitti, Donato; Toffoli, Giuseppe; Traldi, Pietro; Agostini, Marco

    2016-07-01

    Irinotecan is a widely used antineoplastic drug, mostly employed for the treatment of colorectal cancer. This drug is a feasible candidate for therapeutic drug monitoring due to the presence of a wide inter-individual variability in the pharmacokinetic and pharmacodynamic parameters. In order to determine the drug concentration during the administration protocol, we developed a quantitative MALDI-MS method using CHCA as MALDI matrix. Here, we demonstrate that MALDI-TOF can be applied in a routine setting for therapeutic drug monitoring in humans offering quick and accurate results. To reach this aim, we cross validated, according to FDA and EMA guidelines, the MALDI-TOF method in comparison with a standard LC-MS/MS method, applying it for the quantification of 108 patients' plasma samples from a clinical trial. Standard curves for irinotecan were linear (R (2) ≥ 0.9842) over the concentration ranges between 300 and 10,000 ng/mL and showed good back-calculated accuracy and precision. Intra- and inter-day precision and accuracy, determined on three quality control levels were always methods, the percentage differences within 20 % in more than 70 % of the total amount of clinical samples analysed.

  2. Concurrent and predictive validity of physical activity measurement items commonly used in clinical settings--data from SCAPIS pilot study.

    Science.gov (United States)

    Ekblom, Örjan; Ekblom-Bak, Elin; Bolam, Kate A; Ekblom, Björn; Schmidt, Caroline; Söderberg, Stefan; Bergström, Göran; Börjesson, Mats

    2015-09-28

    As the understanding of how different aspects of the physical activity (PA) pattern relate to health and disease, proper assessment is increasingly important. In clinical care, self-reports are the most commonly used assessment technique. However, systematic comparisons between questions regarding concurrent or criterion validity are rare, as are measures of predictive validity. The aim of the study was to examine the concurrent (using accelerometry as reference) and predictive validity (for metabolic syndrome) of five PA questions. A sample of 948 middle-aged Swedish men and women reported their PA patterns via five different questions and wore an accelerometer (Actigraph GT3X) for a minimum of 4 days. Concurrent validity was assessed as correlations and ROC-analyses. Predictive validity was assessed using logistic regression, controlling for potential confounders. Concurrent validity was low-to-moderate (r physical activity pattern. The PHAS and WALK items are proposed for assessment of adherence to PA recommendations. Assessing PA patterns using self-report measures results in methodological problems when trying to predict individual risk for the metabolic syndrome, as the concurrent validity generally was low. However, several of the investigated questions may be useful for assessing risk at a group level, showing better predictive validity.

  3. Statistical model selection between elastic and Newtonian viscous matrix models for the microboudin palaeopiezometer

    Science.gov (United States)

    Matsumura, Tarojiro; Kuwatani, Tatsu; Masuda, Toshiaki

    2017-06-01

    We carried out statistical evaluations of two probability density functions for the microboudin palaeopiezometer using the Akaike information criterion (AIC) and the cross-validation (CV) technique. In terms of the relevant stress-transfer model, these functions are defined as the elastic matrix model and the Newtonian viscous matrix model, respectively. The AIC and CV techniques enable us to evaluate the relative quality of both models when applied to nine data sets collected from metachert samples containing tourmaline grains in a quartz matrix, collected from the East Pilbara Terrane, Western Australia. The results show that the elastic matrix model is the more appropriate probability density function for analysis of fracturing of tourmaline grains in a quartz matrix. This statistical evaluation shows the validity of the elastic matrix model for the microboudin palaeopiezometer when analysing such data sets.[Figure not available: see fulltext.

  4. Predictive validity of a five-item symptom checklist to screen psychiatric morbidity and suicide ideation in general population and psychiatric settings

    Directory of Open Access Journals (Sweden)

    Chia-Yi Wu

    2016-06-01

    Conclusion: The BSRS-5R was validated as an efficient checklist to screen for psychiatric morbidity and suicide ideation in the general public. The result is valuable in translating into general medical and community settings for early detection of suicide ideation.

  5. Utility of the MMPI-2-RF (Restructured Form) Validity Scales in Detecting Malingering in a Criminal Forensic Setting: A Known-Groups Design

    Science.gov (United States)

    Sellbom, Martin; Toomey, Joseph A.; Wygant, Dustin B.; Kucharski, L. Thomas; Duncan, Scott

    2010-01-01

    The current study examined the utility of the recently released Minnesota Multiphasic Personality Inventory-2 Restructured Form (MMPI-2-RF; Ben-Porath & Tellegen, 2008) validity scales to detect feigned psychopathology in a criminal forensic setting. We used a known-groups design with the Structured Interview of Reported Symptoms (SIRS;…

  6. Extension of radiative transfer code MOMO, matrix-operator model to the thermal infrared - Clear air validation by comparison to RTTOV and application to CALIPSO-IIR

    Science.gov (United States)

    Doppler, Lionel; Carbajal-Henken, Cintia; Pelon, Jacques; Ravetta, François; Fischer, Jürgen

    2014-09-01

    1-D radiative transfer code Matrix-Operator Model (MOMO), has been extended from [0.2-3.65 μm] the band to the whole [0.2-100 μm] spectrum. MOMO can now be used for the computation of a full range of radiation budgets (shortwave and longwave). This extension to the longwave part of the electromagnetic radiation required to consider radiative transfer processes that are features of the thermal infrared: the spectroscopy of the water vapor self- and foreign-continuum of absorption at 12 μm and the emission of radiation by gases, aerosol, clouds and surface. MOMO's spectroscopy module, Coefficient of Gas Absorption (CGASA), has been developed for computation of gas extinction coefficients, considering continua and spectral line absorptions. The spectral dependences of gas emission/absorption coefficients and of Planck's function are treated using a k-distribution. The emission of radiation is implemented in the adding-doubling process of the matrix operator method using Schwarzschild's approach in the radiative transfer equation (a pure absorbing/emitting medium, namely without scattering). Within the layer, the Planck-function is assumed to have an exponential dependence on the optical-depth. In this paper, validation tests are presented for clear air case studies: comparisons to the analytical solution of a monochromatic Schwarzschild's case without scattering show an error of less than 0.07% for a realistic atmosphere with an optical depth and a blackbody temperature that decrease linearly with altitude. Comparisons to radiative transfer code RTTOV are presented for simulations of top of atmosphere brightness temperature for channels of the space-borne instrument MODIS. Results show an agreement varying from 0.1 K to less than 1 K depending on the channel. Finally MOMO results are compared to CALIPSO Infrared Imager Radiometer (IIR) measurements for clear air cases. A good agreement was found between computed and observed radiance: biases are smaller than 0.5 K

  7. A note on matrix differentiation

    OpenAIRE

    Kowal, Pawel

    2006-01-01

    This paper presents a set of rules for matrix differentiation with respect to a vector of parameters, using the flattered representation of derivatives, i.e. in form of a matrix. We also introduce a new set of Kronecker tensor products of matrices. Finally we consider a problem of differentiating matrix determinant, trace and inverse.

  8. Quality assessment of cardiovascular magnetic resonance in the setting of the European CMR registry: description and validation of standardized criteria.

    Science.gov (United States)

    Klinke, Vincenzo; Muzzarelli, Stefano; Lauriers, Nathalie; Locca, Didier; Vincenti, Gabriella; Monney, Pierre; Lu, Christian; Nothnagel, Detlev; Pilz, Guenter; Lombardi, Massimo; van Rossum, Albert C; Wagner, Anja; Bruder, Oliver; Mahrholdt, Heiko; Schwitter, Juerg

    2013-06-20

    Cardiovascular magnetic resonance (CMR) has become an important diagnostic imaging modality in cardiovascular medicine. However, insufficient image quality may compromise its diagnostic accuracy. We aimed to describe and validate standardized criteria to evaluate a) cine steady-state free precession (SSFP), b) late gadolinium enhancement (LGE), and c) stress first-pass perfusion images. These criteria will serve for quality assessment in the setting of the Euro-CMR registry. Thirty-five qualitative criteria were defined (scores 0-3) with lower scores indicating better image quality. In addition, quantitative parameters were measured yielding 2 additional quality criteria, i.e. signal-to-noise ratio (SNR) of non-infarcted myocardium (as a measure of correct signal nulling of healthy myocardium) for LGE and % signal increase during contrast medium first-pass for perfusion images. These qualitative and quantitative criteria were assessed in a total of 90 patients (60 patients scanned at our own institution at 1.5T (n=30) and 3T (n=30) and in 30 patients randomly chosen from the Euro-CMR registry examined at 1.5T). Analyses were performed by 2 SCMR level-3 experts, 1 trained study nurse, and 1 trained medical student. The global quality score was 6.7±4.6 (n=90, mean of 4 observers, maximum possible score 64), range 6.4-6.9 (p=0.76 between observers). It ranged from 4.0-4.3 for 1.5T (p=0.96 between observers), from 5.9-6.9 for 3T (p=0.33 between observers), and from 8.6-10.3 for the Euro-CMR cases (p=0.40 between observers). The inter- (n=4) and intra-observer (n=2) agreement for the global quality score, i.e. the percentage of assignments to the same quality tertile ranged from 80% to 88% and from 90% to 98%, respectively. The agreement for the quantitative assessment for LGE images (scores 0-2 for SNR 5, respectively) ranged from 78-84% for the entire population, and 70-93% at 1.5T, 64-88% at 3T, and 72-90% for the Euro-CMR cases. The agreement for perfusion images

  9. Validation of "Cancer Dyspnea Scale" in Patients With Advanced Cancer in a Palliative Care Setting in India.

    Science.gov (United States)

    Damani, Anuja; Ghoshal, Arunangshu; Salins, Naveen; Deodhar, Jayita; Muckaden, MaryAnn

    2017-11-01

    Assessment of dyspnea in patients with advanced cancer is challenging. Cancer Dyspnea Scale (CDS) is a multidimensional scale developed for the measurement of dyspnea. It is available only in Japanese, English, and Swedish and has not been validated before in the Indian languages. The objective was to describe the process of validation and reliability testing of CDS in Indian advanced cancer patients. This is a prospective observational study conducted in the palliative care clinic of a tertiary cancer center in Mumbai. The English version of CDS was translated into Indian languages-Hindi (CDS-H) and Marathi (CDS-M). One hundred twenty newly registered eligible patients (60 for CDS-H and 60 for CDS-M) were enrolled into the study consecutively. They were asked to fill CDS (translated version) and Visual Analogue Scale for dyspnea. Only baseline measures were used. Validity was separately analyzed for CDS-H and CDS-M. The results showed good construct validity between CDS-H and CDS-M. Intersubscale correlation was done by calculating the Pearson's correlation coefficient (mean r = 0.64 and 0.764 for CDS-H and CDS-M, respectively). Convergent validity was calculated by computing the correlation of each factor with VAS-D scores and was found statistically significant (P scale was determined by its internal consistency (Cronbach's alpha coefficient ranging from 0.716 to 0.879). This study demonstrates that CDS-H and CDS-M are valid and reliable multidimensional scales, which can be used to assess dyspnea in patients with advanced cancer. Copyright © 2017 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  10. Validation, revision and extension of the mantle cell lymphoma international prognostic index in a population-based setting

    NARCIS (Netherlands)

    S.A.M. van de Schans (Saskia); M.L.G. Janssen-Heijnen (Maryska); M.R. Nijzie (Marten); E.W. Steyerberg (Ewout); D.J. van Spronsen (Dick Johan)

    2010-01-01

    textabstractBackground The aim of this study was to validate the Mantle Cell Lymphoma International Prognostic Index in a population-based cohort and to study the relevance of its revisions. Design and Methods We analyzed data from 178 unselected patients with stage III or IV mantle cell lymphoma,

  11. Further Validation of the MMPI-2 And MMPI-2-RF Response Bias Scale: Findings from Disability and Criminal Forensic Settings

    Science.gov (United States)

    Wygant, Dustin B.; Sellbom, Martin; Gervais, Roger O.; Ben-Porath, Yossef S.; Stafford, Kathleen P.; Freeman, David B.; Heilbronner, Robert L.

    2010-01-01

    The present study extends the validation of the Minnesota Multiphasic Personality Inventory-2 (MMPI-2) and the Minnesota Multiphasic Personality Inventory-2 Restructured Form (MMPI-2-RF) Response Bias Scale (RBS; R. O. Gervais, Y. S. Ben-Porath, D. B. Wygant, & P. Green, 2007) in separate forensic samples composed of disability claimants and…

  12. Selection of appropriate training and validation set chemicals for modelling dermal permeability by U-optimal design.

    Science.gov (United States)

    Xu, G; Hughes-Oliver, J M; Brooks, J D; Yeatts, J L; Baynes, R E

    2013-01-01

    Quantitative structure-activity relationship (QSAR) models are being used increasingly in skin permeation studies. The main idea of QSAR modelling is to quantify the relationship between biological activities and chemical properties, and thus to predict the activity of chemical solutes. As a key step, the selection of a representative and structurally diverse training set is critical to the prediction power of a QSAR model. Early QSAR models selected training sets in a subjective way and solutes in the training set were relatively homogenous. More recently, statistical methods such as D-optimal design or space-filling design have been applied but such methods are not always ideal. This paper describes a comprehensive procedure to select training sets from a large candidate set of 4534 solutes. A newly proposed 'Baynes' rule', which is a modification of Lipinski's 'rule of five', was used to screen out solutes that were not qualified for the study. U-optimality was used as the selection criterion. A principal component analysis showed that the selected training set was representative of the chemical space. Gas chromatograph amenability was verified. A model built using the training set was shown to have greater predictive power than a model built using a previous dataset [1].

  13. Development of a tool to measure person-centered maternity care in developing settings: validation in a rural and urban Kenyan population.

    Science.gov (United States)

    Afulani, Patience A; Diamond-Smith, Nadia; Golub, Ginger; Sudhinaraset, May

    2017-09-22

    Person-centered reproductive health care is recognized as critical to improving reproductive health outcomes. Yet, little research exists on how to operationalize it. We extend the literature in this area by developing and validating a tool to measure person-centered maternity care. We describe the process of developing the tool and present the results of psychometric analyses to assess its validity and reliability in a rural and urban setting in Kenya. We followed standard procedures for scale development. First, we reviewed the literature to define our construct and identify domains, and developed items to measure each domain. Next, we conducted expert reviews to assess content validity; and cognitive interviews with potential respondents to assess clarity, appropriateness, and relevance of the questions. The questions were then refined and administered in surveys; and survey results used to assess construct and criterion validity and reliability. The exploratory factor analysis yielded one dominant factor in both the rural and urban settings. Three factors with eigenvalues greater than one were identified for the rural sample and four factors identified for the urban sample. Thirty of the 38 items administered in the survey were retained based on the factors loadings and correlation between the items. Twenty-five items load very well onto a single factor in both the rural and urban sample, with five items loading well in either the rural or urban sample, but not in both samples. These 30 items also load on three sub-scales that we created to measure dignified and respectful care, communication and autonomy, and supportive care. The Chronbach alpha for the main scale is greater than 0.8 in both samples, and that for the sub-scales are between 0.6 and 0.8. The main scale and sub-scales are correlated with global measures of satisfaction with maternity services, suggesting criterion validity. We present a 30-item scale with three sub-scales to measure person

  14. Psychometric properties and longitudinal validation of the self-reporting questionnaire (SRQ-20 in a Rwandan community setting: a validation study

    Directory of Open Access Journals (Sweden)

    van Lammeren Anouk

    2011-08-01

    Full Text Available Abstract Background This study took place to enable the measurement of the effects on mental health of a psychosocial intervention in Rwanda. It aimed to establish the capacities of the Self-Reporting Questionnaire (SRQ-20 to screen for mental disorder and to assess symptom change over time in a Rwandan community setting. Methods The SRQ-20 was translated into Kinyarwanda in a process of forward and back-translation. SRQ-20 data were collected in a Rwandan setting on 418 respondents; a random subsample of 230 respondents was assessed a second time with a three month time interval. Internal reliability was tested using Cronbach's alpha. The optimal cut-off point was determined by calculating Receiver Operating Curves, using semi-structured clinical interviews as standard in a random subsample of 99 respondents. Subsequently, predictive value, likelihood ratio, and interrater agreement were calculated. The factor structure of the SRQ-20 was determined through exploratory factor analysis. Factorial invariance over time was tested in a multigroup confirmatory factor analysis. Results The reliability of the SRQ-20 in women (α = 0.85 and men (α = 0.81 could be considered good. The instrument performed moderately well in detecting common mental disorders, with an area under the curve (AUC of 0.76 for women and 0.74 for men. Cut-off scores were different for women (10 and men (8. Factor analysis yielded five factors, explaining 38% of the total variance. The factor structure proved to be time invariant. Conclusions The SRQ-20 can be used as a screener to detect mental disorder in a Rwandan community setting, but cut-off scores need to be adjusted for women and men separately. The instrument also shows longitudinal factorial invariance, which is an important prerequisite for assessing changes in symptom severity. This is a significant finding as in non-western post-conflict settings the relevance of diagnostic categories is questionable. The use of the

  15. Validity and Interrater Reliability of the Visual Quarter-Waste Method for Assessing Food Waste in Middle School and High School Cafeteria Settings.

    Science.gov (United States)

    Getts, Katherine M; Quinn, Emilee L; Johnson, Donna B; Otten, Jennifer J

    2017-11-01

    Measuring food waste (ie, plate waste) in school cafeterias is an important tool to evaluate the effectiveness of school nutrition policies and interventions aimed at increasing consumption of healthier meals. Visual assessment methods are frequently applied in plate waste studies because they are more convenient than weighing. The visual quarter-waste method has become a common tool in studies of school meal waste and consumption, but previous studies of its validity and reliability have used correlation coefficients, which measure association but not necessarily agreement. The aims of this study were to determine, using a statistic measuring interrater agreement, whether the visual quarter-waste method is valid and reliable for assessing food waste in a school cafeteria setting when compared with the gold standard of weighed plate waste. To evaluate validity, researchers used the visual quarter-waste method and weighed food waste from 748 trays at four middle schools and five high schools in one school district in Washington State during May 2014. To assess interrater reliability, researcher pairs independently assessed 59 of the same trays using the visual quarter-waste method. Both validity and reliability were assessed using a weighted κ coefficient. For validity, as compared with the measured weight, 45% of foods assessed using the visual quarter-waste method were in almost perfect agreement, 42% of foods were in substantial agreement, 10% were in moderate agreement, and 3% were in slight agreement. For interrater reliability between pairs of visual assessors, 46% of foods were in perfect agreement, 31% were in almost perfect agreement, 15% were in substantial agreement, and 8% were in moderate agreement. These results suggest that the visual quarter-waste method is a valid and reliable tool for measuring plate waste in school cafeteria settings. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  16. Reliability and validity of the International Spinal Cord Injury Basic Pain Data Set items as self-report measures

    DEFF Research Database (Denmark)

    Jensen, M P; Widerström-Noga, E; Richards, J S

    2010-01-01

    To evaluate the psychometric properties of a subset of International Spinal Cord Injury Basic Pain Data Set (ISCIBPDS) items that could be used as self-report measures in surveys, longitudinal studies and clinical trials.......To evaluate the psychometric properties of a subset of International Spinal Cord Injury Basic Pain Data Set (ISCIBPDS) items that could be used as self-report measures in surveys, longitudinal studies and clinical trials....

  17. Examination of the MMPI-2 restructured form (MMPI-2-RF) validity scales in civil forensic settings: findings from simulation and known group samples.

    Science.gov (United States)

    Wygant, Dustin B; Ben-Porath, Yossef S; Arbisi, Paul A; Berry, David T R; Freeman, David B; Heilbronner, Robert L

    2009-11-01

    The current study examined the effectiveness of the MMPI-2 Restructured Form (MMPI-2-RF; Ben-Porath and Tellegen, 2008) over-reporting indicators in civil forensic settings. The MMPI-2-RF includes three revised MMPI-2 over-reporting validity scales and a new scale to detect over-reported somatic complaints. Participants dissimulated medical and neuropsychological complaints in two simulation samples, and a known-groups sample used symptom validity tests as a response bias criterion. Results indicated large effect sizes for the MMPI-2-RF validity scales, including a Cohen's d of .90 for Fs in a head injury simulation sample, 2.31 for FBS-r, 2.01 for F-r, and 1.97 for Fs in a medical simulation sample, and 1.45 for FBS-r and 1.30 for F-r in identifying poor effort on SVTs. Classification results indicated good sensitivity and specificity for the scales across the samples. This study indicates that the MMPI-2-RF over-reporting validity scales are effective at detecting symptom over-reporting in civil forensic settings.

  18. Data set for the proteomic inventory and quantitative analysis of chicken eggshell matrix proteins during the primary events of eggshell mineralization and the active growth phase of calcification

    Science.gov (United States)

    Marie, Pauline; Labas, Valérie; Brionne, Aurélien; Harichaux, Grégoire; Hennequet-Antier, Christelle; Rodriguez-Navarro, Alejandro B.; Nys, Yves; Gautron, Joël

    2015-01-01

    Chicken eggshell is a biomineral composed of 95% calcite calcium carbonate mineral and of 3.5% organic matrix proteins. The assembly of mineral and its structural organization is controlled by its organic matrix. In a recent study [1], we have used quantitative proteomic, bioinformatic and functional analyses to explore the distribution of 216 eggshell matrix proteins at four key stages of shell mineralization defined as: (1) widespread deposition of amorphous calcium carbonate (ACC), (2) ACC transformation into crystalline calcite aggregates, (3) formation of larger calcite crystal units and (4) rapid growth of calcite as columnar structure with preferential crystal orientation. The current article detailed the quantitative analysis performed at the four stages of shell mineralization to determine the proteins which are the most abundant. Additionally, we reported the enriched GO terms and described the presence of 35 antimicrobial proteins equally distributed at all stages to keep the egg free of bacteria and of 81 proteins, the function of which could not be ascribed. PMID:26306314

  19. Data set for the proteomic inventory and quantitative analysis of chicken eggshell matrix proteins during the primary events of eggshell mineralization and the active growth phase of calcification

    Directory of Open Access Journals (Sweden)

    Pauline Marie

    2015-09-01

    Full Text Available Chicken eggshell is a biomineral composed of 95% calcite calcium carbonate mineral and of 3.5% organic matrix proteins. The assembly of mineral and its structural organization is controlled by its organic matrix. In a recent study [1], we have used quantitative proteomic, bioinformatic and functional analyses to explore the distribution of 216 eggshell matrix proteins at four key stages of shell mineralization defined as: (1 widespread deposition of amorphous calcium carbonate (ACC, (2 ACC transformation into crystalline calcite aggregates, (3 formation of larger calcite crystal units and (4 rapid growth of calcite as columnar structure with preferential crystal orientation. The current article detailed the quantitative analysis performed at the four stages of shell mineralization to determine the proteins which are the most abundant. Additionally, we reported the enriched GO terms and described the presence of 35 antimicrobial proteins equally distributed at all stages to keep the egg free of bacteria and of 81 proteins, the function of which could not be ascribed.

  20. Political Representation and Gender Inequalities Testing the Validity of Model Developed for Pakistan using a Data Set of Malaysia

    OpenAIRE

    Najeebullah Khan; Adnan Hussein; Zahid Awan; Bakhtiar Khan

    2012-01-01

    This study measured the impacts of six independent variables (political rights, election system type, political quota, literacy rate, labor force participation and GDP per capita at current price in US dollar) on the dependent variable (percentage of women representation in national legislature) using multiple linear regression models. At a first step we developed and tested the model without of sample data of Pakistan. For model construction and validation ten years data from the year 1999 a...

  1. Validity and reliability of an Arabic version of the state-trait anxiety inventory in a Saudi dental setting

    Science.gov (United States)

    Bahammam, Maha A.

    2016-01-01

    Objectives: To test the psychometric properties of an adapted Arabic version of the state trait anxiety-form Y (STAI-Y) in Saudi adult dental patients. Methods: In this cross-sectional study, the published Arabic version of the STAI-Y was evaluated by 2 experienced bilingual professionals for its compatibility with Saudi culture and revised prior to testing. Three hundred and eighty-seven patients attending dental clinics for treatment at the Faculty of Dentistry Hospital, King Abdullah University, Jeddah, Kingdom of Saudi Arabia, participated in the study. The Arabic version of the modified dental anxiety scale (MDAS) and visual analogue scale (VAS) ratings of anxiety were used to assess the concurrent criterion validity. Results: The Arabic version of the STAI-Y had high internal consistency reliability (Cronbach’s alpha: 0.989) for state and trait subscales. Factor analysis indicated unidimensionality of the scale. Correlations between STAI-Y scores and both MDAS and VAS scores indicated strong concurrent criterion validity. Discriminant validity was supported by the findings that higher anxiety levels were present among females as opposed to males, younger individuals as compared to older individuals, and patients who do not visit the dentist unless they have a need as opposed to more frequent visitors to the dental office. Conclusion: The Arabic version of the STAI-Y has an adequate internal consistency reliability, generally similar to that reported in the international literature, suggesting it is appropriate for assessing dental anxiety in Arabic speaking populations. PMID:27279514

  2. Validity and reliability of an Arabic version of the state-trait anxiety inventory in a Saudi dental setting.

    Science.gov (United States)

    Bahammam, Maha A

    2016-06-01

    To test the psychometric properties of an adapted Arabic version of the state trait anxiety-form Y (STAI-Y) in Saudi adult dental patients.  In this cross-sectional study, the published Arabic version of the STAI-Y was evaluated by 2 experienced bilingual professionals for its compatibility with Saudi culture and revised prior to testing. Three hundred and eighty-seven patients attending dental clinics for treatment at the Faculty of Dentistry Hospital, King Abdullah University, Jeddah, Kingdom of Saudi Arabia, participated in the study. The Arabic version of the modified dental anxiety scale (MDAS) and visual analogue scale (VAS) ratings of anxiety were used to assess the concurrent criterion validity. The Arabic version of the STAI-Y had high internal consistency reliability (Cronbach's alpha: 0.989) for state and trait subscales. Factor analysis indicated unidimensionality of the scale. Correlations between STAI-Y scores and both MDAS and VAS scores indicated strong concurrent criterion validity. Discriminant validity was supported by the findings that higher anxiety levels were present among females as opposed to males, younger individuals as compared to older individuals, and patients who do not visit the dentist unless they have a need as opposed to more frequent visitors to the dental office. The Arabic version of the STAI-Y has an adequate internal consistency reliability, generally similar to that reported in the international literature, suggesting it is appropriate for assessing dental anxiety in Arabic speaking populations.

  3. Matrix comparison, Part 2

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg; Borlund, Pia

    2007-01-01

    The present two-part article introduces matrix comparison as a formal means for evaluation purposes in informetric studies such as cocitation analysis. In the first part, the motivation behind introducing matrix comparison to informetric studies, as well as two important issues influencing such c...... and Procrustes analysis can be used as statistical validation tools in informetric studies and thus help choosing suitable proximity measures....

  4. Adolescent Personality: A Five-Factor Model Construct Validation

    Science.gov (United States)

    Baker, Spencer T.; Victor, James B.; Chambers, Anthony L.; Halverson, Jr., Charles F.

    2004-01-01

    The purpose of this study was to investigate convergent and discriminant validity of the five-factor model of adolescent personality in a school setting using three different raters (methods): self-ratings, peer ratings, and teacher ratings. The authors investigated validity through a multitrait-multimethod matrix and a confirmatory factor…

  5. Dynamic Matrix Rank

    DEFF Research Database (Denmark)

    Frandsen, Gudmund Skovbjerg; Frandsen, Peter Frands

    2009-01-01

    We consider maintaining information about the rank of a matrix under changes of the entries. For n×n matrices, we show an upper bound of O(n1.575) arithmetic operations and a lower bound of Ω(n) arithmetic operations per element change. The upper bound is valid when changing up to O(n0.575) entries...... in a single column of the matrix. We also give an algorithm that maintains the rank using O(n2) arithmetic operations per rank one update. These bounds appear to be the first nontrivial bounds for the problem. The upper bounds are valid for arbitrary fields, whereas the lower bound is valid for algebraically...... closed fields. The upper bound for element updates uses fast rectangular matrix multiplication, and the lower bound involves further development of an earlier technique for proving lower bounds for dynamic computation of rational functions....

  6. The Virtual Care Climate Questionnaire: Development and Validation of a Questionnaire Measuring Perceived Support for Autonomy in a Virtual Care Setting.

    Science.gov (United States)

    Smit, Eline Suzanne; Dima, Alexandra Lelia; Immerzeel, Stephanie Annette Maria; van den Putte, Bas; Williams, Geoffrey Colin

    2017-05-08

    Web-based health behavior change interventions may be more effective if they offer autonomy-supportive communication facilitating the internalization of motivation for health behavior change. Yet, at this moment no validated tools exist to assess user-perceived autonomy-support of such interventions. The aim of this study was to develop and validate the virtual climate care questionnaire (VCCQ), a measure of perceived autonomy-support in a virtual care setting. Items were developed based on existing questionnaires and expert consultation and were pretested among experts and target populations. The virtual climate care questionnaire was administered in relation to Web-based interventions aimed at reducing consumption of alcohol (Study 1; N=230) or cannabis (Study 2; N=228). Item properties, structural validity, and reliability were examined with item-response and classical test theory methods, and convergent and divergent validity via correlations with relevant concepts. In Study 1, 20 of 23 items formed a one-dimensional scale (alpha=.97; omega=.97; H=.66; mean 4.9 [SD 1.0]; range 1-7) that met the assumptions of monotonicity and invariant item ordering. In Study 2, 16 items fitted these criteria (alpha=.92; H=.45; omega=.93; mean 4.2 [SD 1.1]; range 1-7). Only 15 items remained in the questionnaire in both studies, thus we proceeded to the analyses of the questionnaire's reliability and construct validity with a 15-item version of the virtual climate care questionnaire. Convergent validity of the resulting 15-item virtual climate care questionnaire was confirmed by positive associations with autonomous motivation (Study 1: r=.66, Pclimate care questionnaire accurately assessed participants' perceived autonomy-support offered by two Web-based health behavior change interventions. Overall, the scale showed the expected properties and relationships with relevant concepts, and the studies presented suggest this first version of the virtual climate care questionnaire

  7. Statistical properties of random matrix product states

    Science.gov (United States)

    Garnerone, Silvano; de Oliveira, Thiago R.; Haas, Stephan; Zanardi, Paolo

    2010-11-01

    We study the set of random matrix product states (RMPS) introduced by Garnerone, de Oliveira, and Zanardi [S. Garnerone, T. R. de Oliveira, and P. Zanardi, Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.81.032336 81, 032336 (2010)] as a tool to explore foundational aspects of quantum statistical mechanics. In the present work, we provide an accurate numerical and analytical investigation of the properties of RMPS. We calculate the average state of the ensemble in the nonhomogeneous case, and numerically check the validity of this result. We also suggest using RMPS as a tool to approximate properties of general quantum random states. The numerical simulations presented here support the accuracy and efficiency of this approximation. These results suggest that any generalized canonical state can be approximated with high probability by the reduced density matrix of a RMPS, if the average matrix product states coincide with the associated microcanonical ensemble.

  8. Matrix Extension Study: Validation of the Compact Dry EC Method for Enumeration of Escherichia coli and non-E. coli Coliform Bacteria in Selected Foods.

    Science.gov (United States)

    Mizuochi, Shingo; Nelson, Maria; Baylis, Chris; Green, Becky; Jewell, Keith; Monadjemi, Farinaz; Chen, Yi; Salfinger, Yvonne; Fernandez, Maria Cristina

    2016-01-01

    uncontaminated level). In the single-laboratory evaluation (cooked chicken, prewashed bagged shredded iceberg lettuce, frozen cod filets, and instant nonfat dry milk powder), colony counts were logarithmically transformed, and then the data were analyzed at each level for sr, RSDr, and mean difference between methods with 95% confidence intervals (CIs). A CI outside a range of -0.5 to 0.5 on the log10 mean difference between methods was used as the criterion to establish a significant statistical difference. In the multilaboratory study on pasteurized milk, after logarithmic transformation, the data were analyzed for sR and RSDR in addition to sr, RSDr, and mean difference with 95% CIs. Regression analysis was performed on all matrixes and reported as r(2). In the single-laboratory evaluation, statistical differences were indicated between the Compact Dry EC and ISO 16649-2 methods for the enumeration of E. coli in two of five contamination levels tested for lettuce, and in the low contamination level for cooked chicken. For the cooked chicken and lettuce at the low level, only a few colonies were recovered for each method, and thus not a true indication of the methods' performance. For the high contamination level of lettuce, counts varied within the sets of five replicates more than 10-fold for each method, which may have contributed to the significant difference. Statistical differences were also indicated between the Compact Dry EC and ISO 4832 methods for the enumeration of coliforms in two of five contamination levels tested for lettuce, two of five contamination levels of milk powder, and in the low contamination level for frozen fish. For the lowest levels of frozen fish and milk powder, only a few colonies were recovered for each method. For the lettuce and the other level of milk powder, counts varied within the sets of five replicates more than 10-fold for each method, which may have contributed to the significant differences indicated in the those contamination

  9. The ToMenovela – A photograph-based stimulus set for the study of social cognition with high ecological validity

    Directory of Open Access Journals (Sweden)

    Maike C. Herbort

    2016-12-01

    Full Text Available We present the ToMenovela, a stimulus set that has been developed to provide a set of normatively rated socio-emotional stimuli showing varying amount of characters in emotionally laden interactions for experimental investigations of i cognitive and ii affective ToM, iii emotional reactivity, and iv complex emotion judgment with respect to Ekman’s basic emotions (happiness, sadness, anger, fear, surprise and disgust, Ekman & Friesen, 1975. Stimuli were generated with focus on ecological validity and consist of 190 scenes depicting daily-life situations. Two or more of eight main characters with distinct biographies and personalities are depicted on each scene picture.To obtain an initial evaluation of the stimulus set and to pave the way for future studies in clinical populations, normative data on each stimulus of the set was obtained from a sample of 61 neurologically and psychiatrically healthy participants (31 female, 30 male; mean age 26.74 +/- 5.84, including a visual analog scale rating of Ekman’s basic emotions (happiness, sadness, anger, fear, surprise and disgust and free-text descriptions of the content. The ToMenovela is being developed to provide standardized material of social scenes that are available to researchers in the study of social cognition. It should facilitate experimental control while keeping ecological validity high.

  10. An ancestry informative marker set for determining continental origin: validation and extension using human genome diversity panels

    Directory of Open Access Journals (Sweden)

    Gregersen Peter K

    2009-07-01

    Full Text Available Abstract Background Case-control genetic studies of complex human diseases can be confounded by population stratification. This issue can be addressed using panels of ancestry informative markers (AIMs that can provide substantial population substructure information. Previously, we described a panel of 128 SNP AIMs that were designed as a tool for ascertaining the origins of subjects from Europe, Sub-Saharan Africa, Americas, and East Asia. Results In this study, genotypes from Human Genome Diversity Panel populations were used to further evaluate a 93 SNP AIM panel, a subset of the 128 AIMS set, for distinguishing continental origins. Using both model-based and relatively model-independent methods, we here confirm the ability of this AIM set to distinguish diverse population groups that were not previously evaluated. This study included multiple population groups from Oceana, South Asia, East Asia, Sub-Saharan Africa, North and South America, and Europe. In addition, the 93 AIM set provides population substructure information that can, for example, distinguish Arab and Ashkenazi from Northern European population groups and Pygmy from other Sub-Saharan African population groups. Conclusion These data provide additional support for using the 93 AIM set to efficiently identify continental subject groups for genetic studies, to identify study population outliers, and to control for admixture in association studies.

  11. Matrix theory

    CERN Document Server

    Franklin, Joel N

    2003-01-01

    Mathematically rigorous introduction covers vector and matrix norms, the condition-number of a matrix, positive and irreducible matrices, much more. Only elementary algebra and calculus required. Includes problem-solving exercises. 1968 edition.

  12. External validation and head-to-head comparison of Japanese and Western prostate biopsy nomograms using Japanese data sets.

    Science.gov (United States)

    Utsumi, Takanobu; Kawamura, Koji; Suzuki, Hiroyoshi; Kamiya, Naoto; Imamoto, Takashi; Miura, Junichiro; Ueda, Takeshi; Maruoka, Masayuki; Sekita, Nobuyuki; Mikami, Kazuo; Ichikawa, Tomohiko

    2009-04-01

    The objective of this study was to perform external validation of a previously developed prostate biopsy nomogram (the CHIBA nomogram) and to compare it with previously published nomograms developed in Japanese and overseas populations. Two different cohorts of patients were used: one from the Chiba Cancer Center (n = 392) in which transperineal 16-core biopsy was performed, and another from Chibaken Saiseikai Narashino Hospital (n = 269) in which transrectal 16-core biopsy was carried out. All patients were Japanese men with serum prostate-specific antigen levels less than 10 ng/mL. The predictive accuracy of our CHIBA nomogram and of four other published nomograms (Finne's sextant biopsy-based logistic regression model, Karakiewicz's sextant biopsy-based nomogram, Chun's 10-core biopsy-based nomogram and Kawakami's three-dimensional biopsy-based nomogram) was quantified based on area under the curve derived from receiver operating characteristic curves. Head-to-head comparison of area under the curve values demonstrated that our nomogram was significantly more accurate than all other models except Chun's (P = 0.012 vs Finne's, P = 0.000 vs Karakiewicz's, and P = 0.003 vs Kawakami's). Our nomogram appears to be more useful for the Japanese population than Western models. Moreover, external validation demonstrates that its predictive accuracy does not vary according to biopsy approach. This is the first report to demonstrate that the predictive accuracy of a nomogram is independent from the biopsy method.

  13. Reliability and Validity of Survey Instruments to Measure Work-Related Fatigue in the Emergency Medical Services Setting: A Systematic Review.

    Science.gov (United States)

    Patterson, P Daniel; Weaver, Matthew D; Fabio, Anthony; Teasley, Ellen M; Renn, Megan L; Curtis, Brett R; Matthews, Margaret E; Kroemer, Andrew J; Xun, Xiaoshuang; Bizhanova, Zhadyra; Weiss, Patricia M; Sequeira, Denisse J; Coppler, Patrick J; Lang, Eddy S; Higgins, J Stephen

    2018-01-11

    This study sought to systematically search the literature to identify reliable and valid survey instruments for fatigue measurement in the Emergency Medical Services (EMS) occupational setting. A systematic review study design was used and searched six databases, including one website. The research question guiding the search was developed a priori and registered with the PROSPERO database of systematic reviews: "Are there reliable and valid instruments for measuring fatigue among EMS personnel?" (2016:CRD42016040097). The primary outcome of interest was criterion-related validity. Important outcomes of interest included reliability (e.g., internal consistency), and indicators of sensitivity and specificity. Members of the research team independently screened records from the databases. Full-text articles were evaluated by adapting the Bolster and Rourke system for categorizing findings of systematic reviews, and the rated data abstracted from the body of literature as favorable, unfavorable, mixed/inconclusive, or no impact. The Grading of Recommendations, Assessment, Development and Evaluation (GRADE) methodology was used to evaluate the quality of evidence. The search strategy yielded 1,257 unique records. Thirty-four unique experimental and non-experimental studies were determined relevant following full-text review. Nineteen studies reported on the reliability and/or validity of ten different fatigue survey instruments. Eighteen different studies evaluated the reliability and/or validity of four different sleepiness survey instruments. None of the retained studies reported sensitivity or specificity. Evidence quality was rated as very low across all outcomes. In this systematic review, limited evidence of the reliability and validity of 14 different survey instruments to assess the fatigue and/or sleepiness status of EMS personnel and related shift worker groups was identified.

  14. Validity of measures of pain and symptoms in HIV/AIDS infected households in resources poor settings: results from the Dominican Republic and Cambodia

    Directory of Open Access Journals (Sweden)

    Morineau Guy

    2006-03-01

    Full Text Available Abstract Background HIV/AIDS treatment programs are currently being mounted in many developing nations that include palliative care services. While measures of palliative care have been developed and validated for resource rich settings, very little work exists to support an understanding of measurement for Africa, Latin America or Asia. Methods This study investigates the construct validity of measures of reported pain, pain control, symptoms and symptom control in areas with high HIV-infected prevalence in Dominican Republic and Cambodia Measures were adapted from the POS (Palliative Outcome Scale. Households were selected through purposive sampling from networks of people living with HIV/AIDS. Consistencies in patterns in the data were tested used Chi Square and Mantel Haenszel tests. Results The sample persons who reported chronic illness were much more likely to report pain and symptoms compared to those not chronically ill. When controlling for the degrees of pain, pain control did not differ between the chronically ill and non-chronically ill using a Mantel Haenszel test in both countries. Similar results were found for reported symptoms and symptom control for the Dominican Republic. These findings broadly support the construct validity of an adapted version of the POS in these two less developed countries. Conclusion The results of the study suggest that the selected measures can usefully be incorporated into population-based surveys and evaluation tools needed to monitor palliative care and used in settings with high HIV/AIDS prevalence.

  15. De-MetaST-BLAST: a tool for the validation of degenerate primer sets and data mining of publicly available metagenomes.

    Directory of Open Access Journals (Sweden)

    Christopher A Gulvik

    Full Text Available Development and use of primer sets to amplify nucleic acid sequences of interest is fundamental to studies spanning many life science disciplines. As such, the validation of primer sets is essential. Several computer programs have been created to aid in the initial selection of primer sequences that may or may not require multiple nucleotide combinations (i.e., degeneracies. Conversely, validation of primer specificity has remained largely unchanged for several decades, and there are currently few available programs that allows for an evaluation of primers containing degenerate nucleotide bases. To alleviate this gap, we developed the program De-MetaST that performs an in silico amplification using user defined nucleotide sequence dataset(s and primer sequences that may contain degenerate bases. The program returns an output file that contains the in silico amplicons. When De-MetaST is paired with NCBI's BLAST (De-MetaST-BLAST, the program also returns the top 10 nr NCBI database hits for each recovered in silico amplicon. While the original motivation for development of this search tool was degenerate primer validation using the wealth of nucleotide sequences available in environmental metagenome and metatranscriptome databases, this search tool has potential utility in many data mining applications.

  16. Development of the Human Factors Skills for Healthcare Instrument: a valid and reliable tool for assessing interprofessional learning across healthcare practice settings.

    Science.gov (United States)

    Reedy, Gabriel B; Lavelle, Mary; Simpson, Thomas; Anderson, Janet E

    2017-10-01

    A central feature of clinical simulation training is human factors skills, providing staff with the social and cognitive skills to cope with demanding clinical situations. Although these skills are critical to safe patient care, assessing their learning is challenging. This study aimed to develop, pilot and evaluate a valid and reliable structured instrument to assess human factors skills, which can be used pre- and post-simulation training, and is relevant across a range of healthcare professions. Through consultation with a multi-professional expert group, we developed and piloted a 39-item survey with 272 healthcare professionals attending training courses across two large simulation centres in London, one specialising in acute care and one in mental health, both serving healthcare professionals working across acute and community settings. Following psychometric evaluation, the final 12-item instrument was evaluated with a second sample of 711 trainees. Exploratory factor analysis revealed a 12-item, one-factor solution with good internal consistency (α=0.92). The instrument had discriminant validity, with newly qualified trainees scoring significantly lower than experienced trainees ( t (98)=4.88, pfactor analysis revealed an adequate model fit (RMSEA=0.066). The Human Factors Skills for Healthcare Instrument provides a reliable and valid method of assessing trainees' human factors skills self-efficacy across acute and mental health settings. This instrument has the potential to improve the assessment and evaluation of human factors skills learning in both uniprofessional and interprofessional clinical simulation training.

  17. A new experimentally validated formula to calculate the QT interval in the presence of left bundle branch block holds true in the clinical setting.

    Science.gov (United States)

    Bogossian, Harilaos; Frommeyer, Gerrit; Ninios, Ilias; Pechlivanidou, Eleni; Hasan, Fuad; Nguyen, Quy Suu; Mijic, Dejan; Kloppe, Axel; Karosiene, Zana; Margkarian, Artak; Bandorski, Dirk; Schultes, Dominik; Erkapic, Damir; Seyfarth, Melchior; Lemke, Bernd; Eckardt, Lars; Zarse, Markus

    2017-03-01

    The evaluation of the QT interval in the presence of left bundle branch block (LBBB) is associated with the challenge to discriminate native QT interval from the prolongation due to the increase in QRS duration. The newest formula to evaluate QT interval in the presence of LBBB suggests: modified QT during LBBB = measured QT interval minus 50% of LBBB duration. The purpose of this study is therefore to validate the abovementioned formula in the clinical setting. Validation in two separate groups of patients: Patients who alternated between narrow QRS and intermittent LBBB and patients with narrow QRS who developed LBBB after transcatheter aortic valve implantation (TAVI). The acquired mean native QTc intervals and those calculated by the presented formula displayed no significant differences (p > .99 and p > .75). In this study we proved for the first time the validity and applicability of the experimentally acquired formula for the evaluation of the QT interval in the presence of LBBB in a clinical setting. © 2016 Wiley Periodicals, Inc.

  18. Development of Reliable and Validated Tools to Evaluate Technical Resuscitation Skills in a Pediatric Simulation Setting: Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics.

    Science.gov (United States)

    Faudeux, Camille; Tran, Antoine; Dupont, Audrey; Desmontils, Jonathan; Montaudié, Isabelle; Bréaud, Jean; Braun, Marc; Fournier, Jean-Paul; Bérard, Etienne; Berlengi, Noémie; Schweitzer, Cyril; Haas, Hervé; Caci, Hervé; Gatin, Amélie; Giovannini-Chami, Lisa

    2017-09-01

    To develop a reliable and validated tool to evaluate technical resuscitation skills in a pediatric simulation setting. Four Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics (RESCAPE) evaluation tools were created, following international guidelines: intraosseous needle insertion, bag mask ventilation, endotracheal intubation, and cardiac massage. We applied a modified Delphi methodology evaluation to binary rating items. Reliability was assessed comparing the ratings of 2 observers (1 in real time and 1 after a video-recorded review). The tools were assessed for content, construct, and criterion validity, and for sensitivity to change. Inter-rater reliability, evaluated with Cohen kappa coefficients, was perfect or near-perfect (>0.8) for 92.5% of items and each Cronbach alpha coefficient was ≥0.91. Principal component analyses showed that all 4 tools were unidimensional. Significant increases in median scores with increasing levels of medical expertise were demonstrated for RESCAPE-intraosseous needle insertion (P = .0002), RESCAPE-bag mask ventilation (P = .0002), RESCAPE-endotracheal intubation (P = .0001), and RESCAPE-cardiac massage (P = .0037). Significantly increased median scores over time were also demonstrated during a simulation-based educational program. RESCAPE tools are reliable and validated tools for the evaluation of technical resuscitation skills in pediatric settings during simulation-based educational programs. They might also be used for medical practice performance evaluations. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. The Child Behaviour Assessment Instrument: development and validation of a measure to screen for externalising child behavioural problems in community setting

    Directory of Open Access Journals (Sweden)

    Perera Hemamali

    2010-06-01

    Full Text Available Abstract Background In Sri Lanka, behavioural problems have grown to epidemic proportions accounting second highest category of mental health problems among children. Early identification of behavioural problems in children is an important pre-requisite of the implementation of interventions to prevent long term psychiatric outcomes. The objectives of the study were to develop and validate a screening instrument for use in the community setting to identify behavioural problems in children aged 4-6 years. Methods An initial 54 item questionnaire was developed following an extensive review of the literature. A three round Delphi process involving a panel of experts from six relevant fields was then undertaken to refine the nature and number of items and created the 15 item community screening instrument, Child Behaviour Assessment Instrument (CBAI. Validation study was conducted in the Medical Officer of Health area Kaduwela, Sri Lanka and a community sample of 332 children aged 4-6 years were recruited by two stage randomization process. The behaviour status of the participants was assessed by an interviewer using the CBAI and a clinical psychologist following clinical assessment concurrently. Criterion validity was appraised by assessing the sensitivity, specificity and predictive values at the optimum screen cut off value. Construct validity of the instrument was quantified by testing whether the data of validation study fits to a hypothetical model. Face and content validity of the CBAI were qualitatively assessed by a panel of experts. The reliability of the instrument was assessed by internal consistency analysis and test-retest methods in a 15% subset of the community sample. Results Using the Receiver Operating Characteristic analysis the CBAI score of >16 was identified as the cut off point that optimally differentiated children having behavioural problems, with a sensitivity of 0.88 (95% CI = 0.80-0.96 and specificity of 0.81 (95% CI = 0

  20. Development and validation of an app-based cell counter for use in the clinical laboratory setting

    Directory of Open Access Journals (Sweden)

    Alexander C Thurman

    2015-01-01

    Full Text Available Introduction: For decades cellular differentials have been generated exclusively on analog tabletop cell counters. With the advent of tablet computers, digital cell counters - in the form of mobile applications ("apps" - now represent an alternative to analog devices. However, app-based counters have not been widely adopted by clinical laboratories, perhaps owing to a presumed decrease in count accuracy related to the lack of tactile feedback inherent in a touchscreen interface. We herein provide the first systematic evidence that digital cell counters function similarly to standard tabletop units. Methods: We developed an app-based cell counter optimized for use in the clinical laboratory setting. Paired counts of 188 peripheral blood smears and 62 bone marrow aspirate smears were performed using our app-based counter and a standard analog device. Differences between paired data sets were analyzed using the correlation coefficient, Student′s t-test for paired samples and Bland-Altman plots. Results: All counts showed excellent agreement across all users and touch screen devices. With the exception of peripheral blood basophils (r = 0.684, differentials generated for the measured cell categories within the paired data sets were highly correlated (all r ≥ 0.899. Results of paired t-tests did not reach statistical significance for any cell type (all P > 0.05, and Bland-Altman plots showed a narrow spread of the difference about the mean without evidence of significant outliers. Conclusions: Our analysis suggests that no systematic differences exist between cellular differentials obtained via app-based or tabletop counters and that agreement between these two methods is excellent.

  1. Measuring the Value of New Drugs: Validity and Reliability of 4 Value Assessment Frameworks in the Oncology Setting.

    Science.gov (United States)

    Bentley, Tanya G K; Cohen, Joshua T; Elkin, Elena B; Huynh, Julie; Mukherjea, Arnab; Neville, Thanh H; Mei, Matthew; Copher, Ronda; Knoth, Russell; Popescu, Ioana; Lee, Jackie; Zambrano, Jenelle M; Broder, Michael S

    2017-06-01

    Several organizations have developed frameworks to systematically assess the value of new drugs. To evaluate the convergent validity and interrater reliability of 4 value frameworks to understand the extent to which these tools can facilitate value-based treatment decisions in oncology. Eight panelists used the American Society of Clinical Oncology (ASCO), European Society for Medical Oncology (ESMO), Institute for Clinical and Economic Review (ICER), and National Comprehensive Cancer Network (NCCN) frameworks to conduct value assessments of 15 drugs for advanced lung and breast cancers and castration-refractory prostate cancer. Panelists received instructions and published clinical data required to complete the assessments, assigning each drug a numeric or letter score. Kendall's Coefficient of Concordance for Ranks (Kendall's W) was used to measure convergent validity by cancer type among the 4 frameworks. Intraclass correlation coefficients (ICCs) were used to measure interrater reliability for each framework across cancers. Panelists were surveyed on their experiences. Kendall's W across all 4 frameworks for breast, lung, and prostate cancer drugs was 0.560 (P= 0.010), 0.562 (P = 0.010), and 0.920 (P framework subdomains, Kendall's W among breast cancer drugs was highest for certainty (ICER, NCCN: W = 0.908, P = 0.046) and lowest for clinical benefit (ASCO, ESMO, NCCN: W = 0.345, P = 0.436). Among lung cancer drugs, W was highest for toxicity (ASCO, ESMO, NCCN: W = 0. 944, P frameworks, panelists generally agreed that the frameworks were logically organized and reasonably easy to use, with NCCN rated somewhat easier. Convergent validity among the ASCO, ESMO, ICER, and NCCN frameworks was fair to excellent, increasing with clinical benefit subdomain concordance and simplicity of drug trial data. Interrater reliability, highest for ASCO and ESMO, improved with clarity of instructions and specificity of score definitions. Continued use, analyses, and refinements

  2. Growth and setting of gas bubbles in a viscoelastic matrix imaged by X-ray microtomography: the evolution of cellular structures in fermenting wheat flour dough.

    Science.gov (United States)

    Turbin-Orger, A; Babin, P; Boller, E; Chaunier, L; Chiron, H; Della Valle, G; Dendievel, R; Réguerre, A L; Salvo, L

    2015-05-07

    X-ray tomography is a relevant technique for the dynamic follow-up of gas bubbles in an opaque viscoelastic matrix, especially using image analysis. It has been applied here to pieces of fermenting wheat flour dough of various compositions, at two different voxel sizes (15 and 5 μm). The resulting evolution of the main cellular features shows that the creation of cellular structures follows two regimes that are defined by a characteristic time of connectivity, tc [30 and 80 min]: first (t ≤ tc), bubbles grow freely and then (t ≥ tc) they become connected since the percolation of the gas phase is limited by liquid films. During the first regime, bubbles can be tracked and the local strain rate can be measured. Its values (10(-4)-5 × 10(-4) s(-1)) are in agreement with those computed from dough viscosity and internal gas pressure, both of which depend on the composition. For higher porosity, P = 0.64 in our case, and thus occurring in the second regime, different cellular structures are obtained and XRT images show deformed gas cells that display complex shapes. The comparison of these images with confocal laser scanning microscopy images suggests the presence of liquid films that separate these cells. The dough can therefore be seen as a three-phase medium: viscoelastic matrix/gas cell/liquid phase. The contributions of the different levels of matter organization can be integrated by defining a capillary number (C = 0.1-1) that makes it possible to predict the macroscopic dough behavior.

  3. The development and validation of Huaxi emotional-distress index (HEI): A Chinese questionnaire for screening depression and anxiety in non-psychiatric clinical settings.

    Science.gov (United States)

    Wang, Jian; Guo, Wan-Jun; Zhang, Lan; Deng, Wei; Wang, Hui-Yao; Yu, Jian-Ying; Luo, Shan-Xia; Huang, Ming-Jin; Dong, Zai-Quan; Li, Da-Jiang; Song, Jin-Ping; Jiang, Yu; Cheng, Nan-Sheng; Liu, Xie-He; Li, Tao

    2017-07-01

    Depression and anxiety among general hospital patients are common and under-recognized in China. This study aimed toward developing a short questionnaire for screening depression and anxiety in non-psychiatric clinical settings, and to test its reliability and validity. The item pool which included 35 questions about emotional distress was drafted through a comprehensive literature review. An expert panel review and the first clinical test with 288 general hospital patients were conducted for the primary item selection. The second clinical test was performed to select the final item in 637 non-psychiatric patients. The reliability and validity of the final questionnaire were tested in 763 non-psychiatric patients, in which 211 subjects were interviewed by psychiatrists using Mini International Neuropsychiatric Interview (MINI). Multiple data analysis methods including principal components analysis (PCA), item response theory (IRT), and receiver operating characteristic (ROC) curve were used to select items and validate the final questionnaire. The series selection of items resulted in a 9-item questionnaire, namely Huaxi Emotional-distress Index (HEI). The Cronbach's α coefficient of HEI was 0.90. The PCA results showed a unidimensional construct. The area under the ROC curve (AUC) was 0.88 when compared with MINI interview. Using the optimal cut-off score of HEI (11/12), the sensitivity and specificity were 0.880 and 0.766, respectively. The HEI is considered as a reliable and valid instrument for screening depression and anxiety, which may have substantial clinical value to detect patients' emotional disturbances especially in the busy non-psychiatric clinical settings in China. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Validation of five minimally obstructive methods to estimate physical activity energy expenditure in young adults in semi-standardized settings

    DEFF Research Database (Denmark)

    Schneller, Mikkel Bo; Pedersen, Mogens Theisen; Gupta, Nidhi

    2015-01-01

    We compared the accuracy of five objective methods, including two newly developed methods combining accelerometry and activity type recognition (Acti4), against indirect calorimetry, to estimate total energy expenditure (EE) of different activities in semi-standardized settings. Fourteen particip......We compared the accuracy of five objective methods, including two newly developed methods combining accelerometry and activity type recognition (Acti4), against indirect calorimetry, to estimate total energy expenditure (EE) of different activities in semi-standardized settings. Fourteen...... participants performed a standardized and semi-standardized protocol including seven daily life activity types, while having their EE measured by indirect calorimetry. Simultaneously, physical activity was quantified by an ActivPAL3, two ActiGraph GT3X+'s and an Actiheart. EE was estimated by the standard...... variations in measured physical activity EE by indirect calorimetry, respectively. This study concludes that combining accelerometer data from a thigh-worn ActiGraph GT3X+ with activity type recognition improved the accuracy of activity specific EE estimation against indirect calorimetry in semi-standardized...

  5. Validation of the Environmental Audit Tool in both purpose-built and non-purpose-built dementia care settings.

    Science.gov (United States)

    Smith, Ronald; Fleming, Richard; Chenoweth, Lynn; Jeon, Yun-Hee; Stein-Parbury, Jane; Brodaty, Henry

    2012-09-01

    To provide further validation of the Environmental Audit Tool (EAT) by describing data on scores from 56 facilities and comparing the scores of facilities with a purpose-built dementia environment with those with non-purpose-built designs. Fifty-six facilities were assessed with the EAT. EAT scores for 24 purpose-built environments were compared with 32 non-purpose-built environments using a Wilcoxon rank-sum test. Descriptive data on EAT scores are presented across all facilities. Facilities scored well on safety/security, familiarity, highlighting useful stimuli and privacy. Purpose-built unit scores were significantly higher than those for non-purpose-built environments for nine of 10 subscales of the EAT and the overall EAT score. The EAT can assess the quality of homelike environments in residential aged care facilities for people with dementia, differentiate between the quality of design in various types of facilities and provide an evidence basis for devising improvements. © 2011 The Authors. Australasian Journal on Ageing © 2011 ACOTA.

  6. Appearance and Concurrent Validity of an Instrument for Assessing Disability in People with Chronic Spinal Cord Injury, Based on the ICF Core Set

    Directory of Open Access Journals (Sweden)

    Claudia Patricia Henao Lema

    2013-09-01

    Full Text Available Objective. To determine the appearance and concurrent validity of an instrument for assessing disability in people with chronic spinal cord injury (SCI-DAS, based on the ICF Core Set. Metodology. The study was launched among a group of 100 Colombians from four cities suffering spinal cord injury for longer than six months. Eight physical therapists, with an average professional experience of over 6.75 years, participated in this study. Appearance validity was assessed through a focus group and a survey of observers, the items of coefficient of variation and the relevance and appropriateness index were calculated. Concurrent validity was analyzed with AIS (American Spinal Injury Association [ASIA] Impairment Scale and Disability Scale WHO-DAS II, using the Spearman correlation coefficient. Results. The overall relevance and adequacy of the instrument yielded an average of 4.83/5 and 4.48/5, with a variation coefficient of 0.03. The agreement index among observers for qualifications of good and excellent reached 0.96 for relevance, and 0.86 for adequacy. The disability measured by the SCI-DAS showed a moderate significant correlation with the neurological level, the AIS motor and sensory indices, and a high correlation with disability, measured by WHO-DAS II (p<0. 001. A marginal statistically low-level correlation of functional compromise scale AIS (p = 0. 052 was found. Conclusions. In general, a good appearance validity of the instrument (SCIDAS was found. The concurrent validity of the instrument (SCI-DAS through the impairment scale AIS and the Disability Scale - WHO-DAS II was also evidenced.

  7. Influence of different process settings conditions on the accuracy of micro injection molding simulations: an experimental validation

    DEFF Research Database (Denmark)

    Tosello, Guido; Gava, Alberto; Hansen, Hans Nørgaard

    2009-01-01

    Currently available software packages exhibit poor results accuracy when performing micro injection molding (µIM) simulations. However, with an appropriate set-up of the processing conditions, the quality of results can be improved. The effects on the simulation results of different and alternative...... process conditions are investigated, namely the nominal injection speed, as well as the cavity filling time and the evolution of the cavity injection pressure as experimental data. In addition, the sensitivity of the results to the quality of the rheological data is analyzed. Simulated results...... are compared with experiments in terms of flow front position at part and micro features levels, as well as cavity injection filling time measurements....

  8. Creation and validation of a novel body condition scoring method for the magellanic penguin (Spheniscus magellanicus) in the zoo setting.

    Science.gov (United States)

    Clements, Julie; Sanchez, Jessica N

    2015-11-01

    This research aims to validate a novel, visual body scoring system created for the Magellanic penguin (Spheniscus magellanicus) suitable for the zoo practitioner. Magellanics go through marked seasonal fluctuations in body mass gains and losses. A standardized multi-variable visual body condition guide may provide a more sensitive and objective assessment tool compared to the previously used single variable method. Accurate body condition scores paired with seasonal weight variation measurements give veterinary and keeper staff a clearer understanding of an individual's nutritional status. San Francisco Zoo staff previously used a nine-point body condition scale based on the classic bird standard of a single point of keel palpation with the bird restrained in hand, with no standard measure of reference assigned to each scoring category. We created a novel, visual body condition scoring system that does not require restraint to assesses subcutaneous fat and muscle at seven body landmarks using illustrations and descriptive terms. The scores range from one, the least robust or under-conditioned, to five, the most robust, or over-conditioned. The ratio of body weight to wing length was used as a "gold standard" index of body condition and compared to both the novel multi-variable and previously used single-variable body condition scores. The novel multi-variable scale showed improved agreement with weight:wing ratio compared to the single-variable scale, demonstrating greater accuracy, and reliability when a trained assessor uses the multi-variable body condition scoring system. Zoo staff may use this tool to manage both the colony and the individual to assist in seasonally appropriate Magellanic penguin nutrition assessment. © 2015 Wiley Periodicals, Inc.

  9. An internet-based hearing test for simple audiometry in nonclinical settings: preliminary validation and proof of principle.

    Science.gov (United States)

    Honeth, Louise; Bexelius, Christin; Eriksson, Mikael; Sandin, Sven; Litton, Jan-Eric; Rosenhall, Ulf; Nyrén, Olof; Bagger-Sjöbäck, Dan

    2010-07-01

    To investigate the validity and reproducibility of a newly developed internet-based self-administered hearing test using clinical pure-tone air-conducted audiometry as gold standard. Cross-sectional intrasubject comparative study. Karolinska University Hospital, Solna, Sweden. Seventy-two participants (79% women) with mean age of 45 years (range, 19-71 yr). Twenty participants had impaired hearing according to the gold standard test. Hearing tests. The Pearson correlation coefficient between the results of the studied Internet-based hearing test and the gold standard test, the greatest mean differences in decibel between the 2 tests over tested frequencies, sensitivity and specificity to diagnose hearing loss defined by Heibel-Lidén, and test-retest reproducibility with the Pearson correlation coefficient. The Pearson correlation coefficient was 0.94 (p < 0.0001) for the right ear and 0.93 for the left (p = 0.0001). The greatest mean differences were seen for the frequencies 2 and 4 kHz, with -5.6 dB (standard deviation, 8.29), and -5.1 dB (standard deviation, 6.9), respectively. The 75th percentiles of intraindividual test-gold standard differences did not exceed -10 dB for any of the frequencies. The sensitivity for hearing loss was 75% (95% confidence interval, 51%-90%), and the specificity was 96% (95% confidence interval, 86%-99%). The test-retest reproducibility was excellent, with a Pearson correlation coefficient of 0.99 (p < 0.0001) for both ears. It is possible to assess hearing with reasonable accuracy using an Internet-based hearing test on a personal computer with headphones. The practical viability of self-administration in participants' homes needs further evaluation.

  10. International Life Sciences Institute North America Listeria monocytogenes strain collection: development of standard Listeria monocytogenes strain sets for research and validation studies.

    Science.gov (United States)

    Fugett, Eric; Fortes, Esther; Nnoka, Catherine; Wiedmann, Martin

    2006-12-01

    Research and development efforts on bacterial foodborne pathogens, including the development of novel detection and subtyping methods, as well as validation studies for intervention strategies can greatly be enhanced through the availability and use of standardized strain collections. These types of strain collections are available for some foodborne pathogens, such as Salmonella and Escherichia coli. We have developed a standard Listeria monocytogenes strain collection that has not been previously available. The strain collection includes (i) a diversity set of 25 isolates chosen to represent a genetically diverse set of L. monocytogenes isolates as well as a single hemolytic Listeria innocua strain and (ii) an outbreak set, which includes 21 human and food isolates from nine major human listeriosis outbreaks that occurred between 1981 and 2002. The diversity set represents all three genetic L. monocytogenes lineages (I, n = 9; II, n = 9; and III, n = 6) as well as nine different serotypes. Molecular subtyping by EcoRI automated ribotyping and pulsed-field gel electrophoresis (PFGE) with AscI and ApaI separated the 25 isolates in the diversity set into 23 ribotypes and 25 PFGE types, confirming that this isolate set represents considerable genetic diversity. Molecular subtyping of isolates in the outbreak set confirmed that human and food isolates were identical by ribotype and PFGE, except for human and food isolates for two outbreaks, which displayed related but distinct PFGE patterns. Subtype and source data for all isolates in this strain collection are available on the Internet and are linked to the PathogenTracker database (www.pathogentracker.com), which allows the addition of new, relevant information on these isolates, including links to publications that have used isolates from this collection. We have thus developed a core L. monocytogenes strain collection, which will provide a resource for L. monocytogenes research and development efforts with

  11. The setting time of a clay-slag geopolymer matrix: the influence of blast-furnace-slag addition and the mixing method

    Czech Academy of Sciences Publication Activity Database

    Perná, Ivana; Hanzlíček, Tomáš

    112, Part 1, JAN 20 (2016), s. 1150-1155 ISSN 0959-6526 Institutional support: RVO:67985891 Keywords : blast-furnace slag * geopolymer * setting time * mixing method * solidification * recycling Subject RIV: DM - Solid Waste and Recycling Impact factor: 5.715, year: 2016

  12. A Central European precipitation climatology – Part I: Generation and validation of a high-resolution gridded daily data set (HYRAS

    Directory of Open Access Journals (Sweden)

    Monika Rauthe

    2013-07-01

    Full Text Available A new precipitation climatology (DWD/BfG-HYRAS-PRE is presented which covers the river basins in Germany and neighbouring countries. In order to satisfy hydrological requirements, the gridded dataset has a high spatial resolution of 1 km2 and a daily temporal resolution that is based on up to 6200 precipitation stations within the spatial domain. The period of coverage extends from 1951 to 2006 for which gridded, daily precipitation fields were calculated from the station data using the REGNIE method. This is a combination between multiple linear regression considering orographical conditions and inverse distance weighting. One of the main attributes of the REGNIE method is the preservation of the station values for their respective grid cells. A detailed validation of the data set using cross-validation and Jackknifing showed both seasonally- and spatially-dependent interpolation errors. These errors, through further applications of the HYRAS data set within the KLIWAS project and other studies, provide an estimate of its certainty and quality. The mean absolute error was found to be less than 2 mm/day, but with both spatial and temporal variability. Additionally, the need for a high station network density was shown. Comparisons with other existing data sets show good agreement, with areas of orographical complexity displaying the largest differences within the domain. These errors are largely due to uncertainties caused by differences in the interpolation method, the station network density available, and the topographical information used. First climatological applications are presented and show the high potential of this new, high-resolution data set. Generally significant increases of up to 40% in winter precipitation and light decreases in summer are shown, whereby the spatial variability of the strength and significance of the trends is clearly illustrated.

  13. A validated assay for measuring doxorubicin in biological fluids and tissues in an isolated lung perfusion model: matrix effect and heparin interference strongly influence doxorubicin measurements.

    Science.gov (United States)

    Kümmerle, A; Krueger, T; Dusmet, M; Vallet, C; Pan, Y; Ris, H B; Decosterd, Laurent A

    2003-10-15

    Doxorubicin is an antineoplasic agent active against sarcoma pulmonary metastasis, but its clinical use is hampered by its myelotoxicity and its cumulative cardiotoxicity, when administered systemically. This limitation may be circumvented using the isolated lung perfusion (ILP) approach, wherein a therapeutic agent is infused locoregionally after vascular isolation of the lung. The influence of the mode of infusion (anterograde (AG): through the pulmonary artery (PA); retrograde (RG): through the pulmonary vein (PV)) on doxorubicin pharmacokinetics and lung distribution was unknown. Therefore, a simple, rapid and sensitive high-performance liquid chromatography method has been developed to quantify doxorubicin in four different biological matrices (infusion effluent, serum, tissues with low or high levels of doxorubicin). The related compound daunorubicin was used as internal standard (I.S.). Following a single-step protein precipitation of 500 microl samples with 250 microl acetone and 50 microl zinc sulfate 70% aqueous solution, the obtained supernatant was evaporated to dryness at 60 degrees C for exactly 45 min under a stream of nitrogen and the solid residue was solubilized in 200 microl of purified water. A 100 microl-volume was subjected to HPLC analysis onto a Nucleosil 100-5 microm C18 AB column equipped with a guard column (Nucleosil 100-5 microm C(6)H(5) (phenyl) end-capped) using a gradient elution of acetonitrile and 1-heptanesulfonic acid 0.2% pH 4: 15/85 at 0 min-->50/50 at 20 min-->100/0 at 22 min-->15/85 at 24 min-->15/85 at 26 min, delivered at 1 ml/min. The analytes were detected by fluorescence detection with excitation and emission wavelength set at 480 and 550 nm, respectively. The calibration curves were linear over the range of 2-1000 ng/ml for effluent and plasma matrices, and 0.1 microg/g-750 microg/g for tissues matrices. The method is precise with inter-day and intra-day relative standard deviation within 0.5 and 6.7% and accurate with

  14. RelMon: A general approach to QA, validation and physics analysis through comparison of large sets of histograms

    CERN Document Server

    Piparo, Danilo

    2012-01-01

    The estimation of the compatibility of large amounts of histogram pairs is a recurrent problem in high energy physics. The issue is common to several different areas, from software quality monitoring to data certification, preservation and analysis. Given two sets of histograms, it is very important to be able to scrutinize the outcome of several goodness of fit tests, obtain a clear answer about the overall compatibility, easily spot the single anomalies and directly access the concerned histogram pairs. This procedure must be automated in order to reduce the human workload, therefore improving the process of identification of differences which is usually carried out by a trained human mind. Some solutions to this problem have been proposed, but they are experiment specific. RelMon depends only on ROOT and offers several goodness of fit tests (e.g. chi-squared or Kolmogorov-Smirnov). It produces highly readable web reports, in which aggregations of the comparisons rankings are available as well as all the pl...

  15. Validation and use of a QuEChERS-based gas chromatographic-tandem mass spectrometric method for multiresidue pesticide analysis in blackcurrants including studies of matrix effects and estimation of measurement uncertainty.

    Science.gov (United States)

    Walorczyk, Stanisław

    2014-03-01

    A triple quadrupole GC-QqQ-MS/MS method was optimized for multiresidue analysis of over 180 pesticides in blackcurrants. The samples were prepared by using a modified quick, easy, cheap, effective, rugged and safe (QuEChERS) analytical protocol. To reduce matrix co-extractives in the final extract, the supernatant was cleaned up by dispersive-solid phase extraction (dispersive-SPE) with a mixture of sorbents: primary secondary amine (PSA), octadecyl (C18) and graphitized carbon black (GCB). The validation results demonstrated fitness for purpose of the streamlined method. The overall recoveries at the three spiking levels of 0.01, 0.05 and 0.2 mg kg(-1) spanned between 70% and 116% (102% on average) with relative standard deviation (RSD) values between 3% and 19% except for chlorothalonil (23%). Response linearity was studied in the range between 0.005 and 0.5 mg kg(-1). The matrix effect for each individual compound was evaluated through the study of ratios of the slopes obtained in solvent and blackcurrant matrix. The optimized method provided small matrix effect (matrix effect was 10-20%, 20-30% and >30%, respectively. Following the application of "top-down" approach, the expanded measurement uncertainty was estimated as being 21% on average (coverage factor k=2, confidence level 95%). If compared with samples of other crops, the analyses of blackcurrants revealed a high percentage of exceedance of the legislative maximum residue levels (MRLs), as well as some instances of the detection of pesticides unapproved on this crop. © 2013 Published by Elsevier B.V.

  16. Validation of Landsat-7 ETM+ MEM Thermal Improvement in Thermal Vacuum Tests and in Flight Due to Lower Louver Set Points

    Science.gov (United States)

    Choi, Michael K.

    1999-01-01

    The Enhanced Thematic Mapper Plus (ETM+) Main Electronics Module (MEM) power supply heat sink temperature is critical to the Landsat-7 mission. It is strongly dependent on the thermal louver design. A lower power supply heat sink temperature increases the reliability of the MEM, and reduces the risk of over heating and thermal shut-down. After the power supply failures in ETM+ instrument thermal vacuum tests #1 and #2, the author performed detailed thermal analyses of the MEM, and proposed to reduce the louver set-points by 7C. At the 1998 Intersociety Energy Conversion Engineering Conference (IECEC), the author presented a paper that included results of thermal analysis of the MEM. It showed that a 70C reduction of the louver set points could reduce the maximum power supply heat sink temperature in thermal vacuum test and in flight to below 20"C in the cooler outgas mode and in the nominal imaging mode, and has no significant impact on the standby heater duty cycle. It also showed that the effect of Earth infrared and albedo on the power supply heat sink temperature is small. The louver set point reduction was implemented in June 1998, just prior to ETM+ thermal vacuum test #3. Results of the thermal vacuum tests, and temperature data in flight validate the MEM thermal performance improvement due to the 70C reduction of the louver set points.

  17. Selection and validation of a set of reliable reference genes for quantitative sod gene expression analysis in C. elegans

    Directory of Open Access Journals (Sweden)

    Vandesompele Jo

    2008-01-01

    Full Text Available Abstract Background In the nematode Caenorhabditis elegans the conserved Ins/IGF-1 signaling pathway regulates many biological processes including life span, stress response, dauer diapause and metabolism. Detection of differentially expressed genes may contribute to a better understanding of the mechanism by which the Ins/IGF-1 signaling pathway regulates these processes. Appropriate normalization is an essential prerequisite for obtaining accurate and reproducible quantification of gene expression levels. The aim of this study was to establish a reliable set of reference genes for gene expression analysis in C. elegans. Results Real-time quantitative PCR was used to evaluate the expression stability of 12 candidate reference genes (act-1, ama-1, cdc-42, csq-1, eif-3.C, mdh-1, gpd-2, pmp-3, tba-1, Y45F10D.4, rgs-6 and unc-16 in wild-type, three Ins/IGF-1 pathway mutants, dauers and L3 stage larvae. After geNorm analysis, cdc-42, pmp-3 and Y45F10D.4 showed the most stable expression pattern and were used to normalize 5 sod expression levels. Significant differences in mRNA levels were observed for sod-1 and sod-3 in daf-2 relative to wild-type animals, whereas in dauers sod-1, sod-3, sod-4 and sod-5 are differentially expressed relative to third stage larvae. Conclusion Our findings emphasize the importance of accurate normalization using stably expressed reference genes. The methodology used in this study is generally applicable to reliably quantify gene expression levels in the nematode C. elegans using quantitative PCR.

  18. Cross-validation of biomarkers for the early differential diagnosis and prognosis of dementia in a clinical setting

    Energy Technology Data Exchange (ETDEWEB)

    Perani, Daniela [Vita-Salute San Raffaele University, Milan (Italy); San Raffaele Scientific Institute, Division of Neuroscience, Milan (Italy); San Raffaele Hospital, Nuclear Medicine Unit, Milan (Italy); Cerami, Chiara [Vita-Salute San Raffaele University, Milan (Italy); San Raffaele Scientific Institute, Division of Neuroscience, Milan (Italy); San Raffaele Hospital, Clinical Neuroscience Department, Milan (Italy); Caminiti, Silvia Paola [Vita-Salute San Raffaele University, Milan (Italy); San Raffaele Scientific Institute, Division of Neuroscience, Milan (Italy); Santangelo, Roberto; Coppi, Elisabetta; Ferrari, Laura; Magnani, Giuseppe [San Raffaele Hospital, Department of Neurology, Milan (Italy); Pinto, Patrizia [Papa Giovanni XXIII Hospital, Department of Neurology, Bergamo (Italy); Passerini, Gabriella [Servizio di Medicina di Laboratorio OSR, Milan (Italy); Falini, Andrea [Vita-Salute San Raffaele University, Milan (Italy); San Raffaele Scientific Institute, Division of Neuroscience, Milan (Italy); San Raffaele Hospital, CERMAC - Department of Neuroradiology, Milan (Italy); Iannaccone, Sandro [San Raffaele Hospital, Clinical Neuroscience Department, Milan (Italy); Cappa, Stefano Francesco [San Raffaele Scientific Institute, Division of Neuroscience, Milan (Italy); IUSS Pavia, Pavia (Italy); Comi, Giancarlo [Vita-Salute San Raffaele University, Milan (Italy); San Raffaele Hospital, Department of Neurology, Milan (Italy); Gianolli, Luigi [San Raffaele Hospital, Nuclear Medicine Unit, Milan (Italy)

    2016-03-15

    The aim of this study was to evaluate the supportive role of molecular and structural biomarkers (CSF protein levels, FDG PET and MRI) in the early differential diagnosis of dementia in a large sample of patients with neurodegenerative dementia, and in determining the risk of disease progression in subjects with mild cognitive impairment (MCI). We evaluated the supportive role of CSF Aβ{sub 42}, t-Tau, p-Tau levels, conventional brain MRI and visual assessment of FDG PET SPM t-maps in the early diagnosis of dementia and the evaluation of MCI progression. Diagnosis based on molecular biomarkers showed the best fit with the final diagnosis at a long follow-up. FDG PET SPM t-maps had the highest diagnostic accuracy in Alzheimer's disease and in the differential diagnosis of non-Alzheimer's disease dementias. The p-tau/Aβ{sub 42} ratio was the only CSF biomarker providing a significant classification rate for Alzheimer's disease. An Alzheimer's disease-positive metabolic pattern as shown by FDG PET SPM in MCI was the best predictor of conversion to Alzheimer's disease. In this clinical setting, FDG PET SPM t-maps and the p-tau/Aβ{sub 42} ratio improved clinical diagnostic accuracy, supporting the importance of these biomarkers in the emerging diagnostic criteria for Alzheimer's disease dementia. FDG PET using SPM t-maps had the highest predictive value by identifying hypometabolic patterns in different neurodegenerative dementias and normal brain metabolism in MCI, confirming its additional crucial exclusionary role. (orig.)

  19. Self-organising maps and correlation analysis as a tool to explore patterns in excitation-emission matrix data sets and to discriminate dissolved organic matter fluorescence components.

    Science.gov (United States)

    Ejarque-Gonzalez, Elisabet; Butturini, Andrea

    2014-01-01

    Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.

  20. Validity, reliability and utility of the Irish Nursing Minimum Data Set for General Nursing in investigating the effectiveness of nursing interventions in a general nursing setting: A repeated measures design.

    Science.gov (United States)

    Morris, Roisin; Matthews, Anne; Scott, Anne P

    2014-04-01

    Internationally, nursing professionals are coming under increasing pressure to highlight the contribution they make to health care and patient outcomes. Despite this, difficulties exist in the provision of quality information aimed at describing nursing work in sufficient detail. The Irish Minimum Data Set for General Nursing is a new nursing data collection system aimed at highlighting the contribution of nursing to patient care. The objectives of this study were to investigate the construct validity and internal reliability of the Irish Nursing Minimum Data Set for General Nursing and to assess its usefulness in measuring the mediating effects of nursing interventions on patient well-being for a group of short stay medical and surgical patients. This was a quantitative study using a repeated measures design. Participants sampled came from both general surgery and general medicine wards in 6 hospitals throughout the Republic of Ireland. Nurses took on the role of data collectors. Nurses participating in the study were qualified, registered nurses engaged in direct patient care. Because the unit of analysis for this study was the patient day, patient numbers were considered in estimations of sample size requirements. A total of 337 usable Nursing Minimum Data Set booklets were collected. The construct validity of the tool was established using exploratory factor analysis with a Promax rotation and Maximum Likelihood extraction. Internal reliability was established using the Cronbach's Alpha coefficient. Path analysis was used to assess the mediating effects of nursing interventions on patient well-being. The results of the exploratory factor analysis and path analysis met the criteria for an appropriate model fit. All Cronbach Alpha scores were above .7. The overall findings of the study inferred that the Irish Nursing Minimum Data for General Nursing possessed construct validity and internal reliability. The study results also inferred the potential of the tool in

  1. In Vivo Validation of Volume Flow Measurements of Pulsatile Flow Using a Clinical Ultrasound System and Matrix Array Transducer.

    Science.gov (United States)

    Hudson, John M; Williams, Ross; Milot, Laurent; Wei, Qifeng; Jago, James; Burns, Peter N

    2017-03-01

    The goal of this study was to evaluate the accuracy of a non-invasive C-plane Doppler estimation of pulsatile blood flow in the lower abdominal vessels of a porcine model. Doppler ultrasound measurements from a matrix array transducer system were compared with invasive volume flow measurements made on the same vessels with a surgically implanted ultrasonic transit-time flow probe. For volume flow rates ranging from 60 to 750 mL/min, agreement was very good, with a Pearson correlation coefficient of 0.97 (p < 0.0001) and a mean bias of -4.2%. The combination of 2-D matrix array technology and fast processing gives this Doppler method clinical potential, as many of the user- and system-dependent parameters of previous methods, including explicit vessel angle and diameter measurements, are eliminated. Copyright © 2016 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  2. Implementation and Operational Research: Risk Charts to Guide Targeted HIV-1 Viral Load Monitoring of ART: Development and Validation in Patients From Resource-Limited Settings.

    Science.gov (United States)

    Koller, Manuel; Fatti, Geoffrey; Chi, Benjamin H; Keiser, Olivia; Hoffmann, Christopher J; Wood, Robin; Prozesky, Hans; Stinson, Kathryn; Giddy, Janet; Mutevedzi, Portia; Fox, Matthew P; Law, Matthew; Boulle, Andrew; Egger, Matthias

    2015-11-01

    HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20%, or 40% of patients in 7 cohorts of patients starting ART in South Africa, and plotted cutoffs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia, and the Asia-Pacific. In total, 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African cohort, from 64% to 93% in the Zambian cohort, and from 73% to 96% in the Asia-Pacific cohort. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia, and from 37% to 71% in Asia-Pacific. The area under the receiver operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia, and from 0.77 to 0.92 in Asia-Pacific. CD4-based risk charts with optimal cutoffs for targeted VL testing maybe useful to monitor ART in settings where VL capacity is limited.

  3. Sluggish cognitive tempo and attention-deficit/hyperactivity disorder (ADHD) inattention in the home and school contexts: Parent and teacher invariance and cross-setting validity.

    Science.gov (United States)

    Burns, G Leonard; Becker, Stephen P; Servera, Mateu; Bernad, Maria Del Mar; García-Banda, Gloria

    2017-02-01

    This study examined whether sluggish cognitive tempo (SCT) and attention-deficit/hyperactivity disorder (ADHD) inattention (IN) symptoms demonstrated cross-setting invariance and unique associations with symptom and impairment dimensions across settings (i.e., home SCT and ADHD-IN uniquely predicting school symptom and impairment dimensions, and vice versa). Mothers, fathers, primary teachers, and secondary teachers rated SCT, ADHD-IN, ADHD-hyperactivity/impulsivity (HI), oppositional defiant disorder (ODD), anxiety, depression, academic impairment, social impairment, and peer rejection dimensions for 585 Spanish 3rd-grade children (53% boys). Within-setting (i.e., mothers, fathers; primary, secondary teachers) and cross-settings (i.e., home, school) invariance was found for both SCT and ADHD-IN. From home to school, higher levels of home SCT predicted lower levels of school ADHD-HI and higher levels of school academic impairment after controlling for home ADHD-IN, whereas higher levels of home ADHD-IN predicted higher levels of school ADHD-HI, ODD, anxiety, depression, academic impairment, and peer rejection after controlling for home SCT. From school to home, higher levels of school SCT predicted lower levels of home ADHD-HI and ODD and higher levels of home anxiety, depression, academic impairment, and social impairment after controlling for school ADHD-IN, whereas higher levels of school ADHD-IN predicted higher levels of home ADHD-HI, ODD, and academic impairment after controlling for school SCT. Although SCT at home and school was able to uniquely predict symptom and impairment dimensions in the other setting, SCT at school was a better predictor than ADHD-IN at school of psychopathology and impairment at home. Findings provide additional support for SCT's validity relative to ADHD-IN. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Chemical composition analysis and product consistency tests to support enhanced Hanford waste glass models. Results for the third set of high alumina outer layer matrix glasses

    Energy Technology Data Exchange (ETDEWEB)

    Fox, K. M. [Savannah River Site (SRS), Aiken, SC (United States); Edwards, T. B. [Savannah River Site (SRS), Aiken, SC (United States)

    2015-12-01

    In this report, the Savannah River National Laboratory provides chemical analyses and Product Consistency Test (PCT) results for 14 simulated high level waste glasses fabricated by the Pacific Northwest National Laboratory. The results of these analyses will be used as part of efforts to revise or extend the validation regions of the current Hanford Waste Treatment and Immobilization Plant glass property models to cover a broader span of waste compositions. The measured chemical composition data are reported and compared with the targeted values for each component for each glass. All of the measured sums of oxides for the study glasses fell within the interval of 96.9 to 100.8 wt %, indicating recovery of all components. Comparisons of the targeted and measured chemical compositions showed that the measured values for the glasses met the targeted concentrations within 10% for those components present at more than 5 wt %. The PCT results were normalized to both the targeted and measured compositions of the study glasses. Several of the glasses exhibited increases in normalized concentrations (NCi) after the canister centerline cooled (CCC) heat treatment. Five of the glasses, after the CCC heat treatment, had NCB values that exceeded that of the Environmental Assessment (EA) benchmark glass. These results can be combined with additional characterization, including X-ray diffraction, to determine the cause of the higher release rates.

  5. Matrix effect in F₂-isoprostanes quantification by HPLC-MS/MS: a validated method for analysis of iPF₂α-III and iPF₂α-VI in human urine.

    Science.gov (United States)

    Petrosino, Teresa; Serafini, Mauro

    2014-08-15

    Liquid chromatography coupled with tandem mass spectrometry (HPLC-MS/MS) has become the method of choice for analysis in biological matrices, because of its high specificity and sensitivity. However, it should be taken into account that the presence of matrix components coeluting with analytes might interfere with the ionization process and affect the accuracy and precision of the assay. For this reason, the presence of a "matrix effect" should always be evaluated during method development, above all in complex matrix such as urine. In the present work, a HPLC-MS/MS method was developed for the quantification of urinary iPF2α-III and iPF2α-VI. A careful assessment of matrix effect and an accurate validation were carried out, in order to verify the reliability of quantitative data obtained. Ion suppression, due to the matrix components, was reduced through optimization of both chromatographic method and sample extraction procedure. Urine samples were purified by solid phase extraction (SPE) and the extracts injected into the HPLC-MS/MS system, equipped with a TurboIonSpray ionization source operated in negative ion mode (ESI(-)). Stable isotope-labeled analogues (iPF2α-III-d4 and iPF2α-VI-d4) were used as internal standards, and quantification was performed in multiple reaction monitoring (MRM) mode by monitoring the following mass transitions: m/z 353.4→193.2 for iPF2α-III, m/z 357.2→197.0 for iPF2α-III-d4, m/z 353.4→115.1 for iPF2α-VI, and m/z 357.4→115.1 for iPF2α-VI-d4. The validated assay, applied to the analysis of urinary samples coming from healthy and overweight subjects, resulted suitable for an accurate quantification of iPF2α-III and iPF2α-VI in human urine. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Systematic evaluation of matrix effect and cross-talk-free method for simultaneous determination of zolmitriptan and N-desmethyl zolmitriptan in human plasma: a sensitive LC-MS/MS method validation and its application to a clinical pharmacokinetic study.

    Science.gov (United States)

    Patel, Bhargav; Suhagia, B N; Jangid, Arvind G; Mistri, Hiren N; Desai, Nirmal

    2016-03-01

    The objective of the present work was to carry out systematic evaluation to eliminate matrix effect owing to plasma phospholipids as observed during sample preparation and to develop a cross-talk-free sensitive, selective and rapid bioanalytical method for the simultaneous determination of zolmitriptan (ZT) and N-desmethyl zolmitriptan (DZT) in human plasma by liquid chromatography-tandem mass spectrometry using naratriptan as internal standard (IS). The analytes and IS were quantitatively extracted from 200 μL human plasma by solid phase extraction. No cross-talk was found between ZT and DZT having identical product ions. Quantitation was performed on a triple quadrupole mass spectrometer employing electrospray ionization technique, operating in multiple reaction monitoring and positive ion mode. The total chromatographic run time was 2.5 min. The method was fully validated for sensitivity, selectivity, specificity, linearity, accuracy, precision, recovery, matrix effect, dilution integrity and stability studies. The method was validated over a dynamic concentration range of 0.1-15 ng/mL for ZT and DZT. The method was successfully applied to a bioequivalence study of 2.5 mg ZT tablet formulation in 18 healthy Indian male subjects under fasting conditions. Assay reproducibility was assessed by reanalysis of 62 incurred samples. Copyright © 2015 John Wiley & Sons, Ltd.

  7. The Malawi Developmental Assessment Tool (MDAT: the creation, validation, and reliability of a tool to assess child development in rural African settings.

    Directory of Open Access Journals (Sweden)

    Melissa Gladstone

    2010-05-01

    Full Text Available Although 80% of children with disabilities live in developing countries, there are few culturally appropriate developmental assessment tools available for these settings. Often tools from the West provide misleading findings in different cultural settings, where some items are unfamiliar and reference values are different from those of Western populations.Following preliminary and qualitative studies, we produced a draft developmental assessment tool with 162 items in four domains of development. After face and content validity testing and piloting, we expanded the draft tool to 185 items. We then assessed 1,426 normal rural children aged 0-6 y from rural Malawi and derived age-standardized norms for all items. We examined performance of items using logistic regression and reliability using kappa statistics. We then considered all items at a consensus meeting and removed those performing badly and those that were unnecessary or difficult to administer, leaving 136 items in the final Malawi Developmental Assessment Tool (MDAT. We validated the tool by comparing age-matched normal children with those with malnutrition (120 and neurodisabilities (80. Reliability was good for items remaining with 94%-100% of items scoring kappas >0.4 for interobserver immediate, delayed, and intra-observer testing. We demonstrated significant differences in overall mean scores (and individual domain scores for children with neurodisabilities (35 versus 99 [p<0.001] when compared to normal children. Using a pass/fail technique similar to the Denver II, 3% of children with neurodisabilities passed in comparison to 82% of normal children, demonstrating good sensitivity (97% and specificity (82%. Overall mean scores of children with malnutrition (weight for height <80% were also significantly different from scores of normal controls (62.5 versus 77.4 [p<0.001]; scores in the separate domains, excluding social development, also differed between malnourished children and

  8. Validation of the Reveal(®) 2.0 Group D1 Salmonella Test for Detection of Salmonella Enteritidis in Raw Shell Eggs and Poultry-Associated Matrixes.

    Science.gov (United States)

    Mozola, Mark; Biswas, Preetha; Viator, Ryan; Feldpausch, Emily; Foti, Debra; Li, Lin; Le, Quynh-Nhi; Alles, Susan; Rice, Jennifer

    2016-07-01

    A study was conducted to assess the performance of the Reveal(®) 2.0 Group D1 Salmonella lateral flow immunoassay for use in detection of Salmonella Enteritidis (SE) in raw shell eggs and poultry-associated matrixes, including chicken carcass rinse and poultry feed. In inclusivity testing, the Reveal 2.0 test detected all 37 strains of SE tested. The test also detected all but one of 18 non-Enteritidis somatic group D1 Salmonella serovars examined. In exclusivity testing, none of 42 strains tested was detected. The exclusivity panel included Salmonella strains of somatic groups other than D1, as well as strains of other genera of Gram-negative bacteria. In matrix testing, performance of the Reveal 2.0 test was compared to that of the U.S. Department of Agriculture, Food Safety and Inspection Service Microbiology Laboratory Guidebook reference culture procedure for chicken carcass rinse and to that of the U.S. Food and Drug Administration Bacteriological Analytical Manual for raw shell eggs and poultry feed. For all matrixes evaluated, there were no significant differences in the ability to detect SE when comparing the Reveal 2.0 method and the appropriate reference culture procedure as determined by probability of detection statistical analysis. The ability of the Reveal 2.0 test to withstand modest perturbations to normal operating parameters was examined in robustness experiments. Results showed that the test can withstand deviations in up to three operating parameters simultaneously without significantly affecting performance. Real-time stability testing of multiple lots of Reveal 2.0 devices established the shelf life of the test device at 16 months postmanufacture.

  9. Validation of the GROMOS force-field parameter set 45A3 against nuclear magnetic resonance data of hen egg lysozyme

    Energy Technology Data Exchange (ETDEWEB)

    Soares, T. A. [ETH Hoenggerberg Zuerich, Laboratory of Physical Chemistry (Switzerland); Daura, X. [Universitat Autonoma de Barcelona, InstitucioCatalana de Recerca i Estudis Avancats and Institut de Biotecnologia i Biomedicina (Spain); Oostenbrink, C. [ETH Hoenggerberg Zuerich, Laboratory of Physical Chemistry (Switzerland); Smith, L. J. [University of Oxford, Oxford Centre for Molecular Sciences, Central Chemistry Laboratory (United Kingdom); Gunsteren, W. F. van [ETH Hoenggerberg Zuerich, Laboratory of Physical Chemistry (Switzerland)], E-mail: wfvgn@igc.phys.chem.ethz.ch

    2004-12-15

    The quality of molecular dynamics (MD) simulations of proteins depends critically on the biomolecular force field that is used. Such force fields are defined by force-field parameter sets, which are generally determined and improved through calibration of properties of small molecules against experimental or theoretical data. By application to large molecules such as proteins, a new force-field parameter set can be validated. We report two 3.5 ns molecular dynamics simulations of hen egg white lysozyme in water applying the widely used GROMOS force-field parameter set 43A1 and a new set 45A3. The two MD ensembles are evaluated against NMR spectroscopic data NOE atom-atom distance bounds, {sup 3}J{sub NH{alpha}} and {sup 3}J{sub {alpha}}{sub {beta}} coupling constants, and {sup 1}5N relaxation data. It is shown that the two sets reproduce structural properties about equally well. The 45A3 ensemble fulfills the atom-atom distance bounds derived from NMR spectroscopy slightly less well than the 43A1 ensemble, with most of the NOE distance violations in both ensembles involving residues located in loops or flexible regions of the protein. Convergence patterns are very similar in both simulations atom-positional root-mean-square differences (RMSD) with respect to the X-ray and NMR model structures and NOE inter-proton distances converge within 1.0-1.5 ns while backbone {sup 3}J{sub HN{alpha}}-coupling constants and {sup 1}H- {sup 1}5N order parameters take slightly longer, 1.0-2.0 ns. As expected, side-chain {sup 3}J{sub {alpha}}{sub {beta}}-coupling constants and {sup 1}H- {sup 1}5N order parameters do not reach full convergence for all residues in the time period simulated. This is particularly noticeable for side chains which display rare structural transitions. When comparing each simulation trajectory with an older and a newer set of experimental NOE data on lysozyme, it is found that the newer, larger, set of experimental data agrees as well with each of the

  10. Matrix calculus

    CERN Document Server

    Bodewig, E

    1959-01-01

    Matrix Calculus, Second Revised and Enlarged Edition focuses on systematic calculation with the building blocks of a matrix and rows and columns, shunning the use of individual elements. The publication first offers information on vectors, matrices, further applications, measures of the magnitude of a matrix, and forms. The text then examines eigenvalues and exact solutions, including the characteristic equation, eigenrows, extremum properties of the eigenvalues, bounds for the eigenvalues, elementary divisors, and bounds for the determinant. The text ponders on approximate solutions, as well

  11. Supervised Mineral Classification with Semi-automatic Training and Validation Set Generation in Scanning Electron Microscope Energy Dispersive Spectroscopy Images of Thin Sections

    DEFF Research Database (Denmark)

    Flesche, Harald; Nielsen, Allan Aasbjerg; Larsen, Rasmus

    2000-01-01

    . The data can be approximated by a Poisson distribution. Accordingly, the square root of the data has constant variance and a linear classifier can be used. Near orthogonal input data, enable the use of a minimum distance classifier. Results from both linear and quadratic minimum distance classifications...... are applied to perform the classification. First, training and validation sets are grown from one or a few seed points by a method that ensures spatial and spectral closeness of observations. Spectral closeness is obtained by excluding observations that have high Mahalanobis distances to the training class......–Matusita distance and the posterior probability of a class mean being classified as another class. Fourth, the actual classification is carried out based on four supervised classifiers all assuming multinormal distributions: simple quadratic, a contextual quadratic, and two hierarchical quadratic classifiers...

  12. Adaptation and validation of a portable steam sterilizer for processing intrauterine device insertion instruments and supplies in low-resource settings.

    Science.gov (United States)

    Barone, M A; Faisel, A J; Andrews, L; Ahmed, J; Rashida, B; Kristensen, D

    1997-08-01

    Difficulties with adequately processing intrauterine device (IUD) insertion instruments and supplies have led to use of potentially contaminated items, compromising the quality and safety of IUD insertion services in Bangladesh. A sterilization process for IUD insertion instruments and supplies by using a commercially available portable steam sterilizer was developed and validated. Racks provided with the sterilizer were used during sterilization of wrapped supplies (gloves and cotton balls). Metal compartments to hold insertion instruments were built to fit into the sterilizer. After sterilization, supplies were transported to rural service sites in plastic bags, whereas instruments remained in the sterilizer, which was transported in a carrying case. To validate the sterilizer, laboratory testing was conducted by using chemical and biologic indicators for steam sterilization and field testing in Bangladesh with chemical indicators. Results indicated that sterilization cycles were effective in achieving sterility of IUD insertion supplies and instruments at sterility assurance levels of 10(-5) and 10(-6), respectively. Use of this sterilizer for IUD insertion supplies and instruments will improve the quality of service delivery in the Bangladesh family-planning program and has application for use in many other low-resource settings.

  13. Diagnostic validation of a familial hypercholesterolaemia cohort provides a model for using targeted next generation DNA sequencing in the clinical setting.

    Science.gov (United States)

    Hinchcliffe, Marcus; Le, Huong; Fimmel, Anthony; Molloy, Laura; Freeman, Lucinda; Sullivan, David; Trent, Ronald J

    2014-01-01

    Our aim was to assess the sensitivity and specificity of a next generation DNA sequencing (NGS) platform using a capture based DNA library preparation method. Data and experience gained from this diagnostic validation can be used to progress the applications of NGS in the wider molecular diagnostic setting. A technical cross-validation comparing the current molecular diagnostic gold standard methods of Sanger DNA sequencing and multiplex ligation-dependant probe amplification (MLPA) versus a customised capture based targeted re-sequencing method on a SOLiD 5500 sequencing platform was carried out using a cohort of 96 familial hypercholesterolaemia (FH) samples. We compared a total of 595 DNA variations (488 common single nucleotide polymorphisms, 73 missense mutations, 9 nonsense mutations, 3 splice site point mutations, 13 small indels, 2 multi-exonic duplications and 7 multi-exonic deletions) found previously in the 96 FH samples. DNA variation detection sensitivity and specificity were both 100% for the SOLiD 5500 NGS platform compared with Sanger sequencing and MLPA only when both LifeScope and Integrative Genomics Viewer softwares were utilised. The methods described here offer a high-quality strategy for the detection of a wide range of DNA mutations in diseases with a moderate number of well described causative genes. However, there are important issues related to the bioinformatic algorithms employed to detect small indels.

  14. New uses of the Migraine Screen Questionnaire (MS-Q: validation in the Primary Care setting and ability to detect hidden migraine. MS-Q in Primary Care

    Directory of Open Access Journals (Sweden)

    Palacios Gemma

    2010-06-01

    Full Text Available Abstract Background PC plays an important role in early diagnosis of health disorders, particularly migraine, due to the financial impact of this disease for the society and its impact on patients' quality of life. The aim of the study was to validate the self-administered MS-Q questionnaire for detection of hidden migraine in the field of primary care (PC, and to explore its use in this setting. Methods Cross-sectional, observational, and multicentre study in subjects above 18 years of age patients attending PC centers (regardless of the reason for consultation. A MS-Q score ≥ 4 was considered possible migraine. Level of agreement with IHS criteria clinical diagnosis (kappa coefficient, and instrument's validity properties: sensitivity, specificity, positive (PPV and negative (NPV predictive values were determined. The ability of the instrument to identify possible new cases of migraine was calculated, as well as the ratio of hidden disease compared to the ratio obtained by IHS criteria. Results A total of 9,670 patients were included [48.9 ± 17.2 years (mean ± SD; 61.9% women], from 410 PC centers representative of the whole national territory. The clinical prevalence of migraine according to the IHS criteria was 24.7%, and 20.4% according to MS-Q: Kappa index of agreement 0.82 (p de novo and hidden migraine identified by MS-Q and by IHS criteria: 5.7% vs. 6.1% and 26.6% vs. 24.1%, respectively. Conclusions The results of the present study confirm the usefulness of the MS-Q questionnaire for the early detection and assessment of migraine in PC settings, and its ability to detect hidden migraine.

  15. Sparse matrix beamforming and image reconstruction for 2-D HIFU monitoring using harmonic motion imaging for focused ultrasound (HMIFU) with in vitro validation.

    Science.gov (United States)

    Hou, Gary Y; Provost, Jean; Grondin, Julien; Wang, Shutao; Marquet, Fabrice; Bunting, Ethan; Konofagou, Elisa E

    2014-11-01

    Harmonic motion imaging for focused ultrasound (HMIFU) utilizes an amplitude-modulated HIFU beam to induce a localized focal oscillatory motion simultaneously estimated. The objective of this study is to develop and show the feasibility of a novel fast beamforming algorithm for image reconstruction using GPU-based sparse-matrix operation with real-time feedback. In this study, the algorithm was implemented onto a fully integrated, clinically relevant HMIFU system. A single divergent transmit beam was used while fast beamforming was implemented using a GPU-based delay-and-sum method and a sparse-matrix operation. Axial HMI displacements were then estimated from the RF signals using a 1-D normalized cross-correlation method and streamed to a graphic user interface with frame rates up to 15 Hz, a 100-fold increase compared to conventional CPU-based processing. The real-time feedback rate does not require interrupting the HIFU treatment. Results in phantom experiments showed reproducible HMI images and monitoring of 22 in vitro HIFU treatments using the new 2-D system demonstrated reproducible displacement imaging, and monitoring of 22 in vitro HIFU treatments using the new 2-D system showed a consistent average focal displacement decrease of 46.7 ±14.6% during lesion formation. Complementary focal temperature monitoring also indicated an average rate of displacement increase and decrease with focal temperature at 0.84±1.15%/(°)C, and 2.03±0.93%/(°)C , respectively. These results reinforce the HMIFU capability of estimating and monitoring stiffness related changes in real time. Current ongoing studies include clinical translation of the presented system for monitoring of HIFU treatment for breast and pancreatic tumor applications.

  16. System of belief inventory (SBI-15R): a validation study in Italian cancer patients on oncological, rehabilitation, psychological and supportive care settings.

    Science.gov (United States)

    Ripamonti, Carla; Borreani, Claudia; Maruelli, Alice; Proserpio, Tullio; Pessi, Maria Adelaide; Miccinesi, Guido

    2010-01-01

    Spiritual and religious needs are part of a patient's clinical history. The aim of the study was to validate the System of Belief Inventory (SBI-15R) in Italy. It is a feasible way to collect useful information on spiritual needs and resources of patients at any stage of the disease. After the translation procedure, the psychometric properties of the Italian version of SBI-15R were evaluated in patients with non-advanced cancer cared for in four care settings. All patients were administered the Italian version of SBI-15R together with an hoc item inquiring about spirituality--"I believe I am a spiritual person", which was supposed to be correlated with the SBI-15R score. A total of 257 patients were enrolled (mean age, 53.6 years; 191 females, 50% breast cancers, 12% had mestastases). As regards spirituality and religious beliefs, 47.9% were churchgoers; 42% believers but not churchgoers, and 7.8% non-believers; 86.7% of the patients were catholic. The construct validity was high both for the Belief Scale (Cronbach alpha = 0.946) and for the Support Scale (Cronbach alpha = 0.897). The mean (+/- SD) SBI-15R scores of the different groups of patients (known-groups validity) for the "Support" scale was 9.7 (+/- 3.4) for churchgoers, 4.9 (+/- 3.2) for believers non-churchgoers, and 0.8 (+/- 1.4) for non-believers (P scale, it was 25.4 (+/- 4.8) for churchgoers, 18.1 (+/- 6.3) for believers non-churchgoers, and 3.4 (+/- 3.5) for non-believers (P scale was 0.890 (0.841; 0.939 95% CI) and for the "Belief" scale was 0.969 (0.955; 0.984 95% CI). The correlation between the statement "I believe I am a spiritual person" and the SBI-15R scores was 0.475 for the "Support" scale and 0.473 for the "Belief" scale." The Italian version of SBI-15R is a valid and reliable assessment tool to evaluate religiousness and spirituality in cancer patients.

  17. An efficient matrix bi-factorization alternative optimization method for low-rank matrix recovery and completion.

    Science.gov (United States)

    Liu, Yuanyuan; Jiao, L C; Shang, Fanhua; Yin, Fei; Liu, F

    2013-12-01

    In recent years, matrix rank minimization problems have aroused considerable interests from machine learning, data mining and computer vision communities. All of these problems can be solved via their convex relaxations which minimize the trace norm instead of the rank of the matrix, and have to be solved iteratively and involve singular value decomposition (SVD) at each iteration. Therefore, those algorithms for trace norm minimization problems suffer from high computation cost of multiple SVDs. In this paper, we propose an efficient Matrix Bi-Factorization (MBF) method to approximate the original trace norm minimization problem and mitigate the computation cost of performing SVDs. The proposed MBF method can be used to address a wide range of low-rank matrix recovery and completion problems such as low-rank and sparse matrix decomposition (LRSD), low-rank representation (LRR) and low-rank matrix completion (MC). We also present three small scale matrix trace norm models for LRSD, LRR and MC problems, respectively. Moreover, we develop two concrete linearized proximal alternative optimization algorithms for solving the above three problems. Experimental results on a variety of synthetic and real-world data sets validate the efficiency, robustness and effectiveness of our MBF method comparing with the state-of-the-art trace norm minimization algorithms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Development and validation of a clinical and computerised decision support system for management of hypertension (DSS-HTN) at a primary health care (PHC) setting.

    Science.gov (United States)

    Anchala, Raghupathy; Di Angelantonio, Emanuele; Prabhakaran, Dorairaj; Franco, Oscar H

    2013-01-01

    Hypertension remains the top global cause of disease burden. Decision support systems (DSS) could provide an adequate and cost-effective means to improve the management of hypertension at a primary health care (PHC) level in a developing country, nevertheless evidence on this regard is rather limited. Development of DSS software was based on an algorithmic approach for (a) evaluation of a hypertensive patient, (b) risk stratification (c) drug management and (d) lifestyle interventions, based on Indian guidelines for hypertension II (2007). The beta testing of DSS software involved a feedback from the end users of the system on the contents of the user interface. Software validation and piloting was done in field, wherein the virtual recommendations and advice given by the DSS were compared with two independent experts (government doctors from the non-participating PHC centers). The overall percent agreement between the DSS and independent experts among 60 hypertensives on drug management was 85% (95% CI: 83.61-85.25). The kappa statistic for overall agreement for drug management was 0.659 (95% CI: 0.457-0.862) indicating a substantial degree of agreement beyond chance at an alpha fixed at 0.05 with 80% power. Receiver operator curve (ROC) showed a good accuracy for the DSS, wherein, the area under curve (AUC) was 0.848 (95% CI: 0.741-0.948). Sensitivity and specificity of the DSS were 83.33 and 85.71% respectively when compared with independent experts. A point of care, pilot tested and validated DSS for management of hypertension has been developed in a resource constrained low and middle income setting and could contribute to improved management of hypertension at a primary health care level.

  19. Criterion and longitudinal validity of a fixed-distance incremental running test for the determination of lactate thresholds in field settings.

    Science.gov (United States)

    La Torre, Antonio; Fiorella, Pierluigi; Santos, Tony M; Faina, Marcello; Mauri, Clara; Impellizzeri, Franco M

    2012-01-01

    The aim of this study was to examine the criterion validity of 2 lactate thresholds (LTs, intensity corresponding to 1 mmol·L(-1) above baseline; onset of blood lactate accumulation, intensity at 4 mmol·L(-1)) determined with a fixed-distance incremental field test by assessing their correlation with those obtained using a traditional fixed-time laboratory protocol. A second aim was to verify the longitudinal validity by examining the relationships between the changes in LTs obtained with the 2 protocols. To determine the LTs, 12 well-trained male middle and long distance amateur and competitive runners training from 4 to 7 d·wk(-1) (age 25 [5] years, body mass 66 [5] kg, estimated VO(2)max 58.6 [4.9] ml·min(-1)·kg(-1), SD in parentheses) performed in 2 separate sessions an incremental running test on the field starting at 12 km·h(-1) and increasing the speed by 1 km·h(-1) every 1,200 m (FixD test) and an incremental treadmill test in the laboratory starting at 12 km·h(-1) and increasing the speed by 1 km·h(-1) every 6 minutes. The 2 tests were repeated after 6-12 weeks. A nearly perfect relationship was found between the running speeds at LTs determined with the 2 protocols (r = 0.95 [CI95% 0.83-0.99]; p distance intervals performed in field setting.

  20. Validation of an air–liquid interface toxicological set-up using Cu, Pd, and Ag well-characterized nanostructured aggregates and spheres

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, C. R., E-mail: christian.svensson@design.lth.se [Lund University, Department of Design Sciences, Ergonomics and Aerosol Technology (Sweden); Ameer, S. S. [Lund University, Division of Occupational and Environmental Medicine, Department of Laboratory Medicine (Sweden); Ludvigsson, L. [Lund University, Department of Physics, Solid State Physics (Sweden); Ali, N.; Alhamdow, A. [Lund University, Division of Occupational and Environmental Medicine, Department of Laboratory Medicine (Sweden); Messing, M. E. [Lund University, Department of Physics, Solid State Physics (Sweden); Pagels, J.; Gudmundsson, A.; Bohgard, M. [Lund University, Department of Design Sciences, Ergonomics and Aerosol Technology (Sweden); Sanfins, E. [Atomic Energy Commission (CEA), Institute of Emerging Diseases and Innovative Therapies (iMETI), Division of Prions and Related Diseases - SEPIA (France); Kåredal, M.; Broberg, K. [Lund University, Division of Occupational and Environmental Medicine, Department of Laboratory Medicine (Sweden); Rissler, J. [Lund University, Department of Design Sciences, Ergonomics and Aerosol Technology (Sweden)

    2016-04-15

    Systems for studying the toxicity of metal aggregates on the airways are normally not suited for evaluating the effects of individual particle characteristics. This study validates a set-up for toxicological studies of metal aggregates using an air–liquid interface approach. The set-up used a spark discharge generator capable of generating aerosol metal aggregate particles and sintered near spheres. The set-up also contained an exposure chamber, The Nano Aerosol Chamber for In Vitro Toxicity (NACIVT). The system facilitates online characterization capabilities of mass mobility, mass concentration, and number size distribution to determine the exposure. By dilution, the desired exposure level was controlled. Primary and cancerous airway cells were exposed to copper (Cu), palladium (Pd), and silver (Ag) aggregates, 50–150 nm in median diameter. The aggregates were composed of primary particles <10 nm in diameter. For Cu and Pd, an exposure of sintered aerosol particles was also produced. The doses of the particles were expressed as particle numbers, masses, and surface areas. For the Cu, Pd, and Ag aerosol particles, a range of mass surface concentrations on the air–liquid interface of 0.4–10.7, 0.9–46.6, and 0.1–1.4 µg/cm{sup 2}, respectively, were achieved. Viability was measured by WST-1 assay, cytokines (Il-6, Il-8, TNF-a, MCP) by Luminex technology. Statistically significant effects and dose response on cytokine expression were observed for SAEC cells after exposure to Cu, Pd, or Ag particles. Also, a positive dose response was observed for SAEC viability after Cu exposure. For A549 cells, statistically significant effects on viability were observed after exposure to Cu and Pd particles. The set-up produced a stable flow of aerosol particles with an exposure and dose expressed in terms of number, mass, and surface area. Exposure-related effects on the airway cellular models could be asserted.Graphical Abstract.

  1. Kernelized Bayesian Matrix Factorization.

    Science.gov (United States)

    Gönen, Mehmet; Kaski, Samuel

    2014-10-01

    We extend kernelized matrix factorization with a full-Bayesian treatment and with an ability to work with multiple side information sources expressed as different kernels. Kernels have been introduced to integrate side information about the rows and columns, which is necessary for making out-of-matrix predictions. We discuss specifically binary output matrices but extensions to realvalued matrices are straightforward. We extend the state of the art in two key aspects: (i) A full-conjugate probabilistic formulation of the kernelized matrix factorization enables an efficient variational approximation, whereas full-Bayesian treatments are not computationally feasible in the earlier approaches. (ii) Multiple side information sources are included, treated as different kernels in multiple kernel learning which additionally reveals which side sources are informative. We then show that the framework can also be used for supervised and semi-supervised multilabel classification and multi-output regression, by considering samples and outputs as the domains where matrix factorization operates. Our method outperforms alternatives in predicting drug-protein interactions on two data sets. On multilabel classification, our algorithm obtains the lowest Hamming losses on 10 out of 14 data sets compared to five state-of-the-art multilabel classification algorithms. We finally show that the proposed approach outperforms alternatives in multi-output regression experiments on a yeast cell cycle data set.

  2. Validity of a Job-Exposure Matrix for Psychosocial Job Stressors: Results from the Household Income and Labour Dynamics in Australia Survey

    Science.gov (United States)

    Milner, A.; Niedhammer, I.; Chastang, J.-F.; Spittal, M. J.; LaMontagne, A. D.

    2016-01-01

    Introduction A Job Exposure Matrix (JEM) for psychosocial job stressors allows assessment of these exposures at a population level. JEMs are particularly useful in situations when information on psychosocial job stressors were not collected individually and can help eliminate the biases that may be present in individual self-report accounts. This research paper describes the development of a JEM in the Australian context. Methods The Household Income Labour Dynamics in Australia (HILDA) survey was used to construct a JEM for job control, job demands and complexity, job insecurity, and fairness of pay. Population median values of these variables for all employed people (n = 20,428) were used to define individual exposures across the period 2001 to 2012. The JEM was calculated for the Australian and New Zealand Standard Classification of Occupations (ANZSCO) at the four-digit level, which represents 358 occupations. Both continuous and binary exposures to job stressors were calculated at the 4-digit level. We assessed concordance between the JEM-assigned and individually-reported exposures using the Kappa statistic, sensitivity and specificity assessments. We conducted regression analysis using mental health as an outcome measure. Results Kappa statistics indicate good agreement between individually-reported and JEM-assigned dichotomous measures for job demands and control, and moderate agreement for job insecurity and fairness of pay. Job control, job demands and security had the highest sensitivity, while specificity was relatively high for the four exposures. Regression analysis shows that most individually reported and JEM measures were significantly associated with mental health, and individually-reported exposures produced much stronger effects on mental health than the JEM-assigned exposures. Discussion These JEM-based estimates of stressors exposure provide a conservative proxy for individual-level data, and can be applied to a range of health and

  3. Validity of a Job-Exposure Matrix for Psychosocial Job Stressors: Results from the Household Income and Labour Dynamics in Australia Survey.

    Science.gov (United States)

    Milner, A; Niedhammer, I; Chastang, J-F; Spittal, M J; LaMontagne, A D

    2016-01-01

    A Job Exposure Matrix (JEM) for psychosocial job stressors allows assessment of these exposures at a population level. JEMs are particularly useful in situations when information on psychosocial job stressors were not collected individually and can help eliminate the biases that may be present in individual self-report accounts. This research paper describes the development of a JEM in the Australian context. The Household Income Labour Dynamics in Australia (HILDA) survey was used to construct a JEM for job control, job demands and complexity, job insecurity, and fairness of pay. Population median values of these variables for all employed people (n = 20,428) were used to define individual exposures across the period 2001 to 2012. The JEM was calculated for the Australian and New Zealand Standard Classification of Occupations (ANZSCO) at the four-digit level, which represents 358 occupations. Both continuous and binary exposures to job stressors were calculated at the 4-digit level. We assessed concordance between the JEM-assigned and individually-reported exposures using the Kappa statistic, sensitivity and specificity assessments. We conducted regression analysis using mental health as an outcome measure. Kappa statistics indicate good agreement between individually-reported and JEM-assigned dichotomous measures for job demands and control, and moderate agreement for job insecurity and fairness of pay. Job control, job demands and security had the highest sensitivity, while specificity was relatively high for the four exposures. Regression analysis shows that most individually reported and JEM measures were significantly associated with mental health, and individually-reported exposures produced much stronger effects on mental health than the JEM-assigned exposures. These JEM-based estimates of stressors exposure provide a conservative proxy for individual-level data, and can be applied to a range of health and organisational outcomes.

  4. GoM Diet Matrix

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set was taken from CRD 08-18 at the NEFSC. Specifically, the Gulf of Maine diet matrix was developed for the EMAX exercise described in that center...

  5. Student-Directed Video Validation of Psychomotor Skills Performance: A Strategy to Facilitate Deliberate Practice, Peer Review, and Team Skill Sets.

    Science.gov (United States)

    DeBourgh, Gregory A; Prion, Susan K

    2017-03-22

    Background Essential nursing skills for safe practice are not limited to technical skills, but include abilities for determining salience among clinical data within dynamic practice environments, demonstrating clinical judgment and reasoning, problem-solving abilities, and teamwork competence. Effective instructional methods are needed to prepare new nurses for entry-to-practice in contemporary healthcare settings. Method This mixed-methods descriptive study explored self-reported perceptions of a process to self-record videos for psychomotor skill performance evaluation in a convenience sample of 102 pre-licensure students. Results Students reported gains in confidence and skill acquisition using team skills to record individual videos of skill performance, and described the importance of teamwork, peer support, and deliberate practice. Conclusion Although time consuming, the production of student-directed video validations of psychomotor skill performance is an authentic task with meaningful accountabilities that is well-received by students as an effective, satisfying learner experience to increase confidence and competence in performing psychomotor skills.

  6. Perceived parental rearing style in childhood: internal structure and concurrent validity on the Egna Minnen Beträffande Uppfostran--Child Version in clinical settings.

    Science.gov (United States)

    Penelo, Eva; Viladrich, Carme; Domènech, Josep M

    2010-01-01

    We provide the first validation data of the Spanish version of the Egna Minnen Beträffande Uppfostran--Child Version (EMBU-C) in a clinical context. The EMBU-C is a 41-item self-report questionnaire that assesses perceived parental rearing style in children, comprising 4 subscales (rejection, emotional warmth, control attempts/overprotection, and favoring subjects). The test was administered to a clinical sample of 174 Spanish psychiatric outpatients aged 8 to 12. Confirmatory factor analyses were performed, analyzing the children's reports about their parents' rearing style. The results were almost equivalent for father's and mother's ratings. Confirmatory factor analysis yielded an acceptable fit to data of the 3-factor model when removing the items of the favoring subjects scale (root mean squared error of approximation .73), whereas control attempts scale showed lower values, as in previous studies. The influence of sex (of children and parents) on scale scores was inappreciable and children tended to perceive their parents as progressively less warm as they grew older. As predicted, the scores for rejection and emotional warmth were related to bad relationships with parents, absence of family support, harsh discipline, and lack of parental supervision. The Spanish version of EMBU-C can be used with psychometric guarantees to identify rearing style in psychiatric outpatients because evidences of quality in this setting match those obtained in community samples. Copyright 2010 Elsevier Inc. All rights reserved.

  7. A global land-cover validation data set, II: augmenting a stratified sampling design to estimate accuracy by region and land-cover class

    NARCIS (Netherlands)

    Stehman, S.; Olofsson, P.; Woodcock, C.; Herold, M.; Friedl, M.A.

    2012-01-01

    A global validation database that can be used to assess the accuracy of multiple global and regional land-cover maps would yield significant cost savings and enhance comparisons of accuracy of different maps. Because the global validation database should expand over time as new validation data are

  8. Visualisation Enhancement of HoloCatT Matrix

    Science.gov (United States)

    Rosli, Nor Azlin; Mohamed, Azlinah; Khan, Rahmattullah

    Graphology and personality psychology are two different analyses approach perform by two different groups of people, but addresses the personality of the person that were analyzed. It is of interest to visualize a system that would aid personality identification given information visualization of these two domains. Therefore, a research in identifying the relationship between those two domains has been carried out by producing the HoloCatT Matrix, a combination of graphology features and a selected personality traits approach. The objectives of this research are to identify new features of the existing HoloCatT Matrix and validate the new version of matrix with two (2) related group of experts. A set of questionnaire has been distributed to a group of Personologist to identify the relationship and an interview has been done with a Graphologist in validating the matrix. Based on the analysis, 87.5% of the relation confirmed by both group of experts and subsequently the third (3rd) version of HoloCatT Matrix is obtained.

  9. Determination of selected water-soluble vitamins using hydrophilic chromatography: a comparison of photodiode array, fluorescence, and coulometric detection, and validation in a breakfast cereal matrix.

    Science.gov (United States)

    Langer, Swen; Lodge, John K

    2014-06-01

    Water-soluble vitamins are an important class of compounds that require quantification from food sources to monitor nutritional value. In this study we have analysed six water-soluble B vitamins ([thiamine (B1), riboflavin (B2), nicotinic acid (B3, NAc), nicotinamide (B3, NAm), pyridoxal (B6), folic acid (B9)], and ascorbic acid (vit C) with hydrophilic interaction liquid chromatography (HILIC), and compared UV, fluorescent (FLD) and coulometric detection to optimise a method to quantitate the vitamins from food sources. Employing UV/diode array (DAD) and fluorimetric detection, six B vitamins were detected in a single run using gradient elution from 100% to 60% solvent B [10mM ammonium acetate, pH 5.0, in acetonitrile and water 95:5 (v:v)] over 18 min. UV detection was performed at 268 nm for B1, 260 nm for both B3 species and 284 nm for B9. FLD was employed for B2 at excitation wavelength of 268 nm, emission of 513 nm, and 284 nm/317 nm for B6. Coulometric detection can be used to detect B6 and B9, and vit C, and was performed isocratically at 75% and 85% of solvent B, respectively. B6 was analysed at a potential of 720 mV, while B9 was analysed at 600 mV, and vit C at 30 mV. Retention times (0.96 to 11.81 min), intra-day repeatability (CV 1.6 to 3.6), inter-day variability (CV 1.8 to 11.1), and linearity (R 0.9877 to 0.9995) remained good under these conditions with limits of detection varying from 6.6 to 164.6 ng mL(-1), limits of quantification between 16.8 and 548.7 ng mL(-1). The method was successfully applied for quantification of six B vitamins from a fortified food product and is, to our knowledge, the first to simultaneously determine multiple water-soluble vitamins extracted from a food matrix using HILIC. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Selection and validation of a set of reliable reference genes for quantitative RT-PCR studies in the brain of the Cephalopod Mollusc Octopus vulgaris

    Directory of Open Access Journals (Sweden)

    Biffali Elio

    2009-07-01

    Full Text Available Abstract Background Quantitative real-time polymerase chain reaction (RT-qPCR is valuable for studying the molecular events underlying physiological and behavioral phenomena. Normalization of real-time PCR data is critical for a reliable mRNA quantification. Here we identify reference genes to be utilized in RT-qPCR experiments to normalize and monitor the expression of target genes in the brain of the cephalopod mollusc Octopus vulgaris, an invertebrate. Such an approach is novel for this taxon and of advantage in future experiments given the complexity of the behavioral repertoire of this species when compared with its relatively simple neural organization. Results We chose 16S, and 18S rRNA, actB, EEF1A, tubA and ubi as candidate reference genes (housekeeping genes, HKG. The expression of 16S and 18S was highly variable and did not meet the requirements of candidate HKG. The expression of the other genes was almost stable and uniform among samples. We analyzed the expression of HKG into two different set of animals using tissues taken from the central nervous system (brain parts and mantle (here considered as control tissue by BestKeeper, geNorm and NormFinder. We found that HKG expressions differed considerably with respect to brain area and octopus samples in an HKG-specific manner. However, when the mantle is treated as control tissue and the entire central nervous system is considered, NormFinder revealed tubA and ubi as the most suitable HKG pair. These two genes were utilized to evaluate the relative expression of the genes FoxP, creb, dat and TH in O. vulgaris. Conclusion We analyzed the expression profiles of some genes here identified for O. vulgaris by applying RT-qPCR analysis for the first time in cephalopods. We validated candidate reference genes and found the expression of ubi and tubA to be the most appropriate to evaluate the expression of target genes in the brain of different octopuses. Our results also underline the

  11. Selection and validation of a set of reliable reference genes for quantitative RT-PCR studies in the brain of the Cephalopod Mollusc Octopus vulgaris.

    Science.gov (United States)

    Sirakov, Maria; Zarrella, Ilaria; Borra, Marco; Rizzo, Francesca; Biffali, Elio; Arnone, Maria Ina; Fiorito, Graziano

    2009-07-14

    Quantitative real-time polymerase chain reaction (RT-qPCR) is valuable for studying the molecular events underlying physiological and behavioral phenomena. Normalization of real-time PCR data is critical for a reliable mRNA quantification. Here we identify reference genes to be utilized in RT-qPCR experiments to normalize and monitor the expression of target genes in the brain of the cephalopod mollusc Octopus vulgaris, an invertebrate. Such an approach is novel for this taxon and of advantage in future experiments given the complexity of the behavioral repertoire of this species when compared with its relatively simple neural organization. We chose 16S, and 18S rRNA, actB, EEF1A, tubA and ubi as candidate reference genes (housekeeping genes, HKG). The expression of 16S and 18S was highly variable and did not meet the requirements of candidate HKG. The expression of the other genes was almost stable and uniform among samples. We analyzed the expression of HKG into two different set of animals using tissues taken from the central nervous system (brain parts) and mantle (here considered as control tissue) by BestKeeper, geNorm and NormFinder. We found that HKG expressions differed considerably with respect to brain area and octopus samples in an HKG-specific manner. However, when the mantle is treated as control tissue and the entire central nervous system is considered, NormFinder revealed tubA and ubi as the most suitable HKG pair. These two genes were utilized to evaluate the relative expression of the genes FoxP, creb, dat and TH in O. vulgaris. We analyzed the expression profiles of some genes here identified for O. vulgaris by applying RT-qPCR analysis for the first time in cephalopods. We validated candidate reference genes and found the expression of ubi and tubA to be the most appropriate to evaluate the expression of target genes in the brain of different octopuses. Our results also underline the importance of choosing a proper normalization strategy when

  12. Use of the Environment and Policy Evaluation and Observation as a Self-Report Instrument (EPAO-SR) to measure nutrition and physical activity environments in child care settings: validity and reliability evidence.

    Science.gov (United States)

    Ward, Dianne S; Mazzucca, Stephanie; McWilliams, Christina; Hales, Derek

    2015-09-26

    Early care and education (ECE) centers are important settings influencing young children's diet and physical activity (PA) behaviors. To better understand their impact on diet and PA behaviors as well as to evaluate public health programs aimed at ECE settings, we developed and tested the Environment and Policy Assessment and Observation - Self-Report (EPAO-SR), a self-administered version of the previously validated, researcher-administered EPAO. Development of the EPAO-SR instrument included modification of items from the EPAO, community advisory group and expert review, and cognitive interviews with center directors and classroom teachers. Reliability and validity data were collected across 4 days in 3-5 year old classrooms in 50 ECE centers in North Carolina. Center teachers and directors completed relevant portions of the EPAO-SR on multiple days according to a standardized protocol, and trained data collectors completed the EPAO for 4 days in the centers. Reliability and validity statistics calculated included percent agreement, kappa, correlation coefficients, coefficients of variation, deviations, mean differences, and intraclass correlation coefficients (ICC), depending on the response option of the item. Data demonstrated a range of reliability and validity evidence for the EPAO-SR instrument. Reporting from directors and classroom teachers was consistent and similar to the observational data. Items that produced strongest reliability and validity estimates included beverages served, outside time, and physical activity equipment, while items such as whole grains served and amount of teacher-led PA had lower reliability (observation and self-report) and validity estimates. To overcome lower reliability and validity estimates, some items need administration on multiple days. This study demonstrated appropriate reliability and validity evidence for use of the EPAO-SR in the field. The self-administered EPAO-SR is an advancement of the measurement of ECE

  13. The alcohol use disorders identification test (AUDIT: validation of a Nepali version for the detection of alcohol use disorders and hazardous drinking in medical settings

    Directory of Open Access Journals (Sweden)

    Pradhan Bickram

    2012-10-01

    Full Text Available Abstract Background Alcohol problems are a major health issue in Nepal and remain under diagnosed. Increase in consumption are due to many factors, including advertising, pricing and availability, but accurate information is lacking on the prevalence of current alcohol use disorders. The AUDIT (Alcohol Use Disorder Identification Test questionnaire developed by WHO identifies individuals along the full spectrum of alcohol misuse and hence provides an opportunity for early intervention in non-specialty settings. This study aims to validate a Nepali version of AUDIT among patients attending a university hospital and assess the prevalence of alcohol use disorders along the full spectrum of alcohol misuse. Methods This cross-sectional study was conducted in patients attending the medicine out-patient department of a university hospital. DSM-IV diagnostic categories (alcohol abuse and alcohol dependence were used as the gold standard to calculate the diagnostic parameters of the AUDIT. Hazardous drinking was defined as self reported consumption of ≥21 standard drink units per week for males and ≥14 standard drink units per week for females. Results A total of 1068 individuals successfully completed the study. According to DSM-IV, drinkers were classified as follows: No alcohol problem (n=562; 59.5%, alcohol abusers (n= 78; 8.3% and alcohol dependent (n=304; 32.2%. The prevalence of hazardous drinker was 67.1%. The Nepali version of AUDIT is a reliable and valid screening tool to identify individuals with alcohol use disorders in the Nepalese population. AUDIT showed a good capacity to discriminate dependent patients (with AUDIT ≥11 for both the gender and hazardous drinkers (with AUDIT ≥5 for males and ≥4 for females. For alcohol dependence/abuse the cut off values was ≥9 for both males and females. Conclusion The AUDIT questionnaire is a good screening instrument for detecting alcohol use disorders in patients attending a university

  14. Health Services OutPatient Experience questionnaire: factorial validity and reliability of a patient-centered outcome measure for outpatient settings in Italy

    Directory of Open Access Journals (Sweden)

    Coluccia A

    2014-09-01

    Full Text Available Anna Coluccia, Fabio Ferretti, Andrea PozzaDepartment of Medical Sciences, Surgery and Neurosciences, Santa Maria alle Scotte University Hospital, University of Siena, Siena, ItalyPurpose: The patient-centered approach to health care does not seem to be sufficiently developed in the Italian context, and is still characterized by the biomedical model. In addition, there is a lack of validated outcome measures to assess outpatient experience as an aspect common to a variety of settings. The current study aimed to evaluate the factorial validity, reliability, and invariance across sex of the Health Services OutPatient Experience (HSOPE questionnaire, a short ten-item measure of patient-centeredness for Italian adult outpatients. The rationale for unidimensionality of the measure was that it could cover global patient experience as a process common to patients with a variety of diseases and irrespective of the phase of treatment course.Patients and methods: The HSOPE was compiled by 1,532 adult outpatients (51% females, mean age 59.22 years, standard deviation 16.26 receiving care in ten facilities at the Santa Maria alle Scotte University Hospital of Siena, Italy. The sample represented all the age cohorts. Twelve percent were young adults, 57% were adults, and 32% were older adults. Exploratory and confirmatory factor analyses were conducted to evaluate factor structure. Reliability was evaluated as internal consistency using Cronbach’s α. Factor invariance was assessed through multigroup analyses.Results: Both exploratory and confirmatory analyses suggested a clearly defined unidimensional structure of the measure, with all the ten items having salient loadings on a single factor. Internal consistency was excellent (α=0.95. Indices of model fit supported a single-factor structure for both male and female outpatient groups. Young adult outpatients had significantly lower scores on perceived patient-centeredness relative to older adults. No

  15. Validation of a new method for testing provider clinical quality in rural settings in low- and middle-income countries: the observed simulated patient.

    Directory of Open Access Journals (Sweden)

    Tin Aung

    Full Text Available BACKGROUND: Assessing the quality of care provided by individual health practitioners is critical to identifying possible risks to the health of the public. However, existing assessment methods can be inaccurate, expensive, or infeasible in many developing country settings, particularly in rural areas and especially for children. Following an assessment of the strengths and weaknesses of the existing methods for provider assessment, we developed a synthesis method combining components of direct observation, clinical vignettes, and medical mannequins which we have termed "Observed Simulated Patient" or OSP. An OSP assessment involves a trained actor playing the role of a 'mother', a life-size doll representing a 5-year old boy, and a trained observer. The provider being assessed was informed in advance of the role-playing, and told to conduct the diagnosis and treatment as he normally would while verbally describing the examinations. METHODOLOGY/PRINCIPAL FINDINGS: We tested the validity of OSP by conducting parallel scoring of medical providers in Myanmar, assessing the quality of their diagnosis and treatment of pediatric malaria, first by direct observation of true patients and second by OSP. Data were collected from 20 private independent medical practitioners in Mon and Kayin States, Myanmar between December 26, 2010 and January 12, 2011. All areas of assessment showed agreement between OSP and direct observation above 90% except for history taking related to past experience with malaria medicines. In this area, providers did not ask questions of the OSP to the same degree that they questioned real patients (agreement 82.8%. CONCLUSIONS/SIGNIFICANCE: The OSP methodology may provide a valuable option for quality assessment of providers in places, or for health conditions, where other assessment tools are unworkable.

  16. Obtaining valid laboratory data in clinical trials conducted in resource diverse settings: lessons learned from a microbicide phase III clinical trial.

    Directory of Open Access Journals (Sweden)

    Tania Crucitti

    2010-10-01

    Full Text Available Over the last decade several phase III microbicides trials have been conducted in developing countries. However, laboratories in resource constrained settings do not always have the experience, infrastructure, and the capacity to deliver laboratory data meeting the high standards of clinical trials. This paper describes the design and outcomes of a laboratory quality assurance program which was implemented during a phase III clinical trial evaluating the efficacy of the candidate microbicide Cellulose Sulfate 6% (CS [1].In order to assess the effectiveness of CS for HIV and STI prevention, a phase III clinical trial was conducted in 5 sites: 3 in Africa and 2 in India. The trial sponsor identified an International Central Reference Laboratory (ICRL, responsible for the design and management of a quality assurance program, which would guarantee the reliability of laboratory data. The ICRL provided advice on the tests, assessed local laboratories, organized trainings, conducted supervision visits, performed re-tests, and prepared control panels. Local laboratories were provided with control panels for HIV rapid tests and Chlamydia trachomatis/Neisseria gonorrhoeae (CT/NG amplification technique. Aliquots from respective control panels were tested by local laboratories and were compared with results obtained at the ICRL.Overall, good results were observed. However, discordances between the ICRL and site laboratories were identified for HIV and CT/NG results. One particular site experienced difficulties with HIV rapid testing shortly after study initiation. At all sites, DNA contamination was identified as a cause of invalid CT/NG results. Both problems were timely detected and solved. Through immediate feedback, guidance and repeated training of laboratory staff, additional inaccuracies were prevented.Quality control guidelines when applied in field laboratories ensured the reliability and validity of final study data. It is essential that sponsors

  17. Validation of a new method for testing provider clinical quality in rural settings in low- and middle-income countries: the observed simulated patient.

    Science.gov (United States)

    Aung, Tin; Montagu, Dominic; Schlein, Karen; Khine, Thin Myat; McFarland, Willi

    2012-01-01

    Assessing the quality of care provided by individual health practitioners is critical to identifying possible risks to the health of the public. However, existing assessment methods can be inaccurate, expensive, or infeasible in many developing country settings, particularly in rural areas and especially for children. Following an assessment of the strengths and weaknesses of the existing methods for provider assessment, we developed a synthesis method combining components of direct observation, clinical vignettes, and medical mannequins which we have termed "Observed Simulated Patient" or OSP. An OSP assessment involves a trained actor playing the role of a 'mother', a life-size doll representing a 5-year old boy, and a trained observer. The provider being assessed was informed in advance of the role-playing, and told to conduct the diagnosis and treatment as he normally would while verbally describing the examinations. We tested the validity of OSP by conducting parallel scoring of medical providers in Myanmar, assessing the quality of their diagnosis and treatment of pediatric malaria, first by direct observation of true patients and second by OSP. Data were collected from 20 private independent medical practitioners in Mon and Kayin States, Myanmar between December 26, 2010 and January 12, 2011. All areas of assessment showed agreement between OSP and direct observation above 90% except for history taking related to past experience with malaria medicines. In this area, providers did not ask questions of the OSP to the same degree that they questioned real patients (agreement 82.8%). The OSP methodology may provide a valuable option for quality assessment of providers in places, or for health conditions, where other assessment tools are unworkable.

  18. Validation of the Work Observation Method By Activity Timing (WOMBAT method of conducting time-motion observations in critical care settings: an observational study

    Directory of Open Access Journals (Sweden)

    Gibney RT Noel

    2011-05-01

    Full Text Available Abstract Background Electronic documentation handling may facilitate information flows in health care settings to support better coordination of care among Health Care Providers (HCPs, but evidence is limited. Methods that accurately depict changes to the workflows of HCPs are needed to assess whether the introduction of a Critical Care clinical Information System (CCIS to two Intensive Care Units (ICUs represents a positive step for patient care. To evaluate a previously described method of quantifying amounts of time spent and interruptions encountered by HCPs working in two ICUs. Methods Observers used PDAs running the Work Observation Method By Activity Timing (WOMBAT software to record the tasks performed by HCPs in advance of the introduction of a Critical Care clinical Information System (CCIS to quantify amounts of time spent on tasks and interruptions encountered by HCPs in ICUs. Results We report the percentages of time spent on each task category, and the rates of interruptions observed for physicians, nurses, respiratory therapists, and unit clerks. Compared with previously published data from Australian hospital wards, interdisciplinary information sharing and communication in ICUs explain higher proportions of time spent on professional communication and documentation by nurses and physicians, as well as more frequent interruptions which are often followed by professional communication tasks. Conclusions Critical care workloads include requirements for timely information sharing and communication and explain the differences we observed between the two datasets. The data presented here further validate the WOMBAT method, and support plans to compare workflows before and after the introduction of electronic documentation methods in ICUs.

  19. Challenges in Identifying Patients with Type 2 Diabetes for Quality-Improvement Interventions in Primary Care Settings and the Importance of Valid Disease Registries.

    Science.gov (United States)

    Wozniak, Lisa; Soprovich, Allison; Rees, Sandra; Johnson, Steven T; Majumdar, Sumit R; Johnson, Jeffrey A

    2015-10-01

    Patient registries are considered an important foundation of chronic disease management, and diabetes patient registries are associated with better processes and outcomes of care. The purpose of this article is to describe the development and use of registries in the Alberta's Caring for Diabetes (ABCD) project to identify and reach target populations for quality-improvement interventions in the primary care setting. We applied the reach, effectiveness, adoption, implementation and maintenance (RE-AIM) framework and expanded the definition of reach beyond the individual (i.e. patient) level to include the ability to identify target populations at an organizational level. To characterize reach and the implementation of registries, semistructured interviews were conducted with key informants, and a usual-care checklist was compiled for each participating Primary Care Network (PCN). Content analysis was used to analyze qualitative data. Using registries to identify and recruit participants for the ABCD interventions proved challenging. The quality of the registries depended on whether physicians granted PCN access to patient lists, the strategies used in development, the reliability of diagnostic information and the data elements collected. In addition, once a diabetes registry was developed, there was limited ability to update it. Proactive management of chronic diseases like diabetes requires the ability to reach targeted patients at the population level. We observed several challenges to the development and application of patient registries. Given the importance of valid registries, strong collaborations and novel strategies that involve policy-makers, PCNs and providers are needed to help find solutions to improve registry quality and resolve maintenance issues. Copyright © 2015 Canadian Diabetes Association. Published by Elsevier Inc. All rights reserved.

  20. Development and validation of an acute kidney injury risk index for patients undergoing general surgery: results from a national data set.

    Science.gov (United States)

    Kheterpal, Sachin; Tremper, Kevin K; Heung, Michael; Rosenberg, Andrew L; Englesbe, Michael; Shanks, Amy M; Campbell, Darrell A

    2009-03-01

    The authors sought to identify the incidence, risk factors, and mortality impact of acute kidney injury (AKI) after general surgery using a large and representative national clinical data set. The 2005-2006 American College of Surgeons-National Surgical Quality Improvement Program participant use data file is a compilation of outcome data from general surgery procedures performed in 121 US medical centers. The primary outcome was AKI within 30 days, defined as an increase in serum creatinine of at least 2 mg/dl or acute renal failure necessitating dialysis. A variety of patient comorbidities and operative characteristics were evaluated as possible predictors of AKI. A logistic regression full model fit was used to create an AKI model and risk index. Thirty-day mortality among patients with and without AKI was compared. Of 152,244 operations reviewed, 75,952 met the inclusion criteria, and 762 (1.0%) were complicated by AKI. The authors identified 11 independent preoperative predictors: age 56 yr or older, male sex, emergency surgery, intraperitoneal surgery, diabetes mellitus necessitating oral therapy, diabetes mellitus necessitating insulin therapy, active congestive heart failure, ascites, hypertension, mild preoperative renal insufficiency, and moderate preoperative renal insufficiency. The c statistic for a simplified risk index was 0.80 in the derivation and validation cohorts. Class V patients (six or more risk factors) had a 9% incidence of AKI. Overall, patients experiencing AKI had an eightfold increase in 30-day mortality. Approximately 1% of general surgery cases are complicated by AKI. The authors have developed a robust risk index based on easily identified preoperative comorbidities and patient characteristics.

  1. Improved diagnostic accuracy of Alzheimer's disease by combining regional cortical thickness and default mode network functional connectivity: Validated in the Alzheimer's disease neuroimaging initiative set

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ji Eun; Park, Bum Woo; Kim, Sang Joon; Kim, Ho Sung; Choi, Choong Gon; Jung, Seung Jung; Oh, Joo Young; Shim, Woo Hyun [Dept. of Radiology and Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul (Korea, Republic of); Lee, Jae Hong; Roh, Jee Hoon [University of Ulsan College of Medicine, Asan Medical Center, Seoul (Korea, Republic of)

    2017-11-15

    To identify potential imaging biomarkers of Alzheimer's disease by combining brain cortical thickness (CThk) and functional connectivity and to validate this model's diagnostic accuracy in a validation set. Data from 98 subjects was retrospectively reviewed, including a study set (n = 63) and a validation set from the Alzheimer's Disease Neuroimaging Initiative (n = 35). From each subject, data for CThk and functional connectivity of the default mode network was extracted from structural T1-weighted and resting-state functional magnetic resonance imaging. Cortical regions with significant differences between patients and healthy controls in the correlation of CThk and functional connectivity were identified in the study set. The diagnostic accuracy of functional connectivity measures combined with CThk in the identified regions was evaluated against that in the medial temporal lobes using the validation set and application of a support vector machine. Group-wise differences in the correlation of CThk and default mode network functional connectivity were identified in the superior temporal (p < 0.001) and supramarginal gyrus (p = 0.007) of the left cerebral hemisphere. Default mode network functional connectivity combined with the CThk of those two regions were more accurate than that combined with the CThk of both medial temporal lobes (91.7% vs. 75%). Combining functional information with CThk of the superior temporal and supramarginal gyri in the left cerebral hemisphere improves diagnostic accuracy, making it a potential imaging biomarker for Alzheimer's disease.

  2. Role of work hardening characteristics of matrix alloys in the ...

    Indian Academy of Sciences (India)

    The strengthening of particulate reinforced metal–matrix composites is associated with a high dislocation density in the matrix due to the difference in coefficient of thermal expansion between the reinforcement and the matrix. While this is valid, the role of work hardening characteristics of the matrix alloys in strengthening of ...

  3. In-depth, high-accuracy proteomics of sea urchin tooth organic matrix

    Directory of Open Access Journals (Sweden)

    Mann Matthias

    2008-12-01

    Full Text Available Abstract Background The organic matrix contained in biominerals plays an important role in regulating mineralization and in determining biomineral properties. However, most components of biomineral matrices remain unknown at present. In sea urchin tooth, which is an important model for developmental biology and biomineralization, only few matrix components have been identified. The recent publication of the Strongylocentrotus purpuratus genome sequence rendered possible not only the identification of genes potentially coding for matrix proteins, but also the direct identification of proteins contained in matrices of skeletal elements by in-depth, high-accuracy proteomic analysis. Results We identified 138 proteins in the matrix of tooth powder. Only 56 of these proteins were previously identified in the matrices of test (shell and spine. Among the novel components was an interesting group of five proteins containing alanine- and proline-rich neutral or basic motifs separated by acidic glycine-rich motifs. In addition, four of the five proteins contained either one or two predicted Kazal protease inhibitor domains. The major components of tooth matrix were however largely identical to the set of spicule matrix proteins and MSP130-related proteins identified in test (shell and spine matrix. Comparison of the matrices of crushed teeth to intact teeth revealed a marked dilution of known intracrystalline matrix proteins and a concomitant increase in some intracellular proteins. Conclusion This report presents the most comprehensive list of sea urchin tooth matrix proteins available at present. The complex mixture of proteins identified may reflect many different aspects of the mineralization process. A comparison between intact tooth matrix, presumably containing odontoblast remnants, and crushed tooth matrix served to differentiate between matrix components and possible contributions of cellular remnants. Because LC-MS/MS-based methods directly

  4. Matrix Depot: an extensible test matrix collection for Julia

    Directory of Open Access Journals (Sweden)

    Weijian Zhang

    2016-04-01

    Full Text Available Matrix Depot is a Julia software package that provides easy access to a large and diverse collection of test matrices. Its novelty is threefold. First, it is extensible by the user, and so can be adapted to include the user’s own test problems. In doing so, it facilitates experimentation and makes it easier to carry out reproducible research. Second, it amalgamates in a single framework two different types of existing matrix collections, comprising parametrized test matrices (including Hansen’s set of regularization test problems and Higham’s Test Matrix Toolbox and real-life sparse matrix data (giving access to the University of Florida sparse matrix collection. Third, it fully exploits the Julia language. It uses multiple dispatch to help provide a simple interface and, in particular, to allow matrices to be generated in any of the numeric data types supported by the language.

  5. Matrix analysis

    CERN Document Server

    Bhatia, Rajendra

    1997-01-01

    A good part of matrix theory is functional analytic in spirit. This statement can be turned around. There are many problems in operator theory, where most of the complexities and subtleties are present in the finite-dimensional case. My purpose in writing this book is to present a systematic treatment of methods that are useful in the study of such problems. This book is intended for use as a text for upper division and gradu­ ate courses. Courses based on parts of the material have been given by me at the Indian Statistical Institute and at the University of Toronto (in collaboration with Chandler Davis). The book should also be useful as a reference for research workers in linear algebra, operator theory, mathe­ matical physics and numerical analysis. A possible subtitle of this book could be Matrix Inequalities. A reader who works through the book should expect to become proficient in the art of deriving such inequalities. Other authors have compared this art to that of cutting diamonds. One first has to...

  6. Block Hadamard measurement matrix with arbitrary dimension in compressed sensing

    Science.gov (United States)

    Liu, Shaoqiang; Yan, Xiaoyan; Fan, Xiaoping; Li, Fei; Xu, Wen

    2017-01-01

    As Hadamard measurement matrix cannot be used for compressing signals with dimension of a non-integral power-of-2, this paper proposes a construction method of block Hadamard measurement matrix with arbitrary dimension. According to the dimension N of signals to be measured, firstly, construct a set of Hadamard sub matrixes with different dimensions and make the sum of these dimensions equals to N. Then, arrange the Hadamard sub matrixes in a certain order to form a block diagonal matrix. Finally, take the former M rows of the block diagonal matrix as the measurement matrix. The proposed measurement matrix which retains the orthogonality of Hadamard matrix and sparsity of block diagonal matrix has highly sparse structure, simple hardware implements and general applicability. Simulation results show that the performance of our measurement matrix is better than Gaussian matrix, Logistic chaotic matrix, and Toeplitz matrix.

  7. Protein linear indices of the 'macromolecular pseudograph alpha-carbon atom adjacency matrix' in bioinformatics. Part 1: prediction of protein stability effects of a complete set of alanine substitutions in Arc repressor.

    Science.gov (United States)

    Marrero-Ponce, Yovani; Medina-Marrero, Ricardo; Castillo-Garit, Juan A; Romero-Zaldivar, Vicente; Torrens, Francisco; Castro, Eduardo A

    2005-04-15

    A novel approach to bio-macromolecular design from a linear algebra point of view is introduced. A protein's total (whole protein) and local (one or more amino acid) linear indices are a new set of bio-macromolecular descriptors of relevance to protein QSAR/QSPR studies. These amino-acid level biochemical descriptors are based on the calculation of linear maps on Rn[f k(xmi):Rn-->Rn] in canonical basis. These bio-macromolecular indices are calculated from the kth power of the macromolecular pseudograph alpha-carbon atom adjacency matrix. Total linear indices are linear functional on Rn. That is, the kth total linear indices are linear maps from Rn to the scalar R[f k(xm):Rn-->R]. Thus, the kth total linear indices are calculated by summing the amino-acid linear indices of all amino acids in the protein molecule. A study of the protein stability effects for a complete set of alanine substitutions in the Arc repressor illustrates this approach. A quantitative model that discriminates near wild-type stability alanine mutants from the reduced-stability ones in a training series was obtained. This model permitted the correct classification of 97.56% (40/41) and 91.67% (11/12) of proteins in the training and test set, respectively. It shows a high Matthews correlation coefficient (MCC=0.952) for the training set and an MCC=0.837 for the external prediction set. Additionally, canonical regression analysis corroborated the statistical quality of the classification model (Rcanc=0.824). This analysis was also used to compute biological stability canonical scores for each Arc alanine mutant. On the other hand, the linear piecewise regression model compared favorably with respect to the linear regression one on predicting the melting temperature (tm) of the Arc alanine mutants. The linear model explains almost 81% of the variance of the experimental tm (R=0.90 and s=4.29) and the LOO press statistics evidenced its predictive ability (q2=0.72 and scv=4.79). Moreover, the

  8. The validity of drug users' self-reports in a non-treatment setting: prevalence and predictors of incorrect reporting methadone treatment modalities

    NARCIS (Netherlands)

    Langendam, M. W.; van Haastrecht, H. J.; van Ameijden, E. J.

    1999-01-01

    Epidemiological studies among drug users are often based on retrospective self-reports. However, among others, memory failure, being under the influence of drugs, psychopathology, misunderstanding of questions and socially desirable answering may generate inaccurate reporting. This study validated

  9. Development of the Nurses' Care Coordination Competency Scale for mechanically ventilated patients in critical care settings in Japan: Part 2 Validation of the scale.

    Science.gov (United States)

    Takiguchi, Chie; Yatomi, Yumiko; Inoue, Tomoko

    2017-12-01

    To confirm the validity and reliability of the nurses' care coordination competency draft scale for mechanically ventilated patients in Japan. In this cross sectional observational study, a draft scale measuring care coordination was distributed to 2189 nurses from 73 intensive care units in Japan from February-March 2016. Based on the valid 887 responses, we examined construct validity including structural validity (exploratory and confirmatory factor analysis), convergent and discriminant validity and internal consistency reliability. 73 Intensive care units. Exploratory factor analyses yielded four factors with 22 items: 1) promoting team cohesion, 2) understanding care coordination needs, 3) aggregating and disseminating information, 4) devising and clearly articulating the care vision. The four-factor model was confirmed using a confirmatory factor analysis (confirmatory fit index=0.942, root mean square error of approximation=0.062). Scale scores positively correlated with team leadership and clearly identified and discriminated nurses' attributes. Cronbach's alpha coefficient for each subscale was between 0.812 and 0.890, and 0.947 for the total scale. The Nurses' Care Coordination Competency Scale with four factors and 22 items had sufficient validity and reliability. The scale could make care coordination visible in nursing practice. Future research on the relationship between this scale and patient outcomes is needed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Delineamento e validação de matriz de exposição ocupacional à sílica Design and validation of a job-exposure matrix to silica

    Directory of Open Access Journals (Sweden)

    Fátima Sueli Neto Ribeiro

    2005-01-01

    Full Text Available OBJETIVO: Desenvolver matriz de exposição ocupacional de base populacional para a sílica cristalina no Brasil e estimar sua validade. MÉTODOS: A matriz de exposição ocupacional foi desenvolvida por um epidemiologista e um higienista ocupacional em quatro etapas: a codificação da variável ocupação; b codificação da variável setor econômico; c classificação da exposição por consenso entre os pesquisadores e; d estimativa do número de trabalhadores registrados, em 1995, para cada nível de exposição. As 8.675 células da matriz, formadas pela intersecção das variáveis setor econômico (25 colunas e ocupação (347 linhas, foram classificadas de acordo com a freqüência da exposição à sílica em quatro níveis: não expostos, possivelmente expostos, provavelmente expostos e definitivamente expostos. Para a validação da matriz de exposição ocupacional, cinco setores econômicos (extração mineral, construção civil, metalurgia, administração de serviços de pessoal técnico e indústria têxtil foram re-codificados quanto à exposição por peritos convidados. Avaliou-se a confiabilidade pela proporção de acordos e o grau de concordância pelo Kappa. RESULTADOS: A matriz de exposição ocupacional apresentou alta concordância geral, variando de 64,0% na metalurgia a 94,4% na extração mineral. O Kappa revelou boa concordância no setor de extração mineral (0,9 e baixa ou regular nos demais (variação de 0,1 a 0,5. A especificidade foi alta para os setores de metalurgia (86,5% e extração mineral (100,0%. A construção civil apresentou especificidade de 56,0%. CONCLUSÕES: A matriz de exposição ocupacional apresentou boa acurácia, revelando-se adequada para estimar a exposição à sílica na força de trabalho ocupada no País.OBJECTIVE: To develop a population-based matrix of job-exposure to crystalline silica in Brazil and to estimate its validity. METHODS: An epidemiologist and an industrial

  11. The Maudsley Obsessive-Compulsive Stimuli Set: validation of a standardized paradigm for symptom-specific provocation in obsessive-compulsive disorder.

    NARCIS (Netherlands)

    Mataix-Cols, D.; Lawrence, N.S.; Wooderson, S.; Speckens, A.E.M.; Phillips, M.L.

    2009-01-01

    This article describes and further validates a standardized symptom-provocation procedure that combines symptom-specific audio instructions and pictures to reliably provoke different kinds of symptom-specific anxiety in obsessive-compulsive disorder, corresponding to its four major symptom

  12. Validity, reliability and utility of the Irish Nursing Minimum Data Set for General Nursing in investigating the effectiveness of nursing interventions in a general nursing setting: A repeated measures design.

    LENUS (Irish Health Repository)

    Morris, Roisin

    2013-08-06

    Internationally, nursing professionals are coming under increasing pressure to highlight the contribution they make to health care and patient outcomes. Despite this, difficulties exist in the provision of quality information aimed at describing nursing work in sufficient detail. The Irish Minimum Data Set for General Nursing is a new nursing data collection system aimed at highlighting the contribution of nursing to patient care.

  13. Cardiff cardiac ablation patient-reported outcome measure (C-CAP): validation of a new questionnaire set for patients undergoing catheter ablation for cardiac arrhythmias in the UK.

    Science.gov (United States)

    White, Judith; Withers, Kathleen L; Lencioni, Mauro; Carolan-Rees, Grace; Wilkes, Antony R; Wood, Kathryn A; Patrick, Hannah; Cunningham, David; Griffith, Michael

    2016-06-01

    To formally test and validate a patient-reported outcome measure (PROM) for patients with cardiac arrhythmias undergoing catheter ablation procedures in the UK [Cardiff Cardiac Ablation PROM (C-CAP)]. A multicentre, prospective, observational cohort study with consecutive patient enrolment from three UK sites was conducted. Patients were sent C-CAP questionnaires before and after an ablation procedure. Pre-ablation C-CAP1 (17 items) comprised four domains: patient expectations; condition and symptoms; restricted activity and healthcare visits; medication and general health. Post-ablation C-CAP2 (19 items) comprised five domains including change in symptoms and procedural complications. Both questionnaires also included the generic EQ-5D-5L tool (EuroQol). Reliability, validity, and responsiveness measures were calculated. A total of 517 valid pre-ablation and 434 post-ablation responses were received; questionnaires showed good feasibility and item acceptability. Internal consistency was good (Cronbach's alpha >0.7) and test-retest reliability was acceptable for all scales. C-CAP scales showed high responsiveness (effect size >0.8). Patients improved significantly (p cardiac arrhythmias. C-CAP questionnaires provide a tool with disease-specific and generic domains to explore how cardiac ablation procedures in the UK impact upon patients' lives.

  14. Translation and validation of the breast feeding self efficacy scale into the Kiswahili language in resource restricted setting in Thika – Kenya

    Directory of Open Access Journals (Sweden)

    D.M Mituki

    2017-01-01

    Full Text Available Background Exclusive breastfeeding (EBF is one of the most cost‐effective, health‐ promoting, and disease‐preventing intervention and has been referred to as the cornerstone of child survival. Many mothers however discontinue EBF before the end of six months recommended by World Health Organization (WHO some due to psychosocial issues. Breast feeding self‐efficacy scale‐short form (BSES‐SF, has been used to establish mothers’ self‐efficacy towards breastfeeding by computing breast feeding self‐efficacy (BSE scores. These scores have been used globally to predict EBF duration. Internationally accepted tools can be used to compare data across countries. Such tools however need to be translated into local languages for different countries and set‐ups. Objectives The aim of the study was to translate and validate the English BSES‐SF into Kiswahili the national laguage in Kenya. Methods The study was a pilot study within the main cluster randomized longitudinal study. Pregnant women at 37 weeks gestation were randomly placed into, intervention (n=21 and comparison (n=21 groups. The BSES‐SF questionnaire was used to collect data on BSE at baseline and another questionnaire used to collect socio‐ economic data. Mothers in the intervention were educated on the importance of exclusive breastfeeding (EBF and skills required while those in the comparison group went through usual care provided at the health facility. Nutrition education was tailored to promoting maternal BSE. Results The translated BSES‐SF was found to be easy to understand, it showed good consistency and semantic validity. Predictive validity was demonstrated through significant mean differences between the groups. The intervention group had higher EBF rates at 6 weeks post‐partum (χ2=6.170, p=0.013. The Cronbach’s alpha coefficient for the Kiswahili version of the BSES‐SF was 0.91 with a mean score of 60.95 (SD ±10.36, an item mean of 4.354. Conclusion

  15. On the validity of setting breakpoint minimum inhibition concentrations at one quarter of the plasma concentration achieved following oral administration of oxytetracycline

    DEFF Research Database (Denmark)

    Coyne, R.; Samuelsen, O.; Bergh, Ø.

    2004-01-01

    .03125–0.0625 mg/l. These breakpoint values would, therefore, predict that the therapy should have had no beneficial effect and that any strain of A. salmonicida with MIC>0.0625 mg/l must be considered as resistant. A consideration of the pattern of the mortalities before and during the period of therapy suggests...... administration of OTC medicated feed were applied to investigate the validity of the application of the 4:1 ratio. Breakpoints generated by the application of this ratio to these data would suggest that OTC could never have had any value in combating A. salmonicida infections. As this conclusion is contrary...

  16. Use of international data sets to evaluate and validate pathway assessment models applicable to exposure and dose reconstruction at DOE facilities. Monthly progress reports and final report, October--December 1994

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, F.O. [Senes Oak Ridge, Inc., TN (United States). Center for Risk Analysis

    1995-04-01

    The objective of Task 7.lD was to (1) establish a collaborative US-USSR effort to improve and validate our methods of forecasting doses and dose commitments from the direct contamination of food sources, and (2) perform experiments and validation studies to improve our ability to predict rapidly and accurately the long-term internal dose from the contamination of agricultural soil. At early times following an accident, the direct contamination of pasture and food stuffs, particularly leafy vegetation and grain, can be of great importance. This situation has been modeled extensively. However, models employed then to predict the deposition, retention and transport of radionuclides in terrestrial environments employed concepts and data bases that were more than a decade old. The extent to which these models have been tested with independent data sets was limited. The data gathered in the former-USSR (and elsewhere throughout the Northern Hemisphere) offered a unique opportunity to test model predictions of wet and dry deposition, agricultural foodchain bioaccumulation, and short- and long-term retention, redistribution, and resuspension of radionuclides from a variety of natural and artificial surfaces. The current objective of this project is to evaluate and validate pathway-assessment models applicable to exposure and dose reconstruction at DOE facilities through use of international data sets. This project incorporates the activity of Task 7.lD into a multinational effort to evaluate models and data used for the prediction of radionuclide transfer through agricultural and aquatic systems to humans. It also includes participation in two studies, BIOMOVS (BIOspheric MOdel Validation Study) with the Swedish National Institute for Radiation Protection and VAMP (VAlidation of Model Predictions) with the International Atomic Energy Agency, that address testing the performance of models of radionuclide transport through foodchains.

  17. Community-based cross-cultural adaptation of mental health measures in emergency settings: validating the IES-R and HSCL-37A in Eastern Democratic Republic of Congo.

    Science.gov (United States)

    Mels, Cindy; Derluyn, Ilse; Broekaert, Eric; Rosseel, Yves

    2010-09-01

    This study aims at providing qualitative and quantitative evidence on the relevance of two broadly used mental health self-report measures--Impact of Event Scale Revised (IES-R) and Hopkins Symptom Checklist 37 for Adolescents (HSCL-37A)--for use in Eastern Democratic of Congo, as no psychological assessment instruments were available for this region. We therefore describe an apt procedure to adapt and translate standard screening instruments in close collaboration with the local community, feasible under challenging conditions in emergency settings. Focus groups and interviews with community key figures in psychosocial care were employed to ensure local validity of the adaptation and translation process. Consequently, the questionnaires' internal consistency (Cronbach's alpha) and construct validity (principal component analysis, testing of theoretical assumptions) were assessed based on a clustered school-based community survey among 1,046 adolescents (13-21 years) involving 13 secondary schools in the Ituri district in Eastern Democratic Republic of Congo. Key-informant qualitative data confirmed face and construct validity of all IES-R and all HSCL-37A anxiety items. Additional culture-specific symptoms of adolescent mental ill-health were added to enhance local relevance of the HSCL-37A depression and externalizing subscales. Quantitative analysis of the survey data revealed adequate internal consistency and construct validity of both adapted measures, yet weaker results for the externalizing scale. Furthermore, it confirmed the internalizing/externalizing factor structure of the HSCL-37A and the theoretically deviating intrusion/arousal versus active avoidance factor structure for the IES-R. Community-based adaptation can extend the validity and local relevance of mental health screening in emergency and low-income settings. The availability of adequate Swahili and Congolese French adaptations of the IES-R and HSCL-37A could stimulate the assessment of

  18. Delirium in the intensive care setting: A reevaluation of the validity of the CAM-ICU and ICDSC versus the DSM-IV-TR in determining a diagnosis of delirium as part of the daily clinical routine.

    Science.gov (United States)

    Boettger, Soenke; Nuñez, David Garcia; Meyer, Rafael; Richter, André; Fernandez, Susana Franco; Rudiger, Alain; Schubert, Maria; Jenewein, Josef

    2017-12-01

    In the intensive care setting, delirium is a common occurrence that comes with subsequent adversities. Therefore, several instruments have been developed to screen for and detect delirium. Their validity and psychometric properties, however, remain controversial. In this prospective cohort study, the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU) and the Intensive Care Delirium Screening Checklist (ICDSC) were evaluated versus the DSM-IV-TR in the diagnosis of delirium with respect to their validity and psychometric properties. Out of some 289 patients, 210 with matching CAM-ICU, ICDSC, and DSM-IV-TR diagnoses were included. Between the scales, the prevalence of delirium ranged from 23.3% with the CAM-ICU, to 30.5% with the ICDSC, to 43.8% with the DSM-IV-TR criteria. The CAM-ICU showed only moderate concurrent validity (Cohen's κ = 0.44) and sensitivity (50%), but high specificity (95%). The ICDSC also reached moderate agreement (Cohen's κ = 0.60) and sensitivity (63%) while being very specific (95%). Between the CAM-ICU and the ICDSC, the concurrent validity was again only moderate (Cohen's κ = 0.56); however, the ICDSC yielded higher sensitivity and specificity (78 and 83%, respectively). In the daily clinical routine, neither the CAM-ICU nor the ICDSC, common tools used in screening and detecting delirium in the intensive care setting, reached sufficient concurrent validity; nor did they outperform the DSM-IV-TR diagnostic criteria with respect to sensitivity or positive prediction, but they were very specific. Thus, the non-prediction by the CAM-ICU or ICDSC did not refute the presence of delirium. Between the CAM-ICU and ICDSC, the ICDSC proved to be the more accurate instrument.

  19. Multiple graph regularized nonnegative matrix factorization

    KAUST Repository

    Wang, Jim Jing-Yan

    2013-10-01

    Non-negative matrix factorization (NMF) has been widely used as a data representation method based on components. To overcome the disadvantage of NMF in failing to consider the manifold structure of a data set, graph regularized NMF (GrNMF) has been proposed by Cai et al. by constructing an affinity graph and searching for a matrix factorization that respects graph structure. Selecting a graph model and its corresponding parameters is critical for this strategy. This process is usually carried out by cross-validation or discrete grid search, which are time consuming and prone to overfitting. In this paper, we propose a GrNMF, called MultiGrNMF, in which the intrinsic manifold is approximated by a linear combination of several graphs with different models and parameters inspired by ensemble manifold regularization. Factorization metrics and linear combination coefficients of graphs are determined simultaneously within a unified object function. They are alternately optimized in an iterative algorithm, thus resulting in a novel data representation algorithm. Extensive experiments on a protein subcellular localization task and an Alzheimer\\'s disease diagnosis task demonstrate the effectiveness of the proposed algorithm. © 2013 Elsevier Ltd. All rights reserved.

  20. Development of a validated liquid chromatographic method for quantification of sorafenib tosylate in the presence of stress-induced degradation products and in biological matrix employing analytical quality by design approach.

    Science.gov (United States)

    Sharma, Teenu; Khurana, Rajneet Kaur; Jain, Atul; Katare, O P; Singh, Bhupinder

    2017-12-15

    The current research work envisages an analytical quality by design-enabled development of a simple, rapid, sensitive, specific, robust and cost-effective stability-indicating reversed-phase high-performance liquid chromatographic method for determining stress-induced forced-degradation products of sorafenib tosylate (SFN). An Ishikawa fishbone diagram was constructed to embark upon analytical target profile and critical analytical attributes, i.e. peak area, theoretical plates, retention time and peak tailing. Factor screening using Taguchi orthogonal arrays and quality risk assessment studies carried out using failure mode effect analysis aided the selection of critical method parameters, i.e. mobile phase ratio and flow rate potentially affecting the chosen critical analytical attributes. Systematic optimization using response surface methodology of the chosen critical method parameters was carried out employing a two-factor-three-level-13-run, face-centered cubic design. A method operable design region was earmarked providing optimum method performance using numerical and graphical optimization. The optimum method employed a mobile phase composition consisting of acetonitrile and water (containing orthophosphoric acid, pH 4.1) at 65:35 v/v at a flow rate of 0.8 mL/min with UV detection at 265 nm using a C 18 column. Response surface methodology validation studies confirmed good efficiency and sensitivity of the developed method for analysis of SFN in mobile phase as well as in human plasma matrix. The forced degradation studies were conducted under different recommended stress conditions as per ICH Q1A (R2). Mass spectroscopy studies showed that SFN degrades in strongly acidic, alkaline and oxidative hydrolytic conditions at elevated temperature, while the drug was per se found to be photostable. Oxidative hydrolysis using 30% H 2 O 2 showed maximum degradation with products at retention times of 3.35, 3.65, 4.20 and 5.67 min. The absence of any

  1. Performance validation in anatomic pathology: successful integration of a new classification system into the practice setting using the updated lung non-small cell carcinoma recommendations.

    Science.gov (United States)

    Murugan, Paari; Stevenson, Michael E; Hassell, Lewis A

    2014-01-01

    The new, international, multidisciplinary classification of lung adenocarcinoma, from the International Association for the Study of Lung Cancer/American Thoracic Society/European Respiratory Society, presents a paradigm shift for diagnostic pathologists. To validate our ability to apply the recommendations in reporting on non-small cell lung cancer cases. A test based on the new non-small cell lung cancer classification was administered to 16 pathology faculty members, senior residents, and fellows before and after major educational interventions, which included circulation of articles, electronic presentations, and live presentations by a well-known lung pathologist. Surgical and cytologic (including cell-block material) reports of lung malignancies for representative periods before and after the educational interventions were reviewed for compliance with the new guidelines. Cases were scored on a 3-point scale, with 1 indicating incorrect terminology and/or highly inappropriate stain use, 2 indicating correct diagnostic terminology with suboptimal stain use, and 3 indicating appropriate diagnosis and stain use. The actual error type was also evaluated. The average score on initial testing was 55%, increasing to 88% following the educational interventions (60% improvement). Of the 54 reports evaluated before intervention, participants scored 3 out of 3 points on 15 cases (28%), 2 of 3 on 31 cases (57%), and 1 of 3 on 8 cases (15%). Incorrect use of stains was noted in 23 of 54 cases (43%), incorrect terminology in 15 of 54 cases (28%), and inappropriate use of tissue, precluding possible molecular testing, in 4 out of 54 cases (7%). Of the 55 cases after intervention, participants scored 3 out of 3 points on 46 cases (84%), 2 of 3 on 8 cases (15%), and 1 of 3 on 1 case (2%). Incorrect use of stains was identified in 9 of 55 cases (16% of total reports), and inappropriate use of tissue, precluding possible molecular testing, was found in 1 of the 55 cases (2

  2. A systematic review and meta-analysis of the criterion validity of nutrition assessment tools for diagnosing protein-energy malnutrition in the older community setting (the MACRo study).

    Science.gov (United States)

    Marshall, Skye; Craven, Dana; Kelly, Jaimon; Isenring, Elizabeth

    2017-10-12

    Malnutrition is a significant barrier to healthy and independent ageing in older adults who live in their own homes, and accurate diagnosis is a key step in managing the condition. However, there has not been sufficient systematic review or pooling of existing data regarding malnutrition diagnosis in the geriatric community setting. The current paper was conducted as part of the MACRo (Malnutrition in the Ageing Community Review) Study and seeks to determine the criterion (concurrent and predictive) validity and reliability of nutrition assessment tools in making a diagnosis of protein-energy malnutrition in the general older adult community. A systematic literature review was undertaken using six electronic databases in September 2016. Studies in any language were included which measured malnutrition via a nutrition assessment tool in adults ≥65 years living in their own homes. Data relating to the predictive validity of tools were analysed via meta-analyses. GRADE was used to evaluate the body of evidence. There were 6412 records identified, of which 104 potentially eligible records were screened via full text. Eight papers were included; two which evaluated the concurrent validity of the Mini Nutritional Assessment (MNA) and Subjective Global Assessment (SGA) and six which evaluated the predictive validity of the MNA. The quality of the body of evidence for the concurrent validity of both the MNA and SGA was very low. The quality of the body of evidence for the predictive validity of the MNA in detecting risk of death was moderate (RR: 1.92 [95% CI: 1.55-2.39]; P assessment tool for diagnosing PEM in older adults in the community. High quality diagnostic accuracy studies are needed for all nutrition assessment tools used in older community samples, including measuring of health outcomes subsequent to nutrition assessment by the SGA and PG-SGA. Copyright © 2017 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  3. Concurrent validation of the Actigraph gt3x+, Polar Active accelerometer, Omron HJ-720 and Yamax Digiwalker SW-701 pedometer step counts in lab-based and free-living settings.

    Science.gov (United States)

    Lee, Joey A; Williams, Skip M; Brown, Dale D; Laurson, Kelly R

    2015-01-01

    Activity monitors are frequently used to assess activity in many settings. But as technology advances, so do the mechanisms used to estimate activity causing a continuous need to validate newly developed monitors. The purpose of this study was to examine the step count validity of the Yamax Digiwalker SW-701 pedometer (YX), Omron HJ-720 T pedometer (OP), Polar Active accelerometer (PAC) and Actigraph gt3x+ accelerometer (AG) under controlled and free-living conditions. Participants completed five stages of treadmill walking (n = 43) and a subset of these completed a 3-day free-living wear period (n = 37). Manually counted (MC) steps provided a criterion measure for treadmill walking, whereas the comparative measure during free-living was the YX. During treadmill walking, the OP was the most accurate monitor across all speeds (±1.1% of MC steps), while the PAC underestimated steps by 6.7-16.0% per stage. During free-living, the OP and AG counted 97.5% and 98.5% of YX steps, respectively. The PAC overestimated steps by 44.0%, or 5,265 steps per day. The Omron pedometer seems to provide the most reliable and valid estimate of steps taken, as it was the best performer under lab-based conditions and provided comparable results to the YX in free-living. Future studies should consider these monitors in additional populations and settings.

  4. Balloon-borne limb profiling of UV/vis skylight radiances, O3, NO2, and BrO: technical set-up and validation of the method

    Directory of Open Access Journals (Sweden)

    F. Weidner

    2005-01-01

    Full Text Available A novel light-weight, elevation scanning and absolutely calibrated UV/vis spectrometer and its application to balloon-borne limb radiance and trace gas profile measurements is described. Its performance and the novel method of balloon-borne UV/vis limb trace gas measurements has been tested against simultaneous observations of the same atmospheric parameters available from either (a in-situ instrumentation (cf., by an electrochemical cell (ECC ozone sonde also deployed aboard the gondola or (b trace gas profiles inferred from UV/vis/near IR solar occultation measurements performed on the same payload. The novel technique is also cross validated with radiative transfer modeling. Reasonable agreement is found (a between measured and simulated limb radiances and (b inferred limb O3, NO2, and BrO and correlative profile measurements when properly accounting for all relevant atmospheric parameters (temperature, pressure, aerosol extinction, and major absorbers.

  5. Development and validation of a specific and sensitive gas chromatography tandem mass spectrometry method for the determination of bisphenol A residues in a large set of food items.

    Science.gov (United States)

    Deceuninck, Y; Bichon, E; Durand, S; Bemrah, N; Zendong, Z; Morvan, M L; Marchand, P; Dervilly-Pinel, G; Antignac, J P; Leblanc, J C; Le Bizec, B

    2014-10-03

    BPA-containing products are widely used in foodstuffs packaging as authorized within the European Union (UE no. 10/2011). Therefore, foods and beverages are in contact with BPA which can migrate from food contact material to foodstuffs. An accurate assessment of the exposure of the consumers to BPA is crucial for a non-ambiguous risk characterization. In this context, an efficient analytical method using gas chromatography coupled to tandem mass spectrometry (GC-MS/MS), in the selected reaction monitoring (SRM) mode, was developed for the quantification of BPA in foodstuffs at very low levels (<0.5μgkg(-1)). A standard operating procedure, based on the combination of two successive solid phase extractions (SPE), was developed for various liquid and solid foodstuffs. The use of (13)C12-BPA as internal standard allowed accurate quantification of BPA by isotopic dilution. Control charts based on both blank and certified materials have been implemented to ensure analytical data quality. The developed analytical method has been validated according to in-house validation requirements. R(2) was better than 0.9990 within the range [0-100μgkg(-1)], the trueness was 4.2%. Repeatability and within-laboratory reproducibility ranged from 7.5% to 19.0% and 2.5% to 12.2%, respectively, at 0.5 and 5.0μgkg(-1) depending on the matrices tested for. The detection and quantification limits were 0.03 and 0.10μgkg(-1), respectively. The reporting limit was 0.35μgkg(-1), taking into account the mean of the laboratory background contamination. The global uncertainty was 22.2% at 95% confidence interval. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Measuring Post-Partum Haemorrhage in Low-Resource Settings: The Diagnostic Validity of Weighed Blood Loss versus Quantitative Changes in Hemoglobin.

    Directory of Open Access Journals (Sweden)

    Esther Cathyln Atukunda

    Full Text Available Accurate estimation of blood loss is central to prompt diagnosis and management of post-partum hemorrhage (PPH, which remains a leading cause of maternal mortality in low-resource countries. In such settings, blood loss is often estimated visually and subjectively by attending health workers, due to inconsistent availability of laboratory infrastructure. We evaluated the diagnostic accuracy of weighed blood loss (WBL versus changes in peri-partum hemoglobin to detect PPH.Data from this analysis were collected as part of a randomized controlled trial comparing oxytocin with misoprostol for PPH (NCT01866241. Blood samples for complete blood count were drawn on admission and again prior to hospital discharge or before blood transfusion. During delivery, women were placed on drapes and had pre-weighed sanitary towels placed around their perineum. Blood was then drained into a calibrated container and the sanitary towels were added to estimate WBL, where each gram of blood was estimated as a milliliter. Sensitivity, specificity, negative and positive predictive values (PPVs were calculated at various blood volume loss and time combinations, and we fit receiver-operator curves using blood loss at 1, 2, and 24 hours compared to a reference standard of haemoglobin decrease of >10%.A total of 1,140 women were enrolled in the study, of whom 258 (22.6% developed PPH, defined as a haemoglobin drop >10%, and 262 (23.0% had WBL ≥500mL. WBL generally had a poor sensitivity for detection of PPH (85% in high prevalence settings when WBL exceeds 750mL.WBL has poor sensitivity but high specificity compared to laboratory-based methods of PPH diagnosis. These characteristics correspond to a high PPV in areas with high PPH prevalence. Although WBL is not useful for excluding PPH, this low-cost, simple and reproducible method is promising as a reasonable method to identify significant PPH in such settings where quantifiable red cell indices are unavailable.

  7. Development, validation and initial outcomes of a questionnaire to investigate the views of nurses working in a mental health setting regarding a cardiometabolic health nursing role.

    Science.gov (United States)

    Happell, Brenda; Stanton, Robert; Hoey, Wendy; Scott, David

    2014-04-01

    People with serious mental illness experience disparities in primary health care. One solution is a specialist nursing position responsible for the coordination of the primary care of people with serious mental illness. However the views of nurses regarding this proposed role are only beginning to emerge. This study reports the readability, factorability, internal consistency and responses from a questionnaire regarding the views of nurses working in a mental health setting regarding the proposed role. The questionnaire was determined to have adequate readability, and internal consistency. Nurses are positive towards the development of the role however the cost-effectiveness should be considered. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. The Matrix exponential, Dynamic Systems and Control

    DEFF Research Database (Denmark)

    Poulsen, Niels Kjølstad

    2004-01-01

    The matrix exponential can be found in various connections in analysis and control of dynamic systems. In this short note we are going to list a few examples. The matrix exponential usably pops up in connection to the sampling process, whatever it is in a deterministic or a stochastic setting...

  9. The cellulose resource matrix.

    Science.gov (United States)

    Keijsers, Edwin R P; Yılmaz, Gülden; van Dam, Jan E G

    2013-03-01

    feedstock and the performance in the end-application. The cellulose resource matrix should become a practical tool for stakeholders to make choices regarding raw materials, process or market. Although there is a vast amount of scientific and economic information available on cellulose and lignocellulosic resources, the accessibility for the interested layman or entrepreneur is very difficult and the relevance of the numerous details in the larger context is limited. Translation of science to practical accessible information with modern data management and data integration tools is a challenge. Therefore, a detailed matrix structure was composed in which the different elements or entries of the matrix were identified and a tentative rough set up was made. The inventory includes current commodities and new cellulose containing and raw materials as well as exotic sources and specialties. Important chemical and physical properties of the different raw materials were identified for the use in processes and products. When available, the market data such as price and availability were recorded. Established and innovative cellulose extraction and refining processes were reviewed. The demands on the raw material for suitable processing were collected. Processing parameters known to affect the cellulose properties were listed. Current and expected emerging markets were surveyed as well as their different demands on cellulose raw materials and processes. The setting up of the cellulose matrix as a practical tool requires two steps. Firstly, the reduction of the needed data by clustering of the characteristics of raw materials, processes and markets and secondly, the building of a database that can provide the answers to the questions from stakeholders with an indicative character. This paper describes the steps taken to achieve the defined clusters of most relevant and characteristic properties. These data can be expanded where required. More detailed specification can be obtained

  10. [The validity of the SET 5-10 (language level test for children aged between five and ten years): first analyses].

    Science.gov (United States)

    Metz, D; Belhadj Kouider, E; Karpinski, N; Petermann, F

    2011-10-01

    No other child developmental domain is as frequently affected by disorders as language acquisition. The SET 5-10 (language level test for children aged between 5 and 10 years), was developed to assess specific language disorders. This study examines the ability of the SET 5-10 to differentiate between the developmental increases of speech competence, present deficits resulting from previous speech deficits and the probability of assessing possible speech deficits of children with an immigrant background. Based on data of the norm sample (n=1,052; 51.8% female) different cohorts (age-groups, children with speech impairment problems and children with immigrant background) are compared by Multivariate analysis of variance. In all subtests a steady increase of performance with increasing age could be found. The means of children with previous grammar deficits (n=46) and children with immigrant background (n=143) are significantly lower than those of the reference group. The results presented offer first proof of a differentiated and economic diagnosis of language achievement of children 5-10 years of age. © Georg Thieme Verlag KG Stuttgart · New York.

  11. Development and validation of an observation tool for the assessment of nursing pain management practices in intensive care unit in a standardized clinical simulation setting.

    Science.gov (United States)

    Gosselin, Emilie; Bourgault, Patricia; Lavoie, Stephan; Coleman, Robin-Marie; Méziat-Burdin, Anne

    2014-12-01

    Pain management in the intensive care unit is often inadequate. There is no tool available to assess nursing pain management practices. The aim of this study was to develop and validate a measuring tool to assess nursing pain management in the intensive care unit during standardized clinical simulation. A literature review was performed to identify relevant components demonstrating optimal pain management in adult intensive care units and to integrate them in an observation tool. This tool was submitted to an expert panel and pretested. It was then used to assess pain management practice during 26 discrete standardized clinical simulation sessions with intensive care nurses. The Nursing Observation Tool for Pain Management (NOTPaM) contains 28 statements grouped into 8 categories, which are grouped into 4 dimensions: subjective assessment, objective assessment, interventions, and reassessment. The tool's internal consistency was calculated at a Cronbach's alpha of 0.436 for the whole tool; the alpha varies from 0.328 to 0.518 for each dimension. To evaluate the inter-rater reliability, intra-class correlation coefficient was used, which was calculated at 0.751 (p intensive care nurses' pain management in a standardized clinical simulation. The NOTPaM is the first tool created for this purpose. Copyright © 2014 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  12. Development and validation of a set of six adaptable prognosis prediction (SAP models based on time-series real-world big data analysis for patients with cancer receiving chemotherapy: A multicenter case crossover study.

    Directory of Open Access Journals (Sweden)

    Yu Uneno

    Full Text Available We aimed to develop an adaptable prognosis prediction model that could be applied at any time point during the treatment course for patients with cancer receiving chemotherapy, by applying time-series real-world big data.Between April 2004 and September 2014, 4,997 patients with cancer who had received systemic chemotherapy were registered in a prospective cohort database at the Kyoto University Hospital. Of these, 2,693 patients with a death record were eligible for inclusion and divided into training (n = 1,341 and test (n = 1,352 cohorts. In total, 3,471,521 laboratory data at 115,738 time points, representing 40 laboratory items [e.g., white blood cell counts and albumin (Alb levels] that were monitored for 1 year before the death event were applied for constructing prognosis prediction models. All possible prediction models comprising three different items from 40 laboratory items (40C3 = 9,880 were generated in the training cohort, and the model selection was performed in the test cohort. The fitness of the selected models was externally validated in the validation cohort from three independent settings.A prognosis prediction model utilizing Alb, lactate dehydrogenase, and neutrophils was selected based on a strong ability to predict death events within 1-6 months and a set of six prediction models corresponding to 1,2, 3, 4, 5, and 6 months was developed. The area under the curve (AUC ranged from 0.852 for the 1 month model to 0.713 for the 6 month model. External validation supported the performance of these models.By applying time-series real-world big data, we successfully developed a set of six adaptable prognosis prediction models for patients with cancer receiving chemotherapy.

  13. Validation of a non-uniform meshing algorithm for the 3D-FDTD method by means of a two-wire crosstalk experimental set-up

    Directory of Open Access Journals (Sweden)

    Raúl Esteban Jiménez-Mejía

    2015-06-01

    Full Text Available This paper presents an algorithm used to automatically mesh a 3D computational domain in order to solve electromagnetic interaction scenarios by means of the Finite-Difference Time-Domain -FDTD-  Method. The proposed algorithm has been formulated in a general mathematical form, where convenient spacing functions can be defined for the problem space discretization, allowing the inclusion of small sized objects in the FDTD method and the calculation of detailed variations of the electromagnetic field at specified regions of the computation domain. The results obtained by using the FDTD method with the proposed algorithm have been contrasted not only with a typical uniform mesh algorithm, but also with experimental measurements for a two-wire crosstalk set-up, leading to excellent agreement between theoretical and experimental waveforms. A discussion about the advantages of the non-uniform mesh over the uniform one is also presented.

  14. Exploiting biospectroscopy as a novel screening tool for cervical cancer: towards a framework to validate its accuracy in a routine clinical setting.

    LENUS (Irish Health Repository)

    Purandare, Nikhil C

    2013-11-01

    Biospectroscopy is an emerging field that harnesses the platform of physical sciences with computational analysis in order to shed novel insights on biological questions. An area where this approach seems to have potential is in screening or diagnostic clinical settings, where there is an urgent need for new approaches to objectively interrogate large numbers of samples in an objective fashion with acceptable levels of sensitivity and specificity. This review outlines the benefits of biospectroscopy in screening for precancer lesions of the cervix due to its ability to separate different grades of dysplasia. It evaluates the feasibility of introducing this technique into cervical screening programs on the basis of its ability to identify biomarkers of progression within derived spectra (\\'biochemical‑cell fingerprints\\').

  15. [Validation of an observer-based rating set compared to a standardized written psychological test for the diagnosis of depression and anxiety in a university preadmission test center].

    Science.gov (United States)

    Schulz-Stübner, S; de Bruin, J; Neuser, J; Rossaint, R

    2001-06-01

    Depression and anxiety can be a major factor of perioperative stress and might contribute to patients dissatisfaction with medical care if they remain unrecognized. There are several methods to diagnose depression and anxiety like standardized written psychological tests or self report scales. Because these tests are not always suitable for routine use in a busy preadmission test center we evaluated an observer-based rating set for the diagnosis of depression and anxiety. 70 patients of a university hospital preadmission test center were tested with the HADS-D-Test and the observer-based rating set after approval of the institutional review board and written informed consent. Test-data were compared using a logistic regression model and demographic variables were analyzed using t-Test. ANOVA and Pearson correlation. The prevalence of depression in our study population was 11.11% (14.75% in male, 9.76% in female) and the prevalence of anxiety was 7.14% (6.9% in male and 7.32% in female). The correlation between the observer-based rating items and the HADS-D-diagnosis was statistically highly significant. The observer based items "unsteady eye movements" and "general worrisome mood" proved to be especially sensitive for anxiety and the items "sorrowful mood" and "impression of resignation" were sensitive for depression without any influence of the experience of the anesthesiologist. A higher prevalence of depression and anxiety was found in patients with ASA-class III compared to those with ASA-classes I and II while age and type of surgery had no significant influence. Based on our observations depression and anxiety are a relevant factor of preoperative morbidity assessment. Observer-based items are a reliable tool to detect those patients which might need special assistance and therapy in the perioperative period to reduce stress associated with high preexisting levels of depression and anxiety.

  16. GB Diet matrix as informed by EMAX

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This data set was taken from CRD 08-18 at the NEFSC. Specifically, the Georges Bank diet matrix was developed for the EMAX exercise described in that center...

  17. Measurement of total antioxidant capacity of human plasma: setting and validation of the CUPRAC-BCS method on routine apparatus ADVIA 2400.

    Science.gov (United States)

    Gosmaro, Fabio; Bagnati, Marco; Berto, Silvia; Bellomo, Giorgio; Prenesti, Enrico

    2013-10-15

    Quantification of Total Antioxidant Capacity (TAC) of human plasma is an important clinical target, since many diseases are suspected to be related with oxidative stress. The CUPRAC-BCS (BCS=Bathocuproinedisulfonic acid) method was chosen since it works using the photometric principle, with stable and inexpensive reagents and at physiological pH. The method is based on the complex equilibria between Cu(II)-BCS (reagent) and Cu(I)-BCS. Cu(I)-BCS complex is formed by reducing ability of the plasma redox active substances. The photometric signal is achieved at 478 nm and calibration is performed using urate as a reference substance. Linearity, linear working range, sensitivity, precision, LoD, LoQ, selectivity and robustness have been considered to validate the method. Absorbance at 478 nm was found linear from 0.0025 up to 2.0 mmol L(-1) of urate reference solution. Precision was evaluated as within-day repeatability, Sr=4 µmol L(-1), and intermediate-precision, SI(T)=15 µmol L(-1). LoD and LoQ, resulted equal to 7.0 µmol L(-1) and 21 µmol L(-1) respectively while robustness was tested having care for pH variation during PBS buffer preparation. Tests on plasma (80 samples) and on human cerebrospinal fluid (30 samples) were conducted and discussed. By the analytical point of view, the photometric method was found to be simple, rapid, widely linear and reliable for the routine analysis of a clinical laboratory. By the clinical point of view, the method response is suitable for the study of chemical plasma quantities related to redox reactivity. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. SU-D-202-04: Validation of Deformable Image Registration Algorithms for Head and Neck Adaptive Radiotherapy in Routine Clinical Setting

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, L; Pi, Y; Chen, Z; Xu, X [University of Science and Technology of China, Hefei, Anhui (China); Wang, Z [University of Science and Technology of China, Hefei, Anhui (China); The First Affiliated Hospital of Anhui Medical University, Hefei, Anhui (China); Shi, C [Saint Vincent Medical Center, Bridgeport, CT (United States); Long, T; Luo, W; Wang, F [The First Affiliated Hospital of Anhui Medical University, Hefei, Anhui (China)

    2016-06-15

    Purpose: To evaluate the ROI contours and accumulated dose difference using different deformable image registration (DIR) algorithms for head and neck (H&N) adaptive radiotherapy. Methods: Eight H&N cancer patients were randomly selected from the affiliated hospital. During the treatment, patients were rescanned every week with ROIs well delineated by radiation oncologist on each weekly CT. New weekly treatment plans were also re-designed with consistent dose prescription on the rescanned CT and executed for one week on Siemens CT-on-rails accelerator. At the end, we got six weekly CT scans from CT1 to CT6 including six weekly treatment plans for each patient. The primary CT1 was set as the reference CT for DIR proceeding with the left five weekly CTs using ANACONDA and MORFEUS algorithms separately in RayStation and the external skin ROI was set to be the controlling ROI both. The entire calculated weekly dose were deformed and accumulated on corresponding reference CT1 according to the deformation vector field (DVFs) generated by the two different DIR algorithms respectively. Thus we got both the ANACONDA-based and MORFEUS-based accumulated total dose on CT1 for each patient. At the same time, we mapped the ROIs on CT1 to generate the corresponding ROIs on CT6 using ANACONDA and MORFEUS DIR algorithms. DICE coefficients between the DIR deformed and radiation oncologist delineated ROIs on CT6 were calculated. Results: For DIR accumulated dose, PTV D95 and Left-Eyeball Dmax show significant differences with 67.13 cGy and 109.29 cGy respectively (Table1). For DIR mapped ROIs, PTV, Spinal cord and Left-Optic nerve show difference with −0.025, −0.127 and −0.124 (Table2). Conclusion: Even two excellent DIR algorithms can give divergent results for ROI deformation and dose accumulation. As more and more TPS get DIR module integrated, there is an urgent need to realize the potential risk using DIR in clinical.

  19. Video self-modeling in children with autism: a pilot study validating prerequisite skills and extending the utilization of VSM across skill sets.

    Science.gov (United States)

    Williamson, Robert L; Casey, Laura B; Robertson, Janna Siegel; Buggey, Tom

    2013-01-01

    Given the recent interest in the use of video self-modeling (VSM) to provide instruction within iPod apps and other pieces of handheld mobile assistive technologies, investigating appropriate prerequisite skills for effective use of this intervention is particularly timely and relevant. To provide additional information regarding the efficacy of VSM for students with autism and to provide insights into any possible prerequisite skills students may require for such efficacy, the authors investigated the use of VSM in increasing the instances of effective initiations of interpersonal greetings for three students with autism that exhibited different pre-intervention abilities. Results showed that only one of the three participants showed an increase in self-initiated greetings following the viewing of videos edited to show each participant self-modeling a greeting when entering his or her classroom. Due to the differences in initial skill sets between the three children, this finding supports anecdotally observed student prerequisite abilities mentioned in previous studies that may be required to effectively utilize video based teaching methods.

  20. Matrix completion by deep matrix factorization.

    Science.gov (United States)

    Fan, Jicong; Cheng, Jieyu

    2017-11-03

    Conventional methods of matrix completion are linear methods that are not effective in handling data of nonlinear structures. Recently a few researchers attempted to incorporate nonlinear techniques into matrix completion but there still exists considerable limitations. In this paper, a novel method called deep matrix factorization (DMF) is proposed for nonlinear matrix completion. Different from conventional matrix completion methods that are based on linear latent variable models, DMF is on the basis of a nonlinear latent variable model. DMF is formulated as a deep-structure neural network, in which the inputs are the low-dimensional unknown latent variables and the outputs are the partially observed variables. In DMF, the inputs and the parameters of the multilayer neural network are simultaneously optimized to minimize the reconstruction errors for the observed entries. Then the missing entries can be readily recovered by propagating the latent variables to the output layer. DMF is compared with state-of-the-art methods of linear and nonlinear matrix completion in the tasks of toy matrix completion, image inpainting and collaborative filtering. The experimental results verify that DMF is able to provide higher matrix completion accuracy than existing methods do and DMF is applicable to large matrices. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. TOTAL HARMONIC DISTORTION ANALYSIS OF MATRIX CONVERTER

    OpenAIRE

    K. Kandan; Senthilkumar,K.; Dhivya, K.

    2016-01-01

    This paper deals with the validation and design analysis of Matrix converter for variable frequency using mathematical equations. The analysis was done using Venturini modulation algorithm. The PI controller is used for Matrix converter to reduce Total Harmonic Distortion (THD) in the output current. The comparative study is done for open loop and closed loop PI compensation in MATLAB-Simulink. Furthermore, the output waveforms are produced with significant reduction in the Total Harmonic Dis...

  2. Diagnosis using nail matrix.

    Science.gov (United States)

    Richert, Bertrand; Caucanas, Marie; André, Josette

    2015-04-01

    Diagnosing nail matrix diseases requires knowledge of the nail matrix function and anatomy. This allows recognition of the clinical manifestations and assessment of potential surgical risk. Nail signs depend on the location within the matrix (proximal or distal) and the intensity, duration, and extent of the insult. Proximal matrix involvement includes nail surface irregularities (longitudinal lines, transverse lines, roughness of the nail surface, pitting, and superficial brittleness), whereas distal matrix insult induces longitudinal or transverse chromonychia. Clinical signs are described and their main causes are listed to enable readers to diagnose matrix disease from the nail's clinical features. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Novel image analysis methods for quantification of in situ 3-D tendon cell and matrix strain.

    Science.gov (United States)

    Fung, Ashley K; Paredes, J J; Andarawis-Puri, Nelly

    2018-01-23

    Macroscopic tendon loads modulate the cellular microenvironment leading to biological outcomes such as degeneration or repair. Previous studies have shown that damage accumulation and the phases of tendon healing are marked by significant changes in the extracellular matrix, but it remains unknown how mechanical forces of the extracellular matrix are translated to mechanotransduction pathways that ultimately drive the biological response. Our overarching hypothesis is that the unique relationship between extracellular matrix strain and cell deformation will dictate biological outcomes, prompting the need for quantitative methods to characterize the local strain environment. While 2-D methods have successfully calculated matrix strain and cell deformation, 3-D methods are necessary to capture the increased complexity that can arise due to high levels of anisotropy and out-of-plane motion, particularly in the disorganized, highly cellular, injured state. In this study, we validated the use of digital volume correlation methods to quantify 3-D matrix strain using images of naïve tendon cells, the collagen fiber matrix, and injured tendon cells. Additionally, naïve tendon cell images were used to develop novel methods for 3-D cell deformation and 3-D cell-matrix strain, which is defined as a quantitative measure of the relationship between matrix strain and cell deformation. The results support that these methods can be used to detect strains with high accuracy and can be further extended to an in vivo setting for observing temporal changes in cell and matrix mechanics during degeneration and healing. Copyright © 2017. Published by Elsevier Ltd.

  4. Validation of a modified algorithm for the identification of yeast isolates using matrix-assisted laser desorption/ionisation time-of-flight mass spectrometry (MALDI-TOF MS)

    NARCIS (Netherlands)

    van Herendael, B.H.; Bruynseels, P.; Bensaid, M.; Boekhout, T.; de Baere, T.; Surmont, I.; Mertens, A.H.

    2011-01-01

    Optimising antifungal treatment requires the fast and species-specific identification of yeast isolates. We evaluated a modified protocol for the rapid identification of clinical yeast isolates using matrix-assisted laser desorption/ionisation time-of-flight (MALDI-TOF) technology. First, we

  5. Checklists for external validity

    DEFF Research Database (Denmark)

    Dyrvig, Anne-Kirstine; Kidholm, Kristian; Gerke, Oke

    2014-01-01

    RATIONALE, AIMS AND OBJECTIVES: The quality of the current literature on external validity varies considerably. An improved checklist with validated items on external validity would aid decision-makers in judging similarities among circumstances when transferring evidence from a study setting to ...

  6. Setting Asset Performance Targets

    NARCIS (Netherlands)

    Green, D.; Hodkiewicz, M.; Masschelein, S.; Schoenmaker, R.; Muruvan, S.

    2015-01-01

    Setting targets is a common way for organisations to establish performance expectations. However the validity of targets is challenged when performance is influenced by factors beyond the control of the manager. This project examines the issue of target setting for a single asset performance measure

  7. The Matrix Cookbook

    DEFF Research Database (Denmark)

    Petersen, Kaare Brandt; Pedersen, Michael Syskind

    Matrix identities, relations and approximations. A desktop reference for quick overview of mathematics of matrices.......Matrix identities, relations and approximations. A desktop reference for quick overview of mathematics of matrices....

  8. Matrix differentiation formulas

    Science.gov (United States)

    Usikov, D. A.; Tkhabisimov, D. K.

    1983-01-01

    A compact differentiation technique (without using indexes) is developed for scalar functions that depend on complex matrix arguments which are combined by operations of complex conjugation, transposition, addition, multiplication, matrix inversion and taking the direct product. The differentiation apparatus is developed in order to simplify the solution of extremum problems of scalar functions of matrix arguments.

  9. Matrix with Prescribed Eigenvectors

    Science.gov (United States)

    Ahmad, Faiz

    2011-01-01

    It is a routine matter for undergraduates to find eigenvalues and eigenvectors of a given matrix. But the converse problem of finding a matrix with prescribed eigenvalues and eigenvectors is rarely discussed in elementary texts on linear algebra. This problem is related to the "spectral" decomposition of a matrix and has important technical…

  10. The Dirac equation in the algebraic approximation. IX. Matrix Dirac-Hartree-Fock calculations for the HeH and BeH ground states using distributed Gaussian basis sets

    NARCIS (Netherlands)

    Quiney, HM; Glushkov, VN; Wilson, S

    2004-01-01

    Using large component basis sets of distributed s-type Gaussian functions with positions and exponents optimized so as to support Hartree-Fock total energies with an accuracy approaching the sub-muhartree level, Dirac-Hartree-Fock-Coulomb calculations are reported for the ground states of the

  11. Health system context and implementation of evidence-based practices-development and validation of the Context Assessment for Community Health (COACH) tool for low- and middle-income settings.

    Science.gov (United States)

    Bergström, Anna; Skeen, Sarah; Duc, Duong M; Blandon, Elmer Zelaya; Estabrooks, Carole; Gustavsson, Petter; Hoa, Dinh Thi Phuong; Källestål, Carina; Målqvist, Mats; Nga, Nguyen Thu; Persson, Lars-Åke; Pervin, Jesmin; Peterson, Stefan; Rahman, Anisur; Selling, Katarina; Squires, Janet E; Tomlinson, Mark; Waiswa, Peter; Wallin, Lars

    2015-08-15

    The gap between what is known and what is practiced results in health service users not benefitting from advances in healthcare, and in unnecessary costs. A supportive context is considered a key element for successful implementation of evidence-based practices (EBP). There were no tools available for the systematic mapping of aspects of organizational context influencing the implementation of EBPs in low- and middle-income countries (LMICs). Thus, this project aimed to develop and psychometrically validate a tool for this purpose. The development of the Context Assessment for Community Health (COACH) tool was premised on the context dimension in the Promoting Action on Research Implementation in Health Services framework, and is a derivative product of the Alberta Context Tool. Its development was undertaken in Bangladesh, Vietnam, Uganda, South Africa and Nicaragua in six phases: (1) defining dimensions and draft tool development, (2) content validity amongst in-country expert panels, (3) content validity amongst international experts, (4) response process validity, (5) translation and (6) evaluation of psychometric properties amongst 690 health workers in the five countries. The tool was validated for use amongst physicians, nurse/midwives and community health workers. The six phases of development resulted in a good fit between the theoretical dimensions of the COACH tool and its psychometric properties. The tool has 49 items measuring eight aspects of context: Resources, Community engagement, Commitment to work, Informal payment, Leadership, Work culture, Monitoring services for action and Sources of knowledge. Aspects of organizational context that were identified as influencing the implementation of EBPs in high-income settings were also found to be relevant in LMICs. However, there were additional aspects of context of relevance in LMICs specifically Resources, Community engagement, Commitment to work and Informal payment. Use of the COACH tool will allow

  12. Principles of Proper Validation

    DEFF Research Database (Denmark)

    Esbensen, Kim; Geladi, Paul

    2010-01-01

    of proper validation objectives implies that there is one valid paradigm only: test set validation. (iii) Contrary to much contemporary chemometric practices (and validation myths), cross-validation is shown to be unjustified in the form of monolithic application of a one-for-all procedure (segmented cross...... to suffer from the same deficiencies. The PPV are universal and can be applied to all situations in which the assessment of performance is desired: prediction-, classification-, time series forecasting-, modeling validation. The key element of PPV is the Theory of Sampling (TOS), which allow insight...

  13. Finding Nonoverlapping Substructures of a Sparse Matrix

    Energy Technology Data Exchange (ETDEWEB)

    Pinar, Ali; Vassilevska, Virginia

    2005-08-11

    Many applications of scientific computing rely on computations on sparse matrices. The design of efficient implementations of sparse matrix kernels is crucial for the overall efficiency of these applications. Due to the high compute-to-memory ratio and irregular memory access patterns, the performance of sparse matrix kernels is often far away from the peak performance on a modern processor. Alternative data structures have been proposed, which split the original matrix A into A{sub d} and A{sub s}, so that A{sub d} contains all dense blocks of a specified size in the matrix, and A{sub s} contains the remaining entries. This enables the use of dense matrix kernels on the entries of A{sub d} producing better memory performance. In this work, we study the problem of finding a maximum number of nonoverlapping dense blocks in a sparse matrix, which is previously not studied in the sparse matrix community. We show that the maximum nonoverlapping dense blocks problem is NP-complete by using a reduction from the maximum independent set problem on cubic planar graphs. We also propose a 2/3-approximation algorithm that runs in linear time in the number of nonzeros in the matrix. This extended abstract focuses on our results for 2x2 dense blocks. However we show that our results can be generalized to arbitrary sized dense blocks, and many other oriented substructures, which can be exploited to improve the memory performance of sparse matrix operations.

  14. Nanocrystal doped matrixes

    Science.gov (United States)

    Parce, J. Wallace; Bernatis, Paul; Dubrow, Robert; Freeman, William P.; Gamoras, Joel; Kan, Shihai; Meisel, Andreas; Qian, Baixin; Whiteford, Jeffery A.; Ziebarth, Jonathan

    2010-01-12

    Matrixes doped with semiconductor nanocrystals are provided. In certain embodiments, the semiconductor nanocrystals have a size and composition such that they absorb or emit light at particular wavelengths. The nanocrystals can comprise ligands that allow for mixing with various matrix materials, including polymers, such that a minimal portion of light is scattered by the matrixes. The matrixes of the present invention can also be utilized in refractive index matching applications. In other embodiments, semiconductor nanocrystals are embedded within matrixes to form a nanocrystal density gradient, thereby creating an effective refractive index gradient. The matrixes of the present invention can also be used as filters and antireflective coatings on optical devices and as down-converting layers. Processes for producing matrixes comprising semiconductor nanocrystals are also provided. Nanostructures having high quantum efficiency, small size, and/or a narrow size distribution are also described, as are methods of producing indium phosphide nanostructures and core-shell nanostructures with Group II-VI shells.

  15. The SMOS Validation Campaign 2010 in the Upper Danube Catchment: A Data Set for Studies of Soil Moisture, Brightness Temperature, and Their Spatial Variability Over a Heterogeneous Land Surface

    DEFF Research Database (Denmark)

    T. dall' Amico, Johanna; Schlenz, Florian; Loew, Alexander

    2013-01-01

    . This novel technique requires careful calibration, validation, and an in-depth understanding of the acquired data and the underlying processes. In this light, a measurement campaign was undertaken recently in the river catchment of the upper Danube in southern Germany. In May and June 2010, airborne thermal...... infrared and L-band passive microwave data were collected together with spatially distributed in situ measurements. Two airborne radiometers, EMIRAD and HUT-2D, were used during the campaigns providing two complementary sets of measurements at incidence angles from 0$^{circ}$ to 40$^{circ}$ and with ground...... as of meteorological parameters such as air temperature and humidity, precipitation, wind speed, and radiation. All data have undergone thorough postprocessing and quality checking. Their values and trends fit well among each other and with the theoretically expected behavior. The aim of this paper is to present...

  16. Defining the content of individual physiotherapy and occupational therapy sessions for stroke patients in an inpatient rehabilitation setting. Development, validation and inter-rater reliability of a scoring list.

    Science.gov (United States)

    De Wit, L; Kamsteegt, H; Yadav, B; Verheyden, G; Feys, H; De Weerdt, W

    2007-05-01

    To develop a valid and reliable scoring list to define the content of individual physiotherapy and occupational therapy sessions for stroke patients in inpatient rehabilitation. A list was developed based on previous lists, neurological textbooks and recorded therapy sessions. Content validity was verified and inter-rater reliability evaluated on videos of treatment sessions. In each of four rehabilitation centres, a researcher recorded and scored five physiotherapy and five occupational therapy sessions. These 40 treatment sessions were also scored by the first author. The scores of the researchers and first author were statistically compared. Settings and subjects : Forty stroke patients in four European rehabilitation centres. The scoring list consists of 49 subcategories, divided into 12 categories: mobilization; selective movements; lying (balance); sitting (balance); standing (balance); sensory and visual perceptual training and cognition; transfers; ambulatory activities; personal activities of daily living; domestic activities of daily living; leisure- and work-related activities; and miscellaneous. Comparing the frequency of occurrence of the categories resulted in intraclass correlation coefficients, indicating high reliability for eight categories, good for one, and fair for two. One category was not observed. Spearman rank correlation coefficients were high to very high for 24 subcategories and moderate for four. Twenty-one subcategories contained too few observations to enable calculation of Spearman rank correlation coefficients. Average point-to-point percentage of agreement in time of the treatment sessions equalled 76.6 +/- 16.2%. The list is a valid and reliable tool for describing the content of physiotherapy and occupational therapy for stroke patients.

  17. Matrix method for acoustic levitation simulation.

    Science.gov (United States)

    Andrade, Marco A B; Perez, Nicolas; Buiochi, Flavio; Adamowski, Julio C

    2011-08-01

    A matrix method is presented for simulating acoustic levitators. A typical acoustic levitator consists of an ultrasonic transducer and a reflector. The matrix method is used to determine the potential for acoustic radiation force that acts on a small sphere in the standing wave field produced by the levitator. The method is based on the Rayleigh integral and it takes into account the multiple reflections that occur between the transducer and the reflector. The potential for acoustic radiation force obtained by the matrix method is validated by comparing the matrix method results with those obtained by the finite element method when using an axisymmetric model of a single-axis acoustic levitator. After validation, the method is applied in the simulation of a noncontact manipulation system consisting of two 37.9-kHz Langevin-type transducers and a plane reflector. The manipulation system allows control of the horizontal position of a small levitated sphere from -6 mm to 6 mm, which is done by changing the phase difference between the two transducers. The horizontal position of the sphere predicted by the matrix method agrees with the horizontal positions measured experimentally with a charge-coupled device camera. The main advantage of the matrix method is that it allows simulation of non-symmetric acoustic levitators without requiring much computational effort.

  18. Universal portfolios generated by Vandermonde generating matrix

    Science.gov (United States)

    Tan, Choon Peng; Yong, Say Loong

    2017-04-01

    A universal portfolio generated by the one-parameter symmetric positive definite Vandermonde matrix is studied. It is obtained by maximizing the scaled growth rate of the estimated daily wealth return and minimizing the Mahalanobis squared divergence of two portfolio vectors associated with the Vandermonde matrix. The parameter of the Vandermonde matrix is chosen so that the matrix is positive definite. The companion matrices of the three and five-dimensional generating matrices are evaluated to determine the portfolios. Three and five stock-data sets are selected from the local stock exchange in Malaysia and the empirical performance of the portfolios is presented. There is empirical evidence that the use of an appropriate generating Vandermonde matrix may increase the wealth of investors.

  19. Initial validation of the Argentinean Spanish version of the PedsQL™ 4.0 Generic Core Scales in children and adolescents with chronic diseases: acceptability and comprehensibility in low-income settings

    Directory of Open Access Journals (Sweden)

    Bauer Gabriela

    2008-08-01

    Full Text Available Abstract Background To validate the Argentinean Spanish version of the PedsQL™ 4.0 Generic Core Scales in Argentinean children and adolescents with chronic conditions and to assess the impact of socio-demographic characteristics on the instrument's comprehensibility and acceptability. Reliability, and known-groups, and convergent validity were tested. Methods Consecutive sample of 287 children with chronic conditions and 105 healthy children, ages 2–18, and their parents. Chronically ill children were: (1 attending outpatient clinics and (2 had one of the following diagnoses: stem cell transplant, chronic obstructive pulmonary disease, HIV/AIDS, cancer, end stage renal disease, complex congenital cardiopathy. Patients and adult proxies completed the PedsQL™ 4.0 and an overall health status assessment. Physicians were asked to rate degree of health status impairment. Results The PedsQL™ 4.0 was feasible (only 9 children, all 5 to 7 year-olds, could not complete the instrument, easy to administer, completed without, or with minimal, help by most children and parents, and required a brief administration time (average 5–6 minutes. People living below the poverty line and/or low literacy needed more help to complete the instrument. Cronbach Alpha's internal consistency values for the total and subscale scores exceeded 0.70 for self-reports of children over 8 years-old and parent-reports of children over 5 years of age. Reliability of proxy-reports of 2–4 year-olds was low but improved when school items were excluded. Internal consistency for 5–7 year-olds was low (α range = 0.28–0.76. Construct validity was good. Child self-report and parent proxy-report PedsQL™ 4.0 scores were moderately but significantly correlated (ρ = 0.39, p Conclusion Results suggest that the Argentinean Spanish PedsQL™ 4.0 is suitable for research purposes in the public health setting for children over 8 years old and parents of children over 5 years old

  20. Development and evaluation of a mixed gender, multi-talker matrix sentence test in Australian English.

    Science.gov (United States)

    Kelly, Heather; Lin, Gaven; Sankaran, Narayan; Xia, Jing; Kalluri, Sridhar; Carlile, Simon

    2017-02-01

    To develop, in Australian English, the first mixed-gender, multi-talker matrix sentence test. Speech material consisted of a 50-word base matrix whose elements can be combined to form sentences of identical syntax but unpredictable content. Ten voices (five female and five male) were recorded for editing and preliminary level equalization. Elements were presented as single-talker sentences-in-noise during two perceptual tests: an optimization phase that provided the basis for further level correction, and an evaluation phase that perceptually validated those changes. Ten listeners participated in the optimization phase; these and an additional 32 naïve listeners completed the evaluation test. All were fluent in English and all but one had lived in Australia for >2 years. Optimization reduced the standard deviation (SD) and speech reception threshold (SRT) range across all speech material (grand mean SRT = -10.6 dB signal-to-noise ratio, median = -10.8, SD =1.4, range =13.7, slope =19.3%/dB), yielding data consistent with cross-validated matrix tests in other languages. Intelligibility differences between experienced and naïve listeners were minimal. The Australian matrix corpus provides a robust set of test materials suitable for both clinical assessment and research into the dynamics of active listening in multi-talker environments.

  1. Cell-matrix adhesion.

    Science.gov (United States)

    Berrier, Allison L; Yamada, Kenneth M

    2007-12-01

    The complex interactions of cells with extracellular matrix (ECM) play crucial roles in mediating and regulating many processes, including cell adhesion, migration, and signaling during morphogenesis, tissue homeostasis, wound healing, and tumorigenesis. Many of these interactions involve transmembrane integrin receptors. Integrins cluster in specific cell-matrix adhesions to provide dynamic links between extracellular and intracellular environments by bi-directional signaling and by organizing the ECM and intracellular cytoskeletal and signaling molecules. This mini review discusses these interconnections, including the roles of matrix properties such as composition, three-dimensionality, and porosity, the bi-directional functions of cellular contractility and matrix rigidity, and cell signaling. The review concludes by speculating on the application of this knowledge of cell-matrix interactions in the formation of cell adhesions, assembly of matrix, migration, and tumorigenesis to potential future therapeutic approaches. 2007 Wiley-Liss, Inc.

  2. Parallelism in matrix computations

    CERN Document Server

    Gallopoulos, Efstratios; Sameh, Ahmed H

    2016-01-01

    This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of pa...

  3. Index matrices towards an augmented matrix calculus

    CERN Document Server

    Atanassov, Krassimir T

    2014-01-01

    This book presents the very concept of an index matrix and its related augmented matrix calculus in a comprehensive form. It mostly illustrates the exposition with examples related to the generalized nets and intuitionistic fuzzy sets which are examples of an extremely wide array of possible application areas. The present book contains the basic results of the author over index matrices and some of its open problems with the aim to stimulating more researchers to start working in this area.

  4. Quasiclassical Random Matrix Theory

    OpenAIRE

    Prange, R. E.

    1996-01-01

    We directly combine ideas of the quasiclassical approximation with random matrix theory and apply them to the study of the spectrum, in particular to the two-level correlator. Bogomolny's transfer operator T, quasiclassically an NxN unitary matrix, is considered to be a random matrix. Rather than rejecting all knowledge of the system, except for its symmetry, [as with Dyson's circular unitary ensemble], we choose an ensemble which incorporates the knowledge of the shortest periodic orbits, th...

  5. Impacts of Sample Design for Validation Data on the Accuracy of Feedforward Neural Network Classification

    Directory of Open Access Journals (Sweden)

    Giles M. Foody

    2017-08-01

    Full Text Available Validation data are often used to evaluate the performance of a trained neural network and used in the selection of a network deemed optimal for the task at-hand. Optimality is commonly assessed with a measure, such as overall classification accuracy. The latter is often calculated directly from a confusion matrix showing the counts of cases in the validation set with particular labelling properties. The sample design used to form the validation set can, however, influence the estimated magnitude of the accuracy. Commonly, the validation set is formed with a stratified sample to give balanced classes, but also via random sampling, which reflects class abundance. It is suggested that if the ultimate aim is to accurately classify a dataset in which the classes do vary in abundance, a validation set formed via random, rather than stratified, sampling is preferred. This is illustrated with the classification of simulated and remotely-sensed datasets. With both datasets, statistically significant differences in the accuracy with which the data could be classified arose from the use of validation sets formed via random and stratified sampling (z = 2.7 and 1.9 for the simulated and real datasets respectively, for both p < 0.05%. The accuracy of the classifications that used a stratified sample in validation were smaller, a result of cases of an abundant class being commissioned into a rarer class. Simple means to address the issue are suggested.

  6. POLYMAT-C: a comprehensive SPSS program for computing the polychoric correlation matrix.

    Science.gov (United States)

    Lorenzo-Seva, Urbano; Ferrando, Pere J

    2015-09-01

    We provide a free noncommercial SPSS program that implements procedures for (a) obtaining the polychoric correlation matrix between a set of ordered categorical measures, so that it can be used as input for the SPSS factor analysis (FA) program; (b) testing the null hypothesis of zero population correlation for each element of the matrix by using appropriate simulation procedures; (c) obtaining valid and accurate confidence intervals via bootstrap resampling for those correlations found to be significant; and (d) performing, if necessary, a smoothing procedure that makes the matrix amenable to any FA estimation procedure. For the main purpose (a), the program uses a robust unified procedure that allows four different types of estimates to be obtained at the user's choice. Overall, we hope the program will be a very useful tool for the applied researcher, not only because it provides an appropriate input matrix for FA, but also because it allows the researcher to carefully check the appropriateness of the matrix for this purpose. The SPSS syntax, a short manual, and data files related to this article are available as Supplemental materials that are available for download with this article.

  7. Non-negative Matrix Factorization for Binary Data

    DEFF Research Database (Denmark)

    Larsen, Jacob Søgaard; Clemmensen, Line Katrine Harder

    We propose the Logistic Non-negative Matrix Factorization for decomposition of binary data. Binary data are frequently generated in e.g. text analysis, sensory data, market basket data etc. A common method for analysing non-negative data is the Non-negative Matrix Factorization, though...... this is in theory not appropriate for binary data, and thus we propose a novel Non-negative Matrix Factorization based on the logistic link function. Furthermore we generalize the method to handle missing data. The formulation of the method is compared to a previously proposed method (Tome et al., 2015). We compare...... the performance of the Logistic Non-negative Matrix Factorization to Least Squares Non-negative Matrix Factorization and Kullback-Leibler (KL) Non-negative Matrix Factorization on sets of binary data: a synthetic dataset, a set of student comments on their professors collected in a binary term-document matrix...

  8. Patience of matrix games

    DEFF Research Database (Denmark)

    Hansen, Kristoffer Arnsfelt; Ibsen-Jensen, Rasmus; Podolskii, Vladimir V.

    2013-01-01

    For matrix games we study how small nonzero probability must be used in optimal strategies. We show that for image win–lose–draw games (i.e. image matrix games) nonzero probabilities smaller than image are never needed. We also construct an explicit image win–lose game such that the unique optimal...

  9. A Normalized Transfer Matrix Method for the Free Vibration of Stepped Beams: Comparison with Experimental and FE(3D Methods

    Directory of Open Access Journals (Sweden)

    Tamer Ahmed El-Sayed

    2017-01-01

    Full Text Available The exact solution for multistepped Timoshenko beam is derived using a set of fundamental solutions. This set of solutions is derived to normalize the solution at the origin of the coordinates. The start, end, and intermediate boundary conditions involve concentrated masses and linear and rotational elastic supports. The beam start, end, and intermediate equations are assembled using the present normalized transfer matrix (NTM. The advantage of this method is that it is quicker than the standard method because the size of the complete system coefficient matrix is 4 × 4. In addition, during the assembly of this matrix, there are no inverse matrix steps required. The validity of this method is tested by comparing the results of the current method with the literature. Then the validity of the exact stepped analysis is checked using experimental and FE(3D methods. The experimental results for stepped beams with single step and two steps, for sixteen different test samples, are in excellent agreement with those of the three-dimensional finite element FE(3D. The comparison between the NTM method and the finite element method results shows that the modal percentage deviation is increased when a beam step location coincides with a peak point in the mode shape. Meanwhile, the deviation decreases when a beam step location coincides with a straight portion in the mode shape.

  10. Fuzzy risk matrix.

    Science.gov (United States)

    Markowski, Adam S; Mannan, M Sam

    2008-11-15

    A risk matrix is a mechanism to characterize and rank process risks that are typically identified through one or more multifunctional reviews (e.g., process hazard analysis, audits, or incident investigation). This paper describes a procedure for developing a fuzzy risk matrix that may be used for emerging fuzzy logic applications in different safety analyses (e.g., LOPA). The fuzzification of frequency and severity of the consequences of the incident scenario are described which are basic inputs for fuzzy risk matrix. Subsequently using different design of risk matrix, fuzzy rules are established enabling the development of fuzzy risk matrices. Three types of fuzzy risk matrix have been developed (low-cost, standard, and high-cost), and using a distillation column case study, the effect of the design on final defuzzified risk index is demonstrated.

  11. Higher Spin Matrix Models

    Directory of Open Access Journals (Sweden)

    Mauricio Valenzuela

    2017-10-01

    Full Text Available We propose a hybrid class of theories for higher spin gravity and matrix models, i.e., which handle simultaneously higher spin gravity fields and matrix models. The construction is similar to Vasiliev’s higher spin gravity, but part of the equations of motion are provided by the action principle of a matrix model. In particular, we construct a higher spin (gravity matrix model related to type IIB matrix models/string theory that have a well defined classical limit, and which is compatible with higher spin gravity in A d S space. As it has been suggested that higher spin gravity should be related to string theory in a high energy (tensionless regime, and, therefore to M-Theory, we expect that our construction will be useful to explore concrete connections.

  12. Conducted Emission Evaluation for Direct Matrix Converters

    Science.gov (United States)

    Nothofer, A.; Tarisciotti, L.; Greedy, S.; Empringham, L.; De Lillo, L.; Degano, M.

    2016-05-01

    Matrix converters have been recently proposed as an alternative solution to the standard back-to-back converter in aerospace applications. However, Electromagnetic Interference (EMI), in particular, conducted emissions represent a critical aspect for this converter family. Direct Matrix Converter (DMC) are usually modelled only at the normal operating frequency, but for the research presented in this paper, the model is modified in order to include a detailed high frequency description, which is of interest for conducted emission studies.This paper analyzes the performance of DMC, when different control and modulation techniques are used. Experimental results are shown to validate the simulation models.

  13. Initial validation of the Argentinean Spanish version of the PedsQL 4.0 Generic Core Scales in children and adolescents with chronic diseases: acceptability and comprehensibility in low-income settings.

    Science.gov (United States)

    Roizen, Mariana; Rodríguez, Susana; Bauer, Gabriela; Medin, Gabriela; Bevilacqua, Silvina; Varni, James W; Dussel, Veronica

    2008-08-07

    , respectively, p = 0.01), between different chronic health conditions, and children from lower socioeconomic status. Results suggest that the Argentinean Spanish PedsQL 4.0 is suitable for research purposes in the public health setting for children over 8 years old and parents of children over 5 years old. People with low income and low literacy need help to complete the instrument. Steps to expand the use of the Argentinean Spanish PedsQL 4.0 include an alternative approach to scoring for the 2-4 year-olds, further understanding of how to increase reliability for the 5-7 year-olds self-report, and confirmation of other aspects of validity.

  14. Initial validation of the Argentinean Spanish version of the PedsQL™ 4.0 Generic Core Scales in children and adolescents with chronic diseases: acceptability and comprehensibility in low-income settings

    Science.gov (United States)

    Roizen, Mariana; Rodríguez, Susana; Bauer, Gabriela; Medin, Gabriela; Bevilacqua, Silvina; Varni, James W; Dussel, Veronica

    2008-01-01

    .72 and 66.87, for healthy and ill children, respectively, p = 0.01), between different chronic health conditions, and children from lower socioeconomic status. Conclusion Results suggest that the Argentinean Spanish PedsQL™ 4.0 is suitable for research purposes in the public health setting for children over 8 years old and parents of children over 5 years old. People with low income and low literacy need help to complete the instrument. Steps to expand the use of the Argentinean Spanish PedsQL™ 4.0 include an alternative approach to scoring for the 2–4 year-olds, further understanding of how to increase reliability for the 5–7 year-olds self-report, and confirmation of other aspects of validity. PMID:18687134

  15. Elementary matrix theory

    CERN Document Server

    Eves, Howard

    1980-01-01

    The usefulness of matrix theory as a tool in disciplines ranging from quantum mechanics to psychometrics is widely recognized, and courses in matrix theory are increasingly a standard part of the undergraduate curriculum.This outstanding text offers an unusual introduction to matrix theory at the undergraduate level. Unlike most texts dealing with the topic, which tend to remain on an abstract level, Dr. Eves' book employs a concrete elementary approach, avoiding abstraction until the final chapter. This practical method renders the text especially accessible to students of physics, engineeri

  16. On the eigenvalue and eigenvector derivatives of a non-defective matrix

    Science.gov (United States)

    Juang, Jer-Nan; Ghaemmaghami, Peiman; Lim, Kyong Been

    1988-01-01

    A novel approach is introduced to address the problem of existence of differentiable eigenvectors for a nondefective matrix which may have repeated eigenvalues. The existence of eigenvector derivatives for a unique set of continuous eigenvectors corresponding to a repeated eigenvalue is rigorously established for nondefective and analytic matrices. A numerically implementable method is then developed to compute the differentiable eigenvectors associated with repeated eigenvalues. The solutions of eigenvalue and eigenvector derivatives for repeated eigenvalues are then derived. An example is given to illustrate the validity of formulations developed in this paper.

  17. Semisupervised kernel matrix learning by kernel propagation.

    Science.gov (United States)

    Hu, Enliang; Chen, Songcan; Zhang, Daoqiang; Yin, Xuesong

    2010-11-01

    The goal of semisupervised kernel matrix learning (SS-KML) is to learn a kernel matrix on all the given samples on which just a little supervised information, such as class label or pairwise constraint, is provided. Despite extensive research, the performance of SS-KML still leaves some space for improvement in terms of effectiveness and efficiency. For example, a recent pairwise constraints propagation (PCP) algorithm has formulated SS-KML into a semidefinite programming (SDP) problem, but its computation is very expensive, which undoubtedly restricts PCPs scalability in practice. In this paper, a novel algorithm, called kernel propagation (KP), is proposed to improve the comprehensive performance in SS-KML. The main idea of KP is first to learn a small-sized sub-kernel matrix (named seed-kernel matrix) and then propagate it into a larger-sized full-kernel matrix. Specifically, the implementation of KP consists of three stages: 1) separate the supervised sample (sub)set X(l) from the full sample set X; 2) learn a seed-kernel matrix on X(l) through solving a small-scale SDP problem; and 3) propagate the learnt seed-kernel matrix into a full-kernel matrix on X . Furthermore, following the idea in KP, we naturally develop two conveniently realizable out-of-sample extensions for KML: one is batch-style extension, and the other is online-style extension. The experiments demonstrate that KP is encouraging in both effectiveness and efficiency compared with three state-of-the-art algorithms and its related out-of-sample extensions are promising too.

  18. Pesticide-Exposure Matrix

    Science.gov (United States)

    The "Pesticide-exposure Matrix" was developed to help epidemiologists and other researchers identify the active ingredients to which people were likely exposed when their homes and gardens were treated for pests in past years.

  19. Tendon functional extracellular matrix.

    Science.gov (United States)

    Screen, Hazel R C; Berk, David E; Kadler, Karl E; Ramirez, Francesco; Young, Marian F

    2015-06-01

    This article is one of a series, summarizing views expressed at the Orthopaedic Research Society New Frontiers in Tendon Research Conference. This particular article reviews the three workshops held under the "Functional Extracellular Matrix" stream. The workshops focused on the roles of the tendon extracellular matrix, such as performing the mechanical functions of tendon, creating the local cell environment, and providing cellular cues. Tendon is a complex network of matrix and cells, and its biological functions are influenced by widely varying extrinsic and intrinsic factors such as age, nutrition, exercise levels, and biomechanics. Consequently, tendon adapts dynamically during development, aging, and injury. The workshop discussions identified research directions associated with understanding cell-matrix interactions to be of prime importance for developing novel strategies to target tendon healing or repair. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  20. Matrix Big Brunch

    OpenAIRE

    Bedford, J; Papageorgakis, C.; Rodriguez-Gomez, D.; Ward, J.

    2007-01-01

    Following the holographic description of linear dilaton null Cosmologies with a Big Bang in terms of Matrix String Theory put forward by Craps, Sethi and Verlinde, we propose an extended background describing a Universe including both Big Bang and Big Crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using Matrix String Theory. We provide a simple theory capable of...

  1. The Matrix Organization Revisited

    DEFF Research Database (Denmark)

    Gattiker, Urs E.; Ulhøi, John Parm

    1999-01-01

    This paper gives a short overview of matrix structure and technology management. It outlines some of the characteristics and also points out that many organizations may actualy be hybrids (i.e. mix several ways of organizing to allocate resorces effectively).......This paper gives a short overview of matrix structure and technology management. It outlines some of the characteristics and also points out that many organizations may actualy be hybrids (i.e. mix several ways of organizing to allocate resorces effectively)....

  2. Hacking the Matrix.

    Science.gov (United States)

    Czerwinski, Michael; Spence, Jason R

    2017-01-05

    Recently in Nature, Gjorevski et al. (2016) describe a fully defined synthetic hydrogel that mimics the extracellular matrix to support in vitro growth of intestinal stem cells and organoids. The hydrogel allows exquisite control over the chemical and physical in vitro niche and enables identification of regulatory properties of the matrix. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Development, Validation, and Field-Testing of an Instrument for Clinical Assessment of HIV-Associated Neuropathy and Neuropathic Pain in Resource-Restricted and Large Population Study Settings.

    Directory of Open Access Journals (Sweden)

    Yohannes W Woldeamanuel

    Full Text Available HIV-associated sensory peripheral neuropathy (HIV-SN afflicts approximately 50% of patients on antiretroviral therapy, and is associated with significant neuropathic pain. Simple accurate diagnostic instruments are required for clinical research and daily practice in both high- and low-resource setting. A 4-item clinical tool (CHANT: Clinical HIV-associated Neuropathy Tool assessing symptoms (pain and numbness and signs (ankle reflexes and vibration sense was developed by selecting and combining the most accurate measurands from a deep phenotyping study of HIV positive people (Pain In Neuropathy Study-HIV-PINS. CHANT was alpha-tested in silico against the HIV-PINS dataset and then clinically validated and field-tested in HIV-positive cohorts in London, UK and Johannesburg, South Africa. The Utah Early Neuropathy Score (UENS was used as the reference standard in both settings. In a second step, neuropathic pain in the presence of HIV-SN was assessed using the Douleur Neuropathique en 4 Questions (DN4-interview and a body map. CHANT achieved high accuracy on alpha-testing with sensitivity and specificity of 82% and 90%, respectively. In 30 patients in London, CHANT diagnosed 43.3% (13/30 HIV-SN (66.7% with neuropathic pain; sensitivity = 100%, specificity = 85%, and likelihood ratio = 6.7 versus UENS, internal consistency = 0.88 (Cronbach alpha, average item-total correlation = 0.73 (Spearman's Rho, and inter-tester concordance > 0.93 (Spearman's Rho. In 50 patients in Johannesburg, CHANT diagnosed 66% (33/50 HIV-SN (78.8% neuropathic pain; sensitivity = 74.4%, specificity = 85.7%, and likelihood ratio = 5.29 versus UENS. A positive CHANT score markedly increased of pre- to post-test clinical certainty of HIV-SN from 43% to 83% in London, and from 66% to 92% in Johannesburg. In conclusion, a combination of four easily and quickly assessed clinical items can be used to accurately diagnose HIV-SN. DN4-interview used in the context of bilateral feet

  4. Betatron coupling: Merging Hamiltonian and matrix approaches

    Directory of Open Access Journals (Sweden)

    R. Calaga

    2005-03-01

    Full Text Available Betatron coupling is usually analyzed using either matrix formalism or Hamiltonian perturbation theory. The latter is less exact but provides a better physical insight. In this paper direct relations are derived between the two formalisms. This makes it possible to interpret the matrix approach in terms of resonances, as well as use results of both formalisms indistinctly. An approach to measure the complete coupling matrix and its determinant from turn-by-turn data is presented. Simulations using methodical accelerator design MAD-X, an accelerator design and tracking program, were performed to validate the relations and understand the scope of their application to real accelerators such as the Relativistic Heavy Ion Collider.

  5. Some remarks on 't Hooft's S-matrix for black holes

    OpenAIRE

    Itzhaki, N.

    1996-01-01

    We discuss the limitations of 't Hooft's proposal for the black hole S-matrix. We find that the validity of the S-matrix implies violation of the semi-classical approximation at scales large compared to the Planck scale. We also show that the effect of the centrifugal barrier on the S-matrix is crucial even for large transverse distances.

  6. Algorithms over partially ordered sets

    DEFF Research Database (Denmark)

    Baer, Robert M.; Østerby, Ole

    1969-01-01

    We here study some problems concerned with the computational analysis of finite partially ordered sets. We begin (in § 1) by showing that the matrix representation of a binary relationR may always be taken in triangular form ifR is a partial ordering. We consider (in § 2) the chain structure...

  7. Matrix Information Geometry

    CERN Document Server

    Bhatia, Rajendra

    2013-01-01

    This book is an outcome of the Indo-French Workshop on Matrix Information Geometries (MIG): Applications in Sensor and Cognitive Systems Engineering, which was held in Ecole Polytechnique and Thales Research and Technology Center, Palaiseau, France, in February 23-25, 2011. The workshop was generously funded by the Indo-French Centre for the Promotion of Advanced Research (IFCPAR).  During the event, 22 renowned invited french or indian speakers gave lectures on their areas of expertise within the field of matrix analysis or processing. From these talks, a total of 17 original contribution or state-of-the-art chapters have been assembled in this volume. All articles were thoroughly peer-reviewed and improved, according to the suggestions of the international referees. The 17 contributions presented  are organized in three parts: (1) State-of-the-art surveys & original matrix theory work, (2) Advanced matrix theory for radar processing, and (3) Matrix-based signal processing applications.  

  8. Is the prefrontal cortex important for fluid intelligence? A neuropsychological study using Matrix Reasoning.

    Science.gov (United States)

    Tranel, Daniel; Manzel, Kenneth; Anderson, Steven W

    2008-03-01

    Patients with prefrontal damage and severe defects in decision making and emotional regulation often have a remarkable absence of intellectual impairment, as measured by conventional IQ tests such as the WAIS/WAIS-R. This enigma might be explained by shortcomings in the tests, which tend to emphasize measures of "crystallized" (e.g., vocabulary, fund of information) more than "fluid" (e.g., novel problem solving) intelligence. The WAIS-III added the Matrix Reasoning subtest to enhance measurement of fluid reasoning. In a set of four studies, we investigated Matrix Reasoning performances in 80 patients with damage to various sectors of the prefrontal cortex, and contrasted these with the performances of 80 demographically matched patients with damage outside the frontal lobes. The results failed to support the hypothesis that prefrontal damage would disproportionately impair fluid intelligence, and every prefrontal subgroup we studied (dorsolateral, ventromedial, dorsolateral + ventromedial) had Matrix Reasoning scores (as well as IQ scores more generally) that were indistinguishable from those of the brain-damaged comparison groups. Our findings do not support a connection between fluid intelligence and the frontal lobes, although a viable alternative interpretation is that the Matrix Reasoning subtest lacks construct validity as a measure of fluid intelligence.

  9. MATLAB matrix algebra

    CERN Document Server

    Pérez López, César

    2014-01-01

    MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. MATLAB Matrix Algebra introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. Starting with a look at symbolic and numeric variables, with an emphasis on vector and matrix variables, you will go on to examine functions and operations that support vectors and matrices as arguments, including those based on analytic parent functions. Computational methods for finding eigenvalues and eigenvectors of matrices are detailed, leading to various matrix decompositions. Applications such as change of bases, the classification of quadratic forms and ...

  10. Matrix interdiction problem

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Feng [Los Alamos National Laboratory; Kasiviswanathan, Shiva [Los Alamos National Laboratory

    2010-01-01

    In the matrix interdiction problem, a real-valued matrix and an integer k is given. The objective is to remove k columns such that the sum over all rows of the maximum entry in each row is minimized. This combinatorial problem is closely related to bipartite network interdiction problem which can be applied to prioritize the border checkpoints in order to minimize the probability that an adversary can successfully cross the border. After introducing the matrix interdiction problem, we will prove the problem is NP-hard, and even NP-hard to approximate with an additive n{gamma} factor for a fixed constant {gamma}. We also present an algorithm for this problem that achieves a factor of (n-k) mUltiplicative approximation ratio.

  11. Matrixed business support comparison study.

    Energy Technology Data Exchange (ETDEWEB)

    Parsons, Josh D.

    2004-11-01

    The Matrixed Business Support Comparison Study reviewed the current matrixed Chief Financial Officer (CFO) division staff models at Sandia National Laboratories. There were two primary drivers of this analysis: (1) the increasing number of financial staff matrixed to mission customers and (2) the desire to further understand the matrix process and the opportunities and challenges it creates.

  12. Elementary matrix algebra

    CERN Document Server

    Hohn, Franz E

    2012-01-01

    This complete and coherent exposition, complemented by numerous illustrative examples, offers readers a text that can teach by itself. Fully rigorous in its treatment, it offers a mathematically sound sequencing of topics. The work starts with the most basic laws of matrix algebra and progresses to the sweep-out process for obtaining the complete solution of any given system of linear equations - homogeneous or nonhomogeneous - and the role of matrix algebra in the presentation of useful geometric ideas, techniques, and terminology.Other subjects include the complete treatment of the structur

  13. Complex matrix model duality

    Energy Technology Data Exchange (ETDEWEB)

    Brown, T.W.

    2010-11-15

    The same complex matrix model calculates both tachyon scattering for the c=1 non-critical string at the self-dual radius and certain correlation functions of half-BPS operators in N=4 super- Yang-Mills. It is dual to another complex matrix model where the couplings of the first model are encoded in the Kontsevich-like variables of the second. The duality between the theories is mirrored by the duality of their Feynman diagrams. Analogously to the Hermitian Kontsevich- Penner model, the correlation functions of the second model can be written as sums over discrete points in subspaces of the moduli space of punctured Riemann surfaces. (orig.)

  14. The algebras of large N matrix mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Halpern, M.B.; Schwartz, C.

    1999-09-16

    Extending early work, we formulate the large N matrix mechanics of general bosonic, fermionic and supersymmetric matrix models, including Matrix theory: The Hamiltonian framework of large N matrix mechanics provides a natural setting in which to study the algebras of the large N limit, including (reduced) Lie algebras, (reduced) supersymmetry algebras and free algebras. We find in particular a broad array of new free algebras which we call symmetric Cuntz algebras, interacting symmetric Cuntz algebras, symmetric Bose/Fermi/Cuntz algebras and symmetric Cuntz superalgebras, and we discuss the role of these algebras in solving the large N theory. Most important, the interacting Cuntz algebras are associated to a set of new (hidden!) local quantities which are generically conserved only at large N. A number of other new large N phenomena are also observed, including the intrinsic nonlocality of the (reduced) trace class operators of the theory and a closely related large N field identification phenomenon which is associated to another set (this time nonlocal) of new conserved quantities at large N.

  15. Automatic Generation of Partitioned Matrix Expressions for Matrix Operations

    Science.gov (United States)

    Fabregat-Traver, Diego; Bientinesi, Paolo

    2010-09-01

    We target the automatic generation of formally correct algorithms and routines for linear algebra operations. Given the broad variety of architectures and configurations with which scientists deal, there does not exist one algorithmic variant that is suitable for all scenarios. Therefore, we aim to generate a family of algorithmic variants to attain high-performance for a broad set of scenarios. One of the authors has previously demonstrated that automatic derivation of a family of algorithms is possible when the Partitioned Matrix Expression (PME) of the target operation is available. The PME is a recursive definition that states the relations between submatrices in the input and the output operands. In this paper we describe all the steps involved in the automatic derivation of PMEs, thus making progress towards a fully automated system.

  16. Shunt Active and Series Active Filters-Based Power Quality Conditioner for Matrix Converter

    OpenAIRE

    P. Jeno Paul

    2011-01-01

    This paper proposes a series active filter and shunt active filter to minimize the power quality impact present in matrix converters instead of passive filter. A matrix converter produces significant harmonics and nonstandard frequency components into load. The proposed system compensates the sag and swell problems efficiently in matrix converter. The proposed system has been tested and validated on the matrix converter using MATLAB/Simulink software. Simulated results confirm that the active...

  17. Characterization of Metal Matrix Composites

    Science.gov (United States)

    Daniel, I. M.; Chun, H. J.; Karalekas, D.

    1994-01-01

    Experimental methods were developed, adapted, and applied to the characterization of a metal matrix composite system, namely, silicon carbide/aluminim (SCS-2/6061 Al), and its constituents. The silicon carbide fiber was characterized by determining its modulus, strength, and coefficient of thermal expansion. The aluminum matrix was characterized thermomechanically up to 399 C (750 F) at two strain rates. The unidirectional SiC/Al composite was characterized mechanically under longitudinal, transverse, and in-plane shear loading up to 399 C (750 F). Isothermal and non-isothermal creep behavior was also measured. The applicability of a proposed set of multifactor thermoviscoplastic nonlinear constitutive relations and a computer code was investigated. Agreement between predictions and experimental results was shown in a few cases. The elastoplastic thermomechanical behavior of the composite was also described by a number of new analytical models developed or adapted for the material system studied. These models include the rule of mixtures, composite cylinder model with various thermoelastoplastic analyses and a model based on average field theory. In most cases satisfactory agreement was demonstrated between analytical predictions and experimental results for the cases of stress-strain behavior and thermal deformation behavior at different temperatures. In addition, some models yielded detailed three-dimensional stress distributions in the constituents within the composite.

  18. Matrix relation algebras

    NARCIS (Netherlands)

    el Bachraoui, M.; van de Vel, M.L.J.

    2002-01-01

    Square matrices over a relation algebra are relation algebras in a natural way. We show that for fixed n, these algebras can be characterized as reducts of some richer kind of algebra. Hence for fixed n, the class of n × n matrix relation algebras has a first-order characterization. As a

  19. A random matrix analysis

    Indian Academy of Sciences (India)

    chaos to galaxies. We demonstrate the applicability of random matrix theory for networks by pro- viding a new dimension to complex systems research. We show that in spite of huge differences ... as mentioned earlier, different types of networks can be constructed based on the nature of connections. For example,.

  20. Constructing the matrix

    Science.gov (United States)

    Elliott, John

    2012-09-01

    As part of our 'toolkit' for analysing an extraterrestrial signal, the facility for calculating structural affinity to known phenomena must be part of our core capabilities. Without such a resource, we risk compromising our potential for detection and decipherment or at least causing significant delay in the process. To create such a repository for assessing structural affinity, all known systems (language parameters) need to be structurally analysed to 'place' their 'system' within a relational communication matrix. This will need to include all known variants of language structure, whether 'living' (in current use) or ancient; this must also include endeavours to incorporate yet undeciphered scripts and non-human communication, to provide as complete a picture as possible. In creating such a relational matrix, post-detection decipherment will be assisted by a structural 'map' that will have the potential for 'placing' an alien communication with its nearest known 'neighbour', to assist subsequent categorisation of basic parameters as a precursor to decipherment. 'Universal' attributes and behavioural characteristics of known communication structure will form a range of templates (Elliott, 2001 [1] and Elliott et al., 2002 [2]), to support and optimise our attempt at categorising and deciphering the content of an extraterrestrial signal. Detection of the hierarchical layers, which comprise intelligent, complex communication, will then form a matrix of calculations that will ultimately score affinity through a relational matrix of structural comparison. In this paper we develop the rationales and demonstrate functionality with initial test results.

  1. 2matrix: A Utility for Indel Coding and Phylogenetic Matrix Concatenation

    Directory of Open Access Journals (Sweden)

    Nelson R. Salinas

    2014-01-01

    Full Text Available Premise of the study: Phylogenetic analysis of DNA and amino acid sequences requires the creation of files formatted specifically for each analysis package. Programs currently available cannot simultaneously code inferred insertion/deletion (indel events in sequence alignments and concatenate data sets. Methods and Results: A novel Perl script, 2matrix, was created to concatenate matrices of non-molecular characters and/or aligned sequences and to code indels. 2matrix outputs a variety of formats compatible with popular phylogenetic programs. Conclusions: 2matrix efficiently codes indels and concatenates matrices of sequences and non-molecular data. It is available for free download under a GPL (General Public License open source license (https://github.com/nrsalinas/2matrix/archive/master.zip.

  2. Analysis Matrix for Smart Cities

    Directory of Open Access Journals (Sweden)

    Pablo E. Branchi

    2014-01-01

    Full Text Available The current digital revolution has ignited the evolution of communications grids and the development of new schemes for productive systems. Traditional technologic scenarios have been challenged, and Smart Cities have become the basis for urban competitiveness. The citizen is the one who has the power to set new scenarios, and that is why a definition of the way people interact with their cities is needed, as is commented in the first part of the article. At the same time, a lack of clarity has been detected in the way of describing what Smart Cities are, and the second part will try to set the basis for that. For all before, the information and communication technologies that manage and transform 21st century cities must be reviewed, analyzing their impact on new social behaviors that shape the spaces and means of communication, as is posed in the experimental section, setting the basis for an analysis matrix to score the different elements that affect a Smart City environment. So, as the better way to evaluate what a Smart City is, there is a need for a tool to score the different technologies on the basis of their usefulness and consequences, considering the impact of each application. For all of that, the final section describes the main objective of this article in practical scenarios, considering how the technologies are used by citizens, who must be the main concern of all urban development.

  3. Channel matrix characterization in MIMO scenario through impedance modulation

    OpenAIRE

    Monsalve Carcelen, Beatriz; Romeu Robert, Jordi; Blanch Boris, Sebastián

    2010-01-01

    In this paper a new measurement setup to obtain the characteristic channel matrix of a MIMO radio system is presented using a small and non intrusive device able to characterize two antennas and therefore obtaining the characteristic channel matrix of a MIMO radio system of Nx2 antennas without using a receiver for each port. The extension to a system of more than two output elements can be easily achieved increasing the complexity of the switching device. To validate the new methodo...

  4. Matrix groups for undergraduates

    CERN Document Server

    Tapp, Kristopher

    2016-01-01

    Matrix groups touch an enormous spectrum of the mathematical arena. This textbook brings them into the undergraduate curriculum. It makes an excellent one-semester course for students familiar with linear and abstract algebra and prepares them for a graduate course on Lie groups. Matrix Groups for Undergraduates is concrete and example-driven, with geometric motivation and rigorous proofs. The story begins and ends with the rotations of a globe. In between, the author combines rigor and intuition to describe the basic objects of Lie theory: Lie algebras, matrix exponentiation, Lie brackets, maximal tori, homogeneous spaces, and roots. This second edition includes two new chapters that allow for an easier transition to the general theory of Lie groups. From reviews of the First Edition: This book could be used as an excellent textbook for a one semester course at university and it will prepare students for a graduate course on Lie groups, Lie algebras, etc. … The book combines an intuitive style of writing w...

  5. Extracellular matrix structure.

    Science.gov (United States)

    Theocharis, Achilleas D; Skandalis, Spyros S; Gialeli, Chrysostomi; Karamanos, Nikos K

    2016-02-01

    Extracellular matrix (ECM) is a non-cellular three-dimensional macromolecular network composed of collagens, proteoglycans/glycosaminoglycans, elastin, fibronectin, laminins, and several other glycoproteins. Matrix components bind each other as well as cell adhesion receptors forming a complex network into which cells reside in all tissues and organs. Cell surface receptors transduce signals into cells from ECM, which regulate diverse cellular functions, such as survival, growth, migration, and differentiation, and are vital for maintaining normal homeostasis. ECM is a highly dynamic structural network that continuously undergoes remodeling mediated by several matrix-degrading enzymes during normal and pathological conditions. Deregulation of ECM composition and structure is associated with the development and progression of several pathologic conditions. This article emphasizes in the complex ECM structure as to provide a better understanding of its dynamic structural and functional multipotency. Where relevant, the implication of the various families of ECM macromolecules in health and disease is also presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Distributed Generation using Indirect Matrix Converter in Boost Operating Mode

    DEFF Research Database (Denmark)

    Liu, Xiong; Loh, Poh Chiang; Wang, Peng

    2011-01-01

    Indirect matrix converter (IMC) using two stages configuration is an interesting alternative for ac/ac conversions. In some cases, the ac/ac converter needs boost function, which can't be achieved by traditional IMC due to its limited input-to-output voltage transfer gain 0.866. Alternatively...... power factor and controllable grid-side power factor. In isolated mode, the matrix converter is controlled to supply a three-phase ac voltage and also guarantee sinusoidal input/output waveforms as well as unity input power factor. Simulation and experimental results are provided to validate...... the effectiveness of the control schemes for the proposed matrix converter....

  7. Costs equations for cost modeling: application of ABC Matrix

    Directory of Open Access Journals (Sweden)

    Alex Fabiano Bertollo Santana

    2016-03-01

    Full Text Available This article aimed at providing an application of the ABC Matrix model - a management tool that models processes and activities. The ABC Matrix is based on matrix multiplication, using a fast algorithm for the development of costing systems and the subsequent translation of the costs in cost equations and systems. The research methodology is classified as a case study, using the simulation data to validate the model. The conclusion of the research is that the algorithm presented is an important development, because it is an effective approach to calculating the product cost and because it provides simple and flexible algorithm design software for controlling the cost of products

  8. Trust, but Verify: Standard Setting That Honors and Validates Professional Teacher Judgment. Subtitle: A Tenuous Titanic Tale of Testy Testing and Titillating Touchstones (A Screen Play with an Unknown Number of Acts).

    Science.gov (United States)

    Matter, M. Kevin

    The Cherry Creek School district (Englewood, Colorado) is a growing district of 37,000 students in the Denver area. The 1988 Colorado State School Finance Act required district-set proficiencies (standards), and forced agreement on a set of values for student knowledge and skills. State-adopted standards added additional requirements for the…

  9. Modulation and control of matrix converter for aerospace application

    Science.gov (United States)

    Kobravi, Keyhan

    In the context of modern aircraft systems, a major challenge is power conversion to supply the aircraft's electrical instruments. These instruments are energized through a fixed-frequency internal power grid. In an aircraft, the available sources of energy are a set of variable-speed generators which provide variable-frequency ac voltages. Therefore, to energize the internal power grid of an aircraft, the variable-frequency ac voltages should be converted to a fixed-frequency ac voltage. As a result, an ac to ac power conversion is required within an aircraft's power system. This thesis develops a Matrix Converter to energize the aircraft's internal power grid. The Matrix Converter provides a direct ac to ac power conversion. A major challenge of designing Matrix Converters for aerospace applications is to minimize the volume and weight of the converter. These parameters are minimized by increasing the switching frequency of the converter. To design a Matrix Converter operating at a high switching frequency, this thesis (i) develops a scheme to integrate fast semiconductor switches within the current available Matrix Converter topologies, i.e., MOSFET-based Matrix Converter, and (ii) develops a new modulation strategy for the Matrix Converter. This Matrix Converter and the new modulation strategy enables the operation of the converter at a switching-frequency of 40kHz. To provide a reliable source of energy, this thesis also develops a new methodology for robust control of Matrix Converter. To verify the performance of the proposed MOSFET-based Matrix Converter, modulation strategy, and control design methodology, various simulation and experimental results are presented. The experimental results are obtained under operating condition present in an aircraft. The experimental results verify the proposed Matrix Converter provides a reliable power conversion in an aircraft under extreme operating conditions. The results prove the superiority of the proposed Matrix

  10. Cobalt magnetic nanoparticles embedded in carbon matrix: biofunctional validation

    Energy Technology Data Exchange (ETDEWEB)

    Krolow, Matheus Z., E-mail: matheuskrolow@ifsul.edu.br [Universidade Federal de Pelotas, Engenharia de Materiais, Centro de Desenvolvimento Tecnologico (Brazil); Monte, Leonardo G.; Remiao, Mariana H.; Hartleben, Claudia P.; Moreira, Angela N.; Dellagostin, Odir A. [Universidade Federal de Pelotas, Nucleo de Biotecnologia, Centro de Desenvolvimento Tecnologico (Brazil); Piva, Evandro [Universidade Federal de Pelotas, Faculdade de Odontologia (Brazil); Conceicao, Fabricio R. [Universidade Federal de Pelotas, Nucleo de Biotecnologia, Centro de Desenvolvimento Tecnologico (Brazil); Carreno, Neftali L. V. [Universidade Federal de Pelotas, Engenharia de Materiais, Centro de Desenvolvimento Tecnologico (Brazil)

    2012-09-15

    Carbon nanostructures and nanocomposites display versatile allotropic morphologies, physico-chemical properties and have a wide range of applications in mechanics, electronics, biotechnology, structural material, chemical processing, and energy management. In this study we report the synthesis, characterization, and biotechnological application of cobalt magnetic nanoparticles, with diameter approximately 15-40 nm, embedded in carbon structure (Co/C-MN). A single-step chemical process was used in the synthesis of the Co/C-MN. The Co/C-MN has presented superparamagnetic behavior at room temperature an essential property for immunoseparation assays carried out here. To stimulate interactions between proteins and Co/C-MN, this nanocomposite was functionalized with acrylic acid (AA). We have showed the bonding of different proteins onto Co/C-AA surface using immunofluorescence assay. A Co/C-AA coated with monoclonal antibody anti-pathogenic Leptospira spp. was able to capture leptospires, suggesting that it could be useful in immunoseparation assays.

  11. Estimation and Q-Matrix Validation for Diagnostic Classification Models

    Science.gov (United States)

    Feng, Yuling

    2013-01-01

    Diagnostic classification models (DCMs) are structured latent class models widely discussed in the field of psychometrics. They model subjects' underlying attribute patterns and classify subjects into unobservable groups based on their mastery of attributes required to answer the items correctly. The effective implementation of DCMs depends…

  12. Absorption properties of waste matrix materials

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, J.B. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1997-06-01

    This paper very briefly discusses the need for studies of the limiting critical concentration of radioactive waste matrix materials. Calculated limiting critical concentration values for some common waste materials are listed. However, for systems containing large quantities of waste materials, differences up to 10% in calculated k{sub eff} values are obtained by changing cross section data sets. Therefore, experimental results are needed to compare with calculation results for resolving these differences and establishing realistic biases.

  13. Pseudo-Hermitian random matrix theory

    Science.gov (United States)

    Srivastava, S. C. L.; Jain, S. R.

    2013-02-01

    Complex extension of quantum mechanics and the discovery of pseudo-unitarily invariant random matrix theory has set the stage for a number of applications of these concepts in physics. We briefly review the basic ideas and present applications to problems in statistical mechanics where new results have become possible. We have found it important to mention the precise directions where advances could be made if further results become available.

  14. Pseudo-Hermitian random matrix theory

    OpenAIRE

    Srivastava, Shashi C. L.; Jain, S. R.

    2013-01-01

    Complex extension of quantum mechanics and the discovery of pseudo-unitarily invariant random matrix theory has set the stage for a number of applications of these concepts in physics. We briefly review the basic ideas and present applications to problems in statistical mechanics where new results have become possible. We have found it important to mention the precise directions where advances could be made if further results become available.

  15. Does the International Classification of Functioning, Disability and Health (ICF) core set for low back pain cover the patients' problems? A cross-sectional content-validity study with a Norwegian population.

    Science.gov (United States)

    Bautz-Holter, E; Sveen, U; Cieza, A; Geyh, S; Røe, C

    2008-12-01

    The aim of this work was to evaluate the Norwegian form of the International Classification of Functioning, Disability and Health (ICF) Core Set for low back pain patients and investigate the feasibility of the Core Set in clinical practice. This was part of an international multicenter study with 118 participating Norwegian patients referred to Departments of Physical Medicine and rehabilitation with low back pain (LBP). The ICF Core Set for LBP was filled in by the health professionals. The patients reported their problems using the Medical Outcome Study Short Form 36 (SF-36) and the Oswestry Low Back Pain Disability Questionnaire (ODI). The ICF Core Set categories capture the problems of the LBP patients, and few categories were reported to be missing. Many problems were reported within body function, and problems within work and employment were captured by the activity and participation component. The environmental factors in ICF were most frequently scored as facilitators, but the same factor could also represent a barrier in other individuals. Health professionals, family and friends were important factors within this domain. Few problems were scored as severe or complete indicating the need of collapsing the qualifier levels. Scoring of the ICF Core Set was feasibly, but rather time-consuming. The ICF Core Set for LBP captures the problems of LBP, and adds important aspects to clinical practice in the field of LBP. However, the ICF Core Set for LBP needs further elaboration in order to improve the clinical feasibility.

  16. Random matrix theory

    CERN Document Server

    Deift, Percy

    2009-01-01

    This book features a unified derivation of the mathematical theory of the three classical types of invariant random matrix ensembles-orthogonal, unitary, and symplectic. The authors follow the approach of Tracy and Widom, but the exposition here contains a substantial amount of additional material, in particular, facts from functional analysis and the theory of Pfaffians. The main result in the book is a proof of universality for orthogonal and symplectic ensembles corresponding to generalized Gaussian type weights following the authors' prior work. New, quantitative error estimates are derive

  17. Matrix Encryption Scheme

    Directory of Open Access Journals (Sweden)

    Abdelhakim Chillali

    2017-05-01

    Full Text Available In classical cryptography, the Hill cipher is a polygraphic substitution cipher based on linear algebra. In this work, we proposed a new problem applicable to the public key cryptography, based on the Matrices, called “Matrix discrete logarithm problem”, it uses certain elements formed by matrices whose coefficients are elements in a finite field. We have constructed an abelian group and, for the cryptographic part in this unreliable group, we then perform the computation corresponding to the algebraic equations, Returning the encrypted result to a receiver. Upon receipt of the result, the receiver can retrieve the sender’s clear message by performing the inverse calculation.

  18. Matrix string partition function

    CERN Document Server

    Kostov, Ivan K; Kostov, Ivan K.; Vanhove, Pierre

    1998-01-01

    We evaluate quasiclassically the Ramond partition function of Euclidean D=10 U(N) super Yang-Mills theory reduced to a two-dimensional torus. The result can be interpreted in terms of free strings wrapping the space-time torus, as expected from the point of view of Matrix string theory. We demonstrate that, when extrapolated to the ultraviolet limit (small area of the torus), the quasiclassical expressions reproduce exactly the recently obtained expression for the partition of the completely reduced SYM theory, including the overall numerical factor. This is an evidence that our quasiclassical calculation might be exact.

  19. Matrix vector analysis

    CERN Document Server

    Eisenman, Richard L

    2005-01-01

    This outstanding text and reference applies matrix ideas to vector methods, using physical ideas to illustrate and motivate mathematical concepts but employing a mathematical continuity of development rather than a physical approach. The author, who taught at the U.S. Air Force Academy, dispenses with the artificial barrier between vectors and matrices--and more generally, between pure and applied mathematics.Motivated examples introduce each idea, with interpretations of physical, algebraic, and geometric contexts, in addition to generalizations to theorems that reflect the essential structur

  20. Matrix algebra for linear models

    CERN Document Server

    Gruber, Marvin H J

    2013-01-01

    Matrix methods have evolved from a tool for expressing statistical problems to an indispensable part of the development, understanding, and use of various types of complex statistical analyses. This evolution has made matrix methods a vital part of statistical education. Traditionally, matrix methods are taught in courses on everything from regression analysis to stochastic processes, thus creating a fractured view of the topic. Matrix Algebra for Linear Models offers readers a unique, unified view of matrix analysis theory (where and when necessary), methods, and their applications. Written f

  1. KBLAS: An Optimized Library for Dense Matrix-Vector Multiplication on GPU Accelerators

    KAUST Repository

    Abdelfattah, Ahmad

    2016-05-11

    KBLAS is an open-source, high-performance library that provides optimized kernels for a subset of Level 2 BLAS functionalities on CUDA-enabled GPUs. Since performance of dense matrix-vector multiplication is hindered by the overhead of memory accesses, a double-buffering optimization technique is employed to overlap data motion with computation. After identifying a proper set of tuning parameters, KBLAS efficiently runs on various GPU architectures while avoiding code rewriting and retaining compliance with the standard BLAS API. Another optimization technique allows ensuring coalesced memory access when dealing with submatrices, especially for high-level dense linear algebra algorithms. All KBLAS kernels have been leveraged to a multi-GPU environment, which requires the introduction of new APIs. Considering general matrices, KBLAS is very competitive with existing state-of-the-art kernels and provides a smoother performance across a wide range of matrix dimensions. Considering symmetric and Hermitian matrices, the KBLAS performance outperforms existing state-of-the-art implementations on all matrix sizes and achieves asymptotically up to 50% and 60% speedup against the best competitor on single GPU and multi-GPUs systems, respectively. Performance results also validate our performance model. A subset of KBLAS highperformance kernels have been integrated into NVIDIA\\'s standard BLAS implementation (cuBLAS) for larger dissemination, starting from version 6.0. © 2016 ACM.

  2. Modeling of wave propagation in drill strings using vibration transfer matrix methods.

    Science.gov (United States)

    Han, Je-Heon; Kim, Yong-Joe; Karkoub, Mansour

    2013-09-01

    In order to understand critical vibration of a drill bit such as stick-slip and bit-bounce and their wave propagation characteristics through a drill string system, it is critical to model the torsional, longitudinal, and flexural waves generated by the drill bit vibration. Here, a modeling method based on a vibration transfer matrix between two sets of structural wave variables at the ends of a constant cross-sectional, hollow, circular pipe is proposed. For a drill string system with multiple pipe sections, the total vibration transfer matrix is calculated by multiplying all individual matrices, each is obtained for an individual pipe section. Since drill string systems are typically extremely long, conventional numerical analysis methods such as a finite element method (FEM) require a large number of meshes, which makes it computationally inefficient to analyze these drill string systems numerically. The proposed "analytical" vibration transfer matrix method requires significantly low computational resources. For the validation of the proposed method, experimental and numerical data are obtained from laboratory experiments and FEM analyses conducted by using a commercial FEM package, ANSYS. It is shown that the modeling results obtained by using the proposed method are well matched with the experimental and numerical results.

  3. Shunt Active and Series Active Filters-Based Power Quality Conditioner for Matrix Converter

    Directory of Open Access Journals (Sweden)

    P. Jeno Paul

    2011-01-01

    Full Text Available This paper proposes a series active filter and shunt active filter to minimize the power quality impact present in matrix converters instead of passive filter. A matrix converter produces significant harmonics and nonstandard frequency components into load. The proposed system compensates the sag and swell problems efficiently in matrix converter. The proposed system has been tested and validated on the matrix converter using MATLAB/Simulink software. Simulated results confirm that the active power filters can maintain high performance for matrix converter.

  4. Not dead yet: the rise, fall and persistence of the BCG Matrix

    OpenAIRE

    Dag Øivind Madsen

    2017-01-01

    The BCG Matrix was introduced almost 50 years ago, and is today considered one of the most iconic strategic planning techniques. Using management fashion theory as a theoretical lens, this paper examines the historical rise, fall and persistence of the BCG Matrix. The analysis highlights the role played by fashion-setting actors (e.g., consultants, business schools and business media) in the rise of the BCG Matrix. However, over time, portfolio planning models such as the BCG Matrix were atta...

  5. Ceramic matrix and resin matrix composites: A comparison

    Science.gov (United States)

    Hurwitz, Frances I.

    1987-01-01

    The underlying theory of continuous fiber reinforcement of ceramic matrix and resin matrix composites, their fabrication, microstructure, physical and mechanical properties are contrasted. The growing use of organometallic polymers as precursors to ceramic matrices is discussed as a means of providing low temperature processing capability without the fiber degradation encountered with more conventional ceramic processing techniques. Examples of ceramic matrix composites derived from particulate-filled, high char yield polymers and silsesquioxane precursors are provided.

  6. Ceramic matrix and resin matrix composites - A comparison

    Science.gov (United States)

    Hurwitz, Frances I.

    1987-01-01

    The underlying theory of continuous fiber reinforcement of ceramic matrix and resin matrix composites, their fabrication, microstructure, physical and mechanical properties are contrasted. The growing use of organometallic polymers as precursors to ceramic matrices is discussed as a means of providing low temperature processing capability without the fiber degradation encountered with more conventional ceramic processing techniques. Examples of ceramic matrix composites derived from particulate-filled, high char yield polymers and silsesquioxane precursors are provided.

  7. Matrix exponential based discriminant locality preserving projections for feature extraction.

    Science.gov (United States)

    Lu, Gui-Fu; Wang, Yong; Zou, Jian; Wang, Zhongqun

    2018-01-01

    Discriminant locality preserving projections (DLPP), which has shown good performances in pattern recognition, is a feature extraction algorithm based on manifold learning. However, DLPP suffers from the well-known small sample size (SSS) problem, where the number of samples is less than the dimension of samples. In this paper, we propose a novel matrix exponential based discriminant locality preserving projections (MEDLPP). The proposed MEDLPP method can address the SSS problem elegantly since the matrix exponential of a symmetric matrix is always positive definite. Nevertheless, the computational complexity of MEDLPP is high since it needs to solve a large matrix exponential eigenproblem. Then, in this paper, we also present an efficient algorithm for solving MEDLPP. Besides, the main idea for solving MEDLPP efficiently is also generalized to other matrix exponential based methods. The experimental results on some data sets demonstrate the proposed algorithm outperforms many state-of-the-art discriminant analysis methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Matrix product states for interacting particles without hardcore constraints

    Science.gov (United States)

    Chatterjee, Amit Kumar; Mohanty, P. K.

    2017-12-01

    We construct matrix product steady states for a class of interacting particle systems where particles do not obey hardcore exclusion, meaning each site can occupy any number of particles subjected to the global conservation of the total number of particles in the system. To represent the arbitrary occupancy of the sites, the matrix product ansatz here requires an infinite set of matrices which in turn leads to an algebra involving an infinite number of matrix equations. We show that these matrix equations, in fact, can be reduced to a single functional relation when the matrices are parametric functions of the representative occupation number. We demonstrate this matrix formulation in a class of stochastic particle hopping processes on a one dimensional periodic lattice where hop rates depend on the occupation numbers of the departure site and its neighbors within a finite range; this includes some well known stochastic processes like, the totally or partially asymmetric zero range process, misanthrope process and finite range process.

  9. Content validity and its estimation

    Directory of Open Access Journals (Sweden)

    Yaghmale F

    2003-04-01

    Full Text Available Background: Measuring content validity of instruments are important. This type of validity can help to ensure construct validity and give confidence to the readers and researchers about instruments. content validity refers to the degree that the instrument covers the content that it is supposed to measure. For content validity two judgments are necessary: the measurable extent of each item for defining the traits and the set of items that represents all aspects of the traits. Purpose: To develop a content valid scale for assessing experience with computer usage. Methods: First a review of 2 volumes of International Journal of Nursing Studies, was conducted with onlyI article out of 13 which documented content validity did so by a 4-point content validity index (CV! and the judgment of 3 experts. Then a scale with 38 items was developed. The experts were asked to rate each item based on relevance, clarity, simplicity and ambiguity on the four-point scale. Content Validity Index (CVI for each item was determined. Result: Of 38 items, those with CVIover 0.75 remained and the rest were discarded reSulting to 25-item scale. Conclusion: Although documenting content validity of an instrument may seem expensive in terms of time and human resources, its importance warrants greater attention when a valid assessment instrument is to be developed. Keywords: Content Validity, Measuring Content Validity

  10. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  11. Adapting the nominal group technique for priority setting of evidence-practice gaps in implementation science

    Directory of Open Access Journals (Sweden)

    Nicole M. Rankin

    2016-08-01

    Full Text Available Abstract Background There are a variety of methods for priority setting in health research but few studies have addressed how to prioritise the gaps that exist between research evidence and clinical practice. This study aimed to build a suite of robust, evidence based techniques and tools for use in implementation science projects. We applied the priority setting methodology in lung cancer care as an example. Methods We reviewed existing techniques and tools for priority setting in health research and the criteria used to prioritise items. An expert interdisciplinary consensus group comprised of health service, cancer and nursing researchers iteratively reviewed and adapted the techniques and tools. We tested these on evidence-practice gaps identified for lung cancer. The tools were pilot tested and finalised. A brief process evaluation was conducted. Results We based our priority setting on the Nominal Group Technique (NGT. The adapted tools included a matrix for individuals to privately rate priority gaps; the same matrix was used for group discussion and reaching consensus. An investment exercise was used to validate allocation of priorities across the gaps. We describe the NGT process, criteria and tool adaptations and process evaluation results. Conclusions The modified NGT process, criteria and tools contribute to building a suite of methods that can be applied in prioritising evidence-practice gaps. These methods could be adapted for other health settings within the broader context of implementation science projects.

  12. Matrix matters: differences of grand skink metapopulation parameters in native tussock grasslands and exotic pasture grasslands.

    Directory of Open Access Journals (Sweden)

    Konstanze Gebauer

    Full Text Available Modelling metapopulation dynamics is a potentially very powerful tool for conservation biologists. In recent years, scientists have broadened the range of variables incorporated into metapopulation modelling from using almost exclusively habitat patch size and isolation, to the inclusion of attributes of the matrix and habitat patch quality. We investigated the influence of habitat patch and matrix characteristics on the metapopulation parameters of a highly endangered lizard species, the New Zealand endemic grand skink (Oligosoma grande taking into account incomplete detectability. The predictive ability of the developed zxmetapopulation model was assessed through cross-validation of the data and with an independent data-set. Grand skinks occur on scattered rock-outcrops surrounded by indigenous tussock (bunch and pasture grasslands therefore implying a metapopulation structure. We found that the type of matrix surrounding the habitat patch was equally as important as the size of habitat patch for estimating occupancy, colonisation and extinction probabilities. Additionally, the type of matrix was more important than the physical distance between habitat patches for colonisation probabilities. Detection probability differed between habitat patches in the two matrix types and between habitat patches with different attributes such as habitat patch composition and abundance of vegetation on the outcrop. The developed metapopulation models can now be used for management decisions on area protection, monitoring, and the selection of translocation sites for the grand skink. Our study showed that it is important to incorporate not only habitat patch size and distance between habitat patches, but also those matrix type and habitat patch attributes which are vital in the ecology of the target species.

  13. Google matrix of Twitter

    Science.gov (United States)

    Frahm, K. M.; Shepelyansky, D. L.

    2012-10-01

    We construct the Google matrix of the entire Twitter network, dated by July 2009, and analyze its spectrum and eigenstate properties including the PageRank and CheiRank vectors and 2DRanking of all nodes. Our studies show much stronger inter-connectivity between top PageRank nodes for the Twitter network compared to the networks of Wikipedia and British Universities studied previously. Our analysis allows to locate the top Twitter users which control the information flow on the network. We argue that this small fraction of the whole number of users, which can be viewed as the social network elite, plays the dominant role in the process of opinion formation on the network.

  14. Matrix membranes and integrability

    Energy Technology Data Exchange (ETDEWEB)

    Zachos, C. [Argonne National Lab., IL (United States); Fairlie, D. [University of Durham (United Kingdom). Dept. of Mathematical Sciences; Curtright, T. [University of Miami, Coral Gables, FL (United States). Dept. of Physics

    1997-06-01

    This is a pedagogical digest of results reported in Curtright, Fairlie, {ampersand} Zachos 1997, and an explicit implementation of Euler`s construction for the solution of the Poisson Bracket dual Nahm equation. But it does not cover 9 and 10-dimensional systems, and subsequent progress on them Fairlie 1997. Cubic interactions are considered in 3 and 7 space dimensions, respectively, for bosonic membranes in Poisson Bracket form. Their symmetries and vacuum configurations are explored. Their associated first order equations are transformed to Nahm`s equations, and are hence seen to be integrable, for the 3-dimensional case, by virtue of the explicit Lax pair provided. Most constructions introduced also apply to matrix commutator or Moyal Bracket analogs.

  15. Light cone matrix product

    Energy Technology Data Exchange (ETDEWEB)

    Hastings, Matthew B [Los Alamos National Laboratory

    2009-01-01

    We show how to combine the light-cone and matrix product algorithms to simulate quantum systems far from equilibrium for long times. For the case of the XXZ spin chain at {Delta} = 0.5, we simulate to a time of {approx} 22.5. While part of the long simulation time is due to the use of the light-cone method, we also describe a modification of the infinite time-evolving bond decimation algorithm with improved numerical stability, and we describe how to incorporate symmetry into this algorithm. While statistical sampling error means that we are not yet able to make a definite statement, the behavior of the simulation at long times indicates the appearance of either 'revivals' in the order parameter as predicted by Hastings and Levitov (e-print arXiv:0806.4283) or of a distinct shoulder in the decay of the order parameter.

  16. Typicality in random matrix product states

    Science.gov (United States)

    Garnerone, Silvano; de Oliveira, Thiago R.; Zanardi, Paolo

    2010-03-01

    Recent results suggest that the use of ensembles in statistical mechanics may not be necessary for isolated systems, since typically the states of the Hilbert space would have properties similar to those of the ensemble. Nevertheless, it is often argued that most of the states of the Hilbert space are nonphysical and not good descriptions of realistic systems. Therefore, to better understand the actual power of typicality it is important to ask if it is also a property of a set of physically relevant states. Here we address this issue, studying if and how typicality emerges in the set of matrix product states. We show analytically that typicality occurs for the expectation value of subsystems’ observables when the rank of the matrix product state scales polynomially with the size of the system with a power greater than 2. We illustrate this result numerically and present some indications that typicality may appear already for a linear scaling of the rank of the matrix product state.

  17. A Matrix Splitting Method for Composite Function Minimization

    KAUST Repository

    Yuan, Ganzhao

    2016-12-07

    Composite function minimization captures a wide spectrum of applications in both computer vision and machine learning. It includes bound constrained optimization and cardinality regularized optimization as special cases. This paper proposes and analyzes a new Matrix Splitting Method (MSM) for minimizing composite functions. It can be viewed as a generalization of the classical Gauss-Seidel method and the Successive Over-Relaxation method for solving linear systems in the literature. Incorporating a new Gaussian elimination procedure, the matrix splitting method achieves state-of-the-art performance. For convex problems, we establish the global convergence, convergence rate, and iteration complexity of MSM, while for non-convex problems, we prove its global convergence. Finally, we validate the performance of our matrix splitting method on two particular applications: nonnegative matrix factorization and cardinality regularized sparse coding. Extensive experiments show that our method outperforms existing composite function minimization techniques in term of both efficiency and efficacy.

  18. Homolumo Gap and Matrix Model

    CERN Document Server

    Andric, I; Jurman, D; Nielsen, H B

    2007-01-01

    We discuss a dynamical matrix model by which probability distribution is associated with Gaussian ensembles from random matrix theory. We interpret the matrix M as a Hamiltonian representing interaction of a bosonic system with a single fermion. We show that a system of second-quantized fermions influences the ground state of the whole system by producing a gap between the highest occupied eigenvalue and the lowest unoccupied eigenvalue.

  19. Nonnegative Matrix Factorizations Performing Object Detection and Localization

    Directory of Open Access Journals (Sweden)

    G. Casalino

    2012-01-01

    Full Text Available We study the problem of detecting and localizing objects in still, gray-scale images making use of the part-based representation provided by nonnegative matrix factorizations. Nonnegative matrix factorization represents an emerging example of subspace methods, which is able to extract interpretable parts from a set of template image objects and then to additively use them for describing individual objects. In this paper, we present a prototype system based on some nonnegative factorization algorithms, which differ in the additional properties added to the nonnegative representation of data, in order to investigate if any additional constraint produces better results in general object detection via nonnegative matrix factorizations.

  20. Matrix coordinate Bethe Ansatz: applications to XXZ and ASEP models

    Energy Technology Data Exchange (ETDEWEB)

    Crampe, N [Laboratoire Charles Coulomb, UMR 5221, Universite Montpellier 2, F-34095 Montpellier (France); Ragoucy, E [Laboratoire de Physique Theorique LAPTH, CNRS and Universite de Savoie, 9 chemin de Bellevue, BP 110, F-74941 Annecy-le-Vieux Cedex (France); Simon, D, E-mail: nicolas.crampe@univ-montp2.fr, E-mail: ragoucy@lapp.in2p3.fr, E-mail: damien.simon@upmc.fr [LPMA, Universite Pierre et Marie Curie, Case Courrier 188, 4 place Jussieu, 75252 Paris Cedex 05 (France)

    2011-10-07

    We present the construction of the full set of eigenvectors of the open asymmetric simple exclusion process (ASEP) and XXZ models with special constraints on the boundaries. The method combines both recent constructions of coordinate Bethe Ansatz and the old method of matrix Ansatz specific to the ASEP. This 'matrix coordinate Bethe Ansatz' can be viewed as a non-commutative coordinate Bethe Ansatz, the non-commutative part being related to the algebra appearing in the matrix Ansatz. (paper)

  1. An iterative method to invert the LTSn matrix

    Energy Technology Data Exchange (ETDEWEB)

    Cardona, A.V.; Vilhena, M.T. de [UFRGS, Porto Alegre (Brazil)

    1996-12-31

    Recently Vilhena and Barichello proposed the LTSn method to solve, analytically, the Discrete Ordinates Problem (Sn problem) in transport theory. The main feature of this method consist in the application of the Laplace transform to the set of Sn equations and solve the resulting algebraic system for the transport flux. Barichello solve the linear system containing the parameter s applying the definition of matrix invertion exploiting the structure of the LTSn matrix. In this work, it is proposed a new scheme to invert the LTSn matrix, decomposing it in blocks and recursively inverting this blocks.

  2. Deghosting based on the transmission matrix method

    Science.gov (United States)

    Wang, Benfeng; Wu, Ru-Shan; Chen, Xiaohong

    2017-12-01

    As the developments of seismic exploration and subsequent seismic exploitation advance, marine acquisition systems with towed streamers become an important seismic data acquisition method. But the existing air–water reflective interface can generate surface related multiples, including ghosts, which can affect the accuracy and performance of the following seismic data processing algorithms. Thus, we derive a deghosting method from a new perspective, i.e. using the transmission matrix (T-matrix) method instead of inverse scattering series. The T-matrix-based deghosting algorithm includes all scattering effects and is convergent absolutely. Initially, the effectiveness of the proposed method is demonstrated using synthetic data obtained from a designed layered model, and its noise-resistant property is also illustrated using noisy synthetic data contaminated by random noise. Numerical examples on complicated data from the open SMAART Pluto model and field marine data further demonstrate the validity and flexibility of the proposed method. After deghosting, low frequency components are recovered reasonably and the fake high frequency components are attenuated, and the recovered low frequency components will be useful for the subsequent full waveform inversion. The proposed deghosting method is currently suitable for two-dimensional towed streamer cases with accurate constant depth information and its extension into variable-depth streamers in three-dimensional cases will be studied in the future.

  3. Validating a Monotonically-Integrated Large Eddy Simulation Code for Subsonic Jet Acoustics

    Science.gov (United States)

    Ingraham, Daniel; Bridges, James

    2017-01-01

    The results of subsonic jet validation cases for the Naval Research Lab's Jet Engine Noise REduction (JENRE) code are reported. Two set points from the Tanna matrix, set point 3 (Ma = 0.5, unheated) and set point 7 (Ma = 0.9, unheated) are attempted on three different meshes. After a brief discussion of the JENRE code and the meshes constructed for this work, the turbulent statistics for the axial velocity are presented and compared to experimental data, with favorable results. Preliminary simulations for set point 23 (Ma = 0.5, Tj=T1 = 1.764) on one of the meshes are also described. Finally, the proposed configuration for the farfield noise prediction with JENRE's Ffowcs-Williams Hawking solver are detailed.

  4. System Matrix Analysis for Computed Tomography Imaging.

    Directory of Open Access Journals (Sweden)

    Liubov Flores

    Full Text Available In practical applications of computed tomography imaging (CT, it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data.

  5. Photoacoustic measurement of lutein in biological matrix

    Science.gov (United States)

    Bicanic, D.; Luterotti, S.; Becucci, M.; Fogliano, V.; Versloot, P.

    2005-06-01

    Photoacoustic (PA) spectroscopy was applied for the first time to quantify lutein in a complex biological matrix. Standard addition of lutein to a biological low-lutein matrix was used for the calibration. The PA signal was found linearly proportional (R > 0.98) to lutein concentration up to 0.3% (w/w). The dynamic range of concentrations extends to 1% (w/w) lutein. For a given experimental set-up the responsivity of PA detector within the range of linearity was estimated to 1.1 mV/1% lutein. Precision of repeated analyses is good with average RSD values of 4 and 5% for blanks and spiked samples, respectively. The analytical parameters indicate that the PA method is fast and sensitive enough for quantification of lutein in supplementary drugs and in the lutein-rich foods.

  6. Prediction of MHC class II binding affinity using SMM-align, a novel stabilization matrix alignment method

    DEFF Research Database (Denmark)

    Nielsen, Morten; Lundegaard, Claus; Lund, Ole

    2007-01-01

    the correct alignment of a peptide in the binding groove a crucial part of identifying the core of an MHC class II binding motif. Here, we present a novel stabilization matrix alignment method, SMM-align, that allows for direct prediction of peptide:MHC binding affinities. The predictive performance...... of the method is validated on a large MHC class II benchmark data set covering 14 HLA-DR (human MHC) and three mouse H2-IA alleles. RESULTS: The predictive performance of the SMM-align method was demonstrated to be superior to that of the Gibbs sampler, TEPITOPE, SVRMHC, and MHCpred methods. Cross validation...... by favoring binding registers with a minimum PFR length of two amino acids. Visualizing the binding motif as obtained by the SMM-align and TEPITOPE methods highlights a series of fundamental discrepancies between the two predicted motifs. For the DRB1*1302 allele for instance, the TEPITOPE method favors basic...

  7. Ceramic matrix composite article and process of fabricating a ceramic matrix composite article

    Science.gov (United States)

    Cairo, Ronald Robert; DiMascio, Paul Stephen; Parolini, Jason Robert

    2016-01-12

    A ceramic matrix composite article and a process of fabricating a ceramic matrix composite are disclosed. The ceramic matrix composite article includes a matrix distribution pattern formed by a manifold and ceramic matrix composite plies laid up on the matrix distribution pattern, includes the manifold, or a combination thereof. The manifold includes one or more matrix distribution channels operably connected to a delivery interface, the delivery interface configured for providing matrix material to one or more of the ceramic matrix composite plies. The process includes providing the manifold, forming the matrix distribution pattern by transporting the matrix material through the manifold, and contacting the ceramic matrix composite plies with the matrix material.

  8. Highly accurate first-principles benchmark data sets for the parametrization and validation of density functional and other approximate methods. Derivation of a robust, generally applicable, double-hybrid functional for thermochemistry and thermochemical kinetics.

    Science.gov (United States)

    Karton, Amir; Tarnopolsky, Alex; Lamère, Jean-François; Schatz, George C; Martin, Jan M L

    2008-12-18

    We present a number of near-exact, nonrelativistic, Born-Oppenheimer reference data sets for the parametrization of more approximate methods (such as DFT functionals). The data were obtained by means of the W4 ab initio computational thermochemistry protocol, which has a 95% confidence interval well below 1 kJ/mol. Our data sets include W4-08, which are total atomization energies of over 100 small molecules that cover varying degrees of nondynamical correlations, and DBH24-W4, which are W4 theory values for Truhlar's set of 24 representative barrier heights. The usual procedure of comparing calculated DFT values with experimental atomization energies is hampered by comparatively large experimental uncertainties in many experimental values and compounds errors due to deficiencies in the DFT functional with those resulting from neglect of relativity and finite nuclear mass. Comparison with accurate, explicitly nonrelativistic, ab initio data avoids these issues. We then proceed to explore the performance of B2x-PLYP-type double hybrid functionals for atomization energies and barrier heights. We find that the optimum hybrids for hydrogen-transfer reactions, heavy-atoms transfers, nucleophilic substitutions, and unimolecular and recombination reactions are quite different from one another: out of these subsets, the heavy-atom transfer reactions are by far the most sensitive to the percentages of Hartree-Fock-type exchange y and MP2-type correlation x in an (x, y) double hybrid. The (42,72) hybrid B2K-PLYP, as reported in a preliminary communication, represents the best compromise between thermochemistry and hydrogen-transfer barriers, while also yielding excellent performance for nucleophilic substitutions. By optimizing for best overall performance on both thermochemistry and the DBH24-W4 data set, however, we find a new (36,65) hybrid which we term B2GP-PLYP. At a slight expense in performance for hydrogen-transfer barrier heights and nucleophilic substitutions, we

  9. An Application of Matrix Multiplication

    Indian Academy of Sciences (India)

    IAS Admin

    linguistics, graph theory applications to biological networks, social networks, electrical engineering. We are well aware of the ever increasing impor- tance of graphical and matrix representations in applications to several day-to-day real life prob- lems. The interconnectedness of the notion of graph, matrix, probability, limits, ...

  10. Matrix Methods to Analytic Geometry.

    Science.gov (United States)

    Bandy, C.

    1982-01-01

    The use of basis matrix methods to rotate axes is detailed. It is felt that persons who have need to rotate axes often will find that the matrix method saves considerable work. One drawback is that most students first learning to rotate axes will not yet have studied linear algebra. (MP)

  11. How to Study a Matrix

    Science.gov (United States)

    Jairam, Dharmananda; Kiewra, Kenneth A.; Kauffman, Douglas F.; Zhao, Ruomeng

    2012-01-01

    This study investigated how best to study a matrix. Fifty-three participants studied a matrix topically (1 column at a time), categorically (1 row at a time), or in a unified way (all at once). Results revealed that categorical and unified study produced higher: (a) performance on relationship and fact tests, (b) study material satisfaction, and…

  12. Developments in Random Matrix Theory

    OpenAIRE

    Snaith, N. C.; Forrester, P. J.; Verbaarschot, J. J. M.

    2003-01-01

    In this preface to the Journal of Physics A, Special Edition on Random Matrix Theory, we give a review of the main historical developments of random matrix theory. A short summary of the papers that appear in this special edition is also given.

  13. QCD and random matrix theory

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, A.D. [Niels Bohr Inst., Copenhagen (Denmark)

    1998-08-10

    Chiral random matrix theory has recently been shown to provide a tool useful for both modeling chiral symmetry restoration in QCD and for providing analytic descriptions of the microscopic spectral content of lattice gauge simulations. The basic ideas of chiral random matrix theory and some recent results are discussed. (orig.) 24 refs.

  14. Refining Multivariate Value Set Bounds

    Science.gov (United States)

    Smith, Luke Alexander

    Over finite fields, if the image of a polynomial map is not the entire field, then its cardinality can be bounded above by a significantly smaller value. Earlier results bound the cardinality of the value set using the degree of the polynomial, but more recent results make use of the powers of all monomials. In this paper, we explore the geometric properties of the Newton polytope and show how they allow for tighter upper bounds on the cardinality of the multivariate value set. We then explore a method which allows for even stronger upper bounds, regardless of whether one uses the multivariate degree or the Newton polytope to bound the value set. Effectively, this provides an alternate proof of Kosters' degree bound, an improved Newton polytope-based bound, and an improvement of a degree matrix-based result given by Zan and Cao.

  15. Strategic approaches for assessment and minimization of matrix effect in ligand-binding assays.

    Science.gov (United States)

    Shih, Judy Y; Patel, Vimal; Ma, Mark

    2014-04-01

    Although substantial advances have been made in ligand-binding assays (LBA) for biotherapeutics in the past decade, there are still gaps that need to be addressed, especially in the context of understanding matrix effect and its root causes. Critical and in-depth characterization of matrix effect can provide valuable knowledge of the LBA limitations for proper results interpretation. This article illustrates several strategic approaches with regard to identifying the root cause of matrix effect and practical solutions, including recognizing the confounding factors associated with matrix effect, selection of proper reagents to avoid matrix effect, and a systematic approach in dealing with matrix effect in method development and validation. These strategic approaches have enhanced the management of matrix effect in LBA.

  16. Quantum mechanics in matrix form

    CERN Document Server

    Ludyk, Günter

    2018-01-01

    This book gives an introduction to quantum mechanics with the matrix method. Heisenberg's matrix mechanics is described in detail. The fundamental equations are derived by algebraic methods using matrix calculus. Only a brief description of Schrödinger's wave mechanics is given (in most books exclusively treated), to show their equivalence to Heisenberg's matrix  method. In the first part the historical development of Quantum theory by Planck, Bohr and Sommerfeld is sketched, followed by the ideas and methods of Heisenberg, Born and Jordan. Then Pauli's spin and exclusion principles are treated. Pauli's exclusion principle leads to the structure of atoms. Finally, Dirac´s relativistic quantum mechanics is shortly presented. Matrices and matrix equations are today easy to handle when implementing numerical algorithms using standard software as MAPLE and Mathematica.

  17. Machining of Metal Matrix Composites

    CERN Document Server

    2012-01-01

    Machining of Metal Matrix Composites provides the fundamentals and recent advances in the study of machining of metal matrix composites (MMCs). Each chapter is written by an international expert in this important field of research. Machining of Metal Matrix Composites gives the reader information on machining of MMCs with a special emphasis on aluminium matrix composites. Chapter 1 provides the mechanics and modelling of chip formation for traditional machining processes. Chapter 2 is dedicated to surface integrity when machining MMCs. Chapter 3 describes the machinability aspects of MMCs. Chapter 4 contains information on traditional machining processes and Chapter 5 is dedicated to the grinding of MMCs. Chapter 6 describes the dry cutting of MMCs with SiC particulate reinforcement. Finally, Chapter 7 is dedicated to computational methods and optimization in the machining of MMCs. Machining of Metal Matrix Composites can serve as a useful reference for academics, manufacturing and materials researchers, manu...

  18. Assembly of Fibronectin Extracellular Matrix

    Science.gov (United States)

    Singh, Purva; Carraher, Cara; Schwarzbauer, Jean E.

    2013-01-01

    In the process of matrix assembly, multivalent extracellular matrix (ECM) proteins are induced to self-associate and to interact with other ECM proteins to form fibrillar networks. Matrix assembly is usually initiated by ECM glycoproteins binding to cell surface receptors, such as fibronectin (FN) dimers binding to α5β1 integrin. Receptor binding stimulates FN self-association mediated by the N-terminal assembly domain and organizes the actin cytoskeleton to promote cell contractility. FN conformational changes expose additional binding sites that participate in fibril formation and in conversion of fibrils into a stabilized, insoluble form. Once assembled, the FN matrix impacts tissue organization by contributing to the assembly of other ECM proteins. Here, we describe the major steps, molecular interactions, and cellular mechanisms involved in assembling FN dimers into fibrillar matrix while highlighting important issues and major questions that require further investigation. PMID:20690820

  19. Sparse matrix test collections

    Energy Technology Data Exchange (ETDEWEB)

    Duff, I.

    1996-12-31

    This workshop will discuss plans for coordinating and developing sets of test matrices for the comparison and testing of sparse linear algebra software. We will talk of plans for the next release (Release 2) of the Harwell-Boeing Collection and recent work on improving the accessibility of this Collection and others through the World Wide Web. There will only be three talks of about 15 to 20 minutes followed by a discussion from the floor.

  20. Structural and dynamic studies of two antigenic loops from haemagglutinin: a relaxation matrix approach.

    Science.gov (United States)

    Kieffer, B; Koehl, P; Plaue, S; Lefèvre, J F

    1993-01-01

    We have investigated the dynamics and structural behaviour of two antigenic peptides using 1H NMR. The two cyclic peptides mimic the antigenic site A of influenza haemagglutinin protein; they only differ in the way they were cyclized and in the size of their respective linkers. Homonuclear relaxation parameters extracted from a complete NOE matrix were interpreted in terms of local dynamics. A set of distance constraints was deduced from these parameters which allowed 3D models to be constructed using distance geometry. NOE back-calculation was used to check the validity of the final models. Strong variations of internal motion amplitude have been found in both peptides along their backbone. Motions with high amplitudes have been localized in the Gly-Pro-Gly sequence which forms a beta-turn in both structures.

  1. Independent set dominating sets in bipartite graphs

    Directory of Open Access Journals (Sweden)

    Bohdan Zelinka

    2005-01-01

    Full Text Available The paper continues the study of independent set dominating sets in graphs which was started by E. Sampathkumar. A subset \\(D\\ of the vertex set \\(V(G\\ of a graph \\(G\\ is called a set dominating set (shortly sd-set in \\(G\\, if for each set \\(X \\subseteq V(G-D\\ there exists a set \\(Y \\subseteq D\\ such that the subgraph \\(\\langle X \\cup Y\\rangle\\ of \\(G\\ induced by \\(X \\cup Y\\ is connected. The minimum number of vertices of an sd-set in \\(G\\ is called the set domination number \\(\\gamma_s(G\\ of \\(G\\. An sd-set \\(D\\ in \\(G\\ such that \\(|D|=\\gamma_s(G\\ is called a \\(\\gamma_s\\-set in \\(G\\. In this paper we study sd-sets in bipartite graphs which are simultaneously independent. We apply the theory of hypergraphs.

  2. UpSetR: an R package for the visualization of intersecting sets and their properties.

    Science.gov (United States)

    Conway, Jake R; Lex, Alexander; Gehlenborg, Nils

    2017-09-15

    Venn and Euler diagrams are a popular yet inadequate solution for quantitative visualization of set intersections. A scalable alternative to Venn and Euler diagrams for visualizing intersecting sets and their properties is needed. We developed UpSetR, an open source R package that employs a scalable matrix-based visualization to show intersections of sets, their size, and other properties. UpSetR is available at https://github.com/hms-dbmi/UpSetR/ and released under the MIT License. A Shiny app is available at https://gehlenborglab.shinyapps.io/upsetr/ . nils@hms.harvard.edu. Supplementary data are available at Bioinformatics online.

  3. Ab initio thermochemistry with high-level isodesmic corrections: validation of the ATOMIC protocol for a large set of compounds with first-row atoms (H, C, N, O, F).

    Science.gov (United States)

    Bakowies, Dirk

    2009-10-29

    The recently proposed ATOMIC protocol is a fully ab initio thermochemical approach designed to provide accurate atomization energies for molecules with well-defined valence structures. It makes consistent use of the concept of bond-separation reactions to supply high-level precomputed bond increments which correct for errors of lower-level models. The present work extends the approach to the calculation of standard heats of formation and validates it by comparison to experimental and benchmark level ab initio data reported in the literature. Standard heats of formation (298 K) have been compiled for a large sample of 173 neutral molecules containing hydrogen and first-row atoms (C, N, O, F), resorting to several previous compilations and to the original experimental literature. Statistical evaluation shows that the simplest implementation of the ATOMIC protocol (composite model C) achieves an accuracy comparable to the popular Gaussian-3 approach and that composite models A and B perform better. Chemical accuracy (1-2 kcal/mol) is normally achieved even for larger systems with about 10 non-hydrogen atoms and for systems with charge-separated valence structures, bearing testimony to the robustness of the bond-separation reaction model. Effects of conformational averaging have been examined in detail for the series of n-alkanes, and our most refined composite model A reproduces experimental heats of formation quantitatively, provided that conformational averaging is properly accounted for. Several cases of larger discrepancy with respect to experiment are discussed, and potential weaknesses of the approach are identified.

  4. A Random Matrix Approach to Credit Risk

    Science.gov (United States)

    Guhr, Thomas

    2014-01-01

    We estimate generic statistical properties of a structural credit risk model by considering an ensemble of correlation matrices. This ensemble is set up by Random Matrix Theory. We demonstrate analytically that the presence of correlations severely limits the effect of diversification in a credit portfolio if the correlations are not identically zero. The existence of correlations alters the tails of the loss distribution considerably, even if their average is zero. Under the assumption of randomly fluctuating correlations, a lower bound for the estimation of the loss distribution is provided. PMID:24853864

  5. A random matrix approach to credit risk.

    Directory of Open Access Journals (Sweden)

    Michael C Münnix

    Full Text Available We estimate generic statistical properties of a structural credit risk model by considering an ensemble of correlation matrices. This ensemble is set up by Random Matrix Theory. We demonstrate analytically that the presence of correlations severely limits the effect of diversification in a credit portfolio if the correlations are not identically zero. The existence of correlations alters the tails of the loss distribution considerably, even if their average is zero. Under the assumption of randomly fluctuating correlations, a lower bound for the estimation of the loss distribution is provided.

  6. Random matrix techniques in quantum information theory

    Science.gov (United States)

    Collins, Benoît; Nechita, Ion

    2016-01-01

    The purpose of this review is to present some of the latest developments using random techniques, and in particular, random matrix techniques in quantum information theory. Our review is a blend of a rather exhaustive review and of more detailed examples—coming mainly from research projects in which the authors were involved. We focus on two main topics, random quantum states and random quantum channels. We present results related to entropic quantities, entanglement of typical states, entanglement thresholds, the output set of quantum channels, and violations of the minimum output entropy of random channels.

  7. Pseudo-Hermitian random matrix theory

    Energy Technology Data Exchange (ETDEWEB)

    Srivastava, S.C.L. [RIBFG, Variable Energy Cyclotron Centre, 1/AF Bidhan nagar, Kolkata-700 064 (India); Jain, S.R. [NPD, Bhabha Atomic Research Centre, Mumbai-400 085 (India)

    2013-02-15

    Complex extension of quantum mechanics and the discovery of pseudo-unitarily invariant random matrix theory has set the stage for a number of applications of these concepts in physics. We briefly review the basic ideas and present applications to problems in statistical mechanics where new results have become possible. We have found it important to mention the precise directions where advances could be made if further results become available. (Copyright copyright 2013 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  8. Random matrix techniques in quantum information theory

    Energy Technology Data Exchange (ETDEWEB)

    Collins, Benoît, E-mail: collins@math.kyoto-u.ac.jp [Department of Mathematics, Kyoto University, Kyoto 606-8502 (Japan); Département de Mathématique et Statistique, Université d’Ottawa, 585 King Edward, Ottawa, Ontario K1N6N5 (Canada); CNRS, Lyon (France); Nechita, Ion, E-mail: nechita@irsamc.ups-tlse.fr [Zentrum Mathematik, M5, Technische Universität München, Boltzmannstrasse 3, 85748 Garching (Germany); Laboratoire de Physique Théorique, CNRS, IRSAMC, Université de Toulouse, UPS, F-31062 Toulouse (France)

    2016-01-15

    The purpose of this review is to present some of the latest developments using random techniques, and in particular, random matrix techniques in quantum information theory. Our review is a blend of a rather exhaustive review and of more detailed examples—coming mainly from research projects in which the authors were involved. We focus on two main topics, random quantum states and random quantum channels. We present results related to entropic quantities, entanglement of typical states, entanglement thresholds, the output set of quantum channels, and violations of the minimum output entropy of random channels.

  9. Matrix fluid chemistry experiment. Final report June 1998 - March 2003

    Energy Technology Data Exchange (ETDEWEB)

    Smellie, John A.T. [Conterra AB, Luleaa (Sweden); Waber, H. Niklaus [Univ. of Bern (Switzerland). Inst. of Geology; Frape, Shaun K. [Univ. of Waterloo (Canada). Dept. of Earth Sciences

    2003-06-01

    The Matrix Fluid Chemistry Experiment set out to determine the composition and evolution of matrix pore fluids/waters in low permeable rock located at repository depths in the Aespoe Hard Rock Laboratory (HRL). Matrix pore fluids/waters can be highly saline in composition and, if accessible, may influence the near-field groundwater chemistry of a repository system. Characterising pore fluids/waters involved in-situ borehole sampling and analysis integrated with laboratory studies and experiments on rock matrix drill core material. Relating the rate of in-situ pore water accumulation during sampling to the measured rock porosity indicated a hydraulic conductivity of 10{sup -14}-10{sup -13} m/s for the rock matrix. This was in accordance with earlier estimated predictions. The sampled matrix pore water, brackish in type, mostly represents older palaeo- groundwater mixtures preserved in the rock matrix and dating back to at least the last glaciation. A component of matrix pore 'fluid' is also present. One borehole section suggests a younger groundwater component which has accessed the rock matrix during the experiment. There is little evidence that the salinity of the matrix pore waters has been influenced significantly by fluid inclusion populations hosted by quartz. Crush/leach, cation exchange, pore water diffusion and pore water displacement laboratory experiments were carried out to compare extracted/calculated matrix pore fluids/waters with in-situ sampling. Of these the pore water diffusion experiments appear to be the most promising approach and a recommended site characterisation protocol has been formulated. The main conclusions from the Matrix Fluid Chemistry Experiment are: Groundwater movement within the bedrock hosting the experimental site has been enhanced by increased hydraulic gradients generated by the presence of the tunnel, and to a much lesser extent by the borehole itself. Over experimental timescales {approx}4 years) solute transport

  10. Validation of Serious Games

    Directory of Open Access Journals (Sweden)

    Katinka van der Kooij

    2015-09-01

    Full Text Available The application of games for behavioral change has seen a surge in popularity but evidence on the efficacy of these games is contradictory. Anecdotal findings seem to confirm their motivational value whereas most quantitative findings from randomized controlled trials (RCT are negative or difficult to interpret. One cause for the contradictory evidence could be that the standard RCT validation methods are not sensitive to serious games’ effects. To be able to adapt validation methods to the properties of serious games we need a framework that can connect properties of serious game design to the factors that influence the quality of quantitative research outcomes. The Persuasive Game Design model [1] is particularly suitable for this aim as it encompasses the full circle from game design to behavioral change effects on the user. We therefore use this model to connect game design features, such as the gamification method and the intended transfer effect, to factors that determine the conclusion validity of an RCT. In this paper we will apply this model to develop guidelines for setting up validation methods for serious games. This way, we offer game designers and researchers handles on how to develop tailor-made validation methods.

  11. ABCD Matrix Method a Case Study

    CERN Document Server

    Seidov, Zakir F; Yahalom, Asher

    2004-01-01

    In the Israeli Electrostatic Accelerator FEL, the distance between the accelerator's end and the wiggler's entrance is about 2.1 m, and 1.4 MeV electron beam is transported through this space using four similar quadrupoles (FODO-channel). The transfer matrix method (ABCD matrix method) was used for simulating the beam transport, a set of programs is written in the several programming languages (MATHEMATICA, MATLAB, MATCAD, MAPLE) and reasonable agreement is demonstrated between experimental results and simulations. Comparison of ABCD matrix method with the direct "numerical experiments" using EGUN, ELOP, and GPT programs with and without taking into account the space-charge effects showed the agreement to be good enough as well. Also the inverse problem of finding emittance of the electron beam at the S1 screen position (before FODO-channel), by using the spot image at S2 screen position (after FODO-channel) as function of quad currents, is considered. Spot and beam at both screens are described as tilted eel...

  12. A Cross-Validation of FDG- and Amyloid-PET Biomarkers in Mild Cognitive Impairment for the Risk Prediction to Dementia due to Alzheimer's Disease in a Clinical Setting.

    Science.gov (United States)

    Iaccarino, Leonardo; Chiotis, Konstantinos; Alongi, Pierpaolo; Almkvist, Ove; Wall, Anders; Cerami, Chiara; Bettinardi, Valentino; Gianolli, Luigi; Nordberg, Agneta; Perani, Daniela

    2017-01-01

    Assessments of brain glucose metabolism (18F-FDG-PET) and cerebral amyloid burden (11C-PiB-PET) in mild cognitive impairment (MCI) have shown highly variable performances when adopted to predict progression to dementia due to Alzheimer's disease (ADD). This study investigates, in a clinical setting, the separate and combined values of 18F-FDG-PET and 11C-PiB-PET in ADD conversion prediction with optimized data analysis procedures. Respectively, we investigate the accuracy of an optimized SPM analysis for 18F-FDG-PET and of standardized uptake value ratio semiquantification for 11C-PiB-PET in predicting ADD conversion in 30 MCI subjects (age 63.57±7.78 years). Fourteen subjects converted to ADD during the follow-up (median 26.5 months, inter-quartile range 30 months). Receiver operating characteristic analyses showed an area under the curve (AUC) of 0.89 and of 0.81 for, respectively, 18F-FDG-PET and 11C-PiB-PET. 18F-FDG-PET, compared to 11C-PiB-PET, showed higher specificity (1.00 versus 0.62, respectively), but lower sensitivity (0.79 versus 1.00). Combining the biomarkers improved classification accuracy (AUC = 0.96). During the follow-up time, all the MCI subjects positive for both PET biomarkers converted to ADD, whereas all the subjects negative for both remained stable. The difference in survival distributions was confirmed by a log-rank test (p = 0.002). These results indicate a very high accuracy in predicting MCI to ADD conversion of both 18F-FDG-PET and 11C-PiB-PET imaging, the former showing optimal performance based on the SPM optimized parametric assessment. Measures of brain glucose metabolism and amyloid load represent extremely powerful diagnostic and prognostic biomarkers with complementary roles in prodromal dementia phase, particularly when tailored to individual cases in clinical settings.

  13. New pole placement algorithm - Polynomial matrix approach

    Science.gov (United States)

    Shafai, B.; Keel, L. H.

    1990-01-01

    A simple and direct pole-placement algorithm is introduced for dynamical systems having a block companion matrix A. The algorithm utilizes well-established properties of matrix polynomials. Pole placement is achieved by appropriately assigning coefficient matrices of the corresponding matrix polynomial. This involves only matrix additions and multiplications without requiring matrix inversion. A numerical example is given for the purpose of illustration.

  14. The Clerkship Pediatric Rotation: Does Setting Matter?

    Directory of Open Access Journals (Sweden)

    Natasha Bollegala

    2010-03-01

    Discussion: These results help support the decision of curriculum committees to incorporate the use of community practice settings and inform students and faculty as to the validity of distributed medical education within the field of pediatric medical education.

  15. [Progress on matrix metalloproteinase inhibitors].

    Science.gov (United States)

    Lingling, Jia; Qianbing, Wan

    2017-04-01

    Continuing advances in dentin bonding technology and adhesives revolutionized bonding of resin-based composite restorations. However, hybrid layers created by contemporary dentin adhesives present imperfect durability, and degradation of collagen matrix by endogenous enzymes is a significant factor causing destruction of hybrid layers. Bond durability can be improved by using enzyme inhibitors to prevent collagen degradation and to preserve integrity of collagen matrix. This review summarizes progress on matrix metalloproteinase inhibitors (including chlorhexidine, ethylenediaminetetraacetic acid, quaternary ammonium salt, tetracycline and its derivatives, hydroxamic acid inhibitors, bisphosphonate derivative, and cross-linking agents) and suggests prospects for these compounds.

  16. Hadronic matrix elements for Kaons

    Energy Technology Data Exchange (ETDEWEB)

    Bijnens, Johan [Department of Theoretical Physics 2, Lund University, Soelvegatan 14A, S-22362 Lund (Sweden); Gamiz, Elvira [CAFPE and Departamento de Fisica Teorica y del Cosmos, Universidad de Granada Campus de Fuente Nueva, E-18002 Granada (Spain); Prades, Joaquim [CAFPE and Departamento de Fisica Teorica y del Cosmos, Universidad de Granada Campus de Fuente Nueva, E-18002 Granada (Spain)

    2004-07-01

    We review some work done by us calculating matrix elements for Kaons. Emphasis is put on the matrix elements which are relevant to predict non-leptonic Kaon CP violating observables. In particular, we recall our results for the B{sub K} parameter which governs the K{sup 0}-K{sup 0} mixing and update our results for {epsilon}'inK including estimated all-higher-order CHPT corrections and the new results from recent analytical calculations of the {delta}itI = 3/2 component. Some comments on future prospects on calculating matrix elements for Kaons are also added.

  17. Teaching the Extracellular Matrix and Introducing Online Databases within a Multidisciplinary Course with i-Cell-MATRIX: A Student-Centered Approach

    Science.gov (United States)

    Sousa, Joao Carlos; Costa, Manuel Joao; Palha, Joana Almeida

    2010-01-01

    The biochemistry and molecular biology of the extracellular matrix (ECM) is difficult to convey to students in a classroom setting in ways that capture their interest. The understanding of the matrix's roles in physiological and pathological conditions study will presumably be hampered by insufficient knowledge of its molecular structure.…

  18. The Performance Analysis Based on SAR Sample Covariance Matrix

    Directory of Open Access Journals (Sweden)

    Esra Erten

    2012-03-01

    Full Text Available Multi-channel systems appear in several fields of application in science. In the Synthetic Aperture Radar (SAR context, multi-channel systems may refer to different domains, as multi-polarization, multi-interferometric or multi-temporal data, or even a combination of them. Due to the inherent speckle phenomenon present in SAR images, the statistical description of the data is almost mandatory for its utilization. The complex images acquired over natural media present in general zero-mean circular Gaussian characteristics. In this case, second order statistics as the multi-channel covariance matrix fully describe the data. For practical situations however, the covariance matrix has to be estimated using a limited number of samples, and this sample covariance matrix follow the complex Wishart distribution. In this context, the eigendecomposition of the multi-channel covariance matrix has been shown in different areas of high relevance regarding the physical properties of the imaged scene. Specifically, the maximum eigenvalue of the covariance matrix has been frequently used in different applications as target or change detection, estimation of the dominant scattering mechanism in polarimetric data, moving target indication, etc. In this paper, the statistical behavior of the maximum eigenvalue derived from the eigendecomposition of the sample multi-channel covariance matrix in terms of multi-channel SAR images is simplified for SAR community. Validation is performed against simulated data and examples of estimation and detection problems using the analytical expressions are as well given.

  19. Note on the S-matrix propagation algorithm.

    Science.gov (United States)

    Li, Lifeng

    2003-04-01

    A set of full-matrix recursion formulas for the W --> S variant of the S-matrix algorithm is derived, which includes the recent results of some other authors as a subset. In addition, a special type of symmetry that is often found in the structure of coefficient matrices (W matrices) that appear in boundary-matching conditions is identified and fully exploited for the purpose of increasing computation efficiency. Two tables of floating-point operation (flop) counts for both the new W --> S variant and the old W --> t --> S variant of the S-matrix algorithm are given. Comparisons of flop counts show that in performing S-matrix recursions in the absence of the symmetry, it is more efficient to go directly from W matrices to S matrices. In the presence of the symmetry, however, using t matrices is equally and sometimes more advantageous, provided that the symmetry is utilized.

  20. SANSMIC Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Paula D.; Rudeen, David Keith; Lord, David L.

    2014-08-01

    SANSMIC is solution mining software that was developed and utilized by SNL in its role as geotechnical advisor to the US DOE SPR for planning purposes. Three SANSMIC leach modes - withdrawal, direct, and reverse leach - have been revalidated with multiple test cases for each mode. The withdrawal mode was validated using high quality data from recent leach activity while the direct and reverse modes utilized data from historical cavern completion reports. Withdrawal results compared very well with observed data, including the location and size of shelves due to string breaks with relative leached volume differences ranging from 6 - 10% and relative radius differences from 1.5 - 3%. Profile comparisons for the direct mode were very good with relative leached volume differences ranging from 6 - 12% and relative radius differences from 5 - 7%. First, second, and third reverse configurations were simulated in order to validate SANSMIC over a range of relative hanging string and OBI locations. The first-reverse was simulated reasonably well with relative leached volume differences ranging from 1 - 9% and relative radius differences from 5 - 12%. The second-reverse mode showed the largest discrepancies in leach profile. Leached volume differences ranged from 8 - 12% and relative radius differences from 1 - 10%. In the third-reverse, relative leached volume differences ranged from 10 - 13% and relative radius differences were %7E4 %. Comparisons to historical reports were quite good, indicating that SANSMIC is essentially the same as documented and validated in the early 1980's.

  1. Effect of bitumen emulsion on setting, strength, soundness and ...

    Indian Academy of Sciences (India)

    Addition of bitumen emulsion to the matrix has been found to improve strength and soundness of the product while decreasing the initial setting periods. Thus, bitumen emulsion as an admixture in magnesia cement is a moisture proofing and strengthening material.

  2. Newtonian M(atrix) cosmology

    Science.gov (United States)

    Álvarez, Enrique; Meessen, Patrick

    1998-05-01

    A Newtonian matrix cosmology, corresponding to the Banks, Fischler, Shenker and Susskind model of eleven-dimensional M-theory in the infinite momentum frame as a supersymmetric (0+1) M(atrix) model is constructed. Interesting new results are obtained, such as the existence of (much sought for in the past) static solutions. The possible interpretation of the off-diagonal entries as a background geometry is also briefly discussed.

  3. Superstatistics in Random Matrix Theory

    OpenAIRE

    Abul-Magd, A. Y.

    2011-01-01

    Random matrix theory (RMT) provides a successful model for quantum systems, whose classical counterpart has a chaotic dynamics. It is based on two assumptions: (1) matrix-element independence, and (2) base invariance. Last decade witnessed several attempts to extend RMT to describe quantum systems with mixed regular-chaotic dynamics. Most of the proposed generalizations keep the first assumption and violate the second. Recently, several authors presented other versions of the theory that keep...

  4. Matrix analysis of electrical machinery

    CERN Document Server

    Hancock, N N

    2013-01-01

    Matrix Analysis of Electrical Machinery, Second Edition is a 14-chapter edition that covers the systematic analysis of electrical machinery performance. This edition discusses the principles of various mathematical operations and their application to electrical machinery performance calculations. The introductory chapters deal with the matrix representation of algebraic equations and their application to static electrical networks. The following chapters describe the fundamentals of different transformers and rotating machines and present torque analysis in terms of the currents based on the p

  5. Development of the Russian matrix sentence test.

    Science.gov (United States)

    Warzybok, Anna; Zokoll, Melanie; Wardenga, Nina; Ozimek, Edward; Boboshko, Maria; Kollmeier, Birger

    2015-01-01

    To develop the Russian matrix sentence test for speech intelligibility measurements in noise. Test development included recordings, optimization of speech material, and evaluation to investigate the equivalency of the test lists and training. For each of the 500 test items, the speech intelligibility function, speech reception threshold (SRT: signal-to-noise ratio, SNR, that provides 50% speech intelligibility), and slope was obtained. The speech material was homogenized by applying level corrections. In evaluation measurements, speech intelligibility was measured at two fixed SNRs to compare list-specific intelligibility functions. To investigate the training effect and establish reference data, speech intelligibility was measured adaptively. Overall, 77 normal-hearing native Russian listeners. The optimization procedure decreased the spread in SRTs across words from 2.8 to 0.6 dB. Evaluation measurements confirmed that the 16 test lists were equivalent, with a mean SRT of -9.5 ± 0.2 dB and a slope of 13.8 ± 1.6%/dB. The reference SRT, -8.8 ± 0.8 dB for the open-set and -9.4 ± 0.8 dB for the closed-set format, increased slightly for noise levels above 75 dB SPL. The Russian matrix sentence test is suitable for accurate and reliable speech intelligibility measurements in noise.

  6. Liquid-Phase Microextraction and Gas Chromatographic-Mass Spectrometric Analysis of Antidepressants in Vitreous Humor: Study of Matrix Effect of Human and Bovine Vitreous and Saline Solution.

    Science.gov (United States)

    dos Santos, Marcelo Filonzi; Yamada, Adrian; Seulin, Saskia Carolina; Leyton, Vilma; Pasqualucci, Carlos Augusto Gonçalves; Muñoz, Daniel Romero; Yonamine, Mauricio

    2016-04-01

    In forensic bioanalytical methods, there is a general agreement that calibrators should be prepared by fortifying analytes in matrix-based blank samples (matrix-based). However, in the case of vitreous humor (VH), the collection of blank samples for the validation and for routine analysis would require the availability of many cadavers. Besides the difficulty of obtaining enough blank VH, this procedure could also represent an ethical issue. Here, a study of matrix effect was performed taking into consideration human and bovine vitreous and saline solution (SS) (NaCl 0.9%). Tricyclic antidepressants [amitriptyline (AMI), nortriptyline (NTR), imipramine (IMI) and desipramine (DES)] were used as model analytes and were extracted from samples by means of liquid-phase microextraction and detected by gas chromatography-mass spectrometry. Samples of human and bovine VH and SS were prepared in six different concentrations of antidepressants (5, 40, 80, 120, 160 and 200 ng/mL) and were analyzed. Relative matrix effect was evaluated by applying a two-tailed homoscedastic Student's t-test, comparing the results obtained with the set of data obtained with human VH and bovine VH and SS. No significant matrix effect was found for AMI and NTR in the three evaluated matrices. However, a great variability was observed for IMI and DES for all matrices. Once compatibilities among the matrices were demonstrated, the method was fully validated for AMI and NTR in SS. The method was applied to six VH samples deriving from real cases whose femoral whole blood (FWB) was analyzed by a previously published method. An average ratio (VH/FWB) of ∼ 0.1 was found for both compounds. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Active set support vector regression.

    Science.gov (United States)

    Musicant, David R; Feinberg, Alexander

    2004-03-01

    This paper presents active set support vector regression (ASVR), a new active set strategy to solve a straightforward reformulation of the standard support vector regression problem. This new algorithm is based on the successful ASVM algorithm for classification problems, and consists of solving a finite number of linear equations with a typically large dimensionality equal to the number of points to be approximated. However, by making use of the Sherman-Morrison-Woodbury formula, a much smaller matrix of the order of the original input space is inverted at each step. The algorithm requires no specialized quadratic or linear programming code, but merely a linear equation solver which is publicly available. ASVR is extremely fast, produces comparable generalization error to other popular algorithms, and is available on the web for download.

  8. A Dynamic Model for Direct and Indirect Matrix Converters

    Directory of Open Access Journals (Sweden)

    Mohamad Hosseini Abardeh

    2014-01-01

    Full Text Available The complicated modulation algorithm and the high switching frequency are two main hindrances in the analysis and simulation of matrix converters (MCs based systems. To simplify the analysis and accelerate the simulation of MCs, a unique dynamic model is presented for the MC, which is independent of MC type (direct or indirect and the modulation algorithm. All the input and output variables are transferred to the respective reference frames and their relations and limits are calculated. Based on the proposed equations, an equivalent circuit model is presented which can predict all the direct and indirect matrix converters dynamic and steady state behaviors without the need for small simulation time steps. Validity of the proposed model is evaluated using simulation of the precise model. Moreover, experimental results from a laboratory matrix converter setup are provided to verify the accuracy of the simulation results.

  9. Digital carrier modulation and sampling issues of matrix converters

    DEFF Research Database (Denmark)

    Blaabjerg, Frede; Loh, P.C.; Rong, R.J.

    2008-01-01

    Although the modulation of ac-ac matrix converters using space vector theory has long been established, their carrier-based modulation principles have only recently attracted some attention. Reasons commonly stated for evaluating the carrier-based alternative include simpler converter control...... digital carrier modulation schemes for controlling conventional and sparse matrix converters with minimized semiconductor commutation count and smooth sextant transitions with no erroneous states produced. For guaranteeing the latter two features, correct digital sampling instants and state sequence...... reversal must be chosen appropriately, as demonstrated in the paper for the two different topological options, which to date, have not yet been discussed in the existing literature. To validate the concepts discussed, experimental testing on the implemented conventional and sparse matrix laboratory...

  10. Digital Carrier Modulation and Sampling Issues of Matrix Converters

    DEFF Research Database (Denmark)

    Loh, Poh Chiang; Rong, Runjie; Blaabjerg, Frede

    2009-01-01

    Although the modulation of ac-ac matrix converters using space vector theory has long been established, their carrierbased modulation principles have only recently attracted some attention. Reasons commonly stated for evaluating the carrier-based alternative include simpler converter control...... digital carrier modulation schemes for controlling conventional (direct) and indirect matrix converters with minimized semiconductor commutation count and smooth sextant transitions with no erroneous states produced. For guaranteeing the latter two features, correct digital sampling instants and state...... sequence reversal must be chosen appropriately, as demonstrated in the paper for the two different topological options,which, to date, have not yet been discussed in the existing literature. To validate the concepts discussed, experimental testing on the implemented conventional and indirect matrix...

  11. Quasinormal-Mode Expansion of the Scattering Matrix

    Science.gov (United States)

    Alpeggiani, Filippo; Parappurath, Nikhil; Verhagen, Ewold; Kuipers, L.

    2017-04-01

    It is well known that the quasinormal modes (or resonant states) of photonic structures can be associated with the poles of the scattering matrix of the system in the complex-frequency plane. In this work, the inverse problem, i.e., the reconstruction of the scattering matrix from the knowledge of the quasinormal modes, is addressed. We develop a general and scalable quasinormal-mode expansion of the scattering matrix, requiring only the complex eigenfrequencies and the far-field properties of the eigenmodes. The theory is validated by applying it to illustrative nanophotonic systems with multiple overlapping electromagnetic modes. The examples demonstrate that our theory provides an accurate first-principles prediction of the scattering properties, without the need for postulating ad hoc nonresonant channels.

  12. Carrier-based modulation schemes for various three-level matrix converters

    DEFF Research Database (Denmark)

    Blaabjerg, Frede; Loh, P.C.; Rong, R.C.

    2008-01-01

    Matrix converters with three-level phase switching and five-level line switching characteristics have recently been proposed as improved ldquoall semiconductorrdquo ac-ac power processors. For their control, different modulation schemes have also been developed with different researchers claiming...... of investigation, this paper now presents a scheme for controlling a three-level indirect matrix converter, which then serves as the conceptual platform for extending the principles to other matrix topologies including the newly described 4times3 matrix converter and a third type of matrix converter that uses...... a limited set of switching vectors because of its lower semiconductor count. Through simulation and experimental testing, all the evaluated matrix converters are shown to produce satisfactory sinusoidal input and output quantities using the same set of generic modulation principles, which can conveniently...

  13. Structure of nuclear transition matrix elements for neutrinoless ...

    Indian Academy of Sciences (India)

    Abstract. The structure of nuclear transition matrix elements (NTMEs) required for the study of neutrinoless double- decay within light Majorana neutrino mass mechanism is disassembled in the PHFB model. The NTMEs are calculated using a set of HFB intrinsic wave functions, the reliability of which has been previously ...

  14. Structure of nuclear transition matrix elements for neutrinoless ...

    Indian Academy of Sciences (India)

    Abstract. The structure of nuclear transition matrix elements (NTMEs) required for the study of neutrinoless double-β decay within light Majorana neutrino mass mechanism is disassembled in the PHFB model. The NTMEs are calculated using a set of HFB intrinsic wave functions, the reliability of which has been previously ...

  15. Genetic Background is a Key Determinant of Glomerular Extracellular Matrix Composition and Organization.

    Science.gov (United States)

    Randles, Michael J; Woolf, Adrian S; Huang, Jennifer L; Byron, Adam; Humphries, Jonathan D; Price, Karen L; Kolatsi-Joannou, Maria; Collinson, Sophie; Denny, Thomas; Knight, David; Mironov, Aleksandr; Starborg, Toby; Korstanje, Ron; Humphries, Martin J; Long, David A; Lennon, Rachel

    2015-12-01

    Glomerular disease often features altered histologic patterns of extracellular matrix (ECM). Despite this, the potential complexities of the glomerular ECM in both health and disease are poorly understood. To explore whether genetic background and sex determine glomerular ECM composition, we investigated two mouse strains, FVB and B6, using RNA microarrays of isolated glomeruli combined with proteomic glomerular ECM analyses. These studies, undertaken in healthy young adult animals, revealed unique strain- and sex-dependent glomerular ECM signatures, which correlated with variations in levels of albuminuria and known predisposition to progressive nephropathy. Among the variation, we observed changes in netrin 4, fibroblast growth factor 2, tenascin C, collagen 1, meprin 1-α, and meprin 1-β. Differences in protein abundance were validated by quantitative immunohistochemistry and Western blot analysis, and the collective differences were not explained by mutations in known ECM or glomerular disease genes. Within the distinct signatures, we discovered a core set of structural ECM proteins that form multiple protein-protein interactions and are conserved from mouse to man. Furthermore, we found striking ultrastructural changes in glomerular basement membranes in FVB mice. Pathway analysis of merged transcriptomic and proteomic datasets identified potential ECM regulatory pathways involving inhibition of matrix metalloproteases, liver X receptor/retinoid X receptor, nuclear factor erythroid 2-related factor 2, notch, and cyclin-dependent kinase 5. These pathways may therefore alter ECM and confer susceptibility to disease. Copyright © 2015 by the American Society of Nephrology.

  16. A review of the matrix-exponential formalism in radiative transfer

    Science.gov (United States)

    Efremenko, Dmitry S.; Molina García, Víctor; Gimeno García, Sebastián; Doicu, Adrian

    2017-07-01

    This paper outlines the matrix exponential description of radiative transfer. The eigendecomposition method which serves as a basis for computing the matrix exponential and for representing the solution in a discrete ordinate setting is considered. The mathematical equivalence of the discrete ordinate method, the matrix operator method, and the matrix Riccati equations method is proved rigorously by means of the matrix exponential formalism. For optically thin layers, approximate solution methods relying on the Padé and Taylor series approximations to the matrix exponential, as well as on the matrix Riccati equations, are presented. For optically thick layers, the asymptotic theory with higher-order corrections is derived, and parameterizations of the asymptotic functions and constants for a water-cloud model with a Gamma size distribution are obtained.

  17. InstantLabs® Salmonella species detection method: matrix extension.

    Science.gov (United States)

    Sharma, Neil; Bambusch, Lauren; Le, Thu; Morey, Amit; Hayman, Melinda; Montez, Sergio J

    2014-01-01

    The performance of InstantLabs® Salmonella Species Food Safety Kit to detect Salmonella in four food matrixes was validated against the International Organization for Standardization (ISO) reference method 6579:2002. The matrixes (raw ground beef, raw chicken breast, raw ground chicken, and lettuce) were inoculated with low levels of Salmonella (Salmonella. Samples were validated using 375 g (meat) or 25 g (lettuce and poultry) test portions enriched in FASTGRO TM SE at 42±1 °C for 12 h and 10 h, respectively. All samples were confirmed using the ISO reference method, regardless of initial-screen result. The InstantLabs test method was shown to perform as well as or better than the reference method for the detection of Salmonella species in ground beef, chicken breast, ground chicken, and lettuce. Inclusivity and exclusivity testing revealed no false negatives among the 100 Salmonella serovars and no false positives among the 30 non-Salmonella species examined, respectively.

  18. A graph-theoretic approach to sparse matrix inversion for implicit differential algebraic equations

    Directory of Open Access Journals (Sweden)

    H. Yoshimura

    2013-06-01

    Full Text Available In this paper, we propose an efficient numerical scheme to compute sparse matrix inversions for Implicit Differential Algebraic Equations of large-scale nonlinear mechanical systems. We first formulate mechanical systems with constraints by Dirac structures and associated Lagrangian systems. Second, we show how to allocate input-output relations to the variables in kinematical and dynamical relations appearing in DAEs by introducing an oriented bipartite graph. Then, we also show that the matrix inversion of Jacobian matrix associated to the kinematical and dynamical relations can be carried out by using the input-output relations and we explain solvability of the sparse Jacobian matrix inversion by using the bipartite graph. Finally, we propose an efficient symbolic generation algorithm to compute the sparse matrix inversion of the Jacobian matrix, and we demonstrate the validity in numerical efficiency by an example of the stanford manipulator.

  19. Matrix factorizations and elliptic fibrations

    Directory of Open Access Journals (Sweden)

    Harun Omer

    2016-09-01

    Full Text Available I use matrix factorizations to describe branes at simple singularities of elliptic fibrations. Each node of the corresponding Dynkin diagrams of the ADE-type singularities is associated with one indecomposable matrix factorization which can be deformed into one or more factorizations of lower rank. Branes with internal fluxes arise naturally as bound states of the indecomposable factorizations. Describing branes in such a way avoids the need to resolve singularities. This paper looks at gauge group breaking from E8 fibers down to SU(5 fibers due to the relevance of such fibrations for local F-theory GUT models. A purpose of this paper is to understand how the deformations of the singularity are understood in terms of its matrix factorizations. By systematically factorizing the elliptic fiber equation, this paper discusses geometries which are relevant for building semi-realistic local models. In the process it becomes evident that breaking patterns which are identical at the level of the Kodaira type of the fibers can be inequivalent at the level of matrix factorizations. Therefore the matrix factorization picture supplements information which the conventional less detailed descriptions lack.

  20. Magnesium Matrix Composite Foams—Density, Mechanical Properties, and Applications

    Directory of Open Access Journals (Sweden)

    Kyu Cho

    2012-07-01

    Full Text Available Potential of widespread industrial applications of magnesium has been realized in recent years. A variety of magnesium alloy matrix composites are now being studied for mechanical properties. Since magnesium is the lightest structural metal, it can replace aluminum in existing applications for further weight savings. This review presents an overview of hollow particle filled magnesium matrix syntactic composite foams. Fly ash cenospheres are the most commonly used hollow particles for such applications. Fly ash cenospheres primarily have alumino-silicate composition and contain a large number of trace elements, which makes it challenging to study the interfacial reactions and microstructure in these composites. Microstructures of commonly studied AZ and ZC series magnesium alloys and their syntactic foams are discussed. Although only a few studies are available on these materials because of the nascent stage of this field, a comparison with similar aluminum matrix syntactic foams has provided insight into the properties and weight saving potential of magnesium matrix composites. Analysis shows that the magnesium matrix syntactic foams have higher yield strength at the same level of density compared to most other metal matrix syntactic foams. The comparison can guide future work and set goals that need to be achieved through materials selection and processing method development.

  1. Lectures on matrix field theory

    CERN Document Server

    Ydri, Badis

    2017-01-01

    These lecture notes provide a systematic introduction to matrix models of quantum field theories with non-commutative and fuzzy geometries. The book initially focuses on the matrix formulation of non-commutative and fuzzy spaces, followed by a description of the non-perturbative treatment of the corresponding field theories. As an example, the phase structure of non-commutative phi-four theory is treated in great detail, with a separate chapter on the multitrace approach. The last chapter offers a general introduction to non-commutative gauge theories, while two appendices round out the text. Primarily written as a self-study guide for postgraduate students – with the aim of pedagogically introducing them to key analytical and numerical tools, as well as useful physical models in applications – these lecture notes will also benefit experienced researchers by providing a reference guide to the fundamentals of non-commutative field theory with an emphasis on matrix models and fuzzy geometries.

  2. Matrix formalism of synchrobetatron coupling

    Directory of Open Access Journals (Sweden)

    Xiaobiao Huang

    2007-01-01

    Full Text Available In this paper we present a complete linear synchrobetatron coupling formalism by studying the transfer matrix which describes linear horizontal and longitudinal motions. With the technique established in the linear horizontal-vertical coupling study [D. Sagan and D. Rubin, Phys. Rev. ST Accel. Beams 2, 074001 (1999PRABFM1098-440210.1103/PhysRevSTAB.2.074001], we found a transformation to block diagonalize the transfer matrix and decouple the betatron motion and the synchrotron motion. By separating the usual dispersion term from the horizontal coordinate first, we were able to obtain analytic expressions of the transformation under reasonable approximations. We also obtained the perturbations to the betatron tune and the Courant-Snyder functions. The closed-orbit changes due to finite energy gains at rf cavities and radiation energy losses were studied by the 5×5 extended transfer matrix with the fifth column describing kicks in the 4-dimension phase space.

  3. Matrix sketching for big data reduction (Conference Presentation)

    Science.gov (United States)

    Ezekiel, Soundararajan; Giansiracusa, Michael

    2017-05-01

    Abstract: In recent years, the concept of Big Data has become a more prominent issue as the volume of data as well as the velocity in which it is produced exponentially increases. By 2020 the amount of data being stored is estimated to be 44 Zettabytes and currently over 31 Terabytes of data is being generated every second. Algorithms and applications must be able to effectively scale to the volume of data being generated. One such application designed to effectively and efficiently work with Big Data is IBM's Skylark. Part of DARPA's XDATA program, an open-source catalog of tools to deal with Big Data; Skylark, or Sketching-based Matrix Computations for Machine Learning is a library of functions designed to reduce the complexity of large scale matrix problems that also implements kernel-based machine learning tasks. Sketching reduces the dimensionality of matrices through randomization and compresses matrices while preserving key properties, speeding up computations. Matrix sketches can be used to find accurate solutions to computations in less time, or can summarize data by identifying important rows and columns. In this paper, we investigate the effectiveness of sketched matrix computations using IBM's Skylark versus non-sketched computations. We judge effectiveness based on several factors: computational complexity and validity of outputs. Initial results from testing with smaller matrices are promising, showing that Skylark has a considerable reduction ratio while still accurately performing matrix computations.

  4. Optimization and validation of a CZE method for rufloxacin hydrochloride determination in coated tablets.

    Science.gov (United States)

    Furlanetto, S; Orlandini, S; Porta, E La; Coran, S; Pinzauti, S

    2002-06-15

    A simple and rapid capillary electrophoresis method with UV detection was developed and validated for the determination of rufloxacin hydrochloride in coated tablets. An experimental design strategy (Doehlert design and desirability function) allowed the analytical parameters to be simultaneously optimized in order to determine rufloxacin hydrochloride with high peak area/migration time ratio, good efficiency and short analysis time. Optimized analyses were run using boric acid 0.10 M adjusted to pH 8.8 as BGE and setting voltage and temperature at 18 kV and 27 degrees C, respectively. Pefloxacin mesylate was used as internal standard and run time was about three minutes. The method was validated for the drug substance and the drug product according to the ICH3 guidelines. Robustness was tested by experimental design using an eight-run Plackett-Burman matrix.

  5. Restricted Closed Shell Hartree Fock Roothaan Matrix Method Applied to Helium Atom Using Mathematica

    Science.gov (United States)

    Acosta, César R.; Tapia, J. Alejandro; Cab, César

    2014-01-01

    Slater type orbitals were used to construct the overlap and the Hamiltonian core matrices; we also found the values of the bi-electron repulsion integrals. The Hartree Fock Roothaan approximation process starts with setting an initial guess value for the elements of the density matrix; with these matrices we constructed the initial Fock matrix.…

  6. Supersymmetry in random matrix theory

    Energy Technology Data Exchange (ETDEWEB)

    Kieburg, Mario

    2010-05-04

    I study the applications of supersymmetry in random matrix theory. I generalize the supersymmetry method and develop three new approaches to calculate eigenvalue correlation functions. These correlation functions are averages over ratios of characteristic polynomials. In the first part of this thesis, I derive a relation between integrals over anti-commuting variables (Grassmann variables) and differential operators with respect to commuting variables. With this relation I rederive Cauchy- like integral theorems. As a new application I trace the supermatrix Bessel function back to a product of two ordinary matrix Bessel functions. In the second part, I apply the generalized Hubbard-Stratonovich transformation to arbitrary rotation invariant ensembles of real symmetric and Hermitian self-dual matrices. This extends the approach for unitarily rotation invariant matrix ensembles. For the k-point correlation functions I derive supersymmetric integral expressions in a unifying way. I prove the equivalence between the generalized Hubbard-Stratonovich transformation and the superbosonization formula. Moreover, I develop an alternative mapping from ordinary space to superspace. After comparing the results of this approach with the other two supersymmetry methods, I obtain explicit functional expressions for the probability densities in superspace. If the probability density of the matrix ensemble factorizes, then the generating functions exhibit determinantal and Pfaffian structures. For some matrix ensembles this was already shown with help of other approaches. I show that these structures appear by a purely algebraic manipulation. In this new approach I use structures naturally appearing in superspace. I derive determinantal and Pfaffian structures for three types of integrals without actually mapping onto superspace. These three types of integrals are quite general and, thus, they are applicable to a broad class of matrix ensembles. (orig.)

  7. Symmetries and Interactions in Matrix String Theory

    NARCIS (Netherlands)

    Hacquebord, F.H.

    1999-01-01

    This PhD-thesis reviews matrix string theory and recent developments therein. The emphasis is put on symmetries, interactions and scattering processes in the matrix model. We start with an introduction to matrix string theory and a review of the orbifold model that flows out of matrix string theory

  8. Properties of the matrix A-XY

    NARCIS (Netherlands)

    Steerneman, A.G.M.; van Perlo -ten Kleij, Frederieke

    2005-01-01

    The main topic of this paper is the matrix V = A - XY*, where A is a nonsingular complex k x k matrix and X and Y are k x p complex matrices of full column rank. Because properties of the matrix V can be derived from those of the matrix Q = I - XY*, we will consider in particular the case where A =

  9. HLT Validation of Athena

    CERN Document Server

    Bee, C P; González, S; Karr, K M; Wiedenmann, W

    2002-01-01

    In the present view, the ATLAS High Level Trigger will base its event selection software on the offline reconstruction framework, Athena. It is therefore imperative that the offline software -- and its relevant components -- are able to handle the large CPU and bandwidth loads required in a real-time environment. This note presents a first set of measurements aimed at validating Athena as the ATLAS online event selection framework. Although Athena is at an early development stage, detailed profiling can already yield clues as to which components can be optimized. In this note such areas are identified and a proposal is made on a road map to full performance.

  10. Generalized Reich-Moore R-matrix approximation

    Science.gov (United States)

    Arbanas, Goran; Sobes, Vladimir; Holcomb, Andrew; Ducru, Pablo; Pigni, Marco; Wiarda, Dorothea

    2017-09-01

    A conventional Reich-Moore approximation (RMA) of R-matrix is generalized into a manifestly unitary form by introducing a set of resonant capture channels treated explicitly in a generalized, reduced R-matrix. A dramatic reduction of channel space witnessed in conventional RMA, from Nc × Nc full R-matrix to Np × Np reduced R-matrix, where Nc = Np + Nγ, Np and Nγ denoting the number of particle and γ-ray channels, respectively, is due to Np full R-matrix to N × N, where N = Np + N, and where N is the number of capture channels defined in GRMA. We show that N = Nλ where Nλ is the number of R-matrix levels. This reduction in channel space, although not as dramatic as in the conventional RMA, could be significant for medium and heavy nuclides where N full Nc × NcR-matrix. This suggests that GRMA could yield improved nuclear data evaluations in the resolved resonance range at a cost of introducing N(N - 1)/2 resonant capture width parameters relative to conventional RMA. Manifest unitarity of GRMA justifies a method advocated by Fröhner and implemented in the SAMMY nuclear data evaluation code for enforcing unitarity of conventional RMA. Capture widths of GRMA are exactly convertible into alternative R-matrix parameters via Brune tranform. Application of idealized statistical methods to GRMA shows that variance among conventional RMA capture widths in extant RMA evaluations could be used to estimate variance among off-diagonal elements neglected by conventional RMA. Significant departure of capture widths from an idealized distribution may indicate the presence of underlying doorway states.

  11. Polychoric/Tetrachoric Matrix or Pearson Matrix? A methodological study

    Directory of Open Access Journals (Sweden)

    Dominguez Lara, Sergio Alexis

    2014-04-01

    Full Text Available The use of product-moment correlation of Pearson is common in most studies in factor analysis in psychology, but it is known that this statistic is only applicable when the variables related are in interval scale and normally distributed, and when are used in ordinal data may to produce a distorted correlation matrix . Thus is a suitable option using polychoric/tetrachoric matrices in item-level factor analysis when the items are in level measurement nominal or ordinal. The aim of this study was to show the differences in the KMO, Bartlett`s Test and Determinant of the Matrix, percentage of variance explained and factor loadings in depression trait scale of Depression Inventory Trait - State and the Neuroticism dimension of the short form of the Eysenck Personality Questionnaire -Revised, regarding the use of matrices polychoric/tetrachoric matrices and Pearson. These instruments was analyzed with different extraction methods (Maximum Likelihood, Minimum Rank Factor Analysis, Unweighted Least Squares and Principal Components, keeping constant the rotation method Promin were analyzed. Were observed differences regarding sample adequacy measures, as well as with respect to the explained variance and the factor loadings, for solutions having as polychoric/tetrachoric matrix. So it can be concluded that the polychoric / tetrachoric matrix give better results than Pearson matrices when it comes to item-level factor analysis using different methods.

  12. Towards Google matrix of brain

    Energy Technology Data Exchange (ETDEWEB)

    Shepelyansky, D.L., E-mail: dima@irsamc.ups-tlse.f [Laboratoire de Physique Theorique (IRSAMC), Universite de Toulouse, UPS, F-31062 Toulouse (France); LPT - IRSAMC, CNRS, F-31062 Toulouse (France); Zhirov, O.V. [Budker Institute of Nuclear Physics, 630090 Novosibirsk (Russian Federation)

    2010-07-12

    We apply the approach of the Google matrix, used in computer science and World Wide Web, to description of properties of neuronal networks. The Google matrix G is constructed on the basis of neuronal network of a brain model discussed in PNAS 105 (2008) 3593. We show that the spectrum of eigenvalues of G has a gapless structure with long living relaxation modes. The PageRank of the network becomes delocalized for certain values of the Google damping factor {alpha}. The properties of other eigenstates are also analyzed. We discuss further parallels and similarities between the World Wide Web and neuronal networks.

  13. Acousto-ultrasonic decay in metal matrix composite panels

    Science.gov (United States)

    Kautz, Harold E.

    1995-01-01

    Acousto-ultrasonic (A-U) decay rates (UD) were measured in metal matrix composite (MMC) panels. The MMC panels had fiber architectures and cross-sectional thicknesses corresponding to those designed for aerospace turbine engine structures. The wavelength-to-thickness ratio produced by the combination of experimental frequency setting conditions and specimen geometry was found to be a key parameter for identifying optimum conditions for UD measurements. The ratio was shown to be a useful rule of thumb when applied to ceramic matrix composites (CMC)s and monolithic thermo-plastics.

  14. S-AMP: Approximate Message Passing for General Matrix Ensembles

    DEFF Research Database (Denmark)

    Cakmak, Burak; Winther, Ole; Fleury, Bernard H.

    2014-01-01

    We propose a novel iterative estimation algorithm for linear observation models called S-AMP. The fixed points of S-AMP are the stationary points of the exact Gibbs free energy under a set of (first- and second-) moment consistency constraints in the large system limit. S-AMP extends...... the approximate message-passing (AMP) algorithm to general matrix ensembles with a well-defined large system size limit. The generalization is based on the S-transform (in free probability) of the spectrum of the measurement matrix. Furthermore, we show that the optimality of S-AMP follows directly from its...

  15. Compressing Regular Expressions' DFA Table by Matrix Decomposition

    Science.gov (United States)

    Liu, Yanbing; Guo, Li; Liu, Ping; Tan, Jianlong

    Recently regular expression matching has become a research focus as a result of the urgent demand for Deep Packet Inspection (DPI) in many network security systems. Deterministic Finite Automaton (DFA), which recognizes a set of regular expressions, is usually adopted to cater to the need for real-time processing of network traffic. However, the huge memory usage of DFA prevents it from being applied even on a medium-sized pattern set. In this article, we propose a matrix decomposition method for DFA table compression. The basic idea of the method is to decompose a DFA table into the sum of a row vector, a column vector and a sparse matrix, all of which cost very little space. Experiments on typical rule sets show that the proposed method significantly reduces the memory usage and still runs at fast searching speed.

  16. Random matrix theory and multivariate statistics

    OpenAIRE

    Diaz-Garcia, Jose A.; Jáimez, Ramon Gutiérrez

    2009-01-01

    Some tools and ideas are interchanged between random matrix theory and multivariate statistics. In the context of the random matrix theory, classes of spherical and generalised Wishart random matrix ensemble, containing as particular cases the classical random matrix ensembles, are proposed. Some properties of these classes of ensemble are analysed. In addition, the random matrix ensemble approach is extended and a unified theory proposed for the study of distributions for real normed divisio...

  17. Matrix theory selected topics and useful results

    CERN Document Server

    Mehta, Madan Lal

    1989-01-01

    Matrices and operations on matrices ; determinants ; elementary operations on matrices (continued) ; eigenvalues and eigenvectors, diagonalization of normal matrices ; functions of a matrix ; positive definiteness, various polar forms of a matrix ; special matrices ; matrices with quaternion elements ; inequalities ; generalised inverse of a matrix ; domain of values of a matrix, location and dispersion of eigenvalues ; symmetric functions ; integration over matrix variables ; permanents of doubly stochastic matrices ; infinite matrices ; Alexander matrices, knot polynomials, torsion numbers.

  18. Cross Validated Temperament Scale Validities Computed Using Profile Similarity Metrics

    Science.gov (United States)

    2017-04-27

    The Challenge and Opportunity of the Inverted U. Perspectives on Psychological Science, 6, 61-76. Hogan, R. (2005). In defense of personality ...27 April 2017 at the 32nd Annual Conference of the Society for Industrial and Organizational Psychology , Orlando, FL Disclaimer: All...14. ABSTRACT Personality and temperament scales are used in employment settings to predict performance because they are valid and have

  19. CTF Void Drift Validation Study

    Energy Technology Data Exchange (ETDEWEB)

    Salko, Robert K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gosdin, Chris [Pennsylvania State Univ., University Park, PA (United States); Avramova, Maria N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gergar, Marcus [Pennsylvania State Univ., University Park, PA (United States)

    2015-10-26

    This milestone report is a summary of work performed in support of expansion of the validation and verification (V&V) matrix for the thermal-hydraulic subchannel code, CTF. The focus of this study is on validating the void drift modeling capabilities of CTF and verifying the supporting models that impact the void drift phenomenon. CTF uses a simple turbulent-diffusion approximation to model lateral cross-flow due to turbulent mixing and void drift. The void drift component of the model is based on the Lahey and Moody model. The models are a function of two-phase mass, momentum, and energy distribution in the system; therefore, it is necessary to correctly model the ow distribution in rod bundle geometry as a first step to correctly calculating the void distribution due to void drift.

  20. A Validation of the Fry Syllabication Generalizations.

    Science.gov (United States)

    Costigan, Patricia

    This master's thesis examines the utility of Edward Fry's 23 syllabication generalizations. To validate these rules, two sets of one thousand words each were selected, one set containing "easy" words taken from the Word Frequency Book, the second set containing "hard" words taken randomly from the Thorndike Barnhart Advanced…