WorldWideScience

Sample records for validation approach identifying

  1. A Novel Approach to Identifying Trajectories of Mobility Change in Older Adults.

    Directory of Open Access Journals (Sweden)

    Rachel E Ward

    Full Text Available To validate trajectories of late-life mobility change using a novel approach designed to overcome the constraints of modest sample size and few follow-up time points.Using clinical reasoning and distribution-based methodology, we identified trajectories of mobility change (Late Life Function and Disability Instrument across 2 years in 391 participants age ≥65 years from a prospective cohort study designed to identify modifiable impairments predictive of mobility in late-life. We validated our approach using model fit indices and comparing baseline mobility-related factors between trajectories.Model fit indices confirmed that the optimal number of trajectories were between 4 and 6. Mobility-related factors varied across trajectories with the most unfavorable values in poor mobility trajectories and the most favorable in high mobility trajectories. These factors included leg strength, trunk extension endurance, knee flexion range of motion, limb velocity, physical performance measures, and the number and prevalence of medical conditions including osteoarthritis and back pain.Our findings support the validity of this approach and may facilitate the investigation of a broader scope of research questions within aging populations of varied sizes and traits.

  2. WSRC approach to validation of criticality safety computer codes

    International Nuclear Information System (INIS)

    Finch, D.R.; Mincey, J.F.

    1991-01-01

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K eff ) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope 236 U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed

  3. A model-based approach to identify binding sites in CLIP-Seq data.

    Directory of Open Access Journals (Sweden)

    Tao Wang

    Full Text Available Cross-linking immunoprecipitation coupled with high-throughput sequencing (CLIP-Seq has made it possible to identify the targeting sites of RNA-binding proteins in various cell culture systems and tissue types on a genome-wide scale. Here we present a novel model-based approach (MiClip to identify high-confidence protein-RNA binding sites from CLIP-seq datasets. This approach assigns a probability score for each potential binding site to help prioritize subsequent validation experiments. The MiClip algorithm has been tested in both HITS-CLIP and PAR-CLIP datasets. In the HITS-CLIP dataset, the signal/noise ratios of miRNA seed motif enrichment produced by the MiClip approach are between 17% and 301% higher than those by the ad hoc method for the top 10 most enriched miRNAs. In the PAR-CLIP dataset, the MiClip approach can identify ∼50% more validated binding targets than the original ad hoc method and two recently published methods. To facilitate the application of the algorithm, we have released an R package, MiClip (http://cran.r-project.org/web/packages/MiClip/index.html, and a public web-based graphical user interface software (http://galaxy.qbrc.org/tool_runner?tool_id=mi_clip for customized analysis.

  4. An Efficient Approach for Identifying Stable Lobes with Discretization Method

    Directory of Open Access Journals (Sweden)

    Baohai Wu

    2013-01-01

    Full Text Available This paper presents a new approach for quick identification of chatter stability lobes with discretization method. Firstly, three different kinds of stability regions are defined: absolute stable region, valid region, and invalid region. Secondly, while identifying the chatter stability lobes, three different regions within the chatter stability lobes are identified with relatively large time intervals. Thirdly, stability boundary within the valid regions is finely calculated to get exact chatter stability lobes. The proposed method only needs to test a small portion of spindle speed and cutting depth set; about 89% computation time is savedcompared with full discretization method. It spends only about10 minutes to get exact chatter stability lobes. Since, based on discretization method, the proposed method can be used for different immersion cutting including low immersion cutting process, the proposed method can be directly implemented in the workshop to promote machining parameters selection efficiency.

  5. Identifying prognostic features by bottom-up approach and correlating to drug repositioning.

    Directory of Open Access Journals (Sweden)

    Wei Li

    Full Text Available Traditionally top-down method was used to identify prognostic features in cancer research. That is to say, differentially expressed genes usually in cancer versus normal were identified to see if they possess survival prediction power. The problem is that prognostic features identified from one set of patient samples can rarely be transferred to other datasets. We apply bottom-up approach in this study: survival correlated or clinical stage correlated genes were selected first and prioritized by their network topology additionally, then a small set of features can be used as a prognostic signature.Gene expression profiles of a cohort of 221 hepatocellular carcinoma (HCC patients were used as a training set, 'bottom-up' approach was applied to discover gene-expression signatures associated with survival in both tumor and adjacent non-tumor tissues, and compared with 'top-down' approach. The results were validated in a second cohort of 82 patients which was used as a testing set.Two sets of gene signatures separately identified in tumor and adjacent non-tumor tissues by bottom-up approach were developed in the training cohort. These two signatures were associated with overall survival times of HCC patients and the robustness of each was validated in the testing set, and each predictive performance was better than gene expression signatures reported previously. Moreover, genes in these two prognosis signature gave some indications for drug-repositioning on HCC. Some approved drugs targeting these markers have the alternative indications on hepatocellular carcinoma.Using the bottom-up approach, we have developed two prognostic gene signatures with a limited number of genes that associated with overall survival times of patients with HCC. Furthermore, prognostic markers in these two signatures have the potential to be therapeutic targets.

  6. Identifying bully victims: definitional versus behavioral approaches.

    Science.gov (United States)

    Green, Jennifer Greif; Felix, Erika D; Sharkey, Jill D; Furlong, Michael J; Kras, Jennifer E

    2013-06-01

    Schools frequently assess bullying and the Olweus Bully/Victimization Questionnaire (BVQ; Olweus, 1996) is the most widely adopted tool for this purpose. The BVQ is a self-report survey that uses a definitional measurement method--describing "bullying" as involving repeated, intentional aggression in a relationship where there is an imbalance of power and then asking respondents to indicate how frequently they experienced this type of victimization. Few studies have examined BVQ validity and whether this definitional method truly identifies the repetition and power differential that distinguish bullying from other forms of peer victimization. This study examined the concurrent validity of the BVQ definitional question among 435 students reporting peer victimization. BVQ definitional responses were compared with responses to a behavioral measure that did not use the term "bullying" but, instead, included items that asked about its defining characteristics (repetition, intentionality, power imbalance). Concordance between the two approaches was moderate, with an area under the receiver operating curve of .72. BVQ responses were more strongly associated with students indicating repeated victimization and multiple forms of victimization, than with power imbalance in their relationship with the bully. Findings indicate that the BVQ is a valid measure of repeated victimization and a broad range of victimization experiences but may not detect the more subtle and complex power imbalances that distinguish bullying from other forms of peer victimization. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  7. A discrete wavelet spectrum approach for identifying non-monotonic trends in hydroclimate data

    Science.gov (United States)

    Sang, Yan-Fang; Sun, Fubao; Singh, Vijay P.; Xie, Ping; Sun, Jian

    2018-01-01

    The hydroclimatic process is changing non-monotonically and identifying its trends is a great challenge. Building on the discrete wavelet transform theory, we developed a discrete wavelet spectrum (DWS) approach for identifying non-monotonic trends in hydroclimate time series and evaluating their statistical significance. After validating the DWS approach using two typical synthetic time series, we examined annual temperature and potential evaporation over China from 1961-2013 and found that the DWS approach detected both the warming and the warming hiatus in temperature, and the reversed changes in potential evaporation. Further, the identified non-monotonic trends showed stable significance when the time series was longer than 30 years or so (i.e. the widely defined climate timescale). The significance of trends in potential evaporation measured at 150 stations in China, with an obvious non-monotonic trend, was underestimated and was not detected by the Mann-Kendall test. Comparatively, the DWS approach overcame the problem and detected those significant non-monotonic trends at 380 stations, which helped understand and interpret the spatiotemporal variability in the hydroclimatic process. Our results suggest that non-monotonic trends of hydroclimate time series and their significance should be carefully identified, and the DWS approach proposed has the potential for wide use in the hydrological and climate sciences.

  8. Model validation: a systemic and systematic approach

    International Nuclear Information System (INIS)

    Sheng, G.; Elzas, M.S.; Cronhjort, B.T.

    1993-01-01

    The term 'validation' is used ubiquitously in association with the modelling activities of numerous disciplines including social, political natural, physical sciences, and engineering. There is however, a wide range of definitions which give rise to very different interpretations of what activities the process involves. Analyses of results from the present large international effort in modelling radioactive waste disposal systems illustrate the urgent need to develop a common approach to model validation. Some possible explanations are offered to account for the present state of affairs. The methodology developed treats model validation and code verification in a systematic fashion. In fact, this approach may be regarded as a comprehensive framework to assess the adequacy of any simulation study. (author)

  9. A discrete wavelet spectrum approach for identifying non-monotonic trends in hydroclimate data

    Directory of Open Access Journals (Sweden)

    Y.-F. Sang

    2018-01-01

    Full Text Available The hydroclimatic process is changing non-monotonically and identifying its trends is a great challenge. Building on the discrete wavelet transform theory, we developed a discrete wavelet spectrum (DWS approach for identifying non-monotonic trends in hydroclimate time series and evaluating their statistical significance. After validating the DWS approach using two typical synthetic time series, we examined annual temperature and potential evaporation over China from 1961–2013 and found that the DWS approach detected both the warming and the warming hiatus in temperature, and the reversed changes in potential evaporation. Further, the identified non-monotonic trends showed stable significance when the time series was longer than 30 years or so (i.e. the widely defined climate timescale. The significance of trends in potential evaporation measured at 150 stations in China, with an obvious non-monotonic trend, was underestimated and was not detected by the Mann–Kendall test. Comparatively, the DWS approach overcame the problem and detected those significant non-monotonic trends at 380 stations, which helped understand and interpret the spatiotemporal variability in the hydroclimatic process. Our results suggest that non-monotonic trends of hydroclimate time series and their significance should be carefully identified, and the DWS approach proposed has the potential for wide use in the hydrological and climate sciences.

  10. A probabilistic approach for validating protein NMR chemical shift assignments

    International Nuclear Information System (INIS)

    Wang Bowei; Wang, Yunjun; Wishart, David S.

    2010-01-01

    It has been estimated that more than 20% of the proteins in the BMRB are improperly referenced and that about 1% of all chemical shift assignments are mis-assigned. These statistics also reflect the likelihood that any newly assigned protein will have shift assignment or shift referencing errors. The relatively high frequency of these errors continues to be a concern for the biomolecular NMR community. While several programs do exist to detect and/or correct chemical shift mis-referencing or chemical shift mis-assignments, most can only do one, or the other. The one program (SHIFTCOR) that is capable of handling both chemical shift mis-referencing and mis-assignments, requires the 3D structure coordinates of the target protein. Given that chemical shift mis-assignments and chemical shift re-referencing issues should ideally be addressed prior to 3D structure determination, there is a clear need to develop a structure-independent approach. Here, we present a new structure-independent protocol, which is based on using residue-specific and secondary structure-specific chemical shift distributions calculated over small (3-6 residue) fragments to identify mis-assigned resonances. The method is also able to identify and re-reference mis-referenced chemical shift assignments. Comparisons against existing re-referencing or mis-assignment detection programs show that the method is as good or superior to existing approaches. The protocol described here has been implemented into a freely available Java program called 'Probabilistic Approach for protein Nmr Assignment Validation (PANAV)' and as a web server (http://redpoll.pharmacy.ualberta.ca/PANAVhttp://redpoll.pharmacy.ualberta.ca/PANAV) which can be used to validate and/or correct as well as re-reference assigned protein chemical shifts.

  11. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  12. The bottom-up approach to integrative validity: a new perspective for program evaluation.

    Science.gov (United States)

    Chen, Huey T

    2010-08-01

    The Campbellian validity model and the traditional top-down approach to validity have had a profound influence on research and evaluation. That model includes the concepts of internal and external validity and within that model, the preeminence of internal validity as demonstrated in the top-down approach. Evaluators and researchers have, however, increasingly recognized that in an evaluation, the over-emphasis on internal validity reduces that evaluation's usefulness and contributes to the gulf between academic and practical communities regarding interventions. This article examines the limitations of the Campbellian validity model and the top-down approach and provides a comprehensive, alternative model, known as the integrative validity model for program evaluation. The integrative validity model includes the concept of viable validity, which is predicated on a bottom-up approach to validity. This approach better reflects stakeholders' evaluation views and concerns, makes external validity workable, and becomes therefore a preferable alternative for evaluation of health promotion/social betterment programs. The integrative validity model and the bottom-up approach enable evaluators to meet scientific and practical requirements, facilitate in advancing external validity, and gain a new perspective on methods. The new perspective also furnishes a balanced view of credible evidence, and offers an alternative perspective for funding. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  13. Validation of search filters for identifying pediatric studies in PubMed

    NARCIS (Netherlands)

    Leclercq, Edith; Leeflang, Mariska M. G.; van Dalen, Elvira C.; Kremer, Leontien C. M.

    2013-01-01

    To identify and validate PubMed search filters for retrieving studies including children and to develop a new pediatric search filter for PubMed. We developed 2 different datasets of studies to evaluate the performance of the identified pediatric search filters, expressed in terms of sensitivity,

  14. Identifying Careless Responding With the Psychopathic Personality Inventory-Revised Validity Scales.

    Science.gov (United States)

    Marcus, David K; Church, Abere Sawaqdeh; O'Connell, Debra; Lilienfeld, Scott O

    2018-01-01

    The Psychopathic Personality Inventory-Revised (PPI-R) includes validity scales that assess Deviant Responding (DR), Virtuous Responding, and Inconsistent Responding. We examined the utility of these scales for identifying careless responding using data from two online studies that examined correlates of psychopathy in college students (Sample 1: N = 583; Sample 2: N = 454). Compared with those below the cut scores, those above the cut on the DR scale yielded consistently lower validity coefficients when PPI-R scores were correlated with corresponding scales from the Triarchic Psychopathy Measure. The other three PPI-R validity scales yielded weaker and less consistent results. Participants who completed the studies in an inordinately brief amount of time scored significantly higher on the DR and Virtuous Responding scales than other participants. Based on the findings from the current studies, researchers collecting PPI-R data online should consider identifying and perhaps screening out respondents with elevated scores on the DR scale.

  15. An information-theoretic approach to assess practical identifiability of parametric dynamical systems.

    Science.gov (United States)

    Pant, Sanjay; Lombardi, Damiano

    2015-10-01

    A new approach for assessing parameter identifiability of dynamical systems in a Bayesian setting is presented. The concept of Shannon entropy is employed to measure the inherent uncertainty in the parameters. The expected reduction in this uncertainty is seen as the amount of information one expects to gain about the parameters due to the availability of noisy measurements of the dynamical system. Such expected information gain is interpreted in terms of the variance of a hypothetical measurement device that can measure the parameters directly, and is related to practical identifiability of the parameters. If the individual parameters are unidentifiable, correlation between parameter combinations is assessed through conditional mutual information to determine which sets of parameters can be identified together. The information theoretic quantities of entropy and information are evaluated numerically through a combination of Monte Carlo and k-nearest neighbour methods in a non-parametric fashion. Unlike many methods to evaluate identifiability proposed in the literature, the proposed approach takes the measurement-noise into account and is not restricted to any particular noise-structure. Whilst computationally intensive for large dynamical systems, it is easily parallelisable and is non-intrusive as it does not necessitate re-writing of the numerical solvers of the dynamical system. The application of such an approach is presented for a variety of dynamical systems--ranging from systems governed by ordinary differential equations to partial differential equations--and, where possible, validated against results previously published in the literature. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Research Resource: A Dual Proteomic Approach Identifies Regulated Islet Proteins During β-Cell Mass Expansion In Vivo

    DEFF Research Database (Denmark)

    Horn, Signe; Kirkegaard, Jeannette S.; Hoelper, Soraya

    2016-01-01

    to be up regulated as a response to pregnancy. These included several proteins, not previously associated with pregnancy-induced islet expansion, such as CLIC1, STMN1, MCM6, PPIB, NEDD4, and HLTF. Confirming the validity of our approach, we also identified proteins encoded by genes known to be associated...

  17. Validity of a family-centered approach for assessing infants' social-emotional wellbeing and their developmental context : a prospective cohort study

    NARCIS (Netherlands)

    Hielkema, Margriet; De Winter, Andrea F.; Reijneveld, Sijmen A.

    2017-01-01

    Background: Family-centered care seems promising in preventive pediatrics, but evidence is lacking as to whether this type of care is also valid as a means to identify risks to infants' social-emotional development. We aimed to examine the validity of such a family-centered approach. Methods: We

  18. Validation of assessment tools for identifying trauma symptomatology in young children exposed to trauma

    DEFF Research Database (Denmark)

    Schandorph Løkkegaard, Sille; Elmose, Mette; Elklit, Ask

    There is a lack of Danish validated, developmentally sensitive assessment tools for preschool and young school children exposed to psychological trauma. Consequently, young traumatised children are at risk of not being identified. The purpose of this project is to validate three assessment tools...... that identify trauma symptomatology in young children; a caregiver interview called the Diagnostic Infant and Preschool Assessment (DIPA), a structured play test called the Odense Child Trauma Screening (OCTS), and a child questionnaire called the Darryl Cartoon Test. Three validity studies were conducted...

  19. The validity of using ICD-9 codes and pharmacy records to identify patients with chronic obstructive pulmonary disease

    Directory of Open Access Journals (Sweden)

    Lee Todd A

    2011-02-01

    Full Text Available Abstract Background Administrative data is often used to identify patients with chronic obstructive pulmonary disease (COPD, yet the validity of this approach is unclear. We sought to develop a predictive model utilizing administrative data to accurately identify patients with COPD. Methods Sequential logistic regression models were constructed using 9573 patients with postbronchodilator spirometry at two Veterans Affairs medical centers (2003-2007. COPD was defined as: 1 FEV1/FVC Results 4564 of 9573 patients (47.7% had an FEV1/FVC Conclusion Commonly used definitions of COPD in observational studies misclassify the majority of patients as having COPD. Using multiple diagnostic codes in combination with pharmacy data improves the ability to accurately identify patients with COPD.

  20. Combining phenotypic and proteomic approaches to identify membrane targets in a ‘triple negative’ breast cancer cell type

    Directory of Open Access Journals (Sweden)

    Rust Steven

    2013-02-01

    Full Text Available Abstract Background The continued discovery of therapeutic antibodies, which address unmet medical needs, requires the continued discovery of tractable antibody targets. Multiple protein-level target discovery approaches are available and these can be used in combination to extensively survey relevant cell membranomes. In this study, the MDA-MB-231 cell line was selected for membranome survey as it is a ‘triple negative’ breast cancer cell line, which represents a cancer subtype that is aggressive and has few treatment options. Methods The MDA-MB-231 breast carcinoma cell line was used to explore three membranome target discovery approaches, which were used in parallel to cross-validate the significance of identified antigens. A proteomic approach, which used membrane protein enrichment followed by protein identification by mass spectrometry, was used alongside two phenotypic antibody screening approaches. The first phenotypic screening approach was based on hybridoma technology and the second was based on phage display technology. Antibodies isolated by the phenotypic approaches were tested for cell specificity as well as internalisation and the targets identified were compared to each other as well as those identified by the proteomic approach. An anti-CD73 antibody derived from the phage display-based phenotypic approach was tested for binding to other ‘triple negative’ breast cancer cell lines and tested for tumour growth inhibitory activity in a MDA-MB-231 xenograft model. Results All of the approaches identified multiple cell surface markers, including integrins, CD44, EGFR, CD71, galectin-3, CD73 and BCAM, some of which had been previously confirmed as being tractable to antibody therapy. In total, 40 cell surface markers were identified for further study. In addition to cell surface marker identification, the phenotypic antibody screening approaches provided reagent antibodies for target validation studies. This is illustrated

  1. The relative validity and reproducibility of an iron food frequency questionnaire for identifying iron-related dietary patterns in young women.

    Science.gov (United States)

    Beck, Kathryn L; Kruger, Rozanne; Conlon, Cathryn A; Heath, Anne-Louise M; Coad, Jane; Matthys, Christophe; Jones, Beatrix; Stonehouse, Welma

    2012-08-01

    Using food frequency data to identify dietary patterns is a newly emerging approach to assessing the relationship between dietary intake and iron status. Food frequency questionnaires should be assessed for validity and reproducibility before use. We aimed to investigate the relative validity and reproducibility of an iron food frequency questionnaire (FeFFQ) specifically designed to identify iron-related dietary patterns. Participants completed the FeFFQ at baseline (FeFFQ1) and 1 month later (FeFFQ2) to assess reproducibility. A 4-day weighed diet record (4DDR) was completed between these assessments to determine validity. Foods appearing in the 4DDR were classified into the same 144 food groupings as the FeFFQ. Factor analysis was used to determine dietary patterns from FeFFQ1, FeFFQ2, and the 4DDR. A convenience sample of women (n=115) aged 18 to 44 years living in Auckland, New Zealand, during 2009. Agreement between diet pattern scores was compared using correlation coefficients, Bland-Altman analysis, cross-classification, and the weighted κ statistic. A "healthy" and a "sandwich and drinks" dietary pattern were identified from all three dietary assessments. Correlation coefficients between FeFFQ1 and the 4DDR diet pattern scores (validity) were 0.34 for the healthy, and 0.62 for the sandwich and drinks pattern (both Ps50% of participants into the correct tertile and <10% into the opposite tertile for both the healthy and sandwich and drinks diet pattern scores when compared with the 4DDR and FeFFQ2. The FeFFQ appears to be a reproducible and relatively valid method for identifying dietary patterns, and could be used to investigate the relationship between dietary patterns and iron status. Copyright © 2012 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  2. Validation of new prognostic and predictive scores by sequential testing approach

    International Nuclear Information System (INIS)

    Nieder, Carsten; Haukland, Ellinor; Pawinski, Adam; Dalhaug, Astrid

    2010-01-01

    Background and Purpose: For practitioners, the question arises how their own patient population differs from that used in large-scale analyses resulting in new scores and nomograms and whether such tools actually are valid at a local level and thus can be implemented. A recent article proposed an easy-to-use method for the in-clinic validation of new prediction tools with a limited number of patients, a so-called sequential testing approach. The present study evaluates this approach in scores related to radiation oncology. Material and Methods: Three different scores were used, each predicting short overall survival after palliative radiotherapy (bone metastases, brain metastases, metastatic spinal cord compression). For each scenario, a limited number of consecutive patients entered the sequential testing approach. The positive predictive value (PPV) was used for validation of the respective score and it was required that the PPV exceeded 80%. Results: For two scores, validity in the own local patient population could be confirmed after entering 13 and 17 patients, respectively. For the third score, no decision could be reached even after increasing the sample size to 30. Conclusion: In-clinic validation of new predictive tools with sequential testing approach should be preferred over uncritical adoption of tools which provide no significant benefit to local patient populations. Often the necessary number of patients can be reached within reasonable time frames even in small oncology practices. In addition, validation is performed continuously as the data are collected. (orig.)

  3. Validation of new prognostic and predictive scores by sequential testing approach

    Energy Technology Data Exchange (ETDEWEB)

    Nieder, Carsten [Radiation Oncology Unit, Nordland Hospital, Bodo (Norway); Inst. of Clinical Medicine, Univ. of Tromso (Norway); Haukland, Ellinor; Pawinski, Adam; Dalhaug, Astrid [Radiation Oncology Unit, Nordland Hospital, Bodo (Norway)

    2010-03-15

    Background and Purpose: For practitioners, the question arises how their own patient population differs from that used in large-scale analyses resulting in new scores and nomograms and whether such tools actually are valid at a local level and thus can be implemented. A recent article proposed an easy-to-use method for the in-clinic validation of new prediction tools with a limited number of patients, a so-called sequential testing approach. The present study evaluates this approach in scores related to radiation oncology. Material and Methods: Three different scores were used, each predicting short overall survival after palliative radiotherapy (bone metastases, brain metastases, metastatic spinal cord compression). For each scenario, a limited number of consecutive patients entered the sequential testing approach. The positive predictive value (PPV) was used for validation of the respective score and it was required that the PPV exceeded 80%. Results: For two scores, validity in the own local patient population could be confirmed after entering 13 and 17 patients, respectively. For the third score, no decision could be reached even after increasing the sample size to 30. Conclusion: In-clinic validation of new predictive tools with sequential testing approach should be preferred over uncritical adoption of tools which provide no significant benefit to local patient populations. Often the necessary number of patients can be reached within reasonable time frames even in small oncology practices. In addition, validation is performed continuously as the data are collected. (orig.)

  4. A knowledge-driven approach to cluster validity assessment.

    Science.gov (United States)

    Bolshakova, Nadia; Azuaje, Francisco; Cunningham, Pádraig

    2005-05-15

    This paper presents an approach to assessing cluster validity based on similarity knowledge extracted from the Gene Ontology. The program is freely available for non-profit use on request from the authors.

  5. A robust approach to QMU, validation, and conservative prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Segalman, Daniel Joseph; Paez, Thomas Lee; Bauman, Lara E

    2013-01-01

    A systematic approach to defining margin in a manner that incorporates statistical information and accommodates data uncertainty, but does not require assumptions about specific forms of the tails of distributions is developed. This approach extends to calculations underlying validation assessment and quantitatively conservative predictions.

  6. Distributed design approach in persistent identifiers systems

    Science.gov (United States)

    Golodoniuc, Pavel; Car, Nicholas; Klump, Jens

    2017-04-01

    The need to identify both digital and physical objects is ubiquitous in our society. Past and present persistent identifier (PID) systems, of which there is a great variety in terms of technical and social implementations, have evolved with the advent of the Internet, which has allowed for globally unique and globally resolvable identifiers. PID systems have catered for identifier uniqueness, integrity, persistence, and trustworthiness, regardless of the identifier's application domain, the scope of which has expanded significantly in the past two decades. Since many PID systems have been largely conceived and developed by small communities, or even a single organisation, they have faced challenges in gaining widespread adoption and, most importantly, the ability to survive change of technology. This has left a legacy of identifiers that still exist and are being used but which have lost their resolution service. We believe that one of the causes of once successful PID systems fading is their reliance on a centralised technical infrastructure or a governing authority. Golodoniuc et al. (2016) proposed an approach to the development of PID systems that combines the use of (a) the Handle system, as a distributed system for the registration and first-degree resolution of persistent identifiers, and (b) the PID Service (Golodoniuc et al., 2015), to enable fine-grained resolution to different information object representations. The proposed approach solved the problem of guaranteed first-degree resolution of identifiers, but left fine-grained resolution and information delivery under the control of a single authoritative source, posing risk to the long-term availability of information resources. Herein, we develop these approaches further and explore the potential of large-scale decentralisation at all levels: (i) persistent identifiers and information resources registration; (ii) identifier resolution; and (iii) data delivery. To achieve large-scale decentralisation

  7. An Argument Approach to Observation Protocol Validity

    Science.gov (United States)

    Bell, Courtney A.; Gitomer, Drew H.; McCaffrey, Daniel F.; Hamre, Bridget K.; Pianta, Robert C.; Qi, Yi

    2012-01-01

    This article develops a validity argument approach for use on observation protocols currently used to assess teacher quality for high-stakes personnel and professional development decisions. After defining the teaching quality domain, we articulate an interpretive argument for observation protocols. To illustrate the types of evidence that might…

  8. Validation approach for a fast and simple targeted screening method for 75 antibiotics in meat and aquaculture products using LC-MS/MS.

    Science.gov (United States)

    Dubreil, Estelle; Gautier, Sophie; Fourmond, Marie-Pierre; Bessiral, Mélaine; Gaugain, Murielle; Verdon, Eric; Pessel, Dominique

    2017-04-01

    An approach is described to validate a fast and simple targeted screening method for antibiotic analysis in meat and aquaculture products by LC-MS/MS. The strategy of validation was applied for a panel of 75 antibiotics belonging to different families, i.e., penicillins, cephalosporins, sulfonamides, macrolides, quinolones and phenicols. The samples were extracted once with acetonitrile, concentrated by evaporation and injected into the LC-MS/MS system. The approach chosen for the validation was based on the Community Reference Laboratory (CRL) guidelines for the validation of screening qualitative methods. The aim of the validation was to prove sufficient sensitivity of the method to detect all the targeted antibiotics at the level of interest, generally the maximum residue limit (MRL). A robustness study was also performed to test the influence of different factors. The validation showed that the method is valid to detect and identify 73 antibiotics of the 75 antibiotics studied in meat and aquaculture products at the validation levels.

  9. MicroRNA expression profiling to identify and validate reference genes for relative quantification in colorectal cancer.

    LENUS (Irish Health Repository)

    Chang, Kah Hoong

    2010-01-01

    BACKGROUND: Advances in high-throughput technologies and bioinformatics have transformed gene expression profiling methodologies. The results of microarray experiments are often validated using reverse transcription quantitative PCR (RT-qPCR), which is the most sensitive and reproducible method to quantify gene expression. Appropriate normalisation of RT-qPCR data using stably expressed reference genes is critical to ensure accurate and reliable results. Mi(cro)RNA expression profiles have been shown to be more accurate in disease classification than mRNA expression profiles. However, few reports detailed a robust identification and validation strategy for suitable reference genes for normalisation in miRNA RT-qPCR studies. METHODS: We adopt and report a systematic approach to identify the most stable reference genes for miRNA expression studies by RT-qPCR in colorectal cancer (CRC). High-throughput miRNA profiling was performed on ten pairs of CRC and normal tissues. By using the mean expression value of all expressed miRNAs, we identified the most stable candidate reference genes for subsequent validation. As such the stability of a panel of miRNAs was examined on 35 tumour and 39 normal tissues. The effects of normalisers on the relative quantity of established oncogenic (miR-21 and miR-31) and tumour suppressor (miR-143 and miR-145) target miRNAs were assessed. RESULTS: In the array experiment, miR-26a, miR-345, miR-425 and miR-454 were identified as having expression profiles closest to the global mean. From a panel of six miRNAs (let-7a, miR-16, miR-26a, miR-345, miR-425 and miR-454) and two small nucleolar RNA genes (RNU48 and Z30), miR-16 and miR-345 were identified as the most stably expressed reference genes. The combined use of miR-16 and miR-345 to normalise expression data enabled detection of a significant dysregulation of all four target miRNAs between tumour and normal colorectal tissue. CONCLUSIONS: Our study demonstrates that the top six most

  10. MicroRNA expression profiling to identify and validate reference genes for relative quantification in colorectal cancer

    LENUS (Irish Health Repository)

    Chang, Kah Hoong

    2010-04-29

    Abstract Background Advances in high-throughput technologies and bioinformatics have transformed gene expression profiling methodologies. The results of microarray experiments are often validated using reverse transcription quantitative PCR (RT-qPCR), which is the most sensitive and reproducible method to quantify gene expression. Appropriate normalisation of RT-qPCR data using stably expressed reference genes is critical to ensure accurate and reliable results. Mi(cro)RNA expression profiles have been shown to be more accurate in disease classification than mRNA expression profiles. However, few reports detailed a robust identification and validation strategy for suitable reference genes for normalisation in miRNA RT-qPCR studies. Methods We adopt and report a systematic approach to identify the most stable reference genes for miRNA expression studies by RT-qPCR in colorectal cancer (CRC). High-throughput miRNA profiling was performed on ten pairs of CRC and normal tissues. By using the mean expression value of all expressed miRNAs, we identified the most stable candidate reference genes for subsequent validation. As such the stability of a panel of miRNAs was examined on 35 tumour and 39 normal tissues. The effects of normalisers on the relative quantity of established oncogenic (miR-21 and miR-31) and tumour suppressor (miR-143 and miR-145) target miRNAs were assessed. Results In the array experiment, miR-26a, miR-345, miR-425 and miR-454 were identified as having expression profiles closest to the global mean. From a panel of six miRNAs (let-7a, miR-16, miR-26a, miR-345, miR-425 and miR-454) and two small nucleolar RNA genes (RNU48 and Z30), miR-16 and miR-345 were identified as the most stably expressed reference genes. The combined use of miR-16 and miR-345 to normalise expression data enabled detection of a significant dysregulation of all four target miRNAs between tumour and normal colorectal tissue. Conclusions Our study demonstrates that the top six most

  11. An integrated approach to validation of safeguards and security program performance

    International Nuclear Information System (INIS)

    Altman, W.D.; Hunt, J.S.; Hockert, J.W.

    1988-01-01

    Department of Energy (DOE) requirements for safeguards and security programs are becoming increasingly performance oriented. Master Safeguards and Security Agreemtns specify performance levels for systems protecting DOE security interests. In order to measure and validate security system performance, Lawrence Livermore National Laboratory (LLNL) has developed cost effective validation tools and a comprehensive validation approach that synthesizes information gained from different activities such as force on force exercises, limited scope performance tests, equipment testing, vulnerability analyses, and computer modeling; into an overall assessment of the performance of the protection system. The analytic approach employs logic diagrams adapted from the fault and event trees used in probabilistic risk assessment. The synthesis of the results from the various validation activities is accomplished using a method developed by LLNL, based upon Bayes' theorem

  12. Identifying typical patterns of vulnerability: A 5-step approach based on cluster analysis

    Science.gov (United States)

    Sietz, Diana; Lüdeke, Matthias; Kok, Marcel; Lucas, Paul; Carsten, Walther; Janssen, Peter

    2013-04-01

    Specific processes that shape the vulnerability of socio-ecological systems to climate, market and other stresses derive from diverse background conditions. Within the multitude of vulnerability-creating mechanisms, distinct processes recur in various regions inspiring research on typical patterns of vulnerability. The vulnerability patterns display typical combinations of the natural and socio-economic properties that shape a systems' vulnerability to particular stresses. Based on the identification of a limited number of vulnerability patterns, pattern analysis provides an efficient approach to improving our understanding of vulnerability and decision-making for vulnerability reduction. However, current pattern analyses often miss explicit descriptions of their methods and pay insufficient attention to the validity of their groupings. Therefore, the question arises as to how do we identify typical vulnerability patterns in order to enhance our understanding of a systems' vulnerability to stresses? A cluster-based pattern recognition applied at global and local levels is scrutinised with a focus on an applicable methodology and practicable insights. Taking the example of drylands, this presentation demonstrates the conditions necessary to identify typical vulnerability patterns. They are summarised in five methodological steps comprising the elicitation of relevant cause-effect hypotheses and the quantitative indication of mechanisms as well as an evaluation of robustness, a validation and a ranking of the identified patterns. Reflecting scale-dependent opportunities, a global study is able to support decision-making with insights into the up-scaling of interventions when available funds are limited. In contrast, local investigations encourage an outcome-based validation. This constitutes a crucial step in establishing the credibility of the patterns and hence their suitability for informing extension services and individual decisions. In this respect, working at

  13. A Systematic Review of Validated Methods for Identifying Cerebrovascular Accident or Transient Ischemic Attack Using Administrative Data

    Science.gov (United States)

    Andrade, Susan E.; Harrold, Leslie R.; Tjia, Jennifer; Cutrona, Sarah L.; Saczynski, Jane S.; Dodd, Katherine S.; Goldberg, Robert J.; Gurwitz, Jerry H.

    2012-01-01

    Purpose To perform a systematic review of the validity of algorithms for identifying cerebrovascular accidents (CVAs) or transient ischemic attacks (TIAs) using administrative and claims data. Methods PubMed and Iowa Drug Information Service (IDIS) searches of the English language literature were performed to identify studies published between 1990 and 2010 that evaluated the validity of algorithms for identifying CVAs (ischemic and hemorrhagic strokes, intracranial hemorrhage and subarachnoid hemorrhage) and/or TIAs in administrative data. Two study investigators independently reviewed the abstracts and articles to determine relevant studies according to pre-specified criteria. Results A total of 35 articles met the criteria for evaluation. Of these, 26 articles provided data to evaluate the validity of stroke, 7 reported the validity of TIA, 5 reported the validity of intracranial bleeds (intracerebral hemorrhage and subarachnoid hemorrhage), and 10 studies reported the validity of algorithms to identify the composite endpoints of stroke/TIA or cerebrovascular disease. Positive predictive values (PPVs) varied depending on the specific outcomes and algorithms evaluated. Specific algorithms to evaluate the presence of stroke and intracranial bleeds were found to have high PPVs (80% or greater). Algorithms to evaluate TIAs in adult populations were generally found to have PPVs of 70% or greater. Conclusions The algorithms and definitions to identify CVAs and TIAs using administrative and claims data differ greatly in the published literature. The choice of the algorithm employed should be determined by the stroke subtype of interest. PMID:22262598

  14. Design for validation: An approach to systems validation

    Science.gov (United States)

    Carter, William C.; Dunham, Janet R.; Laprie, Jean-Claude; Williams, Thomas; Howden, William; Smith, Brian; Lewis, Carl M. (Editor)

    1989-01-01

    Every complex system built is validated in some manner. Computer validation begins with review of the system design. As systems became too complicated for one person to review, validation began to rely on the application of adhoc methods by many individuals. As the cost of the changes mounted and the expense of failure increased, more organized procedures became essential. Attempts at devising and carrying out those procedures showed that validation is indeed a difficult technical problem. The successful transformation of the validation process into a systematic series of formally sound, integrated steps is necessary if the liability inherent in the future digita-system-based avionic and space systems is to be minimized. A suggested framework and timetable for the transformtion are presented. Basic working definitions of two pivotal ideas (validation and system life-cyle) are provided and show how the two concepts interact. Many examples are given of past and present validation activities by NASA and others. A conceptual framework is presented for the validation process. Finally, important areas are listed for ongoing development of the validation process at NASA Langley Research Center.

  15. Approaches to Validate and Manipulate RNA Targets with Small Molecules in Cells.

    Science.gov (United States)

    Childs-Disney, Jessica L; Disney, Matthew D

    2016-01-01

    RNA has become an increasingly important target for therapeutic interventions and for chemical probes that dissect and manipulate its cellular function. Emerging targets include human RNAs that have been shown to directly cause cancer, metabolic disorders, and genetic disease. In this review, we describe various routes to obtain bioactive compounds that target RNA, with a particular emphasis on the development of small molecules. We use these cases to describe approaches that are being developed for target validation, which include target-directed cleavage, classic pull-down experiments, and covalent cross-linking. Thus, tools are available to design small molecules to target RNA and to identify the cellular RNAs that are their targets.

  16. 32 species validation of a new Illumina paired-end approach for the development of microsatellites.

    Directory of Open Access Journals (Sweden)

    Stacey L Lance

    Full Text Available Development and optimization of novel species-specific microsatellites, or simple sequence repeats (SSRs remains an important step for studies in ecology, evolution, and behavior. Numerous approaches exist for identifying new SSRs that vary widely in terms of both time and cost investments. A recent approach of using paired-end Illumina sequence data in conjunction with the bioinformatics pipeline, PAL_FINDER, has the potential to substantially reduce the cost and labor investment while also improving efficiency. However, it does not appear that the approach has been widely adopted, perhaps due to concerns over its broad applicability across taxa. Therefore, to validate the utility of the approach we developed SSRs for 32 species representing 30 families, 25 orders, 11 classes, and six phyla and optimized SSRs for 13 of the species. Overall the IPE method worked extremely well and we identified 1000s of SSRs for all species (mean = 128,485, with 17% of loci being potentially amplifiable loci, and 25% of these met our most stringent criteria designed to that avoid SSRs associated with repetitive elements. Approximately 61% of screened primers yielded strong amplification of a single locus.

  17. Model Validation Using Coordinate Distance with Performance Sensitivity

    Directory of Open Access Journals (Sweden)

    Jiann-Shiun Lew

    2008-01-01

    Full Text Available This paper presents an innovative approach to model validation for a structure with significant parameter variations. Model uncertainty of the structural dynamics is quantified with the use of a singular value decomposition technique to extract the principal components of parameter change, and an interval model is generated to represent the system with parameter uncertainty. The coordinate vector, corresponding to the identified principal directions, of the validation system is computed. The coordinate distance between the validation system and the identified interval model is used as a metric for model validation. A beam structure with an attached subsystem, which has significant parameter uncertainty, is used to demonstrate the proposed approach.

  18. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    analyses and hypothesis tests as a part of the validation step to provide feedback to analysts and modelers. Decisions on how to proceed in making model-based predictions are made based on these analyses together with the application requirements. Updating modifying and understanding the boundaries associated with the model are also assisted through this feedback. (4) We include a ''model supplement term'' when model problems are indicated. This term provides a (bias) correction to the model so that it will better match the experimental results and more accurately account for uncertainty. Presumably, as the models continue to develop and are used for future applications, the causes for these apparent biases will be identified and the need for this supplementary modeling will diminish. (5) We use a response-modeling approach for our predictions that allows for general types of prediction and for assessment of prediction uncertainty. This approach is demonstrated through a case study supporting the assessment of a weapons response when subjected to a hydrocarbon fuel fire. The foam decomposition model provides an important element of the response of a weapon system in this abnormal thermal environment. Rigid foam is used to encapsulate critical components in the weapon system providing the needed mechanical support as well as thermal isolation. Because the foam begins to decompose at temperatures above 250 C, modeling the decomposition is critical to assessing a weapons response. In the validation analysis it is indicated that the model tends to ''exaggerate'' the effect of temperature changes when compared to the experimental results. The data, however, are too few and to restricted in terms of experimental design to make confident statements regarding modeling problems. For illustration, we assume these indications are correct and compensate for this apparent bias by constructing a model supplement term for use in the model

  19. Application of Monte Carlo cross-validation to identify pathway cross-talk in neonatal sepsis.

    Science.gov (United States)

    Zhang, Yuxia; Liu, Cui; Wang, Jingna; Li, Xingxia

    2018-03-01

    To explore genetic pathway cross-talk in neonates with sepsis, an integrated approach was used in this paper. To explore the potential relationships between differently expressed genes between normal uninfected neonates and neonates with sepsis and pathways, genetic profiling and biologic signaling pathway were first integrated. For different pathways, the score was obtained based upon the genetic expression by quantitatively analyzing the pathway cross-talk. The paired pathways with high cross-talk were identified by random forest classification. The purpose of the work was to find the best pairs of pathways able to discriminate sepsis samples versus normal samples. The results found 10 pairs of pathways, which were probably able to discriminate neonates with sepsis versus normal uninfected neonates. Among them, the best two paired pathways were identified according to analysis of extensive literature. Impact statement To find the best pairs of pathways able to discriminate sepsis samples versus normal samples, an RF classifier, the DS obtained by DEGs of paired pathways significantly associated, and Monte Carlo cross-validation were applied in this paper. Ten pairs of pathways were probably able to discriminate neonates with sepsis versus normal uninfected neonates. Among them, the best two paired pathways ((7) IL-6 Signaling and Phospholipase C Signaling (PLC); (8) Glucocorticoid Receptor (GR) Signaling and Dendritic Cell Maturation) were identified according to analysis of extensive literature.

  20. Development of the methodology and approaches to validate safety and accident management

    International Nuclear Information System (INIS)

    Asmolov, V.G.

    1997-01-01

    The article compares the development of the methodology and approaches to validate the nuclear power plant safety and accident management in Russia and advanced industrial countries. It demonstrates that the development of methods of safety validation is dialectically related to the accumulation of the knowledge base on processes and events during NPP normal operation, transients and emergencies, including severe accidents. The article describes the Russian severe accident research program (1987-1996), the implementation of which allowed Russia to reach the world level of the safety validation efforts, presents future high-priority study areas. Problems related to possible approaches to the methodological accident management development are discussed. (orig.)

  1. An Integrated Approach to Establish Validity and Reliability of Reading Tests

    Science.gov (United States)

    Razi, Salim

    2012-01-01

    This study presents the processes of developing and establishing reliability and validity of a reading test by administering an integrative approach as conventional reliability and validity measures superficially reveals the difficulty of a reading test. In this respect, analysing vocabulary frequency of the test is regarded as a more eligible way…

  2. Identifying barriers to and facilitators of tuberculosis contact investigation in Kampala, Uganda: a behavioral approach.

    Science.gov (United States)

    Ayakaka, Irene; Ackerman, Sara; Ggita, Joseph M; Kajubi, Phoebe; Dowdy, David; Haberer, Jessica E; Fair, Elizabeth; Hopewell, Philip; Handley, Margaret A; Cattamanchi, Adithya; Katamba, Achilles; Davis, J Lucian

    2017-03-09

    The World Health Organization recommends routine household tuberculosis contact investigation in high-burden countries but adoption has been limited. We sought to identify barriers to and facilitators of TB contact investigation during its introduction in Kampala, Uganda. We collected cross-sectional qualitative data through focus group discussions and interviews with stakeholders, addressing three core activities of contact investigation: arranging household screening visits through index TB patients, visiting households to screen contacts and refer them to clinics, and evaluating at-risk contacts coming to clinics. We analyzed the data using a validated theory of behavior change, the Capability, Opportunity, and Motivation determine Behavior (COM-B) model, and sought to identify targeted interventions using the related Behavior Change Wheel implementation framework. We led seven focus-group discussions with 61 health-care workers, two with 21 lay health workers (LHWs), and one with four household contacts of newly diagnosed TB patients. We, in addition, performed 32 interviews with household contacts from 14 households of newly diagnosed TB patients. Commonly noted barriers included stigma, limited knowledge about TB among contacts, insufficient time and space in clinics for counselling, mistrust of health-center staff among index patients and contacts, and high travel costs for LHWs and contacts. The most important facilitators identified were the personalized and enabling services provided by LHWs. We identified education, persuasion, enablement, modeling of health-positive behaviors, incentivization, and restructuring of the service environment as relevant intervention functions with potential to alleviate barriers to and enhance facilitators of TB contact investigation. The use of a behavioral theory and a validated implementation framework provided a comprehensive approach for systematically identifying barriers to and facilitators of TB contact

  3. Use of AUDIT-based measures to identify unhealthy alcohol use and alcohol dependence in primary care: a validation study.

    Science.gov (United States)

    Johnson, J Aaron; Lee, Anna; Vinson, Daniel; Seale, J Paul

    2013-01-01

    As programs for screening, brief intervention, and referral to treatment (SBIRT) for unhealthy alcohol use disseminate, evidence-based approaches for identifying patients with unhealthy alcohol use and alcohol dependence (AD) are needed. While the National Institute on Alcohol Abuse and Alcoholism Clinician Guide suggests use of a single alcohol screening question (SASQ) for screening and Diagnostic and Statistical Manual checklists for assessment, many SBIRT programs use alcohol use disorders identification test (AUDIT) "zones" for screening and assessment. Validation data for these zones are limited. This study used primary care data from a bi-ethnic southern U.S. population to examine the ability of the AUDIT zones and other AUDIT-based approaches to identify unhealthy alcohol use and dependence. Existing data were analyzed from interviews with 625 female and male adult drinkers presenting to 5 southeastern primary care practices. Timeline follow-back was used to identify at-risk drinking, and diagnostic interview schedule was used to identify alcohol abuse and dependence. Validity measures compared performance of AUDIT, AUDIT-C, and AUDIT dependence domains scores, with and without a 30-day binge drinking measure, for detecting unhealthy alcohol use and dependence. Optimal AUDIT scores for detecting unhealthy alcohol use were lower than current commonly used cutoffs (5 for men, 3 for women). Improved performance was obtained by combining AUDIT cutoffs of 6 for men and 4 for women with a 30-day binge drinking measure. AUDIT scores of 15 for men and 13 for women detected AD with 100% specificity but low sensitivity (20 and 18%, respectively). AUDIT dependence subscale scores of 2 or more showed similar specificity (99%) and slightly higher sensitivity (31% for men, 24% for women). Combining lower AUDIT cutoff scores and binge drinking measures may increase the detection of unhealthy alcohol use in primary care. Use of lower cutoff scores and dependence subscale

  4. Approaches to Demonstrating the Reliability and Validity of Core Diagnostic Criteria for Chronic Pain.

    Science.gov (United States)

    Bruehl, Stephen; Ohrbach, Richard; Sharma, Sonia; Widerstrom-Noga, Eva; Dworkin, Robert H; Fillingim, Roger B; Turk, Dennis C

    2016-09-01

    The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks-American Pain Society Pain Taxonomy (AAPT) is designed to be an evidence-based multidimensional chronic pain classification system that will facilitate more comprehensive and consistent chronic pain diagnoses, and thereby enhance research, clinical communication, and ultimately patient care. Core diagnostic criteria (dimension 1) for individual chronic pain conditions included in the initial version of AAPT will be the focus of subsequent empirical research to evaluate and provide evidence for their reliability and validity. Challenges to validating diagnostic criteria in the absence of clear and identifiable pathophysiological mechanisms are described. Based in part on previous experience regarding the development of evidence-based diagnostic criteria for psychiatric disorders, headache, and specific chronic pain conditions (fibromyalgia, complex regional pain syndrome, temporomandibular disorders, pain associated with spinal cord injuries), several potential approaches for documentation of the reliability and validity of the AAPT diagnostic criteria are summarized. The AAPT is designed to be an evidence-based multidimensional chronic pain classification system. Conceptual and methodological issues related to demonstrating the reliability and validity of the proposed AAPT chronic pain diagnostic criteria are discussed. Copyright © 2016 American Pain Society. Published by Elsevier Inc. All rights reserved.

  5. The CARPEDIEM Algorithm: A Rule-Based System for Identifying Heart Failure Phenotype with a Precision Public Health Approach

    Directory of Open Access Journals (Sweden)

    Michela Franchini

    2018-01-01

    Full Text Available Modern medicine remains dependent on the accurate evaluation of a patient’s health state, recognizing that disease is a process that evolves over time and interacts with many factors unique to that patient. The CARPEDIEM project represents a concrete attempt to address these issues by developing reproducible algorithms to support the accuracy in detection of complex diseases. This study aims to establish and validate the CARPEDIEM approach and algorithm for identifying those patients presenting with or at risk of heart failure (HF by studying 153,393 subjects in Italy, based on patient information flow databases and is not reliant on the electronic health record to accomplish its goals. The resulting algorithm has been validated in a two-stage process, comparing predicted results with (1 HF diagnosis as identified by general practitioners (GPs among the reference cohort and (2 HF diagnosis as identified by cardiologists within a randomly sampled subpopulation of 389 patients. The sources of data used to detect HF cases are numerous and were standardized for this study. The accuracy and the predictive values of the algorithm with respect to the GPs and the clinical standards are highly consistent with those from previous studies. In particular, the algorithm is more efficient in detecting the more severe cases of HF according to the GPs’ validation (specificity increases according to the number of comorbidities and external validation (NYHA: II–IV; HF severity index: 2, 3. Positive and negative predictive values reveal that the CARPEDIEM algorithm is most consistent with clinical evaluation performed in the specialist setting, while it presents a greater ability to rule out false-negative HF cases within the GP practice, probably as a consequence of the different HF prevalence in the two different care settings. Further development includes analyzing the clinical features of false-positive and -negative predictions, to explore the natural

  6. A goal-based approach for qualification of new technologies: Foundations, tool support, and industrial validation

    International Nuclear Information System (INIS)

    Sabetzadeh, Mehrdad; Falessi, Davide; Briand, Lionel; Di Alesio, Stefano

    2013-01-01

    New technologies typically involve innovative aspects that are not addressed by the existing normative standards and hence are not assessable through common certification procedures. To ensure that new technologies can be implemented in a safe and reliable manner, a specific kind of assessment is performed, which in many industries, e.g., the energy sector, is known as Technology Qualification (TQ). TQ aims at demonstrating with an acceptable level of confidence that a new technology will function within specified limits. Expert opinion plays an important role in TQ, both to identify the safety and reliability evidence that needs to be developed and to interpret the evidence provided. Since there are often multiple experts involved in TQ, it is crucial to apply a structured process for eliciting expert opinions, and to use this information systematically when analyzing the satisfaction of the technology's safety and reliability objectives. In this paper, we present a goal-based approach for TQ. Our approach enables analysts to quantitatively reason about the satisfaction of the technology's overall goals and further to identify the aspects that must be improved to increase goal satisfaction. The approach is founded on three main components: goal models, expert elicitation, and probabilistic simulation. We describe a tool, named Modus, that we have developed in support of our approach. We provide an extensive empirical validation of our approach through two industrial case studies and a survey

  7. Predictive Validity of an Empirical Approach for Selecting Promising Message Topics: A Randomized-Controlled Study

    Science.gov (United States)

    Lee, Stella Juhyun; Brennan, Emily; Gibson, Laura Anne; Tan, Andy S. L.; Kybert-Momjian, Ani; Liu, Jiaying; Hornik, Robert

    2016-01-01

    Several message topic selection approaches propose that messages based on beliefs pretested and found to be more strongly associated with intentions will be more effective in changing population intentions and behaviors when used in a campaign. This study aimed to validate the underlying causal assumption of these approaches which rely on cross-sectional belief–intention associations. We experimentally tested whether messages addressing promising themes as identified by the above criterion were more persuasive than messages addressing less promising themes. Contrary to expectations, all messages increased intentions. Interestingly, mediation analyses showed that while messages deemed promising affected intentions through changes in targeted promising beliefs, messages deemed less promising also achieved persuasion by influencing nontargeted promising beliefs. Implications for message topic selection are discussed. PMID:27867218

  8. Automatic address validation and health record review to identify homeless Social Security disability applicants.

    Science.gov (United States)

    Erickson, Jennifer; Abbott, Kenneth; Susienka, Lucinda

    2018-06-01

    Homeless patients face a variety of obstacles in pursuit of basic social services. Acknowledging this, the Social Security Administration directs employees to prioritize homeless patients and handle their disability claims with special care. However, under existing manual processes for identification of homelessness, many homeless patients never receive the special service to which they are entitled. In this paper, we explore address validation and automatic annotation of electronic health records to improve identification of homeless patients. We developed a sample of claims containing medical records at the moment of arrival in a single office. Using address validation software, we reconciled patient addresses with public directories of homeless shelters, veterans' hospitals and clinics, and correctional facilities. Other tools annotated electronic health records. We trained random forests to identify homeless patients and validated each model with 10-fold cross validation. For our finished model, the area under the receiver operating characteristic curve was 0.942. The random forest improved sensitivity from 0.067 to 0.879 but decreased positive predictive value to 0.382. Presumed false positive classifications bore many characteristics of homelessness. Organizations could use these methods to prompt early collection of information necessary to avoid labor-intensive attempts to reestablish contact with homeless individuals. Annually, such methods could benefit tens of thousands of patients who are homeless, destitute, and in urgent need of assistance. We were able to identify many more homeless patients through a combination of automatic address validation and natural language processing of unstructured electronic health records. Copyright © 2018. Published by Elsevier Inc.

  9. Towards a realistic approach to validation of reactive transport models for performance assessment

    International Nuclear Information System (INIS)

    Siegel, M.D.

    1993-01-01

    Performance assessment calculations are based on geochemical models that assume that interactions among radionuclides, rocks and groundwaters under natural conditions, can be estimated or bound by data obtained from laboratory-scale studies. The data include radionuclide distribution coefficients, measured in saturated batch systems of powdered rocks, and retardation factors measured in short-term column experiments. Traditional approaches to model validation cannot be applied in a straightforward manner to the simple reactive transport models that use these data. An approach to model validation in support of performance assessment is described in this paper. It is based on a recognition of different levels of model validity and is compatible with the requirements of current regulations for high-level waste disposal. Activities that are being carried out in support of this approach include (1) laboratory and numerical experiments to test the validity of important assumptions inherent in current performance assessment methodologies,(2) integrated transport experiments, and (3) development of a robust coupled reaction/transport code for sensitivity analyses using massively parallel computers

  10. Shortest-path network analysis is a useful approach toward identifying genetic determinants of longevity.

    Directory of Open Access Journals (Sweden)

    J R Managbanag

    Full Text Available BACKGROUND: Identification of genes that modulate longevity is a major focus of aging-related research and an area of intense public interest. In addition to facilitating an improved understanding of the basic mechanisms of aging, such genes represent potential targets for therapeutic intervention in multiple age-associated diseases, including cancer, heart disease, diabetes, and neurodegenerative disorders. To date, however, targeted efforts at identifying longevity-associated genes have been limited by a lack of predictive power, and useful algorithms for candidate gene-identification have also been lacking. METHODOLOGY/PRINCIPAL FINDINGS: We have utilized a shortest-path network analysis to identify novel genes that modulate longevity in Saccharomyces cerevisiae. Based on a set of previously reported genes associated with increased life span, we applied a shortest-path network algorithm to a pre-existing protein-protein interaction dataset in order to construct a shortest-path longevity network. To validate this network, the replicative aging potential of 88 single-gene deletion strains corresponding to predicted components of the shortest-path longevity network was determined. Here we report that the single-gene deletion strains identified by our shortest-path longevity analysis are significantly enriched for mutations conferring either increased or decreased replicative life span, relative to a randomly selected set of 564 single-gene deletion strains or to the current data set available for the entire haploid deletion collection. Further, we report the identification of previously unknown longevity genes, several of which function in a conserved longevity pathway believed to mediate life span extension in response to dietary restriction. CONCLUSIONS/SIGNIFICANCE: This work demonstrates that shortest-path network analysis is a useful approach toward identifying genetic determinants of longevity and represents the first application of

  11. An integrative -omics approach to identify functional sub-networks in human colorectal cancer.

    Directory of Open Access Journals (Sweden)

    Rod K Nibbe

    2010-01-01

    Full Text Available Emerging evidence indicates that gene products implicated in human cancers often cluster together in "hot spots" in protein-protein interaction (PPI networks. Additionally, small sub-networks within PPI networks that demonstrate synergistic differential expression with respect to tumorigenic phenotypes were recently shown to be more accurate classifiers of disease progression when compared to single targets identified by traditional approaches. However, many of these studies rely exclusively on mRNA expression data, a useful but limited measure of cellular activity. Proteomic profiling experiments provide information at the post-translational level, yet they generally screen only a limited fraction of the proteome. Here, we demonstrate that integration of these complementary data sources with a "proteomics-first" approach can enhance the discovery of candidate sub-networks in cancer that are well-suited for mechanistic validation in disease. We propose that small changes in the mRNA expression of multiple genes in the neighborhood of a protein-hub can be synergistically associated with significant changes in the activity of that protein and its network neighbors. Further, we hypothesize that proteomic targets with significant fold change between phenotype and control may be used to "seed" a search for small PPI sub-networks that are functionally associated with these targets. To test this hypothesis, we select proteomic targets having significant expression changes in human colorectal cancer (CRC from two independent 2-D gel-based screens. Then, we use random walk based models of network crosstalk and develop novel reference models to identify sub-networks that are statistically significant in terms of their functional association with these proteomic targets. Subsequently, using an information-theoretic measure, we evaluate synergistic changes in the activity of identified sub-networks based on genome-wide screens of mRNA expression in CRC

  12. Preliminary Validation of a New Clinical Tool for Identifying Problem Video Game Playing

    Science.gov (United States)

    King, Daniel Luke; Delfabbro, Paul H.; Zajac, Ian T.

    2011-01-01

    Research has estimated that between 6 to 13% of individuals who play video games do so excessively. However, the methods and definitions used to identify "problem" video game players often vary considerably. This research presents preliminary validation data for a new measure of problematic video game play called the Problem Video Game…

  13. Developing and Validating Personas in e-Commerce: A Heuristic Approach

    Science.gov (United States)

    Thoma, Volker; Williams, Bryn

    A multi-method persona development process in a large e-commerce business is described. Personas are fictional representations of customers that describe typical user attributes to facilitate a user-centered approach in interaction design. In the current project persona attributes were derived from various data sources, such as stakeholder interviews, user tests and interviews, data mining, customer surveys, and ethnographic (direct observation, diary studies) research. The heuristic approach of using these data sources conjointly allowed for an early validation of relevant persona dimensions.

  14. Real-time process signal validation based on neuro-fuzzy and possibilistic approach

    International Nuclear Information System (INIS)

    Figedy, S.; Fantoni, P.F.; Hoffmann, M.

    2001-01-01

    Real-time process signal validation is an application field where the use of fuzzy logic and Artificial Neural Networks can improve the diagnostics of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process is to be performed. The possibilistic approach allows a fast detection of unforeseen plant conditions. Specialized Artificial Neural Networks are used, one for each fuzzy cluster. This offers two main advantages: the accuracy and generalization capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This system analyzes the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. This model has been tested on a simulated data from the PWR type of a nuclear power plant, to monitor safety-related reactor variables over the entire power-flow operating map and were installed in real conditions of BWR nuclear reactor. (Authors)

  15. Multipronged approach to identify and validate a novel upstream regulator of Sncg in mouse retinal ganglion cells.

    Science.gov (United States)

    Chintalapudi, Sumana R; Morales-Tirado, Vanessa M; Williams, Robert W; Jablonski, Monica M

    2016-02-01

    Loss of retinal ganglion cells (RGCs) is one of the hallmarks of retinal neurodegenerative diseases, glaucoma being one of the most common. Mechanistic studies on RGCs are hindered by the lack of sufficient primary cells and consensus regarding their signature markers. Recently, γ-synuclein (SNCG) has been shown to be highly expressed in the somas and axons of RGCs. In various mouse models of glaucoma, downregulation of Sncg gene expression correlates with RGC loss. To investigate the role of Sncg in RGCs, we used a novel systems genetics approach to identify a gene that modulates Sncg expression, followed by confirmatory studies in both healthy and diseased retinae. We found that chromosome 1 harbors an expression quantitative trait locus that modulates Sncg expression in the mouse retina, and identified the prefoldin-2 (PFDN2) gene as the candidate upstream modulator of Sncg expression. Our immunohistochemical analyses revealed similar expression patterns in both mouse and human healthy retinae, with PFDN2 colocalizing with SNCG in RGCs and their axons. In contrast, in retinae from glaucoma subjects, SNCG levels were significantly reduced, although PFDN2 levels were maintained. Using a novel flow cytometry-based RGC isolation method, we obtained viable populations of murine RGCs. Knocking down Pfdn2 expression in primary murine RGCs significantly reduced Sncg expression, confirming that Pfdn2 regulates Sncg expression in murine RGCs. Gene Ontology analysis indicated shared mitochondrial function associated with Sncg and Pfdn2. These data solidify the relationship between Sncg and Pfdn2 in RGCs, and provide a novel mechanism for maintaining RGC health. © 2015 FEBS.

  16. Using linked electronic data to validate algorithms for health outcomes in administrative databases.

    Science.gov (United States)

    Lee, Wan-Ju; Lee, Todd A; Pickard, Alan Simon; Shoaibi, Azadeh; Schumock, Glen T

    2015-08-01

    The validity of algorithms used to identify health outcomes in claims-based and administrative data is critical to the reliability of findings from observational studies. The traditional approach to algorithm validation, using medical charts, is expensive and time-consuming. An alternative method is to link the claims data to an external, electronic data source that contains information allowing confirmation of the event of interest. In this paper, we describe this external linkage validation method and delineate important considerations to assess the feasibility and appropriateness of validating health outcomes using this approach. This framework can help investigators decide whether to pursue an external linkage validation method for identifying health outcomes in administrative/claims data.

  17. Mining user-generated geographic content : an interactive, crowdsourced approach to validation and supervision

    NARCIS (Netherlands)

    Ostermann, F.O.; Garcia Chapeton, Gustavo Adolfo; Zurita-Milla, R.; Kraak, M.J.; Bergt, A.; Sarjakoski, T.; van Lammeren, R.; Rip, F.

    2017-01-01

    This paper describes a pilot study that implements a novel approach to validate data mining tasks by using the crowd to train a classifier. This hybrid approach to processing successfully addresses challenges faced during human curation or machine processing of user-generated geographic content

  18. Validation of de-identified record linkage to ascertain hospital admissions in a cohort study

    Directory of Open Access Journals (Sweden)

    English Dallas R

    2011-04-01

    Full Text Available Abstract Background Cohort studies can provide valuable evidence of cause and effect relationships but are subject to loss of participants over time, limiting the validity of findings. Computerised record linkage offers a passive and ongoing method of obtaining health outcomes from existing routinely collected data sources. However, the quality of record linkage is reliant upon the availability and accuracy of common identifying variables. We sought to develop and validate a method for linking a cohort study to a state-wide hospital admissions dataset with limited availability of unique identifying variables. Methods A sample of 2000 participants from a cohort study (n = 41 514 was linked to a state-wide hospitalisations dataset in Victoria, Australia using the national health insurance (Medicare number and demographic data as identifying variables. Availability of the health insurance number was limited in both datasets; therefore linkage was undertaken both with and without use of this number and agreement tested between both algorithms. Sensitivity was calculated for a sub-sample of 101 participants with a hospital admission confirmed by medical record review. Results Of the 2000 study participants, 85% were found to have a record in the hospitalisations dataset when the national health insurance number and sex were used as linkage variables and 92% when demographic details only were used. When agreement between the two methods was tested the disagreement fraction was 9%, mainly due to "false positive" links when demographic details only were used. A final algorithm that used multiple combinations of identifying variables resulted in a match proportion of 87%. Sensitivity of this final linkage was 95%. Conclusions High quality record linkage of cohort data with a hospitalisations dataset that has limited identifiers can be achieved using combinations of a national health insurance number and demographic data as identifying variables.

  19. A Computer Vision Approach to Identify Einstein Rings and Arcs

    Science.gov (United States)

    Lee, Chien-Hsiu

    2017-03-01

    Einstein rings are rare gems of strong lensing phenomena; the ring images can be used to probe the underlying lens gravitational potential at every position angles, tightly constraining the lens mass profile. In addition, the magnified images also enable us to probe high-z galaxies with enhanced resolution and signal-to-noise ratios. However, only a handful of Einstein rings have been reported, either from serendipitous discoveries or or visual inspections of hundred thousands of massive galaxies or galaxy clusters. In the era of large sky surveys, an automated approach to identify ring pattern in the big data to come is in high demand. Here, we present an Einstein ring recognition approach based on computer vision techniques. The workhorse is the circle Hough transform that recognise circular patterns or arcs in the images. We propose a two-tier approach by first pre-selecting massive galaxies associated with multiple blue objects as possible lens, than use Hough transform to identify circular pattern. As a proof-of-concept, we apply our approach to SDSS, with a high completeness, albeit with low purity. We also apply our approach to other lenses in DES, HSC-SSP, and UltraVISTA survey, illustrating the versatility of our approach.

  20. A unified approach to validation, reliability, and education study design for surgical technical skills training.

    Science.gov (United States)

    Sweet, Robert M; Hananel, David; Lawrenz, Frances

    2010-02-01

    To present modern educational psychology theory and apply these concepts to validity and reliability of surgical skills training and assessment. In a series of cross-disciplinary meetings, we applied a unified approach of behavioral science principles and theory to medical technical skills education given the recent advances in the theories in the field of behavioral psychology and statistics. While validation of the individual simulation tools is important, it is only one piece of a multimodal curriculum that in and of itself deserves examination and study. We propose concurrent validation throughout the design of simulation-based curriculum rather than once it is complete. We embrace the concept that validity and curriculum development are interdependent, ongoing processes that are never truly complete. Individual predictive, construct, content, and face validity aspects should not be considered separately but as interdependent and complementary toward an end application. Such an approach could help guide our acceptance and appropriate application of these exciting new training and assessment tools for technical skills training in medicine.

  1. A new simplex chemometric approach to identify olive oil blends with potentially high traceability.

    Science.gov (United States)

    Semmar, N; Laroussi-Mezghani, S; Grati-Kamoun, N; Hammami, M; Artaud, J

    2016-10-01

    Olive oil blends (OOBs) are complex matrices combining different cultivars at variable proportions. Although qualitative determinations of OOBs have been subjected to several chemometric works, quantitative evaluations of their contents remain poorly developed because of traceability difficulties concerning co-occurring cultivars. Around this question, we recently published an original simplex approach helping to develop predictive models of the proportions of co-occurring cultivars from chemical profiles of resulting blends (Semmar & Artaud, 2015). Beyond predictive model construction and validation, this paper presents an extension based on prediction errors' analysis to statistically define the blends with the highest predictability among all the possible ones that can be made by mixing cultivars at different proportions. This provides an interesting way to identify a priori labeled commercial products with potentially high traceability taking into account the natural chemical variability of different constitutive cultivars. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Validation of search filters for identifying pediatric studies in PubMed.

    Science.gov (United States)

    Leclercq, Edith; Leeflang, Mariska M G; van Dalen, Elvira C; Kremer, Leontien C M

    2013-03-01

    To identify and validate PubMed search filters for retrieving studies including children and to develop a new pediatric search filter for PubMed. We developed 2 different datasets of studies to evaluate the performance of the identified pediatric search filters, expressed in terms of sensitivity, precision, specificity, accuracy, and number needed to read (NNR). An optimal search filter will have a high sensitivity and high precision with a low NNR. In addition to the PubMed Limits: All Child: 0-18 years filter (in May 2012 renamed to PubMed Filter Child: 0-18 years), 6 search filters for identifying studies including children were identified: 3 developed by Kastner et al, 1 developed by BestBets, one by the Child Health Field, and 1 by the Cochrane Childhood Cancer Group. Three search filters (Cochrane Childhood Cancer Group, Child Health Field, and BestBets) had the highest sensitivity (99.3%, 99.5%, and 99.3%, respectively) but a lower precision (64.5%, 68.4%, and 66.6% respectively) compared with the other search filters. Two Kastner search filters had a high precision (93.0% and 93.7%, respectively) but a low sensitivity (58.5% and 44.8%, respectively). They failed to identify many pediatric studies in our datasets. The search terms responsible for false-positive results in the reference dataset were determined. With these data, we developed a new search filter for identifying studies with children in PubMed with an optimal sensitivity (99.5%) and precision (69.0%). Search filters to identify studies including children either have a low sensitivity or a low precision with a high NNR. A new pediatric search filter with a high sensitivity and a low NNR has been developed. Copyright © 2013 Mosby, Inc. All rights reserved.

  3. Experimental Validation of a Differential Variational Inequality-Based Approach for Handling Friction and Contact in Vehicle

    Science.gov (United States)

    2015-11-20

    terrain modeled using the discrete element method (DEM). Experimental Validation of a Differential Variational Inequality -Based Approach for Handling...COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Experimental Validation of a Differential Variational Inequality -Based Approach for...sinkage, and single wheel tests. 1.1. Modeling Frictional Contact Via Differential Variational Inequalities Consider a three dimensional (3D) system of

  4. Development and Initial Validation of the Need Satisfaction and Need Support at Work Scales: A Validity-Focused Approach

    Directory of Open Access Journals (Sweden)

    Susanne Tafvelin

    2018-01-01

    Full Text Available Although the relevance of employee need satisfaction and manager need support have been examined, the integration of self-determination theory (SDT into work and organizational psychology has been hampered by the lack of validated measures. The purpose of the current study was to develop and validate measures of employees’ perception of need satisfaction (NSa-WS and need support (NSu-WS at work that were grounded in SDT. We used three Swedish samples (total 'N' = 1,430 to develop and validate our scales. We used a confirmatory approach including expert panels to assess item content relevance, confirmatory factor analysis for factorial validity, and associations with theoretically warranted outcomes to assess criterion-related validity. Scale reliability was also assessed. We found evidence of content, factorial, and criterion-related validity of our two scales of need satisfaction and need support at work. Further, the scales demonstrated high internal consistency. Our newly developed scales may be used in research and practice to further our understanding regarding how satisfaction and support of employee basic needs influence employee motivation, performance, and well-being. Our study makes a contribution to the current literature by providing (1 scales that are specifically designed for the work context, (2 an example of how expert panels can be used to assess content validity, and (3 testing of theoretically derived hypotheses that, although SDT is built on them, have not been examined before.

  5. Validating the Modified Drug Adherence Work-Up (M-DRAW Tool to Identify and Address Barriers to Medication Adherence

    Directory of Open Access Journals (Sweden)

    Sun Lee

    2017-09-01

    Full Text Available Barriers to medication adherence stem from multiple factors. An effective and convenient tool is needed to identify these barriers so that clinicians can provide a tailored, patient-centered consultation with patients. The Modified Drug Adherence Work-up Tool (M-DRAW was developed as a 13-item checklist questionnaire to identify barriers to medication adherence. The response scale was a 4-point Likert scale of frequency of occurrence (1 = never to 4 = often. The checklist was accompanied by a GUIDE that provided corresponding motivational interview-based intervention strategies for each identified barrier. The current pilot study examined the psychometric properties of the M-DRAW checklist (reliability, responsiveness and discriminant validity in patients taking one or more prescription medication(s for chronic conditions. A cross-sectional sample of 26 patients was recruited between December 2015 and March 2016 at an academic medical center pharmacy in Southern California. A priming question that assessed self-reported adherence was used to separate participants into the control group of 17 “adherers” (65.4%, and into the intervention group of nine “unintentional and intentional non-adherers” (34.6%. Comparable baseline characteristics were observed between the two groups. The M-DRAW checklist showed acceptable reliability (13 item; alpha = 0.74 for identifying factors and barriers leading to medication non-adherence. Discriminant validity of the tool and the priming question was established by the four-fold number of barriers to adherence identified within the self-selected intervention group compared to the control group (4.4 versus 1.2 barriers, p < 0.05. The current study did not investigate construct validity due to small sample size and challenges on follow-up with patients. Future testing of the tool will include construct validation.

  6. Identifying Human Phenotype Terms by Combining Machine Learning and Validation Rules

    Directory of Open Access Journals (Sweden)

    Manuel Lobo

    2017-01-01

    Full Text Available Named-Entity Recognition is commonly used to identify biological entities such as proteins, genes, and chemical compounds found in scientific articles. The Human Phenotype Ontology (HPO is an ontology that provides a standardized vocabulary for phenotypic abnormalities found in human diseases. This article presents the Identifying Human Phenotypes (IHP system, tuned to recognize HPO entities in unstructured text. IHP uses Stanford CoreNLP for text processing and applies Conditional Random Fields trained with a rich feature set, which includes linguistic, orthographic, morphologic, lexical, and context features created for the machine learning-based classifier. However, the main novelty of IHP is its validation step based on a set of carefully crafted manual rules, such as the negative connotation analysis, that combined with a dictionary can filter incorrectly identified entities, find missed entities, and combine adjacent entities. The performance of IHP was evaluated using the recently published HPO Gold Standardized Corpora (GSC, where the system Bio-LarK CR obtained the best F-measure of 0.56. IHP achieved an F-measure of 0.65 on the GSC. Due to inconsistencies found in the GSC, an extended version of the GSC was created, adding 881 entities and modifying 4 entities. IHP achieved an F-measure of 0.863 on the new GSC.

  7. The metabolomic approach identifies a biological signature of low-dose chronic exposure to Cesium 137

    International Nuclear Information System (INIS)

    Grison, S.; Grandcolas, L.; Martin, J.C.

    2012-01-01

    Reports have described apparent biological effects of 137 Cs (the most persistent dispersed radionuclide) irradiation in people living in Chernobyl-contaminated territory. The sensitive analytical technology described here should now help assess the relation of this contamination to the observed effects. A rat model chronically exposed to 137 Cs through drinking water was developed to identify biomarkers of radiation-induced metabolic disorders, and the biological impact was evaluated by a metabolomic approach that allowed us to detect several hundred metabolites in biofluids and assess their association with disease states. After collection of plasma and urine from contaminated and non-contaminated rats at the end of the 9-months contamination period, analysis with a liquid chromatography coupled to mass spectrometry (LC-MS) system detected 742 features in urine and 1309 in plasma. Biostatistical discriminant analysis extracted a subset of 26 metabolite signals (2 urinary, 4 plasma non-polar, and 19 plasma polar metabolites) that in combination were able to predict from 68 up to 94% of the contaminated rats, depending on the prediction method used, with a misclassification rate as low as 5.3%. The difference in this metabolic score between the contaminated and non-contaminated rats was highly significant (P=0.019 after ANOVA cross-validation). In conclusion, our proof-of-principle study demonstrated for the first time the usefulness of a metabolomic approach for addressing biological effects of chronic low-dose contamination. We can conclude that a metabolomic signature discriminated 137 Cs-contaminated from control animals in our model. Further validation is nevertheless required together with full annotation of the metabolic indicators. (author)

  8. New approach for validating the segmentation of 3D data applied to individual fibre extraction

    DEFF Research Database (Denmark)

    Emerson, Monica Jane; Dahl, Anders Bjorholm; Dahl, Vedrana Andersen

    2017-01-01

    We present two approaches for validating the segmentation of 3D data. The first approach consists on comparing the amount of estimated material to a value provided by the manufacturer. The second approach consists on comparing the segmented results to those obtained from imaging modalities...

  9. Electromagnetic scattering problems -Numerical issues and new experimental approaches of validation

    Energy Technology Data Exchange (ETDEWEB)

    Geise, Robert; Neubauer, Bjoern; Zimmer, Georg [University of Braunschweig, Institute for Electromagnetic Compatibility, Schleinitzstrasse 23, 38106 Braunschweig (Germany)

    2015-03-10

    Electromagnetic scattering problems, thus the question how radiated energy spreads when impinging on an object, are an essential part of wave propagation. Though the Maxwell’s differential equations as starting point, are actually quite simple,the integral formulation of an object’s boundary conditions, respectively the solution for unknown induced currents can only be solved numerically in most cases.As a timely topic of practical importance the scattering of rotating wind turbines is discussed, the numerical description of which is still based on rigorous approximations with yet unspecified accuracy. In this context the issue of validating numerical solutions is addressed, both with reference simulations but in particular with the experimental approach of scaled measurements. For the latter the idea of an incremental validation is proposed allowing a step by step validation of required new mathematical models in scattering theory.

  10. Geographic and temporal validity of prediction models: Different approaches were useful to examine model performance

    NARCIS (Netherlands)

    P.C. Austin (Peter); D. van Klaveren (David); Y. Vergouwe (Yvonne); D. Nieboer (Daan); D.S. Lee (Douglas); E.W. Steyerberg (Ewout)

    2016-01-01

    textabstractObjective: Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting: We

  11. Validation of Slosh Modeling Approach Using STAR-CCM+

    Science.gov (United States)

    Benson, David J.; Ng, Wanyi

    2018-01-01

    Without an adequate understanding of propellant slosh, the spacecraft attitude control system may be inadequate to control the spacecraft or there may be an unexpected loss of science observation time due to higher slosh settling times. Computational fluid dynamics (CFD) is used to model propellant slosh. STAR-CCM+ is a commercially available CFD code. This paper seeks to validate the CFD modeling approach via a comparison between STAR-CCM+ liquid slosh modeling results and experimental, empirically, and analytically derived results. The geometries examined are a bare right cylinder tank and a right cylinder with a single ring baffle.

  12. Identifying Primary Spontaneous Pneumothorax from Administrative Databases: A Validation Study

    Directory of Open Access Journals (Sweden)

    Eric Frechette

    2016-01-01

    Full Text Available Introduction. Primary spontaneous pneumothorax (PSP is a disorder commonly encountered in healthy young individuals. There is no differentiation between PSP and secondary pneumothorax (SP in the current version of the International Classification of Diseases (ICD-10. This complicates the conduct of epidemiological studies on the subject. Objective. To validate the accuracy of an algorithm that identifies cases of PSP from administrative databases. Methods. The charts of 150 patients who consulted the emergency room (ER with a recorded main diagnosis of pneumothorax were reviewed to define the type of pneumothorax that occurred. The corresponding hospital administrative data collected during previous hospitalizations and ER visits were processed through the proposed algorithm. The results were compared over two different age groups. Results. There were 144 cases of pneumothorax correctly coded (96%. The results obtained from the PSP algorithm demonstrated a significantly higher sensitivity (97% versus 81%, p=0.038 and positive predictive value (87% versus 46%, p<0.001 in patients under 40 years of age than in older patients. Conclusions. The proposed algorithm is adequate to identify cases of PSP from administrative databases in the age group classically associated with the disease. This makes possible its utilization in large population-based studies.

  13. Gene-based Association Approach Identify Genes Across Stress Traits in Fruit Flies

    DEFF Research Database (Denmark)

    Rohde, Palle Duun; Edwards, Stefan McKinnon; Sarup, Pernille Merete

    Identification of genes explaining variation in quantitative traits or genetic risk factors of human diseases requires both good phenotypic- and genotypic data, but also efficient statistical methods. Genome-wide association studies may reveal association between phenotypic variation and variation...... approach grouping variants accordingly to gene position, thus lowering the number of statistical tests performed and increasing the probability of identifying genes with small to moderate effects. Using this approach we identify numerous genes associated with different types of stresses in Drosophila...... melanogaster, but also identify common genes that affects the stress traits....

  14. Quality data validation: Comprehensive approach to environmental data validation

    International Nuclear Information System (INIS)

    Matejka, L.A. Jr.

    1993-01-01

    Environmental data validation consists of an assessment of three major areas: analytical method validation; field procedures and documentation review; evaluation of the level of achievement of data quality objectives based in part on PARCC parameters analysis and expected applications of data. A program utilizing matrix association of required levels of validation effort and analytical levels versus applications of this environmental data was developed in conjunction with DOE-ID guidance documents to implement actions under the Federal Facilities Agreement and Consent Order in effect at the Idaho National Engineering Laboratory. This was an effort to bring consistent quality to the INEL-wide Environmental Restoration Program and database in an efficient and cost-effective manner. This program, documenting all phases of the review process, is described here

  15. Validity of Purchasing Power Parity in BRICS under a DFA Approach

    Directory of Open Access Journals (Sweden)

    Emmanuel Numapau Gyamfi

    2017-02-01

    Full Text Available This study tests the validity of the purchasing power parity (PPP theory in Brazil, Russia, India, Macao-China and South Africa. We examine real exchange rates of these countries for mean reversion. The Hurst exponent is our mean reversion measure which is evaluated by the Detrended Fluctuation Analysis (DFA in a rolling window to determine the validity of the PPP theory amongst these countries through time. Our results show persistence in real exchange rates; an indication not supporting the PPP theory in the five countries. The study contributes to the extant literature of the PPP theory in BRICS using the DFA approach in a rolling window through time.

  16. Validation of simulation codes for future systems: motivations, approach, and the role of nuclear data

    International Nuclear Information System (INIS)

    Palmiotti, G.; Salvatores, M.; Aliberti, G.

    2007-01-01

    The validation of advanced simulation tools will still play a very significant role in several areas of reactor system analysis. This is the case of reactor physics and neutronics, where nuclear data uncertainties still play a crucial role for many core and fuel cycle parameters. The present paper gives a summary of validation motivations, objectives and approach. A validation effort is in particular necessary in the frame of advanced (e.g. Generation-IV or GNEP) reactors and associated fuel cycles assessment and design. Validation of simulation codes is complementary to the 'verification' process. In fact, 'verification' addresses the question 'are we solving the equations correctly' while validation addresses the question 'are we solving the correct equations with the correct parameters'. Verification implies comparisons with 'reference' equation solutions or with analytical solutions, when they exist. Most of what is called 'numerical validation' falls in this category. Validation strategies differ according to the relative weight of the methods and of the parameters that enter into the simulation tools. Most validation is based on experiments, and the field of neutronics where a 'robust' physics description model exists and which is function of 'input' parameters not fully known, will be the focus of this paper. In fact, in the case of reactor core, shielding and fuel cycle physics the model (theory) is well established (the Boltzmann and Bateman equations) and the parameters are the nuclear cross-sections, decay data etc. Two types of validation approaches can and have been used: (a) Mock-up experiments ('global' validation): need for a very close experimental simulation of a reference configuration. Bias factors cannot be extrapolated beyond reference configuration; (b) Use of 'clean', 'representative' integral experiments ('bias factor and adjustment' method). Allows to define bias factors, uncertainties and can be used for a wide range of applications. It

  17. Validating a conceptual model for an inter-professional approach to shared decision making: a mixed methods study

    Science.gov (United States)

    Légaré, France; Stacey, Dawn; Gagnon, Susie; Dunn, Sandy; Pluye, Pierre; Frosch, Dominick; Kryworuchko, Jennifer; Elwyn, Glyn; Gagnon, Marie-Pierre; Graham, Ian D

    2011-01-01

    Rationale, aims and objectives Following increased interest in having inter-professional (IP) health care teams engage patients in decision making, we developed a conceptual model for an IP approach to shared decision making (SDM) in primary care. We assessed the validity of the model with stakeholders in Canada. Methods In 15 individual interviews and 7 group interviews with 79 stakeholders, we asked them to: (1) propose changes to the IP-SDM model; (2) identify barriers and facilitators to the model's implementation in clinical practice; and (3) assess the model using a theory appraisal questionnaire. We performed a thematic analysis of the transcripts and a descriptive analysis of the questionnaires. Results Stakeholders suggested placing the patient at its centre; extending the concept of family to include significant others; clarifying outcomes; highlighting the concept of time; merging the micro, meso and macro levels in one figure; and recognizing the influence of the environment and emotions. The most common barriers identified were time constraints, insufficient resources and an imbalance of power among health professionals. The most common facilitators were education and training in inter-professionalism and SDM, motivation to achieve an IP approach to SDM, and mutual knowledge and understanding of disciplinary roles. Most stakeholders considered that the concepts and relationships between the concepts were clear and rated the model as logical, testable, having clear schematic representation, and being relevant to inter-professional collaboration, SDM and primary care. Conclusions Stakeholders validated the new IP-SDM model for primary care settings and proposed few modifications. Future research should assess if the model helps implement SDM in IP clinical practice. PMID:20695950

  18. Content Validity and Psychometric Properties of the Nomination Scale for Identifying Football Talent (NSIFT: Application to Coaches, Parents and Players

    Directory of Open Access Journals (Sweden)

    Alejandro Prieto-Ayuso

    2017-01-01

    Full Text Available The identification of football talent is a critical issue both for clubs and the families of players. However, despite its importance in a sporting, economic and social sense, there appears to be a lack of instruments that can reliably measure talent performance. The aim of this study was to design and validate the Nomination Scale for Identifying Football Talent (NSIFT, with the aim of optimising the processes for identifying said talent. The scale was first validated through expert judgment, and then statistically, by means of an exploratory factor analysis (EFA, confirmatory factor analysis (CFA, internal reliability and convergent validity. The results reveal the presence of three factors in the scale’s factor matrix, with these results being confirmed by the CFA. The scale revealed suitable internal reliability and homogeneity indices. Convergent validity showed that it is teammates who are best able to identify football talent, followed by coaches and parents. It can be concluded that the NSIFT is suitable for use in the football world. Future studies should seek to confirm these results in different contexts by means of further CFAs.

  19. A Systematic Approach to Determining the Identifiability of Multistage Carcinogenesis Models.

    Science.gov (United States)

    Brouwer, Andrew F; Meza, Rafael; Eisenberg, Marisa C

    2017-07-01

    Multistage clonal expansion (MSCE) models of carcinogenesis are continuous-time Markov process models often used to relate cancer incidence to biological mechanism. Identifiability analysis determines what model parameter combinations can, theoretically, be estimated from given data. We use a systematic approach, based on differential algebra methods traditionally used for deterministic ordinary differential equation (ODE) models, to determine identifiable combinations for a generalized subclass of MSCE models with any number of preinitation stages and one clonal expansion. Additionally, we determine the identifiable combinations of the generalized MSCE model with up to four clonal expansion stages, and conjecture the results for any number of clonal expansion stages. The results improve upon previous work in a number of ways and provide a framework to find the identifiable combinations for further variations on the MSCE models. Finally, our approach, which takes advantage of the Kolmogorov backward equations for the probability generating functions of the Markov process, demonstrates that identifiability methods used in engineering and mathematics for systems of ODEs can be applied to continuous-time Markov processes. © 2016 Society for Risk Analysis.

  20. Model Validation Status Review

    International Nuclear Information System (INIS)

    E.L. Hardin

    2001-01-01

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  1. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  2. Validating the Modified Drug Adherence Work-Up (M-DRAW) Tool to Identify and Address Barriers to Medication Adherence.

    Science.gov (United States)

    Lee, Sun; Bae, Yuna H; Worley, Marcia; Law, Anandi

    2017-09-08

    Barriers to medication adherence stem from multiple factors. An effective and convenient tool is needed to identify these barriers so that clinicians can provide a tailored, patient-centered consultation with patients. The Modified Drug Adherence Work-up Tool (M-DRAW) was developed as a 13-item checklist questionnaire to identify barriers to medication adherence. The response scale was a 4-point Likert scale of frequency of occurrence (1 = never to 4 = often). The checklist was accompanied by a GUIDE that provided corresponding motivational interview-based intervention strategies for each identified barrier. The current pilot study examined the psychometric properties of the M-DRAW checklist (reliability, responsiveness and discriminant validity) in patients taking one or more prescription medication(s) for chronic conditions. A cross-sectional sample of 26 patients was recruited between December 2015 and March 2016 at an academic medical center pharmacy in Southern California. A priming question that assessed self-reported adherence was used to separate participants into the control group of 17 "adherers" (65.4%), and into the intervention group of nine "unintentional and intentional non-adherers" (34.6%). Comparable baseline characteristics were observed between the two groups. The M-DRAW checklist showed acceptable reliability (13 item; alpha = 0.74) for identifying factors and barriers leading to medication non-adherence. Discriminant validity of the tool and the priming question was established by the four-fold number of barriers to adherence identified within the self-selected intervention group compared to the control group (4.4 versus 1.2 barriers, p tool will include construct validation.

  3. Relative validity of a food frequency questionnaire to identify dietary patterns in an adult Mexican population.

    Science.gov (United States)

    Denova-Gutiérrez, Edgar; Tucker, Katherine L; Salmerón, Jorge; Flores, Mario; Barquera, Simón

    2016-01-01

    To examine the validity of a semi-quantitative food frequency questionnaire (SFFQ) to identify dietary patterns in an adult Mexican population. A 140-item SFFQ and two 24-hour dietary recalls (24DRs) were administered. Foods were categorized into 29 food groups used to derive dietary patterns via factor analysis. Pearson and intraclass correlations coefficients between dietary pattern scores identified from the SFFQ and 24DRs were assessed. Pattern 1 was high in snacks, fast food, soft drinks, processed meats and refined grains; pattern 2 was high in fresh vegetables, fresh fruits, and dairy products; and pattern 3 was high in legumes, eggs, sweetened foods and sugars. Pearson correlation coefficients between the SFFQ and the 24DRs for these patterns were 0.66 (P<0.001), 0.41 (P<0.001) and 0.29 (P=0.193) respectively. Our data indicate reasonable validity of the SFFQ, using factor analysis, to derive major dietary patterns in comparison with two 24DR.

  4. Non-Destructive Approaches for the Validation of Visually Observed Spatial Patterns of Decay

    Science.gov (United States)

    Johnston, Brian; McKinley, Jennifer; Warke, Patricia; Ruffell, Alastair

    2017-04-01

    Historical structures are regarded as a built legacy that is passed down through the generations and as such the conservation and restoration of these buildings is of great importance to governmental, religious and charitable organisations. As these groups play the role of custodians of this built heritage, they are therefore keen that the approaches employed in these studies of stone condition are non-destructive in nature. Determining sections of facades requiring repair work is often achieved through a visual conditional inspection of the stonework by a specialist. However, these reports focus upon the need to identify blocks requiring restorative action rather than the determination of spatial trends that lead to the identification of causes. This fixation on decay occurring at the block scale results in the spatial distribution of weathering present at the larger 'wall' scale appearing to have developed chaotically. Recent work has shown the importance of adopting a geomorphological focus when undertaking visual inspection of the facades of historical buildings to overcome this issue. Once trends have been ascertained, they can be used to bolster remedial strategies that target the sources of decay rather than just undertaking an aesthetic treatment of symptoms. Visual inspection of the study site, Fitzroy Presbyterian Church in Belfast, using the geomorphologically driven approach revealed three features suggestive of decay extending beyond the block scale. Firstly, the influence of architectural features on the susceptibility of blocks to decay. Secondly, the impact of the fluctuation in groundwater rise over the seasons and the influence of aspect upon this process. And finally, the interconnectivity of blocks, due to deteriorating mortar and poor repointing, providing conduits for the passage of moisture. Once these patterns were identified, it has proven necessary to validate the outcome of the visual inspection using other techniques. In this study

  5. 'Omics' approaches in tomato aimed at identifying candidate genes ...

    African Journals Online (AJOL)

    adriana

    2013-12-04

    Dec 4, 2013 ... approaches could be combined in order to identify candidate genes for the genetic control of ascorbic ..... applied to other traits under the complex control of many ... Engineering increased vitamin C levels in ... Chem. Biol. 13:532–538. Giovannucci E, Rimm EB, Liu Y, Stampfer MJ, Willett WC (2002). A.

  6. An Objective Approach to Identify Spectral Distinctiveness for Hearing Impairment

    Directory of Open Access Journals (Sweden)

    Yeou-Jiunn Chen

    2013-01-01

    Full Text Available To facilitate the process of developing speech perception, speech-language pathologists have to teach a subject with hearing loss the differences between two syllables by manually enhancing acoustic cues of speech. However, this process is time consuming and difficult. Thus, this study proposes an objective approach to automatically identify the regions of spectral distinctiveness between two syllables, which is used for speech-perception training. To accurately represent the characteristics of speech, mel-frequency cepstrum coefficients are selected as analytical parameters. The mismatch between two syllables in time domain is handled by dynamic time warping. Further, a filter bank is adopted to estimate the components in different frequency bands, which are also represented as mel-frequency cepstrum coefficients. The spectral distinctiveness in different frequency bands is then easily estimated by using Euclidean metrics. Finally, a morphological gradient operator is applied to automatically identify the regions of spectral distinctiveness. To evaluate the proposed approach, the identified regions are manipulated and then the manipulated syllables are measured by a close-set based speech-perception test. The experimental results demonstrated that the identified regions of spectral distinctiveness are very useful in speech perception, which indeed can help speech-language pathologists in speech-perception training.

  7. Validation of two case definitions to identify pressure ulcers using hospital administrative data.

    Science.gov (United States)

    Ho, Chester; Jiang, Jason; Eastwood, Cathy A; Wong, Holly; Weaver, Brittany; Quan, Hude

    2017-08-28

    Pressure ulcer development is a quality of care indicator, as pressure ulcers are potentially preventable. Yet pressure ulcer is a leading cause of morbidity, discomfort and additional healthcare costs for inpatients. Methods are lacking for accurate surveillance of pressure ulcer in hospitals to track occurrences and evaluate care improvement strategies. The main study aim was to validate hospital discharge abstract database (DAD) in recording pressure ulcers against nursing consult reports, and to calculate prevalence of pressure ulcers in Alberta, Canada in DAD. We hypothesised that a more inclusive case definition for pressure ulcers would enhance validity of cases identified in administrative data for research and quality improvement purposes. A cohort of patients with pressure ulcers were identified from enterostomal (ET) nursing consult documents at a large university hospital in 2011. There were 1217 patients with pressure ulcers in ET nursing documentation that were linked to a corresponding record in DAD to validate DAD for correct and accurate identification of pressure ulcer occurrence, using two case definitions for pressure ulcer. Using pressure ulcer definition 1 (7 codes), prevalence was 1.4%, and using definition 2 (29 codes), prevalence was 4.2% after adjusting for misclassifications. The results were lower than expected. Definition 1 sensitivity was 27.7% and specificity was 98.8%, while definition 2 sensitivity was 32.8% and specificity was 95.9%. Pressure ulcer in both DAD and ET consultation increased with age, number of comorbidities and length of stay. DAD underestimate pressure ulcer prevalence. Since various codes are used to record pressure ulcers in DAD, the case definition with more codes captures more pressure ulcer cases, and may be useful for monitoring facility trends. However, low sensitivity suggests that this data source may not be accurate for determining overall prevalence, and should be cautiously compared with other

  8. Designing and determining validity and reliability of a questionnaire to identify factors affecting nutritional behavior among patients with metabolic syndrome

    Directory of Open Access Journals (Sweden)

    Naseh Esmaeili

    2017-06-01

    Full Text Available Background : A number of studies have shown a clear relationship between diet and component of metabolic syndrome. Based on the Theory of Reasoned Action (TRA, attitude and subjective norm are factors affecting behavioral intention and subsequently behavior. The aim of the present study is to design a valid questionnaire identifying factors affecting nutritional behavior among patients with metabolic syndrome. Materials and Methods: Via literature review, six focus group discussion and interview with nutrition specialists were performed to develop an instrument based on the theory of reasoned action. To determine validity of the instrument, content and face validity analyses with 15 expert panels conducted and also to determine reliability, Cronbach’s Alpha coefficient performed. Results: A draft of 100 items questionnaire was developed and after evaluation of validity and reliability, final questionnaire included 46 items: 17 items for attitude, 13 items for subjective norms and 16 items for behavioral intention. For the final questionnaire average of content validity index was 0/92 and Cronbach’s Alpha coefficient was 0/85. Conclusion: Based on the results of the current study the developed questionnaire is a valid and reliable instrument and it can be used to identify factors affecting nutritional behavior among people with metabolic syndrome based on the theory of reasoned action.

  9. Validation of MIMGO: a method to identify differentially expressed GO terms in a microarray dataset

    Directory of Open Access Journals (Sweden)

    Yamada Yoichi

    2012-12-01

    Full Text Available Abstract Background We previously proposed an algorithm for the identification of GO terms that commonly annotate genes whose expression is upregulated or downregulated in some microarray data compared with in other microarray data. We call these “differentially expressed GO terms” and have named the algorithm “matrix-assisted identification method of differentially expressed GO terms” (MIMGO. MIMGO can also identify microarray data in which genes annotated with a differentially expressed GO term are upregulated or downregulated. However, MIMGO has not yet been validated on a real microarray dataset using all available GO terms. Findings We combined Gene Set Enrichment Analysis (GSEA with MIMGO to identify differentially expressed GO terms in a yeast cell cycle microarray dataset. GSEA followed by MIMGO (GSEA + MIMGO correctly identified (p Conclusions MIMGO is a reliable method to identify differentially expressed GO terms comprehensively.

  10. [Validation of the AUDIT test for identifying risk consumption and alcohol use disorders in women].

    Science.gov (United States)

    Pérula de Torres, L A; Fernández-García, J A; Arias-Vega, R; Muriel-Palomino, M; Márquez-Rebollo, E; Ruiz-Moral, R

    2005-11-30

    To validate the AUDIT test for identifying women with excess alcohol consumption and/or dependency syndrome (DS). Descriptive study to validate a test. Two primary care centres and a county drug-dependency centre. 414 women from 18 to 75 recruited at the clinic. Interventions. Social and personal details were obtained through personal interview, their alcohol consumption was quantified and the AUDIT and MALT questionnaires were filled in. Then the semi-structured SCAN interview was conducted (gold standard; DSM-IV and CIE-10 criteria), and analyses were requested (GGT, GOT, GPT, VCM). 186 patients were given a follow-up appointment three-four weeks later (retest). Intra-observer reliability was evaluated with the Kappa index, internal consistency with Cronbach s alpha, and the validity of criteria with indexes of sensitivity and specificity, predictive values and probability quotients. To evaluate the diagnostic performance of the test and the most effective cut-off point, a ROC analysis was run. 11.4% (95% CI, 8.98-13.81) were diagnosed with alcohol abuse (0.5%) or DS (10.9%). The Kappa coefficients of the AUDIT items ranged between 0.685 and 0.795 (PAUDIT is a questionnaire with good psycho-measurement properties. It is reliable and valid for the detection of risk consumption and DS in women.

  11. Validation of Simulation Codes for Future Systems: Motivations, Approach and the Role of Nuclear Data

    International Nuclear Information System (INIS)

    G. Palmiotti; M. Salvatores; G. Aliberti

    2007-01-01

    The validation of advanced simulation tools will still play a very significant role in several areas of reactor system analysis. This is the case of reactor physics and neutronics, where nuclear data uncertainties still play a crucial role for many core and fuel cycle parameters. The present paper gives a summary of validation motivations, objectives and approach. A validation effort is in particular necessary in the frame of advanced (e.g. Generation-IV or GNEP) reactors and associated fuel cycles assessment and design

  12. Hurricane Sandy Economic Impacts Assessment: A Computable General Equilibrium Approach and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Boero, Riccardo [Los Alamos National Laboratory; Edwards, Brian Keith [Los Alamos National Laboratory

    2017-08-07

    Economists use computable general equilibrium (CGE) models to assess how economies react and self-organize after changes in policies, technology, and other exogenous shocks. CGE models are equation-based, empirically calibrated, and inspired by Neoclassical economic theory. The focus of this work was to validate the National Infrastructure Simulation and Analysis Center (NISAC) CGE model and apply it to the problem of assessing the economic impacts of severe events. We used the 2012 Hurricane Sandy event as our validation case. In particular, this work first introduces the model and then describes the validation approach and the empirical data available for studying the event of focus. Shocks to the model are then formalized and applied. Finally, model results and limitations are presented and discussed, pointing out both the model degree of accuracy and the assessed total damage caused by Hurricane Sandy.

  13. A contemporary approach to validity arguments: a practical guide to Kane's framework.

    Science.gov (United States)

    Cook, David A; Brydges, Ryan; Ginsburg, Shiphra; Hatala, Rose

    2015-06-01

    Assessment is central to medical education and the validation of assessments is vital to their use. Earlier validity frameworks suffer from a multiplicity of types of validity or failure to prioritise among sources of validity evidence. Kane's framework addresses both concerns by emphasising key inferences as the assessment progresses from a single observation to a final decision. Evidence evaluating these inferences is planned and presented as a validity argument. We aim to offer a practical introduction to the key concepts of Kane's framework that educators will find accessible and applicable to a wide range of assessment tools and activities. All assessments are ultimately intended to facilitate a defensible decision about the person being assessed. Validation is the process of collecting and interpreting evidence to support that decision. Rigorous validation involves articulating the claims and assumptions associated with the proposed decision (the interpretation/use argument), empirically testing these assumptions, and organising evidence into a coherent validity argument. Kane identifies four inferences in the validity argument: Scoring (translating an observation into one or more scores); Generalisation (using the score[s] as a reflection of performance in a test setting); Extrapolation (using the score[s] as a reflection of real-world performance), and Implications (applying the score[s] to inform a decision or action). Evidence should be collected to support each of these inferences and should focus on the most questionable assumptions in the chain of inference. Key assumptions (and needed evidence) vary depending on the assessment's intended use or associated decision. Kane's framework applies to quantitative and qualitative assessments, and to individual tests and programmes of assessment. Validation focuses on evaluating the key claims, assumptions and inferences that link assessment scores with their intended interpretations and uses. The Implications

  14. Identifying Gifted Students in Puerto Rico: Validation of a Spanish Translation of the Gifted Rating Scales

    Science.gov (United States)

    Rosado, Javier I.; Pfeiffer, Steven; Petscher, Yaacov

    2015-01-01

    The challenge of correctly identifying gifted students is a critical issue. Gifted education in Puerto Rico is marked by insufficient support and a lack of appropriate identification methods. This study examined the reliability and validity of a Spanish translation of the "Gifted Rating Scales-School Form" (GRS) with a sample of 618…

  15. Validation of a fracture mechanics approach to nuclear transportation cask design through a drop test program

    International Nuclear Information System (INIS)

    Sorenson, K.B.

    1986-01-01

    Sandia National Laboratories (SNL), under contract to the Department of Energy, is conducting a research program to develop and validate a fracture mechanics approach to cask design. A series of drop tests of a transportation cask is planned for the summer of 1986 as the method for benchmarking and, thereby, validating the fracture mechanics approach. This paper presents the drop test plan and background leading to the development of the test plan including structural analyses, material characterization, and non-destructive evaluation (NDE) techniques necessary for defining the test plan properly

  16. The Laboratory-Based Intermountain Validated Exacerbation (LIVE Score Identifies Chronic Obstructive Pulmonary Disease Patients at High Mortality Risk

    Directory of Open Access Journals (Sweden)

    Denitza P. Blagev

    2018-06-01

    Full Text Available Background: Identifying COPD patients at high risk for mortality or healthcare utilization remains a challenge. A robust system for identifying high-risk COPD patients using Electronic Health Record (EHR data would empower targeting interventions aimed at ensuring guideline compliance and multimorbidity management. The purpose of this study was to empirically derive, validate, and characterize subgroups of COPD patients based on routinely collected clinical data widely available within the EHR.Methods: Cluster analysis was used in 5,006 patients with COPD at Intermountain to identify clusters based on a large collection of clinical variables. Recursive Partitioning (RP was then used to determine a preferred tree that assigned patients to clusters based on a parsimonious variable subset. The mortality, COPD exacerbations, and comorbidity profile of the identified groups were examined. The findings were validated in an independent Intermountain cohort and in external cohorts from the United States Veterans Affairs (VA and University of Chicago Medicine systems.Measurements and Main Results: The RP algorithm identified five LIVE Scores based on laboratory values: albumin, creatinine, chloride, potassium, and hemoglobin. The groups were characterized by increasing risk of mortality. The lowest risk, LIVE Score 5 had 8% 4-year mortality vs. 56% in the highest risk LIVE Score 1 (p < 0.001. These findings were validated in the VA cohort (n = 83,134, an expanded Intermountain cohort (n = 48,871 and in the University of Chicago system (n = 3,236. Higher mortality groups also had higher COPD exacerbation rates and comorbidity rates.Conclusions: In large clinical datasets across different organizations, the LIVE Score utilizes existing laboratory data for COPD patients, and may be used to stratify risk for mortality and COPD exacerbations.

  17. Market potential of nanoremediation in Europe - Market drivers and interventions identified in a deliberative scenario approach.

    Science.gov (United States)

    Bartke, Stephan; Hagemann, Nina; Harries, Nicola; Hauck, Jennifer; Bardos, Paul

    2018-04-01

    A deliberate expert-based scenario approach is applied to better understand the likely determinants of the evolution of the market for nanoparticles use in remediation in Europe until 2025. An initial set of factors had been obtained from a literature review and was complemented by a workshop and key-informant interviews. In further expert engaging formats - focus groups, workshops, conferences, surveys - this initial set of factors was condensed and engaged experts scored the factors regarding their importance for being likely to influence the market development. An interaction matrix was obtained identifying the factors being most active in shaping the market development in Europe by 2025, namely "Science-Policy-Interface" and "Validated information on nanoparticle application potential". Based on these, potential scenarios were determined and development of factors discussed. Conclusions are offered on achievable interventions to enhance nanoremediation deployment. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. A statistical approach to identify candidate cues for nestmate recognition

    DEFF Research Database (Denmark)

    van Zweden, Jelle; Pontieri, Luigi; Pedersen, Jes Søe

    2014-01-01

    normalization, centroid,and distance calculation is most diagnostic to discriminate between NMR cues andother compounds. We find that using a “global centroid” instead of a “colony centroid”significantly improves the analysis. One reason may be that this new approach, unlikeprevious ones, provides...... than forF. exsecta, possibly due to less than ideal datasets. Nonetheless, some compound setsperformed better than others, showing that this approach can be used to identify candidatecompounds to be tested in bio-assays, and eventually crack the sophisticated code thatgoverns nestmate recognition....

  19. Relative validity of a food frequency questionnaire to identify dietary patterns in an adult Mexican population

    Directory of Open Access Journals (Sweden)

    Edgar Denova-Gutiérrez

    2016-12-01

    Full Text Available Objective. To examine the validity of a semi-quantitative food frequency questionnaire (SFFQ to identify dietary patterns in an adult Mexican population. Materials and methods. A 140-item SFFQ and two 24-hour dietary recalls (24DRs were administered. Foods were categorized into 29 food groups used to derive dietary patterns via factor analy­sis. Pearson and intraclass correlations coefficients between dietary pattern scores identified from the SFFQ and 24DRs were assessed. Results. Pattern 1 was high in snacks, fast food, soft drinks, processed meats and refined grains; pattern 2 was high in fresh vegetables, fresh fruits, and dairy products; and pattern 3 was high in legumes, eggs, sweetened foods and sugars. Pearson correlation oefficients between the SFFQ and the 24DRs for these patterns were 0.66 (P<0.001, 0.41 (P<0.001 and 0.29 (P=0.193 respectively. Conclusions. Our data indicate reasonable validity of the SFFQ, using fac­tor analysis, to derive major dietary patterns in comparison with two 24DR.

  20. A Methodology for Validating Safety Heuristics Using Clinical Simulations: Identifying and Preventing Possible Technology-Induced Errors Related to Using Health Information Systems

    Science.gov (United States)

    Borycki, Elizabeth; Kushniruk, Andre; Carvalho, Christopher

    2013-01-01

    Internationally, health information systems (HIS) safety has emerged as a significant concern for governments. Recently, research has emerged that has documented the ability of HIS to be implicated in the harm and death of patients. Researchers have attempted to develop methods that can be used to prevent or reduce technology-induced errors. Some researchers are developing methods that can be employed prior to systems release. These methods include the development of safety heuristics and clinical simulations. In this paper, we outline our methodology for developing safety heuristics specific to identifying the features or functions of a HIS user interface design that may lead to technology-induced errors. We follow this with a description of a methodological approach to validate these heuristics using clinical simulations. PMID:23606902

  1. Developing and validating the Youth Conduct Problems Scale-Rwanda: a mixed methods approach.

    Directory of Open Access Journals (Sweden)

    Lauren C Ng

    Full Text Available This study developed and validated the Youth Conduct Problems Scale-Rwanda (YCPS-R. Qualitative free listing (n = 74 and key informant interviews (n = 47 identified local conduct problems, which were compared to existing standardized conduct problem scales and used to develop the YCPS-R. The YCPS-R was cognitive tested by 12 youth and caregiver participants, and assessed for test-retest and inter-rater reliability in a sample of 64 youth. Finally, a purposive sample of 389 youth and their caregivers were enrolled in a validity study. Validity was assessed by comparing YCPS-R scores to conduct disorder, which was diagnosed with the Mini International Neuropsychiatric Interview for Children, and functional impairment scores on the World Health Organization Disability Assessment Schedule Child Version. ROC analyses assessed the YCPS-R's ability to discriminate between youth with and without conduct disorder. Qualitative data identified a local presentation of youth conduct problems that did not match previously standardized measures. Therefore, the YCPS-R was developed solely from local conduct problems. Cognitive testing indicated that the YCPS-R was understandable and required little modification. The YCPS-R demonstrated good reliability, construct, criterion, and discriminant validity, and fair classification accuracy. The YCPS-R is a locally-derived measure of Rwandan youth conduct problems that demonstrated good psychometric properties and could be used for further research.

  2. Validity of administrative database code algorithms to identify vascular access placement, surgical revisions, and secondary patency.

    Science.gov (United States)

    Al-Jaishi, Ahmed A; Moist, Louise M; Oliver, Matthew J; Nash, Danielle M; Fleet, Jamie L; Garg, Amit X; Lok, Charmaine E

    2018-03-01

    We assessed the validity of physician billing codes and hospital admission using International Classification of Diseases 10th revision codes to identify vascular access placement, secondary patency, and surgical revisions in administrative data. We included adults (≥18 years) with a vascular access placed between 1 April 2004 and 31 March 2013 at the University Health Network, Toronto. Our reference standard was a prospective vascular access database (VASPRO) that contains information on vascular access type and dates of placement, dates for failure, and any revisions. We used VASPRO to assess the validity of different administrative coding algorithms by calculating the sensitivity, specificity, and positive predictive values of vascular access events. The sensitivity (95% confidence interval) of the best performing algorithm to identify arteriovenous access placement was 86% (83%, 89%) and specificity was 92% (89%, 93%). The corresponding numbers to identify catheter insertion were 84% (82%, 86%) and 84% (80%, 87%), respectively. The sensitivity of the best performing coding algorithm to identify arteriovenous access surgical revisions was 81% (67%, 90%) and specificity was 89% (87%, 90%). The algorithm capturing arteriovenous access placement and catheter insertion had a positive predictive value greater than 90% and arteriovenous access surgical revisions had a positive predictive value of 20%. The duration of arteriovenous access secondary patency was on average 578 (553, 603) days in VASPRO and 555 (530, 580) days in administrative databases. Administrative data algorithms have fair to good operating characteristics to identify vascular access placement and arteriovenous access secondary patency. Low positive predictive values for surgical revisions algorithm suggest that administrative data should only be used to rule out the occurrence of an event.

  3. Validation of standard operating procedures in a multicenter retrospective study to identify -omics biomarkers for chronic low back pain.

    Directory of Open Access Journals (Sweden)

    Concetta Dagostino

    Full Text Available Chronic low back pain (CLBP is one of the most common medical conditions, ranking as the greatest contributor to global disability and accounting for huge societal costs based on the Global Burden of Disease 2010 study. Large genetic and -omics studies provide a promising avenue for the screening, development and validation of biomarkers useful for personalized diagnosis and treatment (precision medicine. Multicentre studies are needed for such an effort, and a standardized and homogeneous approach is vital for recruitment of large numbers of participants among different centres (clinical and laboratories to obtain robust and reproducible results. To date, no validated standard operating procedures (SOPs for genetic/-omics studies in chronic pain have been developed. In this study, we validated an SOP model that will be used in the multicentre (5 centres retrospective "PainOmics" study, funded by the European Community in the 7th Framework Programme, which aims to develop new biomarkers for CLBP through three different -omics approaches: genomics, glycomics and activomics. The SOPs describe the specific procedures for (1 blood collection, (2 sample processing and storage, (3 shipping details and (4 cross-check testing and validation before assays that all the centres involved in the study have to follow. Multivariate analysis revealed the absolute specificity and homogeneity of the samples collected by the five centres for all genetics, glycomics and activomics analyses. The SOPs used in our multicenter study have been validated. Hence, they could represent an innovative tool for the correct management and collection of reliable samples in other large-omics-based multicenter studies.

  4. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant.

    Directory of Open Access Journals (Sweden)

    William S Beatty

    Full Text Available The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international, whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result

  5. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant

    Science.gov (United States)

    Beatty, William S.; Kesler, Dylan C.; Webb, Elisabeth B.; Raedeke, Andrew H.; Naylor, Luke W.; Humburg, Dale D.

    2013-01-01

    The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos) equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international), whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result, targeted

  6. A Simulation Approach for Performance Validation during Embedded Systems Design

    Science.gov (United States)

    Wang, Zhonglei; Haberl, Wolfgang; Herkersdorf, Andreas; Wechs, Martin

    Due to the time-to-market pressure, it is highly desirable to design hardware and software of embedded systems in parallel. However, hardware and software are developed mostly using very different methods, so that performance evaluation and validation of the whole system is not an easy task. In this paper, we propose a simulation approach to bridge the gap between model-driven software development and simulation based hardware design, by merging hardware and software models into a SystemC based simulation environment. An automated procedure has been established to generate software simulation models from formal models, while the hardware design is originally modeled in SystemC. As the simulation models are annotated with timing information, performance issues are tackled in the same pass as system functionality, rather than in a dedicated approach.

  7. An approach to identify issues affecting ERP implementation in Indian SMEs

    Directory of Open Access Journals (Sweden)

    Rana Basu

    2012-06-01

    Full Text Available Purpose: The purpose of this paper is to present the findings of a study which is based on the results of a comprehensive compilation of literature and subsequent analysis of ERP implementation success issues in context to Indian Small and Medium scale Enterprises (SME’s. This paper attempts to explore the existing literature and highlight those issues on ERP implementation and further to this the researchers applied TOPSIS (Technique for order preference by similarity to ideal solution method to prioritize issues affecting successful implementation of ERP. Design/methodology/approach: Based on the literature review certain issues leading to successful ERP implementation have been identified and to identify key issues Pareto Analysis (80-20 Rule have been applied. Further to extraction of key issues a survey based on TOPSIS was carried out in Indian small and medium scale enterprises. Findings: Based on review of literature 25 issues have been identified and further Pareto analysis has been done to extract key issues which is further prioritized by applying Topsis method. Research limitations/implications: Beside those identified issues there may be other issues that need to be explored. There is scope to enhance this study by taking into consideration different type of industries and by extending number of respondents. Practical implications: By identifying key issues for SMEs, managers can better prioritize issues to make implementation process smooth without disruption. ERP vendors can take inputs from this study to change their implementation approach while targeting small scale enterprises. Originality/value: There is no published literature available which followed a similar approach in identification of the critical issues affecting ERP in small and mid-sized companies in India or in any developing economy.

  8. Validating emotional attention regulation as a component of emotional intelligence: A Stroop approach to individual differences in tuning in to and out of nonverbal cues.

    Science.gov (United States)

    Elfenbein, Hillary Anger; Jang, Daisung; Sharma, Sudeep; Sanchez-Burks, Jeffrey

    2017-03-01

    Emotional intelligence (EI) has captivated researchers and the public alike, but it has been challenging to establish its components as objective abilities. Self-report scales lack divergent validity from personality traits, and few ability tests have objectively correct answers. We adapt the Stroop task to introduce a new facet of EI called emotional attention regulation (EAR), which involves focusing emotion-related attention for the sake of information processing rather than for the sake of regulating one's own internal state. EAR includes 2 distinct components. First, tuning in to nonverbal cues involves identifying nonverbal cues while ignoring alternate content, that is, emotion recognition under conditions of distraction by competing stimuli. Second, tuning out of nonverbal cues involves ignoring nonverbal cues while identifying alternate content, that is, the ability to interrupt emotion recognition when needed to focus attention elsewhere. An auditory test of valence included positive and negative words spoken in positive and negative vocal tones. A visual test of approach-avoidance included green- and red-colored facial expressions depicting happiness and anger. The error rates for incongruent trials met the key criteria for establishing the validity of an EI test, in that the measure demonstrated test-retest reliability, convergent validity with other EI measures, divergent validity from factors such as general processing speed and mostly personality, and predictive validity in this case for well-being. By demonstrating that facets of EI can be validly theorized and empirically assessed, results also speak to the validity of EI more generally. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. The node-weighted Steiner tree approach to identify elements of cancer-related signaling pathways.

    Science.gov (United States)

    Sun, Yahui; Ma, Chenkai; Halgamuge, Saman

    2017-12-28

    Cancer constitutes a momentous health burden in our society. Critical information on cancer may be hidden in its signaling pathways. However, even though a large amount of money has been spent on cancer research, some critical information on cancer-related signaling pathways still remains elusive. Hence, new works towards a complete understanding of cancer-related signaling pathways will greatly benefit the prevention, diagnosis, and treatment of cancer. We propose the node-weighted Steiner tree approach to identify important elements of cancer-related signaling pathways at the level of proteins. This new approach has advantages over previous approaches since it is fast in processing large protein-protein interaction networks. We apply this new approach to identify important elements of two well-known cancer-related signaling pathways: PI3K/Akt and MAPK. First, we generate a node-weighted protein-protein interaction network using protein and signaling pathway data. Second, we modify and use two preprocessing techniques and a state-of-the-art Steiner tree algorithm to identify a subnetwork in the generated network. Third, we propose two new metrics to select important elements from this subnetwork. On a commonly used personal computer, this new approach takes less than 2 s to identify the important elements of PI3K/Akt and MAPK signaling pathways in a large node-weighted protein-protein interaction network with 16,843 vertices and 1,736,922 edges. We further analyze and demonstrate the significance of these identified elements to cancer signal transduction by exploring previously reported experimental evidences. Our node-weighted Steiner tree approach is shown to be both fast and effective to identify important elements of cancer-related signaling pathways. Furthermore, it may provide new perspectives into the identification of signaling pathways for other human diseases.

  10. Specification and Preliminary Validation of IAT (Integrated Analysis Techniques) Methods: Executive Summary.

    Science.gov (United States)

    1985-03-01

    conceptual framwork , and preliminary validation of IAT concepts. Planned work for FY85, including more extensive validation, is also described. 20...Developments: Required Capabilities .... ......... 10 2-1 IAT Conceptual Framework - FY85 (FEO) ..... ........... 11 2-2 Recursive Nature of Decomposition...approach: 1) Identify needs & requirements for IAT. 2) Develop IAT conceptual framework. 3) Validate IAT methods. 4) Develop applications materials. To

  11. Nutrient profiling can help identify foods of good nutritional quality for their price: a validation study with linear programming.

    Science.gov (United States)

    Maillot, Matthieu; Ferguson, Elaine L; Drewnowski, Adam; Darmon, Nicole

    2008-06-01

    Nutrient profiling ranks foods based on their nutrient content. They may help identify foods with a good nutritional quality for their price. This hypothesis was tested using diet modeling with linear programming. Analyses were undertaken using food intake data from the nationally representative French INCA (enquête Individuelle et Nationale sur les Consommations Alimentaires) survey and its associated food composition and price database. For each food, a nutrient profile score was defined as the ratio between the previously published nutrient density score (NDS) and the limited nutrient score (LIM); a nutritional quality for price indicator was developed and calculated from the relationship between its NDS:LIM and energy cost (in euro/100 kcal). We developed linear programming models to design diets that fulfilled increasing levels of nutritional constraints at a minimal cost. The median NDS:LIM values of foods selected in modeled diets increased as the levels of nutritional constraints increased (P = 0.005). In addition, the proportion of foods with a good nutritional quality for price indicator was higher (P linear programming and the nutrient profiling approaches indicates that nutrient profiling can help identify foods of good nutritional quality for their price. Linear programming is a useful tool for testing nutrient profiling systems and validating the concept of nutrient profiling.

  12. EEG-based motor network biomarkers for identifying target patients with stroke for upper limb rehabilitation and its construct validity.

    Directory of Open Access Journals (Sweden)

    Chun-Chuan Chen

    Full Text Available Rehabilitation is the main therapeutic approach for reducing poststroke functional deficits in the affected upper limb; however, significant between-patient variability in rehabilitation efficacy indicates the need to target patients who are likely to have clinically significant improvement after treatment. Many studies have determined robust predictors of recovery and treatment gains and yielded many great results using linear approachs. Evidence has emerged that the nonlinearity is a crucial aspect to study the inter-areal communication in human brains and abnormality of oscillatory activities in the motor system is linked to the pathological states. In this study, we hypothesized that combinations of linear and nonlinear (cross-frequency network connectivity parameters are favourable biomarkers for stratifying patients for upper limb rehabilitation with increased accuracy. We identified the biomarkers by using 37 prerehabilitation electroencephalogram (EEG datasets during a movement task through effective connectivity and logistic regression analyses. The predictive power of these biomarkers was then tested by using 16 independent datasets (i.e. construct validation. In addition, 14 right handed healthy subjects were also enrolled for comparisons. The result shows that the beta plus gamma or theta network features provided the best classification accuracy of 92%. The predictive value and the sensitivity of these biomarkers were 81.3% and 90.9%, respectively. Subcortical lesion, the time poststroke and initial Wolf Motor Function Test (WMFT score were identified as the most significant clinical variables affecting the classification accuracy of this predictive model. Moreover, 12 of 14 normal controls were classified as having favourable recovery. In conclusion, EEG-based linear and nonlinear motor network biomarkers are robust and can help clinical decision making.

  13. Fatigue Equivalent Stress State Approach Validation in Non-conservative Criteria: a Comparative Study

    Directory of Open Access Journals (Sweden)

    Kévin Martial Tsapi Tchoupou

    Full Text Available Abstract This paper is concerned with the fatigue prediction models for estimating the multiaxial fatigue limit. An equivalent loading approach with zero out-of-phase angles intended for fatigue limit evaluation under multiaxial loading is used. Based on experimental data found in literatures, the equivalent stress is validated in Crossland and Sines criteria and predictions compared to the predictions of existing multiaxial fatigue; results over 87 experimental items show that the equivalent stress approach is very efficient.

  14. Validation of an employee satisfaction model: A structural equation model approach

    OpenAIRE

    Ophillia Ledimo; Nico Martins

    2015-01-01

    The purpose of this study was to validate an employee satisfaction model and to determine the relationships between the different dimensions of the concept, using the structural equation modelling approach (SEM). A cross-sectional quantitative survey design was used to collect data from a random sample of (n=759) permanent employees of a parastatal organisation. Data was collected using the Employee Satisfaction Survey (ESS) to measure employee satisfaction dimensions. Following the steps of ...

  15. Three novel approaches to structural identifiability analysis in mixed-effects models.

    Science.gov (United States)

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2016-05-06

    Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not

  16. Identifying inhibitory compounds in lignocellulosic biomass hydrolysates using an exometabolomics approach

    NARCIS (Netherlands)

    Zha, Y.; Westerhuis, J.A.; Muilwijk, B.; Overkamp, K.M.; Nijmeijer, B.M.; Coulier, L.; Smilde, A.K.; Punt, P.J.

    2014-01-01

    Background: Inhibitors are formed that reduce the fermentation performance of fermenting yeast during the pretreatment process of lignocellulosic biomass. An exometabolomics approach was applied to systematically identify inhibitors in lignocellulosic biomass hydrolysates.Results: We studied the

  17. Identifying inhibitory compounds in lignocellulosic biomass hydrolysates using an exometabolomics approach

    NARCIS (Netherlands)

    Zha, Y.; Westerhuis, J.A.; Muilwijk, B.; Overkamp, K.M.; Nijmeijer, B.M.; Coulier, L.; Smilde, A.K.; Punt, P.J.

    2014-01-01

    BACKGROUND: Inhibitors are formed that reduce the fermentation performance of fermenting yeast during the pretreatment process of lignocellulosic biomass. An exometabolomics approach was applied to systematically identify inhibitors in lignocellulosic biomass hydrolysates. RESULTS: We studied the

  18. The validity of register data to identify children with atopic dermatitis, asthma or allergic rhinoconjunctivitis.

    Science.gov (United States)

    Stensballe, Lone Graff; Klansø, Lotte; Jensen, Andreas; Haerskjold, Ann; Thomsen, Simon Francis; Simonsen, Jacob

    2017-09-01

    The incidence of atopic dermatitis, wheezing, asthma and allergic rhinoconjunctivitis has been increasing. Register-based studies are essential for research in subpopulations with specific diseases and facilitate epidemiological studies to identify causes and evaluate interventions. Algorithms have been developed to identify children with atopic dermatitis, asthma or allergic rhinoconjunctivitis using register information on disease-specific dispensed prescribed medication and hospital contacts, but the validity of the algorithms has not been evaluated. This study validated the algorithms vs gold standard deep telephone interviews with the caretaker about physician-diagnosed atopic dermatitis, wheezing, asthma or allergic rhinoconjunctivitis in the child. The algorithms defined each of the three atopic diseases using register-based information on disease-specific hospital contacts and/or filled prescriptions of disease-specific medication. Confirmative answers to questions about physician-diagnosed atopic disease were used as the gold standard for the comparison with the algorithms, resulting in sensitivities and specificities and 95% confidence intervals. The interviews with the caretaker of the included 454 Danish children born 1997-2003 were carried out May-September 2015; the mean age of the children at the time of the interview being 15.2 years (standard deviation 1.3 years). For the algorithm capturing children with atopic dermatitis, the sensitivity was 74.1% (95% confidence interval: 66.9%-80.2%) and the specificity 73.0% (67.3%-78.0%). For the algorithm capturing children with asthma, both the sensitivity of 84.1% (78.0%-88.8%) and the specificity of 81.6% (76.5%-85.8%) were high compared with physician-diagnosed asthmatic bronchitis (recurrent wheezing). The sensitivity remained high when capturing physician-diagnosed asthma: 83.3% (74.3%-89.6%); however, the specificity declined to 66.0% (60.9%-70.8%). For allergic rhinoconjunctivitis, the sensitivity

  19. Mixed Integer Linear Programming based machine learning approach identifies regulators of telomerase in yeast.

    Science.gov (United States)

    Poos, Alexandra M; Maicher, André; Dieckmann, Anna K; Oswald, Marcus; Eils, Roland; Kupiec, Martin; Luke, Brian; König, Rainer

    2016-06-02

    Understanding telomere length maintenance mechanisms is central in cancer biology as their dysregulation is one of the hallmarks for immortalization of cancer cells. Important for this well-balanced control is the transcriptional regulation of the telomerase genes. We integrated Mixed Integer Linear Programming models into a comparative machine learning based approach to identify regulatory interactions that best explain the discrepancy of telomerase transcript levels in yeast mutants with deleted regulators showing aberrant telomere length, when compared to mutants with normal telomere length. We uncover novel regulators of telomerase expression, several of which affect histone levels or modifications. In particular, our results point to the transcription factors Sum1, Hst1 and Srb2 as being important for the regulation of EST1 transcription, and we validated the effect of Sum1 experimentally. We compiled our machine learning method leading to a user friendly package for R which can straightforwardly be applied to similar problems integrating gene regulator binding information and expression profiles of samples of e.g. different phenotypes, diseases or treatments. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Validation of a LES turbulence modeling approach on a steady engine head flow

    NARCIS (Netherlands)

    Huijnen, V.; Somers, L.M.T.; Baert, R.S.G.; Goey, de L.P.H.; Dias, V.

    2005-01-01

    The application of the LES turbulence modeling approach in the Kiva-environment is validated on a complex geometry. Results for the steady flow in a realistic geometry of a production type heavy-duty diesel engine head with 120 mm cylinder bore are presented. The bulk Reynolds number is Reb = 1 fl

  1. Innovative Approach to Validation of Ultraviolet (UV) Reactors for Disinfection in Drinking Water Systems

    Science.gov (United States)

    Slide presentation at Conference: ASCE 7th Civil Engineering Conference in the Asian Region. USEPA in partnership with the Cadmus Group, Carollo Engineers, and other State & Industry collaborators, are evaluating new approaches for validating UV reactors to meet groundwater & sur...

  2. Novel approaches to identify protective malaria vaccine candidates

    Directory of Open Access Journals (Sweden)

    Wan Ni eChia

    2014-11-01

    Full Text Available Efforts to develop vaccines against malaria have been the focus of substantial research activities for decades. Several categories of candidate vaccines are currently being developed for protection against malaria, based on antigens corresponding to the pre-erythrocytic, blood-stage or sexual stages of the parasite. Long lasting sterile protection from Plasmodium falciparum sporozoite challenge has been observed in human following vaccination with whole parasite formulations, clearly demonstrating that a protective immune response targeting predominantly the pre-erythrocytic stages can develop against malaria. However, most of vaccine candidates currently being investigated, which are mostly subunits vaccines, have not been able to induce substantial (>50% protection thus far. This is due to the fact that the antigens responsible for protection against the different parasite stages are still yet to be known and relevant correlates of protection have remained elusive. For a vaccine to be developed in a timely manner, novel approaches are required. In this article, we review the novel approaches that have been developed to identify the antigens for the development of an effective malaria vaccine.

  3. Bridging the gap between neurocognitive processing theory and performance validity assessment among the cognitively impaired: a review and methodological approach.

    Science.gov (United States)

    Leighton, Angela; Weinborn, Michael; Maybery, Murray

    2014-10-01

    Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance.

  4. Alternative approaches for identifying acute systemic toxicity: Moving from research to regulatory testing.

    Science.gov (United States)

    Hamm, Jon; Sullivan, Kristie; Clippinger, Amy J; Strickland, Judy; Bell, Shannon; Bhhatarai, Barun; Blaauboer, Bas; Casey, Warren; Dorman, David; Forsby, Anna; Garcia-Reyero, Natàlia; Gehen, Sean; Graepel, Rabea; Hotchkiss, Jon; Lowit, Anna; Matheson, Joanna; Reaves, Elissa; Scarano, Louis; Sprankle, Catherine; Tunkel, Jay; Wilson, Dan; Xia, Menghang; Zhu, Hao; Allen, David

    2017-06-01

    Acute systemic toxicity testing provides the basis for hazard labeling and risk management of chemicals. A number of international efforts have been directed at identifying non-animal alternatives for in vivo acute systemic toxicity tests. A September 2015 workshop, Alternative Approaches for Identifying Acute Systemic Toxicity: Moving from Research to Regulatory Testing, reviewed the state-of-the-science of non-animal alternatives for this testing and explored ways to facilitate implementation of alternatives. Workshop attendees included representatives from international regulatory agencies, academia, nongovernmental organizations, and industry. Resources identified as necessary for meaningful progress in implementing alternatives included compiling and making available high-quality reference data, training on use and interpretation of in vitro and in silico approaches, and global harmonization of testing requirements. Attendees particularly noted the need to characterize variability in reference data to evaluate new approaches. They also noted the importance of understanding the mechanisms of acute toxicity, which could be facilitated by the development of adverse outcome pathways. Workshop breakout groups explored different approaches to reducing or replacing animal use for acute toxicity testing, with each group crafting a roadmap and strategy to accomplish near-term progress. The workshop steering committee has organized efforts to implement the recommendations of the workshop participants. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Validation needs of seismic probabilistic risk assessment (PRA) methods applied to nuclear power plants

    International Nuclear Information System (INIS)

    Kot, C.A.; Srinivasan, M.G.; Hsieh, B.J.

    1985-01-01

    An effort to validate seismic PRA methods is in progress. The work concentrates on the validation of plant response and fragility estimates through the use of test data and information from actual earthquake experience. Validation needs have been identified in the areas of soil-structure interaction, structural response and capacity, and equipment fragility. Of particular concern is the adequacy of linear methodology to predict nonlinear behavior. While many questions can be resolved through the judicious use of dynamic test data, other aspects can only be validated by means of input and response measurements during actual earthquakes. A number of past, ongoing, and planned testing programs which can provide useful validation data have been identified, and validation approaches for specific problems are being formulated

  6. Validating the Copenhagen Psychosocial Questionnaire (COPSOQ-II) Using Set-ESEM: Identifying Psychosocial Risk Factors in a Sample of School Principals.

    Science.gov (United States)

    Dicke, Theresa; Marsh, Herbert W; Riley, Philip; Parker, Philip D; Guo, Jiesi; Horwood, Marcus

    2018-01-01

    School principals world-wide report high levels of strain and attrition resulting in a shortage of qualified principals. It is thus crucial to identify psychosocial risk factors that reflect principals' occupational wellbeing. For this purpose, we used the Copenhagen Psychosocial Questionnaire (COPSOQ-II), a widely used self-report measure covering multiple psychosocial factors identified by leading occupational stress theories. We evaluated the COPSOQ-II regarding factor structure and longitudinal, discriminant, and convergent validity using latent structural equation modeling in a large sample of Australian school principals ( N = 2,049). Results reveal that confirmatory factor analysis produced marginally acceptable model fit. A novel approach we call set exploratory structural equation modeling (set-ESEM), where cross-loadings were only allowed within a priori defined sets of factors, fit well, and was more parsimonious than a full ESEM. Further multitrait-multimethod models based on the set-ESEM confirm the importance of a principal's psychosocial risk factors; Stressors and depression were related to demands and ill-being, while confidence and autonomy were related to wellbeing. We also show that working in the private sector was beneficial for showing a low psychosocial risk, while other demographics have little effects. Finally, we identify five latent risk profiles (high risk to no risk) of school principals based on all psychosocial factors. Overall the research presented here closes the theory application gap of a strong multi-dimensional measure of psychosocial risk-factors.

  7. Transaortic TAVI Is a Valid Alternative to Transapical Approach.

    Science.gov (United States)

    O' Sullivan, Katie E; Hurley, Eoghan T; Segurado, Ricardo; Sugrue, Declan; Hurley, John P

    2015-05-01

    Transcatheter aortic valve implantation (TAVI) can be performed via a number of different anatomical approaches based on patient characteristics and operator choice. The aim of this study was to compare procedural outcomes between transaortic (TAo) and transapical (TA) approaches in an effort to establish whether any differences exist. A systematic review and meta-analysis of the current literature reporting outcomes for patients undergoing TAo and TA TAVI was performed to compare outcomes using each vascular approach to valve deployment. A total of 10 studies and 1736 patients were included. A total of 193 patients underwent TAo and 1543 TA TAVI. No significant difference in 30-day mortality was identified (TAo 9.4, TA 10.4 p = 0.7). There were no significant differences identified between TAo and TA TAVI in procedural success rate (96.3% vs. 93.7% p = 0.3), stroke and transient ischemic attack (TIA) incidence (1.8% vs. 2.3% p = 0.7), major bleed (5.8% vs. 5.5% p = 0.97) or pacemaker insertion rates (6.1% vs. 7.4% p = 0.56). In addition, the incidence of clinically significant paravalvular regurgitation (PVR) was the same between groups (6.7% vs. 11% p = 0.43). Comparison of TAo and TA approaches revealed equivalent outcomes in 30-day mortality, procedural success, major bleeding, stroke/TIA incidence, pacemaker insertion rates and paravalvular leak. Heart teams should be familiar with the use of both TA and TAo access and tailor their selection on a case-to-case basis. © 2015 Wiley Periodicals, Inc.

  8. A comparison of approaches for finding minimum identifying codes on graphs

    Science.gov (United States)

    Horan, Victoria; Adachi, Steve; Bak, Stanley

    2016-05-01

    In order to formulate mathematical conjectures likely to be true, a number of base cases must be determined. However, many combinatorial problems are NP-hard and the computational complexity makes this research approach difficult using a standard brute force approach on a typical computer. One sample problem explored is that of finding a minimum identifying code. To work around the computational issues, a variety of methods are explored and consist of a parallel computing approach using MATLAB, an adiabatic quantum optimization approach using a D-Wave quantum annealing processor, and lastly using satisfiability modulo theory (SMT) and corresponding SMT solvers. Each of these methods requires the problem to be formulated in a unique manner. In this paper, we address the challenges of computing solutions to this NP-hard problem with respect to each of these methods.

  9. Identifying bioaccumulative halogenated organic compounds using a nontargeted analytical approach: seabirds as sentinels.

    Directory of Open Access Journals (Sweden)

    Christopher J Millow

    Full Text Available Persistent organic pollutants (POPs are typically monitored via targeted mass spectrometry, which potentially identifies only a fraction of the contaminants actually present in environmental samples. With new anthropogenic compounds continuously introduced to the environment, novel and proactive approaches that provide a comprehensive alternative to targeted methods are needed in order to more completely characterize the diversity of known and unknown compounds likely to cause adverse effects. Nontargeted mass spectrometry attempts to extensively screen for compounds, providing a feasible approach for identifying contaminants that warrant future monitoring. We employed a nontargeted analytical method using comprehensive two-dimensional gas chromatography coupled to time-of-flight mass spectrometry (GC×GC/TOF-MS to characterize halogenated organic compounds (HOCs in California Black skimmer (Rynchops niger eggs. Our study identified 111 HOCs; 84 of these compounds were regularly detected via targeted approaches, while 27 were classified as typically unmonitored or unknown. Typically unmonitored compounds of note in bird eggs included tris(4-chlorophenylmethane (TCPM, tris(4-chlorophenylmethanol (TCPMOH, triclosan, permethrin, heptachloro-1'-methyl-1,2'-bipyrrole (MBP, as well as four halogenated unknown compounds that could not be identified through database searching or the literature. The presence of these compounds in Black skimmer eggs suggests they are persistent, bioaccumulative, potentially biomagnifying, and maternally transferring. Our results highlight the utility and importance of employing nontargeted analytical tools to assess true contaminant burdens in organisms, as well as to demonstrate the value in using environmental sentinels to proactively identify novel contaminants.

  10. Identifying and Validating a Model of Interpersonal Performance Dimensions

    National Research Council Canada - National Science Library

    Carpenter, Tara

    2004-01-01

    .... Two studies were then completed to validate the proposed taxonomy. In the first study empirical evidence for the taxonomy was gathered using a content analysis of critical incidents taken from a job analysis...

  11. Validity of a family-centered approach for assessing infants' social-emotional wellbeing and their developmental context: a prospective cohort study.

    Science.gov (United States)

    Hielkema, Margriet; De Winter, Andrea F; Reijneveld, Sijmen A

    2017-06-15

    Family-centered care seems promising in preventive pediatrics, but evidence is lacking as to whether this type of care is also valid as a means to identify risks to infants' social-emotional development. We aimed to examine the validity of such a family-centered approach. We conducted a prospective cohort study. During routine well-child visits (2-15 months), Preventive Child Healthcare (PCH) professionals used a family-centered approach, assessing domains as parents' competence, role of the partner, social support, barriers within the care-giving context, and child's wellbeing for 2976 children as protective, indistinct or a risk. If, based on the overall assessment (the families were labeled as "cases", N = 87), an intervention was considered necessary, parents filled in validated questionnaires covering the aforementioned domains. These questionnaires served as gold standards. For each case, two controls, matched by child-age and gender, also filled in questionnaires (N = 172). We compared PCH professionals' assessments with the parent-reported gold standards. Moreover, we evaluated which domain mostly contributed to the overall assessment. Spearman's rank correlation coefficients between PCH professionals' assessments and gold standards were overall reasonable (Spearman's rho 0.17-0.39) except for the domain barriers within the care-giving context. Scores on gold standards were significantly higher when PCH assessments were rated as "at risk" (overall and per domain).We found reasonable to excellent agreement regarding the absence of risk factors (negative agreement rate: 0.40-0.98), but lower agreement regarding the presence of risk factors (positive agreement rate: 0.00-0.67). An "at risk" assessment for the domain Barriers or life events within the care-giving context contributed most to being overall at risk, i.e. a case, odds ratio 100.1, 95%-confidence interval: 22.6 - infinity. Findings partially support the convergent validity of a family

  12. A Network Biology Approach Identifies Molecular Cross-Talk between Normal Prostate Epithelial and Prostate Carcinoma Cells.

    Science.gov (United States)

    Trevino, Victor; Cassese, Alberto; Nagy, Zsuzsanna; Zhuang, Xiaodong; Herbert, John; Antczak, Philipp; Clarke, Kim; Davies, Nicholas; Rahman, Ayesha; Campbell, Moray J; Guindani, Michele; Bicknell, Roy; Vannucci, Marina; Falciani, Francesco

    2016-04-01

    The advent of functional genomics has enabled the genome-wide characterization of the molecular state of cells and tissues, virtually at every level of biological organization. The difficulty in organizing and mining this unprecedented amount of information has stimulated the development of computational methods designed to infer the underlying structure of regulatory networks from observational data. These important developments had a profound impact in biological sciences since they triggered the development of a novel data-driven investigative approach. In cancer research, this strategy has been particularly successful. It has contributed to the identification of novel biomarkers, to a better characterization of disease heterogeneity and to a more in depth understanding of cancer pathophysiology. However, so far these approaches have not explicitly addressed the challenge of identifying networks representing the interaction of different cell types in a complex tissue. Since these interactions represent an essential part of the biology of both diseased and healthy tissues, it is of paramount importance that this challenge is addressed. Here we report the definition of a network reverse engineering strategy designed to infer directional signals linking adjacent cell types within a complex tissue. The application of this inference strategy to prostate cancer genome-wide expression profiling data validated the approach and revealed that normal epithelial cells exert an anti-tumour activity on prostate carcinoma cells. Moreover, by using a Bayesian hierarchical model integrating genetics and gene expression data and combining this with survival analysis, we show that the expression of putative cell communication genes related to focal adhesion and secretion is affected by epistatic gene copy number variation and it is predictive of patient survival. Ultimately, this study represents a generalizable approach to the challenge of deciphering cell communication networks

  13. A Network Biology Approach Identifies Molecular Cross-Talk between Normal Prostate Epithelial and Prostate Carcinoma Cells.

    Directory of Open Access Journals (Sweden)

    Victor Trevino

    2016-04-01

    Full Text Available The advent of functional genomics has enabled the genome-wide characterization of the molecular state of cells and tissues, virtually at every level of biological organization. The difficulty in organizing and mining this unprecedented amount of information has stimulated the development of computational methods designed to infer the underlying structure of regulatory networks from observational data. These important developments had a profound impact in biological sciences since they triggered the development of a novel data-driven investigative approach. In cancer research, this strategy has been particularly successful. It has contributed to the identification of novel biomarkers, to a better characterization of disease heterogeneity and to a more in depth understanding of cancer pathophysiology. However, so far these approaches have not explicitly addressed the challenge of identifying networks representing the interaction of different cell types in a complex tissue. Since these interactions represent an essential part of the biology of both diseased and healthy tissues, it is of paramount importance that this challenge is addressed. Here we report the definition of a network reverse engineering strategy designed to infer directional signals linking adjacent cell types within a complex tissue. The application of this inference strategy to prostate cancer genome-wide expression profiling data validated the approach and revealed that normal epithelial cells exert an anti-tumour activity on prostate carcinoma cells. Moreover, by using a Bayesian hierarchical model integrating genetics and gene expression data and combining this with survival analysis, we show that the expression of putative cell communication genes related to focal adhesion and secretion is affected by epistatic gene copy number variation and it is predictive of patient survival. Ultimately, this study represents a generalizable approach to the challenge of deciphering cell

  14. Validation of Sustainable Development Practices Scale Using the Bayesian Approach to Item Response Theory

    Directory of Open Access Journals (Sweden)

    Martin Hernani Merino

    2014-12-01

    Full Text Available There has been growing recognition of the importance of creating performance measurement tools for the economic, social and environmental management of micro and small enterprise (MSE. In this context, this study aims to validate an instrument to assess perceptions of sustainable development practices by MSEs by means of a Graded Response Model (GRM with a Bayesian approach to Item Response Theory (IRT. The results based on a sample of 506 university students in Peru, suggest that a valid measurement instrument was achieved. At the end of the paper, methodological and managerial contributions are presented.

  15. Maximal Predictability Approach for Identifying the Right Descriptors for Electrocatalytic Reactions.

    Science.gov (United States)

    Krishnamurthy, Dilip; Sumaria, Vaidish; Viswanathan, Venkatasubramanian

    2018-02-01

    Density functional theory (DFT) calculations are being routinely used to identify new material candidates that approach activity near fundamental limits imposed by thermodynamics or scaling relations. DFT calculations are associated with inherent uncertainty, which limits the ability to delineate materials (distinguishability) that possess high activity. Development of error-estimation capabilities in DFT has enabled uncertainty propagation through activity-prediction models. In this work, we demonstrate an approach to propagating uncertainty through thermodynamic activity models leading to a probability distribution of the computed activity and thereby its expectation value. A new metric, prediction efficiency, is defined, which provides a quantitative measure of the ability to distinguish activity of materials and can be used to identify the optimal descriptor(s) ΔG opt . We demonstrate the framework for four important electrochemical reactions: hydrogen evolution, chlorine evolution, oxygen reduction and oxygen evolution. Future studies could utilize expected activity and prediction efficiency to significantly improve the prediction accuracy of highly active material candidates.

  16. Social validation of vocabulary selection: ensuring stakeholder relevance.

    Science.gov (United States)

    Bornman, Juan; Bryen, Diane Nelson

    2013-06-01

    The vocabulary needs of individuals who are unable to spell their messages continue to be of concern in the field of augmentative and alternative communication (AAC). Social validation of vocabulary selection has been suggested as one way to improve the effectiveness and relevance of service delivery in AAC. Despite increased emphasis on stakeholder accountability, social validation is not frequently used in AAC research. This paper describes an investigation of the social validity of a vocabulary set identified in earlier research. A previous study used stakeholder focus groups to identify vocabulary that could be used by South African adults who use AAC to disclose their experiences as victims of crime or abuse. Another study used this vocabulary to create communication boards for use by adults with complex communication needs. In this current project, 12 South African adults with complex communication needs who use AAC systems used a 5-point Likert scale to score the importance of each of the previously identified 57 vocabulary items. This two-step process of first using stakeholder focus groups to identify vocabulary, and then having literate persons who use AAC provide information on social validity of the vocabulary on behalf of their peers who are illiterate, appears to hold promise as a culturally relevant vocabulary selection approach for sensitive topics such as crime and abuse.

  17. Gamma ray self-attenuation correction: a simple numerical approach and its validation

    International Nuclear Information System (INIS)

    Agarwal, Chhavi; Poi, Sanhita; Mhatre, Amol; Goswami, A.

    2009-03-01

    A hybrid Monte Carlo method for gamma ray attenuation correction has been developed. The method has been applied to some common counting geometries like cylinder, box, sphere and disc. The method has been validated theoretically and experimentally over a wide range of transmittance and sample-to-detector distances. The advantage of the approach is that it is common to all sample geometries and can be used at all sample-to detector distances. (author)

  18. Multi-omics approach identifies molecular mechanisms of plant-fungus mycorrhizal interaction

    Directory of Open Access Journals (Sweden)

    Peter E Larsen

    2016-01-01

    Full Text Available In mycorrhizal symbiosis, plant roots form close, mutually beneficial interactions with soil fungi. Before this mycorrhizal interaction can be established however, plant roots must be capable of detecting potential beneficial fungal partners and initiating the gene expression patterns necessary to begin symbiosis. To predict a plant root – mycorrhizal fungi sensor systems, we analyzed in vitro experiments of Populus tremuloides (aspen tree and Laccaria bicolor (mycorrhizal fungi interaction and leveraged over 200 previously published transcriptomic experimental data sets, 159 experimentally validated plant transcription factor binding motifs, and more than 120-thousand experimentally validated protein-protein interactions to generate models of pre-mycorrhizal sensor systems in aspen root. These sensor mechanisms link extracellular signaling molecules with gene regulation through a network comprised of membrane receptors, signal cascade proteins, transcription factors, and transcription factor biding DNA motifs. Modeling predicted four pre-mycorrhizal sensor complexes in aspen that interact with fifteen transcription factors to regulate the expression of 1184 genes in response to extracellular signals synthesized by Laccaria. Predicted extracellular signaling molecules include common signaling molecules such as phenylpropanoids, salicylate, and, jasmonic acid. This multi-omic computational modeling approach for predicting the complex sensory networks yielded specific, testable biological hypotheses for mycorrhizal interaction signaling compounds, sensor complexes, and mechanisms of gene regulation.

  19. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  20. Validation of multisource electronic health record data: an application to blood transfusion data.

    Science.gov (United States)

    Hoeven, Loan R van; Bruijne, Martine C de; Kemper, Peter F; Koopman, Maria M W; Rondeel, Jan M M; Leyte, Anja; Koffijberg, Hendrik; Janssen, Mart P; Roes, Kit C B

    2017-07-14

    Although data from electronic health records (EHR) are often used for research purposes, systematic validation of these data prior to their use is not standard practice. Existing validation frameworks discuss validity concepts without translating these into practical implementation steps or addressing the potential influence of linking multiple sources. Therefore we developed a practical approach for validating routinely collected data from multiple sources and to apply it to a blood transfusion data warehouse to evaluate the usability in practice. The approach consists of identifying existing validation frameworks for EHR data or linked data, selecting validity concepts from these frameworks and establishing quantifiable validity outcomes for each concept. The approach distinguishes external validation concepts (e.g. concordance with external reports, previous literature and expert feedback) and internal consistency concepts which use expected associations within the dataset itself (e.g. completeness, uniformity and plausibility). In an example case, the selected concepts were applied to a transfusion dataset and specified in more detail. Application of the approach to a transfusion dataset resulted in a structured overview of data validity aspects. This allowed improvement of these aspects through further processing of the data and in some cases adjustment of the data extraction. For example, the proportion of transfused products that could not be linked to the corresponding issued products initially was 2.2% but could be improved by adjusting data extraction criteria to 0.17%. This stepwise approach for validating linked multisource data provides a basis for evaluating data quality and enhancing interpretation. When the process of data validation is adopted more broadly, this contributes to increased transparency and greater reliability of research based on routinely collected electronic health records.

  1. Toward validation of a structural approach to conceptualizing psychopathology: A special section of the Journal of Abnormal Psychology.

    Science.gov (United States)

    Krueger, Robert F; Tackett, Jennifer L; MacDonald, Angus

    2016-11-01

    Traditionally, psychopathology has been conceptualized in terms of polythetic categories derived from committee deliberations and enshrined in authoritative psychiatric nosologies-most notably the Diagnostic and Statistical Manual of Mental Disorders (DSM; American Psychiatric Association [APA], 2013). As the limitations of this form of classification have become evident, empirical data have been increasingly relied upon to investigate the structure of psychopathology. These efforts have borne fruit in terms of an increasingly consistent set of psychopathological constructs closely connected with similar personality constructs. However, the work of validating these constructs using convergent sources of data is an ongoing enterprise. This special section collects several new efforts to use structural approaches to study the validity of this empirically based organizational scheme for psychopathology. Inasmuch as a structural approach reflects the natural organization of psychopathology, it has great potential to facilitate comprehensive organization of information on the correlates of psychopathology, providing evidence for the convergent and discriminant validity of an empirical approach to classification. Here, we highlight several themes that emerge from this burgeoning literature. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  2. The construct validity of the Spanish version of the ABQ using a multi-trait/multi-method approach

    Directory of Open Access Journals (Sweden)

    Thomas D. Raedeke

    2013-10-01

    Full Text Available This study was designed to evaluate construct validity evidence associated with the Spanish version of the Athlete Burnout Questionnaire (ABQ using a multi-trait/multi-method (MTMM approach. The ABQ was administered to a sample of 302 Spanish athletes, along with two other questionnaires including the Maslach Burnout Inventory-General Survey (MBI-GS and the Depression, Anxiety, Stress Scale (DASS-21, which respectively measure burnout in organizational settings and indicators of ill being including depression, anxiety and stress. A structural equation modeling approach to a MTMM analysis was used. Results revealed by comparative analysis of four models that the Spanish version of ABQ has convergent and internal discriminant validity evident by high correlations between matching burnout subscales across two measures and lower correlations between non-matching dimensions. In addition, the burnout measures exhibited external discriminant validity as the correlations between burnout dimensions were higher than those seen between conceptually related, but unique, constructs.

  3. Integrative microRNA and proteomic approaches identify novel osteoarthritis genes and their collaborative metabolic and inflammatory networks.

    Directory of Open Access Journals (Sweden)

    Dimitrios Iliopoulos

    Full Text Available BACKGROUND: Osteoarthritis is a multifactorial disease characterized by destruction of the articular cartilage due to genetic, mechanical and environmental components affecting more than 100 million individuals all over the world. Despite the high prevalence of the disease, the absence of large-scale molecular studies limits our ability to understand the molecular pathobiology of osteoathritis and identify targets for drug development. METHODOLOGY/PRINCIPAL FINDINGS: In this study we integrated genetic, bioinformatic and proteomic approaches in order to identify new genes and their collaborative networks involved in osteoarthritis pathogenesis. MicroRNA profiling of patient-derived osteoarthritic cartilage in comparison to normal cartilage, revealed a 16 microRNA osteoarthritis gene signature. Using reverse-phase protein arrays in the same tissues we detected 76 differentially expressed proteins between osteoarthritic and normal chondrocytes. Proteins such as SOX11, FGF23, KLF6, WWOX and GDF15 not implicated previously in the genesis of osteoarthritis were identified. Integration of microRNA and proteomic data with microRNA gene-target prediction algorithms, generated a potential "interactome" network consisting of 11 microRNAs and 58 proteins linked by 414 potential functional associations. Comparison of the molecular and clinical data, revealed specific microRNAs (miR-22, miR-103 and proteins (PPARA, BMP7, IL1B to be highly correlated with Body Mass Index (BMI. Experimental validation revealed that miR-22 regulated PPARA and BMP7 expression and its inhibition blocked inflammatory and catabolic changes in osteoarthritic chondrocytes. CONCLUSIONS/SIGNIFICANCE: Our findings indicate that obesity and inflammation are related to osteoarthritis, a metabolic disease affected by microRNA deregulation. Gene network approaches provide new insights for elucidating the complexity of diseases such as osteoarthritis. The integration of microRNA, proteomic

  4. Groundwater Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process of stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation

  5. Development and validation of a screening procedure to identify speech-language delay in toddlers with cleft palate

    DEFF Research Database (Denmark)

    Jørgensen, Line Dahl; Willadsen, Elisabeth

    2017-01-01

    condition based on assessment of consonant inventory using a real-time listening procedure in combination with parent-reported expressive vocabulary. These measures allowed evaluation of early speech-language skills found to correlate significantly with later speech-language difficulties in longitudinal......The purpose of this study was to develop and validate a clinically useful speech-language screening procedure for young children with cleft palate +/- cleft lip (CP) to identify those in need of speech-language intervention. Twenty-two children with CP were assigned to a +/- need for intervention...... studies of children with CP. The external validity of this screening procedure was evaluated by comparing the +/- need for intervention assignment determined by the screening procedure to experienced speech-language pathologists’ (SLPs’) clinical judgment of whether or not a child needed early...

  6. Identifying subgroups of patients using latent class analysis

    DEFF Research Database (Denmark)

    Nielsen, Anne Mølgaard; Kent, Peter; Hestbæk, Lise

    2017-01-01

    BACKGROUND: Heterogeneity in patients with low back pain (LBP) is well recognised and different approaches to subgrouping have been proposed. Latent Class Analysis (LCA) is a statistical technique that is increasingly being used to identify subgroups based on patient characteristics. However......, as LBP is a complex multi-domain condition, the optimal approach when using LCA is unknown. Therefore, this paper describes the exploration of two approaches to LCA that may help improve the identification of clinically relevant and interpretable LBP subgroups. METHODS: From 928 LBP patients consulting...... of statistical performance measures, qualitative evaluation of clinical interpretability (face validity) and a subgroup membership comparison. RESULTS: For the single-stage LCA, a model solution with seven patient subgroups was preferred, and for the two-stage LCA, a nine patient subgroup model. Both approaches...

  7. An automated approach for finding variable-constant pairing bugs

    DEFF Research Database (Denmark)

    Lawall, Julia; Lo, David

    2010-01-01

    program-analysis and data-mining based approach to identify the uses of named constants and to identify anomalies in these uses.  We have applied our approach to a recent version of the Linux kernel and have found a number of bugs affecting both correctness and software maintenance.  Many of these bugs...... have been validated by the Linux developers....

  8. Design of character-based DNA barcode motif for species identification: A computational approach and its validation in fishes.

    Science.gov (United States)

    Chakraborty, Mohua; Dhar, Bishal; Ghosh, Sankar Kumar

    2017-11-01

    The DNA barcodes are generally interpreted using distance-based and character-based methods. The former uses clustering of comparable groups, based on the relative genetic distance, while the latter is based on the presence or absence of discrete nucleotide substitutions. The distance-based approach has a limitation in defining a universal species boundary across the taxa as the rate of mtDNA evolution is not constant throughout the taxa. However, character-based approach more accurately defines this using a unique set of nucleotide characters. The character-based analysis of full-length barcode has some inherent limitations, like sequencing of the full-length barcode, use of a sparse-data matrix and lack of a uniform diagnostic position for each group. A short continuous stretch of a fragment can be used to resolve the limitations. Here, we observe that a 154-bp fragment, from the transversion-rich domain of 1367 COI barcode sequences can successfully delimit species in the three most diverse orders of freshwater fishes. This fragment is used to design species-specific barcode motifs for 109 species by the character-based method, which successfully identifies the correct species using a pattern-matching program. The motifs also correctly identify geographically isolated population of the Cypriniformes species. Further, this region is validated as a species-specific mini-barcode for freshwater fishes by successful PCR amplification and sequencing of the motif (154 bp) using the designed primers. We anticipate that use of such motifs will enhance the diagnostic power of DNA barcode, and the mini-barcode approach will greatly benefit the field-based system of rapid species identification. © 2017 John Wiley & Sons Ltd.

  9. Online-Based Approaches to Identify Real Journals and Publishers from Hijacked Ones.

    Science.gov (United States)

    Asadi, Amin; Rahbar, Nader; Asadi, Meisam; Asadi, Fahime; Khalili Paji, Kokab

    2017-02-01

    The aim of the present paper was to introduce some online-based approaches to evaluate scientific journals and publishers and to differentiate them from the hijacked ones, regardless of their disciplines. With the advent of open-access journals, many hijacked journals and publishers have deceitfully assumed the mantle of authenticity in order to take advantage of researchers and students. Although these hijacked journals and publishers can be identified through checking their advertisement techniques and their websites, these ways do not always result in their identification. There exist certain online-based approaches, such as using Master Journal List provided by Thomson Reuters, and Scopus database, and using the DOI of a paper, to certify the realness of a journal or publisher. It is indispensable that inexperienced students and researchers know these methods so as to identify hijacked journals and publishers with a higher level of probability.

  10. Relative criterion for validity of a semiclassical approach to the dynamics near quantum critical points.

    Science.gov (United States)

    Wang, Qian; Qin, Pinquan; Wang, Wen-ge

    2015-10-01

    Based on an analysis of Feynman's path integral formulation of the propagator, a relative criterion is proposed for validity of a semiclassical approach to the dynamics near critical points in a class of systems undergoing quantum phase transitions. It is given by an effective Planck constant, in the relative sense that a smaller effective Planck constant implies better performance of the semiclassical approach. Numerical tests of this relative criterion are given in the XY model and in the Dicke model.

  11. Alternative socio-centric approach for model validation - a way forward for socio-hydrology

    Science.gov (United States)

    van Emmerik, Tim; Elshafei, Yasmina; Mahendran, Roobavannan; Kandasamy, Jaya; Pande, Saket; Sivapalan, Murugesu

    2017-04-01

    To better understand and mitigate the impacts of humans on the water cycle, the importance of studying the co-evolution of coupled human-water systems has been recognized. Because of its unique system dynamics, the Murrumbidgee river basin (part of the larger Murray-Darlin basin, Australia) is one of the main study areas in the emerging field of socio-hydrology. In recent years, various historical and modeling studies have contributed to gaining a better understanding of this system's behavior. Kandasamy et al. (2014) performed a historical study on the development of this human-water coupled system. They identified four eras, providing a historical context of the observed "pendulum" swing between first an exclusive focus on agricultural development, followed by increasing environmental awareness, subsequent efforts to mitigate, and finally to restore environmental health. A modeling effort by Van Emmerik et al. (2014) focused on reconstructing hydrological, economical, and societal dynamics and their feedbacks. A measure of changing societal values was included by introducing environmental awareness as an endogenously modeled variable, which resulted in capturing the co-evolution between economic development and environmental health. Later work by Elshafei et al. (2015) modeled and analyzed the two-way feedbacks of land use management and land degradation in two other Australian coupled systems. A composite variable, community sensitivity, was used to measure changing community sentiment, such that the model was capable of isolating the two-way feedbacks in the coupled system. As socio-hydrology adopts a holistic approach, it is often required to introduce (hydrologically) unconventional variables, such as environmental awareness or community sensitivity. It is the subject of ongoing debate how such variables can be validated, as there is no standardized data set available from hydrological or statistical agencies. Recent research (Wei et al. 2017) has provided

  12. Site characterization and validation - Final report

    International Nuclear Information System (INIS)

    Olsson, O.

    1992-04-01

    The central aims of the Site Characterization and Validation (SCV) project were to develop and apply; * an advanced site characterization methodology and * a methodology to validate the models used to describe groundwater flow and transport in fractured rock. The basic experiment within the SCV project was to predict the distribution of water flow and tracer transport through a volume of rock, before and after excavation of a sub-horizontal drift, and to compare these predictions with actual field measurements. A structured approach was developed to combine site characterization data into a geological and hydrogeological conceptual model of a site. The conceptual model was based on a binary description where the rock mass was divided into 'fracture zones' and 'averagely fractured rock'. This designation into categories was based on a Fracture Zone Index (FZI) derived from principal component analysis of single borehole data. The FZI was used to identify the location of fracture zones in the boreholes and the extent of the zones between the boreholes was obtained form remote sensing data (radar and seismics). The consistency of the geometric model thus defined, and its significance to the flow system, was verified by cross-hole hydraulic testing. The conceptual model of the SCV site contained three major and four minor fractures zones which were the principal hydraulic conduits at the site. The location and extent of the fracture zones were included explicitly in the flow and transport models. Four different numerical modelling approaches were pursued within the project; one porous medium approach, two discrete fracture approaches, and an equivalent discontinuum approach. A series of tracer tests was also included in the prediction-validation exercise. (120 refs.) (au)

  13. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    Science.gov (United States)

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  14. Identifying technology innovations for marginalized smallholders-A conceptual approach.

    Science.gov (United States)

    Malek, Mohammad Abdul; Gatzweiler, Franz W; Von Braun, Joachim

    2017-05-01

    This paper adds a contribution in the existing literature in terms of theoretical and conceptual background for the identification of idle potentials of marginal rural areas and people by means of technological and institutional innovations. The approach follows ex-ante assessment for identifying suitable technology and institutional innovations for marginalized smallholders in marginal areas-divided into three main parts (mapping, surveying and evaluating) and several steps. Finally, it contributes to the inclusion of marginalized smallholders by an improved way of understanding the interactions between technology needs, farming systems, ecological resources and poverty characteristics in the different segments of the poor, and to link these insights with productivity enhancing technologies.

  15. Initial Validation for the Estimation of Resting-State fMRI Effective Connectivity by a Generalization of the Correlation Approach

    Directory of Open Access Journals (Sweden)

    Nan Xu

    2017-05-01

    Full Text Available Resting-state functional MRI (rs-fMRI is widely used to noninvasively study human brain networks. Network functional connectivity is often estimated by calculating the timeseries correlation between blood-oxygen-level dependent (BOLD signal from different regions of interest (ROIs. However, standard correlation cannot characterize the direction of information flow between regions. In this paper, we introduce and test a new concept, prediction correlation, to estimate effective connectivity in functional brain networks from rs-fMRI. In this approach, the correlation between two BOLD signals is replaced by a correlation between one BOLD signal and a prediction of this signal via a causal system driven by another BOLD signal. Three validations are described: (1 Prediction correlation performed well on simulated data where the ground truth was known, and outperformed four other methods. (2 On simulated data designed to display the “common driver” problem, prediction correlation did not introduce false connections between non-interacting driven ROIs. (3 On experimental data, prediction correlation recovered the previously identified network organization of human brain. Prediction correlation scales well to work with hundreds of ROIs, enabling it to assess whole brain interregional connectivity at the single subject level. These results provide an initial validation that prediction correlation can capture the direction of information flow and estimate the duration of extended temporal delays in information flow between regions of interest ROIs based on BOLD signal. This approach not only maintains the high sensitivity to network connectivity provided by the correlation analysis, but also performs well in the estimation of causal information flow in the brain.

  16. Validation of a search strategy to identify nutrition trials in PubMed using the relative recall method.

    Science.gov (United States)

    Durão, Solange; Kredo, Tamara; Volmink, Jimmy

    2015-06-01

    To develop, assess, and maximize the sensitivity of a search strategy to identify diet and nutrition trials in PubMed using relative recall. We developed a search strategy to identify diet and nutrition trials in PubMed. We then constructed a gold standard reference set to validate the identified trials using the relative recall method. Relative recall was calculated by dividing the number of references from the gold standard our search strategy identified by the total number of references in the gold standard. Our gold standard comprised 298 trials, derived from 16 included systematic reviews. The initial search strategy identified 242 of 298 references, with a relative recall of 81.2% [95% confidence interval (CI): 76.3%, 85.5%]. We analyzed titles and abstracts of the 56 missed references for possible additional terms. We then modified the search strategy accordingly. The relative recall of the final search strategy was 88.6% (95% CI: 84.4%, 91.9%). We developed a search strategy to identify diet and nutrition trials in PubMed with a high relative recall (sensitivity). This could be useful for establishing a nutrition trials register to support the conduct of future research, including systematic reviews. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Medical chart validation of an algorithm for identifying multiple sclerosis relapse in healthcare claims.

    Science.gov (United States)

    Chastek, Benjamin J; Oleen-Burkey, Merrikay; Lopez-Bresnahan, Maria V

    2010-01-01

    Relapse is a common measure of disease activity in relapsing-remitting multiple sclerosis (MS). The objective of this study was to test the content validity of an operational algorithm for detecting relapse in claims data. A claims-based relapse detection algorithm was tested by comparing its detection rate over a 1-year period with relapses identified based on medical chart review. According to the algorithm, MS patients in a US healthcare claims database who had either (1) a primary claim for MS during hospitalization or (2) a corticosteroid claim following a MS-related outpatient visit were designated as having a relapse. Patient charts were examined for explicit indication of relapse or care suggestive of relapse. Positive and negative predictive values were calculated. Medical charts were reviewed for 300 MS patients, half of whom had a relapse according to the algorithm. The claims-based criteria correctly classified 67.3% of patients with relapses (positive predictive value) and 70.0% of patients without relapses (negative predictive value; kappa 0.373: p value of the operational algorithm. Limitations of the algorithm include lack of differentiation between relapsing-remitting MS and other types, and that it does not incorporate measures of function and disability. The claims-based algorithm appeared to successfully detect moderate-to-severe MS relapse. This validated definition can be applied to future claims-based MS studies.

  18. Systems Biology Genetic Approach Identifies Serotonin Pathway as a Possible Target for Obstructive Sleep Apnea: Results from a Literature Search Review

    Directory of Open Access Journals (Sweden)

    Ram Jagannathan

    2017-01-01

    Full Text Available Rationale. Overall validity of existing genetic biomarkers in the diagnosis of obstructive sleep apnea (OSA remains unclear. The objective of this systematic genetic study is to identify “novel” biomarkers for OSA using systems biology approach. Methods. Candidate genes for OSA were extracted from PubMed, MEDLINE, and Embase search engines and DisGeNET database. The gene ontology (GO analyses and candidate genes prioritization were performed using Enrichr tool. Genes pertaining to the top 10 pathways were extracted and used for Ingenuity Pathway Analysis. Results. In total, we have identified 153 genes. The top 10 pathways associated with OSA include (i serotonin receptor interaction, (ii pathways in cancer, (iii AGE-RAGE signaling in diabetes, (iv infectious diseases, (v serotonergic synapse, (vi inflammatory bowel disease, (vii HIF-1 signaling pathway, (viii PI3-AKT signaling pathway, (ix regulation lipolysis in adipocytes, and (x rheumatoid arthritis. After removing the overlapping genes, we have identified 23 candidate genes, out of which >30% of the genes were related to the genes involved in the serotonin pathway. Among these 4 serotonin receptors SLC6A4, HTR2C, HTR2A, and HTR1B were strongly associated with OSA. Conclusions. This preliminary report identifies several potential candidate genes associated with OSA and also describes the possible regulatory mechanisms.

  19. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  20. A practical approach to validating a PD model

    NARCIS (Netherlands)

    Medema, Lydian; Koning, Ruud H.; Lensink, Robert; Medema, M.

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  1. External validation of fatty liver index for identifying ultrasonographic fatty liver in a large-scale cross-sectional study in Taiwan.

    Directory of Open Access Journals (Sweden)

    Bi-Ling Yang

    Full Text Available The fatty liver index (FLI is an algorithm involving the waist circumference, body mass index, and serum levels of triglyceride and gamma-glutamyl transferase to identify fatty liver. Although some studies have attempted to validate the FLI, few studies have been conducted for external validation among Asians. We attempted to validate FLI to predict ultrasonographic fatty liver in Taiwanese subjects.We enrolled consecutive subjects who received health check-up services at the Taipei Veterans General Hospital from 2002 to 2009. Ultrasonography was applied to diagnose fatty liver. The ability of the FLI to detect ultrasonographic fatty liver was assessed by analyzing the area under the receiver operating characteristic (AUROC curve.Among the 29,797 subjects enrolled in this study, fatty liver was diagnosed in 44.5% of the population. Subjects with ultrasonographic fatty liver had a significantly higher FLI than those without fatty liver by multivariate analysis (odds ratio 1.045; 95% confidence interval, CI 1.044-1.047, p< 0.001. Moreover, FLI had the best discriminative ability to identify patients with ultrasonographic fatty liver (AUROC: 0.827, 95% confidence interval, 0.822-0.831. An FLI < 25 (negative likelihood ratio (LR- 0.32 for males and <10 (LR- 0.26 for females rule out ultrasonographic fatty liver. Moreover, an FLI ≥ 35 (positive likelihood ratio (LR+ 3.12 for males and ≥ 20 (LR+ 4.43 for females rule in ultrasonographic fatty liver.FLI could accurately identify ultrasonographic fatty liver in a large-scale population in Taiwan but with lower cut-off value than the Western population. Meanwhile the cut-off value was lower in females than in males.

  2. Reimagining psychoses: an agnostic approach to diagnosis.

    Science.gov (United States)

    Keshavan, Matcheri S; Clementz, Brett A; Pearlson, Godfrey D; Sweeney, John A; Tamminga, Carol A

    2013-05-01

    Current approaches to defining and classifying psychotic disorders are compromised by substantive heterogeneity within, blurred boundaries between, as well as overlaps across the various disorders in outcome, treatment response, emerging evidence regarding pathophysiology and presumed etiology. We herein review the evolution, current status and the constraints posed by classic symptom-based diagnostic approaches. We compare the continuing constructs that underlie the current classification of psychoses, and contrast those to evolving new thinking in other areas of medicine. An important limitation in current psychiatric nosology may stem from the fact that symptom-based diagnoses do not "carve nature at its joints"; while symptom-based classifications have improved our reliability, they may lack validity. Next steps in developing a more valid scientific nosology for psychoses include a) agnostic deconstruction of disease dimensions, identifying disease markers and endophenotypes; b) mapping such markers across translational domains from behaviors to molecules, c) reclustering cross-cutting bio-behavioral data using modern phenotypic and biometric approaches, and finally d) validating such entities using etio-pathology, outcome and treatment-response measures. The proposed steps of deconstruction and "bottom-up" disease definition, as elsewhere in medicine, may well provide a better foundation for developing a nosology for psychotic disorders that may have better utility in predicting outcome, treatment response and etiology, and identifying novel treatment approaches. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Identifying Mother-Child Interaction Styles Using a Person-Centered Approach.

    Science.gov (United States)

    Nelson, Jackie A; O'Brien, Marion; Grimm, Kevin J; Leerkes, Esther M

    2014-05-01

    Parent-child conflict in the context of a supportive relationship has been discussed as a potentially constructive interaction pattern; the current study is the first to test this using a holistic analytic approach. Interaction styles, defined as mother-child conflict in the context of maternal sensitivity, were identified and described with demographic and stress-related characteristics of families. Longitudinal associations were tested between interaction styles and children's later social competence. Participants included 814 partnered mothers with a first-grade child. Latent profile analysis identified agreeable , dynamic , and disconnected interaction styles. Mothers' intimacy with a partner, depressive symptoms, and authoritarian childrearing beliefs, along with children's later conflict with a best friend and externalizing problems, were associated with group membership. Notably, the dynamic style, characterized by high sensitivity and high conflict, included families who experienced psychological and relational stressors. Findings are discussed with regard to how family stressors shape parent-child interaction patterns.

  4. Identifying Core Mobile Learning Faculty Competencies Based Integrated Approach: A Delphi Study

    Science.gov (United States)

    Elbarbary, Rafik Said

    2015-01-01

    This study is based on the integrated approach as a concept framework to identify, categorize, and rank a key component of mobile learning core competencies for Egyptian faculty members in higher education. The field investigation framework used four rounds Delphi technique to determine the importance rate of each component of core competencies…

  5. Understanding latent structures of clinical information logistics: A bottom-up approach for model building and validating the workflow composite score.

    Science.gov (United States)

    Esdar, Moritz; Hübner, Ursula; Liebe, Jan-David; Hüsers, Jens; Thye, Johannes

    2017-01-01

    Clinical information logistics is a construct that aims to describe and explain various phenomena of information provision to drive clinical processes. It can be measured by the workflow composite score, an aggregated indicator of the degree of IT support in clinical processes. This study primarily aimed to investigate the yet unknown empirical patterns constituting this construct. The second goal was to derive a data-driven weighting scheme for the constituents of the workflow composite score and to contrast this scheme with a literature based, top-down procedure. This approach should finally test the validity and robustness of the workflow composite score. Based on secondary data from 183 German hospitals, a tiered factor analytic approach (confirmatory and subsequent exploratory factor analysis) was pursued. A weighting scheme, which was based on factor loadings obtained in the analyses, was put into practice. We were able to identify five statistically significant factors of clinical information logistics that accounted for 63% of the overall variance. These factors were "flow of data and information", "mobility", "clinical decision support and patient safety", "electronic patient record" and "integration and distribution". The system of weights derived from the factor loadings resulted in values for the workflow composite score that differed only slightly from the score values that had been previously published based on a top-down approach. Our findings give insight into the internal composition of clinical information logistics both in terms of factors and weights. They also allowed us to propose a coherent model of clinical information logistics from a technical perspective that joins empirical findings with theoretical knowledge. Despite the new scheme of weights applied to the calculation of the workflow composite score, the score behaved robustly, which is yet another hint of its validity and therefore its usefulness. Copyright © 2016 Elsevier Ireland

  6. Identifying Key Performance Indicators for Holistic Hospital Management with a Modified DEMATEL Approach.

    Science.gov (United States)

    Si, Sheng-Li; You, Xiao-Yue; Liu, Hu-Chen; Huang, Jia

    2017-08-19

    Performance analysis is an important way for hospitals to achieve higher efficiency and effectiveness in providing services to their customers. The performance of the healthcare system can be measured by many indicators, but it is difficult to improve them simultaneously due to the limited resources. A feasible way is to identify the central and influential indicators to improve healthcare performance in a stepwise manner. In this paper, we propose a hybrid multiple criteria decision making (MCDM) approach to identify key performance indicators (KPIs) for holistic hospital management. First, through integrating evidential reasoning approach and interval 2-tuple linguistic variables, various assessments of performance indicators provided by healthcare experts are modeled. Then, the decision making trial and evaluation laboratory (DEMATEL) technique is adopted to build an interactive network and visualize the causal relationships between the performance indicators. Finally, an empirical case study is provided to demonstrate the proposed approach for improving the efficiency of healthcare management. The results show that "accidents/adverse events", "nosocomial infection", ''incidents/errors", "number of operations/procedures" are significant influential indicators. Also, the indicators of "length of stay", "bed occupancy" and "financial measures" play important roles in performance evaluation of the healthcare organization. The proposed decision making approach could be considered as a reference for healthcare administrators to enhance the performance of their healthcare institutions.

  7. Emotion regulation in patients with rheumatic diseases: validity and responsiveness of the Emotional Approach Coping Scale (EAC

    Directory of Open Access Journals (Sweden)

    Mowinckel Petter

    2009-09-01

    Full Text Available Abstract Background Chronic rheumatic diseases are painful conditions which are not entirely controllable and can place high emotional demands on individuals. Increasing evidence has shown that emotion regulation in terms of actively processing and expressing disease-related emotions are likely to promote positive adjustment in patients with chronic diseases. The Emotional Approach Coping Scale (EAC measures active attempts to acknowledge, understand, and express emotions. Although tested in other clinical samples, the EAC has not been validated for patients with rheumatic diseases. This study evaluated the data quality, internal consistency reliability, validity and responsiveness of the Norwegian version of the EAC for this group of patients. Methods 220 patients with different rheumatic diseases were included in a cross-sectional study in which data quality and internal consistency were assessed. Construct validity was assessed through comparisons with the Brief Approach/Avoidance Coping Questionnaire (BACQ and the General Health Questionnaire (GHQ-20. Responsiveness was tested in a longitudinal pretest-posttest study of two different coping interventions, the Vitality Training Program (VTP and a Self-Management Program (SMP. Results The EAC had low levels of missing data. Results from principal component analysis supported two subscales, Emotional Expression and Emotional Processing, which had high Cronbach's alphas of 0.90 and 0.92, respectively. The EAC had correlations with approach-oriented items in the BACQ in the range 0.17-0.50. The EAC Expression scale had a significant negative correlation with the GHQ-20 of -0.13. As hypothesized, participation in the VTP significantly improved EAC scores, indicating responsiveness to change. Conclusion The EAC is an acceptable and valid instrument for measuring emotional processing and expression in patients with rheumatic diseases. The EAC scales were responsive to change in an intervention

  8. Validating a continental-scale groundwater diffuse pollution model using regional datasets.

    Science.gov (United States)

    Ouedraogo, Issoufou; Defourny, Pierre; Vanclooster, Marnik

    2017-12-11

    In this study, we assess the validity of an African-scale groundwater pollution model for nitrates. In a previous study, we identified a statistical continental-scale groundwater pollution model for nitrate. The model was identified using a pan-African meta-analysis of available nitrate groundwater pollution studies. The model was implemented in both Random Forest (RF) and multiple regression formats. For both approaches, we collected as predictors a comprehensive GIS database of 13 spatial attributes, related to land use, soil type, hydrogeology, topography, climatology, region typology, nitrogen fertiliser application rate, and population density. In this paper, we validate the continental-scale model of groundwater contamination by using a nitrate measurement dataset from three African countries. We discuss the issue of data availability, and quality and scale issues, as challenges in validation. Notwithstanding that the modelling procedure exhibited very good success using a continental-scale dataset (e.g. R 2  = 0.97 in the RF format using a cross-validation approach), the continental-scale model could not be used without recalibration to predict nitrate pollution at the country scale using regional data. In addition, when recalibrating the model using country-scale datasets, the order of model exploratory factors changes. This suggests that the structure and the parameters of a statistical spatially distributed groundwater degradation model for the African continent are strongly scale dependent.

  9. Identifying best-fitting inputs in health-economic model calibration: a Pareto frontier approach.

    Science.gov (United States)

    Enns, Eva A; Cipriano, Lauren E; Simons, Cyrena T; Kong, Chung Yin

    2015-02-01

    To identify best-fitting input sets using model calibration, individual calibration target fits are often combined into a single goodness-of-fit (GOF) measure using a set of weights. Decisions in the calibration process, such as which weights to use, influence which sets of model inputs are identified as best-fitting, potentially leading to different health economic conclusions. We present an alternative approach to identifying best-fitting input sets based on the concept of Pareto-optimality. A set of model inputs is on the Pareto frontier if no other input set simultaneously fits all calibration targets as well or better. We demonstrate the Pareto frontier approach in the calibration of 2 models: a simple, illustrative Markov model and a previously published cost-effectiveness model of transcatheter aortic valve replacement (TAVR). For each model, we compare the input sets on the Pareto frontier to an equal number of best-fitting input sets according to 2 possible weighted-sum GOF scoring systems, and we compare the health economic conclusions arising from these different definitions of best-fitting. For the simple model, outcomes evaluated over the best-fitting input sets according to the 2 weighted-sum GOF schemes were virtually nonoverlapping on the cost-effectiveness plane and resulted in very different incremental cost-effectiveness ratios ($79,300 [95% CI 72,500-87,600] v. $139,700 [95% CI 79,900-182,800] per quality-adjusted life-year [QALY] gained). Input sets on the Pareto frontier spanned both regions ($79,000 [95% CI 64,900-156,200] per QALY gained). The TAVR model yielded similar results. Choices in generating a summary GOF score may result in different health economic conclusions. The Pareto frontier approach eliminates the need to make these choices by using an intuitive and transparent notion of optimality as the basis for identifying best-fitting input sets. © The Author(s) 2014.

  10. Identifying western yellow-billed cuckoo breeding habitat with a dual modelling approach

    Science.gov (United States)

    Johnson, Matthew J.; Hatten, James R.; Holmes, Jennifer A.; Shafroth, Patrick B.

    2017-01-01

    The western population of the yellow-billed cuckoo (Coccyzus americanus) was recently listed as threatened under the federal Endangered Species Act. Yellow-billed cuckoo conservation efforts require the identification of features and area requirements associated with high quality, riparian forest habitat at spatial scales that range from nest microhabitat to landscape, as well as lower-suitability areas that can be enhanced or restored. Spatially explicit models inform conservation efforts by increasing ecological understanding of a target species, especially at landscape scales. Previous yellow-billed cuckoo modelling efforts derived plant-community maps from aerial photography, an expensive and oftentimes inconsistent approach. Satellite models can remotely map vegetation features (e.g., vegetation density, heterogeneity in vegetation density or structure) across large areas with near perfect repeatability, but they usually cannot identify plant communities. We used aerial photos and satellite imagery, and a hierarchical spatial scale approach, to identify yellow-billed cuckoo breeding habitat along the Lower Colorado River and its tributaries. Aerial-photo and satellite models identified several key features associated with yellow-billed cuckoo breeding locations: (1) a 4.5 ha core area of dense cottonwood-willow vegetation, (2) a large native, heterogeneously dense forest (72 ha) around the core area, and (3) moderately rough topography. The odds of yellow-billed cuckoo occurrence decreased rapidly as the amount of tamarisk cover increased or when cottonwood-willow vegetation was limited. We achieved model accuracies of 75–80% in the project area the following year after updating the imagery and location data. The two model types had very similar probability maps, largely predicting the same areas as high quality habitat. While each model provided unique information, a dual-modelling approach provided a more complete picture of yellow-billed cuckoo habitat

  11. Learning from biomedical linked data to suggest valid pharmacogenes.

    Science.gov (United States)

    Dalleau, Kevin; Marzougui, Yassine; Da Silva, Sébastien; Ringot, Patrice; Ndiaye, Ndeye Coumba; Coulet, Adrien

    2017-04-20

    A standard task in pharmacogenomics research is identifying genes that may be involved in drug response variability, i.e., pharmacogenes. Because genomic experiments tended to generate many false positives, computational approaches based on the use of background knowledge have been proposed. Until now, only molecular networks or the biomedical literature were used, whereas many other resources are available. We propose here to consume a diverse and larger set of resources using linked data related either to genes, drugs or diseases. One of the advantages of linked data is that they are built on a standard framework that facilitates the joint use of various sources, and thus facilitates considering features of various origins. We propose a selection and linkage of data sources relevant to pharmacogenomics, including for example DisGeNET and Clinvar. We use machine learning to identify and prioritize pharmacogenes that are the most probably valid, considering the selected linked data. This identification relies on the classification of gene-drug pairs as either pharmacogenomically associated or not and was experimented with two machine learning methods -random forest and graph kernel-, which results are compared in this article. We assembled a set of linked data relative to pharmacogenomics, of 2,610,793 triples, coming from six distinct resources. Learning from these data, random forest enables identifying valid pharmacogenes with a F-measure of 0.73, on a 10 folds cross-validation, whereas graph kernel achieves a F-measure of 0.81. A list of top candidates proposed by both approaches is provided and their obtention is discussed.

  12. A tripartite approach identifies the major sunflower seed albumins.

    Science.gov (United States)

    Jayasena, Achala S; Franke, Bastian; Rosengren, Johan; Mylne, Joshua S

    2016-03-01

    We have used a combination of genomic, transcriptomic, and proteomic approaches to identify the napin-type albumin genes in sunflower and define their contributions to the seed albumin pool. Seed protein content is determined by the expression of what are typically large gene families. A major class of seed storage proteins is the napin-type, water soluble albumins. In this work we provide a comprehensive analysis of the napin-type albumin content of the common sunflower (Helianthus annuus) by analyzing a draft genome, a transcriptome and performing a proteomic analysis of the seed albumin fraction. We show that although sunflower contains at least 26 genes for napin-type albumins, only 15 of these are present at the mRNA level. We found protein evidence for 11 of these but the albumin content of mature seeds is dominated by the encoded products of just three genes. So despite high genetic redundancy for albumins, only a small sub-set of this gene family contributes to total seed albumin content. The three genes identified as producing the majority of sunflower seed albumin are potential future candidates for manipulation through genetics and breeding.

  13. A multi-indicator approach for identifying shoreline sewage pollution hotspots adjacent to coral reefs.

    Science.gov (United States)

    Abaya, Leilani M; Wiegner, Tracy N; Colbert, Steven L; Beets, James P; Carlson, Kaile'a M; Kramer, K Lindsey; Most, Rebecca; Couch, Courtney S

    2018-04-01

    Sewage pollution is contributing to the global decline of coral reefs. Identifying locations where it is entering waters near reefs is therefore a management priority. Our study documented shoreline sewage pollution hotspots in a coastal community with a fringing coral reef (Puakō, Hawai'i) using dye tracer studies, sewage indicator measurements, and a pollution scoring tool. Sewage reached shoreline waters within 9 h to 3 d. Fecal indicator bacteria concentrations were high and variable, and δ 15 N macroalgal values were indicative of sewage at many stations. Shoreline nutrient concentrations were two times higher than those in upland groundwater. Pollution hotspots were identified with a scoring tool using three sewage indicators. It confirmed known locations of sewage pollution from dye tracer studies. Our study highlights the need for a multi-indicator approach and scoring tool to identify sewage pollution hotspots. This approach will be useful for other coastal communities grappling with sewage pollution. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Screening and syndromic approaches to identify gonorrhea and chlamydial infection among women.

    Science.gov (United States)

    Sloan, N L; Winikoff, B; Haberland, N; Coggins, C; Elias, C

    2000-03-01

    The standard diagnostic tools to identify sexually transmitted infections are often expensive and have laboratory and infrastructure requirements that make them unavailable to family planning and primary health-care clinics in developing countries. Therefore, inexpensive, accessible tools that rely on symptoms, signs, and/or risk factors have been developed to identify and treat reproductive tract infections without the need for laboratory diagnostics. Studies were reviewed that used standard diagnostic tests to identify gonorrhea and cervical chlamydial infection among women and that provided adequate information about the usefulness of the tools for screening. Aggregation of the studies' results suggest that risk factors, algorithms, and risk scoring for syndromic management are poor indicators of gonorrhea and chlamydial infection in samples of both low and high prevalence and, consequently, are not effective mechanisms with which to identify or manage these conditions. The development and evaluation of other approaches to identify gonorrhea and chlamydial infections, including inexpensive and simple laboratory screening tools, periodic universal treatment, and other alternatives must be given priority.

  15. When Assessment Data Are Words: Validity Evidence for Qualitative Educational Assessments.

    Science.gov (United States)

    Cook, David A; Kuper, Ayelet; Hatala, Rose; Ginsburg, Shiphra

    2016-10-01

    Quantitative scores fail to capture all important features of learner performance. This awareness has led to increased use of qualitative data when assessing health professionals. Yet the use of qualitative assessments is hampered by incomplete understanding of their role in forming judgments, and lack of consensus in how to appraise the rigor of judgments therein derived. The authors articulate the role of qualitative assessment as part of a comprehensive program of assessment, and translate the concept of validity to apply to judgments arising from qualitative assessments. They first identify standards for rigor in qualitative research, and then use two contemporary assessment validity frameworks to reorganize these standards for application to qualitative assessment.Standards for rigor in qualitative research include responsiveness, reflexivity, purposive sampling, thick description, triangulation, transparency, and transferability. These standards can be reframed using Messick's five sources of validity evidence (content, response process, internal structure, relationships with other variables, and consequences) and Kane's four inferences in validation (scoring, generalization, extrapolation, and implications). Evidence can be collected and evaluated for each evidence source or inference. The authors illustrate this approach using published research on learning portfolios.The authors advocate a "methods-neutral" approach to assessment, in which a clearly stated purpose determines the nature of and approach to data collection and analysis. Increased use of qualitative assessments will necessitate more rigorous judgments of the defensibility (validity) of inferences and decisions. Evidence should be strategically sought to inform a coherent validity argument.

  16. Validity of the top-down approach of inverse dynamics analysis in fast and large rotational trunk movements.

    Science.gov (United States)

    Iino, Yoichi; Kojima, Takeji

    2012-08-01

    This study investigated the validity of the top-down approach of inverse dynamics analysis in fast and large rotational movements of the trunk about three orthogonal axes of the pelvis for nine male collegiate students. The maximum angles of the upper trunk relative to the pelvis were approximately 47°, 49°, 32°, and 55° for lateral bending, flexion, extension, and axial rotation, respectively, with maximum angular velocities of 209°/s, 201°/s, 145°/s, and 288°/s, respectively. The pelvic moments about the axes during the movements were determined using the top-down and bottom-up approaches of inverse dynamics and compared between the two approaches. Three body segment inertial parameter sets were estimated using anthropometric data sets (Ae et al., Biomechanism 11, 1992; De Leva, J Biomech, 1996; Dumas et al., J Biomech, 2007). The root-mean-square errors of the moments and the absolute errors of the peaks of the moments were generally smaller than 10 N·m. The results suggest that the pelvic moment in motions involving fast and large trunk movements can be determined with a certain level of validity using the top-down approach in which the trunk is modeled as two or three rigid-link segments.

  17. Validity and reliability of three definitions of hip osteoarthritis: cross sectional and longitudinal approach.

    Science.gov (United States)

    Reijman, M; Hazes, J M W; Pols, H A P; Bernsen, R M D; Koes, B W; Bierma-Zeinstra, S M A

    2004-11-01

    To compare the reliability and validity in a large open population of three frequently used radiological definitions of hip osteoarthritis (OA): Kellgren and Lawrence grade, minimal joint space (MJS), and Croft grade; and to investigate whether the validity of the three definitions of hip OA is sex dependent. from the Rotterdam study (aged > or= 55 years, n = 3585) were evaluated. The inter-rater reliability was tested in a random set of 148 x rays. The validity was expressed as the ability to identify patients who show clinical symptoms of hip OA (construct validity) and as the ability to predict total hip replacement (THR) at follow up (predictive validity). Inter-rater reliability was similar for the Kellgren and Lawrence grade and MJS (kappa statistics 0.68 and 0.62, respectively) but lower for Croft's grade (kappa statistic, 0.51). The Kellgren and Lawrence grade and MJS showed the strongest associations with clinical symptoms of hip OA. Sex appeared to be an effect modifier for Kellgren and Lawrence and MJS definitions, women showing a stronger association between grading and symptoms than men. However, the sex dependency was attributed to differences in height between women and men. The Kellgren and Lawrence grade showed the highest predictive value for THR at follow up. Based on these findings, Kellgren and Lawrence still appears to be a useful OA definition for epidemiological studies focusing on the presence of hip OA.

  18. External Validity and Model Validity: A Conceptual Approach for Systematic Review Methodology

    Directory of Open Access Journals (Sweden)

    Raheleh Khorsan

    2014-01-01

    Full Text Available Background. Evidence rankings do not consider equally internal (IV, external (EV, and model validity (MV for clinical studies including complementary and alternative medicine/integrative health care (CAM/IHC research. This paper describe this model and offers an EV assessment tool (EVAT© for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IHC research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about “real-world” consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making.

  19. I know what you want to know: the impact of interviewees' ability to identify criteria on interview performance and construct-related validity

    NARCIS (Netherlands)

    Melchers, K.G.; Klehe, U.-C.; Richter, G.M.; Kleinmann, M.; König, C.J.; Lievens, F.

    2009-01-01

    The current study tested whether candidates' ability to identify the targeted interview dimensions fosters their interview success as well as the interviews' convergent and discriminant validity. Ninety-two interviewees participated in a simulated structured interview developed to measure three

  20. Validity of High School Physic Module With Character Values Using Process Skill Approach In STKIP PGRI West Sumatera

    Science.gov (United States)

    Anaperta, M.; Helendra, H.; Zulva, R.

    2018-04-01

    This study aims to describe the validity of physics module with Character Oriented Values Using Process Approach Skills at Dynamic Electrical Material in high school physics / MA and SMK. The type of research is development research. The module development model uses the development model proposed by Plomp which consists of (1) preliminary research phase, (2) the prototyping phase, and (3) assessment phase. In this research is done is initial investigation phase and designing. Data collecting technique to know validation is observation and questionnaire. In the initial investigative phase, curriculum analysis, student analysis, and concept analysis were conducted. In the design phase and the realization of module design for SMA / MA and SMK subjects in dynamic electrical materials. After that, the formative evaluation which include self evaluation, prototyping (expert reviews, one-to-one, and small group. At this stage validity is performed. This research data is obtained through the module validation sheet, which then generates a valid module.

  1. Predicting and analyzing DNA-binding domains using a systematic approach to identifying a set of informative physicochemical and biochemical properties

    Science.gov (United States)

    2011-01-01

    Background Existing methods of predicting DNA-binding proteins used valuable features of physicochemical properties to design support vector machine (SVM) based classifiers. Generally, selection of physicochemical properties and determination of their corresponding feature vectors rely mainly on known properties of binding mechanism and experience of designers. However, there exists a troublesome problem for designers that some different physicochemical properties have similar vectors of representing 20 amino acids and some closely related physicochemical properties have dissimilar vectors. Results This study proposes a systematic approach (named Auto-IDPCPs) to automatically identify a set of physicochemical and biochemical properties in the AAindex database to design SVM-based classifiers for predicting and analyzing DNA-binding domains/proteins. Auto-IDPCPs consists of 1) clustering 531 amino acid indices in AAindex into 20 clusters using a fuzzy c-means algorithm, 2) utilizing an efficient genetic algorithm based optimization method IBCGA to select an informative feature set of size m to represent sequences, and 3) analyzing the selected features to identify related physicochemical properties which may affect the binding mechanism of DNA-binding domains/proteins. The proposed Auto-IDPCPs identified m=22 features of properties belonging to five clusters for predicting DNA-binding domains with a five-fold cross-validation accuracy of 87.12%, which is promising compared with the accuracy of 86.62% of the existing method PSSM-400. For predicting DNA-binding sequences, the accuracy of 75.50% was obtained using m=28 features, where PSSM-400 has an accuracy of 74.22%. Auto-IDPCPs and PSSM-400 have accuracies of 80.73% and 82.81%, respectively, applied to an independent test data set of DNA-binding domains. Some typical physicochemical properties discovered are hydrophobicity, secondary structure, charge, solvent accessibility, polarity, flexibility, normalized Van Der

  2. External Validation of Fatty Liver Index for Identifying Ultrasonographic Fatty Liver in a Large-Scale Cross-Sectional Study in Taiwan

    Science.gov (United States)

    Fang, Kuan-Chieh; Wang, Yuan-Chen; Huo, Teh-Ia; Huang, Yi-Hsiang; Yang, Hwai-I; Su, Chien-Wei; Lin, Han-Chieh; Lee, Fa-Yauh; Wu, Jaw-Ching; Lee, Shou-Dong

    2015-01-01

    Background and Aims The fatty liver index (FLI) is an algorithm involving the waist circumference, body mass index, and serum levels of triglyceride and gamma-glutamyl transferase to identify fatty liver. Although some studies have attempted to validate the FLI, few studies have been conducted for external validation among Asians. We attempted to validate FLI to predict ultrasonographic fatty liver in Taiwanese subjects. Methods We enrolled consecutive subjects who received health check-up services at the Taipei Veterans General Hospital from 2002 to 2009. Ultrasonography was applied to diagnose fatty liver. The ability of the FLI to detect ultrasonographic fatty liver was assessed by analyzing the area under the receiver operating characteristic (AUROC) curve. Results Among the 29,797 subjects enrolled in this study, fatty liver was diagnosed in 44.5% of the population. Subjects with ultrasonographic fatty liver had a significantly higher FLI than those without fatty liver by multivariate analysis (odds ratio 1.045; 95% confidence interval, CI 1.044–1.047, pfatty liver (AUROC: 0.827, 95% confidence interval, 0.822–0.831). An FLI fatty liver. Moreover, an FLI ≥ 35 (positive likelihood ratio (LR+) 3.12) for males and ≥ 20 (LR+ 4.43) for females rule in ultrasonographic fatty liver. Conclusions FLI could accurately identify ultrasonographic fatty liver in a large-scale population in Taiwan but with lower cut-off value than the Western population. Meanwhile the cut-off value was lower in females than in males. PMID:25781622

  3. Validation of ATR FT-IR to identify polymers of plastic marine debris, including those ingested by marine organisms

    Science.gov (United States)

    Jung, Melissa R.; Horgen, F. David; Orski, Sara V.; Rodriguez, Viviana; Beers, Kathryn L.; Balazs, George H.; Jones, T. Todd; Work, Thierry M.; Brignac, Kayla C.; Royer, Sarah-Jeanne; Hyrenbach, David K.; Jensen, Brenda A.; Lynch, Jennifer M.

    2018-01-01

    Polymer identification of plastic marine debris can help identify its sources, degradation, and fate. We optimized and validated a fast, simple, and accessible technique, attenuated total reflectance Fourier transform infrared spectroscopy (ATR FT-IR), to identify polymers contained in plastic ingested by sea turtles. Spectra of consumer good items with known resin identification codes #1–6 and several #7 plastics were compared to standard and raw manufactured polymers. High temperature size exclusion chromatography measurements confirmed ATR FT-IR could differentiate these polymers. High-density (HDPE) and low-density polyethylene (LDPE) discrimination is challenging but a clear step-by-step guide is provided that identified 78% of ingested PE samples. The optimal cleaning methods consisted of wiping ingested pieces with water or cutting. Of 828 ingested plastics pieces from 50 Pacific sea turtles, 96% were identified by ATR FT-IR as HDPE, LDPE, unknown PE, polypropylene (PP), PE and PP mixtures, polystyrene, polyvinyl chloride, and nylon.

  4. An information architecture for validating courseware

    OpenAIRE

    Melia, Mark; Pahl, Claus

    2007-01-01

    Courseware validation should locate Learning Objects inconsistent with the courseware instructional design being used. In order for validation to take place it is necessary to identify the implicit and explicit information needed for validation. In this paper, we identify this information and formally define an information architecture to model courseware validation information explicitly. This promotes tool-support for courseware validation and its interoperability with the courseware specif...

  5. An innovative and integrated approach based on DNA walking to identify unauthorised GMOs.

    Science.gov (United States)

    Fraiture, Marie-Alice; Herman, Philippe; Taverniers, Isabel; De Loose, Marc; Deforce, Dieter; Roosens, Nancy H

    2014-03-15

    In the coming years, the frequency of unauthorised genetically modified organisms (GMOs) being present in the European food and feed chain will increase significantly. Therefore, we have developed a strategy to identify unauthorised GMOs containing a pCAMBIA family vector, frequently present in transgenic plants. This integrated approach is performed in two successive steps on Bt rice grains. First, the potential presence of unauthorised GMOs is assessed by the qPCR SYBR®Green technology targeting the terminator 35S pCAMBIA element. Second, its presence is confirmed via the characterisation of the junction between the transgenic cassette and the rice genome. To this end, a DNA walking strategy is applied using a first reverse primer followed by two semi-nested PCR rounds using primers that are each time nested to the previous reverse primer. This approach allows to rapidly identify the transgene flanking region and can easily be implemented by the enforcement laboratories. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. The Clinical Validation of the Athlete Sleep Screening Questionnaire: an Instrument to Identify Athletes that Need Further Sleep Assessment.

    Science.gov (United States)

    Bender, Amy M; Lawson, Doug; Werthner, Penny; Samuels, Charles H

    2018-06-04

    Previous research has established that general sleep screening questionnaires are not valid and reliable in an athlete population. The Athlete Sleep Screening Questionnaire (ASSQ) was developed to address this need. While the initial validation of the ASSQ has been established, the clinical validity of the ASSQ has yet to be determined. The main objective of the current study was to evaluate the clinical validity of the ASSQ. Canadian National Team athletes (N = 199; mean age 24.0 ± 4.2 years, 62% females; from 23 sports) completed the ASSQ. A subset of athletes (N = 46) were randomized to the clinical validation sub-study which required subjects to complete an ASSQ at times 2 and 3 and to have a clinical sleep interview by a sleep medicine physician (SMP) who rated each subjects' category of clinical sleep problem and provided recommendations to improve sleep. To assess clinical validity, the SMP category of clinical sleep problem was compared to the ASSQ. The internal consistency (Cronbach's alpha = 0.74) and test-retest reliability (r = 0.86) of the ASSQ were acceptable. The ASSQ demonstrated good agreement with the SMP (Cohen's kappa = 0.84) which yielded a diagnostic sensitivity of 81%, specificity of 93%, positive predictive value of 87%, and negative predictive value of 90%. There were 25.1% of athletes identified to have clinically relevant sleep disturbances that required further clinical sleep assessment. Sleep improved from time 1 at baseline to after the recommendations at time 3. Sleep screening athletes with the ASSQ provides a method of accurately determining which athletes would benefit from preventative measures and which athletes suffer from clinically significant sleep problems. The process of sleep screening athletes and providing recommendations improves sleep and offers a clinical intervention output that is simple and efficient for teams and athletes to implement.

  7. Identifying seizure onset zone from electrocorticographic recordings: A machine learning approach based on phase locking value.

    Science.gov (United States)

    Elahian, Bahareh; Yeasin, Mohammed; Mudigoudar, Basanagoud; Wheless, James W; Babajani-Feremi, Abbas

    2017-10-01

    Using a novel technique based on phase locking value (PLV), we investigated the potential for features extracted from electrocorticographic (ECoG) recordings to serve as biomarkers to identify the seizure onset zone (SOZ). We computed the PLV between the phase of the amplitude of high gamma activity (80-150Hz) and the phase of lower frequency rhythms (4-30Hz) from ECoG recordings obtained from 10 patients with epilepsy (21 seizures). We extracted five features from the PLV and used a machine learning approach based on logistic regression to build a model that classifies electrodes as SOZ or non-SOZ. More than 96% of electrodes identified as the SOZ by our algorithm were within the resected area in six seizure-free patients. In four non-seizure-free patients, more than 31% of the identified SOZ electrodes by our algorithm were outside the resected area. In addition, we observed that the seizure outcome in non-seizure-free patients correlated with the number of non-resected SOZ electrodes identified by our algorithm. This machine learning approach, based on features extracted from the PLV, effectively identified electrodes within the SOZ. The approach has the potential to assist clinicians in surgical decision-making when pre-surgical intracranial recordings are utilized. Copyright © 2017 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  8. Derivation and validation of the automated search algorithms to identify cognitive impairment and dementia in electronic health records.

    Science.gov (United States)

    Amra, Sakusic; O'Horo, John C; Singh, Tarun D; Wilson, Gregory A; Kashyap, Rahul; Petersen, Ronald; Roberts, Rosebud O; Fryer, John D; Rabinstein, Alejandro A; Gajic, Ognjen

    2017-02-01

    Long-term cognitive impairment is a common and important problem in survivors of critical illness. We developed electronic search algorithms to identify cognitive impairment and dementia from the electronic medical records (EMRs) that provide opportunity for big data analysis. Eligible patients met 2 criteria. First, they had a formal cognitive evaluation by The Mayo Clinic Study of Aging. Second, they were hospitalized in intensive care unit at our institution between 2006 and 2014. The "criterion standard" for diagnosis was formal cognitive evaluation supplemented by input from an expert neurologist. Using all available EMR data, we developed and improved our algorithms in the derivation cohort and validated them in the independent validation cohort. Of 993 participants who underwent formal cognitive testing and were hospitalized in intensive care unit, we selected 151 participants at random to form the derivation and validation cohorts. The automated electronic search algorithm for cognitive impairment was 94.3% sensitive and 93.0% specific. The search algorithms for dementia achieved respective sensitivity and specificity of 97% and 99%. EMR search algorithms significantly outperformed International Classification of Diseases codes. Automated EMR data extractions for cognitive impairment and dementia are reliable and accurate and can serve as acceptable and efficient alternatives to time-consuming manual data review. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Methods for identifying surgical wound infection after discharge from hospital: a systematic review

    Directory of Open Access Journals (Sweden)

    Moore Peter J

    2006-11-01

    Full Text Available Abstract Background Wound infections are a common complication of surgery that add significantly to the morbidity of patients and costs of treatment. The global trend towards reducing length of hospital stay post-surgery and the increase in day case surgery means that surgical site infections (SSI will increasingly occur after hospital discharge. Surveillance of SSIs is important because rates of SSI are viewed as a measure of hospital performance, however accurate detection of SSIs post-hospital discharge is not straightforward. Methods We conducted a systematic review of methods of post discharge surveillance for surgical wound infection and undertook a national audit of methods of post-discharge surveillance for surgical site infection currently used within United Kingdom NHS Trusts. Results Seven reports of six comparative studies which examined the validity of post-discharge surveillance methods were located; these involved different comparisons and some had methodological limitations, making it difficult to identify an optimal method. Several studies evaluated automated screening of electronic records and found this to be a useful strategy for the identification of SSIs that occurred post discharge. The audit identified a wide range of relevant post-discharge surveillance programmes in England, Scotland and Wales and Northern Ireland; however, these programmes used varying approaches for which there is little supporting evidence of validity and/or reliability. Conclusion In order to establish robust methods of surveillance for those surgical site infections that occur post discharge, there is a need to develop a method of case ascertainment that is valid and reliable post discharge. Existing research has not identified a valid and reliable method. A standardised definition of wound infection (e.g. that of the Centres for Disease Control should be used as a basis for developing a feasible, valid and reliable approach to defining post

  10. Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences.

    Science.gov (United States)

    Harman, Elena; Azzam, Tarek

    2018-02-01

    This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Development and validation of health service management competencies.

    Science.gov (United States)

    Liang, Zhanming; Howard, Peter F; Leggat, Sandra; Bartram, Timothy

    2018-04-09

    Purpose The importance of managerial competencies in monitoring and improving the performance of organisational leaders and managers is well accepted. Different processes have been used to identify and develop competency frameworks or models for healthcare managers around the world to meet different contextual needs. The purpose of the paper is to introduce a validated process in management competency identification and development applied in Australia - a process leading to a management competency framework with associated behavioural items that can be used to measure core management competencies of health service managers. Design/methodology/approach The management competency framework development study incorporated both qualitative and quantitative methods, implemented in four stages, including job description analysis, focus group discussions and online surveys. Findings The study confirmed that the four-stage process could identify management competencies and the framework developed is considered reliable and valid for developing a management competency assessment tool that can measure management competence amongst managers in health organisations. In addition, supervisors of health service managers could use the framework to distinguish perceived superior and average performers among managers in health organisations. Practical implications Developing the core competencies of health service managers is important for management performance improvement and talent management. The six core management competencies identified can be used to guide the design professional development activities for health service managers. Originality/value The validated management competency identification and development process can be applied in other countries and different industrial contexts to identify core management competency requirements.

  12. Outbreaks source: A new mathematical approach to identify their possible location

    Science.gov (United States)

    Buscema, Massimo; Grossi, Enzo; Breda, Marco; Jefferson, Tom

    2009-11-01

    Classical epidemiology has generally relied on the description and explanation of the occurrence of infectious diseases in relation to time occurrence of events rather than to place of occurrence. In recent times, computer generated dot maps have facilitated the modeling of the spread of infectious epidemic diseases either with classical statistics approaches or with artificial “intelligent systems”. Few attempts, however, have been made so far to identify the origin of the epidemic spread rather than its evolution by mathematical topology methods. We report on the use of a new artificial intelligence method (the H-PST Algorithm) and we compare this new technique with other well known algorithms to identify the source of three examples of infectious disease outbreaks derived from literature. The H-PST algorithm is a new system able to project a distances matrix of points (events) into a bi-dimensional space, with the generation of a new point, named hidden unit. This new hidden unit deforms the original Euclidean space and transforms it into a new space (cognitive space). The cost function of this transformation is the minimization of the differences between the original distance matrix among the assigned points and the distance matrix of the same points projected into the bi-dimensional map (or any different set of constraints). For many reasons we will discuss, the position of the hidden unit shows to target the outbreak source in many epidemics much better than the other classic algorithms specifically targeted for this task. Compared with main algorithms known in the location theory, the hidden unit was within yards of the outbreak source in the first example (the 2007 epidemic of Chikungunya fever in Italy). The hidden unit was located in the river between the two village epicentres of the spread exactly where the index case was living. Equally in the second (the 1967 foot and mouth disease epidemic in England), and the third (1854 London Cholera epidemic

  13. Identification of loci governing eight agronomic traits using a GBS-GWAS approach and validation by QTL mapping in soya bean.

    Science.gov (United States)

    Sonah, Humira; O'Donoughue, Louise; Cober, Elroy; Rajcan, Istvan; Belzile, François

    2015-02-01

    Soya bean is a major source of edible oil and protein for human consumption as well as animal feed. Understanding the genetic basis of different traits in soya bean will provide important insights for improving breeding strategies for this crop. A genome-wide association study (GWAS) was conducted to accelerate molecular breeding for the improvement of agronomic traits in soya bean. A genotyping-by-sequencing (GBS) approach was used to provide dense genome-wide marker coverage (>47,000 SNPs) for a panel of 304 short-season soya bean lines. A subset of 139 lines, representative of the diversity among these, was characterized phenotypically for eight traits under six environments (3 sites × 2 years). Marker coverage proved sufficient to ensure highly significant associations between the genes known to control simple traits (flower, hilum and pubescence colour) and flanking SNPs. Between one and eight genomic loci associated with more complex traits (maturity, plant height, seed weight, seed oil and protein) were also identified. Importantly, most of these GWAS loci were located within genomic regions identified by previously reported quantitative trait locus (QTL) for these traits. In some cases, the reported QTLs were also successfully validated by additional QTL mapping in a biparental population. This study demonstrates that integrating GBS and GWAS can be used as a powerful complementary approach to classical biparental mapping for dissecting complex traits in soya bean. © 2014 Society for Experimental Biology, Association of Applied Biologists and John Wiley & Sons Ltd.

  14. Development and validation of a multi-locus DNA metabarcoding method to identify endangered species in complex samples.

    Science.gov (United States)

    Arulandhu, Alfred J; Staats, Martijn; Hagelaar, Rico; Voorhuijzen, Marleen M; Prins, Theo W; Scholtens, Ingrid; Costessi, Adalberto; Duijsings, Danny; Rechenmann, François; Gaspar, Frédéric B; Barreto Crespo, Maria Teresa; Holst-Jensen, Arne; Birck, Matthew; Burns, Malcolm; Haynes, Edward; Hochegger, Rupert; Klingl, Alexander; Lundberg, Lisa; Natale, Chiara; Niekamp, Hauke; Perri, Elena; Barbante, Alessandra; Rosec, Jean-Philippe; Seyfarth, Ralf; Sovová, Tereza; Van Moorleghem, Christoff; van Ruth, Saskia; Peelen, Tamara; Kok, Esther

    2017-10-01

    DNA metabarcoding provides great potential for species identification in complex samples such as food supplements and traditional medicines. Such a method would aid Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) enforcement officers to combat wildlife crime by preventing illegal trade of endangered plant and animal species. The objective of this research was to develop a multi-locus DNA metabarcoding method for forensic wildlife species identification and to evaluate the applicability and reproducibility of this approach across different laboratories. A DNA metabarcoding method was developed that makes use of 12 DNA barcode markers that have demonstrated universal applicability across a wide range of plant and animal taxa and that facilitate the identification of species in samples containing degraded DNA. The DNA metabarcoding method was developed based on Illumina MiSeq amplicon sequencing of well-defined experimental mixtures, for which a bioinformatics pipeline with user-friendly web-interface was developed. The performance of the DNA metabarcoding method was assessed in an international validation trial by 16 laboratories, in which the method was found to be highly reproducible and sensitive enough to identify species present in a mixture at 1% dry weight content. The advanced multi-locus DNA metabarcoding method assessed in this study provides reliable and detailed data on the composition of complex food products, including information on the presence of CITES-listed species. The method can provide improved resolution for species identification, while verifying species with multiple DNA barcodes contributes to an enhanced quality assurance. © The Authors 2017. Published by Oxford University Press.

  15. Utility of the Conners' Adult ADHD Rating Scale validity scales in identifying simulated attention-deficit hyperactivity disorder and random responding.

    Science.gov (United States)

    Walls, Brittany D; Wallace, Elizabeth R; Brothers, Stacey L; Berry, David T R

    2017-12-01

    Recent concern about malingered self-report of symptoms of attention-deficit hyperactivity disorder (ADHD) in college students has resulted in an urgent need for scales that can detect feigning of this disorder. The present study provided further validation data for a recently developed validity scale for the Conners' Adult ADHD Rating Scale (CAARS), the CAARS Infrequency Index (CII), as well as for the Inconsistency Index (INC). The sample included 139 undergraduate students: 21 individuals with diagnoses of ADHD, 29 individuals responding honestly, 54 individuals responding randomly (full or half), and 35 individuals instructed to feign. Overall, the INC showed moderate sensitivity to random responding (.44-.63) and fairly high specificity to ADHD (.86-.91). The CII demonstrated modest sensitivity to feigning (.31-.46) and excellent specificity to ADHD (.91-.95). Sequential application of validity scales had correct classification rates of honest (93.1%), ADHD (81.0%), feigning (57.1%), half random (42.3%), and full random (92.9%). The present study suggests that the CII is modestly sensitive (true positive rate) to feigned ADHD symptoms, and highly specific (true negative rate) to ADHD. Additionally, this study highlights the utility of applying the CAARS validity scales in a sequential manner for identifying feigning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. The development and validation of a two-tiered multiple-choice instrument to identify alternative conceptions in earth science

    Science.gov (United States)

    Mangione, Katherine Anna

    This study was to determine reliability and validity for a two-tiered, multiple- choice instrument designed to identify alternative conceptions in earth science. Additionally, this study sought to identify alternative conceptions in earth science held by preservice teachers, to investigate relationships between self-reported confidence scores and understanding of earth science concepts, and to describe relationships between content knowledge and alternative conceptions and planning instruction in the science classroom. Eighty-seven preservice teachers enrolled in the MAT program participated in this study. Sixty-eight participants were female, twelve were male, and seven chose not to answer. Forty-seven participants were in the elementary certification program, five were in the middle school certification program, and twenty-nine were pursuing secondary certification. Results indicate that the two-tiered, multiple-choice format can be a reliable and valid method for identifying alternative conceptions. Preservice teachers in all certification areas who participated in this study may possess common alternative conceptions previously identified in the literature. Alternative conceptions included: all rivers flow north to south, the shadow of the Earth covers the Moon causing lunar phases, the Sun is always directly overhead at noon, weather can be predicted by animal coverings, and seasons are caused by the Earth's proximity to the Sun. Statistical analyses indicated differences, however not all of them significant, among all subgroups according to gender and certification area. Generally males outperformed females and preservice teachers pursuing middle school certification had higher scores on the questionnaire followed by those obtaining secondary certification. Elementary preservice teachers scored the lowest. Additionally, self-reported scores of confidence in one's answers and understanding of the earth science concept in question were analyzed. There was a

  17. Integrated systems approach identifies risk regulatory pathways and key regulators in coronary artery disease.

    Science.gov (United States)

    Zhang, Yan; Liu, Dianming; Wang, Lihong; Wang, Shuyuan; Yu, Xuexin; Dai, Enyu; Liu, Xinyi; Luo, Shanshun; Jiang, Wei

    2015-12-01

    Coronary artery disease (CAD) is the most common type of heart disease. However, the molecular mechanisms of CAD remain elusive. Regulatory pathways are known to play crucial roles in many pathogenic processes. Thus, inferring risk regulatory pathways is an important step toward elucidating the mechanisms underlying CAD. With advances in high-throughput data, we developed an integrated systems approach to identify CAD risk regulatory pathways and key regulators. Firstly, a CAD-related core subnetwork was identified from a curated transcription factor (TF) and microRNA (miRNA) regulatory network based on a random walk algorithm. Secondly, candidate risk regulatory pathways were extracted from the subnetwork by applying a breadth-first search (BFS) algorithm. Then, risk regulatory pathways were prioritized based on multiple CAD-associated data sources. Finally, we also proposed a new measure to prioritize upstream regulators. We inferred that phosphatase and tensin homolog (PTEN) may be a key regulator in the dysregulation of risk regulatory pathways. This study takes a closer step than the identification of disease subnetworks or modules. From the risk regulatory pathways, we could understand the flow of regulatory information in the initiation and progression of the disease. Our approach helps to uncover its potential etiology. We developed an integrated systems approach to identify risk regulatory pathways. We proposed a new measure to prioritize the key regulators in CAD. PTEN may be a key regulator in dysregulation of the risk regulatory pathways.

  18. Identifying biomarkers for asthma diagnosis using targeted metabolomics approaches.

    Science.gov (United States)

    Checkley, William; Deza, Maria P; Klawitter, Jost; Romero, Karina M; Klawitter, Jelena; Pollard, Suzanne L; Wise, Robert A; Christians, Uwe; Hansel, Nadia N

    2016-12-01

    The diagnosis of asthma in children is challenging and relies on a combination of clinical factors and biomarkers including methacholine challenge, lung function, bronchodilator responsiveness, and presence of airway inflammation. No single test is diagnostic. We sought to identify a pattern of inflammatory biomarkers that was unique to asthma using a targeted metabolomics approach combined with data science methods. We conducted a nested case-control study of 100 children living in a peri-urban community in Lima, Peru. We defined cases as children with current asthma, and controls as children with no prior history of asthma and normal lung function. We further categorized enrollment following a factorial design to enroll equal numbers of children as either overweight or not. We obtained a fasting venous blood sample to characterize a comprehensive panel of targeted markers using a metabolomics approach based on high performance liquid chromatography-mass spectrometry. A statistical comparison of targeted metabolites between children with asthma (n = 50) and healthy controls (n = 49) revealed distinct patterns in relative concentrations of several metabolites: children with asthma had approximately 40-50% lower relative concentrations of ascorbic acid, 2-isopropylmalic acid, shikimate-3-phosphate, and 6-phospho-d-gluconate when compared to children without asthma, and 70% lower relative concentrations of reduced glutathione (all p  13 077 normalized counts/second and betaine ≤ 16 47 121 normalized counts/second). By using a metabolomics approach applied to serum, we were able to discriminate between children with and without asthma by revealing different metabolic patterns. These results suggest that serum metabolomics may represent a diagnostic tool for asthma and may be helpful for distinguishing asthma phenotypes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Innovative approach for in-vivo ablation validation on multimodal images

    Science.gov (United States)

    Shahin, O.; Karagkounis, G.; Carnegie, D.; Schlaefer, A.; Boctor, E.

    2014-03-01

    Radiofrequency ablation (RFA) is an important therapeutic procedure for small hepatic tumors. To make sure that the target tumor is effectively treated, RFA monitoring is essential. While several imaging modalities can observe the ablation procedure, it is not clear how ablated lesions on the images correspond to actual necroses. This uncertainty contributes to the high local recurrence rates (up to 55%) after radiofrequency ablative therapy. This study investigates a novel approach to correlate images of ablated lesions with actual necroses. We mapped both intraoperative images of the lesion and a slice through the actual necrosis in a common reference frame. An electromagnetic tracking system was used to accurately match lesion slices from different imaging modalities. To minimize the liver deformation effect, the tracking reference frame was defined inside the tissue by anchoring an electromagnetic sensor adjacent to the lesion. A validation test was performed using a phantom and proved that the end-to-end accuracy of the approach was within 2mm. In an in-vivo experiment, intraoperative magnetic resonance imaging (MRI) and ultrasound (US) ablation images were correlated to gross and histopathology. The results indicate that the proposed method can accurately correlate invivo ablations on different modalities. Ultimately, this will improve the interpretation of the ablation monitoring and reduce the recurrence rates associated with RFA.

  20. An Approach to Identify and Characterize a Subunit Candidate Shigella Vaccine Antigen.

    Science.gov (United States)

    Pore, Debasis; Chakrabarti, Manoj K

    2016-01-01

    Shigellosis remains a serious issue throughout the developing countries, particularly in children under the age of 5. Numerous strategies have been tested to develop vaccines targeting shigellosis; unfortunately despite several years of extensive research, no safe, effective, and inexpensive vaccine against shigellosis is available so far. Here, we illustrate in detail an approach to identify and establish immunogenic outer membrane proteins from Shigella flexneri 2a as subunit vaccine candidates.

  1. On the improvement of IT process maturity: assessment, recommendation and validation

    Directory of Open Access Journals (Sweden)

    Dirgahayu Teduh

    2018-01-01

    Full Text Available The use of information technology (IT in enterprises must be governed and managed appropriately using IT processes. The notion of IT process maturity is useful to measure the actual performance and to define the desired performance of IT processes. Improvements are necessary when there are gaps between the actual and desired performance. Most literatures focus on IT process maturity assessment. They do not address how to improve IT process maturity. This paper proposes an approach to enterprise IT process maturity improvement for COBIT processes. The approach consists of three activities, i.e. IT process maturity assessment, recommendation, and validation. Assessment is to recognise the process’ control objectives maturity. From the assessment results, recommendation identifies control objectives that must be improved and then suggests improvement actions. The prescriptive nature of the control objectives facilitates in suggesting those actions. Recommendations for managements are defined by abstracting similar actions. Validation checks whether the recommendations match with the enterprise needs and capability. It includes a scale for validation, in which enterprise’s capability is categorized into (i not capable, (ii capable with great efforts, and (iii fully capable. The paper illustrates the approach with a case study.

  2. Soil moisture mapping using Sentinel 1 images: the proposed approach and its preliminary validation carried out in view of an operational product

    Science.gov (United States)

    Paloscia, S.; Pettinato, S.; Santi, E.; Pierdicca, N.; Pulvirenti, L.; Notarnicola, C.; Pace, G.; Reppucci, A.

    2011-11-01

    The main objective of this research is to develop, test and validate a soil moisture (SMC)) algorithm for the GMES Sentinel-1 characteristics, within the framework of an ESA project. The SMC product, to be generated from Sentinel-1 data, requires an algorithm able to process operationally in near-real-time and deliver the product to the GMES services within 3 hours from observations. Two different complementary approaches have been proposed: an Artificial Neural Network (ANN), which represented the best compromise between retrieval accuracy and processing time, thus allowing compliance with the timeliness requirements and a Bayesian Multi-temporal approach, allowing an increase of the retrieval accuracy, especially in case where little ancillary data are available, at the cost of computational efficiency, taking advantage of the frequent revisit time achieved by Sentinel-1. The algorithm was validated in several test areas in Italy, US and Australia, and finally in Spain with a 'blind' validation. The Multi-temporal Bayesian algorithm was validated in Central Italy. The validation results are in all cases very much in line with the requirements. However, the blind validation results were penalized by the availability of only VV polarization SAR images and MODIS lowresolution NDVI, although the RMS is slightly > 4%.

  3. Design of Quiet Rotorcraft Approach Trajectories

    Science.gov (United States)

    Padula, Sharon L.; Burley, Casey L.; Boyd, D. Douglas, Jr.; Marcolini, Michael A.

    2009-01-01

    A optimization procedure for identifying quiet rotorcraft approach trajectories is proposed and demonstrated. The procedure employs a multi-objective genetic algorithm in order to reduce noise and create approach paths that will be acceptable to pilots and passengers. The concept is demonstrated by application to two different helicopters. The optimized paths are compared with one another and to a standard 6-deg approach path. The two demonstration cases validate the optimization procedure but highlight the need for improved noise prediction techniques and for additional rotorcraft acoustic data sets.

  4. Identifying approaches for assessing methodological and reporting quality of systematic reviews

    DEFF Research Database (Denmark)

    Pussegoda, Kusala; Turner, Lucy; Garritty, Chantelle

    2017-01-01

    there are potential gaps in research best-practice guidance materials. The aims of this study are to identify reports assessing the methodological quality (MQ) and/or reporting quality (RQ) of a cohort of SRs and to assess their number, general characteristics, and approaches to 'quality' assessment over time......BACKGROUND: The methodological quality and completeness of reporting of the systematic reviews (SRs) is fundamental to optimal implementation of evidence-based health care and the reduction of research waste. Methods exist to appraise SRs yet little is known about how they are used in SRs or where...... or reporting guidelines used as proxy to assess RQ were used in 80% (61/76) of identified reports. These included two reporting guidelines (PRISMA and QUOROM) and five quality assessment tools (AMSTAR, R-AMSTAR, OQAQ, Mulrow, Sacks) and GRADE criteria. The remaining 24% (18/76) of reports developed their own...

  5. A new approach for the validation of skeletal muscle modelling using MRI data

    Science.gov (United States)

    Böl, Markus; Sturmat, Maike; Weichert, Christine; Kober, Cornelia

    2011-05-01

    Active and passive experiments on skeletal muscles are in general arranged on isolated muscles or by consideration of the whole muscle packages, such as the arm or the leg. Both methods exhibit advantages and disadvantages. By applying experiments on isolated muscles it turns out that no information about the surrounding tissues are considered what leads to insufficient specifications of the isolated muscle. Especially, the muscle shape and the fibre directions of an embedded muscle are completely different to that of the same isolated muscle. An explicit advantage, in contrast, is the possibility to study the mechanical characteristics in an unique, isolated way. On the other hand, by applying experiments on muscle packages the aforementioned pros and cons reverse. In such situation, the whole surrounding tissue is considered in the mechanical characteristics of the muscle which are much more difficult to identify. However, an embedded muscle reflects a much more realistic situation as in isolated condition. Thus, in the proposed work to our knowledge, we, for the first time, suggest a technique that allows to study characteristics of single skeletal muscles inside a muscle package without any computation of the tissue around the muscle of interest. In doing so, we use magnetic resonance imaging data of an upper arm during contraction. By applying a three-dimensional continuum constitutive muscle model we are able to study the biceps brachii inside the upper arm and validate the modelling approach by optical experiments.

  6. Methodology for Computational Fluid Dynamic Validation for Medical Use: Application to Intracranial Aneurysm.

    Science.gov (United States)

    Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui

    2017-12-01

    Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.

  7. Integrative biology approach identifies cytokine targeting strategies for psoriasis.

    Science.gov (United States)

    Perera, Gayathri K; Ainali, Chrysanthi; Semenova, Ekaterina; Hundhausen, Christian; Barinaga, Guillermo; Kassen, Deepika; Williams, Andrew E; Mirza, Muddassar M; Balazs, Mercedesz; Wang, Xiaoting; Rodriguez, Robert Sanchez; Alendar, Andrej; Barker, Jonathan; Tsoka, Sophia; Ouyang, Wenjun; Nestle, Frank O

    2014-02-12

    Cytokines are critical checkpoints of inflammation. The treatment of human autoimmune disease has been revolutionized by targeting inflammatory cytokines as key drivers of disease pathogenesis. Despite this, there exist numerous pitfalls when translating preclinical data into the clinic. We developed an integrative biology approach combining human disease transcriptome data sets with clinically relevant in vivo models in an attempt to bridge this translational gap. We chose interleukin-22 (IL-22) as a model cytokine because of its potentially important proinflammatory role in epithelial tissues. Injection of IL-22 into normal human skin grafts produced marked inflammatory skin changes resembling human psoriasis. Injection of anti-IL-22 monoclonal antibody in a human xenotransplant model of psoriasis, developed specifically to test potential therapeutic candidates, efficiently blocked skin inflammation. Bioinformatic analysis integrating both the IL-22 and anti-IL-22 cytokine transcriptomes and mapping them onto a psoriasis disease gene coexpression network identified key cytokine-dependent hub genes. Using knockout mice and small-molecule blockade, we show that one of these hub genes, the so far unexplored serine/threonine kinase PIM1, is a critical checkpoint for human skin inflammation and potential future therapeutic target in psoriasis. Using in silico integration of human data sets and biological models, we were able to identify a new target in the treatment of psoriasis.

  8. Validation of transcutaneous bilirubin nomogram for identifying neonatal hyperbilirubinemia in healthy Chinese term and late-preterm infants: a multicenter study

    Directory of Open Access Journals (Sweden)

    Zhangbin Yu

    2014-06-01

    Full Text Available OBJECTIVE: to prospectively validate a previously constructed transcutaneous bilirubin (TcB nomogram for identifying severe hyperbilirubinemia in healthy Chinese term and late-preterm infants. METHODS: this was a multicenter study that included 9,174 healthy term and late-preterm infants in eight hospitals of China. TcB measurements were performed using a JM-103 bilirubinometer. TcB values were plotted on a previously developed TcB nomogram, to identify the predictive ability for subsequent significant hyperbilirubinemia. RESULTS: in the present study, 972 neonates (10.6% developed significant hyperbilirubinemia. The 40th percentile of the nomogram could identify all neonates who were at risk of significant hyperbilirubinemia, but with a low positive predictive value (PPV (18.9%. Of the 453 neonates above the 95th percentile, 275 subsequently developed significant hyperbilirubinemia, with a high PPV (60.7%, but with low sensitivity (28.3%. The 75th percentile was highly specific (81.9% and moderately sensitive (79.8%. The area under the curve (AUC for the TcB nomogram was 0.875. CONCLUSIONS: this study validated the previously developed TcB nomogram, which could be used to predict subsequent significant hyperbilirubinemia in healthy Chinese term and late-preterm infants. However, combining TcB nomogram and clinical risk factors could improve the predictive accuracy for severe hyperbilirubinemia, which was not assessed in the study. Further studies are necessary to confirm this combination.

  9. Validation of transcutaneous bilirubin nomogram for identifying neonatal hyperbilirubinemia in healthy Chinese term and late-preterm infants: a multicenter study.

    Science.gov (United States)

    Yu, Zhangbin; Han, Shuping; Wu, Jinxia; Li, Mingxia; Wang, Huaiyan; Wang, Jimei; Liu, Jiebo; Pan, Xinnian; Yang, Jie; Chen, Chao

    2014-01-01

    to prospectively validate a previously constructed transcutaneous bilirubin (TcB) nomogram for identifying severe hyperbilirubinemia in healthy Chinese term and late-preterm infants. this was a multicenter study that included 9,174 healthy term and late-preterm infants in eight hospitals of China. TcB measurements were performed using a JM-103 bilirubinometer. TcB values were plotted on a previously developed TcB nomogram, to identify the predictive ability for subsequent significant hyperbilirubinemia. in the present study, 972 neonates (10.6%) developed significant hyperbilirubinemia. The 40(th) percentile of the nomogram could identify all neonates who were at risk of significant hyperbilirubinemia, but with a low positive predictive value (PPV) (18.9%). Of the 453 neonates above the 95(th) percentile, 275 subsequently developed significant hyperbilirubinemia, with a high PPV (60.7%), but with low sensitivity (28.3%). The 75(th) percentile was highly specific (81.9%) and moderately sensitive (79.8%). The area under the curve (AUC) for the TcB nomogram was 0.875. this study validated the previously developed TcB nomogram, which could be used to predict subsequent significant hyperbilirubinemia in healthy Chinese term and late-preterm infants. However, combining TcB nomogram and clinical risk factors could improve the predictive accuracy for severe hyperbilirubinemia, which was not assessed in the study. Further studies are necessary to confirm this combination. Copyright © 2014 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  10. Physical validation issue of the NEPTUNE two-phase modelling: validation plan to be adopted, experimental programs to be set up and associated instrumentation techniques developed

    International Nuclear Information System (INIS)

    Pierre Peturaud; Eric Hervieu

    2005-01-01

    Full text of publication follows: A long-term joint development program for the next generation of nuclear reactors simulation tools has been launched in 2001 by EDF (Electricite de France) and CEA (Commissariat a l'Energie Atomique). The NEPTUNE Project constitutes the Thermal-Hydraulics part of this comprehensive program. Along with the underway development of this new two-phase flow software platform, the physical validation of the involved modelling is a crucial issue, whatever the modelling scale is, and the present paper deals with this issue. After a brief recall about the NEPTUNE platform, the general validation strategy to be adopted is first of all clarified by means of three major features: (i) physical validation in close connection with the concerned industrial applications, (ii) involving (as far as possible) a two-step process successively focusing on dominant separate models and assessing the whole modelling capability, (iii) thanks to the use of relevant data with respect to the validation aims. Based on this general validation process, a four-step generic work approach has been defined; it includes: (i) a thorough analysis of the concerned industrial applications to identify the key physical phenomena involved and associated dominant basic models, (ii) an assessment of these models against the available validation pieces of information, to specify the additional validation needs and define dedicated validation plans, (iii) an inventory and assessment of existing validation data (with respect to the requirements specified in the previous task) to identify the actual needs for new validation data, (iv) the specification of the new experimental programs to be set up to provide the needed new data. This work approach has been applied to the NEPTUNE software, focusing on 8 high priority industrial applications, and it has resulted in the definition of (i) the validation plan and experimental programs to be set up for the open medium 3D modelling

  11. A multi-source satellite data approach for modelling Lake Turkana water level: calibration and validation using satellite altimetry data

    Directory of Open Access Journals (Sweden)

    N. M. Velpuri

    2012-01-01

    Full Text Available Lake Turkana is one of the largest desert lakes in the world and is characterized by high degrees of inter- and intra-annual fluctuations. The hydrology and water balance of this lake have not been well understood due to its remote location and unavailability of reliable ground truth datasets. Managing surface water resources is a great challenge in areas where in-situ data are either limited or unavailable. In this study, multi-source satellite-driven data such as satellite-based rainfall estimates, modelled runoff, evapotranspiration, and a digital elevation dataset were used to model Lake Turkana water levels from 1998 to 2009. Due to the unavailability of reliable lake level data, an approach is presented to calibrate and validate the water balance model of Lake Turkana using a composite lake level product of TOPEX/Poseidon, Jason-1, and ENVISAT satellite altimetry data. Model validation results showed that the satellite-driven water balance model can satisfactorily capture the patterns and seasonal variations of the Lake Turkana water level fluctuations with a Pearson's correlation coefficient of 0.90 and a Nash-Sutcliffe Coefficient of Efficiency (NSCE of 0.80 during the validation period (2004–2009. Model error estimates were within 10% of the natural variability of the lake. Our analysis indicated that fluctuations in Lake Turkana water levels are mainly driven by lake inflows and over-the-lake evaporation. Over-the-lake rainfall contributes only up to 30% of lake evaporative demand. During the modelling time period, Lake Turkana showed seasonal variations of 1–2 m. The lake level fluctuated in the range up to 4 m between the years 1998–2009. This study demonstrated the usefulness of satellite altimetry data to calibrate and validate the satellite-driven hydrological model for Lake Turkana without using any in-situ data. Furthermore, for Lake Turkana, we identified and outlined opportunities and challenges of using a calibrated

  12. A multi-source satellite data approach for modelling Lake Turkana water level: Calibration and validation using satellite altimetry data

    Science.gov (United States)

    Velpuri, N.M.; Senay, G.B.; Asante, K.O.

    2012-01-01

    Lake Turkana is one of the largest desert lakes in the world and is characterized by high degrees of interand intra-annual fluctuations. The hydrology and water balance of this lake have not been well understood due to its remote location and unavailability of reliable ground truth datasets. Managing surface water resources is a great challenge in areas where in-situ data are either limited or unavailable. In this study, multi-source satellite-driven data such as satellite-based rainfall estimates, modelled runoff, evapotranspiration, and a digital elevation dataset were used to model Lake Turkana water levels from 1998 to 2009. Due to the unavailability of reliable lake level data, an approach is presented to calibrate and validate the water balance model of Lake Turkana using a composite lake level product of TOPEX/Poseidon, Jason-1, and ENVISAT satellite altimetry data. Model validation results showed that the satellitedriven water balance model can satisfactorily capture the patterns and seasonal variations of the Lake Turkana water level fluctuations with a Pearson's correlation coefficient of 0.90 and a Nash-Sutcliffe Coefficient of Efficiency (NSCE) of 0.80 during the validation period (2004-2009). Model error estimates were within 10% of the natural variability of the lake. Our analysis indicated that fluctuations in Lake Turkana water levels are mainly driven by lake inflows and over-the-lake evaporation. Over-the-lake rainfall contributes only up to 30% of lake evaporative demand. During the modelling time period, Lake Turkana showed seasonal variations of 1-2m. The lake level fluctuated in the range up to 4m between the years 1998-2009. This study demonstrated the usefulness of satellite altimetry data to calibrate and validate the satellite-driven hydrological model for Lake Turkana without using any in-situ data. Furthermore, for Lake Turkana, we identified and outlined opportunities and challenges of using a calibrated satellite-driven water balance

  13. Master Logic Diagram: An Approach to Identify Initiating Events of HTGRs

    Science.gov (United States)

    Purba, J. H.

    2018-02-01

    Initiating events of a nuclear power plant being evaluated need to be firstly identified prior to applying probabilistic safety assessment on that plant. Various types of master logic diagrams (MLDs) have been proposedforsearching initiating events of the next generation of nuclear power plants, which have limited data and operating experiences. Those MLDs are different in the number of steps or levels and different in the basis for developing them. This study proposed another type of MLD approach to find high temperature gas cooled reactor (HTGR) initiating events. It consists of five functional steps starting from the top event representing the final objective of the safety functions to the basic event representing the goal of the MLD development, which is an initiating event. The application of the proposed approach to search for two HTGR initiating events, i.e. power turbine generator trip and loss of offsite power, is provided. The results confirmed that the proposed MLD is feasiblefor finding HTGR initiating events.

  14. Interpretative approaches to identifying sources of hydrocarbons in complex contaminated environments

    International Nuclear Information System (INIS)

    Sauer, T.C.; Brown, J.S.; Boehm, P.D.

    1993-01-01

    Recent advances in analytical instrumental hardware and software have permitted the use of more sophisticated approaches in identifying or fingerprinting sources of hydrocarbons in complex matrix environments. In natural resource damage assessments and contaminated site investigations of both terrestrial and aquatic environments, chemical fingerprinting has become an important interpretative tool. The alkyl homologues of the major polycyclic and heterocyclic aromatic hydrocarbons (e.g., phenanthrenes/anthracenes, dibenzothiophenes, chrysenes) have been found to the most valuable hydrocarbons in differentiating hydrocarbon sources, but there are other hydrocarbon analytes, such as the chemical biomarkers steranes and triterpanes, and alkyl homologues of benzene, and chemical methodologies, such as scanning UV fluorescence, that have been found to be useful in certain environments. This presentation will focus on recent data interpretative approaches for hydrocarbon source identification assessments. Selection of appropriate targets analytes and data quality requirements will be discussed and example cases including the Arabian Gulf War oil spill results will be presented

  15. A systematic approach to obtain validated Partial Least Square models for predicting lipoprotein subclasses from serum NMR spectra

    NARCIS (Netherlands)

    Mihaleva, V.V.; van Schalkwijk, D.B.; de Graaf, A.A.; van Duynhoven, J.; van Dorsten, F.A.; Vervoort, J.; Smilde, A.; Westerhuis, J.A.; Jacobs, D.M.

    2014-01-01

    A systematic approach is described for building validated PLS models that predict cholesterol and triglyceride concentrations in lipoprotein subclasses in fasting serum from a normolipidemic, healthy population. The PLS models were built on diffusion-edited 1H NMR spectra and calibrated on

  16. A systematic approach to obtain validated partial least square models for predicting lipoprotein subclasses from serum NMR spectra

    NARCIS (Netherlands)

    Mihaleva, V.V.; Schalkwijk, van D.B.; Graaf, de A.A.; Duynhoven, van J.P.M.; Dorsten, van F.A.; Vervoort, J.J.M.; Smilde, A.K.; Westerhuis, J.A.; Jacobs, D.M.

    2014-01-01

    A systematic approach is described for building validated PLS models that predict cholesterol and triglyceride concentrations in lipoprotein subclasses in fasting serum from a normolipidemic, healthy population. The PLS models were built on diffusion-edited (1)H NMR spectra and calibrated on

  17. A systematic approach to obtain validated partial least square models for predicting lipoprotein subclasses from serum nmr spectra

    NARCIS (Netherlands)

    Mihaleva, V.V.; Schalkwijk, D.B. van; Graaf, A.A. de; Duynhoven, J. van; Dorsten, F.A. van; Vervoort, J.; Smilde, A.; Westerhuis, J.A.; Jacobs, D.M.

    2014-01-01

    A systematic approach is described for building validated PLS models that predict cholesterol and triglyceride concentrations in lipoprotein subclasses in fasting serum from a normolipidemic, healthy population. The PLS models were built on diffusion-edited 1H NMR spectra and calibrated on

  18. Experimental Validation of Various Temperature Modells for Semi-Physical Tyre Model Approaches

    Science.gov (United States)

    Hackl, Andreas; Scherndl, Christoph; Hirschberg, Wolfgang; Lex, Cornelia

    2017-10-01

    With increasing level of complexity and automation in the area of automotive engineering, the simulation of safety relevant Advanced Driver Assistance Systems (ADAS) leads to increasing accuracy demands in the description of tyre contact forces. In recent years, with improvement in tyre simulation, the needs for coping with tyre temperatures and the resulting changes in tyre characteristics are rising significantly. Therefore, experimental validation of three different temperature model approaches is carried out, discussed and compared in the scope of this article. To investigate or rather evaluate the range of application of the presented approaches in combination with respect of further implementation in semi-physical tyre models, the main focus lies on the a physical parameterisation. Aside from good modelling accuracy, focus is held on computational time and complexity of the parameterisation process. To evaluate this process and discuss the results, measurements from a Hoosier racing tyre 6.0 / 18.0 10 LCO C2000 from an industrial flat test bench are used. Finally the simulation results are compared with the measurement data.

  19. Candidate gene linkage approach to identify DNA variants that predispose to preterm birth

    DEFF Research Database (Denmark)

    Bream, Elise N A; Leppellere, Cara R; Cooper, Margaret E

    2013-01-01

    Background:The aim of this study was to identify genetic variants contributing to preterm birth (PTB) using a linkage candidate gene approach.Methods:We studied 99 single-nucleotide polymorphisms (SNPs) for 33 genes in 257 families with PTBs segregating. Nonparametric and parametric analyses were...... through the infant and/or the mother in the etiology of PTB....

  20. The validity of the family history method for identifying Alzheimer disease.

    Science.gov (United States)

    Li, G; Aryan, M; Silverman, J M; Haroutunian, V; Perl, D P; Birstein, S; Lantz, M; Marin, D B; Mohs, R C; Davis, K L

    1997-05-01

    To examine the validity of the family history method for identifying Alzheimer disease (AD) by comparing family history and neuropathological diagnoses. Seventy-seven former residents of the Jewish Home and Hospital for the Aged, New York, NY, with neuropathological evaluations on record were blindly assessed for the presence of dementia and, if present, the type of dementia through family informants by telephone interviews. The Alzheimer's Disease Risk Questionnaire was used to collect demographic information and screen for possible dementia. If dementia was suspected, the Dementia Questionnaire was administered to assess the course and type of dementia, i.e., primary progressive dementia (PPD, likely AD), multiple infarct dementia, mixed dementia (i.e., PPD and multiple infarct dementia), and other dementias based on the modified Diagnostic and Statistical Manual of Mental Disorders, Third Edition, criteria. Sixty (77.9%) of 77 elderly subjects were classified as having dementia and 17 (22.1%) were without dementia by family history evaluation. Of the 60 elderly subjects with dementia, 57 (95%) were found at autopsy to have had neuropathological changes related to dementia. The sensitivity of the family history diagnosis for dementia with related neuropathological change was 0.84 (57 of 68) and the specificity was 0.67 (6 of 9). Using family history information to differentiate the type of dementia, the sensitivity for definite or probable AD (with or without another condition) was 0.69 (36 of 51) and the specificity was 0.73 (19 of 26). The majority (9 of 15) of patients testing false negative for PPD had a history of stroke associated with onset of memory changes, excluding a diagnosis of PPD. Identifying dementia, in general, and AD, in particular, has an acceptable sensitivity and specificity. As is true for direct clinical diagnosis, the major issue associated with misclassifying AD in a family history assessment is the masking effects of a coexisting non

  1. Improving accuracy for identifying related PubMed queries by an integrated approach.

    Science.gov (United States)

    Lu, Zhiyong; Wilbur, W John

    2009-10-01

    PubMed is the most widely used tool for searching biomedical literature online. As with many other online search tools, a user often types a series of multiple related queries before retrieving satisfactory results to fulfill a single information need. Meanwhile, it is also a common phenomenon to see a user type queries on unrelated topics in a single session. In order to study PubMed users' search strategies, it is necessary to be able to automatically separate unrelated queries and group together related queries. Here, we report a novel approach combining both lexical and contextual analyses for segmenting PubMed query sessions and identifying related queries and compare its performance with the previous approach based solely on concept mapping. We experimented with our integrated approach on sample data consisting of 1539 pairs of consecutive user queries in 351 user sessions. The prediction results of 1396 pairs agreed with the gold-standard annotations, achieving an overall accuracy of 90.7%. This demonstrates that our approach is significantly better than the previously published method. By applying this approach to a one day query log of PubMed, we found that a significant proportion of information needs involved more than one PubMed query, and that most of the consecutive queries for the same information need are lexically related. Finally, the proposed PubMed distance is shown to be an accurate and meaningful measure for determining the contextual similarity between biological terms. The integrated approach can play a critical role in handling real-world PubMed query log data as is demonstrated in our experiments.

  2. Validation of Helicopter Gear Condition Indicators Using Seeded Fault Tests

    Science.gov (United States)

    Dempsey, Paula; Brandon, E. Bruce

    2013-01-01

    A "seeded fault test" in support of a rotorcraft condition based maintenance program (CBM), is an experiment in which a component is tested with a known fault while health monitoring data is collected. These tests are performed at operating conditions comparable to operating conditions the component would be exposed to while installed on the aircraft. Performance of seeded fault tests is one method used to provide evidence that a Health Usage Monitoring System (HUMS) can replace current maintenance practices required for aircraft airworthiness. Actual in-service experience of the HUMS detecting a component fault is another validation method. This paper will discuss a hybrid validation approach that combines in service-data with seeded fault tests. For this approach, existing in-service HUMS flight data from a naturally occurring component fault will be used to define a component seeded fault test. An example, using spiral bevel gears as the targeted component, will be presented. Since the U.S. Army has begun to develop standards for using seeded fault tests for HUMS validation, the hybrid approach will be mapped to the steps defined within their Aeronautical Design Standard Handbook for CBM. This paper will step through their defined processes, and identify additional steps that may be required when using component test rig fault tests to demonstrate helicopter CI performance. The discussion within this paper will provide the reader with a better appreciation for the challenges faced when defining a seeded fault test for HUMS validation.

  3. A systems immunology approach identifies the collective impact of 5 miRs in Th2 inflammation.

    Science.gov (United States)

    Kılıç, Ayşe; Santolini, Marc; Nakano, Taiji; Schiller, Matthias; Teranishi, Mizue; Gellert, Pascal; Ponomareva, Yuliya; Braun, Thomas; Uchida, Shizuka; Weiss, Scott T; Sharma, Amitabh; Renz, Harald

    2018-06-07

    Allergic asthma is a chronic inflammatory disease dominated by a CD4+ T helper 2 (Th2) cell signature. The immune response amplifies in self-enforcing loops, promoting Th2-driven cellular immunity and leaving the host unable to terminate inflammation. Posttranscriptional mechanisms, including microRNAs (miRs), are pivotal in maintaining immune homeostasis. Since an altered expression of various miRs has been associated with T cell-driven diseases, including asthma, we hypothesized that miRs control mechanisms ensuring Th2 stability and maintenance in the lung. We isolated murine CD4+ Th2 cells from allergic inflamed lungs and profiled gene and miR expression. Instead of focusing on the magnitude of miR differential expression, here we addressed the secondary consequences for the set of molecular interactions in the cell, the interactome. We developed the Impact of Differential Expression Across Layers, a network-based algorithm to prioritize disease-relevant miRs based on the central role of their targets in the molecular interactome. This method identified 5 Th2-related miRs (mir27b, mir206, mir106b, mir203, and mir23b) whose antagonization led to a sharp reduction of the Th2 phenotype. Overall, a systems biology tool was developed and validated, highlighting the role of miRs in Th2-driven immune response. This result offers potentially novel approaches for therapeutic interventions.

  4. Development and validation testing of a short nutrition questionnaire to identify dietary risk factors in preschoolers aged 12–36 months

    Directory of Open Access Journals (Sweden)

    Niamh Rice

    2015-06-01

    Full Text Available Background: Although imbalances in dietary intakes can have short and longer term influences on the health of preschool children, few tools exist to quickly and easily identify nutritional risk in otherwise healthy young children. Objectives: To develop and test the validity of a parent-administered questionnaire (NutricheQ as a means of evaluating dietary risk in young children (12–36 months. Design: Following a comprehensive development process and internal reliability assessment, the NutricheQ questionnaire was validated in a cohort of 371 Irish preschool children as part of the National Preschool Nutrition Survey. Dietary risk was rated on a scale ranging from 0 to 22 from 11 questions, with a higher score indicating higher risk. Results: Children with higher NutricheQ scores had significantly (p<0.05 lower mean daily intakes of key nutrients such as iron, zinc, vitamin D, riboflavin, niacin, folate, phosphorous, potassium, carotene, retinol, and dietary fibre. They also had lower (p<0.05 intakes of vegetables, fish and fish dishes, meat and infant/toddler milks and higher intakes of processed foods and non-milk beverages, confectionery, sugars and savoury snack foods indicative of poorer dietary quality. Areas under the curve values of 84.7 and 75.6% were achieved for ‘medium’ and ‘high’ dietary risk when compared with expert risk ratings indicating good consistency between the two methods. Conclusion: NutricheQ is a valid method of quickly assessing dietary quality in preschoolers and in identifying those at increased nutritional risk.In ContextAnalysis of data from national food and nutrition surveys typically identifies shortfalls in dietary intakes or quality of young children. This can relate to intakes of micronutrients such as iron or vitamin D as well as to the balance of macronutrients they consume (e.g. fat or sugar. Alongside this lie concerns regarding overweight and obesity and physical inactivity. This combination of

  5. A Systematic Approach to Identify Promising New Items for Small to Medium Enterprises: A Case Study

    Directory of Open Access Journals (Sweden)

    Sukjae Jeong

    2016-11-01

    Full Text Available Despite the growing importance of identifying new business items for small and medium enterprises (SMEs, most previous studies focus on conglomerates. The paucity of empirical studies has also led to limited real-life applications. Hence, this study proposes a systematic approach to find new business items (NBIs that help the prospective SMEs develop, evaluate, and select viable business items to survive the competitive environment. The proposed approach comprises two stages: (1 the classification of diversification of SMEs; and (2 the searching and screening of business items. In the first stage, SMEs are allocated to five groups, based on their internal technological competency and external market conditions. In the second stage, based on the types of SMEs identified in the first stage, a set of alternative business items is derived by combining the results of portfolio analysis and benchmarking analysis. After deriving new business items, a market and technology-driven matrix analysis is utilized to screen suitable business items, and the Bruce Merrifield-Ohe (BMO method is used to categorize and identify prospective items based on market attractiveness and internal capability. To illustrate the applicability of the proposed approach, a case study is presented.

  6. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    Science.gov (United States)

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  7. Reliability and validity of risk analysis

    International Nuclear Information System (INIS)

    Aven, Terje; Heide, Bjornar

    2009-01-01

    In this paper we investigate to what extent risk analysis meets the scientific quality requirements of reliability and validity. We distinguish between two types of approaches within risk analysis, relative frequency-based approaches and Bayesian approaches. The former category includes both traditional statistical inference methods and the so-called probability of frequency approach. Depending on the risk analysis approach, the aim of the analysis is different, the results are presented in different ways and consequently the meaning of the concepts reliability and validity are not the same.

  8. A functional glycoproteomics approach identifies CD13 as a novel E-selectin ligand in breast cancer.

    Science.gov (United States)

    Carrascal, M A; Silva, M; Ferreira, J A; Azevedo, R; Ferreira, D; Silva, A M N; Ligeiro, D; Santos, L L; Sackstein, R; Videira, P A

    2018-05-17

    The glycan moieties sialyl-Lewis-X and/or -A (sLe X/A ) are the primary ligands for E-selectin, regulating subsequent tumor cell extravasation into distant organs. However, the nature of the glycoprotein scaffolds displaying these glycans in breast cancer remains unclear and constitutes the focus of the present investigation. We isolated glycoproteins that bind E-selectin from the CF1_T breast cancer cell line, derived from a patient with ductal carcinoma. Proteins were identified using bottom-up proteomics approach by nanoLC-orbitrap LTQ-MS/MS. Data were curated using bioinformatics tools to highlight clinically relevant glycoproteins, which were validated by flow cytometry, Western blot, immunohistochemistry and in-situ proximity ligation assays in clinical samples. We observed that the CF1_T cell line expressed sLe X , but not sLe A and the E-selectin reactivity was mainly on N-glycans. MS and bioinformatics analysis of the targeted glycoproteins, when narrowed down to the most clinically relevant species in breast cancer, identified CD44 glycoprotein (HCELL) and CD13 as key E-selectin ligands. Additionally, the co-expression of sLe X -CD44 and sLe X -CD13 was confirmed in clinical breast cancer tissue samples. Both CD44 and CD13 glycoforms display sLe X in breast cancer and bind E-selectin, suggesting a key role in metastasis development. Such observations provide a novel molecular rationale for developing targeted therapeutics. While HCELL expression in breast cancer has been previously reported, this is the first study indicating that CD13 functions as an E-selectin ligand in breast cancer. This observation supports previous associations of CD13 with metastasis and draws attention to this glycoprotein as an anti-cancer target. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Graphical approach to assess the soil fertility evaluation model validity for rice (case study: southern area of Merapi Mountain, Indonesia)

    Science.gov (United States)

    Julianto, E. A.; Suntoro, W. A.; Dewi, W. S.; Partoyo

    2018-03-01

    Climate change has been reported to exacerbate land resources degradation including soil fertility decline. The appropriate validity use on soil fertility evaluation could reduce the risk of climate change effect on plant cultivation. This study aims to assess the validity of a Soil Fertility Evaluation Model using a graphical approach. The models evaluated were the Indonesian Soil Research Center (PPT) version model, the FAO Unesco version model, and the Kyuma version model. Each model was then correlated with rice production (dry grain weight/GKP). The goodness of fit of each model can be tested to evaluate the quality and validity of a model, as well as the regression coefficient (R2). This research used the Eviews 9 programme by a graphical approach. The results obtained three curves, namely actual, fitted, and residual curves. If the actual and fitted curves are widely apart or irregular, this means that the quality of the model is not good, or there are many other factors that are still not included in the model (large residual) and conversely. Indeed, if the actual and fitted curves show exactly the same shape, it means that all factors have already been included in the model. Modification of the standard soil fertility evaluation models can improve the quality and validity of a model.

  10. Structural exploration for the refinement of anticancer matrix metalloproteinase-2 inhibitor designing approaches through robust validated multi-QSARs

    Science.gov (United States)

    Adhikari, Nilanjan; Amin, Sk. Abdul; Saha, Achintya; Jha, Tarun

    2018-03-01

    Matrix metalloproteinase-2 (MMP-2) is a promising pharmacological target for designing potential anticancer drugs. MMP-2 plays critical functions in apoptosis by cleaving the DNA repair enzyme namely poly (ADP-ribose) polymerase (PARP). Moreover, MMP-2 expression triggers the vascular endothelial growth factor (VEGF) having a positive influence on tumor size, invasion, and angiogenesis. Therefore, it is an urgent need to develop potential MMP-2 inhibitors without any toxicity but better pharmacokinetic property. In this article, robust validated multi-quantitative structure-activity relationship (QSAR) modeling approaches were attempted on a dataset of 222 MMP-2 inhibitors to explore the important structural and pharmacophoric requirements for higher MMP-2 inhibition. Different validated regression and classification-based QSARs, pharmacophore mapping and 3D-QSAR techniques were performed. These results were challenged and subjected to further validation to explain 24 in house MMP-2 inhibitors to judge the reliability of these models further. All these models were individually validated internally as well as externally and were supported and validated by each other. These results were further justified by molecular docking analysis. Modeling techniques adopted here not only helps to explore the necessary structural and pharmacophoric requirements but also for the overall validation and refinement techniques for designing potential MMP-2 inhibitors.

  11. CFD validation experiments for hypersonic flows

    Science.gov (United States)

    Marvin, Joseph G.

    1992-01-01

    A roadmap for CFD code validation is introduced. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments could provide new validation data.

  12. Methods for identifying 30 chronic conditions: application to administrative data.

    Science.gov (United States)

    Tonelli, Marcello; Wiebe, Natasha; Fortin, Martin; Guthrie, Bruce; Hemmelgarn, Brenda R; James, Matthew T; Klarenbach, Scott W; Lewanczuk, Richard; Manns, Braden J; Ronksley, Paul; Sargious, Peter; Straus, Sharon; Quan, Hude

    2015-04-17

    Multimorbidity is common and associated with poor clinical outcomes and high health care costs. Administrative data are a promising tool for studying the epidemiology of multimorbidity. Our goal was to derive and apply a new scheme for using administrative data to identify the presence of chronic conditions and multimorbidity. We identified validated algorithms that use ICD-9 CM/ICD-10 data to ascertain the presence or absence of 40 morbidities. Algorithms with both positive predictive value and sensitivity ≥70% were graded as "high validity"; those with positive predictive value ≥70% and sensitivity <70% were graded as "moderate validity". To show proof of concept, we applied identified algorithms with high to moderate validity to inpatient and outpatient claims and utilization data from 574,409 people residing in Edmonton, Canada during the 2008/2009 fiscal year. Of the 40 morbidities, we identified 30 that could be identified with high to moderate validity. Approximately one quarter of participants had identified multimorbidity (2 or more conditions), one quarter had a single identified morbidity and the remaining participants were not identified as having any of the 30 morbidities. We identified a panel of 30 chronic conditions that can be identified from administrative data using validated algorithms, facilitating the study and surveillance of multimorbidity. We encourage other groups to use this scheme, to facilitate comparisons between settings and jurisdictions.

  13. Structural identifiability of cyclic graphical models of biological networks with latent variables.

    Science.gov (United States)

    Wang, Yulin; Lu, Na; Miao, Hongyu

    2016-06-13

    Graphical models have long been used to describe biological networks for a variety of important tasks such as the determination of key biological parameters, and the structure of graphical model ultimately determines whether such unknown parameters can be unambiguously obtained from experimental observations (i.e., the identifiability problem). Limited by resources or technical capacities, complex biological networks are usually partially observed in experiment, which thus introduces latent variables into the corresponding graphical models. A number of previous studies have tackled the parameter identifiability problem for graphical models such as linear structural equation models (SEMs) with or without latent variables. However, the limited resolution and efficiency of existing approaches necessarily calls for further development of novel structural identifiability analysis algorithms. An efficient structural identifiability analysis algorithm is developed in this study for a broad range of network structures. The proposed method adopts the Wright's path coefficient method to generate identifiability equations in forms of symbolic polynomials, and then converts these symbolic equations to binary matrices (called identifiability matrix). Several matrix operations are introduced for identifiability matrix reduction with system equivalency maintained. Based on the reduced identifiability matrices, the structural identifiability of each parameter is determined. A number of benchmark models are used to verify the validity of the proposed approach. Finally, the network module for influenza A virus replication is employed as a real example to illustrate the application of the proposed approach in practice. The proposed approach can deal with cyclic networks with latent variables. The key advantage is that it intentionally avoids symbolic computation and is thus highly efficient. Also, this method is capable of determining the identifiability of each single parameter and

  14. Yeast Augmented Network Analysis (YANA: a new systems approach to identify therapeutic targets for human genetic diseases [v1; ref status: indexed, http://f1000r.es/3gk

    Directory of Open Access Journals (Sweden)

    David J. Wiley

    2014-06-01

    Full Text Available Genetic interaction networks that underlie most human diseases are highly complex and poorly defined. Better-defined networks will allow identification of a greater number of therapeutic targets. Here we introduce our Yeast Augmented Network Analysis (YANA approach and test it with the X-linked spinal muscular atrophy (SMA disease gene UBA1. First, we express UBA1 and a mutant variant in fission yeast and use high-throughput methods to identify fission yeast genetic modifiers of UBA1. Second, we analyze available protein-protein interaction network databases in both fission yeast and human to construct UBA1 genetic networks. Third, from these networks we identified potential therapeutic targets for SMA. Finally, we validate one of these targets in a vertebrate (zebrafish SMA model. This study demonstrates the power of combining synthetic and chemical genetics with a simple model system to identify human disease gene networks that can be exploited for treating human diseases.

  15. An integrated computational validation approach for potential novel miRNA prediction

    Directory of Open Access Journals (Sweden)

    Pooja Viswam

    2017-12-01

    Full Text Available MicroRNAs (miRNAs are short, non-coding RNAs between 17bp-24bp length that regulate gene expression by targeting mRNA molecules. The regulatory functions of miRNAs are known to be majorly associated with disease phenotypes such as cancer, cell signaling, cell division, growth and other metabolisms. Novel miRNAs are defined as sequences which does not have any similarity with the existing known sequences and void of any experimental evidences. In recent decades, the advent of next-generation sequencing allows us to capture the small RNA molecules form the cells and developing methods to estimate their expression levels. Several computational algorithms are available to predict the novel miRNAs from the deep sequencing data. In this work, we integrated three novel miRNA prediction programs miRDeep, miRanalyzer and miRPRo to compare and validate their prediction efficiency. The dicer cleavage sites, alignment density, seed conservation, minimum free energy, AU-GC percentage, secondary loop scores, false discovery rates and confidence scores will be considered for comparison and evaluation. Efficiency to identify isomiRs and base pair mismatches in a strand specific manner will also be considered for the computational validation. Further, the criteria and parameters for the identification of the best possible novel miRNA with minimal false positive rates were deduced.

  16. An integrated chemical biology approach identifies specific vulnerability of Ewing's sarcoma to combined inhibition of Aurora kinases A and B.

    Science.gov (United States)

    Winter, Georg E; Rix, Uwe; Lissat, Andrej; Stukalov, Alexey; Müllner, Markus K; Bennett, Keiryn L; Colinge, Jacques; Nijman, Sebastian M; Kubicek, Stefan; Kovar, Heinrich; Kontny, Udo; Superti-Furga, Giulio

    2011-10-01

    Ewing's sarcoma is a pediatric cancer of the bone that is characterized by the expression of the chimeric transcription factor EWS-FLI1 that confers a highly malignant phenotype and results from the chromosomal translocation t(11;22)(q24;q12). Poor overall survival and pronounced long-term side effects associated with traditional chemotherapy necessitate the development of novel, targeted, therapeutic strategies. We therefore conducted a focused viability screen with 200 small molecule kinase inhibitors in 2 different Ewing's sarcoma cell lines. This resulted in the identification of several potential molecular intervention points. Most notably, tozasertib (VX-680, MK-0457) displayed unique nanomolar efficacy, which extended to other cell lines, but was specific for Ewing's sarcoma. Furthermore, tozasertib showed strong synergies with the chemotherapeutic drugs etoposide and doxorubicin, the current standard agents for Ewing's sarcoma. To identify the relevant targets underlying the specific vulnerability toward tozasertib, we determined its cellular target profile by chemical proteomics. We identified 20 known and unknown serine/threonine and tyrosine protein kinase targets. Additional target deconvolution and functional validation by RNAi showed simultaneous inhibition of Aurora kinases A and B to be responsible for the observed tozasertib sensitivity, thereby revealing a new mechanism for targeting Ewing's sarcoma. We further corroborated our cellular observations with xenograft mouse models. In summary, the multilayered chemical biology approach presented here identified a specific vulnerability of Ewing's sarcoma to concomitant inhibition of Aurora kinases A and B by tozasertib and danusertib, which has the potential to become a new therapeutic option.

  17. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    Science.gov (United States)

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  18. The McGill Interactive Pediatric OncoGenetic Guidelines: An approach to identifying pediatric oncology patients most likely to benefit from a genetic evaluation.

    Science.gov (United States)

    Goudie, Catherine; Coltin, Hallie; Witkowski, Leora; Mourad, Stephanie; Malkin, David; Foulkes, William D

    2017-08-01

    Identifying cancer predisposition syndromes in children with tumors is crucial, yet few clinical guidelines exist to identify children at high risk of having germline mutations. The McGill Interactive Pediatric OncoGenetic Guidelines project aims to create a validated pediatric guideline in the form of a smartphone/tablet application using algorithms to process clinical data and help determine whether to refer a child for genetic assessment. This paper discusses the initial stages of the project, focusing on its overall structure, the methodology underpinning the algorithms, and the upcoming algorithm validation process. © 2017 Wiley Periodicals, Inc.

  19. A multi-criteria decision making approach to identify a vaccine formulation.

    Science.gov (United States)

    Dewé, Walthère; Durand, Christelle; Marion, Sandie; Oostvogels, Lidia; Devaster, Jeanne-Marie; Fourneau, Marc

    2016-01-01

    This article illustrates the use of a multi-criteria decision making approach, based on desirability functions, to identify an appropriate adjuvant composition for an influenza vaccine to be used in elderly. The proposed adjuvant system contained two main elements: monophosphoryl lipid and α-tocopherol with squalene in an oil/water emulsion. The objective was to elicit a stronger immune response while maintaining an acceptable reactogenicity and safety profile. The study design, the statistical models, the choice of the desirability functions, the computation of the overall desirability index, and the assessment of the robustness of the ranking are all detailed in this manuscript.

  20. Optimizing and Validating a Brief Assessment for Identifying Children of Service Members at Risk for Psychological Health Problems Following Parent Deployment

    Science.gov (United States)

    2016-07-01

    Journal of Family Therapy, 21, 313-323. Behar, L.B. (1997). The Preschool Behavior Questionnaire. Journal of Abnormal Child Psychology , 5, 265-275... Psychological Health Problems Following Parent Deployment PRINCIPAL INVESTIGATOR: Julie Wargo Aikins, PhD CONTRACTING ORGANIZATION: Wayne State...Validating a Brief Assessment for Identifying Children of Service Members at Risk for Psychological Health Problems Following Parent Deployment 5b. GRANT

  1. Application and validation of case-finding algorithms for identifying individuals with human immunodeficiency virus from administrative data in British Columbia, Canada.

    Directory of Open Access Journals (Sweden)

    Bohdan Nosyk

    Full Text Available To define a population-level cohort of individuals infected with the human immunodeficiency virus (HIV in the province of British Columbia from available registries and administrative datasets using a validated case-finding algorithm.Individuals were identified for possible cohort inclusion from the BC Centre for Excellence in HIV/AIDS (CfE drug treatment program (antiretroviral therapy and laboratory testing datasets (plasma viral load (pVL and CD4 diagnostic test results, the BC Centre for Disease Control (CDC provincial HIV surveillance database (positive HIV tests, as well as databases held by the BC Ministry of Health (MoH; the Discharge Abstract Database (hospitalizations, the Medical Services Plan (physician billing and PharmaNet databases (additional HIV-related medications. A validated case-finding algorithm was applied to distinguish true HIV cases from those likely to have been misclassified. The sensitivity of the algorithms was assessed as the proportion of confirmed cases (those with records in the CfE, CDC and MoH databases positively identified by each algorithm. A priori hypotheses were generated and tested to verify excluded cases.A total of 25,673 individuals were identified as having at least one HIV-related health record. Among 9,454 unconfirmed cases, the selected case-finding algorithm identified 849 individuals believed to be HIV-positive. The sensitivity of this algorithm among confirmed cases was 88%. Those excluded from the cohort were more likely to be female (44.4% vs. 22.5%; p<0.01, had a lower mortality rate (2.18 per 100 person years (100PY vs. 3.14/100PY; p<0.01, and had lower median rates of health service utilization (days of medications dispensed: 9745/100PY vs. 10266/100PY; p<0.01; days of inpatient care: 29/100PY vs. 98/100PY; p<0.01; physician billings: 602/100PY vs. 2,056/100PY; p<0.01.The application of validated case-finding algorithms and subsequent hypothesis testing provided a strong framework for

  2. Genetic screens to identify pathogenic gene variants in the common cancer predisposition Lynch syndrome

    DEFF Research Database (Denmark)

    Drost, Mark; Lützen, Anne; van Hees, Sandrine

    2013-01-01

    In many individuals suspected of the common cancer predisposition Lynch syndrome, variants of unclear significance (VUS), rather than an obviously pathogenic mutations, are identified in one of the DNA mismatch repair (MMR) genes. The uncertainty of whether such VUS inactivate MMR, and therefore...... function. When a residue identified as mutated in an individual suspected of Lynch syndrome is listed as critical in such a reverse diagnosis catalog, there is a high probability that the corresponding human VUS is pathogenic. To investigate the applicability of this approach, we have generated....... Nearly half of these critical residues match with VUS previously identified in individuals suspected of Lynch syndrome. This aids in the assignment of pathogenicity to these human VUS and validates the approach described here as a diagnostic tool. In a wider perspective, this work provides a model...

  3. Validity as a social imperative for assessment in health professions education: a concept analysis.

    Science.gov (United States)

    Marceau, Mélanie; Gallagher, Frances; Young, Meredith; St-Onge, Christina

    2018-06-01

    Assessment can have far-reaching consequences for future health care professionals and for society. Thus, it is essential to establish the quality of assessment. Few modern approaches to validity are well situated to ensure the quality of complex assessment approaches, such as authentic and programmatic assessments. Here, we explore and delineate the concept of validity as a social imperative in the context of assessment in health professions education (HPE) as a potential framework for examining the quality of complex and programmatic assessment approaches. We conducted a concept analysis using Rodgers' evolutionary method to describe the concept of validity as a social imperative in the context of assessment in HPE. Supported by an academic librarian, we developed and executed a search strategy across several databases for literature published between 1995 and 2016. From a total of 321 citations, we identified 67 articles that met our inclusion criteria. Two team members analysed the texts using a specified approach to qualitative data analysis. Consensus was achieved through full team discussions. Attributes that characterise the concept were: (i) demonstration of the use of evidence considered credible by society to document the quality of assessment; (ii) validation embedded through the assessment process and score interpretation; (iii) documented validity evidence supporting the interpretation of the combination of assessment findings, and (iv) demonstration of a justified use of a variety of evidence (quantitative and qualitative) to document the quality of assessment strategies. The emerging concept of validity as a social imperative highlights some areas of focus in traditional validation frameworks, whereas some characteristics appear unique to HPE and move beyond traditional frameworks. The study reflects the importance of embedding consideration for society and societal concerns throughout the assessment and validation process, and may represent a

  4. Principles of Proper Validation

    DEFF Research Database (Denmark)

    Esbensen, Kim; Geladi, Paul

    2010-01-01

    to suffer from the same deficiencies. The PPV are universal and can be applied to all situations in which the assessment of performance is desired: prediction-, classification-, time series forecasting-, modeling validation. The key element of PPV is the Theory of Sampling (TOS), which allow insight......) is critically necessary for the inclusion of the sampling errors incurred in all 'future' situations in which the validated model must perform. Logically, therefore, all one data set re-sampling approaches for validation, especially cross-validation and leverage-corrected validation, should be terminated...

  5. Validated questionnaires heighten detection of difficult asthma comorbidities.

    Science.gov (United States)

    Radhakrishna, Naghmeh; Tay, Tunn Ren; Hore-Lacy, Fiona; Stirling, Robert; Hoy, Ryan; Dabscheck, Eli; Hew, Mark

    2017-04-01

    Multiple extra-pulmonary comorbidities contribute to difficult asthma, but their diagnosis can be challenging and time consuming. Previous data on comorbidity detection have focused on clinical assessment, which may miss certain conditions. We aimed to locate relevant validated screening questionnaires to identify extra-pulmonary comorbidities that contribute to difficult asthma, and evaluate their performance during a difficult asthma evaluation. MEDLINE was searched to identify key extra-pulmonary comorbidities that contribute to difficult asthma. Screening questionnaires were chosen based on ease of use, presence of a cut-off score, and adequate validation to help systematically identify comorbidities. In a consecutive series of 86 patients referred for systematic evaluation of difficult asthma, questionnaires were administered prior to clinical consultation. Six difficult asthma comorbidities and corresponding screening questionnaires were found: sinonasal disease (allergic rhinitis and chronic rhinosinusitis), vocal cord dysfunction, dysfunctional breathing, obstructive sleep apnea, anxiety and depression, and gastro-oesophageal reflux disease. When the questionnaires were added to the referring clinician's impression, the detection of all six comorbidities was significantly enhanced. The average time for questionnaire administration was approximately 40 minutes. The use of validated screening questionnaires heightens detection of comorbidities in difficult asthma. The availability of data from a battery of questionnaires prior to consultation can save time and allow clinicians to systematically assess difficult asthma patients and to focus on areas of particular concern. Such an approach would ensure that all contributing comorbidities have been addressed before significant treatment escalation is considered.

  6. Validation of Land Surface Temperature from Sentinel-3

    Science.gov (United States)

    Ghent, D.

    2017-12-01

    One of the main objectives of the Sentinel-3 mission is to measure sea- and land-surface temperature with high-end accuracy and reliability in support of environmental and climate monitoring in an operational context. Calibration and validation are thus key criteria for operationalization within the framework of the Sentinel-3 Mission Performance Centre (S3MPC). Land surface temperature (LST) has a long heritage of satellite observations which have facilitated our understanding of land surface and climate change processes, such as desertification, urbanization, deforestation and land/atmosphere coupling. These observations have been acquired from a variety of satellite instruments on platforms in both low-earth orbit and in geostationary orbit. Retrieval accuracy can be a challenge though; surface emissivities can be highly variable owing to the heterogeneity of the land, and atmospheric effects caused by the presence of aerosols and by water vapour absorption can give a bias to the underlying LST. As such, a rigorous validation is critical in order to assess the quality of the data and the associated uncertainties. Validation of the level-2 SL_2_LST product, which became freely available on an operational basis from 5th July 2017 builds on an established validation protocol for satellite-based LST. This set of guidelines provides a standardized framework for structuring LST validation activities. The protocol introduces a four-pronged approach which can be summarised thus: i) in situ validation where ground-based observations are available; ii) radiance-based validation over sites that are homogeneous in emissivity; iii) intercomparison with retrievals from other satellite sensors; iv) time-series analysis to identify artefacts on an interannual time-scale. This multi-dimensional approach is a necessary requirement for assessing the performance of the LST algorithm for the Sea and Land Surface Temperature Radiometer (SLSTR) which is designed around biome

  7. Conceptualization of Approaches and Thought Processes Emerging in Validating of Model in Mathematical Modeling in Technology Aided Environment

    Science.gov (United States)

    Hidiroglu, Çaglar Naci; Bukova Güzel, Esra

    2013-01-01

    The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…

  8. Automated ensemble assembly and validation of microbial genomes

    Science.gov (United States)

    2014-01-01

    Background The continued democratization of DNA sequencing has sparked a new wave of development of genome assembly and assembly validation methods. As individual research labs, rather than centralized centers, begin to sequence the majority of new genomes, it is important to establish best practices for genome assembly. However, recent evaluations such as GAGE and the Assemblathon have concluded that there is no single best approach to genome assembly. Instead, it is preferable to generate multiple assemblies and validate them to determine which is most useful for the desired analysis; this is a labor-intensive process that is often impossible or unfeasible. Results To encourage best practices supported by the community, we present iMetAMOS, an automated ensemble assembly pipeline; iMetAMOS encapsulates the process of running, validating, and selecting a single assembly from multiple assemblies. iMetAMOS packages several leading open-source tools into a single binary that automates parameter selection and execution of multiple assemblers, scores the resulting assemblies based on multiple validation metrics, and annotates the assemblies for genes and contaminants. We demonstrate the utility of the ensemble process on 225 previously unassembled Mycobacterium tuberculosis genomes as well as a Rhodobacter sphaeroides benchmark dataset. On these real data, iMetAMOS reliably produces validated assemblies and identifies potential contamination without user intervention. In addition, intelligent parameter selection produces assemblies of R. sphaeroides comparable to or exceeding the quality of those from the GAGE-B evaluation, affecting the relative ranking of some assemblers. Conclusions Ensemble assembly with iMetAMOS provides users with multiple, validated assemblies for each genome. Although computationally limited to small or mid-sized genomes, this approach is the most effective and reproducible means for generating high-quality assemblies and enables users to

  9. Identifying functional reorganization of spelling networks: an individual peak probability comparison approach

    Science.gov (United States)

    Purcell, Jeremy J.; Rapp, Brenda

    2013-01-01

    Previous research has shown that damage to the neural substrates of orthographic processing can lead to functional reorganization during reading (Tsapkini et al., 2011); in this research we ask if the same is true for spelling. To examine the functional reorganization of spelling networks we present a novel three-stage Individual Peak Probability Comparison (IPPC) analysis approach for comparing the activation patterns obtained during fMRI of spelling in a single brain-damaged individual with dysgraphia to those obtained in a set of non-impaired control participants. The first analysis stage characterizes the convergence in activations across non-impaired control participants by applying a technique typically used for characterizing activations across studies: Activation Likelihood Estimate (ALE) (Turkeltaub et al., 2002). This method was used to identify locations that have a high likelihood of yielding activation peaks in the non-impaired participants. The second stage provides a characterization of the degree to which the brain-damaged individual's activations correspond to the group pattern identified in Stage 1. This involves performing a Mahalanobis distance statistics analysis (Tsapkini et al., 2011) that compares each of a control group's peak activation locations to the nearest peak generated by the brain-damaged individual. The third stage evaluates the extent to which the brain-damaged individual's peaks are atypical relative to the range of individual variation among the control participants. This IPPC analysis allows for a quantifiable, statistically sound method for comparing an individual's activation pattern to the patterns observed in a control group and, thus, provides a valuable tool for identifying functional reorganization in a brain-damaged individual with impaired spelling. Furthermore, this approach can be applied more generally to compare any individual's activation pattern with that of a set of other individuals. PMID:24399981

  10. Analysis of market competitive structure: The new methodological approach based in the using

    International Nuclear Information System (INIS)

    Romero de la Fuente, J.; Yague Guillen, M. J.

    2007-01-01

    This paper proposes a new methodological approach to identify market competitive structure, applying usage situation concept in positioning analysis. Dimensions used by consumer to classify products are identified using Correspondence Analysis and competitive groups are formed. Results are validated with Discriminant Analysis. (Author) 23 refs

  11. How well do discharge diagnoses identify hospitalised patients with community-acquired infections? - a validation study

    DEFF Research Database (Denmark)

    Henriksen, Daniel Pilsgaard; Nielsen, Stig Lønberg; Laursen, Christian Borbjerg

    2014-01-01

    -10 diagnoses was 79.9% (95%CI: 78.1-81.3%), specificity 83.9% (95%CI: 82.6-85.1%), positive likelihood ratio 4.95 (95%CI: 4.58-5.36) and negative likelihood ratio 0.24 (95%CI: 0.22-0.26). The two most common sites of infection, the lower respiratory tract and urinary tract, had positive likelihood......BACKGROUND: Credible measures of disease incidence, trends and mortality can be obtained through surveillance using manual chart review, but this is both time-consuming and expensive. ICD-10 discharge diagnoses are used as surrogate markers of infection, but knowledge on the validity of infections...... in general is sparse. The aim of the study was to determine how well ICD-10 discharge diagnoses identify patients with community-acquired infections in a medical emergency department (ED), overall and related to sites of infection and patient characteristics. METHODS: We manually reviewed 5977 patients...

  12. Computational Design and Discovery of Ni-Based Alloys and Coatings: Thermodynamic Approaches Validated by Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zi-Kui [Pennsylvania State University; Gleeson, Brian [University of Pittsburgh; Shang, Shunli [Pennsylvania State University; Gheno, Thomas [University of Pittsburgh; Lindwall, Greta [Pennsylvania State University; Zhou, Bi-Cheng [Pennsylvania State University; Liu, Xuan [Pennsylvania State University; Ross, Austin [Pennsylvania State University

    2018-04-23

    This project developed computational tools that can complement and support experimental efforts in order to enable discovery and more efficient development of Ni-base structural materials and coatings. The project goal was reached through an integrated computation-predictive and experimental-validation approach, including first-principles calculations, thermodynamic CALPHAD (CALculation of PHAse Diagram), and experimental investigations on compositions relevant to Ni-base superalloys and coatings in terms of oxide layer growth and microstructure stabilities. The developed description included composition ranges typical for coating alloys and, hence, allow for prediction of thermodynamic properties for these material systems. The calculation of phase compositions, phase fraction, and phase stabilities, which are directly related to properties such as ductility and strength, was a valuable contribution, along with the collection of computational tools that are required to meet the increasing demands for strong, ductile and environmentally-protective coatings. Specifically, a suitable thermodynamic description for the Ni-Al-Cr-Co-Si-Hf-Y system was developed for bulk alloy and coating compositions. Experiments were performed to validate and refine the thermodynamics from the CALPHAD modeling approach. Additionally, alloys produced using predictions from the current computational models were studied in terms of their oxidation performance. Finally, results obtained from experiments aided in the development of a thermodynamic modeling automation tool called ESPEI/pycalphad - for more rapid discovery and development of new materials.

  13. A standardized approach to verification and validation to assist in expert system development

    International Nuclear Information System (INIS)

    Hines, J.W.; Hajek, B.K.; Miller, D.W.; Haas, M.A.

    1992-01-01

    For the past six years, the Nuclear Engineering Program's Artificial Intelligence (AI) Group at The Ohio State University has been developing an integration of expert systems to act as an aid to nuclear power plant operators. This Operator Advisor consists of four modules that monitor plant parameters, detect deviations from normality, diagnose the root cause of the abnormality, manage procedures to effectively respond to the abnormality, and mitigate its consequences. To aid in the development of this new system, a standardized Verification and Validation (V and V) approach is being implemented. The primary functions are to guide the development of the expert system and to ensure that the end product fulfills the initial objectives. The development process has been divided into eight life-cycle V and V phases from concept to operation and maintenance. Each phase has specific V and V tasks to be performed to ensure a quality end product. Four documents are being used to guide development. The Software Verification and Validation Plan (SVVP) outlines the V and V tasks necessary to verify the product at the end of each software development phase, and to validate that the end product complies with the established software and system requirements and meets the needs of the user. The Software Requirements Specification (SRS) documents the essential requirements of the system. The Software Design Description (SDD) represents these requirements with a specific design. And lastly, the Software Test Document establishes a testing methodology to be used throughout the development life-cycle

  14. Relative validity and reproducibility of a food frequency questionnaire for identifying the dietary patterns of toddlers in New Zealand.

    Science.gov (United States)

    Mills, Virginia C; Skidmore, Paula M L; Watson, Emily O; Taylor, Rachael W; Fleming, Elizabeth A; Heath, Anne-Louise M

    2015-04-01

    Dietary patterns provide insight into relationships between diet and disease. Food frequency questionnaires (FFQs) can identify dietary patterns in adults, but similar analyses have not been performed for toddlers. The aim of the Eating Assessment in Toddlers study was to evaluate the relative validity and reproducibility of dietary patterns from an FFQ developed for toddlers aged 12 to 24 months. Participants were 160 toddlers aged 12 to 24 months and their primary caregiver who completed an FFQ twice, approximately 5 weeks apart (FFQ1 and FFQ2). A 5-day weighed food record was collected on nonconsecutive days between FFQ administrations. Principal component analysis identified three major dietary patterns similar across FFQ1, FFQ2, and the 5-day weighted food record. The sweet foods and fries pattern was characterized by high intakes of sweet foods, fries and roast potato and kumara (sweet potato), butter and margarines, processed meat, sweet drinks, and fruit or milk drinks. The vegetables and meat pattern was characterized by high intakes of vegetables, meat, eggs and beans, and fruit. The milk and fruit pattern was characterized by high intakes of milk and milk products and fruit, and low intakes of breastmilk and infant and follow-up formula. The FFQ (FFQ1) correctly classified 43.1% to 51.0% of toddlers into the same quartile of pattern score as the 5-day weighted food record, and Pearson correlations ranged from 0.56 to 0.68 for the three patterns. Reliability coefficients ranged from 0.71 to 0.72 for all three dietary patterns. the Eating Assessment in Toddlers study FFQ shows acceptable relative validity and high reproducibility for identifying dietary patterns in toddlers. Copyright © 2015 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  15. Post-partum depression in Kinshasa, Democratic Republic of Congo: validation of a concept using a mixed-methods cross-cultural approach.

    Science.gov (United States)

    Bass, Judith K; Ryder, Robert W; Lammers, Marie-Christine; Mukaba, Thibaut N; Bolton, Paul A

    2008-12-01

    To determine if a post-partum depression syndrome exists among mothers in Kinshasa, Democratic Republic of Congo, by adapting and validating standard screening instruments. Using qualitative interviewing techniques, we interviewed a convenience sample of 80 women living in a large peri-urban community to better understand local conceptions of mental illness. We used this information to adapt two standard depression screeners, the Edinburgh Post-partum Depression Scale and the Hopkins Symptom Checklist. In a subsequent quantitative study, we identified another 133 women with and without the local depression syndrome and used this information to validate the adapted screening instruments. Based on the qualitative data, we found a local syndrome that closely approximates the Western model of major depressive disorder. The women we interviewed, representative of the local populace, considered this an important syndrome among new mothers because it negatively affects women and their young children. Women (n = 41) identified as suffering from this syndrome had statistically significantly higher depression severity scores on both adapted screeners than women identified as not having this syndrome (n = 20; P depression and validated instruments to screen for this disorder. As the importance of compromised mental health in developing world populations becomes recognized, the methods described in this report will be useful more widely.

  16. On the validity of the incremental approach to estimate the impact of cities on air quality

    Science.gov (United States)

    Thunis, Philippe

    2018-01-01

    The question of how much cities are the sources of their own air pollution is not only theoretical as it is critical to the design of effective strategies for urban air quality planning. In this work, we assess the validity of the commonly used incremental approach to estimate the likely impact of cities on their air pollution. With the incremental approach, the city impact (i.e. the concentration change generated by the city emissions) is estimated as the concentration difference between a rural background and an urban background location, also known as the urban increment. We show that the city impact is in reality made up of the urban increment and two additional components and consequently two assumptions need to be fulfilled for the urban increment to be representative of the urban impact. The first assumption is that the rural background location is not influenced by emissions from within the city whereas the second requires that background concentration levels, obtained with zero city emissions, are equal at both locations. Because the urban impact is not measurable, the SHERPA modelling approach, based on a full air quality modelling system, is used in this work to assess the validity of these assumptions for some European cities. Results indicate that for PM2.5, these two assumptions are far from being fulfilled for many large or medium city sizes. For this type of cities, urban increments are largely underestimating city impacts. Although results are in better agreement for NO2, similar issues are met. In many situations the incremental approach is therefore not an adequate estimate of the urban impact on air pollution. This poses issues in terms of interpretation when these increments are used to define strategic options in terms of air quality planning. We finally illustrate the interest of comparing modelled and measured increments to improve our confidence in the model results.

  17. Clustering approaches to identifying gene expression patterns from DNA microarray data.

    Science.gov (United States)

    Do, Jin Hwan; Choi, Dong-Kug

    2008-04-30

    The analysis of microarray data is essential for large amounts of gene expression data. In this review we focus on clustering techniques. The biological rationale for this approach is the fact that many co-expressed genes are co-regulated, and identifying co-expressed genes could aid in functional annotation of novel genes, de novo identification of transcription factor binding sites and elucidation of complex biological pathways. Co-expressed genes are usually identified in microarray experiments by clustering techniques. There are many such methods, and the results obtained even for the same datasets may vary considerably depending on the algorithms and metrics for dissimilarity measures used, as well as on user-selectable parameters such as desired number of clusters and initial values. Therefore, biologists who want to interpret microarray data should be aware of the weakness and strengths of the clustering methods used. In this review, we survey the basic principles of clustering of DNA microarray data from crisp clustering algorithms such as hierarchical clustering, K-means and self-organizing maps, to complex clustering algorithms like fuzzy clustering.

  18. Validation of a Methodology to Predict Micro-Vibrations Based on Finite Element Model Approach

    Science.gov (United States)

    Soula, Laurent; Rathband, Ian; Laduree, Gregory

    2014-06-01

    This paper presents the second part of the ESA R&D study called "METhodology for Analysis of structure- borne MICro-vibrations" (METAMIC). After defining an integrated analysis and test methodology to help predicting micro-vibrations [1], a full-scale validation test campaign has been carried out. It is based on a bread-board representative of typical spacecraft (S/C) platform consisting in a versatile structure made of aluminium sandwich panels equipped with different disturbance sources and a dummy payload made of a silicon carbide (SiC) bench. The bread-board has been instrumented with a large set of sensitive accelerometers and tests have been performed including back-ground noise measurement, modal characterization and micro- vibration tests. The results provided responses to the perturbation coming from a reaction wheel or cryo-cooler compressors, operated independently then simultaneously with different operation modes. Using consistent modelling and associated experimental characterization techniques, a correlation status has been assessed by comparing test results with predictions based on FEM approach. Very good results have been achieved particularly for the case of a wheel in sweeping rate operation with test results over-predicted within a reasonable margin lower than two. Some limitations of the methodology have also been identified for sources operating at a fixed rate or coming with a small number of dominant harmonics and recommendations have been issued in order to deal with model uncertainties and stay conservative.

  19. Can Medicaid Claims Validly Ascertain Foster Care Status?

    Science.gov (United States)

    Raghavan, Ramesh; Brown, Derek S; Allaire, Benjamin T

    2017-08-01

    Medicaid claims have been used to identify populations of children in foster care in the current literature; however, the ability of such an approach to validly ascertain a foster care population is unknown. This study linked children in the National Survey of Child and Adolescent Well-Being-I to their Medicaid claims from 36 states using their Social Security numbers. Using this match, we examined discordance between caregiver report of foster care placement and the foster care eligibility code contained in the child's Medicaid claims. Only 73% of youth placed in foster care for at least a year displayed a Medicaid code for foster care eligibility. Half of all youth coming into contact with child welfare displayed discordance between caregiver report and Medicaid claims. Children with emergency department utilization, and those in primary care case management health insurance arrangements, had the highest odds of accurate ascertainment. The use of Medicaid claims to identify a cohort of children in foster care results in high rates of underascertainment. Supplementing administrative data with survey data is one way to enhance validity of ascertainment.

  20. Measuring multi-joint stiffness during single movements: numerical validation of a novel time-frequency approach.

    Science.gov (United States)

    Piovesan, Davide; Pierobon, Alberto; DiZio, Paul; Lackner, James R

    2012-01-01

    This study presents and validates a Time-Frequency technique for measuring 2-dimensional multijoint arm stiffness throughout a single planar movement as well as during static posture. It is proposed as an alternative to current regressive methods which require numerous repetitions to obtain average stiffness on a small segment of the hand trajectory. The method is based on the analysis of the reassigned spectrogram of the arm's response to impulsive perturbations and can estimate arm stiffness on a trial-by-trial basis. Analytic and empirical methods are first derived and tested through modal analysis on synthetic data. The technique's accuracy and robustness are assessed by modeling the estimation of stiffness time profiles changing at different rates and affected by different noise levels. Our method obtains results comparable with two well-known regressive techniques. We also test how the technique can identify the viscoelastic component of non-linear and higher than second order systems with a non-parametrical approach. The technique proposed here is very impervious to noise and can be used easily for both postural and movement tasks. Estimations of stiffness profiles are possible with only one perturbation, making our method a useful tool for estimating limb stiffness during motor learning and adaptation tasks, and for understanding the modulation of stiffness in individuals with neurodegenerative diseases.

  1. Measuring multi-joint stiffness during single movements: numerical validation of a novel time-frequency approach.

    Directory of Open Access Journals (Sweden)

    Davide Piovesan

    Full Text Available This study presents and validates a Time-Frequency technique for measuring 2-dimensional multijoint arm stiffness throughout a single planar movement as well as during static posture. It is proposed as an alternative to current regressive methods which require numerous repetitions to obtain average stiffness on a small segment of the hand trajectory. The method is based on the analysis of the reassigned spectrogram of the arm's response to impulsive perturbations and can estimate arm stiffness on a trial-by-trial basis. Analytic and empirical methods are first derived and tested through modal analysis on synthetic data. The technique's accuracy and robustness are assessed by modeling the estimation of stiffness time profiles changing at different rates and affected by different noise levels. Our method obtains results comparable with two well-known regressive techniques. We also test how the technique can identify the viscoelastic component of non-linear and higher than second order systems with a non-parametrical approach. The technique proposed here is very impervious to noise and can be used easily for both postural and movement tasks. Estimations of stiffness profiles are possible with only one perturbation, making our method a useful tool for estimating limb stiffness during motor learning and adaptation tasks, and for understanding the modulation of stiffness in individuals with neurodegenerative diseases.

  2. The Baby TALK Model: An Innovative Approach to Identifying High-Risk Children and Families

    Science.gov (United States)

    Villalpando, Aimee Hilado; Leow, Christine; Hornstein, John

    2012-01-01

    This research report examines the Baby TALK model, an innovative early childhood intervention approach used to identify, recruit, and serve young children who are at-risk for developmental delays, mental health needs, and/or school failure, and their families. The report begins with a description of the model. This description is followed by an…

  3. Validity of the Male Depression Risk Scale in a representative Canadian sample: sensitivity and specificity in identifying men with recent suicide attempt.

    Science.gov (United States)

    Rice, Simon M; Ogrodniczuk, John S; Kealy, David; Seidler, Zac E; Dhillon, Haryana M; Oliffe, John L

    2017-12-22

    Clinical practice and literature has supported the existence of a phenotypic sub-type of depression in men. While a number of self-report rating scales have been developed in order to empirically test the male depression construct, psychometric validation of these scales is limited. To confirm the psychometric properties of the multidimensional Male Depression Risk Scale (MDRS-22) and to develop clinical cut-off scores for the MDRS-22. Data were obtained from an online sample of 1000 Canadian men (median age (M) = 49.63, standard deviation (SD) = 14.60). Confirmatory factor analysis (CFA) was used to replicate the established six-factor model of the MDRS-22. Psychometric values of the MDRS subscales were comparable to the widely used Patient Health Questionnaire-9. CFA model fit indices indicated adequate model fit for the six-factor MDRS-22 model. ROC curve analysis indicated the MDRS-22 was effective for identifying those with a recent (previous four-weeks) suicide attempt (area under curve (AUC) values = 0.837). The MDRS-22 cut-off identified proportionally more (84.62%) cases of recent suicide attempt relative to the PHQ-9 moderate range (53.85%). The MDRS-22 is the first male-sensitive depression scale to be psychometrically validated using CFA techniques in independent and cross-nation samples. Additional studies should identify differential item functioning and evaluate cross-cultural effects.

  4. Semi-automated knowledge discovery: identifying and profiling human trafficking

    Science.gov (United States)

    Poelmans, Jonas; Elzinga, Paul; Ignatov, Dmitry I.; Kuznetsov, Sergei O.

    2012-11-01

    We propose an iterative and human-centred knowledge discovery methodology based on formal concept analysis. The proposed approach recognizes the important role of the domain expert in mining real-world enterprise applications and makes use of specific domain knowledge, including human intelligence and domain-specific constraints. Our approach was empirically validated at the Amsterdam-Amstelland police to identify suspects and victims of human trafficking in 266,157 suspicious activity reports. Based on guidelines of the Attorney Generals of the Netherlands, we first defined multiple early warning indicators that were used to index the police reports. Using concept lattices, we revealed numerous unknown human trafficking and loverboy suspects. In-depth investigation by the police resulted in a confirmation of their involvement in illegal activities resulting in actual arrestments been made. Our human-centred approach was embedded into operational policing practice and is now successfully used on a daily basis to cope with the vastly growing amount of unstructured information.

  5. Using a distribution and conservation status weighted hotspot approach to identify areas in need of conservation action to benefit Idaho bird species

    Science.gov (United States)

    Haines, Aaron M.; Leu, Matthias; Svancara, Leona K.; Wilson, Gina; Scott, J. Michael

    2010-01-01

    Identification of biodiversity hotspots (hereafter, hotspots) has become a common strategy to delineate important areas for wildlife conservation. However, the use of hotspots has not often incorporated important habitat types, ecosystem services, anthropogenic activity, or consistency in identifying important conservation areas. The purpose of this study was to identify hotspots to improve avian conservation efforts for Species of Greatest Conservation Need (SGCN) in the state of Idaho, United States. We evaluated multiple approaches to define hotspots and used a unique approach based on weighting species by their distribution size and conservation status to identify hotspot areas. All hotspot approaches identified bodies of water (Bear Lake, Grays Lake, and American Falls Reservoir) as important hotspots for Idaho avian SGCN, but we found that the weighted approach produced more congruent hotspot areas when compared to other hotspot approaches. To incorporate anthropogenic activity into hotspot analysis, we grouped species based on their sensitivity to specific human threats (i.e., urban development, agriculture, fire suppression, grazing, roads, and logging) and identified ecological sections within Idaho that may require specific conservation actions to address these human threats using the weighted approach. The Snake River Basalts and Overthrust Mountains ecological sections were important areas for potential implementation of conservation actions to conserve biodiversity. Our approach to identifying hotspots may be useful as part of a larger conservation strategy to aid land managers or local governments in applying conservation actions on the ground.

  6. A comprehensive approach to identify reliable reference gene candidates to investigate the link between alcoholism and endocrinology in Sprague-Dawley rats.

    Directory of Open Access Journals (Sweden)

    Faten A Taki

    Full Text Available Gender and hormonal differences are often correlated with alcohol dependence and related complications like addiction and breast cancer. Estrogen (E2 is an important sex hormone because it serves as a key protein involved in organism level signaling pathways. Alcoholism has been reported to affect estrogen receptor signaling; however, identifying the players involved in such multi-faceted syndrome is complex and requires an interdisciplinary approach. In many situations, preliminary investigations included a straight forward, yet informative biotechniques such as gene expression analyses using quantitative real time PCR (qRT-PCR. The validity of qRT-PCR-based conclusions is affected by the choice of reliable internal controls. With this in mind, we compiled a list of 15 commonly used housekeeping genes (HKGs as potential reference gene candidates in rat biological models. A comprehensive comparison among 5 statistical approaches (geNorm, dCt method, NormFinder, BestKeeper, and RefFinder was performed to identify the minimal number as well the most stable reference genes required for reliable normalization in experimental rat groups that comprised sham operated (SO, ovariectomized rats in the absence (OVX or presence of E2 (OVXE2. These rat groups were subdivided into subgroups that received alcohol in liquid diet or isocalroic control liquid diet for 12 weeks. Our results showed that U87, 5S rRNA, GAPDH, and U5a were the most reliable gene candidates for reference genes in heart and brain tissue. However, different gene stability ranking was specific for each tissue input combination. The present preliminary findings highlight the variability in reference gene rankings across different experimental conditions and analytic methods and constitute a fundamental step for gene expression assays.

  7. Network Security Validation Using Game Theory

    Science.gov (United States)

    Papadopoulou, Vicky; Gregoriades, Andreas

    Non-functional requirements (NFR) such as network security recently gained widespread attention in distributed information systems. Despite their importance however, there is no systematic approach to validate these requirements given the complexity and uncertainty characterizing modern networks. Traditionally, network security requirements specification has been the results of a reactive process. This however, limited the immunity property of the distributed systems that depended on these networks. Security requirements specification need a proactive approach. Networks' infrastructure is constantly under attack by hackers and malicious software that aim to break into computers. To combat these threats, network designers need sophisticated security validation techniques that will guarantee the minimum level of security for their future networks. This paper presents a game-theoretic approach to security requirements validation. An introduction to game theory is presented along with an example that demonstrates the application of the approach.

  8. Raman fiber-optical method for colon cancer detection: Cross-validation and outlier identification approach

    Science.gov (United States)

    Petersen, D.; Naveed, P.; Ragheb, A.; Niedieker, D.; El-Mashtoly, S. F.; Brechmann, T.; Kötting, C.; Schmiegel, W. H.; Freier, E.; Pox, C.; Gerwert, K.

    2017-06-01

    Endoscopy plays a major role in early recognition of cancer which is not externally accessible and therewith in increasing the survival rate. Raman spectroscopic fiber-optical approaches can help to decrease the impact on the patient, increase objectivity in tissue characterization, reduce expenses and provide a significant time advantage in endoscopy. In gastroenterology an early recognition of malign and precursor lesions is relevant. Instantaneous and precise differentiation between adenomas as precursor lesions for cancer and hyperplastic polyps on the one hand and between high and low-risk alterations on the other hand is important. Raman fiber-optical measurements of colon biopsy samples taken during colonoscopy were carried out during a clinical study, and samples of adenocarcinoma (22), tubular adenomas (141), hyperplastic polyps (79) and normal tissue (101) from 151 patients were analyzed. This allows us to focus on the bioinformatic analysis and to set stage for Raman endoscopic measurements. Since spectral differences between normal and cancerous biopsy samples are small, special care has to be taken in data analysis. Using a leave-one-patient-out cross-validation scheme, three different outlier identification methods were investigated to decrease the influence of systematic errors, like a residual risk in misplacement of the sample and spectral dilution of marker bands (esp. cancerous tissue) and therewith optimize the experimental design. Furthermore other validations methods like leave-one-sample-out and leave-one-spectrum-out cross-validation schemes were compared with leave-one-patient-out cross-validation. High-risk lesions were differentiated from low-risk lesions with a sensitivity of 79%, specificity of 74% and an accuracy of 77%, cancer and normal tissue with a sensitivity of 79%, specificity of 83% and an accuracy of 81%. Additionally applied outlier identification enabled us to improve the recognition of neoplastic biopsy samples.

  9. Raman fiber-optical method for colon cancer detection: Cross-validation and outlier identification approach.

    Science.gov (United States)

    Petersen, D; Naveed, P; Ragheb, A; Niedieker, D; El-Mashtoly, S F; Brechmann, T; Kötting, C; Schmiegel, W H; Freier, E; Pox, C; Gerwert, K

    2017-06-15

    Endoscopy plays a major role in early recognition of cancer which is not externally accessible and therewith in increasing the survival rate. Raman spectroscopic fiber-optical approaches can help to decrease the impact on the patient, increase objectivity in tissue characterization, reduce expenses and provide a significant time advantage in endoscopy. In gastroenterology an early recognition of malign and precursor lesions is relevant. Instantaneous and precise differentiation between adenomas as precursor lesions for cancer and hyperplastic polyps on the one hand and between high and low-risk alterations on the other hand is important. Raman fiber-optical measurements of colon biopsy samples taken during colonoscopy were carried out during a clinical study, and samples of adenocarcinoma (22), tubular adenomas (141), hyperplastic polyps (79) and normal tissue (101) from 151 patients were analyzed. This allows us to focus on the bioinformatic analysis and to set stage for Raman endoscopic measurements. Since spectral differences between normal and cancerous biopsy samples are small, special care has to be taken in data analysis. Using a leave-one-patient-out cross-validation scheme, three different outlier identification methods were investigated to decrease the influence of systematic errors, like a residual risk in misplacement of the sample and spectral dilution of marker bands (esp. cancerous tissue) and therewith optimize the experimental design. Furthermore other validations methods like leave-one-sample-out and leave-one-spectrum-out cross-validation schemes were compared with leave-one-patient-out cross-validation. High-risk lesions were differentiated from low-risk lesions with a sensitivity of 79%, specificity of 74% and an accuracy of 77%, cancer and normal tissue with a sensitivity of 79%, specificity of 83% and an accuracy of 81%. Additionally applied outlier identification enabled us to improve the recognition of neoplastic biopsy samples. Copyright

  10. Calibrating and Validating a Simulation Model to Identify Drivers of Urban Land Cover Change in the Baltimore, MD Metropolitan Region

    Directory of Open Access Journals (Sweden)

    Claire Jantz

    2014-09-01

    Full Text Available We build upon much of the accumulated knowledge of the widely used SLEUTH urban land change model and offer advances. First, we use SLEUTH’s exclusion/attraction layer to identify and test different urban land cover change drivers; second, we leverage SLEUTH’s self-modification capability to incorporate a demographic model; and third, we develop a validation procedure to quantify the influence of land cover change drivers and assess uncertainty. We found that, contrary to our a priori expectations, new development is not attracted to areas serviced by existing or planned water and sewer infrastructure. However, information about where population and employment growth is likely to occur did improve model performance. These findings point to the dominant role of centrifugal forces in post-industrial cities like Baltimore, MD. We successfully developed a demographic model that allowed us to constrain the SLEUTH model forecasts and address uncertainty related to the dynamic relationship between changes in population and employment and urban land use. Finally, we emphasize the importance of model validation. In this work the validation procedure played a key role in rigorously assessing the impacts of different exclusion/attraction layers and in assessing uncertainty related to population and employment forecasts.

  11. A Generic Approach for Inversion of Surface Reflectance over Land: Overview, Application and Validation Using MODIS and LANDSAT8 Data

    Science.gov (United States)

    Vermote, E.; Roger, J. C.; Justice, C. O.; Franch, B.; Claverie, M.

    2016-01-01

    This paper presents a generic approach developed to derive surface reflectance over land from a variety of sensors. This technique builds on the extensive dataset acquired by the Terra platform by combining MODIS and MISR to derive an explicit and dynamic map of band ratio's between blue and red channels and is a refinement of the operational approach used for MODIS and LANDSAT over the past 15 years. We will present the generic approach and the application to MODIS and LANDSAT data and its validation using the AERONET data.

  12. Cross validation of two partitioning-based sampling approaches in mesocosms containing PCB contaminated field sediment, biota, and activated carbon amendment

    DEFF Research Database (Denmark)

    Nørgaard Schmidt, Stine; Wang, Alice P.; Gidley, Philip T

    2017-01-01

    with multiple thicknesses of silicone and in situ pre-equilibrium sampling with low density polyethylene (LDPE) loaded with performance reference compounds were applied independently to measure polychlorinated biphenyls (PCBs) in mesocosms with (1) New Bedford Harbor sediment (MA, USA), (2) sediment and biota......, and (3) activated carbon amended sediment and biota. The aim was to cross validate the two different sampling approaches. Around 100 PCB congeners were quantified in the two sampling polymers, and the results confirmed the good precision of both methods and were in overall good agreement with recently...... published silicone to LDPE partition ratios. Further, the methods yielded Cfree in good agreement for all three experiments. The average ratio between Cfree determined by the two methods was factor 1.4±0.3 (range: 0.6-2.0), and the results thus cross-validated the two sampling approaches. For future...

  13. Xtalk: a path-based approach for identifying crosstalk between signaling pathways

    Science.gov (United States)

    Tegge, Allison N.; Sharp, Nicholas; Murali, T. M.

    2016-01-01

    Motivation: Cells communicate with their environment via signal transduction pathways. On occasion, the activation of one pathway can produce an effect downstream of another pathway, a phenomenon known as crosstalk. Existing computational methods to discover such pathway pairs rely on simple overlap statistics. Results: We present Xtalk, a path-based approach for identifying pairs of pathways that may crosstalk. Xtalk computes the statistical significance of the average length of multiple short paths that connect receptors in one pathway to the transcription factors in another. By design, Xtalk reports the precise interactions and mechanisms that support the identified crosstalk. We applied Xtalk to signaling pathways in the KEGG and NCI-PID databases. We manually curated a gold standard set of 132 crosstalking pathway pairs and a set of 140 pairs that did not crosstalk, for which Xtalk achieved an area under the receiver operator characteristic curve of 0.65, a 12% improvement over the closest competing approach. The area under the receiver operator characteristic curve varied with the pathway, suggesting that crosstalk should be evaluated on a pathway-by-pathway level. We also analyzed an extended set of 658 pathway pairs in KEGG and to a set of more than 7000 pathway pairs in NCI-PID. For the top-ranking pairs, we found substantial support in the literature (81% for KEGG and 78% for NCI-PID). We provide examples of networks computed by Xtalk that accurately recovered known mechanisms of crosstalk. Availability and implementation: The XTALK software is available at http://bioinformatics.cs.vt.edu/~murali/software. Crosstalk networks are available at http://graphspace.org/graphs?tags=2015-bioinformatics-xtalk. Contact: ategge@vt.edu, murali@cs.vt.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26400040

  14. VALUE - Validating and Integrating Downscaling Methods for Climate Change Research

    Science.gov (United States)

    Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose

    2013-04-01

    Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of

  15. MicroRNA Expression Profiling to Identify and Validate Reference Genes for the Relative Quantification of microRNA in Rectal Cancer

    DEFF Research Database (Denmark)

    Eriksen, Anne Haahr Mellergaard; Andersen, Rikke Fredslund; Pallisgaard, Niels

    2016-01-01

    the miRNA profiling experiment, miR-645, miR-193a-5p, miR-27a and let-7g were identified as stably expressed, both in malignant and stromal tissue. In addition, NormFinder confirmed high expression stability for the four miRNAs. In the RT-qPCR based validation experiments, no significant difference...... management. Real-time quantitative polymerase chain reaction (RT-qPCR) is commonly used, when measuring miRNA expression. Appropriate normalisation of RT-qPCR data is important to ensure reliable results. The aim of the present study was to identify stably expressed miRNAs applicable as normaliser candidates...... in future studies of miRNA expression in rectal cancer.MATERIALS AND METHODS: We performed high-throughput miRNA profiling (OpenArray®) on ten pairs of laser micro-dissected rectal cancer tissue and adjacent stroma. A global mean expression normalisation strategy was applied to identify the most stably...

  16. [Diagnostic validity of attention deficit/hyperactivity disorder: from phenomenology to neurobiology (I)].

    Science.gov (United States)

    Trujillo-Orrego, N; Pineda, D A; Uribe, L H

    2012-03-01

    The diagnostic criteria for the attentional deficit hyperactivity disorder (ADHD), were defined by the American Psychiatric Association in the Diagnostic and Statistical Manual of Mental Disorders fourth version (DSM-IV) and World Health Organization in the ICD-10. The American Psychiatric Association used an internal validity analysis to select specific behavioral symptoms associated with the disorder and to build five cross-cultural criteria for its use in the categorical diagnosis. The DSM has been utilized for clinicians and researchers as a valid and stable approach since 1968. We did a systematic review of scientific literature in Spanish and English, aimed to identify the historical origin that supports ADHD as a psychiatric construct. This comprehensive review started exploring the concept of minimal brain dysfunction, hyper-activity, inattention, impulsivity since 1932 to 2011. This paper summarize all the DSM versions that include the definition of ADHD or its equivalent, and it point out the statistical and methodological approach implemented for defining ADHD as a valid epidemiological and psychometric construct. Finally the paper discusses some considerations and suggestions for the new versions of the manual.

  17. An integrated approach for signal validation in nuclear power plants

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Kerlin, T.W.; Gloeckler, O.; Frei, Z.; Qualls, L.; Morgenstern, V.

    1987-08-01

    A signal validation system, based on several parallel signal processing modules, is being developed at the University of Tennessee. The major modules perform (1) general consistency checking (GCC) of a set of redundant measurements, (2) multivariate data-driven modeling of dynamic signal components for maloperation detection, (3) process empirical modeling for prediction and redundancy generation, (4) jump, pulse, noise detection, and (5) an expert system for qualitative signal validation. A central database stores information related to sensors, diagnostics rules, past system performance, subsystem models, etc. We are primarily concerned with signal validation during steady-state operation and slow degradations. In general, the different modules will perform signal validation during all operating conditions. The techniques have been successfully tested using PWR steam generator simulation, and efforts are currently underway in applying the techniques to Millstone-III operational data. These methods could be implemented in advanced reactors, including advanced liquid metal reactors

  18. Comparative Validation of Building Simulation Software

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Heiselberg, Per

    The scope of this subtask is to perform a comparative validation of the building simulation software for the buildings with the double skin façade. The outline of the results in the comparative validation identifies the areas where is no correspondence achieved, i.e. calculation of the air flow r...... is that the comparative validation can be regarded as the main argument to continue the validation of the building simulation software for the buildings with the double skin façade with the empirical validation test cases.......The scope of this subtask is to perform a comparative validation of the building simulation software for the buildings with the double skin façade. The outline of the results in the comparative validation identifies the areas where is no correspondence achieved, i.e. calculation of the air flow...

  19. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    Science.gov (United States)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  20. Modelling Creativity: Identifying Key Components through a Corpus-Based Approach.

    Science.gov (United States)

    Jordanous, Anna; Keller, Bill

    2016-01-01

    Creativity is a complex, multi-faceted concept encompassing a variety of related aspects, abilities, properties and behaviours. If we wish to study creativity scientifically, then a tractable and well-articulated model of creativity is required. Such a model would be of great value to researchers investigating the nature of creativity and in particular, those concerned with the evaluation of creative practice. This paper describes a unique approach to developing a suitable model of how creative behaviour emerges that is based on the words people use to describe the concept. Using techniques from the field of statistical natural language processing, we identify a collection of fourteen key components of creativity through an analysis of a corpus of academic papers on the topic. Words are identified which appear significantly often in connection with discussions of the concept. Using a measure of lexical similarity to help cluster these words, a number of distinct themes emerge, which collectively contribute to a comprehensive and multi-perspective model of creativity. The components provide an ontology of creativity: a set of building blocks which can be used to model creative practice in a variety of domains. The components have been employed in two case studies to evaluate the creativity of computational systems and have proven useful in articulating achievements of this work and directions for further research.

  1. Integrating modelling and phenotyping approaches to identify and screen complex traits - Illustration for transpiration efficiency in cereals.

    Science.gov (United States)

    Chenu, K; van Oosterom, E J; McLean, G; Deifel, K S; Fletcher, A; Geetika, G; Tirfessa, A; Mace, E S; Jordan, D R; Sulman, R; Hammer, G L

    2018-02-21

    Following advances in genetics, genomics, and phenotyping, trait selection in breeding is limited by our ability to understand interactions within the plants and with their environments, and to target traits of most relevance for the target population of environments. We propose an integrated approach that combines insights from crop modelling, physiology, genetics, and breeding to identify traits valuable for yield gain in the target population of environments, develop relevant high-throughput phenotyping platforms, and identify genetic controls and their values in production environments. This paper uses transpiration efficiency (biomass produced per unit of water used) as an example of a complex trait of interest to illustrate how the approach can guide modelling, phenotyping, and selection in a breeding program. We believe that this approach, by integrating insights from diverse disciplines, can increase the resource use efficiency of breeding programs for improving yield gains in target populations of environments.

  2. A Multiomics Approach to Identify Genes Associated with Childhood Asthma Risk and Morbidity.

    Science.gov (United States)

    Forno, Erick; Wang, Ting; Yan, Qi; Brehm, John; Acosta-Perez, Edna; Colon-Semidey, Angel; Alvarez, Maria; Boutaoui, Nadia; Cloutier, Michelle M; Alcorn, John F; Canino, Glorisa; Chen, Wei; Celedón, Juan C

    2017-10-01

    Childhood asthma is a complex disease. In this study, we aim to identify genes associated with childhood asthma through a multiomics "vertical" approach that integrates multiple analytical steps using linear and logistic regression models. In a case-control study of childhood asthma in Puerto Ricans (n = 1,127), we used adjusted linear or logistic regression models to evaluate associations between several analytical steps of omics data, including genome-wide (GW) genotype data, GW methylation, GW expression profiling, cytokine levels, asthma-intermediate phenotypes, and asthma status. At each point, only the top genes/single-nucleotide polymorphisms/probes/cytokines were carried forward for subsequent analysis. In step 1, asthma modified the gene expression-protein level association for 1,645 genes; pathway analysis showed an enrichment of these genes in the cytokine signaling system (n = 269 genes). In steps 2-3, expression levels of 40 genes were associated with intermediate phenotypes (asthma onset age, forced expiratory volume in 1 second, exacerbations, eosinophil counts, and skin test reactivity); of those, methylation of seven genes was also associated with asthma. Of these seven candidate genes, IL5RA was also significant in analytical steps 4-8. We then measured plasma IL-5 receptor α levels, which were associated with asthma age of onset and moderate-severe exacerbations. In addition, in silico database analysis showed that several of our identified IL5RA single-nucleotide polymorphisms are associated with transcription factors related to asthma and atopy. This approach integrates several analytical steps and is able to identify biologically relevant asthma-related genes, such as IL5RA. It differs from other methods that rely on complex statistical models with various assumptions.

  3. A CFD validation roadmap for hypersonic flows

    Science.gov (United States)

    Marvin, Joseph G.

    1993-01-01

    A roadmap for computational fluid dynamics (CFD) code validation is developed. The elements of the roadmap are consistent with air-breathing vehicle design requirements and related to the important flow path components: forebody, inlet, combustor, and nozzle. Building block and benchmark validation experiments are identified along with their test conditions and measurements. Based on an evaluation criteria, recommendations for an initial CFD validation data base are given and gaps identified where future experiments would provide the needed validation data.

  4. Identifying food-related life style segments by a cross-culturally valid scaling device

    DEFF Research Database (Denmark)

    Brunsø, Karen; Grunert, Klaus G.

    1994-01-01

    -related life style in a cross-culturally valid way. To this end, we have col-lected a pool of 202 items, collected data in three countries, and have con-structed scales based on cross-culturally stable patterns. These scales have then been subjected to a number of tests of reliability and vali-dity. We have...... then applied the set of scales to a fourth country, Germany, based on a representative sample of 1000 respondents. The scales had, with a fe exceptions, moderately good reliabilities. A cluster ana-ly-sis led to the identification of 5 segments, which differed on all 23 scales....

  5. Validation and calibration of structural models that combine information from multiple sources.

    Science.gov (United States)

    Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A

    2017-02-01

    Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.

  6. Automation of RELAP5 input calibration and code validation using genetic algorithm

    International Nuclear Information System (INIS)

    Phung, Viet-Anh; Kööp, Kaspar; Grishchenko, Dmitry; Vorobyev, Yury; Kudinov, Pavel

    2016-01-01

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  7. Automation of RELAP5 input calibration and code validation using genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Phung, Viet-Anh, E-mail: vaphung@kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Kööp, Kaspar, E-mail: kaspar@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Vorobyev, Yury, E-mail: yura3510@gmail.com [National Research Center “Kurchatov Institute”, Kurchatov square 1, Moscow 123182 (Russian Federation); Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden)

    2016-04-15

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  8. A practical approach to perform graded verification and validation

    International Nuclear Information System (INIS)

    Terrado, Carlos; Woolley, J.

    2000-01-01

    Modernization of instrumentation and control (I and C) systems in nuclear power plants often implies to go from analog to digital systems. One condition for the upgrade to be successful is that the new systems achieve at least the same quality level as the analog they replace. The most important part of digital systems quality assurance (QA) is verification and validation (V and V). V and V is concerned with the process as much as the product, it is a systematic program of review and testing activities performed throughout the system development life cycle. Briefly, we can say that verification is to build the product correctly, and validation is to build the correct product. Since V and V is necessary but costly, it is helpful to tailor the effort that should be performed to achieve the quality goal for each particular case. To do this, an accepted practice is to establish different V and V levels, each one with a proper degree of stringency or rigor. This paper shows a practical approach to estimate the appropriate level of V and V, and the resulting V and V techniques recommended for each specific system. The firs step purposed is to determine 'What to do', that is the selection of the V and V class. The main factors considered here are: required integrity, functional complexity, defense in depth and development environment. A guideline to classify the particular system using these factors and show how they lead to the selection of the V and V class is presented. The second step is to determine 'How to do it', that is to choose an appropriate set of V and V methods according to the attributes of the system and the V and V class already selected. A list of possible V and V methods that are recommended for each V and V level during different stages of the development life cycle is included. As a result of the application of this procedure, solutions are found for generalists interested in 'What to do', as well as for specialists, interested in 'How to do'. Finally

  9. Validation of cross sections for Monte Carlo simulation of the photoelectric effect

    CERN Document Server

    Han, Min Cheol; Pia, Maria Grazia; Basaglia, Tullio; Batic, Matej; Hoff, Gabriela; Kim, Chan Hyeong; Saracco, Paolo

    2016-01-01

    Several total and partial photoionization cross section calculations, based on both theoretical and empirical approaches, are quantitatively evaluated with statistical analyses using a large collection of experimental data retrieved from the literature to identify the state of the art for modeling the photoelectric effect in Monte Carlo particle transport. Some of the examined cross section models are available in general purpose Monte Carlo systems, while others have been implemented and subjected to validation tests for the first time to estimate whether they could improve the accuracy of particle transport codes. The validation process identifies Scofield's 1973 non-relativistic calculations, tabulated in the Evaluated Photon Data Library(EPDL), as the one best reproducing experimental measurements of total cross sections. Specialized total cross section models, some of which derive from more recent calculations, do not provide significant improvements. Scofield's non-relativistic calculations are not surp...

  10. Modeling Run Test Validity: A Meta-Analytic Approach

    National Research Council Canada - National Science Library

    Vickers, Ross

    2002-01-01

    .... This study utilized data from 166 samples (N = 5,757) to test the general hypothesis that differences in testing methods could account for the cross-situational variation in validity. Only runs >2 km...

  11. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  12. Experimental validation of Monte Carlo calculations for organ dose

    International Nuclear Information System (INIS)

    Yalcintas, M.G.; Eckerman, K.F.; Warner, G.G.

    1980-01-01

    The problem of validating estimates of absorbed dose due to photon energy deposition is examined. The computational approaches used for the estimation of the photon energy deposition is examined. The limited data for validation of these approaches is discussed and suggestions made as to how better validation information might be obtained

  13. Electrocatalysis of borohydride oxidation: a review of density functional theory approach combined with experimental validation

    International Nuclear Information System (INIS)

    Sison Escaño, Mary Clare; Arevalo, Ryan Lacdao; Kasai, Hideaki; Gyenge, Elod

    2014-01-01

    The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH 4 − on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements. (topical review)

  14. Electrocatalysis of borohydride oxidation: a review of density functional theory approach combined with experimental validation

    Science.gov (United States)

    Sison Escaño, Mary Clare; Lacdao Arevalo, Ryan; Gyenge, Elod; Kasai, Hideaki

    2014-09-01

    The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4- on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.

  15. Validation of the probabilistic approach for the analysis of PWR transients

    International Nuclear Information System (INIS)

    Amesz, J.; Francocci, G.F.; Clarotti, C.

    1978-01-01

    This paper reviews the pilot study at present being carried out on the validation of probabilistic methodology with real data coming from the operational records of the PWR power station at Obrigheim (KWO, Germany) operating since 1969. The aim of this analysis is to validate the a priori predictions of reactor transients performed by a probabilistic methodology, with the posteriori analysis of transients that actually occurred at a power station. Two levels of validation have been distinguished: (a) validation of the rate of occurrence of initiating events; (b) validation of the transient-parameter amplitude (i.e., overpressure) caused by the above mentioned initiating events. The paper describes the a priori calculations performed using a fault-tree analysis by means of a probabilistic code (SALP 3) and event-trees coupled with a PWR system deterministic computer code (LOOP 7). Finally the principle results of these analyses are presented and critically reviewed

  16. Design and Implementation Content Validity Study: Development of an instrument for measuring Patient-Centered Communication

    Directory of Open Access Journals (Sweden)

    Vahid Zamanzadeh

    2015-06-01

    Full Text Available ABSTRACT Introduction: The importance of content validity in the instrument psychometric and its relevance with reliability, have made it an essential step in the instrument development. This article attempts to give an overview of the content validity process and to explain the complexity of this process by introducing an example. Methods: We carried out a methodological study conducted to examine the content validity of the patient-centered communication instrument through a two-step process (development and judgment. At the first step, domain determination, sampling (item generation and instrument formation and at the second step, content validity ratio, content validity index and modified kappa statistic was performed. Suggestions of expert panel and item impact scores are used to examine the instrument face validity. Results: From a set of 188 items, content validity process identified seven dimensions includes trust building (eight items, informational support (seven items, emotional support (five items, problem solving (seven items, patient activation (10 items, intimacy/friendship (six items and spirituality strengthening (14 items. Content validity study revealed that this instrument enjoys an appropriate level of content validity. The overall content validity index of the instrument using universal agreement approach was low; however, it can be advocated with respect to the high number of content experts that makes consensus difficult and high value of the S-CVI with the average approach, which was equal to 0.93. Conclusion: This article illustrates acceptable quantities indices for content validity a new instrument and outlines them during design and psychometrics of patient-centered communication measuring instrument.

  17. Developing an instrument to identify MBChB students' approaches ...

    African Journals Online (AJOL)

    The constructs of deep, surface and achieving approaches to learning are well defined in the literature and amply supported by research. Quality learning results from a deep approach to learning, and a deep-achieving approach to learning is regarded as the most adaptive approach institutionally. It is therefore felt that ...

  18. The validity of the first and second generation Microsoft Kinect™ for identifying joint center locations during static postures.

    Science.gov (United States)

    Xu, Xu; McGorry, Raymond W

    2015-07-01

    The Kinect™ sensor released by Microsoft is a low-cost, portable, and marker-less motion tracking system for the video game industry. Since the first generation Kinect sensor was released in 2010, many studies have been conducted to examine the validity of this sensor when used to measure body movement in different research areas. In 2014, Microsoft released the computer-used second generation Kinect sensor with a better resolution for the depth sensor. However, very few studies have performed a direct comparison between all the Kinect sensor-identified joint center locations and their corresponding motion tracking system-identified counterparts, the result of which may provide some insight into the error of the Kinect-identified segment length, joint angles, as well as the feasibility of adapting inverse dynamics to Kinect-identified joint centers. The purpose of the current study is to first propose a method to align the coordinate system of the Kinect sensor with respect to the global coordinate system of a motion tracking system, and then to examine the accuracy of the Kinect sensor-identified coordinates of joint locations during 8 standing and 8 sitting postures of daily activities. The results indicate the proposed alignment method can effectively align the Kinect sensor with respect to the motion tracking system. The accuracy level of the Kinect-identified joint center location is posture-dependent and joint-dependent. For upright standing posture, the average error across all the participants and all Kinect-identified joint centers is 76 mm and 87 mm for the first and second generation Kinect sensor, respectively. In general, standing postures can be identified with better accuracy than sitting postures, and the identification accuracy of the joints of the upper extremities is better than for the lower extremities. This result may provide some information regarding the feasibility of using the Kinect sensor in future studies. Copyright © 2015 Elsevier

  19. Distinguishing between forensic science and forensic pseudoscience: testing of validity and reliability, and approaches to forensic voice comparison.

    Science.gov (United States)

    Morrison, Geoffrey Stewart

    2014-05-01

    In this paper it is argued that one should not attempt to directly assess whether a forensic analysis technique is scientifically acceptable. Rather one should first specify what one considers to be appropriate principles governing acceptable practice, then consider any particular approach in light of those principles. This paper focuses on one principle: the validity and reliability of an approach should be empirically tested under conditions reflecting those of the case under investigation using test data drawn from the relevant population. Versions of this principle have been key elements in several reports on forensic science, including forensic voice comparison, published over the last four-and-a-half decades. The aural-spectrographic approach to forensic voice comparison (also known as "voiceprint" or "voicegram" examination) and the currently widely practiced auditory-acoustic-phonetic approach are considered in light of this principle (these two approaches do not appear to be mutually exclusive). Approaches based on data, quantitative measurements, and statistical models are also considered in light of this principle. © 2013.

  20. Construct Validity of the Holistic Complementary and Alternative Medicines Questionnaire (HCAMQ—An Investigation Using Modern Psychometric Approaches

    Directory of Open Access Journals (Sweden)

    Paula Kersten

    2011-01-01

    Full Text Available The scientific basis of efficacy studies of complementary medicine requires the availability of validated measures. The Holistic Complementary and Alternative Medicine Questionnaire (HCAMQ is one such measure. This article aimed to examine its construct validity, using a modern psychometric approach. The HCAMQ was completed by 221 patients (mean age 66.8, SD 8.29, 58% females with chronic stable pain predominantly from a single joint (hip or knee of mechanical origin, waiting for a hip (40% or knee (60% joint replacement, on enrolment in a study investigating the effects of acupuncture and placebo controls. The HCAMQ contains a Holistic Health (HH Subscale (five items and a CAM subscale (six items. Validity of the subscales was tested using Cronbach alpha's, factor analysis, Mokken scaling and Rasch analysis, which did not support the original two-factor structure of the scale. A five-item HH subscale and a four-item CAM subscale (worded in a negative direction fitted the Rasch model and were unidimensional (χ2=8.44, P=0.39, PSI=0.69 versus χ2=17.33, P=0.03, PSI=0.77. Two CAM items (worded in the positive direction had significant misfit. In conclusion, we have shown that the original two-factor structure of the HCAMQ could not be supported but that two valid shortened subscales can be used, one for HH Beliefs (four-item HH, and the other for CAM Beliefs (four-item CAM. It is recommended that consideration is given to rewording the two discarded positively worded CAM questions to enhance construct validity.

  1. Cost model validation: a technical and cultural approach

    Science.gov (United States)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  2. Identifying the Critical Links in Road Transportation Networks: Centrality-based approach utilizing structural properties

    Energy Technology Data Exchange (ETDEWEB)

    Chinthavali, Supriya [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Surface transportation road networks share structural properties similar to other complex networks (e.g., social networks, information networks, biological networks, and so on). This research investigates the structural properties of road networks for any possible correlation with the traffic characteristics such as link flows those determined independently. Additionally, we define a criticality index for the links of the road network that identifies the relative importance in the network. We tested our hypotheses with two sample road networks. Results show that, correlation exists between the link flows and centrality measures of a link of the road (dual graph approach is followed) and the criticality index is found to be effective for one test network to identify the vulnerable nodes.

  3. A standardized approach to verification and validation to assist in expert system development

    International Nuclear Information System (INIS)

    Hines, J.W.; Hajek, B.K.; Miller, D.W.; Haas, M.A.

    1992-01-01

    For the past six years, the Nuclear Engineering Program's Artificial Intelligence (AI) Group at The Ohio State University has been developing an integration of expert systems to act as an aid to nuclear power plant operators. This Operator Advisor consists of four modules that monitor plant parameters, detect deviations from normality, diagnose the root cause of the abnormality, manage procedures to effectively respond to the abnormality, and mitigate its consequences. Recently Ohio State University received a grant from the Department of Energy's Special Research Grant Program to utilize the methodologies developed for the Operator Advisor for Heavy Water Reactor (HWR) malfunction root cause diagnosis. To aid in the development of this new system, a standardized Verification and Validation (V ampersand V) approach is being implemented. Its primary functions are to guide the development of the expert system and to ensure that the end product fulfills the initial objectives. The development process has been divided into eight life-cycle V ampersand V phases from concept to operation and maintenance. Each phase has specific V ampersand V tasks to be performed to ensure a quality end product. Four documents are being used to guide development. The Software Verification and Validation Plan (SVVP) outlines the V ampersand V tasks necessary to verify the product at the end of each software development phase, and to validate that the end product complies with the established software and system requirements and meets the needs of the user. The Software Requirements Specification (SRS) documents the essential requirements of the system. The Software Design Description (SDD) represents these requirements with a specific design. And lastly, the Software Test Document establishes a testing methodology to be used throughout the development life-cycle. 10 refs., 1 fig

  4. Can children identify and achieve goals for intervention? A randomized trial comparing two goal-setting approaches.

    Science.gov (United States)

    Vroland-Nordstrand, Kristina; Eliasson, Ann-Christin; Jacobsson, Helén; Johansson, Ulla; Krumlinde-Sundholm, Lena

    2016-06-01

    The efficacy of two different goal-setting approaches (children's self-identified goals and goals identified by parents) were compared on a goal-directed, task-oriented intervention. In this assessor-blinded parallel randomized trial, 34 children with disabilities (13 males, 21 females; mean age 9y, SD 1y 4mo) were randomized using concealed allocation to one of two 8-week, goal-directed, task-oriented intervention groups with different goal-setting approaches: (1) children's self-identified goals (n=18) using the Perceived Efficacy and Goal-Setting System, or (2) goals identified by parents (n=16) using the Canadian Occupational Performance Measure (COPM). Participants were recruited through eight paediatric rehabilitation centres and randomized between October 2011 and May 2013. The primary outcome measure was the Goal Attainment Scaling and the secondary measure, the COPM performance scale (COPM-P). Data were collected pre- and post-intervention and at the 5-month follow-up. There was no evidence of a difference in mean characteristics at baseline between groups. There was evidence of an increase in mean goal attainment (mean T score) in both groups after intervention (child-goal group: estimated mean difference [EMD] 27.84, 95% CI 22.93-32.76; parent-goal group: EMD 21.42, 95% CI 16.16-26.67). There was no evidence of a difference in the mean T scores post-intervention between the two groups (EMD 6.42, 95% CI -0.80 to 13.65). These results were sustained at the 5-month follow-up. Children's self-identified goals are achievable to the same extent as parent-identified goals and remain stable over time. Thus children can be trusted to identify their own goals for intervention, thereby influencing their involvement in their intervention programmes. © 2015 Mac Keith Press.

  5. Integrated Pathway-Based Approach Identifies Association between Genomic Regions at CTCF and CACNB2 and Schizophrenia

    NARCIS (Netherlands)

    Juraeva, Dilafruz; Haenisch, Britta; Zapatka, Marc; Frank, Josef; Witt, Stephanie H.; Mühleisen, Thomas W.; Treutlein, Jens; Strohmaier, Jana; Meier, Sandra; Degenhardt, Franziska; Giegling, Ina; Ripke, Stephan; Leber, Markus; Lange, Christoph; Schulze, Thomas G.; Mössner, Rainald; Nenadic, Igor; Sauer, Heinrich; Rujescu, Dan; Maier, Wolfgang; Børglum, Anders; Ophoff, Roel; Cichon, Sven; Nöthen, Markus M.; Rietschel, Marcella; Mattheisen, Manuel; Brors, Benedikt; Kahn, René S.; Cahn, Wiepke; Linszen, Don H.; de Haan, Lieuwe; van Os, Jim; Krabbendam, Lydia; Myin-Germeys, Inez; Wiersma, Durk; Bruggeman, Richard; Mors, O.; Børglum, A. D.; Mortensen, P. B.; Pedersen, C. B.; Demontis, D.; Grove, J.; Mattheisen, M.; Hougaard, D. M.

    2014-01-01

    In the present study, an integrated hierarchical approach was applied to: (1) identify pathways associated with susceptibility to schizophrenia; (2) detect genes that may be potentially affected in these pathways since they contain an associated polymorphism; and (3) annotate the functional

  6. A simplified approach to the pooled analysis of calibration of clinical prediction rules for systematic reviews of validation studies

    Directory of Open Access Journals (Sweden)

    Dimitrov BD

    2015-04-01

    Full Text Available Borislav D Dimitrov,1,2 Nicola Motterlini,2,† Tom Fahey2 1Academic Unit of Primary Care and Population Sciences, University of Southampton, Southampton, United Kingdom; 2HRB Centre for Primary Care Research, Department of General Medicine, Division of Population Health Sciences, Royal College of Surgeons in Ireland, Dublin, Ireland †Nicola Motterlini passed away on November 11, 2012 Objective: Estimating calibration performance of clinical prediction rules (CPRs in systematic reviews of validation studies is not possible when predicted values are neither published nor accessible or sufficient or no individual participant or patient data are available. Our aims were to describe a simplified approach for outcomes prediction and calibration assessment and evaluate its functionality and validity. Study design and methods: Methodological study of systematic reviews of validation studies of CPRs: a ABCD2 rule for prediction of 7 day stroke; and b CRB-65 rule for prediction of 30 day mortality. Predicted outcomes in a sample validation study were computed by CPR distribution patterns (“derivation model”. As confirmation, a logistic regression model (with derivation study coefficients was applied to CPR-based dummy variables in the validation study. Meta-analysis of validation studies provided pooled estimates of “predicted:observed” risk ratios (RRs, 95% confidence intervals (CIs, and indexes of heterogeneity (I2 on forest plots (fixed and random effects models, with and without adjustment of intercepts. The above approach was also applied to the CRB-65 rule. Results: Our simplified method, applied to ABCD2 rule in three risk strata (low, 0–3; intermediate, 4–5; high, 6–7 points, indicated that predictions are identical to those computed by univariate, CPR-based logistic regression model. Discrimination was good (c-statistics =0.61–0.82, however, calibration in some studies was low. In such cases with miscalibration, the under

  7. Guiding Development Based Approach Practicum Vertebrates Taxonomy Scientific Study Program for Students of Biology Education

    Science.gov (United States)

    Arieska, M.; Syamsurizal, S.; Sumarmin, R.

    2018-04-01

    Students having difficulty in identifying and describing the vertebrate animals as well as less skilled in science process as practical. Increased expertise in scientific skills, one of which is through practical activities using practical guidance based on scientific approach. This study aims to produce practical guidance vertebrate taxonomy for biology education students PGRI STKIP West Sumatra valid. This study uses a model of Plomp development consisting of three phases: the initial investigation, floating or prototype stage, and the stage of assessment. Data collection instruments used in this study is a validation sheet guiding practicum. Data were analyzed descriptively based on data obtained from the field. The result of the development of practical guidance vertebrate taxonomic validity value of 3.22 is obtained with very valid category. Research and development has produced a practical guide based vertebrate taxonomic scientific approach very valid.

  8. Towards natural language question generation for the validation of ontologies and mappings.

    Science.gov (United States)

    Ben Abacha, Asma; Dos Reis, Julio Cesar; Mrabet, Yassine; Pruski, Cédric; Da Silveira, Marcos

    2016-08-08

    The increasing number of open-access ontologies and their key role in several applications such as decision-support systems highlight the importance of their validation. Human expertise is crucial for the validation of ontologies from a domain point-of-view. However, the growing number of ontologies and their fast evolution over time make manual validation challenging. We propose a novel semi-automatic approach based on the generation of natural language (NL) questions to support the validation of ontologies and their evolution. The proposed approach includes the automatic generation, factorization and ordering of NL questions from medical ontologies. The final validation and correction is performed by submitting these questions to domain experts and automatically analyzing their feedback. We also propose a second approach for the validation of mappings impacted by ontology changes. The method exploits the context of the changes to propose correction alternatives presented as Multiple Choice Questions. This research provides a question optimization strategy to maximize the validation of ontology entities with a reduced number of questions. We evaluate our approach for the validation of three medical ontologies. We also evaluate the feasibility and efficiency of our mappings validation approach in the context of ontology evolution. These experiments are performed with different versions of SNOMED-CT and ICD9. The obtained experimental results suggest the feasibility and adequacy of our approach to support the validation of interconnected and evolving ontologies. Results also suggest that taking into account RDFS and OWL entailment helps reducing the number of questions and validation time. The application of our approach to validate mapping evolution also shows the difficulty of adapting mapping evolution over time and highlights the importance of semi-automatic validation.

  9. On the Validity of Continuum Computational Fluid Dynamics Approach Under Very Low-Pressure Plasma Spray Conditions

    Science.gov (United States)

    Ivchenko, Dmitrii; Zhang, Tao; Mariaux, Gilles; Vardelle, Armelle; Goutier, Simon; Itina, Tatiana E.

    2018-01-01

    Plasma spray physical vapor deposition aims to substantially evaporate powders in order to produce coatings with various microstructures. This is achieved by powder vapor condensation onto the substrate and/or by deposition of fine melted powder particles and nanoclusters. The deposition process typically operates at pressures ranging between 10 and 200 Pa. In addition to the experimental works, numerical simulations are performed to better understand the process and optimize the experimental conditions. However, the combination of high temperatures and low pressure with shock waves initiated by supersonic expansion of the hot gas in the low-pressure medium makes doubtful the applicability of the continuum approach for the simulation of such a process. This work investigates (1) effects of the pressure dependence of thermodynamic and transport properties on computational fluid dynamics (CFD) predictions and (2) the validity of the continuum approach for thermal plasma flow simulation under very low-pressure conditions. The study compares the flow fields predicted with a continuum approach using CFD software with those obtained by a kinetic-based approach using a direct simulation Monte Carlo method (DSMC). It also shows how the presence of high gradients can contribute to prediction errors for typical PS-PVD conditions.

  10. Evidence Based Validation of Indian Traditional Medicine – Way Forward

    Directory of Open Access Journals (Sweden)

    Pulok K Mukherjee

    2016-01-01

    Full Text Available Evidence based validation of the ethno-pharmacological claims on traditional medicine (TM is the need of the day for its globalization and reinforcement. Combining the unique features of identifying biomarkers that are highly conserved across species, this can offer an innovative approach to biomarker-driven drug discovery and development. TMs are an integral component of alternative health care systems. India has a rich wealth of TMs and the potential to accept the challenge to meet the global demand for them. Ayurveda, Yoga, Unani, Siddha and Homeopathy (AYUSH medicine are the major healthcare systems in Indian Traditional Medicine. The plant species mentioned in the ancient texts of these systems may be explored with the modern scientific approaches for better leads in the healthcare. TM is the best sources of chemical diversity for finding new drugs and leads. Authentication and scientific validation of medicinal plant is a fundamental requirement of industry and other organizations dealing with herbal drugs. Quality control (QC of botanicals, validated processes of manufacturing, customer awareness and post marketing surveillance are the key points, which could ensure the quality, safety and efficacy of TM. For globalization of TM, there is a need for harmonization with respect to its chemical and metabolite profiling, standardization, QC, scientific validation, documentation and regulatory aspects of TM. Therefore, the utmost attention is necessary for the promotion and development of TM through global collaboration and co-ordination by national and international programme.

  11. Validating Measures of Mathematical Knowledge for Teaching

    Science.gov (United States)

    Kane, Michael

    2007-01-01

    According to Schilling, Blunk, and Hill, the set of papers presented in this journal issue had two main purposes: (1) to use an argument-based approach to evaluate the validity of the tests of mathematical knowledge for teaching (MKT), and (2) to critically assess the author's version of an argument-based approach to validation (Kane, 2001, 2004).…

  12. Biographical approach to human resource management

    Directory of Open Access Journals (Sweden)

    Ratković-Njegovan Biljana

    2015-01-01

    Full Text Available The paper discusses the importance of biographical approach to managing human resources, which is especially important in the first, anticipatory stage of organizational socialization, in which interview for the job is performed. Biographical principle is based on a broader and more complex approach to the candidate, which enables him to present his working career, personal qualities, professional knowledge and skills, social skills, interests and aspirations. Biographical approach allows an individual who has applied for a certain job to reflect, identify and present their work and life path in their own way.. The organization, in turn, through the biographical method receives valid information to predict the future behavior of candidates and their performance.

  13. SPARQL-enabled identifier conversion with Identifiers.org.

    Science.gov (United States)

    Wimalaratne, Sarala M; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille

    2015-06-01

    On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. © The Author 2015. Published by Oxford University Press.

  14. SPARQL-enabled identifier conversion with Identifiers.org

    Science.gov (United States)

    Wimalaratne, Sarala M.; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille

    2015-01-01

    Motivation: On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. Results: We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. Availability and implementation: The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. Contact: sarala@ebi.ac.uk PMID:25638809

  15. Invention and validation of an automated camera system that uses optical character recognition to identify patient name mislabeled samples.

    Science.gov (United States)

    Hawker, Charles D; McCarthy, William; Cleveland, David; Messinger, Bonnie L

    2014-03-01

    Mislabeled samples are a serious problem in most clinical laboratories. Published error rates range from 0.39/1000 to as high as 1.12%. Standardization of bar codes and label formats has not yet achieved the needed improvement. The mislabel rate in our laboratory, although low compared with published rates, prompted us to seek a solution to achieve zero errors. To reduce or eliminate our mislabeled samples, we invented an automated device using 4 cameras to photograph the outside of a sample tube. The system uses optical character recognition (OCR) to look for discrepancies between the patient name in our laboratory information system (LIS) vs the patient name on the customer label. All discrepancies detected by the system's software then require human inspection. The system was installed on our automated track and validated with production samples. We obtained 1 009 830 images during the validation period, and every image was reviewed. OCR passed approximately 75% of the samples, and no mislabeled samples were passed. The 25% failed by the system included 121 samples actually mislabeled by patient name and 148 samples with spelling discrepancies between the patient name on the customer label and the patient name in our LIS. Only 71 of the 121 mislabeled samples detected by OCR were found through our normal quality assurance process. We have invented an automated camera system that uses OCR technology to identify potential mislabeled samples. We have validated this system using samples transported on our automated track. Full implementation of this technology offers the possibility of zero mislabeled samples in the preanalytic stage.

  16. Development and validation of the Approach-Iron Skill Test for use in golf.

    Science.gov (United States)

    Robertson, Samuel John; Burnett, Angus F; Newton, Robert U

    2013-01-01

    The primary aim of this study was to develop and validate a golf-specific approach-iron test for use with elite and high-level amateur golfers. Elite (n=26) and high-level amateur (n=23) golfers were recruited for this study. The 'Approach-Iron Skill Test' requires players to hit a total of 27 shots. Specifically, three shots are hit at each of nine targets on a specially constructed driving range in a randomised order. A real-time launch monitor positioned behind the player, measured the carry distance for each of these shots. A scoring system was developed based on the percentage error index of each shot, meaning that 81 points was the maximum score possible (with a maximum of three points per shot). Two rounds of the test were performed. For both rounds of the test, elite-level golfers scored significantly higher than their high-level amateur counterparts (56.3 ± 5.6 and 58.5 ± 4.6 points versus 46.0 ± 6.3 and 46.1 ± 6.7 points, respectively) (P<0.05). For both elite and high-level players, 95% limits of agreement statistics also indicated that the test showed good test-retest reliability (2.1 ± 7.9 and 0.2 ± 10.8, respectively). Due to the clinimetric properties of the test, we conclude that the Approach-Iron Skill Test is suitable for further examination with the players examined in this study.

  17. A network analysis of the Chinese medicine Lianhua-Qingwen formula to identify its main effective components.

    Science.gov (United States)

    Wang, Chun-Hua; Zhong, Yi; Zhang, Yan; Liu, Jin-Ping; Wang, Yue-Fei; Jia, Wei-Na; Wang, Guo-Cai; Li, Zheng; Zhu, Yan; Gao, Xiu-Mei

    2016-02-01

    Chinese medicine is known to treat complex diseases with multiple components and multiple targets. However, the main effective components and their related key targets and functions remain to be identified. Herein, a network analysis method was developed to identify the main effective components and key targets of a Chinese medicine, Lianhua-Qingwen Formula (LQF). The LQF is commonly used for the prevention and treatment of viral influenza in China. It is composed of 11 herbs, gypsum and menthol with 61 compounds being identified in our previous work. In this paper, these 61 candidate compounds were used to find their related targets and construct the predicted-target (PT) network. An influenza-related protein-protein interaction (PPI) network was constructed and integrated with the PT network. Then the compound-effective target (CET) network and compound-ineffective target network (CIT) were extracted, respectively. A novel approach was developed to identify effective components by comparing CET and CIT networks. As a result, 15 main effective components were identified along with 61 corresponding targets. 7 of these main effective components were further experimentally validated to have antivirus efficacy in vitro. The main effective component-target (MECT) network was further constructed with main effective components and their key targets. Gene Ontology (GO) analysis of the MECT network predicted key functions such as NO production being modulated by the LQF. Interestingly, five effective components were experimentally tested and exhibited inhibitory effects on NO production in the LPS induced RAW 264.7 cell. In summary, we have developed a novel approach to identify the main effective components in a Chinese medicine LQF and experimentally validated some of the predictions.

  18. Cross-species multiple environmental stress responses: An integrated approach to identify candidate genes for multiple stress tolerance in sorghum (Sorghum bicolor (L. Moench and related model species.

    Directory of Open Access Journals (Sweden)

    Adugna Abdi Woldesemayat

    associated with different traits that are responsive to multiple stresses. Ontology mapping was used to validate the identified genes, while reconstruction of the phylogenetic tree was instrumental to infer the evolutionary relationship of the sorghum orthologs. The results also show specific genes responsible for various interrelated components of drought response mechanism such as drought tolerance, drought avoidance and drought escape.We submit that this approach is novel and to our knowledge, has not been used previously in any other research; it enables us to perform cross-species queries for genes that are likely to be associated with multiple stress tolerance, as a means to identify novel targets for engineering stress resistance in sorghum and possibly, in other crop species.

  19. A Proteomic Approach Identifies Candidate Early Biomarkers to Predict Severe Dengue in Children.

    Directory of Open Access Journals (Sweden)

    Dang My Nhi

    2016-02-01

    Full Text Available Severe dengue with severe plasma leakage (SD-SPL is the most frequent of dengue severe form. Plasma biomarkers for early predictive diagnosis of SD-SPL are required in the primary clinics for the prevention of dengue death.Among 63 confirmed dengue pediatric patients recruited, hospital based longitudinal study detected six SD-SPL and ten dengue with warning sign (DWS. To identify the specific proteins increased or decreased in the SD-SPL plasma obtained 6-48 hours before the shock compared with the DWS, the isobaric tags for relative and absolute quantification (iTRAQ technology was performed using four patients each group. Validation was undertaken in 6 SD-SPL and 10 DWS patients.Nineteen plasma proteins exhibited significantly different relative concentrations (p<0.05, with five over-expressed and fourteen under-expressed in SD-SPL compared with DWS. The individual protein was classified to either blood coagulation, vascular regulation, cellular transport-related processes or immune response. The immunoblot quantification showed angiotensinogen and antithrombin III significantly increased in SD-SPL whole plasma of early stage compared with DWS subjects. Even using this small number of samples, antithrombin III predicted SD-SPL before shock occurrence with accuracy.Proteins identified here may serve as candidate predictive markers to diagnose SD-SPL for timely clinical management. Since the number of subjects are small, so further studies are needed to confirm all these biomarkers.

  20. Multi-Evaporator Miniature Loop Heat Pipe for Small Spacecraft Thermal Control. Part 1; New Technologies and Validation Approach

    Science.gov (United States)

    Ku, Jentung; Ottenstein, Laura; Douglas, Donya; Hoang, Triem

    2010-01-01

    Under NASA s New Millennium Program Space Technology 8 (ST 8) Project, four experiments Thermal Loop, Dependable Microprocessor, SAILMAST, and UltraFlex - were conducted to advance the maturity of individual technologies from proof of concept to prototype demonstration in a relevant environment , i.e. from a technology readiness level (TRL) of 3 to a level of 6. This paper presents the new technologies and validation approach of the Thermal Loop experiment. The Thermal Loop is an advanced thermal control system consisting of a miniature loop heat pipe (MLHP) with multiple evaporators and multiple condensers designed for future small system applications requiring low mass, low power, and compactness. The MLHP retains all features of state-of-the-art loop heat pipes (LHPs) and offers additional advantages to enhance the functionality, performance, versatility, and reliability of the system. Details of the thermal loop concept, technical advances, benefits, objectives, level 1 requirements, and performance characteristics are described. Also included in the paper are descriptions of the test articles and mathematical modeling used for the technology validation. An MLHP breadboard was built and tested in the laboratory and thermal vacuum environments for TRL 4 and TRL 5 validations, and an MLHP proto-flight unit was built and tested in a thermal vacuum chamber for the TRL 6 validation. In addition, an analytical model was developed to simulate the steady state and transient behaviors of the MLHP during various validation tests. Capabilities and limitations of the analytical model are also addressed.

  1. Development and Validation of Triarchic Construct Scales from the Psychopathic Personality Inventory

    Science.gov (United States)

    Hall, Jason R.; Drislane, Laura E.; Patrick, Christopher J.; Morano, Mario; Lilienfeld, Scott O.; Poythress, Norman G.

    2014-01-01

    The Triarchic model of psychopathy describes this complex condition in terms of distinct phenotypic components of boldness, meanness, and disinhibition. Brief self-report scales designed specifically to index these psychopathy facets have thus far demonstrated promising construct validity. The present study sought to develop and validate scales for assessing facets of the Triarchic model using items from a well-validated existing measure of psychopathy—the Psychopathic Personality Inventory (PPI). A consensus rating approach was used to identify PPI items relevant to each Triarchic facet, and the convergent and discriminant validity of the resulting PPI-based Triarchic scales were evaluated in relation to multiple criterion variables (i.e., other psychopathy inventories, antisocial personality disorder features, personality traits, psychosocial functioning) in offender and non-offender samples. The PPI-based Triarchic scales showed good internal consistency and related to criterion variables in ways consistent with predictions based on the Triarchic model. Findings are discussed in terms of implications for conceptualization and assessment of psychopathy. PMID:24447280

  2. Genome-wide local ancestry approach identifies genes and variants associated with chemotherapeutic susceptibility in African Americans.

    Directory of Open Access Journals (Sweden)

    Heather E Wheeler

    Full Text Available Chemotherapeutic agents are used in the treatment of many cancers, yet variable resistance and toxicities among individuals limit successful outcomes. Several studies have indicated outcome differences associated with ancestry among patients with various cancer types. Using both traditional SNP-based and newly developed gene-based genome-wide approaches, we investigated the genetics of chemotherapeutic susceptibility in lymphoblastoid cell lines derived from 83 African Americans, a population for which there is a disparity in the number of genome-wide studies performed. To account for population structure in this admixed population, we incorporated local ancestry information into our association model. We tested over 2 million SNPs and identified 325, 176, 240, and 190 SNPs that were suggestively associated with cytarabine-, 5'-deoxyfluorouridine (5'-DFUR-, carboplatin-, and cisplatin-induced cytotoxicity, respectively (p≤10(-4. Importantly, some of these variants are found only in populations of African descent. We also show that cisplatin-susceptibility SNPs are enriched for carboplatin-susceptibility SNPs. Using a gene-based genome-wide association approach, we identified 26, 11, 20, and 41 suggestive candidate genes for association with cytarabine-, 5'-DFUR-, carboplatin-, and cisplatin-induced cytotoxicity, respectively (p≤10(-3. Fourteen of these genes showed evidence of association with their respective chemotherapeutic phenotypes in the Yoruba from Ibadan, Nigeria (p<0.05, including TP53I11, COPS5 and GAS8, which are known to be involved in tumorigenesis. Although our results require further study, we have identified variants and genes associated with chemotherapeutic susceptibility in African Americans by using an approach that incorporates local ancestry information.

  3. A model-based design and validation approach with OMEGA-UML and the IF toolset

    Science.gov (United States)

    Ben-hafaiedh, Imene; Constant, Olivier; Graf, Susanne; Robbana, Riadh

    2009-03-01

    Intelligent, embedded systems such as autonomous robots and other industrial systems are becoming increasingly more heterogeneous with respect to the platforms on which they are implemented, and thus the software architecture more complex to design and analyse. In this context, it is important to have well-defined design methodologies which should be supported by (1) high level design concepts allowing to master the design complexity, (2) concepts for the expression of non-functional requirements and (3) analysis tools allowing to verify or invalidate that the system under development will be able to conform to its requirements. We illustrate here such an approach for the design of complex embedded systems on hand of a small case study used as a running example for illustration purposes. We briefly present the important concepts of the OMEGA-RT UML profile, we show how we use this profile in a modelling approach, and explain how these concepts are used in the IFx verification toolbox to integrate validation into the design flow and make scalable verification possible.

  4. Identifying Cancer Driver Genes Using Replication-Incompetent Retroviral Vectors

    Directory of Open Access Journals (Sweden)

    Victor M. Bii

    2016-10-01

    Full Text Available Identifying novel genes that drive tumor metastasis and drug resistance has significant potential to improve patient outcomes. High-throughput sequencing approaches have identified cancer genes, but distinguishing driver genes from passengers remains challenging. Insertional mutagenesis screens using replication-incompetent retroviral vectors have emerged as a powerful tool to identify cancer genes. Unlike replicating retroviruses and transposons, replication-incompetent retroviral vectors lack additional mutagenesis events that can complicate the identification of driver mutations from passenger mutations. They can also be used for almost any human cancer due to the broad tropism of the vectors. Replication-incompetent retroviral vectors have the ability to dysregulate nearby cancer genes via several mechanisms including enhancer-mediated activation of gene promoters. The integrated provirus acts as a unique molecular tag for nearby candidate driver genes which can be rapidly identified using well established methods that utilize next generation sequencing and bioinformatics programs. Recently, retroviral vector screens have been used to efficiently identify candidate driver genes in prostate, breast, liver and pancreatic cancers. Validated driver genes can be potential therapeutic targets and biomarkers. In this review, we describe the emergence of retroviral insertional mutagenesis screens using replication-incompetent retroviral vectors as a novel tool to identify cancer driver genes in different cancer types.

  5. A generalised chemical precipitation modelling approach in wastewater treatment applied to calcite

    DEFF Research Database (Denmark)

    Mbamba, Christian Kazadi; Batstone, Damien J.; Flores Alsina, Xavier

    2015-01-01

    , the present study aims to identify a broadly applicable precipitation modelling approach. The study uses two experimental platforms applied to calcite precipitating from synthetic aqueous solutions to identify and validate the model approach. Firstly, dynamic pH titration tests are performed to define...... an Arrhenius-style correction of kcryst. The influence of magnesium (a common and representative added impurity) on kcryst was found to be significant but was considered an optional correction because of a lesser influence as compared to that of temperature. Other variables such as ionic strength and pH were...

  6. IDENTIFYING DEMENTIA IN ELDERLY POPULATION : A CAMP APPROACH

    Directory of Open Access Journals (Sweden)

    Anand P

    2015-06-01

    Full Text Available BACKGROUND: Dementia is an emerging medico social problem affecting elderly, and poses a challenge to clinician and caregivers. It is usually identified in late stage where management becomes difficult. AIM: The aim of camp was to identify dementia in elderly population participating in screening camp. MATERIAL AND METHODS : The geriatric clinic and department of psychiatry jointly organised screening camp to detect dementia in elderly for five days in September 2014 to commemorate world Alzheimer’s day. The invitation regarding camp was sent to all senio r citizen forums and also published in leading Kannada daily newspaper. Mini Mental Status Examination and Diagnostic and Statistical Manual of Mental Disorders, 4 th edition criteria (DSM IV was used to identify dementia. RESULTS: Elderly male participate d in camp in more number than females and dementia was identified in 36% elderly with education less than 9 th standard. Dementia was found in 18% in our study population. CONCLUSION: The camp help identify elderly suffering from dementia and also created a wareness about it. Hypertension and diabetes mellitus were common co morbidity in study population. Our study suggested organising screening camp will help identify elderly living with dementia.

  7. The miRNA Pull Out Assay as a Method to Validate the miR-28-5p Targets Identified in Other Tumor Contexts in Prostate Cancer

    Directory of Open Access Journals (Sweden)

    Milena Rizzo

    2017-01-01

    Full Text Available miR-28-5p is an intragenic miRNA which is underexpressed in several tumor types showing a tumor suppressor (TS activity. Routinely, the known miR-28-5p targets are validated in specific tumor contexts but it is unclear whether these targets are also being regulated in other tumor types. To this end, we adopted the miRNA pull out assay to capture the miR-28-5p targets in DU-145 prostate cancer (PCa cells. Firstly, we demonstrated that miR-28-5p acts as a TS-miRNA in PCa, affecting cell proliferation, survival, and apoptosis. Secondly, we evaluated the enrichment of the 10 validated miR-28-5p targets in the pull out sample. We showed that E2F6, TEX-261, MAPK1, MPL, N4BP1, and RAP1B but not BAG1, OTUB1, MAD2L1, and p21 were significantly enriched, suggesting that not all the miR-28-5p targets are regulated by this miRNA in PCa. We then verified whether the miR-28-5p-interacting targets were regulated by this miRNA. We selected E2F6, the most enriched target in the pull out sample, and demonstrated that miR-28-5p downregulated E2F6 at the protein level suggesting that our approach was effective. In general terms, these findings support the miRNA pull out assay as a useful method to identify context-specific miRNA targets.

  8. Validation of case-finding algorithms derived from administrative data for identifying adults living with human immunodeficiency virus infection.

    Directory of Open Access Journals (Sweden)

    Tony Antoniou

    Full Text Available OBJECTIVE: We sought to validate a case-finding algorithm for human immunodeficiency virus (HIV infection using administrative health databases in Ontario, Canada. METHODS: We constructed 48 case-finding algorithms using combinations of physician billing claims, hospital and emergency room separations and prescription drug claims. We determined the test characteristics of each algorithm over various time frames for identifying HIV infection, using data abstracted from the charts of 2,040 randomly selected patients receiving care at two medical practices in Toronto, Ontario as the reference standard. RESULTS: With the exception of algorithms using only a single physician claim, the specificity of all algorithms exceeded 99%. An algorithm consisting of three physician claims over a three year period had a sensitivity and specificity of 96.2% (95% CI 95.2%-97.9% and 99.6% (95% CI 99.1%-99.8%, respectively. Application of the algorithm to the province of Ontario identified 12,179 HIV-infected patients in care for the period spanning April 1, 2007 to March 31, 2009. CONCLUSIONS: Case-finding algorithms generated from administrative data can accurately identify adults living with HIV. A relatively simple "3 claims in 3 years" definition can be used for assembling a population-based cohort and facilitating future research examining trends in health service use and outcomes among HIV-infected adults in Ontario.

  9. Tiered High-Throughput Screening Approach to Identify ...

    Science.gov (United States)

    High-throughput screening (HTS) for potential thyroid–disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limited in the US EPA ToxCast screening assay portfolio. To fill one critical screening gap, the Amplex UltraRed-thyroperoxidase (AUR-TPO) assay was developed to identify chemicals that inhibit TPO, as decreased TPO activity reduces TH synthesis. The ToxCast Phase I and II chemical libraries, comprised of 1,074 unique chemicals, were initially screened using a single, high concentration to identify potential TPO inhibitors. Chemicals positive in the single concentration screen were retested in concentration-response. Due to high false positive rates typically observed with loss-of-signal assays such as AUR-TPO, we also employed two additional assays in parallel to identify possible sources of nonspecific assay signal loss, enabling stratification of roughly 300 putative TPO inhibitors based upon selective AUR-TPO activity. A cell-free luciferase inhibition assay was used to identify nonspecific enzyme inhibition among the putative TPO inhibitors, and a cytotoxicity assay using a human cell line was used to estimate the cellular tolerance limit. Additionally, the TPO inhibition activities of 150 chemicals were compared between the AUR-TPO and an orthogonal peroxidase oxidation assay using

  10. External validation of approaches to prediction of falls during hospital rehabilitation stays and development of a new simpler tool

    Directory of Open Access Journals (Sweden)

    Angela Vratsistas-Curto

    2017-12-01

    Full Text Available Objectives: To test the external validity of 4 approaches to fall prediction in a rehabilitation setting (Predict_FIRST, Ontario Modified STRATIFY (OMS, physiotherapists’ judgement of fall risk (PT_Risk, and falls in the past year (Past_Falls, and to develop and test the validity of a simpler tool for fall prediction in rehabilitation (Predict_CM2. Participants: A total of 300 consecutively-admitted rehabilitation inpatients. Methods: Prospective inception cohort study. Falls during the rehabilitation stay were monitored. Potential predictors were extracted from medical records. Results: Forty-one patients (14% fell during their rehabilitation stay. The external validity, area under the receiver operating characteristic curve (AUC, for predicting future fallers was: 0.71 (95% confidence interval (95% CI: 0.61–0.81 for OMS (Total_Score; 0.66 (95% CI: 0.57–0.74 for Predict_FIRST; 0.65 (95% CI 0.57–0.73 for PT_Risk; and 0.52 for Past_Falls (95% CI: 0.46–0.60. A simple 3-item tool (Predict_CM2 was developed from the most predictive individual items (impaired mobility/transfer ability, impaired cognition, and male sex. The accuracy of Predict_CM2 was 0.73 (95% CI: 0.66–0.81, comparable to OMS (Total_Score (p = 0.52, significantly better than Predict_FIRST (p = 0.04, and Past_Falls (p < 0.001, and approaching significantly better than PT_Risk (p = 0.09. Conclusion: Predict_CM2 is a simpler screening tool with similar accuracy for predicting fallers in rehabilitation to OMS (Total_Score and better accuracy than Predict_FIRST or Past_Falls. External validation of Predict_CM2 is required.

  11. Structured methodology review identified seven (RETREAT) criteria for selecting qualitative evidence synthesis approaches.

    Science.gov (United States)

    Booth, Andrew; Noyes, Jane; Flemming, Kate; Gerhardus, Ansgar; Wahlster, Philip; van der Wilt, Gert Jan; Mozygemba, Kati; Refolo, Pietro; Sacchini, Dario; Tummers, Marcia; Rehfuess, Eva

    2018-07-01

    To compare and contrast different methods of qualitative evidence synthesis (QES) against criteria identified from the literature and to map their attributes to inform selection of the most appropriate QES method to answer research questions addressed by qualitative research. Electronic databases, citation searching, and a study register were used to identify studies reporting QES methods. Attributes compiled from 26 methodological papers (2001-2014) were used as a framework for data extraction. Data were extracted into summary tables by one reviewer and then considered within the author team. We identified seven considerations determining choice of methods from the methodological literature, encapsulated within the mnemonic Review question-Epistemology-Time/Timescale-Resources-Expertise-Audience and purpose-Type of data. We mapped 15 different published QES methods against these seven criteria. The final framework focuses on stand-alone QES methods but may also hold potential when integrating quantitative and qualitative data. These findings offer a contemporary perspective as a conceptual basis for future empirical investigation of the advantages and disadvantages of different methods of QES. It is hoped that this will inform appropriate selection of QES approaches. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Mathematical modeling and validation in physiology applications to the cardiovascular and respiratory systems

    CERN Document Server

    Bachar, Mostafa; Kappel, Franz

    2013-01-01

    This volume synthesizes theoretical and practical aspects of both the mathematical and life science viewpoints needed for modeling of the cardiovascular-respiratory system specifically and physiological systems generally.  Theoretical points include model design, model complexity and validation in the light of available data, as well as control theory approaches to feedback delay and Kalman filter applications to parameter identification. State of the art approaches using parameter sensitivity are discussed for enhancing model identifiability through joint analysis of model structure and data. Practical examples illustrate model development at various levels of complexity based on given physiological information. The sensitivity-based approaches for examining model identifiability are illustrated by means of specific modeling  examples. The themes presented address the current problem of patient-specific model adaptation in the clinical setting, where data is typically limited.

  13. Global approach for the validation of an in-line Raman spectroscopic method to determine the API content in real-time during a hot-melt extrusion process.

    Science.gov (United States)

    Netchacovitch, L; Thiry, J; De Bleye, C; Dumont, E; Cailletaud, J; Sacré, P-Y; Evrard, B; Hubert, Ph; Ziemons, E

    2017-08-15

    Since the Food and Drug Administration (FDA) published a guidance based on the Process Analytical Technology (PAT) approach, real-time analyses during manufacturing processes are in real expansion. In this study, in-line Raman spectroscopic analyses were performed during a Hot-Melt Extrusion (HME) process to determine the Active Pharmaceutical Ingredient (API) content in real-time. The method was validated based on a univariate and a multivariate approach and the analytical performances of the obtained models were compared. Moreover, on one hand, in-line data were correlated with the real API concentration present in the sample quantified by a previously validated off-line confocal Raman microspectroscopic method. On the other hand, in-line data were also treated in function of the concentration based on the weighing of the components in the prepared mixture. The importance of developing quantitative methods based on the use of a reference method was thus highlighted. The method was validated according to the total error approach fixing the acceptance limits at ±15% and the α risk at ±5%. This method reaches the requirements of the European Pharmacopeia norms for the uniformity of content of single-dose preparations. The validation proves that future results will be in the acceptance limits with a previously defined probability. Finally, the in-line validated method was compared with the off-line one to demonstrate its ability to be used in routine analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. A Human Proximity Operations System test case validation approach

    Science.gov (United States)

    Huber, Justin; Straub, Jeremy

    A Human Proximity Operations System (HPOS) poses numerous risks in a real world environment. These risks range from mundane tasks such as avoiding walls and fixed obstacles to the critical need to keep people and processes safe in the context of the HPOS's situation-specific decision making. Validating the performance of an HPOS, which must operate in a real-world environment, is an ill posed problem due to the complexity that is introduced by erratic (non-computer) actors. In order to prove the HPOS's usefulness, test cases must be generated to simulate possible actions of these actors, so the HPOS can be shown to be able perform safely in environments where it will be operated. The HPOS must demonstrate its ability to be as safe as a human, across a wide range of foreseeable circumstances. This paper evaluates the use of test cases to validate HPOS performance and utility. It considers an HPOS's safe performance in the context of a common human activity, moving through a crowded corridor, and extrapolates (based on this) to the suitability of using test cases for AI validation in other areas of prospective application.

  15. Containment Code Validation Matrix

    International Nuclear Information System (INIS)

    Chin, Yu-Shan; Mathew, P.M.; Glowa, Glenn; Dickson, Ray; Liang, Zhe; Leitch, Brian; Barber, Duncan; Vasic, Aleks; Bentaib, Ahmed; Journeau, Christophe; Malet, Jeanne; Studer, Etienne; Meynet, Nicolas; Piluso, Pascal; Gelain, Thomas; Michielsen, Nathalie; Peillon, Samuel; Porcheron, Emmanuel; Albiol, Thierry; Clement, Bernard; Sonnenkalb, Martin; Klein-Hessling, Walter; Arndt, Siegfried; Weber, Gunter; Yanez, Jorge; Kotchourko, Alexei; Kuznetsov, Mike; Sangiorgi, Marco; Fontanet, Joan; Herranz, Luis; Garcia De La Rua, Carmen; Santiago, Aleza Enciso; Andreani, Michele; Paladino, Domenico; Dreier, Joerg; Lee, Richard; Amri, Abdallah

    2014-01-01

    The Committee on the Safety of Nuclear Installations (CSNI) formed the CCVM (Containment Code Validation Matrix) task group in 2002. The objective of this group was to define a basic set of available experiments for code validation, covering the range of containment (ex-vessel) phenomena expected in the course of light and heavy water reactor design basis accidents and beyond design basis accidents/severe accidents. It was to consider phenomena relevant to pressurised heavy water reactor (PHWR), pressurised water reactor (PWR) and boiling water reactor (BWR) designs of Western origin as well as of Eastern European VVER types. This work would complement the two existing CSNI validation matrices for thermal hydraulic code validation (NEA/CSNI/R(1993)14) and In-vessel core degradation (NEA/CSNI/R(2001)21). The report initially provides a brief overview of the main features of a PWR, BWR, CANDU and VVER reactors. It also provides an overview of the ex-vessel corium retention (core catcher). It then provides a general overview of the accident progression for light water and heavy water reactors. The main focus is to capture most of the phenomena and safety systems employed in these reactor types and to highlight the differences. This CCVM contains a description of 127 phenomena, broken down into 6 categories: - Containment Thermal-hydraulics Phenomena; - Hydrogen Behaviour (Combustion, Mitigation and Generation) Phenomena; - Aerosol and Fission Product Behaviour Phenomena; - Iodine Chemistry Phenomena; - Core Melt Distribution and Behaviour in Containment Phenomena; - Systems Phenomena. A synopsis is provided for each phenomenon, including a description, references for further information, significance for DBA and SA/BDBA and a list of experiments that may be used for code validation. The report identified 213 experiments, broken down into the same six categories (as done for the phenomena). An experiment synopsis is provided for each test. Along with a test description

  16. An Immunohistochemical Approach to Identify the Sex of Young Marine Turtles.

    Science.gov (United States)

    Tezak, Boris M; Guthrie, Kathleen; Wyneken, Jeanette

    2017-08-01

    Marine turtles exhibit temperature-dependent sex determination (TSD). During critical periods of embryonic development, the nest's thermal environment directs whether an embryo will develop as a male or female. At warmer sand temperatures, nests tend to produce female-biased sex ratios. The rapid increase of global temperature highlights the need for a clear assessment of its effects on sea turtle sex ratios. However, estimating hatchling sex ratios at rookeries remains imprecise due to the lack of sexual dimorphism in young marine turtles. We rely mainly upon laparoscopic procedures to verify hatchling sex; however, in some species, morphological sex can be ambiguous even at the histological level. Recent studies using immunohistochemical (IHC) techniques identified that embryonic snapping turtle (Chelydra serpentina) ovaries overexpressed a particular cold-induced RNA-binding protein in comparison to testes. This feature allows the identification of females vs. males. We modified this technique to successfully identify the sexes of loggerhead sea turtle (Caretta caretta) hatchlings, and independently confirmed the results by standard histological and laparoscopic methods that reliably identify sex in this species. We next tested the CIRBP IHC method on gonad samples from leatherback turtles (Dermochelys coriacea). Leatherbacks display delayed gonad differentiation, when compared to other sea turtles, making hatchling gonads difficult to sex using standard H&E stain histology. The IHC approach was successful in both C. caretta and D. coriacea samples, offering a much-needed tool to establish baseline hatchling sex ratios, particularly for assessing impacts of climate change effects on leatherback turtle hatchlings and sea turtle demographics. Anat Rec, 300:1512-1518, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  17. An affinity pull-down approach to identify the plant cyclic nucleotide interactome

    KAUST Repository

    Donaldson, Lara Elizabeth; Meier, Stuart Kurt

    2013-01-01

    Cyclic nucleotides (CNs) are intracellular second messengers that play an important role in mediating physiological responses to environmental and developmental signals, in species ranging from bacteria to humans. In response to these signals, CNs are synthesized by nucleotidyl cyclases and then act by binding to and altering the activity of downstream target proteins known as cyclic nucleotide-binding proteins (CNBPs). A number of CNBPs have been identified across kingdoms including transcription factors, protein kinases, phosphodiesterases, and channels, all of which harbor conserved CN-binding domains. In plants however, few CNBPs have been identified as homology searches fail to return plant sequences with significant matches to known CNBPs. Recently, affinity pull-down techniques have been successfully used to identify CNBPs in animals and have provided new insights into CN signaling. The application of these techniques to plants has not yet been extensively explored and offers an alternative approach toward the unbiased discovery of novel CNBP candidates in plants. Here, an affinity pull-down technique for the identification of the plant CN interactome is presented. In summary, the method involves an extraction of plant proteins which is incubated with a CN-bait, followed by a series of increasingly stringent elutions that eliminates proteins in a sequential manner according to their affinity to the bait. The eluted and bait-bound proteins are separated by one-dimensional gel electrophoresis, excised, and digested with trypsin after which the resultant peptides are identified by mass spectrometry - techniques that are commonplace in proteomics experiments. The discovery of plant CNBPs promises to provide valuable insight into the mechanism of CN signal transduction in plants. © Springer Science+Business Media New York 2013.

  18. An affinity pull-down approach to identify the plant cyclic nucleotide interactome

    KAUST Repository

    Donaldson, Lara Elizabeth

    2013-09-03

    Cyclic nucleotides (CNs) are intracellular second messengers that play an important role in mediating physiological responses to environmental and developmental signals, in species ranging from bacteria to humans. In response to these signals, CNs are synthesized by nucleotidyl cyclases and then act by binding to and altering the activity of downstream target proteins known as cyclic nucleotide-binding proteins (CNBPs). A number of CNBPs have been identified across kingdoms including transcription factors, protein kinases, phosphodiesterases, and channels, all of which harbor conserved CN-binding domains. In plants however, few CNBPs have been identified as homology searches fail to return plant sequences with significant matches to known CNBPs. Recently, affinity pull-down techniques have been successfully used to identify CNBPs in animals and have provided new insights into CN signaling. The application of these techniques to plants has not yet been extensively explored and offers an alternative approach toward the unbiased discovery of novel CNBP candidates in plants. Here, an affinity pull-down technique for the identification of the plant CN interactome is presented. In summary, the method involves an extraction of plant proteins which is incubated with a CN-bait, followed by a series of increasingly stringent elutions that eliminates proteins in a sequential manner according to their affinity to the bait. The eluted and bait-bound proteins are separated by one-dimensional gel electrophoresis, excised, and digested with trypsin after which the resultant peptides are identified by mass spectrometry - techniques that are commonplace in proteomics experiments. The discovery of plant CNBPs promises to provide valuable insight into the mechanism of CN signal transduction in plants. © Springer Science+Business Media New York 2013.

  19. Identifying the Return on Investment for Army Migration to a Modular Open Systems Approach for Future and Legacy Systems

    Science.gov (United States)

    2017-04-05

    Identifying the Return on Investment for Army Migration to a Modular Open Systems Approach for Future and Legacy Systems Phillip Minor...Authorization Act (NDAA) of 2015, cites the modular open systems approach (MOSA) as both a business and technical strategy to reduce the cost of system ...access the service over the network. Combine the advances cited above with the emergence of systems developed using the modular open systems approach

  20. An experimental approach to validating a theory of human error in complex systems

    Science.gov (United States)

    Morris, N. M.; Rouse, W. B.

    1985-01-01

    The problem of 'human error' is pervasive in engineering systems in which the human is involved. In contrast to the common engineering approach of dealing with error probabilistically, the present research seeks to alleviate problems associated with error by gaining a greater understanding of causes and contributing factors from a human information processing perspective. The general approach involves identifying conditions which are hypothesized to contribute to errors, and experimentally creating the conditions in order to verify the hypotheses. The conceptual framework which serves as the basis for this research is discussed briefly, followed by a description of upcoming research. Finally, the potential relevance of this research to design, training, and aiding issues is discussed.

  1. Demonstration of statistical approaches to identify component's ageing by operational data analysis-A case study for the ageing PSA network

    International Nuclear Information System (INIS)

    Rodionov, Andrei; Atwood, Corwin L.; Kirchsteiger, Christian; Patrik, Milan

    2008-01-01

    The paper presents some results of a case study on 'Demonstration of statistical approaches to identify the component's ageing by operational data analysis', which was done in the frame of the EC JRC Ageing PSA Network. Several techniques: visual evaluation, nonparametric and parametric hypothesis tests, were proposed and applied in order to demonstrate the capacity, advantages and limitations of statistical approaches to identify the component's ageing by operational data analysis. Engineering considerations are out of the scope of the present study

  2. Rejoinder: A Construct Validity Approach to the Assessment of Narcissism.

    Science.gov (United States)

    Miller, Joshua D; Lynam, Donald R; Campbell, W Keith

    2016-02-01

    In this rejoinder, we comment on Wright's response to our reanalysis and reinterpretation of the data presented by Wright and colleagues. Two primary differences characterize these perspectives. First, the conceptualization of grandiose narcissism differs such that emotional and ego vulnerability, dysregulation, and pervasive impairments are more characteristic of Wright's conception, likely due to the degree to which it is tied to clinical observations. Our conceptualization is closer to psychopathy and describes an extraverted, dominant, and antagonistic individual who is relatively less likely to be found in clinical settings. Second, our approach to construct validation differs in that we take an empirical perspective that focuses on the degree to which inventories yield scores consistent with a priori predictions. The grandiose dimension of the Pathological Narcissism Inventory (PNI-G) yields data that fail to align with expert ratings of narcissistic personality disorder and grandiose narcissism. We suggest that caution should be taken in treating the PNI-G as a gold standard measure of pathological narcissism, that revision of the PNI-G is required before it can serve as a stand-alone measure of grandiose narcissism, and that the PNI-G should be buttressed by other scales when being used as a measure of grandiose narcissism. © The Author(s) 2015.

  3. The KNICS approach for verification and validation of safety software

    International Nuclear Information System (INIS)

    Cha, Kyung Ho; Sohn, Han Seong; Lee, Jang Soo; Kim, Jang Yeol; Cheon, Se Woo; Lee, Young Joon; Hwang, In Koo; Kwon, Kee Choon

    2003-01-01

    This paper presents verification and validation (VV) to be approached for safety software of POSAFE-Q Programmable Logic Controller (PLC) prototype and Plant Protection System (PPS) prototype, which consists of Reactor Protection System (RPS) and Engineered Safety Features-Component Control System (ESF-CCS) in development of Korea Nuclear Instrumentation and Control System (KNICS). The SVV criteria and requirements are selected from IEEE Std. 7-4.3.2, IEEE Std. 1012, IEEE Std. 1028 and BTP-14, and they have been considered for acceptance framework to be provided within SVV procedures. SVV techniques, including Review and Inspection (R and I), Formal Verification and Theorem Proving, and Automated Testing, are applied for safety software and automated SVV tools supports SVV tasks. Software Inspection Support and Requirement Traceability (SIS-RT) supports R and I and traceability analysis, a New Symbolic Model Verifier (NuSMV), Statemate MAGNUM (STM) ModelCertifier, and Prototype Verification System (PVS) are used for formal verification, and McCabe and Cantata++ are utilized for static and dynamic software testing. In addition, dedication of Commercial-Off-The-Shelf (COTS) software and firmware, Software Safety Analysis (SSA) and evaluation of Software Configuration Management (SCM) are being performed for the PPS prototype in the software requirements phase

  4. Validation of a Smartphone-Based Approach to In Situ Cognitive Fatigue Assessment

    Science.gov (United States)

    Linden, Mark

    2017-01-01

    Background Acquired Brain Injuries (ABIs) can result in multiple detrimental cognitive effects, such as reduced memory capability, concentration, and planning. These effects can lead to cognitive fatigue, which can exacerbate the symptoms of ABIs and hinder management and recovery. Assessing cognitive fatigue is difficult due to the largely subjective nature of the condition and existing assessment approaches. Traditional methods of assessment use self-assessment questionnaires delivered in a medical setting, but recent work has attempted to employ more objective cognitive tests as a way of evaluating cognitive fatigue. However, these tests are still predominantly delivered within a medical environment, limiting their utility and efficacy. Objective The aim of this research was to investigate how cognitive fatigue can be accurately assessed in situ, during the quotidian activities of life. It was hypothesized that this assessment could be achieved through the use of mobile assistive technology to assess working memory, sustained attention, information processing speed, reaction time, and cognitive throughput. Methods The study used a bespoke smartphone app to track daily cognitive performance, in order to assess potential levels of cognitive fatigue. Twenty-one participants with no prior reported brain injuries took place in a two-week study, resulting in 81 individual testing instances being collected. The smartphone app delivered three cognitive tests on a daily basis: (1) Spatial Span to measure visuospatial working memory; (2) Psychomotor Vigilance Task (PVT) to measure sustained attention, information processing speed, and reaction time; and (3) a Mental Arithmetic Test to measure cognitive throughput. A smartphone-optimized version of the Mental Fatigue Scale (MFS) self-assessment questionnaire was used as a baseline to assess the validity of the three cognitive tests, as the questionnaire has already been validated in multiple peer-reviewed studies. Results

  5. Validation of a Smartphone-Based Approach to In Situ Cognitive Fatigue Assessment.

    Science.gov (United States)

    Price, Edward; Moore, George; Galway, Leo; Linden, Mark

    2017-08-17

    Acquired Brain Injuries (ABIs) can result in multiple detrimental cognitive effects, such as reduced memory capability, concentration, and planning. These effects can lead to cognitive fatigue, which can exacerbate the symptoms of ABIs and hinder management and recovery. Assessing cognitive fatigue is difficult due to the largely subjective nature of the condition and existing assessment approaches. Traditional methods of assessment use self-assessment questionnaires delivered in a medical setting, but recent work has attempted to employ more objective cognitive tests as a way of evaluating cognitive fatigue. However, these tests are still predominantly delivered within a medical environment, limiting their utility and efficacy. The aim of this research was to investigate how cognitive fatigue can be accurately assessed in situ, during the quotidian activities of life. It was hypothesized that this assessment could be achieved through the use of mobile assistive technology to assess working memory, sustained attention, information processing speed, reaction time, and cognitive throughput. The study used a bespoke smartphone app to track daily cognitive performance, in order to assess potential levels of cognitive fatigue. Twenty-one participants with no prior reported brain injuries took place in a two-week study, resulting in 81 individual testing instances being collected. The smartphone app delivered three cognitive tests on a daily basis: (1) Spatial Span to measure visuospatial working memory; (2) Psychomotor Vigilance Task (PVT) to measure sustained attention, information processing speed, and reaction time; and (3) a Mental Arithmetic Test to measure cognitive throughput. A smartphone-optimized version of the Mental Fatigue Scale (MFS) self-assessment questionnaire was used as a baseline to assess the validity of the three cognitive tests, as the questionnaire has already been validated in multiple peer-reviewed studies. The most highly correlated results

  6. Reverse Vaccinology: An Approach for Identifying Leptospiral Vaccine Candidates

    Directory of Open Access Journals (Sweden)

    Odir A. Dellagostin

    2017-01-01

    Full Text Available Leptospirosis is a major public health problem with an incidence of over one million human cases each year. It is a globally distributed, zoonotic disease and is associated with significant economic losses in farm animals. Leptospirosis is caused by pathogenic Leptospira spp. that can infect a wide range of domestic and wild animals. Given the inability to control the cycle of transmission among animals and humans, there is an urgent demand for a new vaccine. Inactivated whole-cell vaccines (bacterins are routinely used in livestock and domestic animals, however, protection is serovar-restricted and short-term only. To overcome these limitations, efforts have focused on the development of recombinant vaccines, with partial success. Reverse vaccinology (RV has been successfully applied to many infectious diseases. A growing number of leptospiral genome sequences are now available in public databases, providing an opportunity to search for prospective vaccine antigens using RV. Several promising leptospiral antigens were identified using this approach, although only a few have been characterized and evaluated in animal models. In this review, we summarize the use of RV for leptospirosis and discuss the need for potential improvements for the successful development of a new vaccine towards reducing the burden of human and animal leptospirosis.

  7. Checklists for external validity

    DEFF Research Database (Denmark)

    Dyrvig, Anne-Kirstine; Kidholm, Kristian; Gerke, Oke

    2014-01-01

    to an implementation setting. In this paper, currently available checklists on external validity are identified, assessed and used as a basis for proposing a new improved instrument. METHOD: A systematic literature review was carried out in Pubmed, Embase and Cinahl on English-language papers without time restrictions....... The retrieved checklist items were assessed for (i) the methodology used in primary literature, justifying inclusion of each item; and (ii) the number of times each item appeared in checklists. RESULTS: Fifteen papers were identified, presenting a total of 21 checklists for external validity, yielding a total...... of 38 checklist items. Empirical support was considered the most valid methodology for item inclusion. Assessment of methodological justification showed that none of the items were supported empirically. Other kinds of literature justified the inclusion of 22 of the items, and 17 items were included...

  8. Protein Correlation Profiles Identify Lipid Droplet Proteins with High Confidence*

    Science.gov (United States)

    Krahmer, Natalie; Hilger, Maximiliane; Kory, Nora; Wilfling, Florian; Stoehr, Gabriele; Mann, Matthias; Farese, Robert V.; Walther, Tobias C.

    2013-01-01

    Lipid droplets (LDs) are important organelles in energy metabolism and lipid storage. Their cores are composed of neutral lipids that form a hydrophobic phase and are surrounded by a phospholipid monolayer that harbors specific proteins. Most well-established LD proteins perform important functions, particularly in cellular lipid metabolism. Morphological studies show LDs in close proximity to and interacting with membrane-bound cellular organelles, including the endoplasmic reticulum, mitochondria, peroxisomes, and endosomes. Because of these close associations, it is difficult to purify LDs to homogeneity. Consequently, the confident identification of bona fide LD proteins via proteomics has been challenging. Here, we report a methodology for LD protein identification based on mass spectrometry and protein correlation profiles. Using LD purification and quantitative, high-resolution mass spectrometry, we identified LD proteins by correlating their purification profiles to those of known LD proteins. Application of the protein correlation profile strategy to LDs isolated from Drosophila S2 cells led to the identification of 111 LD proteins in a cellular LD fraction in which 1481 proteins were detected. LD localization was confirmed in a subset of identified proteins via microscopy of the expressed proteins, thereby validating the approach. Among the identified LD proteins were both well-characterized LD proteins and proteins not previously known to be localized to LDs. Our method provides a high-confidence LD proteome of Drosophila cells and a novel approach that can be applied to identify LD proteins of other cell types and tissues. PMID:23319140

  9. Evidence for validity within workplace assessment: the Longitudinal Evaluation of Performance (LEP).

    Science.gov (United States)

    Prescott-Clements, Linda; van der Vleuten, Cees P M; Schuwirth, Lambert W T; Hurst, Yvonne; Rennie, James S

    2008-05-01

    The drive towards valid and reliable assessment methods for health professions' training is becoming increasingly focused towards authentic models of workplace performance assessment. This study investigates the validity of such a method, longitudinal evaluation of performance (LEP), which has been implemented in the assessment of postgraduate dental trainees in Scotland. Although it is similar in format to the mini-CEX (mini clinical evaluation exercise) and other tools that use global ratings for assessing performance in the workplace, a number of differences exist in the way in which the LEP has been implemented. These include the use of a reference point for evaluators' judgement that represents the standard expected upon completion of the training, flexibility, a greater range of cases assessed and the use of frequency scores within feedback to identify trainees' progress over time. A range of qualitative and quantitative data were collected and analysed from 2 consecutive cohorts of trainees in Scotland (2002-03 and 2003-04). There is rich evidence supporting the validity, educational impact and feasibility of the LEP. In particular, a great deal of support was given by trainers for the use of a fixed reference point for judgements, despite initial concerns that this might be demotivating to trainees. Trainers were highly positive about this approach and considered it useful in identifying trainees' progress and helping to drive learning. The LEP has been successful in combining a strong formative approach to continuous assessment with the collection of evidence on performance within the workplace that (alongside other tools within an assessment system) can contribute towards a summative decision regarding competence.

  10. Reverse-translational biomarker validation of Abnormal Repetitive Behaviors in mice: an illustration of the 4P's modeling approach.

    Science.gov (United States)

    Garner, Joseph P; Thogerson, Collette M; Dufour, Brett D; Würbel, Hanno; Murray, James D; Mench, Joy A

    2011-06-01

    The NIMH's new strategic plan, with its emphasis on the "4P's" (Prediction, Pre-emption, Personalization, and Populations) and biomarker-based medicine requires a radical shift in animal modeling methodology. In particular 4P's models will be non-determinant (i.e. disease severity will depend on secondary environmental and genetic factors); and validated by reverse-translation of animal homologues to human biomarkers. A powerful consequence of the biomarker approach is that different closely related disorders have a unique fingerprint of biomarkers. Animals can be validated as a highly specific model of a single disorder by matching this 'fingerprint'; or as a model of a symptom seen in multiple disorders by matching common biomarkers. Here we illustrate this approach with two Abnormal Repetitive Behaviors (ARBs) in mice: stereotypies and barbering (hair pulling). We developed animal versions of the neuropsychological biomarkers that distinguish human ARBs, and tested the fingerprint of the different mouse ARBs. As predicted, the two mouse ARBs were associated with different biomarkers. Both barbering and stereotypy could be discounted as models of OCD (even though they are widely used as such), due to the absence of limbic biomarkers which are characteristic of OCD and hence are necessary for a valid model. Conversely barbering matched the fingerprint of trichotillomania (i.e. selective deficits in set-shifting), suggesting it may be a highly specific model of this disorder. In contrast stereotypies were correlated only with a biomarker (deficits in response shifting) correlated with stereotypies in multiple disorders, suggesting that animal stereotypies model stereotypies in multiple disorders. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Novel Approach for Ensuring Increased Validity in Home Blood Pressure Monitoring

    DEFF Research Database (Denmark)

    Wagner, Stefan Rahr; Toftegaard, Thomas Skjødeberg; Bertelsen, Olav Wedege

    This paper proposes a novel technique to increase the validity of home blood pressure monitoring by using various sensor technologies as part of an intelligent environment platform in the home of the user. A range of recommendations exists on how to obtain a valid blood pressure but with the devi......This paper proposes a novel technique to increase the validity of home blood pressure monitoring by using various sensor technologies as part of an intelligent environment platform in the home of the user. A range of recommendations exists on how to obtain a valid blood pressure...

  12. Linear Unlearning for Cross-Validation

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Larsen, Jan

    1996-01-01

    The leave-one-out cross-validation scheme for generalization assessment of neural network models is computationally expensive due to replicated training sessions. In this paper we suggest linear unlearning of examples as an approach to approximative cross-validation. Further, we discuss...... time series prediction benchmark demonstrate the potential of the linear unlearning technique...

  13. Construct Validity of the Holistic Complementary and Alternative Medicines Questionnaire (HCAMQ)—An Investigation Using Modern Psychometric Approaches

    Science.gov (United States)

    Kersten, Paula; White, P. J.; Tennant, A.

    2011-01-01

    The scientific basis of efficacy studies of complementary medicine requires the availability of validated measures. The Holistic Complementary and Alternative Medicine Questionnaire (HCAMQ) is one such measure. This article aimed to examine its construct validity, using a modern psychometric approach. The HCAMQ was completed by 221 patients (mean age 66.8, SD 8.29, 58% females) with chronic stable pain predominantly from a single joint (hip or knee) of mechanical origin, waiting for a hip (40%) or knee (60%) joint replacement, on enrolment in a study investigating the effects of acupuncture and placebo controls. The HCAMQ contains a Holistic Health (HH) Subscale (five items) and a CAM subscale (six items). Validity of the subscales was tested using Cronbach alpha's, factor analysis, Mokken scaling and Rasch analysis, which did not support the original two-factor structure of the scale. A five-item HH subscale and a four-item CAM subscale (worded in a negative direction) fitted the Rasch model and were unidimensional (χ2 = 8.44, P = 0.39, PSI = 0.69 versus χ2 = 17.33, P = 0.03, PSI = 0.77). Two CAM items (worded in the positive direction) had significant misfit. In conclusion, we have shown that the original two-factor structure of the HCAMQ could not be supported but that two valid shortened subscales can be used, one for HH Beliefs (four-item HH), and the other for CAM Beliefs (four-item CAM). It is recommended that consideration is given to rewording the two discarded positively worded CAM questions to enhance construct validity. PMID:19793835

  14. Identifying and prioritizing barriers to implementation of smart energy city projects in Europe: An empirical approach

    International Nuclear Information System (INIS)

    Mosannenzadeh, Farnaz; Di Nucci, Maria Rosaria; Vettorato, Daniele

    2017-01-01

    Successful implementation of smart energy city projects in Europe is crucial for a sustainable transition of urban energy systems and the improvement of quality of life for citizens. We aim to develop a systematic classification and analysis of the barriers hindering successful implementation of smart energy city projects. Through an empirical approach, we investigated 43 communities implementing smart and sustainable energy city projects under the Sixth and Seventh Framework Programmes of the European Union. Validated through literature review, we identified 35 barriers categorized in policy, administrative, legal, financial, market, environmental, technical, social, and information-and-awareness dimensions. We prioritized these barriers, using a novel multi-dimensional methodology that simultaneously analyses barriers based on frequency, level of impact, causal relationship among barriers, origin, and scale. The results indicate that the key barriers are lacking or fragmented political support on the long term at the policy level, and lack of good cooperation and acceptance among project partners, insufficient external financial support, lack of skilled and trained personnel, and fragmented ownership at the project level. The outcome of the research should aid policy-makers to better understand and prioritize implementation barriers to develop effective action and policy interventions towards more successful implementation of smart energy city projects. - Highlights: • A solid empirical study on the implementation of European smart energy city projects. • We found 35 barriers in nine dimensions; e.g. policy, legal, financial, and social. • We suggested a new multi-dimensional methodology to prioritize barriers. • Lacking or fragmented political support on the long term is a key barrier. • We provided insights for action for project coordinators and policy makers.

  15. Site characterization and validation - validation drift fracture data, stage 4

    International Nuclear Information System (INIS)

    Bursey, G.; Gale, J.; MacLeod, R.; Straahle, A.; Tiren, S.

    1991-08-01

    This report describes the mapping procedures and the data collected during fracture mapping in the validation drift. Fracture characteristics examined include orientation, trace length, termination mode, and fracture minerals. These data have been compared and analysed together with fracture data from the D-boreholes to determine the adequacy of the borehole mapping procedures and to assess the nature and degree of orientation bias in the borehole data. The analysis of the validation drift data also includes a series of corrections to account for orientation, truncation, and censoring biases. This analysis has identified at least 4 geologically significant fracture sets in the rock mass defined by the validation drift. An analysis of the fracture orientations in both the good rock and the H-zone has defined groups of 7 clusters and 4 clusters, respectively. Subsequent analysis of the fracture patterns in five consecutive sections along the validation drift further identified heterogeneity through the rock mass, with respect to fracture orientations. These results are in stark contrast to the results form the D-borehole analysis, where a strong orientation bias resulted in a consistent pattern of measured fracture orientations through the rock. In the validation drift, fractures in the good rock also display a greater mean variance in length than those in the H-zone. These results provide strong support for a distinction being made between fractures in the good rock and the H-zone, and possibly between different areas of the good rock itself, for discrete modelling purposes. (au) (20 refs.)

  16. A stepwise approach to identify intellectual disabilities in the criminal justice system

    Directory of Open Access Journals (Sweden)

    Valentina Cabral Iversen

    2010-07-01

    Full Text Available A significant proportion of the prison inmates have an IQ level corresponding to intellectual disability (ID or borderline ID. These persons are rarely identified and subsequently not offered any compensation for their learning and comprehension deficits. The purpose of this study was to explore and help providing methods for better identification of ID at an early stage during criminal proceedings. 143 randomly selected prisoners serving sentences in prisons were assessed using The Wechsler Abbreviated Scale of Intelligence (WASI and the Hayes Ability Screening Index (HASI while a semi-structured interview was carried out to obtain data on health as well as social and criminological issues. A total of 10.8% (n = 15 of the participants showed an IQ below 70. From previous analyses of the semistructured interview, a checklist was extracted and found to have good predictive validity on ID (AUC= 93%. The resulting identification referred 32% (n= 46 of the sample for comprehensive assessment. Within this group, all participants with an IQ below70 were included. Identification through this checklist, the screening and a full assessment is essential in improving the quality of the services.

  17. Identifying determinants of medication adherence following myocardial infarction using the Theoretical Domains Framework and the Health Action Process Approach.

    Science.gov (United States)

    Presseau, Justin; Schwalm, J D; Grimshaw, Jeremy M; Witteman, Holly O; Natarajan, Madhu K; Linklater, Stefanie; Sullivan, Katrina; Ivers, Noah M

    2017-10-01

    Despite evidence-based recommendations, adherence with secondary prevention medications post-myocardial infarction (MI) remains low. Taking medication requires behaviour change, and using behavioural theories to identify what factors determine adherence could help to develop novel adherence interventions. Compare the utility of different behaviour theory-based approaches for identifying modifiable determinants of medication adherence post-MI that could be targeted by interventions. Two studies were conducted with patients 0-2, 3-12, 13-24 or 25-36 weeks post-MI. Study 1: 24 patients were interviewed about barriers and facilitators to medication adherence. Interviews were conducted and coded using the Theoretical Domains Framework. Study 2: 201 patients answered a telephone questionnaire assessing Health Action Process Approach constructs to predict intention and medication adherence (MMAS-8). Study 1: domains identified: Beliefs about Consequences, Memory/Attention/Decision Processes, Behavioural Regulation, Social Influences and Social Identity. Study 2: 64, 59, 42 and 58% reported high adherence at 0-2, 3-12, 13-24 and 25-36 weeks. Social Support and Action Planning predicted adherence at all time points, though the relationship between Action Planning and adherence decreased over time. Using two behaviour theory-based approaches provided complimentary findings and identified modifiable factors that could be targeted to help translate Intention into action to improve medication adherence post-MI.

  18. A physarum-inspired prize-collecting steiner tree approach to identify subnetworks for drug repositioning.

    Science.gov (United States)

    Sun, Yahui; Hameed, Pathima Nusrath; Verspoor, Karin; Halgamuge, Saman

    2016-12-05

    Drug repositioning can reduce the time, costs and risks of drug development by identifying new therapeutic effects for known drugs. It is challenging to reposition drugs as pharmacological data is large and complex. Subnetwork identification has already been used to simplify the visualization and interpretation of biological data, but it has not been applied to drug repositioning so far. In this paper, we fill this gap by proposing a new Physarum-inspired Prize-Collecting Steiner Tree algorithm to identify subnetworks for drug repositioning. Drug Similarity Networks (DSN) are generated using the chemical, therapeutic, protein, and phenotype features of drugs. In DSNs, vertex prizes and edge costs represent the similarities and dissimilarities between drugs respectively, and terminals represent drugs in the cardiovascular class, as defined in the Anatomical Therapeutic Chemical classification system. A new Physarum-inspired Prize-Collecting Steiner Tree algorithm is proposed in this paper to identify subnetworks. We apply both the proposed algorithm and the widely-used GW algorithm to identify subnetworks in our 18 generated DSNs. In these DSNs, our proposed algorithm identifies subnetworks with an average Rand Index of 81.1%, while the GW algorithm can only identify subnetworks with an average Rand Index of 64.1%. We select 9 subnetworks with high Rand Index to find drug repositioning opportunities. 10 frequently occurring drugs in these subnetworks are identified as candidates to be repositioned for cardiovascular diseases. We find evidence to support previous discoveries that nitroglycerin, theophylline and acarbose may be able to be repositioned for cardiovascular diseases. Moreover, we identify seven previously unknown drug candidates that also may interact with the biological cardiovascular system. These discoveries show our proposed Prize-Collecting Steiner Tree approach as a promising strategy for drug repositioning.

  19. Cross validation in LULOO

    DEFF Research Database (Denmark)

    Sørensen, Paul Haase; Nørgård, Peter Magnus; Hansen, Lars Kai

    1996-01-01

    The leave-one-out cross-validation scheme for generalization assessment of neural network models is computationally expensive due to replicated training sessions. Linear unlearning of examples has recently been suggested as an approach to approximative cross-validation. Here we briefly review...... the linear unlearning scheme, dubbed LULOO, and we illustrate it on a systemidentification example. Further, we address the possibility of extracting confidence information (error bars) from the LULOO ensemble....

  20. Validation of the Spanish Version of the Emotional Skills Assessment Process (ESAP) with College Students in Mexico

    Science.gov (United States)

    Teliz Triujeque, Rosalia

    2009-01-01

    The major purpose of the study was to determine the construct validity of the Spanish version of the Emotional Skills Assessment Process (ESAP) in a targeted population of agriculture college students in Mexico. The ESAP is a self assessment approach that helps students to identify and understand emotional intelligence skills relevant for…

  1. Who’s Who at the Border? A rights-based approach to identifying human trafficking at international borders

    Directory of Open Access Journals (Sweden)

    Marika McAdam

    2013-09-01

    Full Text Available International borders are widely touted as bastions in the fight against trafficking in persons. This article acknowledges the important role border officials play in preventing human trafficking, but calls for expectations to be tempered by deference to the conceptual complexity of cross-border trafficking and the migration processes involved. The fact that many trafficked victims begin their journeys as irregular or smuggled migrants highlights the challenge posed to border officials in identifying trafficked persons among the people they encounter. Indicators of trafficking generally relate to the exploitation phase, leaving border officials with little guidance as to how persons vulnerable to trafficking can be accurately identified before any exploitation has occurred. Ultimately, this paper advocates a pragmatic rights-based approach in designating anti-trafficking functions to border officials. A rights-based approach to border control acknowledges the core work of border officials as being to uphold border integrity, while ensuring that their performance of this role does not jeopardise the rights of those they intercept nor result in missed opportunities for specialists to identify trafficked persons and other vulnerable people among them.

  2. Development and Validation of the Behavioral Tendencies Questionnaire.

    Directory of Open Access Journals (Sweden)

    Nicholas T Van Dam

    Full Text Available At a fundamental level, taxonomy of behavior and behavioral tendencies can be described in terms of approach, avoid, or equivocate (i.e., neither approach nor avoid. While there are numerous theories of personality, temperament, and character, few seem to take advantage of parsimonious taxonomy. The present study sought to implement this taxonomy by creating a questionnaire based on a categorization of behavioral temperaments/tendencies first identified in Buddhist accounts over fifteen hundred years ago. Items were developed using historical and contemporary texts of the behavioral temperaments, described as "Greedy/Faithful", "Aversive/Discerning", and "Deluded/Speculative". To both maintain this categorical typology and benefit from the advantageous properties of forced-choice response format (e.g., reduction of response biases, binary pairwise preferences for items were modeled using Latent Class Analysis (LCA. One sample (n1 = 394 was used to estimate the item parameters, and the second sample (n2 = 504 was used to classify the participants using the established parameters and cross-validate the classification against multiple other measures. The cross-validated measure exhibited good nomothetic span (construct-consistent relationships with related measures that seemed to corroborate the ideas present in the original Buddhist source documents. The final 13-block questionnaire created from the best performing items (the Behavioral Tendencies Questionnaire or BTQ is a psychometrically valid questionnaire that is historically consistent, based in behavioral tendencies, and promises practical and clinical utility particularly in settings that teach and study meditation practices such as Mindfulness Based Stress Reduction (MBSR.

  3. Comparison of two model approaches in the Zambezi river basin with regard to model reliability and identifiability

    Directory of Open Access Journals (Sweden)

    H. C. Winsemius

    2006-01-01

    Full Text Available Variations of water stocks in the upper Zambezi river basin have been determined by 2 different hydrological modelling approaches. The purpose was to provide preliminary terrestrial storage estimates in the upper Zambezi, which will be compared with estimates derived from the Gravity Recovery And Climate Experiment (GRACE in a future study. The first modelling approach is GIS-based, distributed and conceptual (STREAM. The second approach uses Lumped Elementary Watersheds identified and modelled conceptually (LEW. The STREAM model structure has been assessed using GLUE (Generalized Likelihood Uncertainty Estimation a posteriori to determine parameter identifiability. The LEW approach could, in addition, be tested for model structure, because computational efforts of LEW are low. Both models are threshold models, where the non-linear behaviour of the Zambezi river basin is explained by a combination of thresholds and linear reservoirs. The models were forced by time series of gauged and interpolated rainfall. Where available, runoff station data was used to calibrate the models. Ungauged watersheds were generally given the same parameter sets as their neighbouring calibrated watersheds. It appeared that the LEW model structure could be improved by applying GLUE iteratively. Eventually, it led to better identifiability of parameters and consequently a better model structure than the STREAM model. Hence, the final model structure obtained better represents the true hydrology. After calibration, both models show a comparable efficiency in representing discharge. However the LEW model shows a far greater storage amplitude than the STREAM model. This emphasizes the storage uncertainty related to hydrological modelling in data-scarce environments such as the Zambezi river basin. It underlines the need and potential for independent observations of terrestrial storage to enhance our understanding and modelling capacity of the hydrological processes. GRACE

  4. A data-driven modeling approach to identify disease-specific multi-organ networks driving physiological dysregulation.

    Directory of Open Access Journals (Sweden)

    Warren D Anderson

    2017-07-01

    Full Text Available Multiple physiological systems interact throughout the development of a complex disease. Knowledge of the dynamics and connectivity of interactions across physiological systems could facilitate the prevention or mitigation of organ damage underlying complex diseases, many of which are currently refractory to available therapeutics (e.g., hypertension. We studied the regulatory interactions operating within and across organs throughout disease development by integrating in vivo analysis of gene expression dynamics with a reverse engineering approach to infer data-driven dynamic network models of multi-organ gene regulatory influences. We obtained experimental data on the expression of 22 genes across five organs, over a time span that encompassed the development of autonomic nervous system dysfunction and hypertension. We pursued a unique approach for identification of continuous-time models that jointly described the dynamics and structure of multi-organ networks by estimating a sparse subset of ∼12,000 possible gene regulatory interactions. Our analyses revealed that an autonomic dysfunction-specific multi-organ sequence of gene expression activation patterns was associated with a distinct gene regulatory network. We analyzed the model structures for adaptation motifs, and identified disease-specific network motifs involving genes that exhibited aberrant temporal dynamics. Bioinformatic analyses identified disease-specific single nucleotide variants within or near transcription factor binding sites upstream of key genes implicated in maintaining physiological homeostasis. Our approach illustrates a novel framework for investigating the pathogenesis through model-based analysis of multi-organ system dynamics and network properties. Our results yielded novel candidate molecular targets driving the development of cardiovascular disease, metabolic syndrome, and immune dysfunction.

  5. Validation of an employee satisfaction model: A structural equation model approach

    Directory of Open Access Journals (Sweden)

    Ophillia Ledimo

    2015-01-01

    Full Text Available The purpose of this study was to validate an employee satisfaction model and to determine the relationships between the different dimensions of the concept, using the structural equation modelling approach (SEM. A cross-sectional quantitative survey design was used to collect data from a random sample of (n=759 permanent employees of a parastatal organisation. Data was collected using the Employee Satisfaction Survey (ESS to measure employee satisfaction dimensions. Following the steps of SEM analysis, the three domains and latent variables of employee satisfaction were specified as organisational strategy, policies and procedures, and outcomes. Confirmatory factor analysis of the latent variables was conducted, and the path coefficients of the latent variables of the employee satisfaction model indicated a satisfactory fit for all these variables. The goodness-of-fit measure of the model indicated both absolute and incremental goodness-of-fit; confirming the relationships between the latent and manifest variables. It also indicated that the latent variables, organisational strategy, policies and procedures, and outcomes, are the main indicators of employee satisfaction. This study adds to the knowledge base on employee satisfaction and makes recommendations for future research.

  6. Validity and Reliability in Social Science Research

    Science.gov (United States)

    Drost, Ellen A.

    2011-01-01

    In this paper, the author aims to provide novice researchers with an understanding of the general problem of validity in social science research and to acquaint them with approaches to developing strong support for the validity of their research. She provides insight into these two important concepts, namely (1) validity; and (2) reliability, and…

  7. Genomic approach to therapeutic target validation identifies a glucose-lowering GLP1R variant protective for coronary heart disease

    Science.gov (United States)

    Scott, Robert A.; Freitag, Daniel F.; Li, Li; Chu, Audrey Y.; Surendran, Praveen; Young, Robin; Grarup, Niels; Stancáková, Alena; Chen, Yuning; V.Varga, Tibor; Yaghootkar, Hanieh; Luan, Jian'an; Zhao, Jing Hua; Willems, Sara M.; Wessel, Jennifer; Wang, Shuai; Maruthur, Nisa; Michailidou, Kyriaki; Pirie, Ailith; van der Lee, Sven J.; Gillson, Christopher; Olama, Ali Amin Al; Amouyel, Philippe; Arriola, Larraitz; Arveiler, Dominique; Aviles-Olmos, Iciar; Balkau, Beverley; Barricarte, Aurelio; Barroso, Inês; Garcia, Sara Benlloch; Bis, Joshua C.; Blankenberg, Stefan; Boehnke, Michael; Boeing, Heiner; Boerwinkle, Eric; Borecki, Ingrid B.; Bork-Jensen, Jette; Bowden, Sarah; Caldas, Carlos; Caslake, Muriel; Cupples, L. Adrienne; Cruchaga, Carlos; Czajkowski, Jacek; den Hoed, Marcel; Dunn, Janet A.; Earl, Helena M.; Ehret, Georg B.; Ferrannini, Ele; Ferrieres, Jean; Foltynie, Thomas; Ford, Ian; Forouhi, Nita G.; Gianfagna, Francesco; Gonzalez, Carlos; Grioni, Sara; Hiller, Louise; Jansson, Jan-Håkan; Jørgensen, Marit E.; Jukema, J. Wouter; Kaaks, Rudolf; Kee, Frank; Kerrison, Nicola D.; Key, Timothy J.; Kontto, Jukka; Kote-Jarai, Zsofia; Kraja, Aldi T.; Kuulasmaa, Kari; Kuusisto, Johanna; Linneberg, Allan; Liu, Chunyu; Marenne, Gaëlle; Mohlke, Karen L.; Morris, Andrew P.; Muir, Kenneth; Müller-Nurasyid, Martina; Munroe, Patricia B.; Navarro, Carmen; Nielsen, Sune F.; Nilsson, Peter M.; Nordestgaard, Børge G.; Packard, Chris J.; Palli, Domenico; Panico, Salvatore; Peloso, Gina M.; Perola, Markus; Peters, Annette; Poole, Christopher J.; Quirós, J. Ramón; Rolandsson, Olov; Sacerdote, Carlotta; Salomaa, Veikko; Sánchez, María-José; Sattar, Naveed; Sharp, Stephen J.; Sims, Rebecca; Slimani, Nadia; Smith, Jennifer A.; Thompson, Deborah J.; Trompet, Stella; Tumino, Rosario; van der A, Daphne L.; van der Schouw, Yvonne T.; Virtamo, Jarmo; Walker, Mark; Walter, Klaudia; Abraham, Jean E.; Amundadottir, Laufey T.; Aponte, Jennifer L.; Butterworth, Adam S.; Dupuis, Josée; Easton, Douglas F.; Eeles, Rosalind A.; Erdmann, Jeanette; Franks, Paul W.; Frayling, Timothy M.; Hansen, Torben; Howson, Joanna M. M.; Jørgensen, Torben; Kooner, Jaspal; Laakso, Markku; Langenberg, Claudia; McCarthy, Mark I.; Pankow, James S.; Pedersen, Oluf; Riboli, Elio; Rotter, Jerome I.; Saleheen, Danish; Samani, Nilesh J.; Schunkert, Heribert; Vollenweider, Peter; O'Rahilly, Stephen; Deloukas, Panos; Danesh, John; Goodarzi, Mark O.; Kathiresan, Sekar; Meigs, James B.; Ehm, Margaret G.; Wareham, Nicholas J.; Waterworth, Dawn M.

    2016-01-01

    Regulatory authorities have indicated that new drugs to treat type 2 diabetes (T2D) should not be associated with an unacceptable increase in cardiovascular risk. Human genetics may be able to inform development of antidiabetic therapies by predicting cardiovascular and other health endpoints. We therefore investigated the association of variants in 6 genes that encode drug targets for obesity or T2D with a range of metabolic traits in up to 11,806 individuals by targeted exome sequencing, and follow-up in 39,979 individuals by targeted genotyping, with additional in silico follow up in consortia. We used these data to first compare associations of variants in genes encoding drug targets with the effects of pharmacological manipulation of those targets in clinical trials. We then tested the association those variants with disease outcomes, including coronary heart disease, to predict cardiovascular safety of these agents. A low-frequency missense variant (Ala316Thr;rs10305492) in the gene encoding glucagon-like peptide-1 receptor (GLP1R), the target of GLP1R agonists, was associated with lower fasting glucose and lower T2D risk, consistent with GLP1R agonist therapies. The minor allele was also associated with protection against heart disease, thus providing evidence that GLP1R agonists are not likely to be associated with an unacceptable increase in cardiovascular risk. Our results provide an encouraging signal that these agents may be associated with benefit, a question currently being addressed in randomised controlled trials. Genetic variants associated with metabolic traits and multiple disease outcomes can be used to validate therapeutic targets at an early stage in the drug development process. PMID:27252175

  8. Accurately identifying patients who are excellent candidates or unsuitable for a medication: a novel approach

    Directory of Open Access Journals (Sweden)

    South C

    2017-12-01

    Full Text Available Charles South,1–3 A John Rush,4,* Thomas J Carmody,1–3 Manish K Jha,1,2 Madhukar H Trivedi1,2,*1Center for Depression Research and Clinical Care, 2Department of Psychiatry, 3Department of Clinical Sciences, University of Texas Southwestern Medical Center, Dallas, TX, USA; 4Department of Psychiatry and Behavioral Sciences, Duke-National University of Singapore, Singapore; Duke Medical School, Durham, NC, USA*These authors contributed equally to this work Objective: The objective of the study was to determine whether a unique analytic approach – as a proof of concept – could identify individual depressed outpatients (using 30 baseline clinical and demographic variables who are very likely (75% certain to not benefit (NB or to remit (R, accepting that without sufficient certainty, no prediction (NP would be made.Methods: Patients from the Combining Medications to Enhance Depression Outcomes trial treated with escitalopram (S-CIT + placebo (n=212 or S-CIT + bupropion-SR (n=206 were analyzed separately to assess replicability. For each treatment, the elastic net was used to identify subsets of predictive baseline measures for R and NB, separately. Two different equations that estimate the likelihood of remission and no benefit were developed for each patient. The ratio of these two numbers characterized likely outcomes for each patient.Results: The two treatment cells had comparable rates of remission (40% and no benefit (22%. In S-CIT + bupropion-SR, 11 were predicted NB of which 82% were correct; 26 were predicted R – 85% correct (169 had NP. For S-CIT + placebo, 13 were predicted NB – 69% correct; 44 were predicted R – 75% correct (155 were NP. Overall, 94/418 (22% patients were identified with a meaningful degree of certainty (69%–85% correct. Different variable sets with some overlap were predictive of remission and no benefit within and across treatments, despite comparable outcomes.Conclusion: In two separate analyses with two

  9. To Be or Not to Be Associated: Power study of four statistical modeling approaches to identify parasite associations in cross-sectional studies

    Directory of Open Access Journals (Sweden)

    Elise eVaumourin

    2014-05-01

    Full Text Available A growing number of studies are reporting simultaneous infections by parasites in many different hosts. The detection of whether these parasites are significantly associated is important in medicine and epidemiology. Numerous approaches to detect associations are available, but only a few provide statistical tests. Furthermore, they generally test for an overall detection of association and do not identify which parasite is associated with which other one. Here, we developed a new approach, the association screening approach, to detect the overall and the detail of multi-parasite associations. We studied the power of this new approach and of three other known ones (i.e. the generalized chi-square, the network and the multinomial GLM approaches to identify parasite associations either due to parasite interactions or to confounding factors. We applied these four approaches to detect associations within two populations of multi-infected hosts: 1 rodents infected with Bartonella sp., Babesia microti and Anaplasma phagocytophilum and 2 bovine population infected with Theileria sp. and Babesia sp.. We found that the best power is obtained with the screening model and the generalized chi-square test. The differentiation between associations, which are due to confounding factors and parasite interactions was not possible. The screening approach significantly identified associations between Bartonella doshiae and B. microti, and between T. parva, T. mutans and T. velifera. Thus, the screening approach was relevant to test the overall presence of parasite associations and identify the parasite combinations that are significantly over- or under-represented. Unravelling whether the associations are due to real biological interactions or confounding factors should be further investigated. Nevertheless, in the age of genomics and the advent of new technologies, it is a considerable asset to speed up researches focusing on the mechanisms driving interactions

  10. Development and initial validation of the determinants of physical activity questionnaire.

    Science.gov (United States)

    Taylor, Natalie; Lawton, Rebecca; Conner, Mark

    2013-06-11

    Physical activity interventions are more likely to be effective if they target causal determinants of behaviour change. Targeting requires accurate identification of specific theoretical determinants of physical activity. Two studies were undertaken to develop and validate the Determinants of Physical Activity Questionnaire. In Study 1, 832 male and female university staff and students were recruited from 49 universities across the UK and completed the 66-item measure, which is based on the Theoretical Domains Framework. Confirmatory factor analysis was undertaken on a calibration sample to generate the model, which resulted in a loss of 31 items. A validation sample was used to cross-validate the model. 20 new items were added and Study 2 tested the revised model in a sample of 466 male and female university students together with a physical activity measure. The final model consisted of 11 factors and 34 items, and CFA produced a reasonable fit χ2 (472) = 852.3, p < .001, CFI = .933, SRMR = .105, RMSEA = .042 (CI = .037-.046), as well as generally acceptable levels of discriminant validity, internal consistency, and test-retest reliability. Eight subscales significantly differentiated between high and low exercisers, indicating that those who exercise less report more barriers for physical activity. A theoretically underpinned measure of determinants of physical activity has been developed with reasonable reliability and validity. Further work is required to test the measure amongst a more representative sample. This study provides an innovative approach to identifying potential barriers to physical activity. This approach illustrates a method for moving from diagnosing implementation difficulties to designing and evaluating interventions.

  11. Constructing New Theory for Identifying Students with Emotional Disturbance: A Constructivist Approach to Grounded Theory

    OpenAIRE

    Dori Barnett

    2012-01-01

    A grounded theory study that examined how practitioners in a county alternative and correctional education setting identify youth with emotional and behavioral difficulties for special education services provides an exemplar for a constructivist approach to grounded theory methodology. Discussion focuses on how a constructivist orientation to grounded theory methodology informed research decisions, shaped the development of the emergent grounded theory, and prompted a way of thinking about da...

  12. Validation of a Consensus Method for Identifying Delirium from Hospital Records

    Science.gov (United States)

    Kuhn, Elvira; Du, Xinyi; McGrath, Keith; Coveney, Sarah; O'Regan, Niamh; Richardson, Sarah; Teodorczuk, Andrew; Allan, Louise; Wilson, Dan; Inouye, Sharon K.; MacLullich, Alasdair M. J.; Meagher, David; Brayne, Carol; Timmons, Suzanne; Davis, Daniel

    2014-01-01

    Background Delirium is increasingly considered to be an important determinant of trajectories of cognitive decline. Therefore, analyses of existing cohort studies measuring cognitive outcomes could benefit from methods to ascertain a retrospective delirium diagnosis. This study aimed to develop and validate such a method for delirium detection using routine medical records in UK and Ireland. Methods A point prevalence study of delirium provided the reference-standard ratings for delirium diagnosis. Blinded to study results, clinical vignettes were compiled from participants' medical records in a standardised manner, describing any relevant delirium symptoms recorded in the whole case record for the period leading up to case-ascertainment. An expert panel rated each vignette as unlikely, possible, or probable delirium and disagreements were resolved by consensus. Results From 95 case records, 424 vignettes were abstracted by 5 trained clinicians. There were 29 delirium cases according to the reference standard. Median age of subjects was 76.6 years (interquartile range 54.6 to 82.5). Against the original study DSM-IV diagnosis, the chart abstraction method gave a positive likelihood ratio (LR) of 7.8 (95% CI 5.7–12.0) and the negative LR of 0.45 (95% CI 0.40–0.47) for probable delirium (sensitivity 0.58 (95% CI 0.53–0.62); specificity 0.93 (95% CI 0.90–0.95); AUC 0.86 (95% CI 0.82–0.89)). The method diagnosed possible delirium with positive LR 3.5 (95% CI 2.9–4.3) and negative LR 0.15 (95% CI 0.11–0.21) (sensitivity 0.89 (95% CI 0.85–0.91); specificity 0.75 (95% CI 0.71–0.79); AUC 0.86 (95% CI 0.80–0.89)). Conclusions This chart abstraction method can retrospectively diagnose delirium in hospitalised patients with good accuracy. This has potential for retrospectively identifying delirium in cohort studies where routine medical records are available. This example of record linkage between hospitalisations and epidemiological data may lead to

  13. Validation of a consensus method for identifying delirium from hospital records.

    Directory of Open Access Journals (Sweden)

    Elvira Kuhn

    Full Text Available Delirium is increasingly considered to be an important determinant of trajectories of cognitive decline. Therefore, analyses of existing cohort studies measuring cognitive outcomes could benefit from methods to ascertain a retrospective delirium diagnosis. This study aimed to develop and validate such a method for delirium detection using routine medical records in UK and Ireland.A point prevalence study of delirium provided the reference-standard ratings for delirium diagnosis. Blinded to study results, clinical vignettes were compiled from participants' medical records in a standardised manner, describing any relevant delirium symptoms recorded in the whole case record for the period leading up to case-ascertainment. An expert panel rated each vignette as unlikely, possible, or probable delirium and disagreements were resolved by consensus.From 95 case records, 424 vignettes were abstracted by 5 trained clinicians. There were 29 delirium cases according to the reference standard. Median age of subjects was 76.6 years (interquartile range 54.6 to 82.5. Against the original study DSM-IV diagnosis, the chart abstraction method gave a positive likelihood ratio (LR of 7.8 (95% CI 5.7-12.0 and the negative LR of 0.45 (95% CI 0.40-0.47 for probable delirium (sensitivity 0.58 (95% CI 0.53-0.62; specificity 0.93 (95% CI 0.90-0.95; AUC 0.86 (95% CI 0.82-0.89. The method diagnosed possible delirium with positive LR 3.5 (95% CI 2.9-4.3 and negative LR 0.15 (95% CI 0.11-0.21 (sensitivity 0.89 (95% CI 0.85-0.91; specificity 0.75 (95% CI 0.71-0.79; AUC 0.86 (95% CI 0.80-0.89.This chart abstraction method can retrospectively diagnose delirium in hospitalised patients with good accuracy. This has potential for retrospectively identifying delirium in cohort studies where routine medical records are available. This example of record linkage between hospitalisations and epidemiological data may lead to further insights into the inter-relationship between acute

  14. The Metacognitive Anger Processing (MAP) Scale - Validation in a Mixed Clinical and a Forensic In-Patient Sample

    DEFF Research Database (Denmark)

    Moeller, Stine Bjerrum; Bech, Per

    2018-01-01

    BACKGROUND: The metacognitive approach by Wells and colleagues has gained empirical support with a broad range of symptoms. The Metacognitive Anger Processing (MAP) scale was developed to provide a metacognitive measure on anger (Moeller, 2016). In the preliminary validation, three components were...... identified (positive beliefs, negative beliefs and rumination) to be positively correlated with the anger. AIMS: To validate the MAP in a sample of mixed clinical patients (n = 88) and a sample of male forensic patients (n = 54). METHOD: The MAP was administered together with measures of metacognition, anger......, rumination, anxiety and depressive symptoms. RESULTS: The MAP showed acceptable scalability and excellent reliability. Convergent validity was evidenced using the general metacognitive measure (MCQ-30), and concurrent validity was supported using two different anger measures (STAXI-2 and NAS). CONCLUSIONS...

  15. Validation of Inhibition Effect in the Cellulose Hydrolysis: a Dynamic Modelling Approach

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Tsai, Chien-Tai; Meyer, Anne S.

    2011-01-01

    Enzymatic hydrolysis is one of the main steps in the processing of bioethanol from lignocellulosic raw materials. However, complete understanding of the underlying phenomena is still under development. Hence, this study has focused on validation of the inhibition effects in the cellulosic biomass...... for parameter estimation (calibration) and validation purposes. The model predictions using calibrated parameters have shown good agreement with the validation data sets, which provides credibility to the model structure and the parameter values....

  16. A generalized approach for producing, quantifying, and validating citizen science data from wildlife images.

    Science.gov (United States)

    Swanson, Alexandra; Kosmala, Margaret; Lintott, Chris; Packer, Craig

    2016-06-01

    Citizen science has the potential to expand the scope and scale of research in ecology and conservation, but many professional researchers remain skeptical of data produced by nonexperts. We devised an approach for producing accurate, reliable data from untrained, nonexpert volunteers. On the citizen science website www.snapshotserengeti.org, more than 28,000 volunteers classified 1.51 million images taken in a large-scale camera-trap survey in Serengeti National Park, Tanzania. Each image was circulated to, on average, 27 volunteers, and their classifications were aggregated using a simple plurality algorithm. We validated the aggregated answers against a data set of 3829 images verified by experts and calculated 3 certainty metrics-level of agreement among classifications (evenness), fraction of classifications supporting the aggregated answer (fraction support), and fraction of classifiers who reported "nothing here" for an image that was ultimately classified as containing an animal (fraction blank)-to measure confidence that an aggregated answer was correct. Overall, aggregated volunteer answers agreed with the expert-verified data on 98% of images, but accuracy differed by species commonness such that rare species had higher rates of false positives and false negatives. Easily calculated analysis of variance and post-hoc Tukey tests indicated that the certainty metrics were significant indicators of whether each image was correctly classified or classifiable. Thus, the certainty metrics can be used to identify images for expert review. Bootstrapping analyses further indicated that 90% of images were correctly classified with just 5 volunteers per image. Species classifications based on the plurality vote of multiple citizen scientists can provide a reliable foundation for large-scale monitoring of African wildlife. © 2016 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  17. Identifying perinatal risk factors for infant maltreatment: an ecological approach

    Directory of Open Access Journals (Sweden)

    Hallisey Elaine J

    2006-12-01

    Full Text Available Abstract Background Child maltreatment and its consequences are a persistent problem throughout the world. Public health workers, human services officials, and others are interested in new and efficient ways to determine which geographic areas to target for intervention programs and resources. To improve assessment efforts, selected perinatal factors were examined, both individually and in various combinations, to determine if they are associated with increased risk of infant maltreatment. State of Georgia birth records and abuse and neglect data were analyzed using an area-based, ecological approach with the census tract as a surrogate for the community. Cartographic visualization suggested some correlation exists between risk factors and child maltreatment, so bivariate and multivariate regression were performed. The presence of spatial autocorrelation precluded the use of traditional ordinary least squares regression, therefore a spatial regression model coupled with maximum likelihood estimation was employed. Results Results indicate that all individual factors or their combinations are significantly associated with increased risk of infant maltreatment. The set of perinatal risk factors that best predicts infant maltreatment rates are: mother smoked during pregnancy, families with three or more siblings, maternal age less than 20 years, births to unmarried mothers, Medicaid beneficiaries, and inadequate prenatal care. Conclusion This model enables public health to take a proactive stance, to reasonably predict areas where poor outcomes are likely to occur, and to therefore more efficiently allocate resources. U.S. states that routinely collect the variables the National Center for Health Statistics (NCHS defines for birth certificates can easily identify areas that are at high risk for infant maltreatment. The authors recommend that agencies charged with reducing child maltreatment target communities that demonstrate the perinatal risks

  18. Validity and Reliability of the Questionnaire for Assessing Women’s Reproductive History in Azar Cohort Study

    Directory of Open Access Journals (Sweden)

    Mohammad Zakaria Pezeshki

    2017-06-01

    Full Text Available This study was done to evaluate the validity and reliability of women’s reproductive history questionnaire which will be used in Azar Cohort study; a cohort that is conducted by Tabriz University of Medical Science in Shabestar county for identifying risk factors of no communicable diseases. Content and face validity were evaluated by ten experts in the field and quantified as content validity index (CVI and content validity ratio (CVR. To assess the reliability, using test-retest approach, kappa statistic was calculated for categorical variables and intra-class correlation coefficient (ICC was used for the quantitative items. The calculated CVI and CVR were 0.91and 0.94, respectively. Reliability for all items was high. The ICC was 0.99 and kappa statistic was equal to 1. The final version of questionnaire was redesigned in 26 items with 7 subscales.

  19. Empirical Validation of Listening Proficiency Guidelines

    Science.gov (United States)

    Cox, Troy L.; Clifford, Ray

    2014-01-01

    Because listening has received little attention and the validation of ability scales describing multidimensional skills is always challenging, this study applied a multistage, criterion-referenced approach that used a framework of aligned audio passages and listening tasks to explore the validity of the ACTFL and related listening proficiency…

  20. Integrative screening approach identifies regulators of polyploidization and targets for acute megakaryocytic leukemia

    Science.gov (United States)

    Wen, Qiang; Goldenson, Benjamin; Silver, Serena J.; Schenone, Monica; Dancik, Vladimir; Huang, Zan; Wang, Ling-Zhi; Lewis, Timothy; An, W. Frank; Li, Xiaoyu; Bray, Mark-Anthony; Thiollier, Clarisse; Diebold, Lauren; Gilles, Laure; Vokes, Martha S.; Moore, Christopher B.; Bliss-Moreau, Meghan; VerPlank, Lynn; Tolliday, Nicola J.; Mishra, Rama; Vemula, Sasidhar; Shi, Jianjian; Wei, Lei; Kapur, Reuben; Lopez, Cécile K.; Gerby, Bastien; Ballerini, Paola; Pflumio, Francoise; Gilliland, D. Gary; Goldberg, Liat; Birger, Yehudit; Izraeli, Shai; Gamis, Alan S.; Smith, Franklin O.; Woods, William G.; Taub, Jeffrey; Scherer, Christina A.; Bradner, James; Goh, Boon-Cher; Mercher, Thomas; Carpenter, Anne E.; Gould, Robert J.; Clemons, Paul A.; Carr, Steven A.; Root, David E.; Schreiber, Stuart L.; Stern, Andrew M.; Crispino, John D.

    2012-01-01

    Summary The mechanism by which cells decide to skip mitosis to become polyploid is largely undefined. Here we used a high-content image-based screen to identify small-molecule probes that induce polyploidization of megakaryocytic leukemia cells and serve as perturbagens to help understand this process. We found that dimethylfasudil (diMF, H-1152P) selectively increased polyploidization, mature cell-surface marker expression, and apoptosis of malignant megakaryocytes. A broadly applicable, highly integrated target identification approach employing proteomic and shRNA screening revealed that a major target of diMF is Aurora A kinase (AURKA), which has not been studied extensively in megakaryocytes. Moreover, we discovered that MLN8237 (Alisertib), a selective inhibitor of AURKA, induced polyploidization and expression of mature megakaryocyte markers in AMKL blasts and displayed potent anti-AMKL activity in vivo. This research provides the rationale to support clinical trials of MLN8237 and other inducers of polyploidization in AMKL. Finally, we have identified five networks of kinases that regulate the switch to polyploidy. PMID:22863010

  1. Calibrated photostimulated luminescence is an effective approach to identify irradiated orange during storage

    Science.gov (United States)

    Jo, Yunhee; Sanyal, Bhaskar; Chung, Namhyeok; Lee, Hyun-Gyu; Park, Yunji; Park, Hae-Jun; Kwon, Joong-Ho

    2015-06-01

    Photostimulated luminescence (PSL) has been employed as a fast screening method for various irradiated foods. In this study the potential use of PSL was evaluated to identify oranges irradiated with gamma ray, electron beam and X-ray (0-2 kGy) and stored under different conditions for 6 weeks. The effects of light conditions (natural light, artificial light, and dark) and storage temperatures (4 and 20 °C) on PSL photon counts (PCs) during post-irradiation periods were studied. Non-irradiated samples always showed negative values of PCs, while irradiated oranges exhibited intermediate results after first PSL measurements. However, the irradiated samples had much higher PCs. The PCs of all the samples declined as the storage time increased. Calibrated second PSL measurements showed PSL ratio <10 for the irradiated samples after 3 weeks of irradiation confirming their irradiation status in all the storage conditions. Calibrated PSL and sample storage in dark at 4 °C were found out to be most suitable approaches to identify irradiated oranges during storage.

  2. An Investigation to Validate the Grammar and Phonology Screening (GAPS) Test to Identify Children with Specific Language Impairment

    Science.gov (United States)

    van der Lely, Heather K. J.; Payne, Elisabeth; McClelland, Alastair

    2011-01-01

    Background The extraordinarily high incidence of grammatical language impairments in developmental disorders suggests that this uniquely human cognitive function is “fragile”. Yet our understanding of the neurobiology of grammatical impairments is limited. Furthermore, there is no “gold-standard” to identify grammatical impairments and routine screening is not undertaken. An accurate screening test to identify grammatical abilities would serve the research, health and education communities, further our understanding of developmental disorders, and identify children who need remediation, many of whom are currently un-diagnosed. A potential realistic screening tool that could be widely administered is the Grammar and Phonology Screening (GAPS) test – a 10 minute test that can be administered by professionals and non-professionals alike. Here we provide a further step in evaluating the validity and accuracy (sensitivity and specificity) of the GAPS test in identifying children who have Specific Language Impairment (SLI). Methods and Findings We tested three groups of children; two groups aged 3;6–6:6, a typically developing (n = 30) group, and a group diagnosed with SLI: (n = 11) (Young (Y)-SLI), and a further group aged 6;9–8;11 with SLI (Older (O)-SLI) (n = 10) who were above the test age norms. We employed a battery of language assessments including the GAPS test to assess the children's language abilities. For Y-SLI children, analyses revealed a sensitivity and specificity at the 5th and 10th percentile of 1.00 and 0.98, respectively, and for O-SLI children at the 10th and 15th percentile .83 and .90, respectively. Conclusions The findings reveal that the GAPS is highly accurate in identifying impaired vs. non-impaired children up to 6;8 years, and has moderate-to-high accuracy up to 9 years. The results indicate that GAPS is a realistic tool for the early identification of grammatical abilities and impairment in young children. A larger

  3. Biomarkers of systemic lupus erythematosus identified using mass spectrometry-based proteomics: a systematic review.

    Science.gov (United States)

    Nicolaou, Orthodoxia; Kousios, Andreas; Hadjisavvas, Andreas; Lauwerys, Bernard; Sokratous, Kleitos; Kyriacou, Kyriacos

    2017-05-01

    Advances in mass spectrometry technologies have created new opportunities for discovering novel protein biomarkers in systemic lupus erythematosus (SLE). We performed a systematic review of published reports on proteomic biomarkers identified in SLE patients using mass spectrometry-based proteomics and highlight their potential disease association and clinical utility. Two electronic databases, MEDLINE and EMBASE, were systematically searched up to July 2015. The methodological quality of studies included in the review was performed according to Preferred Reporting Items for Systematic Reviews and Meta-analyses guidelines. Twenty-five studies were included in the review, identifying 241 SLE candidate proteomic biomarkers related to various aspects of the disease including disease diagnosis and activity or pinpointing specific organ involvement. Furthermore, 13 of the 25 studies validated their results for a selected number of biomarkers in an independent cohort, resulting in the validation of 28 candidate biomarkers. It is noteworthy that 11 candidate biomarkers were identified in more than one study. A significant number of potential proteomic biomarkers that are related to a number of aspects of SLE have been identified using mass spectrometry proteomic approaches. However, further studies are required to assess the utility of these biomarkers in routine clinical practice. © 2016 The Authors. Journal of Cellular and Molecular Medicine published by John Wiley & Sons Ltd and Foundation for Cellular and Molecular Medicine.

  4. Identifying and Evaluating External Validity Evidence for Passing Scores

    Science.gov (United States)

    Davis-Becker, Susan L.; Buckendahl, Chad W.

    2013-01-01

    A critical component of the standard setting process is collecting evidence to evaluate the recommended cut scores and their use for making decisions and classifying students based on test performance. Kane (1994, 2001) proposed a framework by which practitioners can identify and evaluate evidence of the results of the standard setting from (1)…

  5. Modeling and experimental validation of a Hybridized Energy Storage System for automotive applications

    Science.gov (United States)

    Fiorenti, Simone; Guanetti, Jacopo; Guezennec, Yann; Onori, Simona

    2013-11-01

    This paper presents the development and experimental validation of a dynamic model of a Hybridized Energy Storage System (HESS) consisting of a parallel connection of a lead acid (PbA) battery and double layer capacitors (DLCs), for automotive applications. The dynamic modeling of both the PbA battery and the DLC has been tackled via the equivalent electric circuit based approach. Experimental tests are designed for identification purposes. Parameters of the PbA battery model are identified as a function of state of charge and current direction, whereas parameters of the DLC model are identified for different temperatures. A physical HESS has been assembled at the Center for Automotive Research The Ohio State University and used as a test-bench to validate the model against a typical current profile generated for Start&Stop applications. The HESS model is then integrated into a vehicle simulator to assess the effects of the battery hybridization on the vehicle fuel economy and mitigation of the battery stress.

  6. Validating a mass balance accounting approach to using 7Be measurements to estimate event-based erosion rates over an extended period at the catchment scale

    Science.gov (United States)

    Porto, Paolo; Walling, Des E.; Cogliandro, Vanessa; Callegari, Giovanni

    2016-07-01

    Use of the fallout radionuclides cesium-137 and excess lead-210 offers important advantages over traditional methods of quantifying erosion and soil redistribution rates. However, both radionuclides provide information on longer-term (i.e., 50-100 years) average rates of soil redistribution. Beryllium-7, with its half-life of 53 days, can provide a basis for documenting short-term soil redistribution and it has been successfully employed in several studies. However, the approach commonly used introduces several important constraints related to the timing and duration of the study period. A new approach proposed by the authors that overcomes these constraints has been successfully validated using an erosion plot experiment undertaken in southern Italy. Here, a further validation exercise undertaken in a small (1.38 ha) catchment is reported. The catchment was instrumented to measure event sediment yields and beryllium-7 measurements were employed to document the net soil loss for a series of 13 events that occurred between November 2013 and June 2015. In the absence of significant sediment storage within the catchment's ephemeral channel system and of a significant contribution from channel erosion to the measured sediment yield, the estimates of net soil loss for the individual events could be directly compared with the measured sediment yields to validate the former. The close agreement of the two sets of values is seen as successfully validating the use of beryllium-7 measurements and the new approach to obtain estimates of net soil loss for a sequence of individual events occurring over an extended period at the scale of a small catchment.

  7. Non-destructive inspection approach using ultrasound to identify the material state for amorphous and semi-crystalline materials

    Science.gov (United States)

    Jost, Elliott; Jack, David; Moore, David

    2018-04-01

    At present, there are many methods to identify the temperature and phase of a material using invasive techniques. However, most current methods require physical contact or implicit methods utilizing light reflectance of the specimen. This work presents a nondestructive inspection method using ultrasonic wave technology that circumvents these disadvantages to identify phase change regions and infer the temperature state of a material. In the present study an experiment is performed to monitor the time of flight within a wax as it undergoes melting and the subsequent cooling. Results presented in this work show a clear relationship between a material's speed of sound and its temperature. The phase change transition of the material is clear from the time of flight results, and in the case of the investigated material, this change in the material state occurs over a range of temperatures. The range of temperatures over which the wax material melts is readily identified by speed of sound represented as a function of material temperature. The melt temperature, obtained acoustically, is validated using Differential Scanning Calorimetry (DSC), which uses shifts in heat flow rates to identify phase transition temperature ranges. The investigated ultrasonic NDE method has direct applications in many industries, including oil and gas, food and beverage, and polymer composites, in addition to many implications for future capabilities of nondestructive inspection of multi-phase materials.

  8. Calibrated photostimulated luminescence is an effective approach to identify irradiated orange during storage

    International Nuclear Information System (INIS)

    Jo, Yunhee; Sanyal, Bhaskar; Chung, Namhyeok; Lee, Hyun-Gyu; Park, Yunji; Park, Hae-Jun; Kwon, Joong-Ho

    2015-01-01

    Photostimulated luminescence (PSL) has been employed as a fast screening method for various irradiated foods. In this study the potential use of PSL was evaluated to identify oranges irradiated with gamma ray, electron beam and X-ray (0–2 kGy) and stored under different conditions for 6 weeks. The effects of light conditions (natural light, artificial light, and dark) and storage temperatures (4 and 20 °C) on PSL photon counts (PCs) during post-irradiation periods were studied. Non-irradiated samples always showed negative values of PCs, while irradiated oranges exhibited intermediate results after first PSL measurements. However, the irradiated samples had much higher PCs. The PCs of all the samples declined as the storage time increased. Calibrated second PSL measurements showed PSL ratio <10 for the irradiated samples after 3 weeks of irradiation confirming their irradiation status in all the storage conditions. Calibrated PSL and sample storage in dark at 4 °C were found out to be most suitable approaches to identify irradiated oranges during storage. - Highlights: • Photostimulatedluminescence (PSL) was studied to identify irradiated orange for quarantine application. • PSL detection efficiency was compared amonggamma,electron, and X irradiation during shelf-life of oranges • PSL properties of samples were characterized by standard samples • Calibrated PSL gave a clear verdict on irradiation extending potential of PSL technique

  9. Learn the Lagrangian: A Vector-Valued RKHS Approach to Identifying Lagrangian Systems.

    Science.gov (United States)

    Cheng, Ching-An; Huang, Han-Pang

    2016-12-01

    We study the modeling of Lagrangian systems with multiple degrees of freedom. Based on system dynamics, canonical parametric models require ad hoc derivations and sometimes simplification for a computable solution; on the other hand, due to the lack of prior knowledge in the system's structure, modern nonparametric models in machine learning face the curse of dimensionality, especially in learning large systems. In this paper, we bridge this gap by unifying the theories of Lagrangian systems and vector-valued reproducing kernel Hilbert space. We reformulate Lagrangian systems with kernels that embed the governing Euler-Lagrange equation-the Lagrangian kernels-and show that these kernels span a subspace capturing the Lagrangian's projection as inverse dynamics. By such property, our model uses only inputs and outputs as in machine learning and inherits the structured form as in system dynamics, thereby removing the need for the mundane derivations for new systems as well as the generalization problem in learning from scratches. In effect, it learns the system's Lagrangian, a simpler task than directly learning the dynamics. To demonstrate, we applied the proposed kernel to identify the robot inverse dynamics in simulations and experiments. Our results present a competitive novel approach to identifying Lagrangian systems, despite using only inputs and outputs.

  10. Brief Report: Using the Internet to Identify Persons with Cognitive Impairment for Participation in Clinical Trials

    Directory of Open Access Journals (Sweden)

    Lindsay F. Morra

    2017-04-01

    Full Text Available Identifying, recruiting, and enrolling persons in clinical trials of dementia treatments is extremely difficult. One approach to first-wave screening of potential participants is the use of online assessment tools. Initial studies using the Dementia Risk Assessment (DRA—which includes a previously validated recognition memory test—support the use of this self-administered assessment to identify individuals with “suspected MCI” or “suspected dementia.” In this study, we identified between 71 and 622 persons with suspected dementia and between 128 and 1653 persons with suspected mild cognitive impairment (depending on specific criteria over the course of 22 months. Assessment tools that can inexpensively and easily identify individuals with higher than average risk for cognitive impairment can facilitate recruitment for large-scale clinical trials for dementia treatments.

  11. Identifying problematic Internet users: development and validation of the Internet Motive Questionnaire for Adolescents (IMQ-A).

    Science.gov (United States)

    Bischof-Kastner, Christina; Kuntsche, Emmanuel; Wolstein, Jörg

    2014-10-09

    Internationally, up to 15.1% of intensive Internet use among adolescents is dysfunctional. To provide a basis for early intervention and preventive measures, understanding the motives behind intensive Internet use is important. This study aims to develop a questionnaire, the Internet Motive Questionnaire for Adolescents (IMQ-A), as a theory-based measurement for identifying the underlying motives for high-risk Internet use. More precisely, the aim was to confirm the 4-factor structure (ie, social, enhancement, coping, and conformity motives) as well as its construct and concurrent validity. Another aim was to identify the motivational differences between high-risk and low-risk Internet users. A sample of 101 German adolescents (female: 52.5%, 53/101; age: mean 15.9, SD 1.3 years) was recruited. High-risk users (n=47) and low-risk users (n=54) were identified based on a screening measure for online addiction behavior in children and adolescents (Online-Suchtverhalten-Skala, OSVK-S). Here, "high-risk" Internet use means use that exceeds the level of intensive Internet use (OSVK-S sum score ≥7). The confirmatory factor analysis confirmed the IMQ-A's 4-factor structure. A reliability analysis revealed good internal consistencies of the subscales (.71 up to .86). Moreover, regression analyses confirmed that the enhancement and coping motive groups significantly predicted high-risk Internet consumption and the OSVK-S sum score. A mixed-model ANOVA confirmed that adolescents mainly access the Internet for social motives, followed by enhancement and coping motives, and that high-risk users access the Internet more frequently for coping and enhancement motives than low-risk users. Low-risk users were primarily motivated socially. The IMQ-A enables the assessment of motives related to adolescent Internet use and thus the identification of populations at risk. The questionnaire enables the development of preventive measures or early intervention programs, especially dealing

  12. A Systematic Approach to Identify Sources of Abnormal Interior Noise for a High-Speed Train

    Directory of Open Access Journals (Sweden)

    Jie Zhang

    2018-01-01

    Full Text Available A systematic approach to identify sources of abnormal interior noise occurring in a high-speed train is presented and applied in this paper to resolve a particular noise issue. This approach is developed based on a number of previous dealings with similar noise problems. The particular noise issue occurs in a Chinese high-speed train. It is measured that there is a difference of 7 dB(A in overall Sound Pressure Level (SPL between two nominally identical VIP cabins at 250 km/h. The systematic approach is applied to identify the root cause of the 7 dB(A difference. Well planned measurements are performed in both the VIP cabins. Sound pressure contributions, either in terms of frequency band or in terms of facing area, are analyzed. Order analysis is also carried out. Based on these analyses, it is found that the problematic frequency is the sleeper passing frequency of the train, and an area on the roof contributes the most. In order to determine what causes that area to be the main contributor without disassembling the structure of the roof, measured noise and vibration data for different train speeds are further analyzed. It is then reasoned that roof is the main contributor caused by sound pressure behind the panel. Up to this point, panels of the roof are removed, revealing that a hole of 300 cm2 for running cables is presented behind the red area without proper sound insulation. This study can provide a basis for abnormal interior noise analysis and control of high-speed trains.

  13. Referencing Science: Teaching Undergraduates to Identify, Validate, and Utilize Peer-Reviewed Online Literature

    Science.gov (United States)

    Berzonsky, William A.; Richardson, Katherine D.

    2008-01-01

    Accessibility of online scientific literature continues to expand due to the advent of scholarly databases and search engines. Studies have shown that undergraduates favor using online scientific literature to address research questions, but they often do not have the skills to assess the validity of research articles. Undergraduates generally are…

  14. Online cross-validation-based ensemble learning.

    Science.gov (United States)

    Benkeser, David; Ju, Cheng; Lendle, Sam; van der Laan, Mark

    2018-01-30

    Online estimators update a current estimate with a new incoming batch of data without having to revisit past data thereby providing streaming estimates that are scalable to big data. We develop flexible, ensemble-based online estimators of an infinite-dimensional target parameter, such as a regression function, in the setting where data are generated sequentially by a common conditional data distribution given summary measures of the past. This setting encompasses a wide range of time-series models and, as special case, models for independent and identically distributed data. Our estimator considers a large library of candidate online estimators and uses online cross-validation to identify the algorithm with the best performance. We show that by basing estimates on the cross-validation-selected algorithm, we are asymptotically guaranteed to perform as well as the true, unknown best-performing algorithm. We provide extensions of this approach including online estimation of the optimal ensemble of candidate online estimators. We illustrate excellent performance of our methods using simulations and a real data example where we make streaming predictions of infectious disease incidence using data from a large database. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  15. The hierarchy-by-interval approach to identifying important models that need improvement in severe-accident simulation codes

    International Nuclear Information System (INIS)

    Heames, T.J.; Khatib-Rahbar, M.; Kelly, J.E.

    1995-01-01

    The hierarchy-by-interval (HBI) methodology was developed to determine an appropriate phenomena identification and ranking table for an independent peer review of severe-accident computer codes. The methodology is described, and the results of a specific code review are presented. Use of this systematic and structured approach ensures that important code models that need improvement are identified and prioritized, which allows code sponsors to more effectively direct limited resources in future code development. In addition, critical phenomenological areas that need more fundamental work, such as experimentation, are identified

  16. Assessing the validity of discourse analysis: transdisciplinary convergence

    Science.gov (United States)

    Jaipal-Jamani, Kamini

    2014-12-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.

  17. Using optical markers of nondysplastic rectal epithelial cells to identify patients with ulcerative colitis-associated neoplasia.

    Science.gov (United States)

    Bista, Rajan K; Brentnall, Teresa A; Bronner, Mary P; Langmead, Christopher J; Brand, Randall E; Liu, Yang

    2011-12-01

    Current surveillance guidelines for patients with long-standing ulcerative colitis (UC) recommend repeated colonoscopy with random biopsies, which is time-consuming, discomforting, and expensive. A less invasive strategy is to identify neoplasia by analyzing biomarkers from the more accessible rectum to predict the need for a full colonoscopy. The goal of this pilot study was to evaluate whether optical markers of rectal mucosa derived from a novel optical technique, partial-wave spectroscopic microscopy (PWS), could identify UC patients with high-grade dysplasia (HGD) or cancer (CA) present anywhere in their colon. Banked frozen nondysplastic mucosal rectal biopsies were used from 28 UC patients (15 without dysplasia and 13 with concurrent HGD or CA). The specimen slides were made using a touch prep method and underwent PWS analysis. We divided the patients into two groups: 13 as a training set and an independent 15 as a validation set. We identified six optical markers, ranked by measuring the information gain with respect to the outcome of cancer. The most effective markers were selected by maximizing the cross-validated training accuracy of a Naive Bayes classifier. The optimal classifier was applied to the validation data yielding 100% sensitivity and 75% specificity. Our results indicate that the PWS-derived optical markers can accurately predict UC patients with HGD/CA through assessment of rectal epithelial cells. By aiming for high sensitivity, our approach could potentially simplify the surveillance of UC patients and improve overall resource utilization by identifying patients with HGD/CA who should proceed with colonoscopy. Copyright © 2011 Crohn's & Colitis Foundation of America, Inc.

  18. Validity and Reliability of the 8-Item Work Limitations Questionnaire.

    Science.gov (United States)

    Walker, Timothy J; Tullar, Jessica M; Diamond, Pamela M; Kohl, Harold W; Amick, Benjamin C

    2017-12-01

    Purpose To evaluate factorial validity, scale reliability, test-retest reliability, convergent validity, and discriminant validity of the 8-item Work Limitations Questionnaire (WLQ) among employees from a public university system. Methods A secondary analysis using de-identified data from employees who completed an annual Health Assessment between the years 2009-2015 tested research aims. Confirmatory factor analysis (CFA) (n = 10,165) tested the latent structure of the 8-item WLQ. Scale reliability was determined using a CFA-based approach while test-retest reliability was determined using the intraclass correlation coefficient. Convergent/discriminant validity was tested by evaluating relations between the 8-item WLQ with health/performance variables for convergent validity (health-related work performance, number of chronic conditions, and general health) and demographic variables for discriminant validity (gender and institution type). Results A 1-factor model with three correlated residuals demonstrated excellent model fit (CFI = 0.99, TLI = 0.99, RMSEA = 0.03, and SRMR = 0.01). The scale reliability was acceptable (0.69, 95% CI 0.68-0.70) and the test-retest reliability was very good (ICC = 0.78). Low-to-moderate associations were observed between the 8-item WLQ and the health/performance variables while weak associations were observed between the demographic variables. Conclusions The 8-item WLQ demonstrated sufficient reliability and validity among employees from a public university system. Results suggest the 8-item WLQ is a usable alternative for studies when the more comprehensive 25-item WLQ is not available.

  19. Sensor Selection and Data Validation for Reliable Integrated System Health Management

    Science.gov (United States)

    Garg, Sanjay; Melcher, Kevin J.

    2008-01-01

    -select iteration, and the final selection analysis. The knowledge base required for productive use of S4 consists of system design information and heritage experience together with a focus on components with health implications. The sensor suite down-selection is an iterative process for identifying a group of sensors that provide good fault detection and isolation for targeted fault scenarios. In the final selection analysis, a statistical evaluation algorithm provides the final robustness test for each down-selected sensor suite. NASA GRC has developed an approach to sensor data qualification that applies empirical relationships, threshold detection techniques, and Bayesian belief theory to a network of sensors related by physics (i.e., analytical redundancy) in order to identify the failure of a given sensor within the network. This data quality validation approach extends the state-of-the-art, from red-lines and reasonableness checks that flag a sensor after it fails, to include analytical redundancy-based methods that can identify a sensor in the process of failing. The focus of this effort is on understanding the proper application of analytical redundancy-based data qualification methods for onboard use in monitoring Upper Stage sensors.

  20. The Identification and Validation of Novel Small Proteins in Pseudomonas Putida KT-2440

    DEFF Research Database (Denmark)

    Yang, Xiaochen; Long, Katherine

    2014-01-01

    and activities and may lead to the discovery of novel antimicrobial agents. Our project focuses on the identification, validation and characterization of novel s-­‐proteins in the bacterium Pseudomonas putida KT-­2440. As there is virtually no information on s-­‐proteins in pseudomonads, the first step......, total protein samples are prepared, fractionated, and analyzed with mass spectrometry (MS/MS). The MS/MS data are compared to a custom database containing >80000 putative sORF sequences to identify candidates for validation. A total of 56 and 22 putative sORFs were obtained from MS/MS data...... and bioinformatics prediction, respectively, where there is no overlap between the putative sORFs obtained from the two approaches. The sequences encoding the putative sORFs will be integrated onto the Tn7 site on the chromosome as well as on a plasmid expression vector for validation....

  1. IDENTIFYING DEMENTIA IN ELDERLY POPULATION : A CAMP APPROACH

    OpenAIRE

    Anand P; Chaukimath; Srikanth; Koli

    2015-01-01

    BACKGROUND: Dementia is an emerging medico social problem affecting elderly, and poses a challenge to clinician and caregivers. It is usually identified in late stage where management becomes difficult. AIM: The aim of camp was to identify dementia in elderly population participating in screening camp. MATERIAL AND METHODS : The geriatric clinic and department of psychiatry jointly organised screening camp to detect dementia in elderly for five days in Sept...

  2. Neuroproteomics and Systems Biology Approach to Identify Temporal Biomarker Changes Post Experimental Traumatic Brain Injury in Rats

    Directory of Open Access Journals (Sweden)

    Firas H Kobeissy

    2016-11-01

    Full Text Available Traumatic brain injury (TBI represents a critical health problem of which diagnosis, management and treatment remain challenging. TBI is a contributing factor in approximately 1/3 of all injury-related deaths in the United States. The Centers for Disease Control and Prevention (CDC estimate that 1.7 million TBI people suffer a TBI in the United States annually. Efforts continue to focus on elucidating the complex molecular mechanisms underlying TBI pathophysiology and defining sensitive and specific biomarkers that can aid in improving patient management and care. Recently, the area of neuroproteomics-systems biology is proving to be a prominent tool in biomarker discovery for central nervous system (CNS injury and other neurological diseases. In this work, we employed the controlled cortical impact (CCI model of experimental TBI in rat model to assess the temporal-global proteome changes after acute (1 day and for the first time, subacute (7 days, post-injury time frame using the established CAX-PAGE LC-MS/MS platform for protein separation combined with discrete systems biology analyses to identify temporal biomarker changes related to this rat TBI model. Rather than focusing on any one individual molecular entities, we used in silico systems biology approach to understand the global dynamics that govern proteins that are differentially altered post-injury. In addition, gene ontology analysis of the proteomic data was conducted in order to categorize the proteins by molecular function, biological process, and cellular localization. Results show alterations in several proteins related to inflammatory responses and oxidative stress in both acute (1 day and subacute (7 days periods post TBI. Moreover, results suggest a differential upregulation of neuroprotective proteins at 7-days post-CCI involved in cellular functions such as neurite growth, regeneration, and axonal guidance. Our study is amongst the first to assess temporal neuroproteome

  3. Assessment of validity with polytrauma Veteran populations.

    Science.gov (United States)

    Bush, Shane S; Bass, Carmela

    2015-01-01

    Veterans with polytrauma have suffered injuries to multiple body parts and organs systems, including the brain. The injuries can generate a triad of physical, neurologic/cognitive, and emotional symptoms. Accurate diagnosis is essential for the treatment of these conditions and for fair allocation of benefits. To accurately diagnose polytrauma disorders and their related problems, clinicians take into account the validity of reported history and symptoms, as well as clinical presentations. The purpose of this article is to describe the assessment of validity with polytrauma Veteran populations. Review of scholarly and other relevant literature and clinical experience are utilized. A multimethod approach to validity assessment that includes objective, standardized measures increases the confidence that can be placed in the accuracy of self-reported symptoms and physical, cognitive, and emotional test results. Due to the multivariate nature of polytrauma and the multiple disciplines that play a role in diagnosis and treatment, an ideal model of validity assessment with polytrauma Veteran populations utilizes neurocognitive, neurological, neuropsychiatric, and behavioral measures of validity. An overview of these validity assessment approaches as applied to polytrauma Veteran populations is presented. Veterans, the VA, and society are best served when accurate diagnoses are made.

  4. Development and validation of cell-based luciferase reporter gene assays for measuring neutralizing anti-drug antibodies against interferon beta.

    Science.gov (United States)

    Hermanrud, Christina; Ryner, Malin; Luft, Thomas; Jensen, Poul Erik; Ingenhoven, Kathleen; Rat, Dorothea; Deisenhammer, Florian; Sørensen, Per Soelberg; Pallardy, Marc; Sikkema, Dan; Bertotti, Elisa; Kramer, Daniel; Creeke, Paul; Fogdell-Hahn, Anna

    2016-03-01

    Neutralizing anti-drug antibodies (NAbs) against therapeutic interferon beta (IFNβ) in people with multiple sclerosis (MS) are measured with cell-based bioassays. The aim of this study was to redevelop and validate two luciferase reporter-gene bioassays, LUC and iLite, using a cut-point approach to identify NAb positive samples. Such an approach is favored by the pharmaceutical industry and governmental regulatory agencies as it has a clear statistical basis and overcomes the limitations of the current assays based on the Kawade principle. The work was conducted following the latest assay guidelines. The assays were re-developed and validated as part of the "Anti-Biopharmaceutical Immunization: Prediction and analysis of clinical relevance to minimize the risk" (ABIRISK) consortium and involved a joint collaboration between four academic laboratories and two pharmaceutical companies. The LUC assay was validated at Innsbruck Medical University (LUCIMU) and at Rigshospitalet (LUCRH) Copenhagen, and the iLite assay at Karolinska Institutet, Stockholm. For both assays, the optimal serum sample concentration in relation to sensitivity and recovery was 2.5% (v/v) in assay media. A Shapiro-Wilk test indicated a normal distribution for the majority of runs, allowing a parametric approach for cut-point calculation to be used, where NAb positive samples could be identified with 95% confidence. An analysis of means and variances indicated that a floating cut-point should be used for all assays. The assays demonstrated acceptable sensitivity for being cell-based assays, with a confirmed limit of detection in neat serum of 1519 ng/mL for LUCIMU, 814 ng/mL for LUCRH, and 320 ng/mL for iLite. Use of the validated cut-point assay, in comparison with the previously used Kawade method, identified 14% more NAb positive samples. In conclusion, implementation of the cut-point design resulted in increased sensitivity to detect NAbs. However, the clinical significance of these low

  5. Enhancing hit identification in Mycobacterium tuberculosis drug discovery using validated dual-event Bayesian models.

    Directory of Open Access Journals (Sweden)

    Sean Ekins

    Full Text Available High-throughput screening (HTS in whole cells is widely pursued to find compounds active against Mycobacterium tuberculosis (Mtb for further development towards new tuberculosis (TB drugs. Hit rates from these screens, usually conducted at 10 to 25 µM concentrations, typically range from less than 1% to the low single digits. New approaches to increase the efficiency of hit identification are urgently needed to learn from past screening data. The pharmaceutical industry has for many years taken advantage of computational approaches to optimize compound libraries for in vitro testing, a practice not fully embraced by academic laboratories in the search for new TB drugs. Adapting these proven approaches, we have recently built and validated Bayesian machine learning models for predicting compounds with activity against Mtb based on publicly available large-scale HTS data from the Tuberculosis Antimicrobial Acquisition Coordinating Facility. We now demonstrate the largest prospective validation to date in which we computationally screened 82,403 molecules with these Bayesian models, assayed a total of 550 molecules in vitro, and identified 124 actives against Mtb. Individual hit rates for the different datasets varied from 15-28%. We have identified several FDA approved and late stage clinical candidate kinase inhibitors with activity against Mtb which may represent starting points for further optimization. The computational models developed herein and the commercially available molecules derived from them are now available to any group pursuing Mtb drug discovery.

  6. Validation of the Physician-Pharmacist Collaborative Index for physicians in Malaysia.

    Science.gov (United States)

    Sellappans, Renukha; Ng, Chirk Jenn; Lai, Pauline Siew Mei

    2015-12-01

    Establishing a collaborative working relationship between doctors and pharmacists is essential for the effective provision of pharmaceutical care. The Physician-Pharmacist Collaborative Index (PPCI) was developed to assess the professional exchanges between doctors and pharmacists. Two versions of the PPCI was developed: one for physicians and one for pharmacists. However, these instruments have not been validated in Malaysia. To determine the validity and reliability of the PPCI for physicians in Malaysia. An urban tertiary hospital in Malaysia. This prospective study was conducted from June to August 2014. Doctors were grouped as either a "collaborator" or a "non-collaborator". Collaborators were doctors who regularly worked with one particular clinical pharmacist in their ward, while non-collaborators were doctors who interacted with any random pharmacist who answered the general pharmacy telephone line whenever they required assistance on medication-related enquiries, as they did not have a clinical pharmacist in their ward. Collaborators were firstly identified by the clinical pharmacist he/she worked with, then invited to participate in this study through email, as it was difficult to locate and approach them personally. Non-collaborators were sampled conveniently by approaching them in person as these doctors could be easily sampled from any wards without a clinical pharmacist. The PPCI for physicians was administered at baseline and 2 weeks later. Validity (face validity, factor analysis and discriminative validity) and reliability (internal consistency and test-retest) of the PPCI for physicians. A total of 116 doctors (18 collaborators and 98 non-collaborators) were recruited. Confirmatory factor analysis confirmed that the PPCI for physicians was a 3-factor model. The correlation of the mean domain scores ranged from 0.711 to 0.787. "Collaborators" had significantly higher scores compared to "non-collaborators" (81.4 ± 10.1 vs. 69.3 ± 12.1, p Malaysia.

  7. Development and validation of a multidimensional measure of lean manufacturing

    Directory of Open Access Journals (Sweden)

    Juan A. Marin-Garcia

    2010-02-01

    Full Text Available In the last 30 years of research of lean manufacturing many different questionnaires was proposed to check the degree of the use of the concept. The set of the items used changed considerably from one investigation to another one. Until now isn’t appreciate a movement that converge towards the use, by the investigators, of a few instruments whose validity and reliability have been compared in different surroundings. In fact, the majority of investigations are based on ad-hoc questionnaires and a few of them present the questionnaire validation checking only the unidimensionality and -Cronbach. Nevertheless it seems to have a consensus in identifying 5 big constructs that compose the lean manufacturing (TQM, JIT, TPM, supply chain management and high-involvement. Our research has consisted of identifying and summarizing the models that have been published previously to add the items in constructs or sub-scales of constructs. Later we developed an integrating questionnaire, starting off of the items that appeared in previous investigations. Finally we realized the sub-scales and models validation through a confirmatory factorial analysis, using date of a sample of Spanish Sheltered Work Centre’s (N=128. Of all proposed models, the best an adjustment takes place with the first order model with 20 sub-scales. Our investigation contributes to an integrating vision of the published models and the lean manufacturing sub-scales validity and reliability verification raised by other investigators. Due to his confirming approach, it can serve as generalization of studies that had been realized in contexts with different samples to which we have used for the replication.

  8. Clinical staff nurse leadership: Identifying gaps in competency development.

    Science.gov (United States)

    Franks-Meeks, Sherron

    2018-01-01

    To date, there has been no development of a complete, applicable inventory of clinical staff nurse (CSN) leadership role competencies through a valid and reliable methodology. Further, the CSN has not been invited to engage in the identification, definition, or development of their own leadership competencies. Compare existing leadership competencies to identify and highlight gaps in clinical staff nurse leadership role competency development and validation. Literature review. The CSN has not participated in the development of CSN leadership role competencies, nor have the currently identified CSN leadership role competencies been scientifically validated through research. Finally, CSN leadership role competencies are incomplete and do not reflect the CSN perspective. © 2017 Wiley Periodicals, Inc.

  9. [Relevance and validity of a new French composite index to measure poverty on a geographical level].

    Science.gov (United States)

    Challier, B; Viel, J F

    2001-02-01

    A number of disease conditions are influenced by deprivation. Geographical measurement of deprivation can provide an independent contribution to individual measures by accounting for the social context. Such a geographical approach, based on deprivation indices, is classical in Great Britain but scarcely used in France. The objective of this work was to build and validate an index readily usable in French municipalities and cantons. Socioeconomic data (unemployment, occupations, housing specifications, income, etc.) were derived from the 1990 census of municipalities and cantons in the Doubs departement. A new index was built by principal components analysis on the municipality data. The validity of the new index was checked and tested for correlations with British deprivation indices. Principal components analysis on municipality data identified four components (explaining 76% of the variance). Only the first component (CP1 explaining 42% of the variance) was retained. Content validity (wide choice of potential deprivation items, correlation between items and CP1: 0.52 to 0.96) and construct validity (CP1 socially relevant; Cronbach's alpha=0.91; correlation between CP1 and three out of four British indices ranging from 0.73 to 0.88) were sufficient. Analysis on canton data supported that on municipality data. The validation of the new index being satisfactory, the user will have to make a choice. The new index, CP1, is closer to the local background and was derived from data from a French departement. It is therefore better adapted to more descriptive approaches such as health care planning. To examine the relationship between deprivation and health with a more etiological approach, the British indices (anteriority, international comparisons) would be more appropriate, but CP1, once validated in various health problem situations, should be most useful for French studies.

  10. Liquid Water from First Principles: Validation of Different Sampling Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Mundy, C J; Kuo, W; Siepmann, J; McGrath, M J; Vondevondele, J; Sprik, M; Hutter, J; Parrinello, M; Mohamed, F; Krack, M; Chen, B; Klein, M

    2004-05-20

    A series of first principles molecular dynamics and Monte Carlo simulations were carried out for liquid water to assess the validity and reproducibility of different sampling approaches. These simulations include Car-Parrinello molecular dynamics simulations using the program CPMD with different values of the fictitious electron mass in the microcanonical and canonical ensembles, Born-Oppenheimer molecular dynamics using the programs CPMD and CP2K in the microcanonical ensemble, and Metropolis Monte Carlo using CP2K in the canonical ensemble. With the exception of one simulation for 128 water molecules, all other simulations were carried out for systems consisting of 64 molecules. It is found that the structural and thermodynamic properties of these simulations are in excellent agreement with each other as long as adiabatic sampling is maintained in the Car-Parrinello molecular dynamics simulations either by choosing a sufficiently small fictitious mass in the microcanonical ensemble or by Nos{acute e}-Hoover thermostats in the canonical ensemble. Using the Becke-Lee-Yang-Parr exchange and correlation energy functionals and norm-conserving Troullier-Martins or Goedecker-Teter-Hutter pseudopotentials, simulations at a fixed density of 1.0 g/cm{sup 3} and a temperature close to 315 K yield a height of the first peak in the oxygen-oxygen radial distribution function of about 3.0, a classical constant-volume heat capacity of about 70 J K{sup -1} mol{sup -1}, and a self-diffusion constant of about 0.1 Angstroms{sup 2}/ps.

  11. Novel VEGFR-2 kinase inhibitor identified by the back-to-front approach

    Science.gov (United States)

    Sanphanya, Kingkan; Phowichit, Suwadee; Wattanapitayakul, Suvara K.; Fokin, Valery V.; Vajragupta, Opa

    2013-01-01

    Novel lead was developed as VEGFR-2 inhibitor by the back-to-front approach. Docking experiment guided that the 3-chloromethylphenylurea motif occupied the back pocket of the VEGFR-2 kinase. The attempt to enhance the binding affinity of 1 was made by expanding structure to access the front pocket using triazole as linker. A library of 1,4-(disubsituted)-1H-1,2,3-triazoles were screened in silico and one lead compound (VH02) was identified with enzymatic IC50 against VEGFR-2 of 0.56 μM. VH02 showed antiangiogenic effect by inhibiting the tube formation of HUVEC cells (EA.hy926) at 0.3 μM which was 13 times lower than its cytotoxic dose. The enzymatic and cellular activities suggested the potential of VH02 as a lead for further optimization. PMID:23562241

  12. Identifying the critical financial ratios for stocks evaluation: A fuzzy delphi approach

    Science.gov (United States)

    Mokhtar, Mazura; Shuib, Adibah; Mohamad, Daud

    2014-12-01

    Stocks evaluation has always been an interesting and challenging problem for both researchers and practitioners. Generally, the evaluation can be made based on a set of financial ratios. Nevertheless, there are a variety of financial ratios that can be considered and if all ratios in the set are placed into the evaluation process, data collection would be more difficult and time consuming. Thus, the objective of this paper is to identify the most important financial ratios upon which to focus in order to evaluate the stock's performance. For this purpose, a survey was carried out using an approach which is based on an expert judgement, namely the Fuzzy Delphi Method (FDM). The results of this study indicated that return on equity, return on assets, net profit margin, operating profit margin, earnings per share and debt to equity are the most important ratios.

  13. Matrine Is Identified as a Novel Macropinocytosis Inducer by a Network Target Approach

    Directory of Open Access Journals (Sweden)

    Bo Zhang

    2018-01-01

    Full Text Available Comprehensively understanding pharmacological functions of natural products is a key issue to be addressed for the discovery of new drugs. Unlike some single-target drugs, natural products always exert diverse therapeutic effects through acting on a “network” that consists of multiple targets, making it necessary to develop a systematic approach, e.g., network pharmacology, to reveal pharmacological functions of natural products and infer their mechanisms of action. In this work, to identify the “network target” of a natural product, we perform a functional analysis of matrine, a marketed drug in China extracted from a medical herb Ku-Shen (Radix Sophorae Flavescentis. Here, the network target of matrine was firstly predicted by drugCIPHER, a genome-wide target prediction method. Based on the network target of matrine, we performed a functional gene set enrichment analysis to computationally identify the potential pharmacological functions of matrine, most of which are supported by the literature evidence, including neurotoxicity and neuropharmacological activities of matrine. Furthermore, computational results demonstrated that matrine has the potential for the induction of macropinocytosis and the regulation of ATP metabolism. Our experimental data revealed that the large vesicles induced by matrine are consistent with the typical characteristics of macropinosome. Our verification results also suggested that matrine could decrease cellular ATP level. These findings demonstrated the availability and effectiveness of the network target strategy for identifying the comprehensive pharmacological functions of natural products.

  14. Validação da Escala de Abordagens de Aprendizagem (EABAP em uma amostra Brasileira Validation of the Learning Approach Scale (LAS in a Brazilian sample

    Directory of Open Access Journals (Sweden)

    Cristiano Mauro Assis Gomes

    2011-01-01

    Full Text Available O presente estudo teve como objetivo validar a Escala de Abordagens de Aprendizagem (EABAP, construída para mensurar a dimensão Superficial e Profunda, em uma amostra brasileira. Fizeram parte da pesquisa 716 estudantes do ensino fundamental e médio de uma escola de Belo Horizonte. A análise fatorial exploratória foi utilizada para a identificação das dimensões e seleção dos melhores itens. Dos 27 itens estipulados, foram mantidos 17. A Análise Fatorial Confirmatória indicou a adequação da solução de um fator geral e de dois fatores específicos que representam a Abordagem Superficial e a Profunda (χ2=281,50; gl=116; CFI=0,94; GFI=0,95 e RMSEA=0,05. São discutidas implicações dos resultados para a teoria das abordagens, assim como algumas potencialidades do EABAP para a Psicologia Educacional.The main objective of the current study is to build and validate the Learning Approach Scale (LAS, which was developed to assess the deep and surface dimensions in a Brazilian sample. Seven hundred sixteen participants from junior and high schools in the city of Belo Horizonte took part in the research. The exploratory factor analysis was used to identify those dimensions and to select the best items. Only 17 items were kept from the 27 original ones. By the Confirmatory Factor Analysis it was possible to identify the adequacy of one general and two specific factors, the latter representing the deep and surface approaches (χ2=281.50; gl=116; CFI=0.94; GFI=0.95 and RMSEA=0.05. The implications of the results for the theory of approaches and the application of the scale in the Educational Psychology field are discussed.

  15. Development and initial validation of a cessation fatigue scale.

    Science.gov (United States)

    Mathew, Amanda R; Heckman, Bryan W; Meier, Ellen; Carpenter, Matthew J

    2017-07-01

    Smoking cessation fatigue, or tiredness of attempting to quit smoking, has been posited as a latent construct encompassing loss of motivation, loss of hope in cessation success, decreased self-efficacy, and exhaustion of self-control resources. Despite the potential clinical impact of characterizing cessation fatigue, there is currently no validated measure to assess it. Using a rational scale development approach, we developed a cessation fatigue measure and examined its reliability and construct validity in relation to a) smokers' experience of a recently failed quit attempt (QA) and b) readiness to engage in a subsequent QA. Data were drawn from an online cross-sectional survey of 484 smokers who relapsed from a QA within the past 30days. Exploratory factor analysis identified three factors within the 17-item Cessation Fatigue Scale (CFS), which we labeled: emotional exhaustion, pessimism, and devaluation. High internal consistency was observed for each factor and across the full scale. As expected, CFS overall was positively associated with withdrawal severity and difficulty quitting. CFS was negatively associated with previously validated measures of intention to quit, self-efficacy, and abstinence-related motivational engagement, even after adjusting for nicotine dependence. Findings provide initial validation for a new tool to assess cessation fatigue and contribute needed information on a theory-driven component of cessation-related motivation and relapse risk. Copyright © 2017. Published by Elsevier B.V.

  16. Fingerprint-Based Machine Learning Approach to Identify Potent and Selective 5-HT2BR Ligands

    Directory of Open Access Journals (Sweden)

    Krzysztof Rataj

    2018-05-01

    Full Text Available The identification of subtype-selective GPCR (G-protein coupled receptor ligands is a challenging task. In this study, we developed a computational protocol to find compounds with 5-HT2BR versus 5-HT1BR selectivity. Our approach employs the hierarchical combination of machine learning methods, docking, and multiple scoring methods. First, we applied machine learning tools to filter a large database of druglike compounds by the new Neighbouring Substructures Fingerprint (NSFP. This two-dimensional fingerprint contains information on the connectivity of the substructural features of a compound. Preselected subsets of the database were then subjected to docking calculations. The main indicators of compounds’ selectivity were their different interactions with the secondary binding pockets of both target proteins, while binding modes within the orthosteric binding pocket were preserved. The combined methodology of ligand-based and structure-based methods was validated prospectively, resulting in the identification of hits with nanomolar affinity and ten-fold to ten thousand-fold selectivities.

  17. Identification and validation of novel small proteins in Pseudomonas putida

    DEFF Research Database (Denmark)

    Yang, Xiaochen; Ingemann Jensen, Sheila; Wulff, Tune

    2016-01-01

    Small proteins of fifty amino acids or less have been understudied due to difficulties that impede their annotation and detection. In order to obtain information on small open reading frames (sORFs) in P. putida, bioinformatic and proteomic approaches were used to identify putative small open...... reading frames (sORFs) in the well-characterized strain KT2440. A plasmid-based system was established for sORF validation, enabling expression of C-terminal sequential peptide affinity (SPA) tagged variants and their detection via protein immunoblotting. Out of 22 tested putative sORFs, the expression...... of fourteen sORFs was confirmed, where all except one are novel. All of the validated sORFs except one are located adjacent to annotated genes on the same strand and three are in close proximity to genes with known functions. These include an ABC transporter operon and the two transcriptional regulators Fis...

  18. A Systems Genetic Approach to Identify Low Dose Radiation-Induced Lymphoma Susceptibility/DOE2013FinalReport

    Energy Technology Data Exchange (ETDEWEB)

    Balmain, Allan [University of California, San Francisco; Song, Ihn Young [University of California, San Francisco

    2013-05-15

    The ultimate goal of this project is to identify the combinations of genetic variants that confer an individual's susceptibility to the effects of low dose (0.1 Gy) gamma-radiation, in particular with regard to tumor development. In contrast to the known effects of high dose radiation in cancer induction, the responses to low dose radiation (defined as 0.1 Gy or less) are much less well understood, and have been proposed to involve a protective anti-tumor effect in some in vivo scientific models. These conflicting results confound attempts to develop predictive models of the risk of exposure to low dose radiation, particularly when combined with the strong effects of inherited genetic variants on both radiation effects and cancer susceptibility. We have used a Systems Genetics approach in mice that combines genetic background analysis with responses to low and high dose radiation, in order to develop insights that will allow us to reconcile these disparate observations. Using this comprehensive approach we have analyzed normal tissue gene expression (in this case the skin and thymus), together with the changes that take place in this gene expression architecture a) in response to low or high- dose radiation and b) during tumor development. Additionally, we have demonstrated that using our expression analysis approach in our genetically heterogeneous/defined radiation-induced tumor mouse models can uniquely identify genes and pathways relevant to human T-ALL, and uncover interactions between common genetic variants of genes which may lead to tumor susceptibility.

  19. DISCRETIZATION APPROACH USING RAY-TESTING MODEL IN PARTING LINE AND PARTING SURFACE GENERATION

    Institute of Scientific and Technical Information of China (English)

    HAN Jianwen; JIAN Bin; YAN Guangrong; LEI Yi

    2007-01-01

    Surface classification, 3D parting line, parting surface generation and demoldability analysis which is helpful to select optimal parting direction and optimal parting line are involved in automatic cavity design based on the ray-testing model. A new ray-testing approach is presented to classify the part surfaces to core/cavity surfaces and undercut surfaces by automatic identifying the visibility of surfaces. A simple, direct and efficient algorithm to identify surface visibility is developed. The algorithm is robust and adapted to rather complicated geometry, so it is valuable in computer-aided mold design systems. To validate the efficiency of the approach, an experimental program is implemented. Case studies show that the approach is practical and valuable in automatic parting line and parting surface generation.

  20. Identifying thresholds for ecosystem-based management.

    Directory of Open Access Journals (Sweden)

    Jameal F Samhouri

    Full Text Available BACKGROUND: One of the greatest obstacles to moving ecosystem-based management (EBM from concept to practice is the lack of a systematic approach to defining ecosystem-level decision criteria, or reference points that trigger management action. METHODOLOGY/PRINCIPAL FINDINGS: To assist resource managers and policymakers in developing EBM decision criteria, we introduce a quantitative, transferable method for identifying utility thresholds. A utility threshold is the level of human-induced pressure (e.g., pollution at which small changes produce substantial improvements toward the EBM goal of protecting an ecosystem's structural (e.g., diversity and functional (e.g., resilience attributes. The analytical approach is based on the detection of nonlinearities in relationships between ecosystem attributes and pressures. We illustrate the method with a hypothetical case study of (1 fishing and (2 nearshore habitat pressure using an empirically-validated marine ecosystem model for British Columbia, Canada, and derive numerical threshold values in terms of the density of two empirically-tractable indicator groups, sablefish and jellyfish. We also describe how to incorporate uncertainty into the estimation of utility thresholds and highlight their value in the context of understanding EBM trade-offs. CONCLUSIONS/SIGNIFICANCE: For any policy scenario, an understanding of utility thresholds provides insight into the amount and type of management intervention required to make significant progress toward improved ecosystem structure and function. The approach outlined in this paper can be applied in the context of single or multiple human-induced pressures, to any marine, freshwater, or terrestrial ecosystem, and should facilitate more effective management.

  1. Identifying key performance indicators for nursing and midwifery care using a consensus approach.

    Science.gov (United States)

    McCance, Tanya; Telford, Lorna; Wilson, Julie; Macleod, Olive; Dowd, Audrey

    2012-04-01

    The aim of this study was to gain consensus on key performance indicators that are appropriate and relevant for nursing and midwifery practice in the current policy context. There is continuing demand to demonstrate effectiveness and efficiency in health and social care and to communicate this at boardroom level. Whilst there is substantial literature on the use of clinical indicators and nursing metrics, there is less evidence relating to indicators that reflect the patient experience. A consensus approach was used to identify relevant key performance indicators. A nominal group technique was used comprising two stages: a workshop involving all grades of nursing and midwifery staff in two HSC trusts in Northern Ireland (n = 50); followed by a regional Consensus Conference (n = 80). During the workshop, potential key performance indicators were identified. This was used as the basis for the Consensus Conference, which involved two rounds of consensus. Analysis was based on aggregated scores that were then ranked. Stage one identified 38 potential indicators and stage two prioritised the eight top-ranked indicators as a core set for nursing and midwifery. The relevance and appropriateness of these indicators were confirmed with nurses and midwives working in a range of settings and from the perspective of service users. The eight indicators identified do not conform to the majority of other nursing metrics generally reported in the literature. Furthermore, they are strategically aligned to work on the patient experience and are reflective of the fundamentals of nursing and midwifery practice, with the focus on person-centred care. Nurses and midwives have a significant contribution to make in determining the extent to which these indicators are achieved in practice. Furthermore, measurement of such indicators provides an opportunity to evidence of the unique impact of nursing/midwifery care on the patient experience. © 2011 Blackwell Publishing Ltd.

  2. Multiple Score Comparison: a network meta-analysis approach to comparison and external validation of prognostic scores

    Directory of Open Access Journals (Sweden)

    Sarah R. Haile

    2017-12-01

    Full Text Available Abstract Background Prediction models and prognostic scores have been increasingly popular in both clinical practice and clinical research settings, for example to aid in risk-based decision making or control for confounding. In many medical fields, a large number of prognostic scores are available, but practitioners may find it difficult to choose between them due to lack of external validation as well as lack of comparisons between them. Methods Borrowing methodology from network meta-analysis, we describe an approach to Multiple Score Comparison meta-analysis (MSC which permits concurrent external validation and comparisons of prognostic scores using individual patient data (IPD arising from a large-scale international collaboration. We describe the challenges in adapting network meta-analysis to the MSC setting, for instance the need to explicitly include correlations between the scores on a cohort level, and how to deal with many multi-score studies. We propose first using IPD to make cohort-level aggregate discrimination or calibration scores, comparing all to a common comparator. Then, standard network meta-analysis techniques can be applied, taking care to consider correlation structures in cohorts with multiple scores. Transitivity, consistency and heterogeneity are also examined. Results We provide a clinical application, comparing prognostic scores for 3-year mortality in patients with chronic obstructive pulmonary disease using data from a large-scale collaborative initiative. We focus on the discriminative properties of the prognostic scores. Our results show clear differences in performance, with ADO and eBODE showing higher discrimination with respect to mortality than other considered scores. The assumptions of transitivity and local and global consistency were not violated. Heterogeneity was small. Conclusions We applied a network meta-analytic methodology to externally validate and concurrently compare the prognostic properties

  3. Tools to identify the men with prostate cancer most appropriate for active surveillance?

    Directory of Open Access Journals (Sweden)

    Robert H Getzenberg

    2014-02-01

    Full Text Available A great deal of effort is underway in order to identify those men with prostate cancer felicitous for active surveillance with greater precision than that afforded to us today. In the manuscript by Irshad et al. the authors evaluate a novel set of genes associated with senescence and aging as tools that can provide guidance regarding the indolent nature of an individual's prostate cancer with validation using both mRNA and protein analyses. While additional studies are required to understand the full impact of these findings, the innovative approach taken enhances our understanding of distinct phenotypes of prostate cancer.

  4. 78 FR 20672 - Literature Review ApproachIdentifying Research Needs for Assessing Safe Use of High Intakes of...

    Science.gov (United States)

    2013-04-05

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Literature Review Approach... Needs for Assessing Safe Use of High Intakes of Folic Acid,'' for review of the pertinent literature... folate and folic acid, screening of the literature was undertaken to identify the potential adverse...

  5. Identifying Opportunities for Decision Support Systems in Support of Regional Resource Use Planning: An Approach Through Soft Systems Methodology.

    Science.gov (United States)

    Zhu; Dale

    2000-10-01

    / Regional resource use planning relies on key regional stakeholder groups using and having equitable access to appropriate social, economic, and environmental information and assessment tools. Decision support systems (DSS) can improve stakeholder access to such information and analysis tools. Regional resource use planning, however, is a complex process involving multiple issues, multiple assessment criteria, multiple stakeholders, and multiple values. There is a need for an approach to DSS development that can assist in understanding and modeling complex problem situations in regional resource use so that areas where DSSs could provide effective support can be identified, and the user requirements can be well established. This paper presents an approach based on the soft systems methodology for identifying DSS opportunities for regional resource use planning, taking the Central Highlands Region of Queensland, Australia, as a case study.

  6. Evolutionary Analysis Predicts Sensitive Positions of MMP20 and Validates Newly- and Previously-Identified MMP20 Mutations Causing Amelogenesis Imperfecta

    Directory of Open Access Journals (Sweden)

    Barbara Gasse

    2017-06-01

    Full Text Available Amelogenesis imperfecta (AI designates a group of genetic diseases characterized by a large range of enamel disorders causing important social and health problems. These defects can result from mutations in enamel matrix proteins or protease encoding genes. A range of mutations in the enamel cleavage enzyme matrix metalloproteinase-20 gene (MMP20 produce enamel defects of varying severity. To address how various alterations produce a range of AI phenotypes, we performed a targeted analysis to find MMP20 mutations in French patients diagnosed with non-syndromic AI. Genomic DNA was isolated from saliva and MMP20 exons and exon-intron boundaries sequenced. We identified several homozygous or heterozygous mutations, putatively involved in the AI phenotypes. To validate missense mutations and predict sensitive positions in the MMP20 sequence, we evolutionarily compared 75 sequences extracted from the public databases using the Datamonkey webserver. These sequences were representative of mammalian lineages, covering more than 150 million years of evolution. This analysis allowed us to find 324 sensitive positions (out of the 483 MMP20 residues, pinpoint functionally important domains, and build an evolutionary chart of important conserved MMP20 regions. This is an efficient tool to identify new- and previously-identified mutations. We thus identified six functional MMP20 mutations in unrelated families, finding two novel mutated sites. The genotypes and phenotypes of these six mutations are described and compared. To date, 13 MMP20 mutations causing AI have been reported, making these genotypes and associated hypomature enamel phenotypes the most frequent in AI.

  7. Evolutionary Analysis Predicts Sensitive Positions of MMP20 and Validates Newly- and Previously-Identified MMP20 Mutations Causing Amelogenesis Imperfecta.

    Science.gov (United States)

    Gasse, Barbara; Prasad, Megana; Delgado, Sidney; Huckert, Mathilde; Kawczynski, Marzena; Garret-Bernardin, Annelyse; Lopez-Cazaux, Serena; Bailleul-Forestier, Isabelle; Manière, Marie-Cécile; Stoetzel, Corinne; Bloch-Zupan, Agnès; Sire, Jean-Yves

    2017-01-01

    Amelogenesis imperfecta (AI) designates a group of genetic diseases characterized by a large range of enamel disorders causing important social and health problems. These defects can result from mutations in enamel matrix proteins or protease encoding genes. A range of mutations in the enamel cleavage enzyme matrix metalloproteinase-20 gene ( MMP20 ) produce enamel defects of varying severity. To address how various alterations produce a range of AI phenotypes, we performed a targeted analysis to find MMP20 mutations in French patients diagnosed with non-syndromic AI. Genomic DNA was isolated from saliva and MMP20 exons and exon-intron boundaries sequenced. We identified several homozygous or heterozygous mutations, putatively involved in the AI phenotypes. To validate missense mutations and predict sensitive positions in the MMP20 sequence, we evolutionarily compared 75 sequences extracted from the public databases using the Datamonkey webserver. These sequences were representative of mammalian lineages, covering more than 150 million years of evolution. This analysis allowed us to find 324 sensitive positions (out of the 483 MMP20 residues), pinpoint functionally important domains, and build an evolutionary chart of important conserved MMP20 regions. This is an efficient tool to identify new- and previously-identified mutations. We thus identified six functional MMP20 mutations in unrelated families, finding two novel mutated sites. The genotypes and phenotypes of these six mutations are described and compared. To date, 13 MMP20 mutations causing AI have been reported, making these genotypes and associated hypomature enamel phenotypes the most frequent in AI.

  8. Multibody dynamical modeling for spacecraft docking process with spring-damper buffering device: A new validation approach

    Science.gov (United States)

    Daneshjou, Kamran; Alibakhshi, Reza

    2018-01-01

    In the current manuscript, the process of spacecraft docking, as one of the main risky operations in an on-orbit servicing mission, is modeled based on unconstrained multibody dynamics. The spring-damper buffering device is utilized here in the docking probe-cone system for micro-satellites. Owing to the impact occurs inevitably during docking process and the motion characteristics of multibody systems are remarkably affected by this phenomenon, a continuous contact force model needs to be considered. Spring-damper buffering device, keeping the spacecraft stable in an orbit when impact occurs, connects a base (cylinder) inserted in the chaser satellite and the end of docking probe. Furthermore, by considering a revolute joint equipped with torsional shock absorber, between base and chaser satellite, the docking probe can experience both translational and rotational motions simultaneously. Although spacecraft docking process accompanied by the buffering mechanisms may be modeled by constrained multibody dynamics, this paper deals with a simple and efficient formulation to eliminate the surplus generalized coordinates and solve the impact docking problem based on unconstrained Lagrangian mechanics. By an example problem, first, model verification is accomplished by comparing the computed results with those recently reported in the literature. Second, according to a new alternative validation approach, which is based on constrained multibody problem, the accuracy of presented model can be also evaluated. This proposed verification approach can be applied to indirectly solve the constrained multibody problems by minimum required effort. The time history of impact force, the influence of system flexibility and physical interaction between shock absorber and penetration depth caused by impact are the issues followed in this paper. Third, the MATLAB/SIMULINK multibody dynamic analysis software will be applied to build impact docking model to validate computed results and

  9. A Selected Reaction Monitoring Mass Spectrometry Protocol for Validation of Proteomic Biomarker Candidates in Studies of Psychiatric Disorders.

    Science.gov (United States)

    Reis-de-Oliveira, Guilherme; Garcia, Sheila; Guest, Paul C; Cassoli, Juliana S; Martins-de-Souza, Daniel

    2017-01-01

    Most biomarker candidates arising from proteomic studies of psychiatric disorders have not progressed for use in clinical studies due to insufficient validation steps. Here we describe a selective reaction monitoring mass spectrometry (SRM-MS) approach that could be used as a follow-up validation tool of proteins identified in blood serum or plasma. This protocol specifically covers the stages of peptide selection and optimization. The increasing application of SRM-MS should enable fast, sensitive, and robust methods with the potential for use in clinical studies involving sampling of serum or plasma. Understanding the molecular mechanisms and identifying potential biomarkers for risk assessment, diagnosis, prognosis, and prediction of drug response goes toward the implementation of translational medicine strategies for improved treatment of patients with psychiatric disorders and other debilitating diseases.

  10. The validation of synthetic spectra used in the performance evaluation of radionuclide identifiers

    International Nuclear Information System (INIS)

    Flynn, A.; Boardman, D.; Reinhard, M.I.

    2013-01-01

    This work has evaluated synthetic gamma-ray spectra created by the RASE sampler using experimental data. The RASE sampler resamples experimental data to create large data libraries which are subsequently available for use in evaluation of radionuclide identification algorithms. A statistical evaluation of the synthetic energy bins has shown the variation to follow a Poisson distribution identical to experimental data. The minimum amount of statistics required in each base spectrum to ensure the subsequent use of the base spectrum in the generation of statistically robust synthetic data was determined. A requirement that the simulated acquisition time of the synthetic spectra was not more than 4% of the acquisition time of the base spectrum was also determined. Further validation of RASE was undertaken using two different radionuclide identification algorithms. - Highlights: • A validation of synthetic data created in order to evaluate radionuclide identification systems has been carried out. • Statistical analysis has shown that the data accurately represents experimental data. • A limit to the amount of data which could be created using this method was evaluated. • Analysis of the synthetic gamma spectra show identical results to analysis carried out with experimental data

  11. Identifying predictors of physics item difficulty: A linear regression approach

    Science.gov (United States)

    Mesic, Vanes; Muratovic, Hasnija

    2011-06-01

    Large-scale assessments of student achievement in physics are often approached with an intention to discriminate students based on the attained level of their physics competencies. Therefore, for purposes of test design, it is important that items display an acceptable discriminatory behavior. To that end, it is recommended to avoid extraordinary difficult and very easy items. Knowing the factors that influence physics item difficulty makes it possible to model the item difficulty even before the first pilot study is conducted. Thus, by identifying predictors of physics item difficulty, we can improve the test-design process. Furthermore, we get additional qualitative feedback regarding the basic aspects of student cognitive achievement in physics that are directly responsible for the obtained, quantitative test results. In this study, we conducted a secondary analysis of data that came from two large-scale assessments of student physics achievement at the end of compulsory education in Bosnia and Herzegovina. Foremost, we explored the concept of “physics competence” and performed a content analysis of 123 physics items that were included within the above-mentioned assessments. Thereafter, an item database was created. Items were described by variables which reflect some basic cognitive aspects of physics competence. For each of the assessments, Rasch item difficulties were calculated in separate analyses. In order to make the item difficulties from different assessments comparable, a virtual test equating procedure had to be implemented. Finally, a regression model of physics item difficulty was created. It has been shown that 61.2% of item difficulty variance can be explained by factors which reflect the automaticity, complexity, and modality of the knowledge structure that is relevant for generating the most probable correct solution, as well as by the divergence of required thinking and interference effects between intuitive and formal physics knowledge

  12. Identifying predictors of physics item difficulty: A linear regression approach

    Directory of Open Access Journals (Sweden)

    Hasnija Muratovic

    2011-06-01

    Full Text Available Large-scale assessments of student achievement in physics are often approached with an intention to discriminate students based on the attained level of their physics competencies. Therefore, for purposes of test design, it is important that items display an acceptable discriminatory behavior. To that end, it is recommended to avoid extraordinary difficult and very easy items. Knowing the factors that influence physics item difficulty makes it possible to model the item difficulty even before the first pilot study is conducted. Thus, by identifying predictors of physics item difficulty, we can improve the test-design process. Furthermore, we get additional qualitative feedback regarding the basic aspects of student cognitive achievement in physics that are directly responsible for the obtained, quantitative test results. In this study, we conducted a secondary analysis of data that came from two large-scale assessments of student physics achievement at the end of compulsory education in Bosnia and Herzegovina. Foremost, we explored the concept of “physics competence” and performed a content analysis of 123 physics items that were included within the above-mentioned assessments. Thereafter, an item database was created. Items were described by variables which reflect some basic cognitive aspects of physics competence. For each of the assessments, Rasch item difficulties were calculated in separate analyses. In order to make the item difficulties from different assessments comparable, a virtual test equating procedure had to be implemented. Finally, a regression model of physics item difficulty was created. It has been shown that 61.2% of item difficulty variance can be explained by factors which reflect the automaticity, complexity, and modality of the knowledge structure that is relevant for generating the most probable correct solution, as well as by the divergence of required thinking and interference effects between intuitive and formal

  13. Verification and Validation of a Fingerprint Image Registration Software

    Directory of Open Access Journals (Sweden)

    Liu Yan

    2006-01-01

    Full Text Available The need for reliable identification and authentication is driving the increased use of biometric devices and systems. Verification and validation techniques applicable to these systems are rather immature and ad hoc, yet the consequences of the wide deployment of biometric systems could be significant. In this paper we discuss an approach towards validation and reliability estimation of a fingerprint registration software. Our validation approach includes the following three steps: (a the validation of the source code with respect to the system requirements specification; (b the validation of the optimization algorithm, which is in the core of the registration system; and (c the automation of testing. Since the optimization algorithm is heuristic in nature, mathematical analysis and test results are used to estimate the reliability and perform failure analysis of the image registration module.

  14. A novel approach to sequence validating protein expression clones with automated decision making

    Directory of Open Access Journals (Sweden)

    Mohr Stephanie E

    2007-06-01

    Full Text Available Abstract Background Whereas the molecular assembly of protein expression clones is readily automated and routinely accomplished in high throughput, sequence verification of these clones is still largely performed manually, an arduous and time consuming process. The ultimate goal of validation is to determine if a given plasmid clone matches its reference sequence sufficiently to be "acceptable" for use in protein expression experiments. Given the accelerating increase in availability of tens of thousands of unverified clones, there is a strong demand for rapid, efficient and accurate software that automates clone validation. Results We have developed an Automated Clone Evaluation (ACE system – the first comprehensive, multi-platform, web-based plasmid sequence verification software package. ACE automates the clone verification process by defining each clone sequence as a list of multidimensional discrepancy objects, each describing a difference between the clone and its expected sequence including the resulting polypeptide consequences. To evaluate clones automatically, this list can be compared against user acceptance criteria that specify the allowable number of discrepancies of each type. This strategy allows users to re-evaluate the same set of clones against different acceptance criteria as needed for use in other experiments. ACE manages the entire sequence validation process including contig management, identifying and annotating discrepancies, determining if discrepancies correspond to polymorphisms and clone finishing. Designed to manage thousands of clones simultaneously, ACE maintains a relational database to store information about clones at various completion stages, project processing parameters and acceptance criteria. In a direct comparison, the automated analysis by ACE took less time and was more accurate than a manual analysis of a 93 gene clone set. Conclusion ACE was designed to facilitate high throughput clone sequence

  15. Development of a clinician reputation metric to identify appropriate problem-medication pairs in a crowdsourced knowledge base.

    Science.gov (United States)

    McCoy, Allison B; Wright, Adam; Rogith, Deevakar; Fathiamini, Safa; Ottenbacher, Allison J; Sittig, Dean F

    2014-04-01

    Correlation of data within electronic health records is necessary for implementation of various clinical decision support functions, including patient summarization. A key type of correlation is linking medications to clinical problems; while some databases of problem-medication links are available, they are not robust and depend on problems and medications being encoded in particular terminologies. Crowdsourcing represents one approach to generating robust knowledge bases across a variety of terminologies, but more sophisticated approaches are necessary to improve accuracy and reduce manual data review requirements. We sought to develop and evaluate a clinician reputation metric to facilitate the identification of appropriate problem-medication pairs through crowdsourcing without requiring extensive manual review. We retrieved medications from our clinical data warehouse that had been prescribed and manually linked to one or more problems by clinicians during e-prescribing between June 1, 2010 and May 31, 2011. We identified measures likely to be associated with the percentage of accurate problem-medication links made by clinicians. Using logistic regression, we created a metric for identifying clinicians who had made greater than or equal to 95% appropriate links. We evaluated the accuracy of the approach by comparing links made by those physicians identified as having appropriate links to a previously manually validated subset of problem-medication pairs. Of 867 clinicians who asserted a total of 237,748 problem-medication links during the study period, 125 had a reputation metric that predicted the percentage of appropriate links greater than or equal to 95%. These clinicians asserted a total of 2464 linked problem-medication pairs (983 distinct pairs). Compared to a previously validated set of problem-medication pairs, the reputation metric achieved a specificity of 99.5% and marginally improved the sensitivity of previously described knowledge bases. A

  16. Validating CDIAC's population-based approach to the disaggregation of within-country CO2 emissions

    International Nuclear Information System (INIS)

    Cushman, R.M.; Beauchamp, J.J.; Brenkert, A.L.

    1998-01-01

    The Carbon Dioxide Information Analysis Center produces and distributes a data base of CO 2 emissions from fossil-fuel combustion and cement production, expressed as global, regional, and national estimates. CDIAC also produces a companion data base, expressed on a one-degree latitude-longitude grid. To do this gridding, emissions within each country are spatially disaggregated according to the distribution of population within that country. Previously, the lack of within-country emissions data prevented a validation of this approach. But emissions inventories are now becoming available for most US states. An analysis of these inventories confirms that population distribution explains most, but not all, of the variance in the distribution of CO 2 emissions within the US. Additional sources of variance (coal production, non-carbon energy sources, and interstate electricity transfers) are explored, with the hope that the spatial disaggregation of emissions can be improved

  17. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  18. The Dutch Linguistic Intraoperative Protocol: a valid linguistic approach to awake brain surgery.

    Science.gov (United States)

    De Witte, E; Satoer, D; Robert, E; Colle, H; Verheyen, S; Visch-Brink, E; Mariën, P

    2015-01-01

    Intraoperative direct electrical stimulation (DES) is increasingly used in patients operated on for tumours in eloquent areas. Although a positive impact of DES on postoperative linguistic outcome is generally advocated, information about the neurolinguistic methods applied in awake surgery is scarce. We developed for the first time a standardised Dutch linguistic test battery (measuring phonology, semantics, syntax) to reliably identify the critical language zones in detail. A normative study was carried out in a control group of 250 native Dutch-speaking healthy adults. In addition, the clinical application of the Dutch Linguistic Intraoperative Protocol (DuLIP) was demonstrated by means of anatomo-functional models and five case studies. A set of DuLIP tests was selected for each patient depending on the tumour location and degree of linguistic impairment. DuLIP is a valid test battery for pre-, intraoperative and postoperative language testing and facilitates intraoperative mapping of eloquent language regions that are variably located. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Identifying patients with myasthenia for epidemiological research by linkage of automated registers

    DEFF Research Database (Denmark)

    Pedersen, Emil Greve; Hallas, Jesper; Hansen, Klaus

    2011-01-01

    We validated a new method of identifying patients with incident myasthenia in automated Danish registers for the purpose of conducting epidemiological studies of the disorder.......We validated a new method of identifying patients with incident myasthenia in automated Danish registers for the purpose of conducting epidemiological studies of the disorder....

  20. A hybrid life cycle and multi-criteria decision analysis approach for identifying sustainable development strategies of Beijing's taxi fleet

    International Nuclear Information System (INIS)

    Cai, Yanpeng; Applegate, Scott; Yue, Wencong; Cai, Jianying; Wang, Xuan; Liu, Gengyuan; Li, Chunhui

    2017-01-01

    To identify and evaluate sustainable strategies of taxi fleet in Beijing in terms of economic, policy, and environmental implications, a hybrid approach was developed through incorporating multi-criteria decision analysis (MCDA) methods within a general life-cycle analysis (LCA) framework. The approach can (a) help comprehensive evaluate environmental impacts of multiple types of vehicles, (b) facilitate analysis of environmental, economic and policy features of such vehicles, and (c) identify desirable taxi fleet development strategies for the city. The developed approach represented an improvement of the decision-making capability for taxi implementation based on multiple available technologies and their performance that can be specifically tailored to Beijing. The results demonstrated that the proposed approach could comprehensively reflect multiple implications of strategies for the taxi fleet in Beijing to reduce air pollution in the city. The results also indicated that the electric vehicle powered with the year 2020 electricity projections would be the ideal solution, outranking the other alternatives. The conventional vehicle ranked the lowest among the alternatives. The plug-in hybrid vehicle powered by 2020 electricity projects ranked the third, followed by the plug-in hybrid vehicle ranking the fourth, and the hybrid vehicle ranking the fifth. - Highlights: • An hybrid approach was proposed for evaluating sustainable strategies of Beijing's taxi fleet. • This approach was based on the combination of multi-criteria decision analysis methods and life-cycle assessment. • Environmental, economic and policy performances of multiple strategies were compared. • Detailed responses of taxi drivers and local residents were interviewed. • The electric vehicle would be the ideal solution for Beijing Taxi fleet.

  1. Identifying genes that mediate anthracyline toxicity in immune cells

    Directory of Open Access Journals (Sweden)

    Amber eFrick

    2015-04-01

    Full Text Available The role of the immune system in response to chemotherapeutic agents remains elusive. The interpatient variability observed in immune and chemotherapeutic cytotoxic responses is likely, at least in part, due to complex genetic differences. Through the use of a panel of genetically diverse mouse inbred strains, we developed a drug screening platform aimed at identifying genes underlying these chemotherapeutic cytotoxic effects on immune cells. Using genome-wide association studies (GWAS, we identified four genome-wide significant quantitative trait loci (QTL that contributed to the sensitivity of doxorubicin and idarubicin in immune cells. Of particular interest, a locus on chromosome 16 was significantly associated with cell viability following idarubicin administration (p = 5.01x10-8. Within this QTL lies App, which encodes amyloid beta precursor protein. Comparison of dose-response curves verified that T-cells in App knockout mice were more sensitive to idarubicin than those of C57BL/6J control mice (p < 0.05.In conclusion, the cellular screening approach coupled with GWAS led to the identification and subsequent validation of a gene involved in T-cell viability after idarubicin treatment. Previous studies have suggested a role for App in in vitro and in vivo cytotoxicity to anticancer agents; the overexpression of App enhances resistance, while the knockdown of this gene is deleterious to cell viability. Thus, further investigations should include performing mechanistic studies, validating additional genes from the GWAS, including Ppfia1 and Ppfibp1, and ultimately translating the findings to in vivo and human studies.

  2. Developing and validating a method for monitoring and tracking changes in southern pine beetle hazard at the landscape level

    Science.gov (United States)

    Ronald Billings; L. Allen Smith; Jin Zhu; Shailu Verma; Nick Kouchoukos; Joon Heo

    2010-01-01

    The objective of this research project is to develop and validate a method for using satellite images and digital geospatial data to map the distribution of southern pine beetle (SPB) habitats across the pinelands of east Texas. Our approach builds on a work that used photo interpretation and discriminant analysis to identify and evaluate environmental conditions...

  3. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    Science.gov (United States)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  4. What do conscientious people do? Development and validation of the Behavioral Indicators of Conscientiousness (BIC).

    Science.gov (United States)

    Jackson, Joshua J; Wood, Dustin; Bogg, Tim; Walton, Kate E; Harms, Peter D; Roberts, Brent W

    2010-08-01

    Typical assessments of personality traits collapse behaviors, thoughts, and feelings into a single measure without distinguishing between these different manifestations. To address this lack of specification, the current study develops and validates a measure that assesses a number of broad behaviors associated with the personality trait of conscientiousness (the Behavioral Indicators of Conscientiousness; BIC). Findings suggest that the lower-order structure of conscientious behaviors is mostly similar to the lower-order structure in extant trait measures. Furthermore, a daily diary method was used to validate the BIC against frequency counts of conscientious behavior. Overall, the results identify specific behaviors that conscientious individuals tend to perform and highlight possible advantages of this approach over broad trait assessment.

  5. High throughput sequencing and proteomics to identify immunogenic proteins of a new pathogen: the dirty genome approach.

    Science.gov (United States)

    Greub, Gilbert; Kebbi-Beghdadi, Carole; Bertelli, Claire; Collyn, François; Riederer, Beat M; Yersin, Camille; Croxatto, Antony; Raoult, Didier

    2009-12-23

    With the availability of new generation sequencing technologies, bacterial genome projects have undergone a major boost. Still, chromosome completion needs a costly and time-consuming gap closure, especially when containing highly repetitive elements. However, incomplete genome data may be sufficiently informative to derive the pursued information. For emerging pathogens, i.e. newly identified pathogens, lack of release of genome data during gap closure stage is clearly medically counterproductive. We thus investigated the feasibility of a dirty genome approach, i.e. the release of unfinished genome sequences to develop serological diagnostic tools. We showed that almost the whole genome sequence of the emerging pathogen Parachlamydia acanthamoebae was retrieved even with relatively short reads from Genome Sequencer 20 and Solexa. The bacterial proteome was analyzed to select immunogenic proteins, which were then expressed and used to elaborate the first steps of an ELISA. This work constitutes the proof of principle for a dirty genome approach, i.e. the use of unfinished genome sequences of pathogenic bacteria, coupled with proteomics to rapidly identify new immunogenic proteins useful to develop in the future specific diagnostic tests such as ELISA, immunohistochemistry and direct antigen detection. Although applied here to an emerging pathogen, this combined dirty genome sequencing/proteomic approach may be used for any pathogen for which better diagnostics are needed. These genome sequences may also be very useful to develop DNA based diagnostic tests. All these diagnostic tools will allow further evaluations of the pathogenic potential of this obligate intracellular bacterium.

  6. The HepTestContest: a global innovation contest to identify approaches to hepatitis B and C testing.

    Science.gov (United States)

    Tucker, Joseph D; Meyers, Kathrine; Best, John; Kaplan, Karyn; Pendse, Razia; Fenton, Kevin A; Andrieux-Meyer, Isabelle; Figueroa, Carmen; Goicochea, Pedro; Gore, Charles; Ishizaki, Azumi; Khwairakpam, Giten; Miller, Veronica; Mozalevskis, Antons; Ninburg, Michael; Ocama, Ponsiano; Peeling, Rosanna; Walsh, Nick; Colombo, Massimo G; Easterbrook, Philippa

    2017-11-01

    Innovation contests are a novel approach to elicit good ideas and innovative practices in various areas of public health. There remains limited published literature on approaches to deliver hepatitis testing. The purpose of this innovation contest was to identify examples of different hepatitis B and C approaches to support countries in their scale-up of hepatitis testing and to supplement development of formal recommendations on service delivery in the 2017 World Health Organization hepatitis B and C testing guidelines. This contest involved four steps: 1) establishment of a multisectoral steering committee to coordinate a call for contest entries; 2) dissemination of the call for entries through diverse media (Facebook, Twitter, YouTube, email listservs, academic journals); 3) independent ranking of submissions by a panel of judges according to pre-specified criteria (clarity of testing model, innovation, effectiveness, next steps) using a 1-10 scale; 4) recognition of highly ranked entries through presentation at international conferences, commendation certificate, and inclusion as a case study in the WHO 2017 testing guidelines. The innovation contest received 64 entries from 27 countries and took a total of 4 months to complete. Sixteen entries were directly included in the WHO testing guidelines. The entries covered testing in different populations, including primary care patients (n = 5), people who inject drugs (PWID) (n = 4), pregnant women (n = 4), general populations (n = 4), high-risk groups (n = 3), relatives of people living with hepatitis B and C (n = 2), migrants (n = 2), incarcerated individuals (n = 2), workers (n = 2), and emergency department patients (n = 2). A variety of different testing delivery approaches were employed, including integrated HIV-hepatitis testing (n = 12); integrated testing with harm reduction and addiction services (n = 9); use of electronic medical records to support targeted testing (n = 8

  7. 77 FR 27135 - HACCP Systems Validation

    Science.gov (United States)

    2012-05-09

    ... validation, the journal article should identify E.coli O157:H7 and other pathogens as the hazard that the..., or otherwise processes ground beef may determine that E. coli O157:H7 is not a hazard reasonably... specifications that require that the establishment's suppliers apply validated interventions to address E. coli...

  8. Approaches to Validation of Models for Low Gravity Fluid Behavior

    Science.gov (United States)

    Chato, David J.; Marchetta, Jeffery; Hochstein, John I.; Kassemi, Mohammad

    2005-01-01

    This paper details the author experiences with the validation of computer models to predict low gravity fluid behavior. It reviews the literature of low gravity fluid behavior as a starting point for developing a baseline set of test cases. It examines authors attempts to validate their models against these cases and the issues they encountered. The main issues seem to be that: Most of the data is described by empirical correlation rather than fundamental relation; Detailed measurements of the flow field have not been made; Free surface shapes are observed but through thick plastic cylinders, and therefore subject to a great deal of optical distortion; and Heat transfer process time constants are on the order of minutes to days but the zero-gravity time available has been only seconds.

  9. Constructing New Theory for Identifying Students with Emotional Disturbance: A Constructivist Approach to Grounded Theory

    Directory of Open Access Journals (Sweden)

    Dori Barnett

    2012-06-01

    Full Text Available A grounded theory study that examined how practitioners in a county alternative and correctional education setting identify youth with emotional and behavioral difficulties for special education services provides an exemplar for a constructivist approach to grounded theory methodology. Discussion focuses on how a constructivist orientation to grounded theory methodology informed research decisions, shaped the development of the emergent grounded theory, and prompted a way of thinking about data collection and analysis. Implications for future research directions and policy and practice in the field of special and alternative education are discussed.

  10. Evaluating response shift in training evaluation: comparing the retrospective pretest with an adapted measurement invariance approach in a classroom management training program.

    Science.gov (United States)

    Piwowar, Valentina; Thiel, Felicitas

    2014-10-01

    Response shift (RS) can threaten the internal validity of pre-post designs. As RS may indicate a redefinition of the target construct, its occurrence in training evaluation is rather likely. The most common approach to deal with RS is to implement a retrospective pretest (then-test) instead of the traditional pre-test. In health psychology, an adapted measurement invariance approach (MIad) was developed as an alternative technique to study RS. Results produced by identifying RS with the two approaches were rarely studied simultaneously or within an experimental framework. To study RS in two different treatment conditions and compare results produced by both techniques in identifying various types of RS. We further studied validity aspects of the then-test. We evaluated RS by applying the then-test procedure (TP) and the measurement invariance apporach MIad within an experimental design: Participants either attended a short-term or a long-term classroom management training program. Participants were 146 student teachers in their first year of master's study. Pre (before training), post, and then self-ratings (after training) on classroom management knowledge were administered. Results indicated that the two approaches do not yield the same results. The MIad identified more and also group-specific RS as opposed to the findings of the TP, which found less and only little evidence for group-specific RS. Further research is needed to study the usability and validity of the respective approaches. In particular, the usability of the then-test seems to be challenged. © The Author(s) 2014.

  11. Validation of Simulation Models without Knowledge of Parameters Using Differential Algebra

    Directory of Open Access Journals (Sweden)

    Björn Haffke

    2015-01-01

    Full Text Available This study deals with the external validation of simulation models using methods from differential algebra. Without any system identification or iterative numerical methods, this approach provides evidence that the equations of a model can represent measured and simulated sets of data. This is very useful to check if a model is, in general, suitable. In addition, the application of this approach to verification of the similarity between the identifiable parameters of two models with different sets of input and output measurements is demonstrated. We present a discussion on how the method can be used to find parameter deviations between any two models. The advantage of this method is its applicability to nonlinear systems as well as its algorithmic nature, which makes it easy to automate.

  12. The Earthquake‐Source Inversion Validation (SIV) Project

    KAUST Repository

    Mai, Paul Martin

    2016-04-27

    Finite-fault earthquake source inversions infer the (time-dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake-source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward-modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source-model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake-source imaging problem.

  13. The Earthquake‐Source Inversion Validation (SIV) Project

    KAUST Repository

    Mai, Paul Martin; Schorlemmer, Danijel; Page, Morgan; Ampuero, Jean‐Paul; Asano, Kimiyuki; Causse, Mathieu; Custodio, Susana; Fan, Wenyuan; Festa, Gaetano; Galis, Martin; Gallovic, Frantisek; Imperatori, Walter; Kä ser, Martin; Malytskyy, Dmytro; Okuwaki, Ryo; Pollitz, Fred; Passone, Luca; Razafindrakoto, Hoby; Sekiguchi, Haruko; Song, Seok Goo; Somala, Surendra N.; Thingbaijam, Kiran Kumar; Twardzik, Cedric; van Driel, Martin; Vyas, Jagdish Chandra; Wang, Rongjiang; Yagi, Yuji; Zielke, Olaf

    2016-01-01

    Finite-fault earthquake source inversions infer the (time-dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake-source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward-modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source-model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake-source imaging problem.

  14. The Earthquake‐Source Inversion Validation (SIV) Project

    Science.gov (United States)

    Mai, P. Martin; Schorlemmer, Danijel; Page, Morgan T.; Ampuero, Jean-Paul; Asano, Kimiyuki; Causse, Mathieu; Custodio, Susana; Fan, Wenyuan; Festa, Gaetano; Galis, Martin; Gallovic, Frantisek; Imperatori, Walter; Käser, Martin; Malytskyy, Dmytro; Okuwaki, Ryo; Pollitz, Fred; Passone, Luca; Razafindrakoto, Hoby N. T.; Sekiguchi, Haruko; Song, Seok Goo; Somala, Surendra N.; Thingbaijam, Kiran K. S.; Twardzik, Cedric; van Driel, Martin; Vyas, Jagdish C.; Wang, Rongjiang; Yagi, Yuji; Zielke, Olaf

    2016-01-01

    Finite‐fault earthquake source inversions infer the (time‐dependent) displacement on the rupture surface from geophysical data. The resulting earthquake source models document the complexity of the rupture process. However, multiple source models for the same earthquake, obtained by different research teams, often exhibit remarkable dissimilarities. To address the uncertainties in earthquake‐source inversion methods and to understand strengths and weaknesses of the various approaches used, the Source Inversion Validation (SIV) project conducts a set of forward‐modeling exercises and inversion benchmarks. In this article, we describe the SIV strategy, the initial benchmarks, and current SIV results. Furthermore, we apply statistical tools for quantitative waveform comparison and for investigating source‐model (dis)similarities that enable us to rank the solutions, and to identify particularly promising source inversion approaches. All SIV exercises (with related data and descriptions) and statistical comparison tools are available via an online collaboration platform, and we encourage source modelers to use the SIV benchmarks for developing and testing new methods. We envision that the SIV efforts will lead to new developments for tackling the earthquake‐source imaging problem.

  15. Identifying MMORPG Bots: A Traffic Analysis Approach

    Science.gov (United States)

    Chen, Kuan-Ta; Jiang, Jhih-Wei; Huang, Polly; Chu, Hao-Hua; Lei, Chin-Laung; Chen, Wen-Chin

    2008-12-01

    Massively multiplayer online role playing games (MMORPGs) have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1) the regularity in the release time of client commands, 2) the trend and magnitude of traffic burstiness in multiple time scales, and 3) the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  16. Identifying MMORPG Bots: A Traffic Analysis Approach

    Directory of Open Access Journals (Sweden)

    Wen-Chin Chen

    2008-11-01

    Full Text Available Massively multiplayer online role playing games (MMORPGs have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1 the regularity in the release time of client commands, 2 the trend and magnitude of traffic burstiness in multiple time scales, and 3 the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  17. On the validity of the effective field theory approach to SM precision tests

    Energy Technology Data Exchange (ETDEWEB)

    Contino, Roberto [EPFL, Lausanne (Switzerland). Inst. de Theorie des Phenomenes Physiques; CERN, Geneva (Switzerland). Theoretical Physics Dept.; Falkowski, Adam [Paris-11 Univ., 91 - Orsay (France). Lab. de Physique Theorique; Goertz, Florian; Riva, Francesco [CERN, Geneva (Switzerland). Theoretical Physics Dept.; Grojean, Christophe [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2016-09-15

    We discuss the conditions for an effective field theory (EFT) to give an adequate low-energy description of an underlying physics beyond the Standard Model (SM). Starting from the EFT where the SM is extended by dimension-6 operators, experimental data can be used without further assumptions to measure (or set limits on) the EFT parameters. The interpretation of these results requires instead a set of broad assumptions (e.g. power counting rules) on the UV dynamics. This allows one to establish, in a bottom-up approach, the validity range of the EFT description, and to assess the error associated with the truncation of the EFT series. We give a practical prescription on how experimental results could be reported, so that they admit a maximally broad range of theoretical interpretations. Namely, the experimental constraints on dimension-6 operators should be reported as functions of the kinematic variables that set the relevant energy scale of the studied process. This is especially important for hadron collider experiments where collisions probe a wide range of energy scales.

  18. Identifying reports of randomized controlled trials (RCTs) via a hybrid machine learning and crowdsourcing approach.

    Science.gov (United States)

    Wallace, Byron C; Noel-Storr, Anna; Marshall, Iain J; Cohen, Aaron M; Smalheiser, Neil R; Thomas, James

    2017-11-01

    Identifying all published reports of randomized controlled trials (RCTs) is an important aim, but it requires extensive manual effort to separate RCTs from non-RCTs, even using current machine learning (ML) approaches. We aimed to make this process more efficient via a hybrid approach using both crowdsourcing and ML. We trained a classifier to discriminate between citations that describe RCTs and those that do not. We then adopted a simple strategy of automatically excluding citations deemed very unlikely to be RCTs by the classifier and deferring to crowdworkers otherwise. Combining ML and crowdsourcing provides a highly sensitive RCT identification strategy (our estimates suggest 95%-99% recall) with substantially less effort (we observed a reduction of around 60%-80%) than relying on manual screening alone. Hybrid crowd-ML strategies warrant further exploration for biomedical curation/annotation tasks. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  19. A multicriteria approach to identify investment opportunities for the exploitation of the clean development mechanism

    International Nuclear Information System (INIS)

    Diakoulaki, D.; Georgiou, P.; Tourkolias, C.; Georgopoulou, E.; Lalas, D.; Mirasgedis, S.; Sarafidis, Y.

    2007-01-01

    The aim of the present paper is to investigate the prospects for the exploitation of the Kyoto Protocol's Clean Development Mechanism (CDM) in Greece. The paper is addressing 3 questions: in which country, what kind of investment, with which economic and environmental return? The proposed approach is based on a multicriteria analysis for identifying priority countries and interesting investment opportunities in each priority country. These opportunities are then evaluated through a conventional financial analysis in order to assess their economic and environmental attractiveness. To this purpose, the IRR of a typical project in each investment category is calculated by taking into account country-specific parameters, such as baseline emission factors, load factors, costs, energy prices etc. The results reveal substantial differences in the economic and environmental return of different types of projects in different host-countries and show that for the full exploitation of the CDM a multifaceted approach to decision-making is necessary

  20. Using Optical Markers of Non-dysplastic Rectal Epithelial Cells to Identify Patients With Ulcerative Colitis (UC) - Associated Neoplasia

    Science.gov (United States)

    Bista, Rajan K.; Brentnall, Teresa A.; Bronner, Mary P.; Langmead, Christopher J.; Brand, Randall E.; Liu, Yang

    2011-01-01

    BACKGROUND Current surveillance guidelines for patients with long-standing ulcerative colitis (UC) recommend repeated colonoscopy with random biopsies, which is time-consuming, discomforting and expensive. A less invasive strategy is to identify neoplasia by analyzing biomarkers from the more accessible rectum to predict the need for a full colonoscopy. The goal of this pilot study is to evaluate whether optical markers of rectal mucosa derived from a novel optical technique – partial-wave spectroscopic microscopy (PWS) could identify UC patients with high-grade dysplasia (HGD) or cancer (CA) present anywhere in their colon. METHODS Banked frozen non-dysplastic mucosal rectal biopsies were used from 28 UC patients (15 without dysplasia and 13 with concurrent HGD or CA). The specimen slides were made using a touch prep method and underwent PWS analysis. We divided the patients into two groups: 13 as a training set and an independent 15 as a validation set. RESULTS We identified six optical markers, ranked by measuring the information gain with respect to the outcome of cancer. The most effective markers were selected by maximizing the cross validated training accuracy of a Naive Bayes classifier. The optimal classifier was applied to the validation data yielding 100% sensitivity and 75% specificity. CONCLUSIONS Our results indicate that the PWS-derived optical markers can accurately predict UC patients with HGD/CA through assessment of rectal epithelial cells. By aiming for a high sensitivity, our approach could potentially simplify the surveillance of UC patients and improve overall resource utilization by identifying patients with HGD/CA who should proceed with colonoscopy. PMID:21351200

  1. Expression profiling identifies genes involved in emphysema severity

    Directory of Open Access Journals (Sweden)

    Bowman Rayleen V

    2009-09-01

    Full Text Available Abstract Chronic obstructive pulmonary disease (COPD is a major public health problem. The aim of this study was to identify genes involved in emphysema severity in COPD patients. Gene expression profiling was performed on total RNA extracted from non-tumor lung tissue from 30 smokers with emphysema. Class comparison analysis based on gas transfer measurement was performed to identify differentially expressed genes. Genes were then selected for technical validation by quantitative reverse transcriptase-PCR (qRT-PCR if also represented on microarray platforms used in previously published emphysema studies. Genes technically validated advanced to tests of biological replication by qRT-PCR using an independent test set of 62 lung samples. Class comparison identified 98 differentially expressed genes (p p Gene expression profiling of lung from emphysema patients identified seven candidate genes associated with emphysema severity including COL6A3, SERPINF1, ZNHIT6, NEDD4, CDKN2A, NRN1 and GSTM3.

  2. Forward genetic screening for regulators involved in cholesterol synthesis using validation-based insertional mutagenesis.

    Directory of Open Access Journals (Sweden)

    Wei Jiang

    Full Text Available Somatic cell genetics is a powerful approach for unraveling the regulatory mechanism of cholesterol metabolism. However, it is difficult to identify the mutant gene(s due to cells are usually mutagenized chemically or physically. To identify important genes controlling cholesterol biosynthesis, an unbiased forward genetics approach named validation-based insertional mutagenesis (VBIM system was used to isolate and characterize the 25-hydroxycholesterol (25-HC-resistant and SR-12813-resistant mutants. Here we report that five mutant cell lines were isolated. Among which, four sterol-resistant mutants either contain a truncated NH2-terminal domain of sterol regulatory element-binding protein (SREBP-2 terminating at amino acids (aa 400, or harbor an overexpressed SREBP cleavage-activating protein (SCAP. Besides, one SR-12813 resistant mutant was identified to contain a truncated COOH-terminal catalytic domain of 3-hydroxy-3-methylglutaryl-coenzyme A reductase (HMG-CoA reductase. This study demonstrates that the VBIM system can be a powerful tool to screen novel regulatory genes in cholesterol biosynthesis.

  3. International Harmonization and Cooperation in the Validation of Alternative Methods.

    Science.gov (United States)

    Barroso, João; Ahn, Il Young; Caldeira, Cristiane; Carmichael, Paul L; Casey, Warren; Coecke, Sandra; Curren, Rodger; Desprez, Bertrand; Eskes, Chantra; Griesinger, Claudius; Guo, Jiabin; Hill, Erin; Roi, Annett Janusch; Kojima, Hajime; Li, Jin; Lim, Chae Hyung; Moura, Wlamir; Nishikawa, Akiyoshi; Park, HyeKyung; Peng, Shuangqing; Presgrave, Octavio; Singer, Tim; Sohn, Soo Jung; Westmoreland, Carl; Whelan, Maurice; Yang, Xingfen; Yang, Ying; Zuang, Valérie

    The development and validation of scientific alternatives to animal testing is important not only from an ethical perspective (implementation of 3Rs), but also to improve safety assessment decision making with the use of mechanistic information of higher relevance to humans. To be effective in these efforts, it is however imperative that validation centres, industry, regulatory bodies, academia and other interested parties ensure a strong international cooperation, cross-sector collaboration and intense communication in the design, execution, and peer review of validation studies. Such an approach is critical to achieve harmonized and more transparent approaches to method validation, peer-review and recommendation, which will ultimately expedite the international acceptance of valid alternative methods or strategies by regulatory authorities and their implementation and use by stakeholders. It also allows achieving greater efficiency and effectiveness by avoiding duplication of effort and leveraging limited resources. In view of achieving these goals, the International Cooperation on Alternative Test Methods (ICATM) was established in 2009 by validation centres from Europe, USA, Canada and Japan. ICATM was later joined by Korea in 2011 and currently also counts with Brazil and China as observers. This chapter describes the existing differences across world regions and major efforts carried out for achieving consistent international cooperation and harmonization in the validation and adoption of alternative approaches to animal testing.

  4. Participatory approach to identify interventions to improve the health, safety, and work productivity of smallholder women vegetable farmers in the Gambia.

    Science.gov (United States)

    Vanderwal, Londa; Rautiainen, Risto; Ramirez, Marizen; Kuye, Rex; Peek-Asa, Corinne; Cook, Thomas; Culp, Kennith; Donham, Kelley

    2011-03-01

    This paper describes the qualitative, community-based participatory approach used to identify culturally-acceptable and sustainable interventions to improve the occupational health, safety, and productivity of smallholder women vegetable farmers in The Gambia (West Africa). This approach was used to conduct: 1) analysis of the tasks and methods traditionally used in vegetable production, and 2) selection of interventions. The most arduous garden tasks that were amenable to interventions were identified, and the interventions were selected through a participatory process for further evaluation. Factors contributing to the successful implementation of the participatory approach used in this study included the following: 1) ensuring that cultural norms were respected and observed; 2) working closely with the existing garden leadership structure; and 3) research team members working with the subjects for an extended period of time to gain first-hand understanding of the selected tasks and to build credibility with the subjects.

  5. Hidden Markov model approach for identifying the modular framework of the protein backbone.

    Science.gov (United States)

    Camproux, A C; Tuffery, P; Chevrolat, J P; Boisvieux, J F; Hazout, S

    1999-12-01

    The hidden Markov model (HMM) was used to identify recurrent short 3D structural building blocks (SBBs) describing protein backbones, independently of any a priori knowledge. Polypeptide chains are decomposed into a series of short segments defined by their inter-alpha-carbon distances. Basically, the model takes into account the sequentiality of the observed segments and assumes that each one corresponds to one of several possible SBBs. Fitting the model to a database of non-redundant proteins allowed us to decode proteins in terms of 12 distinct SBBs with different roles in protein structure. Some SBBs correspond to classical regular secondary structures. Others correspond to a significant subdivision of their bounding regions previously considered to be a single pattern. The major contribution of the HMM is that this model implicitly takes into account the sequential connections between SBBs and thus describes the most probable pathways by which the blocks are connected to form the framework of the protein structures. Validation of the SBBs code was performed by extracting SBB series repeated in recoding proteins and examining their structural similarities. Preliminary results on the sequence specificity of SBBs suggest promising perspectives for the prediction of SBBs or series of SBBs from the protein sequences.

  6. Derivation and Cross-Validation of Cutoff Scores for Patients With Schizophrenia Spectrum Disorders on WAIS-IV Digit Span-Based Performance Validity Measures.

    Science.gov (United States)

    Glassmire, David M; Toofanian Ross, Parnian; Kinney, Dominique I; Nitch, Stephen R

    2016-06-01

    Two studies were conducted to identify and cross-validate cutoff scores on the Wechsler Adult Intelligence Scale-Fourth Edition Digit Span-based embedded performance validity (PV) measures for individuals with schizophrenia spectrum disorders. In Study 1, normative scores were identified on Digit Span-embedded PV measures among a sample of patients (n = 84) with schizophrenia spectrum diagnoses who had no known incentive to perform poorly and who put forth valid effort on external PV tests. Previously identified cutoff scores resulted in unacceptable false positive rates and lower cutoff scores were adopted to maintain specificity levels ≥90%. In Study 2, the revised cutoff scores were cross-validated within a sample of schizophrenia spectrum patients (n = 96) committed as incompetent to stand trial. Performance on Digit Span PV measures was significantly related to Full Scale IQ in both studies, indicating the need to consider the intellectual functioning of examinees with psychotic spectrum disorders when interpreting scores on Digit Span PV measures. © The Author(s) 2015.

  7. Issues in developing valid assessments of speech pathology students' performance in the workplace.

    Science.gov (United States)

    McAllister, Sue; Lincoln, Michelle; Ferguson, Alison; McAllister, Lindy

    2010-01-01

    Workplace-based learning is a critical component of professional preparation in speech pathology. A validated assessment of this learning is seen to be 'the gold standard', but it is difficult to develop because of design and validation issues. These issues include the role and nature of judgement in assessment, challenges in measuring quality, and the relationship between assessment and learning. Valid assessment of workplace-based performance needs to capture the development of competence over time and account for both occupation specific and generic competencies. This paper reviews important conceptual issues in the design of valid and reliable workplace-based assessments of competence including assessment content, process, impact on learning, measurement issues, and validation strategies. It then goes on to share what has been learned about quality assessment and validation of a workplace-based performance assessment using competency-based ratings. The outcomes of a four-year national development and validation of an assessment tool are described. A literature review of issues in conceptualizing, designing, and validating workplace-based assessments was conducted. Key factors to consider in the design of a new tool were identified and built into the cycle of design, trialling, and data analysis in the validation stages of the development process. This paper provides an accessible overview of factors to consider in the design and validation of workplace-based assessment tools. It presents strategies used in the development and national validation of a tool COMPASS, used in an every speech pathology programme in Australia, New Zealand, and Singapore. The paper also describes Rasch analysis, a model-based statistical approach which is useful for establishing validity and reliability of assessment tools. Through careful attention to conceptual and design issues in the development and trialling of workplace-based assessments, it has been possible to develop the

  8. Identifying natural compounds as multi-target-directed ligands against Alzheimer's disease: an in silico approach.

    Science.gov (United States)

    Ambure, Pravin; Bhat, Jyotsna; Puzyn, Tomasz; Roy, Kunal

    2018-04-23

    Alzheimer's disease (AD) is a multi-factorial disease, which can be simply outlined as an irreversible and progressive neurodegenerative disorder with an unclear root cause. It is a major cause of dementia in old aged people. In the present study, utilizing the structural and biological activity information of ligands for five important and mostly studied vital targets (i.e. cyclin-dependant kinase 5, β-secretase, monoamine oxidase B, glycogen synthase kinase 3β, acetylcholinesterase) that are believed to be effective against AD, we have developed five classification models using linear discriminant analysis (LDA) technique. Considering the importance of data curation, we have given more attention towards the chemical and biological data curation, which is a difficult task especially in case of big data-sets. Thus, to ease the curation process we have designed Konstanz Information Miner (KNIME) workflows, which are made available at http://teqip.jdvu.ac.in/QSAR_Tools/ . The developed models were appropriately validated based on the predictions for experiment derived data from test sets, as well as true external set compounds including known multi-target compounds. The domain of applicability for each classification model was checked based on a confidence estimation approach. Further, these validated models were employed for screening of natural compounds collected from the InterBioScreen natural database ( https://www.ibscreen.com/natural-compounds ). Further, the natural compounds that were categorized as 'actives' in at least two classification models out of five developed models were considered as multi-target leads, and these compounds were further screened using the drug-like filter, molecular docking technique and then thoroughly analyzed using molecular dynamics studies. Finally, the most potential multi-target natural compounds against AD are suggested.

  9. Selection, calibration, and validation of models of tumor growth.

    Science.gov (United States)

    Lima, E A B F; Oden, J T; Hormuth, D A; Yankeelov, T E; Almeida, R C

    2016-11-01

    This paper presents general approaches for addressing some of the most important issues in predictive computational oncology concerned with developing classes of predictive models of tumor growth. First, the process of developing mathematical models of vascular tumors evolving in the complex, heterogeneous, macroenvironment of living tissue; second, the selection of the most plausible models among these classes, given relevant observational data; third, the statistical calibration and validation of models in these classes, and finally, the prediction of key Quantities of Interest (QOIs) relevant to patient survival and the effect of various therapies. The most challenging aspects of this endeavor is that all of these issues often involve confounding uncertainties: in observational data, in model parameters, in model selection, and in the features targeted in the prediction. Our approach can be referred to as "model agnostic" in that no single model is advocated; rather, a general approach that explores powerful mixture-theory representations of tissue behavior while accounting for a range of relevant biological factors is presented, which leads to many potentially predictive models. Then representative classes are identified which provide a starting point for the implementation of OPAL, the Occam Plausibility Algorithm (OPAL) which enables the modeler to select the most plausible models (for given data) and to determine if the model is a valid tool for predicting tumor growth and morphology ( in vivo ). All of these approaches account for uncertainties in the model, the observational data, the model parameters, and the target QOI. We demonstrate these processes by comparing a list of models for tumor growth, including reaction-diffusion models, phase-fields models, and models with and without mechanical deformation effects, for glioma growth measured in murine experiments. Examples are provided that exhibit quite acceptable predictions of tumor growth in laboratory

  10. Development and validation of cell-based luciferase reporter gene assays for measuring neutralizing anti-drug antibodies against interferon beta

    DEFF Research Database (Denmark)

    Hermanrud, Christina; Ryner, Malin; Luft, Thomas

    2016-01-01

    a normal distribution for the majority of runs, allowing a parametric approach for cut-point calculation to be used, where NAb positive samples could be identified with 95% confidence. An analysis of means and variances indicated that a floating cut-point should be used for all assays. The assays......Neutralizing anti-drug antibodies (NAbs) against therapeutic interferon beta (IFNβ) in people with multiple sclerosis (MS) are measured with cell-based bioassays. The aim of this study was to redevelop and validate two luciferase reporter-gene bioassays, LUC and iLite, using a cut-point approach...... to identify NAb positive samples. Such an approach is favored by the pharmaceutical industry and governmental regulatory agencies as it has a clear statistical basis and overcomes the limitations of the current assays based on the Kawade principle. The work was conducted following the latest assay guidelines...

  11. New approaches for identifying and testing potential new anti-asthma agents.

    Science.gov (United States)

    Licari, Amelia; Castagnoli, Riccardo; Brambilla, Ilaria; Marseglia, Alessia; Tosca, Maria Angela; Marseglia, Gian Luigi; Ciprandi, Giorgio

    2018-01-01

    Asthma is a chronic disease with significant heterogeneity in clinical features, disease severity, pattern of underlying disease mechanisms, and responsiveness to specific treatments. While the majority of asthmatic patients are controlled by standard pharmacological strategies, a significant subgroup has limited therapeutic options representing a major unmet need. Ongoing asthma research aims to better characterize distinct clinical phenotypes, molecular endotypes, associated reliable biomarkers, and also to develop a series of new effective targeted treatment modalities. Areas covered: The expanding knowledge on the pathogenetic mechanisms of asthma has allowed researchers to investigate a range of new treatment options matched to patient profiles. The aim of this review is to provide a comprehensive and updated overview of the currently available, new and developing approaches for identifying and testing potential treatment options for asthma management. Expert opinion: Future therapeutic strategies for asthma require the identification of reliable biomarkers that can help with diagnosis and endotyping, in order to determine the most effective drug for the right patient phenotype. Furthermore, in addition to the identification of clinical and inflammatory phenotypes, it is expected that a better understanding of the mechanisms of airway remodeling will likely optimize asthma targeted treatment.

  12. On the validation of risk analysis-A commentary

    International Nuclear Information System (INIS)

    Rosqvist, Tony

    2010-01-01

    Aven and Heide (2009) [1] provided interesting views on the reliability and validation of risk analysis. The four validation criteria presented are contrasted with modelling features related to the relative frequency-based and Bayesian approaches to risk analysis. In this commentary I would like to bring forth some issues on validation that partly confirm and partly suggest changes in the interpretation of the introduced validation criteria-especially, in the context of low probability-high consequence systems. The mental model of an expert in assessing probabilities is argued to be a key notion in understanding the validation of a risk analysis.

  13. Using distant supervised learning to identify protein subcellular localizations from full-text scientific articles.

    Science.gov (United States)

    Zheng, Wu; Blake, Catherine

    2015-10-01

    Databases of curated biomedical knowledge, such as the protein-locations reflected in the UniProtKB database, provide an accurate and useful resource to researchers and decision makers. Our goal is to augment the manual efforts currently used to curate knowledge bases with automated approaches that leverage the increased availability of full-text scientific articles. This paper describes experiments that use distant supervised learning to identify protein subcellular localizations, which are important to understand protein function and to identify candidate drug targets. Experiments consider Swiss-Prot, the manually annotated subset of the UniProtKB protein knowledge base, and 43,000 full-text articles from the Journal of Biological Chemistry that contain just under 11.5 million sentences. The system achieves 0.81 precision and 0.49 recall at sentence level and an accuracy of 57% on held-out instances in a test set. Moreover, the approach identifies 8210 instances that are not in the UniProtKB knowledge base. Manual inspection of the 50 most likely relations showed that 41 (82%) were valid. These results have immediate benefit to researchers interested in protein function, and suggest that distant supervision should be explored to complement other manual data curation efforts. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Identification, definition and quantification of goods and services provided by marine biodiversity: implications for the ecosystem approach.

    Science.gov (United States)

    Beaumont, N J; Austen, M C; Atkins, J P; Burdon, D; Degraer, S; Dentinho, T P; Derous, S; Holm, P; Horton, T; van Ierland, E; Marboe, A H; Starkey, D J; Townsend, M; Zarzycki, T

    2007-03-01

    This paper identifies and defines ecosystem goods and services provided by marine biodiversity. Case studies have been used to provide an insight into the practical issues associated with the assessment of marine ecosystem goods and services at specific locations. The aim of this research was to validate the definitions of goods and services, and to identify knowledge gaps and likely difficulties of quantifying the goods and services. A validated theoretical framework for the assessment of goods and services is detailed, and examples of the goods and services at a variety of case study areas are documented. These results will enable future assessments of marine ecosystem goods and services. It is concluded that the utilisation of this goods and services approach has the capacity to play a fundamental role in the Ecosystem Approach, by enabling the pressures and demands of society, the economy and the environment to be integrated into environmental management.

  15. Identifying Psoriasis and Psoriatic Arthritis Patients in Retrospective Databases When Diagnosis Codes Are Not Available: A Validation Study Comparing Medication/Prescriber Visit-Based Algorithms with Diagnosis Codes.

    Science.gov (United States)

    Dobson-Belaire, Wendy; Goodfield, Jason; Borrelli, Richard; Liu, Fei Fei; Khan, Zeba M

    2018-01-01

    Using diagnosis code-based algorithms is the primary method of identifying patient cohorts for retrospective studies; nevertheless, many databases lack reliable diagnosis code information. To develop precise algorithms based on medication claims/prescriber visits (MCs/PVs) to identify psoriasis (PsO) patients and psoriatic patients with arthritic conditions (PsO-AC), a proxy for psoriatic arthritis, in Canadian databases lacking diagnosis codes. Algorithms were developed using medications with narrow indication profiles in combination with prescriber specialty to define PsO and PsO-AC. For a 3-year study period from July 1, 2009, algorithms were validated using the PharMetrics Plus database, which contains both adjudicated medication claims and diagnosis codes. Positive predictive value (PPV), negative predictive value (NPV), sensitivity, and specificity of the developed algorithms were assessed using diagnosis code as the reference standard. Chosen algorithms were then applied to Canadian drug databases to profile the algorithm-identified PsO and PsO-AC cohorts. In the selected database, 183,328 patients were identified for validation. The highest PPVs for PsO (85%) and PsO-AC (65%) occurred when a predictive algorithm of two or more MCs/PVs was compared with the reference standard of one or more diagnosis codes. NPV and specificity were high (99%-100%), whereas sensitivity was low (≤30%). Reducing the number of MCs/PVs or increasing diagnosis claims decreased the algorithms' PPVs. We have developed an MC/PV-based algorithm to identify PsO patients with a high degree of accuracy, but accuracy for PsO-AC requires further investigation. Such methods allow researchers to conduct retrospective studies in databases in which diagnosis codes are absent. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  16. Development and validation of a tool for identifying women with low bone mineral density and low-impact fractures: the São Paulo Osteoporosis Risk Index (SAPORI).

    Science.gov (United States)

    Pinheiro, M M; Reis Neto, E T; Machado, F S; Omura, F; Szejnfeld, J; Szejnfeld, V L

    2012-04-01

    The performance of the São Paulo Osteoporosis Risk Index (SAPORI) was tested in 1,915 women from the original cohort, São Paulo Osteoporosis Study (SAPOS) (N = 4332). This new tool was able to identify women with low bone density (spine and hip) and low-impact fracture, with an area under the receiving operator curve (ROC) of 0.831, 0.724, and 0.689, respectively. A number of studies have demonstrated the clinical relevance of risk factors for identifying individuals at risk of fracture (Fx) and osteoporosis (OP). The SAPOS is an epidemiological study for the assessment of risk factors for Fx and low bone density in women from the community of the metropolitan area of São Paulo, Brazil. The aim of the present study was to develop and validate a tool for identifying women at higher risk for OP and low-impact Fx. A total of 4,332 pre-, peri-, and postmenopausal women were analyzed through a questionnaire addressing risk factors for OP and Fx. All of them performed bone densitometry at the lumbar spine and proximal femur (DPX NT, GE-Lunar). Following the identification of the main risk factors for OP and Fx through multivariate and logistic regression, respectively, the SAPORI was designed and subsequently validated on a second cohort of 1,915 women from the metropolitan community of São Paulo. The performance of this tool was assessed through ROC analysis. The main and significant risk factors associated with low bone density and low-impact Fx were low body weight, advanced age, Caucasian ethnicity, family history of hip Fx, current smoking, and chronic use of glucocorticosteroids. Hormonal replacement therapy and regular physical activity in the previous year played a protective role (p < 0.05). After the statistical adjustments, the SAPORI was able to identify women with low bone density (T-score ≤ -2 standard deviations) in the femur, with 91.4% sensitivity, 52% specificity, and an area under the ROC of 0.831 (p < 0.001). At the lumbar spine

  17. A comprehensive approach to identifying repurposed drugs to treat SCN8A epilepsy.

    Science.gov (United States)

    Atkin, Talia A; Maher, Chani M; Gerlach, Aaron C; Gay, Bryant C; Antonio, Brett M; Santos, Sonia C; Padilla, Karen M; Rader, JulieAnn; Krafte, Douglas S; Fox, Matthew A; Stewart, Gregory R; Petrovski, Slavé; Devinsky, Orrin; Might, Matthew; Petrou, Steven; Goldstein, David B

    2018-04-01

    Many previous studies of drug repurposing have relied on literature review followed by evaluation of a limited number of candidate compounds. Here, we demonstrate the feasibility of a more comprehensive approach using high-throughput screening to identify inhibitors of a gain-of-function mutation in the SCN8A gene associated with severe pediatric epilepsy. We developed cellular models expressing wild-type or an R1872Q mutation in the Na v 1.6 sodium channel encoded by SCN8A. Voltage clamp experiments in HEK-293 cells expressing the SCN8A R1872Q mutation demonstrated a leftward shift in sodium channel activation as well as delayed inactivation; both changes are consistent with a gain-of-function mutation. We next developed a fluorescence-based, sodium flux assay and used it to assess an extensive library of approved drugs, including a panel of antiepileptic drugs, for inhibitory activity in the mutated cell line. Lead candidates were evaluated in follow-on studies to generate concentration-response curves for inhibiting sodium influx. Select compounds of clinical interest were evaluated by electrophysiology to further characterize drug effects on wild-type and mutant sodium channel functions. The screen identified 90 drugs that significantly inhibited sodium influx in the R1872Q cell line. Four drugs of potential clinical interest-amitriptyline, carvedilol, nilvadipine, and carbamazepine-were further investigated and demonstrated concentration-dependent inhibition of sodium channel currents. A comprehensive drug repurposing screen identified potential new candidates for the treatment of epilepsy caused by the R1872Q mutation in the SCN8A gene. Wiley Periodicals, Inc. © 2018 International League Against Epilepsy.

  18. A Machine Learning Approach to Identifying Placebo Responders in Late-Life Depression Trials.

    Science.gov (United States)

    Zilcha-Mano, Sigal; Roose, Steven P; Brown, Patrick J; Rutherford, Bret R

    2018-01-11

    Despite efforts to identify characteristics associated with medication-placebo differences in antidepressant trials, few consistent findings have emerged to guide participant selection in drug development settings and differential therapeutics in clinical practice. Limitations in the methodologies used, particularly searching for a single moderator while treating all other variables as noise, may partially explain the failure to generate consistent results. The present study tested whether interactions between pretreatment patient characteristics, rather than a single-variable solution, may better predict who is most likely to benefit from placebo versus medication. Data were analyzed from 174 patients aged 75 years and older with unipolar depression who were randomly assigned to citalopram or placebo. Model-based recursive partitioning analysis was conducted to identify the most robust significant moderators of placebo versus citalopram response. The greatest signal detection between medication and placebo in favor of medication was among patients with fewer years of education (≤12) who suffered from a longer duration of depression since their first episode (>3.47 years) (B = 2.53, t(32) = 3.01, p = 0.004). Compared with medication, placebo had the greatest response for those who were more educated (>12 years), to the point where placebo almost outperformed medication (B = -0.57, t(96) = -1.90, p = 0.06). Machine learning approaches capable of evaluating the contributions of multiple predictor variables may be a promising methodology for identifying placebo versus medication responders. Duration of depression and education should be considered in the efforts to modulate placebo magnitude in drug development settings and in clinical practice. Copyright © 2018 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  19. Identifying Similarities in Cognitive Subtest Functional Requirements: An Empirical Approach

    Science.gov (United States)

    Frisby, Craig L.; Parkin, Jason R.

    2007-01-01

    In the cognitive test interpretation literature, a Rational/Intuitive, Indirect Empirical, or Combined approach is typically used to construct conceptual taxonomies of the functional (behavioral) similarities between subtests. To address shortcomings of these approaches, the functional requirements for 49 subtests from six individually…

  20. Fingerprints of zones in boreholes. An approach to identify the characteristics of structures

    Energy Technology Data Exchange (ETDEWEB)

    Straeng, Thomas; Waenstedt, Stefan; Tiren, Sven (GEOSIGMA AB (Sweden))

    2010-11-15

    The classification of geophysical borehole data in order to identify and characterize structures, which intercepts the borehole, is an important part of 3D modelling of the structural pattern of the bedrock. The objective of this study is to test a statistical approach, cluster analysis, on site data in order to see if it is possible to classify complex data set, geological and geophysical borehole data, in order to identify borehole intersects with increased brittle deformation, i.e. brittle deformation zones. The base data used in the study have been provided and delivered by SKB and consist of borehole logging data from the cored borehole KFM03A. The statistical method chosen for this study, cluster analysis using K-means method, groups data into a pre-defined number of clusters with the goal to minimize the variance of data within the group and to maximize the variance between the clusters. The idea is that data can and should be grouped into two categories, two clusters -corresponding to homogeneous bedrock (matrix) in one cluster and open fractures in the other cluster. The analysis also includes a repetition of the cluster analysis, with a stepwise refined data set to see whether strongly accentuated features could be identified. The results show that the use of K-mean Cluster analysis will present clusters that could represent the spatial distribution of bedrock matrix and the location of fractures respectively down the borehole. The results were compared with fracture frequency data (from core mapping) and also with the geological Single Hole Interpretation of KFM03A performed by SKB. The fracture zones identified in the Single Hole Interpretation process are all indicated in the cluster analysis results. The cluster analysis revealed eight additional possible zones. A majority of these are smaller than 5 metres (section width in the borehole) but they are still pronounced in the analysis. Based on the geophysical data, these sections should be taken into

  1. [Design and validation of the CSR-Hospital-SP scale to measure corporate social responsibility].

    Science.gov (United States)

    Mira, José Joaquín; Lorenzo, Susana; Navarro, Isabel; Pérez-Jover, Virtudes; Vitaller, Julián

    2013-01-01

    To design and validate a scale (CSR-Hospital-SP) to determine health professionals' views on the approach of management to corporate social responsibility (CSR) in their hospital. The literature was reviewed to identify the main CSR scales and select the dimensions to be evaluated. The initial version of the scale consisted of 25 items. A convenience sample of a minimum of 224 health professionals working in five public hospitals in five autonomous regions were invited to respond. Floor and ceiling effects, internal consistency, reliability, and construct validity were analyzed. A total of 233 health professionals responded. The CSR-Hospital-SP scale had 20 items grouped into four factors. The item-total correlation was higher than 0.30; all factor loadings were greater than 0.50; 59.57% of the variance was explained; Cronbach's alpha was 0.90; Spearman-Brown's coefficient was 0.82. The CSR-Hospital-SP scale is a tool designed for hospitals that implement accountability mechanisms and promote socially responsible management approaches. Copyright © 2012 SESPAS. Published by Elsevier Espana. All rights reserved.

  2. Identifying Green Infrastructure from Social Media and Crowdsourcing- An Image Based Machine-Learning Approach.

    Science.gov (United States)

    Rai, A.; Minsker, B. S.

    2016-12-01

    In this work we introduce a novel dataset GRID: GReen Infrastructure Detection Dataset and a framework for identifying urban green storm water infrastructure (GI) designs (wetlands/ponds, urban trees, and rain gardens/bioswales) from social media and satellite aerial images using computer vision and machine learning methods. Along with the hydrologic benefits of GI, such as reducing runoff volumes and urban heat islands, GI also provides important socio-economic benefits such as stress recovery and community cohesion. However, GI is installed by many different parties and cities typically do not know where GI is located, making study of its impacts or siting new GI difficult. We use object recognition learning methods (template matching, sliding window approach, and Random Hough Forest method) and supervised machine learning algorithms (e.g., support vector machines) as initial screening approaches to detect potential GI sites, which can then be investigated in more detail using on-site surveys. Training data were collected from GPS locations of Flickr and Instagram image postings and Amazon Mechanical Turk identification of each GI type. Sliding window method outperformed other methods and achieved an average F measure, which is combined metric for precision and recall performance measure of 0.78.

  3. Validation of candidate genes putatively associated with resistance to SCMV and MDMV in maize (Zea mays L.) by expression profiling

    DEFF Research Database (Denmark)

    Uzarowska, Anna; Dionisio, Giuseppe; Sarholz, Barbara

    2009-01-01

    Background The potyviruses sugarcane mosaic virus (SCMV) and maize dwarf mosaic virus (MDMV) are major pathogens of maize worldwide. Two loci, Scmv1 and Scmv2, have ealier been shown to confer complete resistance to SCMV. Custom-made microarrays containing previously identified SCMV resistance...... the effectiveness and reliability of the combination of different expression profiling approaches for the identification and validation of candidate genes. Genes identified in this study represent possible future targets for manipulation of SCMV resistance in maize....

  4. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  5. Structural Validation of the Holistic Wellness Assessment

    Science.gov (United States)

    Brown, Charlene; Applegate, E. Brooks; Yildiz, Mustafa

    2015-01-01

    The Holistic Wellness Assessment (HWA) is a relatively new assessment instrument based on an emergent transdisciplinary model of wellness. This study validated the factor structure identified via exploratory factor analysis (EFA), assessed test-retest reliability, and investigated concurrent validity of the HWA in three separate samples. The…

  6. Box-ticking and Olympic high jumping - Physicians' perceptions and acceptance of national physician validation systems.

    Science.gov (United States)

    Sehlbach, Carolin; Govaerts, Marjan J B; Mitchell, Sharon; Rohde, Gernot G U; Smeenk, Frank W J M; Driessen, Erik W

    2018-05-24

    National physician validation systems aim to ensure lifelong learning through periodic appraisals of physicians' competence. Their effectiveness is determined by physicians' acceptance of and commitment to the system. This study, therefore, sought to explore physicians' perceptions and self-reported acceptance of validation across three different physician validation systems in Europe. Using a constructivist grounded-theory approach, we conducted semi-structured interviews with 32 respiratory specialists from three countries with markedly different validation systems: Germany, which has a mandatory, credit-based system oriented to continuing professional development; Denmark, with mandatory annual dialogs and ensuing, non-compulsory activities; and the UK, with a mandatory, portfolio-based revalidation system. We analyzed interview data with a view to identifying factors influencing physicians' perceptions and acceptance. Factors that influenced acceptance were the assessment's authenticity and alignment of its requirements with clinical practice, physicians' beliefs about learning, perceived autonomy, and organizational support. Users' acceptance levels determine any system's effectiveness. To support lifelong learning effectively, national physician validation systems must be carefully designed and integrated into daily practice. Involving physicians in their design may render systems more authentic and improve alignment between individual ambitions and the systems' goals, thereby promoting acceptance.

  7. High throughput sequencing and proteomics to identify immunogenic proteins of a new pathogen: the dirty genome approach.

    Directory of Open Access Journals (Sweden)

    Gilbert Greub

    Full Text Available BACKGROUND: With the availability of new generation sequencing technologies, bacterial genome projects have undergone a major boost. Still, chromosome completion needs a costly and time-consuming gap closure, especially when containing highly repetitive elements. However, incomplete genome data may be sufficiently informative to derive the pursued information. For emerging pathogens, i.e. newly identified pathogens, lack of release of genome data during gap closure stage is clearly medically counterproductive. METHODS/PRINCIPAL FINDINGS: We thus investigated the feasibility of a dirty genome approach, i.e. the release of unfinished genome sequences to develop serological diagnostic tools. We showed that almost the whole genome sequence of the emerging pathogen Parachlamydia acanthamoebae was retrieved even with relatively short reads from Genome Sequencer 20 and Solexa. The bacterial proteome was analyzed to select immunogenic proteins, which were then expressed and used to elaborate the first steps of an ELISA. CONCLUSIONS/SIGNIFICANCE: This work constitutes the proof of principle for a dirty genome approach, i.e. the use of unfinished genome sequences of pathogenic bacteria, coupled with proteomics to rapidly identify new immunogenic proteins useful to develop in the future specific diagnostic tests such as ELISA, immunohistochemistry and direct antigen detection. Although applied here to an emerging pathogen, this combined dirty genome sequencing/proteomic approach may be used for any pathogen for which better diagnostics are needed. These genome sequences may also be very useful to develop DNA based diagnostic tests. All these diagnostic tools will allow further evaluations of the pathogenic potential of this obligate intracellular bacterium.

  8. Cross-validated detection of crack initiation in aerospace materials

    Science.gov (United States)

    Vanniamparambil, Prashanth A.; Cuadra, Jefferson; Guclu, Utku; Bartoli, Ivan; Kontsos, Antonios

    2014-03-01

    A cross-validated nondestructive evaluation approach was employed to in situ detect the onset of damage in an Aluminum alloy compact tension specimen. The approach consisted of the coordinated use primarily the acoustic emission, combined with the infrared thermography and digital image correlation methods. Both tensile loads were applied and the specimen was continuously monitored using the nondestructive approach. Crack initiation was witnessed visually and was confirmed by the characteristic load drop accompanying the ductile fracture process. The full field deformation map provided by the nondestructive approach validated the formation of a pronounced plasticity zone near the crack tip. At the time of crack initiation, a burst in the temperature field ahead of the crack tip as well as a sudden increase of the acoustic recordings were observed. Although such experiments have been attempted and reported before in the literature, the presented approach provides for the first time a cross-validated nondestructive dataset that can be used for quantitative analyses of the crack initiation information content. It further allows future development of automated procedures for real-time identification of damage precursors including the rarely explored crack incubation stage in fatigue conditions.

  9. Sterilization validation for medical devices at IRASM microbiological laboratory—Practical approaches

    International Nuclear Information System (INIS)

    Trandafir, Laura; Alexandru, Mioara; Constantin, Mihai; Ioniţă, Anca; Zorilă, Florina; Moise, Valentin

    2012-01-01

    EN ISO 11137 established regulations for setting or substantiating the dose for achieving the desired sterility assurance level. The validation studies can be designed in particular for different types of products. Each product needs distinct protocols for bioburden determination and sterility testing. The Microbiological Laboratory from Irradiation Processing Center (IRASM) deals with different types of products, mainly for the VD max 25 method. When it comes to microbiological evaluation the most challenging was cotton gauze. A special situation for establishing the sterilization validation method appears in cases of cotton packed in large quantities. The VD max 25 method cannot be applied for items with average bioburden more than 1000 CFU/pack, irrespective of the weight of the package. This is a method limitation and implies increased costs for the manufacturer when choosing other methods. For microbiological tests, culture condition should be selected in both cases of the bioburden and sterility testing. Details about choosing criteria are given. - Highlights: ► The paper presents aspects and results within the sterilization validation process. ► Critical aspects that can lead to the failure of the process were emphasized. ► Limitation methods were discussed.

  10. DNA enrichment approaches to identify unauthorized genetically modified organisms (GMOs).

    Science.gov (United States)

    Arulandhu, Alfred J; van Dijk, Jeroen P; Dobnik, David; Holst-Jensen, Arne; Shi, Jianxin; Zel, Jana; Kok, Esther J

    2016-07-01

    With the increased global production of different genetically modified (GM) plant varieties, chances increase that unauthorized GM organisms (UGMOs) may enter the food chain. At the same time, the detection of UGMOs is a challenging task because of the limited sequence information that will generally be available. PCR-based methods are available to detect and quantify known UGMOs in specific cases. If this approach is not feasible, DNA enrichment of the unknown adjacent sequences of known GMO elements is one way to detect the presence of UGMOs in a food or feed product. These enrichment approaches are also known as chromosome walking or gene walking (GW). In recent years, enrichment approaches have been coupled with next generation sequencing (NGS) analysis and implemented in, amongst others, the medical and microbiological fields. The present review will provide an overview of these approaches and an evaluation of their applicability in the identification of UGMOs in complex food or feed samples.

  11. Earth Observation for Citizen Science Validation, or Citizen Science for Earth Observation Validation? The Role of Quality Assurance of Volunteered Observations

    Directory of Open Access Journals (Sweden)

    Didier G. Leibovici

    2017-10-01

    Full Text Available Environmental policy involving citizen science (CS is of growing interest. In support of this open data stream of information, validation or quality assessment of the CS geo-located data to their appropriate usage for evidence-based policy making needs a flexible and easily adaptable data curation process ensuring transparency. Addressing these needs, this paper describes an approach for automatic quality assurance as proposed by the Citizen OBservatory WEB (COBWEB FP7 project. This approach is based upon a workflow composition that combines different quality controls, each belonging to seven categories or “pillars”. Each pillar focuses on a specific dimension in the types of reasoning algorithms for CS data qualification. These pillars attribute values to a range of quality elements belonging to three complementary quality models. Additional data from various sources, such as Earth Observation (EO data, are often included as part of the inputs of quality controls within the pillars. However, qualified CS data can also contribute to the validation of EO data. Therefore, the question of validation can be considered as “two sides of the same coin”. Based on an invasive species CS study, concerning Fallopia japonica (Japanese knotweed, the paper discusses the flexibility and usefulness of qualifying CS data, either when using an EO data product for the validation within the quality assurance process, or validating an EO data product that describes the risk of occurrence of the plant. Both validation paths are found to be improved by quality assurance of the CS data. Addressing the reliability of CS open data, issues and limitations of the role of quality assurance for validation, due to the quality of secondary data used within the automatic workflow, are described, e.g., error propagation, paving the route to improvements in the approach.

  12. PEANO, a toolbox for real-time process signal validation and estimation

    International Nuclear Information System (INIS)

    Fantoni, Paolo F.; Figedy, Stefan; Racz, Attila

    1998-02-01

    PEANO (Process Evaluation and Analysis by Neural Operators), a toolbox for real time process signal validation and condition monitoring has been developed. This system analyses the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. The system is based on neuro-fuzzy techniques. Artificial Neural Networks and Fuzzy Logic models can be combined to exploit learning and generalisation capability of the first technique with the approximate reasoning embedded in the second approach. Real-time process signal validation is an application field where the use of this technique can improve the diagnosis of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process has to be performed. The possibilistic approach (rather than probabilistic) allows a ''don't know'' classification that results in a fast detection of unforeseen plant conditions or outliers. Specialised Artificial Neural Networks are used for the validation process, one for each fuzzy cluster in which the operating map has been divided. There are two main advantages in using this technique: the accuracy and generalisation capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This model has been tested in a simulated environment on a French PWR, to monitor safety-related reactor variables over the entire power-flow operating map. (author)

  13. PEANO, a toolbox for real-time process signal validation and estimation

    Energy Technology Data Exchange (ETDEWEB)

    Fantoni, Paolo F.; Figedy, Stefan; Racz, Attila

    1998-02-01

    PEANO (Process Evaluation and Analysis by Neural Operators), a toolbox for real time process signal validation and condition monitoring has been developed. This system analyses the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. The system is based on neuro-fuzzy techniques. Artificial Neural Networks and Fuzzy Logic models can be combined to exploit learning and generalisation capability of the first technique with the approximate reasoning embedded in the second approach. Real-time process signal validation is an application field where the use of this technique can improve the diagnosis of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process has to be performed. The possibilistic approach (rather than probabilistic) allows a ''don't know'' classification that results in a fast detection of unforeseen plant conditions or outliers. Specialised Artificial Neural Networks are used for the validation process, one for each fuzzy cluster in which the operating map has been divided. There are two main advantages in using this technique: the accuracy and generalisation capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This model has been tested in a simulated environment on a French PWR, to monitor safety-related reactor variables over the entire power-flow operating map. (author)

  14. The validation index: a new metric for validation of segmentation algorithms using two or more expert outlines with application to radiotherapy planning.

    Science.gov (United States)

    Juneja, Prabhjot; Evans, Philp M; Harris, Emma J

    2013-08-01

    Validation is required to ensure automated segmentation algorithms are suitable for radiotherapy target definition. In the absence of true segmentation, algorithmic segmentation is validated against expert outlining of the region of interest. Multiple experts are used to overcome inter-expert variability. Several approaches have been studied in the literature, but the most appropriate approach to combine the information from multiple expert outlines, to give a single metric for validation, is unclear. None consider a metric that can be tailored to case-specific requirements in radiotherapy planning. Validation index (VI), a new validation metric which uses experts' level of agreement was developed. A control parameter was introduced for the validation of segmentations required for different radiotherapy scenarios: for targets close to organs-at-risk and for difficult to discern targets, where large variation between experts is expected. VI was evaluated using two simulated idealized cases and data from two clinical studies. VI was compared with the commonly used Dice similarity coefficient (DSCpair - wise) and found to be more sensitive than the DSCpair - wise to the changes in agreement between experts. VI was shown to be adaptable to specific radiotherapy planning scenarios.

  15. Expert validation of fit-for-purpose guidelines for designing programmes of assessment

    Directory of Open Access Journals (Sweden)

    Dijkstra Joost

    2012-04-01

    Full Text Available Abstract Background An assessment programme, a purposeful mix of assessment activities, is necessary to achieve a complete picture of assessee competence. High quality assessment programmes exist, however, design requirements for such programmes are still unclear. We developed guidelines for design based on an earlier developed framework which identified areas to be covered. A fitness-for-purpose approach defining quality was adopted to develop and validate guidelines. Methods First, in a brainstorm, ideas were generated, followed by structured interviews with 9 international assessment experts. Then, guidelines were fine-tuned through analysis of the interviews. Finally, validation was based on expert consensus via member checking. Results In total 72 guidelines were developed and in this paper the most salient guidelines are discussed. The guidelines are related and grouped per layer of the framework. Some guidelines were so generic that these are applicable in any design consideration. These are: the principle of proportionality, rationales should underpin each decisions, and requirement of expertise. Logically, many guidelines focus on practical aspects of assessment. Some guidelines were found to be clear and concrete, others were less straightforward and were phrased more as issues for contemplation. Conclusions The set of guidelines is comprehensive and not bound to a specific context or educational approach. From the fitness-for-purpose principle, guidelines are eclectic, requiring expertise judgement to use them appropriately in different contexts. Further validation studies to test practicality are required.

  16. A Complementary Bioinformatics Approach to Identify Potential Plant Cell Wall Glycosyltransferase-Encoding Genes

    DEFF Research Database (Denmark)

    Egelund, Jack; Skjøt, Michael; Geshi, Naomi

    2004-01-01

    Plant cell wall (CW) synthesizing enzymes can be divided into the glycan (i.e. cellulose and callose) synthases, which are multimembrane spanning proteins located at the plasma membrane, and the glycosyltransferases (GTs), which are Golgi localized single membrane spanning proteins, believed....... Although much is known with regard to composition and fine structures of the plant CW, only a handful of CW biosynthetic GT genes-all classified in the CAZy system-have been characterized. In an effort to identify CW GTs that have not yet been classified in the CAZy database, a simple bioinformatics...... approach was adopted. First, the entire Arabidopsis proteome was run through the Transmembrane Hidden Markov Model 2.0 server and proteins containing one or, more rarely, two transmembrane domains within the N-terminal 150 amino acids were collected. Second, these sequences were submitted...

  17. A study on a systematic approach of verification and validation of a computerized procedure system: ImPRO

    International Nuclear Information System (INIS)

    Qin, Wei; Seong, Poong Hyun

    2003-01-01

    Paper Based Procedure (PBP) and Computerized Procedure System (CPS) are studied to demonstrate that it is necessary to develop CPS in NPP I and C system. Computerized procedure system is actually a software system. All the desired and undesired properties of a software system can be described and evaluated as software qualities. Generally, software qualities can be categorized into product quality and process quality. In order to achieve product quality, the process quality of a software system should also be considered and achieved. Characteristics of CPS will be described to analyse the product and process of an example CPS: ImPRO. At the same time, several main product and process issues will be analysed from Verification and Validation (V and V) point of view. It is concluded and suggested that V and V activities can also be regarded as software development process, this point of view then is applied to the V and V activities of ImPRO as a systematic approach of V and V of ImPRO. To support and realize this approach, suitable testing technologies and testing strategies are suggested

  18. The Treatment Validity of Autism Screening Instruments

    Science.gov (United States)

    Livanis, Andrew; Mouzakitis, Angela

    2010-01-01

    Treatment validity is a frequently neglected topic of screening instruments used to identify autism spectrum disorders. Treatment validity, however, should represent an important aspect of these instruments to link the resulting data to the selection of interventions as well as make decisions about treatment length and intensity. Research…

  19. Validity and Fairness

    Science.gov (United States)

    Kane, Michael

    2010-01-01

    This paper presents the author's critique on Xiaoming Xi's article, "How do we go about investigating test fairness?," which lays out a broad framework for studying fairness as comparable validity across groups within the population of interest. Xi proposes to develop a fairness argument that would identify and evaluate potential fairness-based…

  20. Fish age validation by radiometric analysis of otoliths

    International Nuclear Information System (INIS)

    Fenton, G.E.

    1992-01-01

    Radiochemical analysis of aragonitic fish otoliths provides a useful approach to validating ages obtained by more common methods. The history of applications of radiometry using short-lived natural isotopes to clams, Nautilus, living corals and fish otoliths is briefly reviewed. The biogeochemical assumptions required for successful use of these techniques are discussed, and the appropriate mathematical treatments required for data analysis are outlined. Novel normalization techniques designed to widen the validity of this approach are proposed. Desirable lines of further research are also briefly discussed. 38 refs., 1 tab