WorldWideScience

Sample records for validation approach identifying

  1. Multipronged approach to identify and validate a novel upstream regulator of Sncg in mouse retinal ganglion cells.

    Science.gov (United States)

    Chintalapudi, Sumana R; Morales-Tirado, Vanessa M; Williams, Robert W; Jablonski, Monica M

    2016-02-01

    Loss of retinal ganglion cells (RGCs) is one of the hallmarks of retinal neurodegenerative diseases, glaucoma being one of the most common. Mechanistic studies on RGCs are hindered by the lack of sufficient primary cells and consensus regarding their signature markers. Recently, γ-synuclein (SNCG) has been shown to be highly expressed in the somas and axons of RGCs. In various mouse models of glaucoma, downregulation of Sncg gene expression correlates with RGC loss. To investigate the role of Sncg in RGCs, we used a novel systems genetics approach to identify a gene that modulates Sncg expression, followed by confirmatory studies in both healthy and diseased retinae. We found that chromosome 1 harbors an expression quantitative trait locus that modulates Sncg expression in the mouse retina, and identified the prefoldin-2 (PFDN2) gene as the candidate upstream modulator of Sncg expression. Our immunohistochemical analyses revealed similar expression patterns in both mouse and human healthy retinae, with PFDN2 colocalizing with SNCG in RGCs and their axons. In contrast, in retinae from glaucoma subjects, SNCG levels were significantly reduced, although PFDN2 levels were maintained. Using a novel flow cytometry-based RGC isolation method, we obtained viable populations of murine RGCs. Knocking down Pfdn2 expression in primary murine RGCs significantly reduced Sncg expression, confirming that Pfdn2 regulates Sncg expression in murine RGCs. Gene Ontology analysis indicated shared mitochondrial function associated with Sncg and Pfdn2. These data solidify the relationship between Sncg and Pfdn2 in RGCs, and provide a novel mechanism for maintaining RGC health. © 2015 FEBS.

  2. Genomic approach to therapeutic target validation identifies a glucose-lowering GLP1R variant protective for coronary heart disease

    Science.gov (United States)

    Scott, Robert A.; Freitag, Daniel F.; Li, Li; Chu, Audrey Y.; Surendran, Praveen; Young, Robin; Grarup, Niels; Stancáková, Alena; Chen, Yuning; V.Varga, Tibor; Yaghootkar, Hanieh; Luan, Jian'an; Zhao, Jing Hua; Willems, Sara M.; Wessel, Jennifer; Wang, Shuai; Maruthur, Nisa; Michailidou, Kyriaki; Pirie, Ailith; van der Lee, Sven J.; Gillson, Christopher; Olama, Ali Amin Al; Amouyel, Philippe; Arriola, Larraitz; Arveiler, Dominique; Aviles-Olmos, Iciar; Balkau, Beverley; Barricarte, Aurelio; Barroso, Inês; Garcia, Sara Benlloch; Bis, Joshua C.; Blankenberg, Stefan; Boehnke, Michael; Boeing, Heiner; Boerwinkle, Eric; Borecki, Ingrid B.; Bork-Jensen, Jette; Bowden, Sarah; Caldas, Carlos; Caslake, Muriel; Cupples, L. Adrienne; Cruchaga, Carlos; Czajkowski, Jacek; den Hoed, Marcel; Dunn, Janet A.; Earl, Helena M.; Ehret, Georg B.; Ferrannini, Ele; Ferrieres, Jean; Foltynie, Thomas; Ford, Ian; Forouhi, Nita G.; Gianfagna, Francesco; Gonzalez, Carlos; Grioni, Sara; Hiller, Louise; Jansson, Jan-Håkan; Jørgensen, Marit E.; Jukema, J. Wouter; Kaaks, Rudolf; Kee, Frank; Kerrison, Nicola D.; Key, Timothy J.; Kontto, Jukka; Kote-Jarai, Zsofia; Kraja, Aldi T.; Kuulasmaa, Kari; Kuusisto, Johanna; Linneberg, Allan; Liu, Chunyu; Marenne, Gaëlle; Mohlke, Karen L.; Morris, Andrew P.; Muir, Kenneth; Müller-Nurasyid, Martina; Munroe, Patricia B.; Navarro, Carmen; Nielsen, Sune F.; Nilsson, Peter M.; Nordestgaard, Børge G.; Packard, Chris J.; Palli, Domenico; Panico, Salvatore; Peloso, Gina M.; Perola, Markus; Peters, Annette; Poole, Christopher J.; Quirós, J. Ramón; Rolandsson, Olov; Sacerdote, Carlotta; Salomaa, Veikko; Sánchez, María-José; Sattar, Naveed; Sharp, Stephen J.; Sims, Rebecca; Slimani, Nadia; Smith, Jennifer A.; Thompson, Deborah J.; Trompet, Stella; Tumino, Rosario; van der A, Daphne L.; van der Schouw, Yvonne T.; Virtamo, Jarmo; Walker, Mark; Walter, Klaudia; Abraham, Jean E.; Amundadottir, Laufey T.; Aponte, Jennifer L.; Butterworth, Adam S.; Dupuis, Josée; Easton, Douglas F.; Eeles, Rosalind A.; Erdmann, Jeanette; Franks, Paul W.; Frayling, Timothy M.; Hansen, Torben; Howson, Joanna M. M.; Jørgensen, Torben; Kooner, Jaspal; Laakso, Markku; Langenberg, Claudia; McCarthy, Mark I.; Pankow, James S.; Pedersen, Oluf; Riboli, Elio; Rotter, Jerome I.; Saleheen, Danish; Samani, Nilesh J.; Schunkert, Heribert; Vollenweider, Peter; O'Rahilly, Stephen; Deloukas, Panos; Danesh, John; Goodarzi, Mark O.; Kathiresan, Sekar; Meigs, James B.; Ehm, Margaret G.; Wareham, Nicholas J.; Waterworth, Dawn M.

    2016-01-01

    Regulatory authorities have indicated that new drugs to treat type 2 diabetes (T2D) should not be associated with an unacceptable increase in cardiovascular risk. Human genetics may be able to inform development of antidiabetic therapies by predicting cardiovascular and other health endpoints. We therefore investigated the association of variants in 6 genes that encode drug targets for obesity or T2D with a range of metabolic traits in up to 11,806 individuals by targeted exome sequencing, and follow-up in 39,979 individuals by targeted genotyping, with additional in silico follow up in consortia. We used these data to first compare associations of variants in genes encoding drug targets with the effects of pharmacological manipulation of those targets in clinical trials. We then tested the association those variants with disease outcomes, including coronary heart disease, to predict cardiovascular safety of these agents. A low-frequency missense variant (Ala316Thr;rs10305492) in the gene encoding glucagon-like peptide-1 receptor (GLP1R), the target of GLP1R agonists, was associated with lower fasting glucose and lower T2D risk, consistent with GLP1R agonist therapies. The minor allele was also associated with protection against heart disease, thus providing evidence that GLP1R agonists are not likely to be associated with an unacceptable increase in cardiovascular risk. Our results provide an encouraging signal that these agents may be associated with benefit, a question currently being addressed in randomised controlled trials. Genetic variants associated with metabolic traits and multiple disease outcomes can be used to validate therapeutic targets at an early stage in the drug development process. PMID:27252175

  3. Identifying bully victims: definitional versus behavioral approaches.

    Science.gov (United States)

    Green, Jennifer Greif; Felix, Erika D; Sharkey, Jill D; Furlong, Michael J; Kras, Jennifer E

    2013-06-01

    Schools frequently assess bullying and the Olweus Bully/Victimization Questionnaire (BVQ; Olweus, 1996) is the most widely adopted tool for this purpose. The BVQ is a self-report survey that uses a definitional measurement method--describing "bullying" as involving repeated, intentional aggression in a relationship where there is an imbalance of power and then asking respondents to indicate how frequently they experienced this type of victimization. Few studies have examined BVQ validity and whether this definitional method truly identifies the repetition and power differential that distinguish bullying from other forms of peer victimization. This study examined the concurrent validity of the BVQ definitional question among 435 students reporting peer victimization. BVQ definitional responses were compared with responses to a behavioral measure that did not use the term "bullying" but, instead, included items that asked about its defining characteristics (repetition, intentionality, power imbalance). Concordance between the two approaches was moderate, with an area under the receiver operating curve of .72. BVQ responses were more strongly associated with students indicating repeated victimization and multiple forms of victimization, than with power imbalance in their relationship with the bully. Findings indicate that the BVQ is a valid measure of repeated victimization and a broad range of victimization experiences but may not detect the more subtle and complex power imbalances that distinguish bullying from other forms of peer victimization. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  4. Distributed design approach in persistent identifiers systems

    Science.gov (United States)

    Golodoniuc, Pavel; Car, Nicholas; Klump, Jens

    2017-04-01

    The need to identify both digital and physical objects is ubiquitous in our society. Past and present persistent identifier (PID) systems, of which there is a great variety in terms of technical and social implementations, have evolved with the advent of the Internet, which has allowed for globally unique and globally resolvable identifiers. PID systems have catered for identifier uniqueness, integrity, persistence, and trustworthiness, regardless of the identifier's application domain, the scope of which has expanded significantly in the past two decades. Since many PID systems have been largely conceived and developed by small communities, or even a single organisation, they have faced challenges in gaining widespread adoption and, most importantly, the ability to survive change of technology. This has left a legacy of identifiers that still exist and are being used but which have lost their resolution service. We believe that one of the causes of once successful PID systems fading is their reliance on a centralised technical infrastructure or a governing authority. Golodoniuc et al. (2016) proposed an approach to the development of PID systems that combines the use of (a) the Handle system, as a distributed system for the registration and first-degree resolution of persistent identifiers, and (b) the PID Service (Golodoniuc et al., 2015), to enable fine-grained resolution to different information object representations. The proposed approach solved the problem of guaranteed first-degree resolution of identifiers, but left fine-grained resolution and information delivery under the control of a single authoritative source, posing risk to the long-term availability of information resources. Herein, we develop these approaches further and explore the potential of large-scale decentralisation at all levels: (i) persistent identifiers and information resources registration; (ii) identifier resolution; and (iii) data delivery. To achieve large-scale decentralisation

  5. An Efficient Approach for Identifying Stable Lobes with Discretization Method

    Directory of Open Access Journals (Sweden)

    Baohai Wu

    2013-01-01

    Full Text Available This paper presents a new approach for quick identification of chatter stability lobes with discretization method. Firstly, three different kinds of stability regions are defined: absolute stable region, valid region, and invalid region. Secondly, while identifying the chatter stability lobes, three different regions within the chatter stability lobes are identified with relatively large time intervals. Thirdly, stability boundary within the valid regions is finely calculated to get exact chatter stability lobes. The proposed method only needs to test a small portion of spindle speed and cutting depth set; about 89% computation time is savedcompared with full discretization method. It spends only about10 minutes to get exact chatter stability lobes. Since, based on discretization method, the proposed method can be used for different immersion cutting including low immersion cutting process, the proposed method can be directly implemented in the workshop to promote machining parameters selection efficiency.

  6. WSRC approach to validation of criticality safety computer codes

    International Nuclear Information System (INIS)

    Finch, D.R.; Mincey, J.F.

    1991-01-01

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K eff ) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope 236 U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed

  7. Model validation: a systemic and systematic approach

    International Nuclear Information System (INIS)

    Sheng, G.; Elzas, M.S.; Cronhjort, B.T.

    1993-01-01

    The term 'validation' is used ubiquitously in association with the modelling activities of numerous disciplines including social, political natural, physical sciences, and engineering. There is however, a wide range of definitions which give rise to very different interpretations of what activities the process involves. Analyses of results from the present large international effort in modelling radioactive waste disposal systems illustrate the urgent need to develop a common approach to model validation. Some possible explanations are offered to account for the present state of affairs. The methodology developed treats model validation and code verification in a systematic fashion. In fact, this approach may be regarded as a comprehensive framework to assess the adequacy of any simulation study. (author)

  8. An Argument Approach to Observation Protocol Validity

    Science.gov (United States)

    Bell, Courtney A.; Gitomer, Drew H.; McCaffrey, Daniel F.; Hamre, Bridget K.; Pianta, Robert C.; Qi, Yi

    2012-01-01

    This article develops a validity argument approach for use on observation protocols currently used to assess teacher quality for high-stakes personnel and professional development decisions. After defining the teaching quality domain, we articulate an interpretive argument for observation protocols. To illustrate the types of evidence that might…

  9. Identifying Primary Spontaneous Pneumothorax from Administrative Databases: A Validation Study

    Directory of Open Access Journals (Sweden)

    Eric Frechette

    2016-01-01

    Full Text Available Introduction. Primary spontaneous pneumothorax (PSP is a disorder commonly encountered in healthy young individuals. There is no differentiation between PSP and secondary pneumothorax (SP in the current version of the International Classification of Diseases (ICD-10. This complicates the conduct of epidemiological studies on the subject. Objective. To validate the accuracy of an algorithm that identifies cases of PSP from administrative databases. Methods. The charts of 150 patients who consulted the emergency room (ER with a recorded main diagnosis of pneumothorax were reviewed to define the type of pneumothorax that occurred. The corresponding hospital administrative data collected during previous hospitalizations and ER visits were processed through the proposed algorithm. The results were compared over two different age groups. Results. There were 144 cases of pneumothorax correctly coded (96%. The results obtained from the PSP algorithm demonstrated a significantly higher sensitivity (97% versus 81%, p=0.038 and positive predictive value (87% versus 46%, p<0.001 in patients under 40 years of age than in older patients. Conclusions. The proposed algorithm is adequate to identify cases of PSP from administrative databases in the age group classically associated with the disease. This makes possible its utilization in large population-based studies.

  10. Identifying MMORPG Bots: A Traffic Analysis Approach

    Science.gov (United States)

    Chen, Kuan-Ta; Jiang, Jhih-Wei; Huang, Polly; Chu, Hao-Hua; Lei, Chin-Laung; Chen, Wen-Chin

    2008-12-01

    Massively multiplayer online role playing games (MMORPGs) have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1) the regularity in the release time of client commands, 2) the trend and magnitude of traffic burstiness in multiple time scales, and 3) the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  11. Identifying MMORPG Bots: A Traffic Analysis Approach

    Directory of Open Access Journals (Sweden)

    Wen-Chin Chen

    2008-11-01

    Full Text Available Massively multiplayer online role playing games (MMORPGs have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1 the regularity in the release time of client commands, 2 the trend and magnitude of traffic burstiness in multiple time scales, and 3 the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.

  12. An approach to identify urban groundwater recharge

    Directory of Open Access Journals (Sweden)

    E. Vázquez-Suñé

    2010-10-01

    Full Text Available Evaluating the proportion in which waters from different origins are mixed in a given water sample is relevant for many hydrogeological problems, such as quantifying total recharge, assessing groundwater pollution risks, or managing water resources. Our work is motivated by urban hydrogeology, where waters with different chemical signature can be identified (losses from water supply and sewage networks, infiltration from surface runoff and other water bodies, lateral aquifers inflows, .... The relative contribution of different sources to total recharge can be quantified by means of solute mass balances, but application is hindered by the large number of potential origins. Hence, the need to incorporate data from a large number of conservative species, the uncertainty in sources concentrations and measurement errors. We present a methodology to compute mixing ratios and end-members composition, which consists of (i Identification of potential recharge sources, (ii Selection of tracers, (iii Characterization of the hydrochemical composition of potential recharge sources and mixed water samples, and (iv Computation of mixing ratios and reevaluation of end-members. The analysis performed in a data set from samples of the Barcelona city aquifers suggests that the main contributors to total recharge are the water supply network losses (22%, the sewage network losses (30%, rainfall, concentrated in the non-urbanized areas (17%, from runoff infiltration (20%, and the Besòs River (11%. Regarding species, halogens (chloride, fluoride and bromide, sulfate, total nitrogen, and stable isotopes (18O, 2H, and 34S behaved quite conservatively. Boron, residual alkalinity, EDTA and Zn did not. Yet, including these species in the computations did not affect significantly the proportion estimations.

  13. Identifying and Validating a Model of Interpersonal Performance Dimensions

    National Research Council Canada - National Science Library

    Carpenter, Tara

    2004-01-01

    .... Two studies were then completed to validate the proposed taxonomy. In the first study empirical evidence for the taxonomy was gathered using a content analysis of critical incidents taken from a job analysis...

  14. Design for validation: An approach to systems validation

    Science.gov (United States)

    Carter, William C.; Dunham, Janet R.; Laprie, Jean-Claude; Williams, Thomas; Howden, William; Smith, Brian; Lewis, Carl M. (Editor)

    1989-01-01

    Every complex system built is validated in some manner. Computer validation begins with review of the system design. As systems became too complicated for one person to review, validation began to rely on the application of adhoc methods by many individuals. As the cost of the changes mounted and the expense of failure increased, more organized procedures became essential. Attempts at devising and carrying out those procedures showed that validation is indeed a difficult technical problem. The successful transformation of the validation process into a systematic series of formally sound, integrated steps is necessary if the liability inherent in the future digita-system-based avionic and space systems is to be minimized. A suggested framework and timetable for the transformtion are presented. Basic working definitions of two pivotal ideas (validation and system life-cyle) are provided and show how the two concepts interact. Many examples are given of past and present validation activities by NASA and others. A conceptual framework is presented for the validation process. Finally, important areas are listed for ongoing development of the validation process at NASA Langley Research Center.

  15. Quality data validation: Comprehensive approach to environmental data validation

    International Nuclear Information System (INIS)

    Matejka, L.A. Jr.

    1993-01-01

    Environmental data validation consists of an assessment of three major areas: analytical method validation; field procedures and documentation review; evaluation of the level of achievement of data quality objectives based in part on PARCC parameters analysis and expected applications of data. A program utilizing matrix association of required levels of validation effort and analytical levels versus applications of this environmental data was developed in conjunction with DOE-ID guidance documents to implement actions under the Federal Facilities Agreement and Consent Order in effect at the Idaho National Engineering Laboratory. This was an effort to bring consistent quality to the INEL-wide Environmental Restoration Program and database in an efficient and cost-effective manner. This program, documenting all phases of the review process, is described here

  16. Identifying and Evaluating External Validity Evidence for Passing Scores

    Science.gov (United States)

    Davis-Becker, Susan L.; Buckendahl, Chad W.

    2013-01-01

    A critical component of the standard setting process is collecting evidence to evaluate the recommended cut scores and their use for making decisions and classifying students based on test performance. Kane (1994, 2001) proposed a framework by which practitioners can identify and evaluate evidence of the results of the standard setting from (1)…

  17. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  18. A probabilistic approach for validating protein NMR chemical shift assignments

    International Nuclear Information System (INIS)

    Wang Bowei; Wang, Yunjun; Wishart, David S.

    2010-01-01

    It has been estimated that more than 20% of the proteins in the BMRB are improperly referenced and that about 1% of all chemical shift assignments are mis-assigned. These statistics also reflect the likelihood that any newly assigned protein will have shift assignment or shift referencing errors. The relatively high frequency of these errors continues to be a concern for the biomolecular NMR community. While several programs do exist to detect and/or correct chemical shift mis-referencing or chemical shift mis-assignments, most can only do one, or the other. The one program (SHIFTCOR) that is capable of handling both chemical shift mis-referencing and mis-assignments, requires the 3D structure coordinates of the target protein. Given that chemical shift mis-assignments and chemical shift re-referencing issues should ideally be addressed prior to 3D structure determination, there is a clear need to develop a structure-independent approach. Here, we present a new structure-independent protocol, which is based on using residue-specific and secondary structure-specific chemical shift distributions calculated over small (3-6 residue) fragments to identify mis-assigned resonances. The method is also able to identify and re-reference mis-referenced chemical shift assignments. Comparisons against existing re-referencing or mis-assignment detection programs show that the method is as good or superior to existing approaches. The protocol described here has been implemented into a freely available Java program called 'Probabilistic Approach for protein Nmr Assignment Validation (PANAV)' and as a web server (http://redpoll.pharmacy.ualberta.ca/PANAVhttp://redpoll.pharmacy.ualberta.ca/PANAV) which can be used to validate and/or correct as well as re-reference assigned protein chemical shifts.

  19. 'Omics' approaches in tomato aimed at identifying candidate genes ...

    African Journals Online (AJOL)

    adriana

    2013-12-04

    Dec 4, 2013 ... approaches could be combined in order to identify candidate genes for the genetic control of ascorbic ..... applied to other traits under the complex control of many ... Engineering increased vitamin C levels in ... Chem. Biol. 13:532–538. Giovannucci E, Rimm EB, Liu Y, Stampfer MJ, Willett WC (2002). A.

  20. A Computer Vision Approach to Identify Einstein Rings and Arcs

    Science.gov (United States)

    Lee, Chien-Hsiu

    2017-03-01

    Einstein rings are rare gems of strong lensing phenomena; the ring images can be used to probe the underlying lens gravitational potential at every position angles, tightly constraining the lens mass profile. In addition, the magnified images also enable us to probe high-z galaxies with enhanced resolution and signal-to-noise ratios. However, only a handful of Einstein rings have been reported, either from serendipitous discoveries or or visual inspections of hundred thousands of massive galaxies or galaxy clusters. In the era of large sky surveys, an automated approach to identify ring pattern in the big data to come is in high demand. Here, we present an Einstein ring recognition approach based on computer vision techniques. The workhorse is the circle Hough transform that recognise circular patterns or arcs in the images. We propose a two-tier approach by first pre-selecting massive galaxies associated with multiple blue objects as possible lens, than use Hough transform to identify circular pattern. As a proof-of-concept, we apply our approach to SDSS, with a high completeness, albeit with low purity. We also apply our approach to other lenses in DES, HSC-SSP, and UltraVISTA survey, illustrating the versatility of our approach.

  1. Application of Monte Carlo cross-validation to identify pathway cross-talk in neonatal sepsis.

    Science.gov (United States)

    Zhang, Yuxia; Liu, Cui; Wang, Jingna; Li, Xingxia

    2018-03-01

    To explore genetic pathway cross-talk in neonates with sepsis, an integrated approach was used in this paper. To explore the potential relationships between differently expressed genes between normal uninfected neonates and neonates with sepsis and pathways, genetic profiling and biologic signaling pathway were first integrated. For different pathways, the score was obtained based upon the genetic expression by quantitatively analyzing the pathway cross-talk. The paired pathways with high cross-talk were identified by random forest classification. The purpose of the work was to find the best pairs of pathways able to discriminate sepsis samples versus normal samples. The results found 10 pairs of pathways, which were probably able to discriminate neonates with sepsis versus normal uninfected neonates. Among them, the best two paired pathways were identified according to analysis of extensive literature. Impact statement To find the best pairs of pathways able to discriminate sepsis samples versus normal samples, an RF classifier, the DS obtained by DEGs of paired pathways significantly associated, and Monte Carlo cross-validation were applied in this paper. Ten pairs of pathways were probably able to discriminate neonates with sepsis versus normal uninfected neonates. Among them, the best two paired pathways ((7) IL-6 Signaling and Phospholipase C Signaling (PLC); (8) Glucocorticoid Receptor (GR) Signaling and Dendritic Cell Maturation) were identified according to analysis of extensive literature.

  2. A knowledge-driven approach to cluster validity assessment.

    Science.gov (United States)

    Bolshakova, Nadia; Azuaje, Francisco; Cunningham, Pádraig

    2005-05-15

    This paper presents an approach to assessing cluster validity based on similarity knowledge extracted from the Gene Ontology. The program is freely available for non-profit use on request from the authors.

  3. A statistical approach to identify candidate cues for nestmate recognition

    DEFF Research Database (Denmark)

    van Zweden, Jelle; Pontieri, Luigi; Pedersen, Jes Søe

    2014-01-01

    normalization, centroid,and distance calculation is most diagnostic to discriminate between NMR cues andother compounds. We find that using a “global centroid” instead of a “colony centroid”significantly improves the analysis. One reason may be that this new approach, unlikeprevious ones, provides...... than forF. exsecta, possibly due to less than ideal datasets. Nonetheless, some compound setsperformed better than others, showing that this approach can be used to identify candidatecompounds to be tested in bio-assays, and eventually crack the sophisticated code thatgoverns nestmate recognition....

  4. Validation of assessment tools for identifying trauma symptomatology in young children exposed to trauma

    DEFF Research Database (Denmark)

    Schandorph Løkkegaard, Sille; Elmose, Mette; Elklit, Ask

    There is a lack of Danish validated, developmentally sensitive assessment tools for preschool and young school children exposed to psychological trauma. Consequently, young traumatised children are at risk of not being identified. The purpose of this project is to validate three assessment tools...... that identify trauma symptomatology in young children; a caregiver interview called the Diagnostic Infant and Preschool Assessment (DIPA), a structured play test called the Odense Child Trauma Screening (OCTS), and a child questionnaire called the Darryl Cartoon Test. Three validity studies were conducted...

  5. An Objective Approach to Identify Spectral Distinctiveness for Hearing Impairment

    Directory of Open Access Journals (Sweden)

    Yeou-Jiunn Chen

    2013-01-01

    Full Text Available To facilitate the process of developing speech perception, speech-language pathologists have to teach a subject with hearing loss the differences between two syllables by manually enhancing acoustic cues of speech. However, this process is time consuming and difficult. Thus, this study proposes an objective approach to automatically identify the regions of spectral distinctiveness between two syllables, which is used for speech-perception training. To accurately represent the characteristics of speech, mel-frequency cepstrum coefficients are selected as analytical parameters. The mismatch between two syllables in time domain is handled by dynamic time warping. Further, a filter bank is adopted to estimate the components in different frequency bands, which are also represented as mel-frequency cepstrum coefficients. The spectral distinctiveness in different frequency bands is then easily estimated by using Euclidean metrics. Finally, a morphological gradient operator is applied to automatically identify the regions of spectral distinctiveness. To evaluate the proposed approach, the identified regions are manipulated and then the manipulated syllables are measured by a close-set based speech-perception test. The experimental results demonstrated that the identified regions of spectral distinctiveness are very useful in speech perception, which indeed can help speech-language pathologists in speech-perception training.

  6. A robust approach to QMU, validation, and conservative prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Segalman, Daniel Joseph; Paez, Thomas Lee; Bauman, Lara E

    2013-01-01

    A systematic approach to defining margin in a manner that incorporates statistical information and accommodates data uncertainty, but does not require assumptions about specific forms of the tails of distributions is developed. This approach extends to calculations underlying validation assessment and quantitatively conservative predictions.

  7. Validation of search filters for identifying pediatric studies in PubMed

    NARCIS (Netherlands)

    Leclercq, Edith; Leeflang, Mariska M. G.; van Dalen, Elvira C.; Kremer, Leontien C. M.

    2013-01-01

    To identify and validate PubMed search filters for retrieving studies including children and to develop a new pediatric search filter for PubMed. We developed 2 different datasets of studies to evaluate the performance of the identified pediatric search filters, expressed in terms of sensitivity,

  8. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  9. Identifying technology innovations for marginalized smallholders-A conceptual approach.

    Science.gov (United States)

    Malek, Mohammad Abdul; Gatzweiler, Franz W; Von Braun, Joachim

    2017-05-01

    This paper adds a contribution in the existing literature in terms of theoretical and conceptual background for the identification of idle potentials of marginal rural areas and people by means of technological and institutional innovations. The approach follows ex-ante assessment for identifying suitable technology and institutional innovations for marginalized smallholders in marginal areas-divided into three main parts (mapping, surveying and evaluating) and several steps. Finally, it contributes to the inclusion of marginalized smallholders by an improved way of understanding the interactions between technology needs, farming systems, ecological resources and poverty characteristics in the different segments of the poor, and to link these insights with productivity enhancing technologies.

  10. A model-based approach to identify binding sites in CLIP-Seq data.

    Directory of Open Access Journals (Sweden)

    Tao Wang

    Full Text Available Cross-linking immunoprecipitation coupled with high-throughput sequencing (CLIP-Seq has made it possible to identify the targeting sites of RNA-binding proteins in various cell culture systems and tissue types on a genome-wide scale. Here we present a novel model-based approach (MiClip to identify high-confidence protein-RNA binding sites from CLIP-seq datasets. This approach assigns a probability score for each potential binding site to help prioritize subsequent validation experiments. The MiClip algorithm has been tested in both HITS-CLIP and PAR-CLIP datasets. In the HITS-CLIP dataset, the signal/noise ratios of miRNA seed motif enrichment produced by the MiClip approach are between 17% and 301% higher than those by the ad hoc method for the top 10 most enriched miRNAs. In the PAR-CLIP dataset, the MiClip approach can identify ∼50% more validated binding targets than the original ad hoc method and two recently published methods. To facilitate the application of the algorithm, we have released an R package, MiClip (http://cran.r-project.org/web/packages/MiClip/index.html, and a public web-based graphical user interface software (http://galaxy.qbrc.org/tool_runner?tool_id=mi_clip for customized analysis.

  11. Identifying Careless Responding With the Psychopathic Personality Inventory-Revised Validity Scales.

    Science.gov (United States)

    Marcus, David K; Church, Abere Sawaqdeh; O'Connell, Debra; Lilienfeld, Scott O

    2018-01-01

    The Psychopathic Personality Inventory-Revised (PPI-R) includes validity scales that assess Deviant Responding (DR), Virtuous Responding, and Inconsistent Responding. We examined the utility of these scales for identifying careless responding using data from two online studies that examined correlates of psychopathy in college students (Sample 1: N = 583; Sample 2: N = 454). Compared with those below the cut scores, those above the cut on the DR scale yielded consistently lower validity coefficients when PPI-R scores were correlated with corresponding scales from the Triarchic Psychopathy Measure. The other three PPI-R validity scales yielded weaker and less consistent results. Participants who completed the studies in an inordinately brief amount of time scored significantly higher on the DR and Virtuous Responding scales than other participants. Based on the findings from the current studies, researchers collecting PPI-R data online should consider identifying and perhaps screening out respondents with elevated scores on the DR scale.

  12. A Novel Approach to Identifying Trajectories of Mobility Change in Older Adults.

    Directory of Open Access Journals (Sweden)

    Rachel E Ward

    Full Text Available To validate trajectories of late-life mobility change using a novel approach designed to overcome the constraints of modest sample size and few follow-up time points.Using clinical reasoning and distribution-based methodology, we identified trajectories of mobility change (Late Life Function and Disability Instrument across 2 years in 391 participants age ≥65 years from a prospective cohort study designed to identify modifiable impairments predictive of mobility in late-life. We validated our approach using model fit indices and comparing baseline mobility-related factors between trajectories.Model fit indices confirmed that the optimal number of trajectories were between 4 and 6. Mobility-related factors varied across trajectories with the most unfavorable values in poor mobility trajectories and the most favorable in high mobility trajectories. These factors included leg strength, trunk extension endurance, knee flexion range of motion, limb velocity, physical performance measures, and the number and prevalence of medical conditions including osteoarthritis and back pain.Our findings support the validity of this approach and may facilitate the investigation of a broader scope of research questions within aging populations of varied sizes and traits.

  13. Preliminary Validation of a New Clinical Tool for Identifying Problem Video Game Playing

    Science.gov (United States)

    King, Daniel Luke; Delfabbro, Paul H.; Zajac, Ian T.

    2011-01-01

    Research has estimated that between 6 to 13% of individuals who play video games do so excessively. However, the methods and definitions used to identify "problem" video game players often vary considerably. This research presents preliminary validation data for a new measure of problematic video game play called the Problem Video Game…

  14. Identifying Gifted Students in Puerto Rico: Validation of a Spanish Translation of the Gifted Rating Scales

    Science.gov (United States)

    Rosado, Javier I.; Pfeiffer, Steven; Petscher, Yaacov

    2015-01-01

    The challenge of correctly identifying gifted students is a critical issue. Gifted education in Puerto Rico is marked by insufficient support and a lack of appropriate identification methods. This study examined the reliability and validity of a Spanish translation of the "Gifted Rating Scales-School Form" (GRS) with a sample of 618…

  15. Novel approaches to identify protective malaria vaccine candidates

    Directory of Open Access Journals (Sweden)

    Wan Ni eChia

    2014-11-01

    Full Text Available Efforts to develop vaccines against malaria have been the focus of substantial research activities for decades. Several categories of candidate vaccines are currently being developed for protection against malaria, based on antigens corresponding to the pre-erythrocytic, blood-stage or sexual stages of the parasite. Long lasting sterile protection from Plasmodium falciparum sporozoite challenge has been observed in human following vaccination with whole parasite formulations, clearly demonstrating that a protective immune response targeting predominantly the pre-erythrocytic stages can develop against malaria. However, most of vaccine candidates currently being investigated, which are mostly subunits vaccines, have not been able to induce substantial (>50% protection thus far. This is due to the fact that the antigens responsible for protection against the different parasite stages are still yet to be known and relevant correlates of protection have remained elusive. For a vaccine to be developed in a timely manner, novel approaches are required. In this article, we review the novel approaches that have been developed to identify the antigens for the development of an effective malaria vaccine.

  16. Identifying biomarkers for asthma diagnosis using targeted metabolomics approaches.

    Science.gov (United States)

    Checkley, William; Deza, Maria P; Klawitter, Jost; Romero, Karina M; Klawitter, Jelena; Pollard, Suzanne L; Wise, Robert A; Christians, Uwe; Hansel, Nadia N

    2016-12-01

    The diagnosis of asthma in children is challenging and relies on a combination of clinical factors and biomarkers including methacholine challenge, lung function, bronchodilator responsiveness, and presence of airway inflammation. No single test is diagnostic. We sought to identify a pattern of inflammatory biomarkers that was unique to asthma using a targeted metabolomics approach combined with data science methods. We conducted a nested case-control study of 100 children living in a peri-urban community in Lima, Peru. We defined cases as children with current asthma, and controls as children with no prior history of asthma and normal lung function. We further categorized enrollment following a factorial design to enroll equal numbers of children as either overweight or not. We obtained a fasting venous blood sample to characterize a comprehensive panel of targeted markers using a metabolomics approach based on high performance liquid chromatography-mass spectrometry. A statistical comparison of targeted metabolites between children with asthma (n = 50) and healthy controls (n = 49) revealed distinct patterns in relative concentrations of several metabolites: children with asthma had approximately 40-50% lower relative concentrations of ascorbic acid, 2-isopropylmalic acid, shikimate-3-phosphate, and 6-phospho-d-gluconate when compared to children without asthma, and 70% lower relative concentrations of reduced glutathione (all p  13 077 normalized counts/second and betaine ≤ 16 47 121 normalized counts/second). By using a metabolomics approach applied to serum, we were able to discriminate between children with and without asthma by revealing different metabolic patterns. These results suggest that serum metabolomics may represent a diagnostic tool for asthma and may be helpful for distinguishing asthma phenotypes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. A discrete wavelet spectrum approach for identifying non-monotonic trends in hydroclimate data

    Science.gov (United States)

    Sang, Yan-Fang; Sun, Fubao; Singh, Vijay P.; Xie, Ping; Sun, Jian

    2018-01-01

    The hydroclimatic process is changing non-monotonically and identifying its trends is a great challenge. Building on the discrete wavelet transform theory, we developed a discrete wavelet spectrum (DWS) approach for identifying non-monotonic trends in hydroclimate time series and evaluating their statistical significance. After validating the DWS approach using two typical synthetic time series, we examined annual temperature and potential evaporation over China from 1961-2013 and found that the DWS approach detected both the warming and the warming hiatus in temperature, and the reversed changes in potential evaporation. Further, the identified non-monotonic trends showed stable significance when the time series was longer than 30 years or so (i.e. the widely defined climate timescale). The significance of trends in potential evaporation measured at 150 stations in China, with an obvious non-monotonic trend, was underestimated and was not detected by the Mann-Kendall test. Comparatively, the DWS approach overcame the problem and detected those significant non-monotonic trends at 380 stations, which helped understand and interpret the spatiotemporal variability in the hydroclimatic process. Our results suggest that non-monotonic trends of hydroclimate time series and their significance should be carefully identified, and the DWS approach proposed has the potential for wide use in the hydrological and climate sciences.

  18. A discrete wavelet spectrum approach for identifying non-monotonic trends in hydroclimate data

    Directory of Open Access Journals (Sweden)

    Y.-F. Sang

    2018-01-01

    Full Text Available The hydroclimatic process is changing non-monotonically and identifying its trends is a great challenge. Building on the discrete wavelet transform theory, we developed a discrete wavelet spectrum (DWS approach for identifying non-monotonic trends in hydroclimate time series and evaluating their statistical significance. After validating the DWS approach using two typical synthetic time series, we examined annual temperature and potential evaporation over China from 1961–2013 and found that the DWS approach detected both the warming and the warming hiatus in temperature, and the reversed changes in potential evaporation. Further, the identified non-monotonic trends showed stable significance when the time series was longer than 30 years or so (i.e. the widely defined climate timescale. The significance of trends in potential evaporation measured at 150 stations in China, with an obvious non-monotonic trend, was underestimated and was not detected by the Mann–Kendall test. Comparatively, the DWS approach overcame the problem and detected those significant non-monotonic trends at 380 stations, which helped understand and interpret the spatiotemporal variability in the hydroclimatic process. Our results suggest that non-monotonic trends of hydroclimate time series and their significance should be carefully identified, and the DWS approach proposed has the potential for wide use in the hydrological and climate sciences.

  19. A tripartite approach identifies the major sunflower seed albumins.

    Science.gov (United States)

    Jayasena, Achala S; Franke, Bastian; Rosengren, Johan; Mylne, Joshua S

    2016-03-01

    We have used a combination of genomic, transcriptomic, and proteomic approaches to identify the napin-type albumin genes in sunflower and define their contributions to the seed albumin pool. Seed protein content is determined by the expression of what are typically large gene families. A major class of seed storage proteins is the napin-type, water soluble albumins. In this work we provide a comprehensive analysis of the napin-type albumin content of the common sunflower (Helianthus annuus) by analyzing a draft genome, a transcriptome and performing a proteomic analysis of the seed albumin fraction. We show that although sunflower contains at least 26 genes for napin-type albumins, only 15 of these are present at the mRNA level. We found protein evidence for 11 of these but the albumin content of mature seeds is dominated by the encoded products of just three genes. So despite high genetic redundancy for albumins, only a small sub-set of this gene family contributes to total seed albumin content. The three genes identified as producing the majority of sunflower seed albumin are potential future candidates for manipulation through genetics and breeding.

  20. Integrative biology approach identifies cytokine targeting strategies for psoriasis.

    Science.gov (United States)

    Perera, Gayathri K; Ainali, Chrysanthi; Semenova, Ekaterina; Hundhausen, Christian; Barinaga, Guillermo; Kassen, Deepika; Williams, Andrew E; Mirza, Muddassar M; Balazs, Mercedesz; Wang, Xiaoting; Rodriguez, Robert Sanchez; Alendar, Andrej; Barker, Jonathan; Tsoka, Sophia; Ouyang, Wenjun; Nestle, Frank O

    2014-02-12

    Cytokines are critical checkpoints of inflammation. The treatment of human autoimmune disease has been revolutionized by targeting inflammatory cytokines as key drivers of disease pathogenesis. Despite this, there exist numerous pitfalls when translating preclinical data into the clinic. We developed an integrative biology approach combining human disease transcriptome data sets with clinically relevant in vivo models in an attempt to bridge this translational gap. We chose interleukin-22 (IL-22) as a model cytokine because of its potentially important proinflammatory role in epithelial tissues. Injection of IL-22 into normal human skin grafts produced marked inflammatory skin changes resembling human psoriasis. Injection of anti-IL-22 monoclonal antibody in a human xenotransplant model of psoriasis, developed specifically to test potential therapeutic candidates, efficiently blocked skin inflammation. Bioinformatic analysis integrating both the IL-22 and anti-IL-22 cytokine transcriptomes and mapping them onto a psoriasis disease gene coexpression network identified key cytokine-dependent hub genes. Using knockout mice and small-molecule blockade, we show that one of these hub genes, the so far unexplored serine/threonine kinase PIM1, is a critical checkpoint for human skin inflammation and potential future therapeutic target in psoriasis. Using in silico integration of human data sets and biological models, we were able to identify a new target in the treatment of psoriasis.

  1. Identifying prognostic features by bottom-up approach and correlating to drug repositioning.

    Directory of Open Access Journals (Sweden)

    Wei Li

    Full Text Available Traditionally top-down method was used to identify prognostic features in cancer research. That is to say, differentially expressed genes usually in cancer versus normal were identified to see if they possess survival prediction power. The problem is that prognostic features identified from one set of patient samples can rarely be transferred to other datasets. We apply bottom-up approach in this study: survival correlated or clinical stage correlated genes were selected first and prioritized by their network topology additionally, then a small set of features can be used as a prognostic signature.Gene expression profiles of a cohort of 221 hepatocellular carcinoma (HCC patients were used as a training set, 'bottom-up' approach was applied to discover gene-expression signatures associated with survival in both tumor and adjacent non-tumor tissues, and compared with 'top-down' approach. The results were validated in a second cohort of 82 patients which was used as a testing set.Two sets of gene signatures separately identified in tumor and adjacent non-tumor tissues by bottom-up approach were developed in the training cohort. These two signatures were associated with overall survival times of HCC patients and the robustness of each was validated in the testing set, and each predictive performance was better than gene expression signatures reported previously. Moreover, genes in these two prognosis signature gave some indications for drug-repositioning on HCC. Some approved drugs targeting these markers have the alternative indications on hepatocellular carcinoma.Using the bottom-up approach, we have developed two prognostic gene signatures with a limited number of genes that associated with overall survival times of patients with HCC. Furthermore, prognostic markers in these two signatures have the potential to be therapeutic targets.

  2. Identifying perinatal risk factors for infant maltreatment: an ecological approach

    Directory of Open Access Journals (Sweden)

    Hallisey Elaine J

    2006-12-01

    Full Text Available Abstract Background Child maltreatment and its consequences are a persistent problem throughout the world. Public health workers, human services officials, and others are interested in new and efficient ways to determine which geographic areas to target for intervention programs and resources. To improve assessment efforts, selected perinatal factors were examined, both individually and in various combinations, to determine if they are associated with increased risk of infant maltreatment. State of Georgia birth records and abuse and neglect data were analyzed using an area-based, ecological approach with the census tract as a surrogate for the community. Cartographic visualization suggested some correlation exists between risk factors and child maltreatment, so bivariate and multivariate regression were performed. The presence of spatial autocorrelation precluded the use of traditional ordinary least squares regression, therefore a spatial regression model coupled with maximum likelihood estimation was employed. Results Results indicate that all individual factors or their combinations are significantly associated with increased risk of infant maltreatment. The set of perinatal risk factors that best predicts infant maltreatment rates are: mother smoked during pregnancy, families with three or more siblings, maternal age less than 20 years, births to unmarried mothers, Medicaid beneficiaries, and inadequate prenatal care. Conclusion This model enables public health to take a proactive stance, to reasonably predict areas where poor outcomes are likely to occur, and to therefore more efficiently allocate resources. U.S. states that routinely collect the variables the National Center for Health Statistics (NCHS defines for birth certificates can easily identify areas that are at high risk for infant maltreatment. The authors recommend that agencies charged with reducing child maltreatment target communities that demonstrate the perinatal risks

  3. Developing an instrument to identify MBChB students' approaches ...

    African Journals Online (AJOL)

    The constructs of deep, surface and achieving approaches to learning are well defined in the literature and amply supported by research. Quality learning results from a deep approach to learning, and a deep-achieving approach to learning is regarded as the most adaptive approach institutionally. It is therefore felt that ...

  4. Reverse Vaccinology: An Approach for Identifying Leptospiral Vaccine Candidates

    Directory of Open Access Journals (Sweden)

    Odir A. Dellagostin

    2017-01-01

    Full Text Available Leptospirosis is a major public health problem with an incidence of over one million human cases each year. It is a globally distributed, zoonotic disease and is associated with significant economic losses in farm animals. Leptospirosis is caused by pathogenic Leptospira spp. that can infect a wide range of domestic and wild animals. Given the inability to control the cycle of transmission among animals and humans, there is an urgent demand for a new vaccine. Inactivated whole-cell vaccines (bacterins are routinely used in livestock and domestic animals, however, protection is serovar-restricted and short-term only. To overcome these limitations, efforts have focused on the development of recombinant vaccines, with partial success. Reverse vaccinology (RV has been successfully applied to many infectious diseases. A growing number of leptospiral genome sequences are now available in public databases, providing an opportunity to search for prospective vaccine antigens using RV. Several promising leptospiral antigens were identified using this approach, although only a few have been characterized and evaluated in animal models. In this review, we summarize the use of RV for leptospirosis and discuss the need for potential improvements for the successful development of a new vaccine towards reducing the burden of human and animal leptospirosis.

  5. The validity of using ICD-9 codes and pharmacy records to identify patients with chronic obstructive pulmonary disease

    Directory of Open Access Journals (Sweden)

    Lee Todd A

    2011-02-01

    Full Text Available Abstract Background Administrative data is often used to identify patients with chronic obstructive pulmonary disease (COPD, yet the validity of this approach is unclear. We sought to develop a predictive model utilizing administrative data to accurately identify patients with COPD. Methods Sequential logistic regression models were constructed using 9573 patients with postbronchodilator spirometry at two Veterans Affairs medical centers (2003-2007. COPD was defined as: 1 FEV1/FVC Results 4564 of 9573 patients (47.7% had an FEV1/FVC Conclusion Commonly used definitions of COPD in observational studies misclassify the majority of patients as having COPD. Using multiple diagnostic codes in combination with pharmacy data improves the ability to accurately identify patients with COPD.

  6. Identifying predictors of physics item difficulty: A linear regression approach

    Science.gov (United States)

    Mesic, Vanes; Muratovic, Hasnija

    2011-06-01

    Large-scale assessments of student achievement in physics are often approached with an intention to discriminate students based on the attained level of their physics competencies. Therefore, for purposes of test design, it is important that items display an acceptable discriminatory behavior. To that end, it is recommended to avoid extraordinary difficult and very easy items. Knowing the factors that influence physics item difficulty makes it possible to model the item difficulty even before the first pilot study is conducted. Thus, by identifying predictors of physics item difficulty, we can improve the test-design process. Furthermore, we get additional qualitative feedback regarding the basic aspects of student cognitive achievement in physics that are directly responsible for the obtained, quantitative test results. In this study, we conducted a secondary analysis of data that came from two large-scale assessments of student physics achievement at the end of compulsory education in Bosnia and Herzegovina. Foremost, we explored the concept of “physics competence” and performed a content analysis of 123 physics items that were included within the above-mentioned assessments. Thereafter, an item database was created. Items were described by variables which reflect some basic cognitive aspects of physics competence. For each of the assessments, Rasch item difficulties were calculated in separate analyses. In order to make the item difficulties from different assessments comparable, a virtual test equating procedure had to be implemented. Finally, a regression model of physics item difficulty was created. It has been shown that 61.2% of item difficulty variance can be explained by factors which reflect the automaticity, complexity, and modality of the knowledge structure that is relevant for generating the most probable correct solution, as well as by the divergence of required thinking and interference effects between intuitive and formal physics knowledge

  7. Identifying predictors of physics item difficulty: A linear regression approach

    Directory of Open Access Journals (Sweden)

    Hasnija Muratovic

    2011-06-01

    Full Text Available Large-scale assessments of student achievement in physics are often approached with an intention to discriminate students based on the attained level of their physics competencies. Therefore, for purposes of test design, it is important that items display an acceptable discriminatory behavior. To that end, it is recommended to avoid extraordinary difficult and very easy items. Knowing the factors that influence physics item difficulty makes it possible to model the item difficulty even before the first pilot study is conducted. Thus, by identifying predictors of physics item difficulty, we can improve the test-design process. Furthermore, we get additional qualitative feedback regarding the basic aspects of student cognitive achievement in physics that are directly responsible for the obtained, quantitative test results. In this study, we conducted a secondary analysis of data that came from two large-scale assessments of student physics achievement at the end of compulsory education in Bosnia and Herzegovina. Foremost, we explored the concept of “physics competence” and performed a content analysis of 123 physics items that were included within the above-mentioned assessments. Thereafter, an item database was created. Items were described by variables which reflect some basic cognitive aspects of physics competence. For each of the assessments, Rasch item difficulties were calculated in separate analyses. In order to make the item difficulties from different assessments comparable, a virtual test equating procedure had to be implemented. Finally, a regression model of physics item difficulty was created. It has been shown that 61.2% of item difficulty variance can be explained by factors which reflect the automaticity, complexity, and modality of the knowledge structure that is relevant for generating the most probable correct solution, as well as by the divergence of required thinking and interference effects between intuitive and formal

  8. Validation of MIMGO: a method to identify differentially expressed GO terms in a microarray dataset

    Directory of Open Access Journals (Sweden)

    Yamada Yoichi

    2012-12-01

    Full Text Available Abstract Background We previously proposed an algorithm for the identification of GO terms that commonly annotate genes whose expression is upregulated or downregulated in some microarray data compared with in other microarray data. We call these “differentially expressed GO terms” and have named the algorithm “matrix-assisted identification method of differentially expressed GO terms” (MIMGO. MIMGO can also identify microarray data in which genes annotated with a differentially expressed GO term are upregulated or downregulated. However, MIMGO has not yet been validated on a real microarray dataset using all available GO terms. Findings We combined Gene Set Enrichment Analysis (GSEA with MIMGO to identify differentially expressed GO terms in a yeast cell cycle microarray dataset. GSEA followed by MIMGO (GSEA + MIMGO correctly identified (p Conclusions MIMGO is a reliable method to identify differentially expressed GO terms comprehensively.

  9. Automatic address validation and health record review to identify homeless Social Security disability applicants.

    Science.gov (United States)

    Erickson, Jennifer; Abbott, Kenneth; Susienka, Lucinda

    2018-06-01

    Homeless patients face a variety of obstacles in pursuit of basic social services. Acknowledging this, the Social Security Administration directs employees to prioritize homeless patients and handle their disability claims with special care. However, under existing manual processes for identification of homelessness, many homeless patients never receive the special service to which they are entitled. In this paper, we explore address validation and automatic annotation of electronic health records to improve identification of homeless patients. We developed a sample of claims containing medical records at the moment of arrival in a single office. Using address validation software, we reconciled patient addresses with public directories of homeless shelters, veterans' hospitals and clinics, and correctional facilities. Other tools annotated electronic health records. We trained random forests to identify homeless patients and validated each model with 10-fold cross validation. For our finished model, the area under the receiver operating characteristic curve was 0.942. The random forest improved sensitivity from 0.067 to 0.879 but decreased positive predictive value to 0.382. Presumed false positive classifications bore many characteristics of homelessness. Organizations could use these methods to prompt early collection of information necessary to avoid labor-intensive attempts to reestablish contact with homeless individuals. Annually, such methods could benefit tens of thousands of patients who are homeless, destitute, and in urgent need of assistance. We were able to identify many more homeless patients through a combination of automatic address validation and natural language processing of unstructured electronic health records. Copyright © 2018. Published by Elsevier Inc.

  10. [Validation of the AUDIT test for identifying risk consumption and alcohol use disorders in women].

    Science.gov (United States)

    Pérula de Torres, L A; Fernández-García, J A; Arias-Vega, R; Muriel-Palomino, M; Márquez-Rebollo, E; Ruiz-Moral, R

    2005-11-30

    To validate the AUDIT test for identifying women with excess alcohol consumption and/or dependency syndrome (DS). Descriptive study to validate a test. Two primary care centres and a county drug-dependency centre. 414 women from 18 to 75 recruited at the clinic. Interventions. Social and personal details were obtained through personal interview, their alcohol consumption was quantified and the AUDIT and MALT questionnaires were filled in. Then the semi-structured SCAN interview was conducted (gold standard; DSM-IV and CIE-10 criteria), and analyses were requested (GGT, GOT, GPT, VCM). 186 patients were given a follow-up appointment three-four weeks later (retest). Intra-observer reliability was evaluated with the Kappa index, internal consistency with Cronbach s alpha, and the validity of criteria with indexes of sensitivity and specificity, predictive values and probability quotients. To evaluate the diagnostic performance of the test and the most effective cut-off point, a ROC analysis was run. 11.4% (95% CI, 8.98-13.81) were diagnosed with alcohol abuse (0.5%) or DS (10.9%). The Kappa coefficients of the AUDIT items ranged between 0.685 and 0.795 (PAUDIT is a questionnaire with good psycho-measurement properties. It is reliable and valid for the detection of risk consumption and DS in women.

  11. Identifying Similarities in Cognitive Subtest Functional Requirements: An Empirical Approach

    Science.gov (United States)

    Frisby, Craig L.; Parkin, Jason R.

    2007-01-01

    In the cognitive test interpretation literature, a Rational/Intuitive, Indirect Empirical, or Combined approach is typically used to construct conceptual taxonomies of the functional (behavioral) similarities between subtests. To address shortcomings of these approaches, the functional requirements for 49 subtests from six individually…

  12. Validation of de-identified record linkage to ascertain hospital admissions in a cohort study

    Directory of Open Access Journals (Sweden)

    English Dallas R

    2011-04-01

    Full Text Available Abstract Background Cohort studies can provide valuable evidence of cause and effect relationships but are subject to loss of participants over time, limiting the validity of findings. Computerised record linkage offers a passive and ongoing method of obtaining health outcomes from existing routinely collected data sources. However, the quality of record linkage is reliant upon the availability and accuracy of common identifying variables. We sought to develop and validate a method for linking a cohort study to a state-wide hospital admissions dataset with limited availability of unique identifying variables. Methods A sample of 2000 participants from a cohort study (n = 41 514 was linked to a state-wide hospitalisations dataset in Victoria, Australia using the national health insurance (Medicare number and demographic data as identifying variables. Availability of the health insurance number was limited in both datasets; therefore linkage was undertaken both with and without use of this number and agreement tested between both algorithms. Sensitivity was calculated for a sub-sample of 101 participants with a hospital admission confirmed by medical record review. Results Of the 2000 study participants, 85% were found to have a record in the hospitalisations dataset when the national health insurance number and sex were used as linkage variables and 92% when demographic details only were used. When agreement between the two methods was tested the disagreement fraction was 9%, mainly due to "false positive" links when demographic details only were used. A final algorithm that used multiple combinations of identifying variables resulted in a match proportion of 87%. Sensitivity of this final linkage was 95%. Conclusions High quality record linkage of cohort data with a hospitalisations dataset that has limited identifiers can be achieved using combinations of a national health insurance number and demographic data as identifying variables.

  13. DNA enrichment approaches to identify unauthorized genetically modified organisms (GMOs).

    Science.gov (United States)

    Arulandhu, Alfred J; van Dijk, Jeroen P; Dobnik, David; Holst-Jensen, Arne; Shi, Jianxin; Zel, Jana; Kok, Esther J

    2016-07-01

    With the increased global production of different genetically modified (GM) plant varieties, chances increase that unauthorized GM organisms (UGMOs) may enter the food chain. At the same time, the detection of UGMOs is a challenging task because of the limited sequence information that will generally be available. PCR-based methods are available to detect and quantify known UGMOs in specific cases. If this approach is not feasible, DNA enrichment of the unknown adjacent sequences of known GMO elements is one way to detect the presence of UGMOs in a food or feed product. These enrichment approaches are also known as chromosome walking or gene walking (GW). In recent years, enrichment approaches have been coupled with next generation sequencing (NGS) analysis and implemented in, amongst others, the medical and microbiological fields. The present review will provide an overview of these approaches and an evaluation of their applicability in the identification of UGMOs in complex food or feed samples.

  14. Research Resource: A Dual Proteomic Approach Identifies Regulated Islet Proteins During β-Cell Mass Expansion In Vivo

    DEFF Research Database (Denmark)

    Horn, Signe; Kirkegaard, Jeannette S.; Hoelper, Soraya

    2016-01-01

    to be up regulated as a response to pregnancy. These included several proteins, not previously associated with pregnancy-induced islet expansion, such as CLIC1, STMN1, MCM6, PPIB, NEDD4, and HLTF. Confirming the validity of our approach, we also identified proteins encoded by genes known to be associated...

  15. Validity of administrative database code algorithms to identify vascular access placement, surgical revisions, and secondary patency.

    Science.gov (United States)

    Al-Jaishi, Ahmed A; Moist, Louise M; Oliver, Matthew J; Nash, Danielle M; Fleet, Jamie L; Garg, Amit X; Lok, Charmaine E

    2018-03-01

    We assessed the validity of physician billing codes and hospital admission using International Classification of Diseases 10th revision codes to identify vascular access placement, secondary patency, and surgical revisions in administrative data. We included adults (≥18 years) with a vascular access placed between 1 April 2004 and 31 March 2013 at the University Health Network, Toronto. Our reference standard was a prospective vascular access database (VASPRO) that contains information on vascular access type and dates of placement, dates for failure, and any revisions. We used VASPRO to assess the validity of different administrative coding algorithms by calculating the sensitivity, specificity, and positive predictive values of vascular access events. The sensitivity (95% confidence interval) of the best performing algorithm to identify arteriovenous access placement was 86% (83%, 89%) and specificity was 92% (89%, 93%). The corresponding numbers to identify catheter insertion were 84% (82%, 86%) and 84% (80%, 87%), respectively. The sensitivity of the best performing coding algorithm to identify arteriovenous access surgical revisions was 81% (67%, 90%) and specificity was 89% (87%, 90%). The algorithm capturing arteriovenous access placement and catheter insertion had a positive predictive value greater than 90% and arteriovenous access surgical revisions had a positive predictive value of 20%. The duration of arteriovenous access secondary patency was on average 578 (553, 603) days in VASPRO and 555 (530, 580) days in administrative databases. Administrative data algorithms have fair to good operating characteristics to identify vascular access placement and arteriovenous access secondary patency. Low positive predictive values for surgical revisions algorithm suggest that administrative data should only be used to rule out the occurrence of an event.

  16. IDENTIFYING DEMENTIA IN ELDERLY POPULATION : A CAMP APPROACH

    OpenAIRE

    Anand P; Chaukimath; Srikanth; Koli

    2015-01-01

    BACKGROUND: Dementia is an emerging medico social problem affecting elderly, and poses a challenge to clinician and caregivers. It is usually identified in late stage where management becomes difficult. AIM: The aim of camp was to identify dementia in elderly population participating in screening camp. MATERIAL AND METHODS : The geriatric clinic and department of psychiatry jointly organised screening camp to detect dementia in elderly for five days in Sept...

  17. IDENTIFYING DEMENTIA IN ELDERLY POPULATION : A CAMP APPROACH

    Directory of Open Access Journals (Sweden)

    Anand P

    2015-06-01

    Full Text Available BACKGROUND: Dementia is an emerging medico social problem affecting elderly, and poses a challenge to clinician and caregivers. It is usually identified in late stage where management becomes difficult. AIM: The aim of camp was to identify dementia in elderly population participating in screening camp. MATERIAL AND METHODS : The geriatric clinic and department of psychiatry jointly organised screening camp to detect dementia in elderly for five days in September 2014 to commemorate world Alzheimer’s day. The invitation regarding camp was sent to all senio r citizen forums and also published in leading Kannada daily newspaper. Mini Mental Status Examination and Diagnostic and Statistical Manual of Mental Disorders, 4 th edition criteria (DSM IV was used to identify dementia. RESULTS: Elderly male participate d in camp in more number than females and dementia was identified in 36% elderly with education less than 9 th standard. Dementia was found in 18% in our study population. CONCLUSION: The camp help identify elderly suffering from dementia and also created a wareness about it. Hypertension and diabetes mellitus were common co morbidity in study population. Our study suggested organising screening camp will help identify elderly living with dementia.

  18. Validation of Slosh Modeling Approach Using STAR-CCM+

    Science.gov (United States)

    Benson, David J.; Ng, Wanyi

    2018-01-01

    Without an adequate understanding of propellant slosh, the spacecraft attitude control system may be inadequate to control the spacecraft or there may be an unexpected loss of science observation time due to higher slosh settling times. Computational fluid dynamics (CFD) is used to model propellant slosh. STAR-CCM+ is a commercially available CFD code. This paper seeks to validate the CFD modeling approach via a comparison between STAR-CCM+ liquid slosh modeling results and experimental, empirically, and analytically derived results. The geometries examined are a bare right cylinder tank and a right cylinder with a single ring baffle.

  19. Identifying barriers to and facilitators of tuberculosis contact investigation in Kampala, Uganda: a behavioral approach.

    Science.gov (United States)

    Ayakaka, Irene; Ackerman, Sara; Ggita, Joseph M; Kajubi, Phoebe; Dowdy, David; Haberer, Jessica E; Fair, Elizabeth; Hopewell, Philip; Handley, Margaret A; Cattamanchi, Adithya; Katamba, Achilles; Davis, J Lucian

    2017-03-09

    The World Health Organization recommends routine household tuberculosis contact investigation in high-burden countries but adoption has been limited. We sought to identify barriers to and facilitators of TB contact investigation during its introduction in Kampala, Uganda. We collected cross-sectional qualitative data through focus group discussions and interviews with stakeholders, addressing three core activities of contact investigation: arranging household screening visits through index TB patients, visiting households to screen contacts and refer them to clinics, and evaluating at-risk contacts coming to clinics. We analyzed the data using a validated theory of behavior change, the Capability, Opportunity, and Motivation determine Behavior (COM-B) model, and sought to identify targeted interventions using the related Behavior Change Wheel implementation framework. We led seven focus-group discussions with 61 health-care workers, two with 21 lay health workers (LHWs), and one with four household contacts of newly diagnosed TB patients. We, in addition, performed 32 interviews with household contacts from 14 households of newly diagnosed TB patients. Commonly noted barriers included stigma, limited knowledge about TB among contacts, insufficient time and space in clinics for counselling, mistrust of health-center staff among index patients and contacts, and high travel costs for LHWs and contacts. The most important facilitators identified were the personalized and enabling services provided by LHWs. We identified education, persuasion, enablement, modeling of health-positive behaviors, incentivization, and restructuring of the service environment as relevant intervention functions with potential to alleviate barriers to and enhance facilitators of TB contact investigation. The use of a behavioral theory and a validated implementation framework provided a comprehensive approach for systematically identifying barriers to and facilitators of TB contact

  20. Identifying Meaningful Behaviors for Social Competence: A Contextual Approach.

    Science.gov (United States)

    Warnes, Emily D.; Sheridan, Susan M.; Geske, Jenenne; Warnes, William A.

    An exploratory study was conducted which assessed behaviors that characterize social competence in the 2nd and 5th grades. A contextual approach was used to gather information from 2nd and 5th grade children and their parents and teachers regarding the behaviors they perceived to be important for getting along well with peers. Data were gathered…

  1. Relative validity of a food frequency questionnaire to identify dietary patterns in an adult Mexican population.

    Science.gov (United States)

    Denova-Gutiérrez, Edgar; Tucker, Katherine L; Salmerón, Jorge; Flores, Mario; Barquera, Simón

    2016-01-01

    To examine the validity of a semi-quantitative food frequency questionnaire (SFFQ) to identify dietary patterns in an adult Mexican population. A 140-item SFFQ and two 24-hour dietary recalls (24DRs) were administered. Foods were categorized into 29 food groups used to derive dietary patterns via factor analysis. Pearson and intraclass correlations coefficients between dietary pattern scores identified from the SFFQ and 24DRs were assessed. Pattern 1 was high in snacks, fast food, soft drinks, processed meats and refined grains; pattern 2 was high in fresh vegetables, fresh fruits, and dairy products; and pattern 3 was high in legumes, eggs, sweetened foods and sugars. Pearson correlation coefficients between the SFFQ and the 24DRs for these patterns were 0.66 (P<0.001), 0.41 (P<0.001) and 0.29 (P=0.193) respectively. Our data indicate reasonable validity of the SFFQ, using factor analysis, to derive major dietary patterns in comparison with two 24DR.

  2. Relative validity of a food frequency questionnaire to identify dietary patterns in an adult Mexican population

    Directory of Open Access Journals (Sweden)

    Edgar Denova-Gutiérrez

    2016-12-01

    Full Text Available Objective. To examine the validity of a semi-quantitative food frequency questionnaire (SFFQ to identify dietary patterns in an adult Mexican population. Materials and methods. A 140-item SFFQ and two 24-hour dietary recalls (24DRs were administered. Foods were categorized into 29 food groups used to derive dietary patterns via factor analy­sis. Pearson and intraclass correlations coefficients between dietary pattern scores identified from the SFFQ and 24DRs were assessed. Results. Pattern 1 was high in snacks, fast food, soft drinks, processed meats and refined grains; pattern 2 was high in fresh vegetables, fresh fruits, and dairy products; and pattern 3 was high in legumes, eggs, sweetened foods and sugars. Pearson correlation oefficients between the SFFQ and the 24DRs for these patterns were 0.66 (P<0.001, 0.41 (P<0.001 and 0.29 (P=0.193 respectively. Conclusions. Our data indicate reasonable validity of the SFFQ, using fac­tor analysis, to derive major dietary patterns in comparison with two 24DR.

  3. A Simulation Approach for Performance Validation during Embedded Systems Design

    Science.gov (United States)

    Wang, Zhonglei; Haberl, Wolfgang; Herkersdorf, Andreas; Wechs, Martin

    Due to the time-to-market pressure, it is highly desirable to design hardware and software of embedded systems in parallel. However, hardware and software are developed mostly using very different methods, so that performance evaluation and validation of the whole system is not an easy task. In this paper, we propose a simulation approach to bridge the gap between model-driven software development and simulation based hardware design, by merging hardware and software models into a SystemC based simulation environment. An automated procedure has been established to generate software simulation models from formal models, while the hardware design is originally modeled in SystemC. As the simulation models are annotated with timing information, performance issues are tackled in the same pass as system functionality, rather than in a dedicated approach.

  4. Validation of search filters for identifying pediatric studies in PubMed.

    Science.gov (United States)

    Leclercq, Edith; Leeflang, Mariska M G; van Dalen, Elvira C; Kremer, Leontien C M

    2013-03-01

    To identify and validate PubMed search filters for retrieving studies including children and to develop a new pediatric search filter for PubMed. We developed 2 different datasets of studies to evaluate the performance of the identified pediatric search filters, expressed in terms of sensitivity, precision, specificity, accuracy, and number needed to read (NNR). An optimal search filter will have a high sensitivity and high precision with a low NNR. In addition to the PubMed Limits: All Child: 0-18 years filter (in May 2012 renamed to PubMed Filter Child: 0-18 years), 6 search filters for identifying studies including children were identified: 3 developed by Kastner et al, 1 developed by BestBets, one by the Child Health Field, and 1 by the Cochrane Childhood Cancer Group. Three search filters (Cochrane Childhood Cancer Group, Child Health Field, and BestBets) had the highest sensitivity (99.3%, 99.5%, and 99.3%, respectively) but a lower precision (64.5%, 68.4%, and 66.6% respectively) compared with the other search filters. Two Kastner search filters had a high precision (93.0% and 93.7%, respectively) but a low sensitivity (58.5% and 44.8%, respectively). They failed to identify many pediatric studies in our datasets. The search terms responsible for false-positive results in the reference dataset were determined. With these data, we developed a new search filter for identifying studies with children in PubMed with an optimal sensitivity (99.5%) and precision (69.0%). Search filters to identify studies including children either have a low sensitivity or a low precision with a high NNR. A new pediatric search filter with a high sensitivity and a low NNR has been developed. Copyright © 2013 Mosby, Inc. All rights reserved.

  5. Tiered High-Throughput Screening Approach to Identify ...

    Science.gov (United States)

    High-throughput screening (HTS) for potential thyroid–disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limited in the US EPA ToxCast screening assay portfolio. To fill one critical screening gap, the Amplex UltraRed-thyroperoxidase (AUR-TPO) assay was developed to identify chemicals that inhibit TPO, as decreased TPO activity reduces TH synthesis. The ToxCast Phase I and II chemical libraries, comprised of 1,074 unique chemicals, were initially screened using a single, high concentration to identify potential TPO inhibitors. Chemicals positive in the single concentration screen were retested in concentration-response. Due to high false positive rates typically observed with loss-of-signal assays such as AUR-TPO, we also employed two additional assays in parallel to identify possible sources of nonspecific assay signal loss, enabling stratification of roughly 300 putative TPO inhibitors based upon selective AUR-TPO activity. A cell-free luciferase inhibition assay was used to identify nonspecific enzyme inhibition among the putative TPO inhibitors, and a cytotoxicity assay using a human cell line was used to estimate the cellular tolerance limit. Additionally, the TPO inhibition activities of 150 chemicals were compared between the AUR-TPO and an orthogonal peroxidase oxidation assay using

  6. A novel scientific approach in identifying talents among female ...

    African Journals Online (AJOL)

    This study determine the most significant physical fitness and anthro-energy intake components in identifying the talents among female adolescent field hockey players. 45 players from Terengganu sport academy were assessed in physical fitness and anthro-energy intake measurements. The first rotated PCAs presented 8 ...

  7. Identifying the "Truly Disadvantaged": A Comprehensive Biosocial Approach

    Science.gov (United States)

    Barnes, J. C.; Beaver, Kevin M.; Connolly, Eric J.; Schwartz, Joseph A.

    2016-01-01

    There has been significant interest in examining the developmental factors that predispose individuals to chronic criminal offending. This body of research has identified some social-environmental risk factors as potentially important. At the same time, the research producing these results has generally failed to employ genetically sensitive…

  8. Identifying the Determinants of Chronic Absenteeism: A Bioecological Systems Approach

    Science.gov (United States)

    Gottfried, Michael A.; Gee, Kevin A.

    2017-01-01

    Background/Context: Chronic school absenteeism is a pervasive problem across the US; in early education, it is most rampant in kindergarten and its consequences are particularly detrimental, often leading to poorer academic, behavioral and developmental outcomes later in life. Though prior empirical research has identified a broad range of…

  9. Photostimulated luminescence (PSL): A new approach to identifying irradiated foods

    International Nuclear Information System (INIS)

    Sanderson, D.C.W.

    1991-01-01

    PSL, and particularly Anti-Stokes luminescence is a highly specific indicator of energy storage in systems which have been exposed to ionising radiation. The preliminary work illustrated here demonstrates the radiation response of food analogues and the manner in which the phenomenon complements existing tests for irradiated herbs and spices. There appears to be considerable potential for further extension of this approach to a wider range of foods and food components. 5 figs

  10. A Lexical Approach to Identifying Dimensions of Organizational Culture

    Science.gov (United States)

    Chapman, Derek S.; Reeves, Paige; Chapin, Michelle

    2018-01-01

    A comprehensive measure of organizational culture was developed using a lexical approach, a method typically employed within the study of personality. 1761 adjectives were narrowed down and factor analyzed, which resulted in the identification of a nine factor solution to organizational culture, including the dimensions of: Innovative, Dominant, Pace, Friendly, Prestigious, Trendy, Corporate Social Responsibility, Traditional, and Diverse. Comprised of 135 adjectives most frequently used in describing organizational culture by current employees of several hundred organizations, the Lexical Organizational Culture Scale (LOCS) was found to predict employee commitment, job satisfaction, job search behaviors, and subjective fit better than earlier scales of organizational culture. PMID:29922200

  11. The validity of register data to identify children with atopic dermatitis, asthma or allergic rhinoconjunctivitis.

    Science.gov (United States)

    Stensballe, Lone Graff; Klansø, Lotte; Jensen, Andreas; Haerskjold, Ann; Thomsen, Simon Francis; Simonsen, Jacob

    2017-09-01

    The incidence of atopic dermatitis, wheezing, asthma and allergic rhinoconjunctivitis has been increasing. Register-based studies are essential for research in subpopulations with specific diseases and facilitate epidemiological studies to identify causes and evaluate interventions. Algorithms have been developed to identify children with atopic dermatitis, asthma or allergic rhinoconjunctivitis using register information on disease-specific dispensed prescribed medication and hospital contacts, but the validity of the algorithms has not been evaluated. This study validated the algorithms vs gold standard deep telephone interviews with the caretaker about physician-diagnosed atopic dermatitis, wheezing, asthma or allergic rhinoconjunctivitis in the child. The algorithms defined each of the three atopic diseases using register-based information on disease-specific hospital contacts and/or filled prescriptions of disease-specific medication. Confirmative answers to questions about physician-diagnosed atopic disease were used as the gold standard for the comparison with the algorithms, resulting in sensitivities and specificities and 95% confidence intervals. The interviews with the caretaker of the included 454 Danish children born 1997-2003 were carried out May-September 2015; the mean age of the children at the time of the interview being 15.2 years (standard deviation 1.3 years). For the algorithm capturing children with atopic dermatitis, the sensitivity was 74.1% (95% confidence interval: 66.9%-80.2%) and the specificity 73.0% (67.3%-78.0%). For the algorithm capturing children with asthma, both the sensitivity of 84.1% (78.0%-88.8%) and the specificity of 81.6% (76.5%-85.8%) were high compared with physician-diagnosed asthmatic bronchitis (recurrent wheezing). The sensitivity remained high when capturing physician-diagnosed asthma: 83.3% (74.3%-89.6%); however, the specificity declined to 66.0% (60.9%-70.8%). For allergic rhinoconjunctivitis, the sensitivity

  12. MicroRNA expression profiling to identify and validate reference genes for relative quantification in colorectal cancer.

    LENUS (Irish Health Repository)

    Chang, Kah Hoong

    2010-01-01

    BACKGROUND: Advances in high-throughput technologies and bioinformatics have transformed gene expression profiling methodologies. The results of microarray experiments are often validated using reverse transcription quantitative PCR (RT-qPCR), which is the most sensitive and reproducible method to quantify gene expression. Appropriate normalisation of RT-qPCR data using stably expressed reference genes is critical to ensure accurate and reliable results. Mi(cro)RNA expression profiles have been shown to be more accurate in disease classification than mRNA expression profiles. However, few reports detailed a robust identification and validation strategy for suitable reference genes for normalisation in miRNA RT-qPCR studies. METHODS: We adopt and report a systematic approach to identify the most stable reference genes for miRNA expression studies by RT-qPCR in colorectal cancer (CRC). High-throughput miRNA profiling was performed on ten pairs of CRC and normal tissues. By using the mean expression value of all expressed miRNAs, we identified the most stable candidate reference genes for subsequent validation. As such the stability of a panel of miRNAs was examined on 35 tumour and 39 normal tissues. The effects of normalisers on the relative quantity of established oncogenic (miR-21 and miR-31) and tumour suppressor (miR-143 and miR-145) target miRNAs were assessed. RESULTS: In the array experiment, miR-26a, miR-345, miR-425 and miR-454 were identified as having expression profiles closest to the global mean. From a panel of six miRNAs (let-7a, miR-16, miR-26a, miR-345, miR-425 and miR-454) and two small nucleolar RNA genes (RNU48 and Z30), miR-16 and miR-345 were identified as the most stably expressed reference genes. The combined use of miR-16 and miR-345 to normalise expression data enabled detection of a significant dysregulation of all four target miRNAs between tumour and normal colorectal tissue. CONCLUSIONS: Our study demonstrates that the top six most

  13. MicroRNA expression profiling to identify and validate reference genes for relative quantification in colorectal cancer

    LENUS (Irish Health Repository)

    Chang, Kah Hoong

    2010-04-29

    Abstract Background Advances in high-throughput technologies and bioinformatics have transformed gene expression profiling methodologies. The results of microarray experiments are often validated using reverse transcription quantitative PCR (RT-qPCR), which is the most sensitive and reproducible method to quantify gene expression. Appropriate normalisation of RT-qPCR data using stably expressed reference genes is critical to ensure accurate and reliable results. Mi(cro)RNA expression profiles have been shown to be more accurate in disease classification than mRNA expression profiles. However, few reports detailed a robust identification and validation strategy for suitable reference genes for normalisation in miRNA RT-qPCR studies. Methods We adopt and report a systematic approach to identify the most stable reference genes for miRNA expression studies by RT-qPCR in colorectal cancer (CRC). High-throughput miRNA profiling was performed on ten pairs of CRC and normal tissues. By using the mean expression value of all expressed miRNAs, we identified the most stable candidate reference genes for subsequent validation. As such the stability of a panel of miRNAs was examined on 35 tumour and 39 normal tissues. The effects of normalisers on the relative quantity of established oncogenic (miR-21 and miR-31) and tumour suppressor (miR-143 and miR-145) target miRNAs were assessed. Results In the array experiment, miR-26a, miR-345, miR-425 and miR-454 were identified as having expression profiles closest to the global mean. From a panel of six miRNAs (let-7a, miR-16, miR-26a, miR-345, miR-425 and miR-454) and two small nucleolar RNA genes (RNU48 and Z30), miR-16 and miR-345 were identified as the most stably expressed reference genes. The combined use of miR-16 and miR-345 to normalise expression data enabled detection of a significant dysregulation of all four target miRNAs between tumour and normal colorectal tissue. Conclusions Our study demonstrates that the top six most

  14. Multimodal approach to identifying malingered posttraumatic stress disorder: a review.

    Science.gov (United States)

    Ali, Shahid; Jabeen, Shagufta; Alam, Farzana

    2015-01-01

    The primary aim of this article is to aid clinicians in differentiating true posttraumatic stress disorder from malingered posttraumatic stress disorder. Posttraumatic stress disorder and malingering are defined, and prevalence rates are explored. Similarities and differences in diagnostic criteria between the fourth and fifth editions of the Diagnostic and Statistical Manual of Mental Disorders are described for posttraumatic stress disorder. Possible motivations for malingering posttraumatic stress disorder are discussed, and common characteristics of malingered posttraumatic stress disorder are described. A multimodal approach is described for evaluating posttraumatic stress disorder, including interview techniques, collection of collateral data, and psychometric and physiologic testing, that should allow clinicians to distinguish between those patients who are truly suffering from posttraumatic disorder and those who are malingering the illness.

  15. Identifying Pathogenicity Islands in Bacterial Pathogenomics Using Computational Approaches

    Directory of Open Access Journals (Sweden)

    Dongsheng Che

    2014-01-01

    Full Text Available High-throughput sequencing technologies have made it possible to study bacteria through analyzing their genome sequences. For instance, comparative genome sequence analyses can reveal the phenomenon such as gene loss, gene gain, or gene exchange in a genome. By analyzing pathogenic bacterial genomes, we can discover that pathogenic genomic regions in many pathogenic bacteria are horizontally transferred from other bacteria, and these regions are also known as pathogenicity islands (PAIs. PAIs have some detectable properties, such as having different genomic signatures than the rest of the host genomes, and containing mobility genes so that they can be integrated into the host genome. In this review, we will discuss various pathogenicity island-associated features and current computational approaches for the identification of PAIs. Existing pathogenicity island databases and related computational resources will also be discussed, so that researchers may find it to be useful for the studies of bacterial evolution and pathogenicity mechanisms.

  16. Omics Approach to Identify Factors Involved in Brassica Disease Resistance.

    Science.gov (United States)

    Francisco, Marta; Soengas, Pilar; Velasco, Pablo; Bhadauria, Vijai; Cartea, Maria E; Rodríguez, Victor M

    2016-01-01

    Understanding plant's defense mechanisms and their response to biotic stresses is of fundamental meaning for the development of resistant crop varieties and more productive agriculture. The Brassica genus involves a large variety of economically important species and cultivars used as vegetable source, oilseeds, forage and ornamental. Damage caused by pathogens attack affects negatively various aspects of plant growth, development, and crop productivity. Over the last few decades, advances in plant physiology, genetics, and molecular biology have greatly improved our understanding of plant responses to biotic stress conditions. In this regard, various 'omics' technologies enable qualitative and quantitative monitoring of the abundance of various biological molecules in a high-throughput manner, and thus allow determination of their variation between different biological states on a genomic scale. In this review, we have described advances in 'omic' tools (genomics, transcriptomics, proteomics and metabolomics) in the view of conventional and modern approaches being used to elucidate the molecular mechanisms that underlie Brassica disease resistance.

  17. Identifying Human Phenotype Terms by Combining Machine Learning and Validation Rules

    Directory of Open Access Journals (Sweden)

    Manuel Lobo

    2017-01-01

    Full Text Available Named-Entity Recognition is commonly used to identify biological entities such as proteins, genes, and chemical compounds found in scientific articles. The Human Phenotype Ontology (HPO is an ontology that provides a standardized vocabulary for phenotypic abnormalities found in human diseases. This article presents the Identifying Human Phenotypes (IHP system, tuned to recognize HPO entities in unstructured text. IHP uses Stanford CoreNLP for text processing and applies Conditional Random Fields trained with a rich feature set, which includes linguistic, orthographic, morphologic, lexical, and context features created for the machine learning-based classifier. However, the main novelty of IHP is its validation step based on a set of carefully crafted manual rules, such as the negative connotation analysis, that combined with a dictionary can filter incorrectly identified entities, find missed entities, and combine adjacent entities. The performance of IHP was evaluated using the recently published HPO Gold Standardized Corpora (GSC, where the system Bio-LarK CR obtained the best F-measure of 0.56. IHP achieved an F-measure of 0.65 on the GSC. Due to inconsistencies found in the GSC, an extended version of the GSC was created, adding 881 entities and modifying 4 entities. IHP achieved an F-measure of 0.863 on the new GSC.

  18. Medical chart validation of an algorithm for identifying multiple sclerosis relapse in healthcare claims.

    Science.gov (United States)

    Chastek, Benjamin J; Oleen-Burkey, Merrikay; Lopez-Bresnahan, Maria V

    2010-01-01

    Relapse is a common measure of disease activity in relapsing-remitting multiple sclerosis (MS). The objective of this study was to test the content validity of an operational algorithm for detecting relapse in claims data. A claims-based relapse detection algorithm was tested by comparing its detection rate over a 1-year period with relapses identified based on medical chart review. According to the algorithm, MS patients in a US healthcare claims database who had either (1) a primary claim for MS during hospitalization or (2) a corticosteroid claim following a MS-related outpatient visit were designated as having a relapse. Patient charts were examined for explicit indication of relapse or care suggestive of relapse. Positive and negative predictive values were calculated. Medical charts were reviewed for 300 MS patients, half of whom had a relapse according to the algorithm. The claims-based criteria correctly classified 67.3% of patients with relapses (positive predictive value) and 70.0% of patients without relapses (negative predictive value; kappa 0.373: p value of the operational algorithm. Limitations of the algorithm include lack of differentiation between relapsing-remitting MS and other types, and that it does not incorporate measures of function and disability. The claims-based algorithm appeared to successfully detect moderate-to-severe MS relapse. This validated definition can be applied to future claims-based MS studies.

  19. An information-theoretic approach to assess practical identifiability of parametric dynamical systems.

    Science.gov (United States)

    Pant, Sanjay; Lombardi, Damiano

    2015-10-01

    A new approach for assessing parameter identifiability of dynamical systems in a Bayesian setting is presented. The concept of Shannon entropy is employed to measure the inherent uncertainty in the parameters. The expected reduction in this uncertainty is seen as the amount of information one expects to gain about the parameters due to the availability of noisy measurements of the dynamical system. Such expected information gain is interpreted in terms of the variance of a hypothetical measurement device that can measure the parameters directly, and is related to practical identifiability of the parameters. If the individual parameters are unidentifiable, correlation between parameter combinations is assessed through conditional mutual information to determine which sets of parameters can be identified together. The information theoretic quantities of entropy and information are evaluated numerically through a combination of Monte Carlo and k-nearest neighbour methods in a non-parametric fashion. Unlike many methods to evaluate identifiability proposed in the literature, the proposed approach takes the measurement-noise into account and is not restricted to any particular noise-structure. Whilst computationally intensive for large dynamical systems, it is easily parallelisable and is non-intrusive as it does not necessitate re-writing of the numerical solvers of the dynamical system. The application of such an approach is presented for a variety of dynamical systems--ranging from systems governed by ordinary differential equations to partial differential equations--and, where possible, validated against results previously published in the literature. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Identifying typical patterns of vulnerability: A 5-step approach based on cluster analysis

    Science.gov (United States)

    Sietz, Diana; Lüdeke, Matthias; Kok, Marcel; Lucas, Paul; Carsten, Walther; Janssen, Peter

    2013-04-01

    Specific processes that shape the vulnerability of socio-ecological systems to climate, market and other stresses derive from diverse background conditions. Within the multitude of vulnerability-creating mechanisms, distinct processes recur in various regions inspiring research on typical patterns of vulnerability. The vulnerability patterns display typical combinations of the natural and socio-economic properties that shape a systems' vulnerability to particular stresses. Based on the identification of a limited number of vulnerability patterns, pattern analysis provides an efficient approach to improving our understanding of vulnerability and decision-making for vulnerability reduction. However, current pattern analyses often miss explicit descriptions of their methods and pay insufficient attention to the validity of their groupings. Therefore, the question arises as to how do we identify typical vulnerability patterns in order to enhance our understanding of a systems' vulnerability to stresses? A cluster-based pattern recognition applied at global and local levels is scrutinised with a focus on an applicable methodology and practicable insights. Taking the example of drylands, this presentation demonstrates the conditions necessary to identify typical vulnerability patterns. They are summarised in five methodological steps comprising the elicitation of relevant cause-effect hypotheses and the quantitative indication of mechanisms as well as an evaluation of robustness, a validation and a ranking of the identified patterns. Reflecting scale-dependent opportunities, a global study is able to support decision-making with insights into the up-scaling of interventions when available funds are limited. In contrast, local investigations encourage an outcome-based validation. This constitutes a crucial step in establishing the credibility of the patterns and hence their suitability for informing extension services and individual decisions. In this respect, working at

  1. Human Development in Romania: A Comparative Approach to Identifying Shortcomings

    Directory of Open Access Journals (Sweden)

    Robert STEFAN

    2017-12-01

    Full Text Available Following the research carried out by the economist Mahbub ul Haq, derived from the studies of Amartya Sen on human capabilities, in 1990, the United Nations Development Programme (UNDP published its first Human Development Report. It introduced the notion that development of a country is not merely equal to economic growth, but has the ultimate purpose of enriching human life by expanding people’s choices. Thus, Human Development seeks to reveal the fundamental role of human life: that of reaching its full potential. Even after 28 years since the fall of communism, the political environment in Romania continues to be unsopportive of proper development. This study seeks to identify the shortcomings of the primary dimensions of Human Development in Romania and hopefully make a firm and rhetorical call to action.

  2. Computational approaches to identify functional genetic variants in cancer genomes

    DEFF Research Database (Denmark)

    Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris

    2013-01-01

    The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result of discu......The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor but only a minority of these drive tumor progression. We present the result...... of discussions within the ICGC on how to address the challenge of identifying mutations that contribute to oncogenesis, tumor maintenance or response to therapy, and recommend computational techniques to annotate somatic variants and predict their impact on cancer phenotype....

  3. How well do discharge diagnoses identify hospitalised patients with community-acquired infections? - a validation study

    DEFF Research Database (Denmark)

    Henriksen, Daniel Pilsgaard; Nielsen, Stig Lønberg; Laursen, Christian Borbjerg

    2014-01-01

    -10 diagnoses was 79.9% (95%CI: 78.1-81.3%), specificity 83.9% (95%CI: 82.6-85.1%), positive likelihood ratio 4.95 (95%CI: 4.58-5.36) and negative likelihood ratio 0.24 (95%CI: 0.22-0.26). The two most common sites of infection, the lower respiratory tract and urinary tract, had positive likelihood......BACKGROUND: Credible measures of disease incidence, trends and mortality can be obtained through surveillance using manual chart review, but this is both time-consuming and expensive. ICD-10 discharge diagnoses are used as surrogate markers of infection, but knowledge on the validity of infections...... in general is sparse. The aim of the study was to determine how well ICD-10 discharge diagnoses identify patients with community-acquired infections in a medical emergency department (ED), overall and related to sites of infection and patient characteristics. METHODS: We manually reviewed 5977 patients...

  4. An Integrative data mining approach to identifying Adverse ...

    Science.gov (United States)

    The Adverse Outcome Pathway (AOP) framework is a tool for making biological connections and summarizing key information across different levels of biological organization to connect biological perturbations at the molecular level to adverse outcomes for an individual or population. Computational approaches to explore and determine these connections can accelerate the assembly of AOPs. By leveraging the wealth of publicly available data covering chemical effects on biological systems, computationally-predicted AOPs (cpAOPs) were assembled via data mining of high-throughput screening (HTS) in vitro data, in vivo data and other disease phenotype information. Frequent Itemset Mining (FIM) was used to find associations between the gene targets of ToxCast HTS assays and disease data from Comparative Toxicogenomics Database (CTD) by using the chemicals as the common aggregators between datasets. The method was also used to map gene expression data to disease data from CTD. A cpAOP network was defined by considering genes and diseases as nodes and FIM associations as edges. This network contained 18,283 gene to disease associations for the ToxCast data and 110,253 for CTD gene expression. Two case studies show the value of the cpAOP network by extracting subnetworks focused either on fatty liver disease or the Aryl Hydrocarbon Receptor (AHR). The subnetwork surrounding fatty liver disease included many genes known to play a role in this disease. When querying the cpAOP

  5. Validation of two case definitions to identify pressure ulcers using hospital administrative data.

    Science.gov (United States)

    Ho, Chester; Jiang, Jason; Eastwood, Cathy A; Wong, Holly; Weaver, Brittany; Quan, Hude

    2017-08-28

    Pressure ulcer development is a quality of care indicator, as pressure ulcers are potentially preventable. Yet pressure ulcer is a leading cause of morbidity, discomfort and additional healthcare costs for inpatients. Methods are lacking for accurate surveillance of pressure ulcer in hospitals to track occurrences and evaluate care improvement strategies. The main study aim was to validate hospital discharge abstract database (DAD) in recording pressure ulcers against nursing consult reports, and to calculate prevalence of pressure ulcers in Alberta, Canada in DAD. We hypothesised that a more inclusive case definition for pressure ulcers would enhance validity of cases identified in administrative data for research and quality improvement purposes. A cohort of patients with pressure ulcers were identified from enterostomal (ET) nursing consult documents at a large university hospital in 2011. There were 1217 patients with pressure ulcers in ET nursing documentation that were linked to a corresponding record in DAD to validate DAD for correct and accurate identification of pressure ulcer occurrence, using two case definitions for pressure ulcer. Using pressure ulcer definition 1 (7 codes), prevalence was 1.4%, and using definition 2 (29 codes), prevalence was 4.2% after adjusting for misclassifications. The results were lower than expected. Definition 1 sensitivity was 27.7% and specificity was 98.8%, while definition 2 sensitivity was 32.8% and specificity was 95.9%. Pressure ulcer in both DAD and ET consultation increased with age, number of comorbidities and length of stay. DAD underestimate pressure ulcer prevalence. Since various codes are used to record pressure ulcers in DAD, the case definition with more codes captures more pressure ulcer cases, and may be useful for monitoring facility trends. However, low sensitivity suggests that this data source may not be accurate for determining overall prevalence, and should be cautiously compared with other

  6. Newer Approaches to Identify Potential Untoward Effects in Functional Foods.

    Science.gov (United States)

    Marone, Palma Ann; Birkenbach, Victoria L; Hayes, A Wallace

    2016-01-01

    Globalization has greatly accelerated the numbers and variety of food and beverage products available worldwide. The exchange among greater numbers of countries, manufacturers, and products in the United States and worldwide has necessitated enhanced quality measures for nutritional products for larger populations increasingly reliant on functionality. These functional foods, those that provide benefit beyond basic nutrition, are increasingly being used for their potential to alleviate food insufficiency while enhancing quality and longevity of life. In the United States alone, a steady import increase of greater than 15% per year or 24 million shipments, over 70% products of which are food related, is regulated under the Food and Drug Administration (FDA). This unparalleled growth has resulted in the need for faster, cheaper, and better safety and efficacy screening methods in the form of harmonized guidelines and recommendations for product standardization. In an effort to meet this need, the in vitro toxicology testing market has similarly grown with an anticipatory 15% increase between 2010 and 2015 of US$1.3 to US$2.7 billion. Although traditionally occupying a small fraction of the market behind pharmaceuticals and cosmetic/household products, the scope of functional food testing, including additives/supplements, ingredients, residues, contact/processing, and contaminants, is potentially expansive. Similarly, as functional food testing has progressed, so has the need to identify potential adverse factors that threaten the safety and quality of these products. © The Author(s) 2015.

  7. A consequence index approach to identifying radiological sabotage targets

    International Nuclear Information System (INIS)

    Altman, W.D.; Hockert, J.W.

    1988-01-01

    One of the threats to concern to facilities using significant quantities of radioactive material is radiological sabotage. Both the Department of Energy (DOE) and the U.S. Nuclear Regulatory Commission have issued guidance to facilities for radiological sabotage protection. At those facilities where the inventories of radioactive materials change frequently, there is an operational need for a technically defensible method of determining whether or not the inventory of radioactive material at a given facility poses a potential radiological sabotage risk. In order to determine quickly whether a building is a potential radiological sabotage target, Lawrence Livermore National Loaboratory (LLNL) has developed a radiological sabotage consequence index that provides a conservative estimate of the maximum potential off-site consequences of a radiological sabotage attempt involving the facility. This radiological sabotage consequence index can be used by safeguards and security staff to rapidly determine whether a change in building operations poses a potential radiological sabotage risk. In those cases where such a potential risk is identified, a more detailed radiological sabotage vulnerability analysis can be performed

  8. Bioinformatics approaches for identifying new therapeutic bioactive peptides in food

    Directory of Open Access Journals (Sweden)

    Nora Khaldi

    2012-10-01

    Full Text Available ABSTRACT:The traditional methods for mining foods for bioactive peptides are tedious and long. Similar to the drug industry, the length of time to identify and deliver a commercial health ingredient that reduces disease symptoms can take anything between 5 to 10 years. Reducing this time and effort is crucial in order to create new commercially viable products with clear and important health benefits. In the past few years, bioinformatics, the science that brings together fast computational biology, and efficient genome mining, is appearing as the long awaited solution to this problem. By quickly mining food genomes for characteristics of certain food therapeutic ingredients, researchers can potentially find new ones in a matter of a few weeks. Yet, surprisingly, very little success has been achieved so far using bioinformatics in mining for food bioactives.The absence of food specific bioinformatic mining tools, the slow integration of both experimental mining and bioinformatics, and the important difference between different experimental platforms are some of the reasons for the slow progress of bioinformatics in the field of functional food and more specifically in bioactive peptide discovery.In this paper I discuss some methods that could be easily translated, using a rational peptide bioinformatics design, to food bioactive peptide mining. I highlight the need for an integrated food peptide database. I also discuss how to better integrate experimental work with bioinformatics in order to improve the mining of food for bioactive peptides, therefore achieving a higher success rates.

  9. Identifying problematic concepts in SNOMED CT using a lexical approach.

    Science.gov (United States)

    Agrawal, Ankur; Perl, Yehoshua; Elhanan, Gai

    2013-01-01

    SNOMED CT (SCT) has been endorsed as a premier clinical terminology by many organizations with a perceived use within electronic health records and clinical information systems. However, there are indications that, at the moment, SCT is not optimally structured for its intended use by healthcare practitioners. A study is conducted to investigate the extent of inconsistencies among the concepts in SCT. A group auditing technique to improve the quality of SCT is introduced that can help identify problematic concepts with a high probability. Positional similarity sets are defined, which are groups of concepts that are lexically similar and the position of the differing word in the fully specified name of the concepts of a set that correspond to each other. A manual auditing of a sample of such sets found 38% of the sets exhibiting one or more inconsistent concepts. Group auditing techniques such as this can thus be very helpful to assure the quality of SCT, which will help expedite its adoption as a reference terminology for clinical purposes.

  10. A landscape ecology approach identifies important drivers of urban biodiversity.

    Science.gov (United States)

    Turrini, Tabea; Knop, Eva

    2015-04-01

    Cities are growing rapidly worldwide, yet a mechanistic understanding of the impact of urbanization on biodiversity is lacking. We assessed the impact of urbanization on arthropod diversity (species richness and evenness) and abundance in a study of six cities and nearby intensively managed agricultural areas. Within the urban ecosystem, we disentangled the relative importance of two key landscape factors affecting biodiversity, namely the amount of vegetated area and patch isolation. To do so, we a priori selected sites that independently varied in the amount of vegetated area in the surrounding landscape at the 500-m scale and patch isolation at the 100-m scale, and we hold local patch characteristics constant. As indicator groups, we used bugs, beetles, leafhoppers, and spiders. Compared to intensively managed agricultural ecosystems, urban ecosystems supported a higher abundance of most indicator groups, a higher number of bug species, and a lower evenness of bug and beetle species. Within cities, a high amount of vegetated area increased species richness and abundance of most arthropod groups, whereas evenness showed no clear pattern. Patch isolation played only a limited role in urban ecosystems, which contrasts findings from agro-ecological studies. Our results show that urban areas can harbor a similar arthropod diversity and abundance compared to intensively managed agricultural ecosystems. Further, negative consequences of urbanization on arthropod diversity can be mitigated by providing sufficient vegetated space in the urban area, while patch connectivity is less important in an urban context. This highlights the need for applying a landscape ecological approach to understand the mechanisms shaping urban biodiversity and underlines the potential of appropriate urban planning for mitigating biodiversity loss. © 2015 John Wiley & Sons Ltd.

  11. A new simplex chemometric approach to identify olive oil blends with potentially high traceability.

    Science.gov (United States)

    Semmar, N; Laroussi-Mezghani, S; Grati-Kamoun, N; Hammami, M; Artaud, J

    2016-10-01

    Olive oil blends (OOBs) are complex matrices combining different cultivars at variable proportions. Although qualitative determinations of OOBs have been subjected to several chemometric works, quantitative evaluations of their contents remain poorly developed because of traceability difficulties concerning co-occurring cultivars. Around this question, we recently published an original simplex approach helping to develop predictive models of the proportions of co-occurring cultivars from chemical profiles of resulting blends (Semmar & Artaud, 2015). Beyond predictive model construction and validation, this paper presents an extension based on prediction errors' analysis to statistically define the blends with the highest predictability among all the possible ones that can be made by mixing cultivars at different proportions. This provides an interesting way to identify a priori labeled commercial products with potentially high traceability taking into account the natural chemical variability of different constitutive cultivars. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Market potential of nanoremediation in Europe - Market drivers and interventions identified in a deliberative scenario approach.

    Science.gov (United States)

    Bartke, Stephan; Hagemann, Nina; Harries, Nicola; Hauck, Jennifer; Bardos, Paul

    2018-04-01

    A deliberate expert-based scenario approach is applied to better understand the likely determinants of the evolution of the market for nanoparticles use in remediation in Europe until 2025. An initial set of factors had been obtained from a literature review and was complemented by a workshop and key-informant interviews. In further expert engaging formats - focus groups, workshops, conferences, surveys - this initial set of factors was condensed and engaged experts scored the factors regarding their importance for being likely to influence the market development. An interaction matrix was obtained identifying the factors being most active in shaping the market development in Europe by 2025, namely "Science-Policy-Interface" and "Validated information on nanoparticle application potential". Based on these, potential scenarios were determined and development of factors discussed. Conclusions are offered on achievable interventions to enhance nanoremediation deployment. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Transaortic TAVI Is a Valid Alternative to Transapical Approach.

    Science.gov (United States)

    O' Sullivan, Katie E; Hurley, Eoghan T; Segurado, Ricardo; Sugrue, Declan; Hurley, John P

    2015-05-01

    Transcatheter aortic valve implantation (TAVI) can be performed via a number of different anatomical approaches based on patient characteristics and operator choice. The aim of this study was to compare procedural outcomes between transaortic (TAo) and transapical (TA) approaches in an effort to establish whether any differences exist. A systematic review and meta-analysis of the current literature reporting outcomes for patients undergoing TAo and TA TAVI was performed to compare outcomes using each vascular approach to valve deployment. A total of 10 studies and 1736 patients were included. A total of 193 patients underwent TAo and 1543 TA TAVI. No significant difference in 30-day mortality was identified (TAo 9.4, TA 10.4 p = 0.7). There were no significant differences identified between TAo and TA TAVI in procedural success rate (96.3% vs. 93.7% p = 0.3), stroke and transient ischemic attack (TIA) incidence (1.8% vs. 2.3% p = 0.7), major bleed (5.8% vs. 5.5% p = 0.97) or pacemaker insertion rates (6.1% vs. 7.4% p = 0.56). In addition, the incidence of clinically significant paravalvular regurgitation (PVR) was the same between groups (6.7% vs. 11% p = 0.43). Comparison of TAo and TA approaches revealed equivalent outcomes in 30-day mortality, procedural success, major bleeding, stroke/TIA incidence, pacemaker insertion rates and paravalvular leak. Heart teams should be familiar with the use of both TA and TAo access and tailor their selection on a case-to-case basis. © 2015 Wiley Periodicals, Inc.

  14. Use of AUDIT-based measures to identify unhealthy alcohol use and alcohol dependence in primary care: a validation study.

    Science.gov (United States)

    Johnson, J Aaron; Lee, Anna; Vinson, Daniel; Seale, J Paul

    2013-01-01

    As programs for screening, brief intervention, and referral to treatment (SBIRT) for unhealthy alcohol use disseminate, evidence-based approaches for identifying patients with unhealthy alcohol use and alcohol dependence (AD) are needed. While the National Institute on Alcohol Abuse and Alcoholism Clinician Guide suggests use of a single alcohol screening question (SASQ) for screening and Diagnostic and Statistical Manual checklists for assessment, many SBIRT programs use alcohol use disorders identification test (AUDIT) "zones" for screening and assessment. Validation data for these zones are limited. This study used primary care data from a bi-ethnic southern U.S. population to examine the ability of the AUDIT zones and other AUDIT-based approaches to identify unhealthy alcohol use and dependence. Existing data were analyzed from interviews with 625 female and male adult drinkers presenting to 5 southeastern primary care practices. Timeline follow-back was used to identify at-risk drinking, and diagnostic interview schedule was used to identify alcohol abuse and dependence. Validity measures compared performance of AUDIT, AUDIT-C, and AUDIT dependence domains scores, with and without a 30-day binge drinking measure, for detecting unhealthy alcohol use and dependence. Optimal AUDIT scores for detecting unhealthy alcohol use were lower than current commonly used cutoffs (5 for men, 3 for women). Improved performance was obtained by combining AUDIT cutoffs of 6 for men and 4 for women with a 30-day binge drinking measure. AUDIT scores of 15 for men and 13 for women detected AD with 100% specificity but low sensitivity (20 and 18%, respectively). AUDIT dependence subscale scores of 2 or more showed similar specificity (99%) and slightly higher sensitivity (31% for men, 24% for women). Combining lower AUDIT cutoff scores and binge drinking measures may increase the detection of unhealthy alcohol use in primary care. Use of lower cutoff scores and dependence subscale

  15. A Systematic Review of Validated Methods for Identifying Cerebrovascular Accident or Transient Ischemic Attack Using Administrative Data

    Science.gov (United States)

    Andrade, Susan E.; Harrold, Leslie R.; Tjia, Jennifer; Cutrona, Sarah L.; Saczynski, Jane S.; Dodd, Katherine S.; Goldberg, Robert J.; Gurwitz, Jerry H.

    2012-01-01

    Purpose To perform a systematic review of the validity of algorithms for identifying cerebrovascular accidents (CVAs) or transient ischemic attacks (TIAs) using administrative and claims data. Methods PubMed and Iowa Drug Information Service (IDIS) searches of the English language literature were performed to identify studies published between 1990 and 2010 that evaluated the validity of algorithms for identifying CVAs (ischemic and hemorrhagic strokes, intracranial hemorrhage and subarachnoid hemorrhage) and/or TIAs in administrative data. Two study investigators independently reviewed the abstracts and articles to determine relevant studies according to pre-specified criteria. Results A total of 35 articles met the criteria for evaluation. Of these, 26 articles provided data to evaluate the validity of stroke, 7 reported the validity of TIA, 5 reported the validity of intracranial bleeds (intracerebral hemorrhage and subarachnoid hemorrhage), and 10 studies reported the validity of algorithms to identify the composite endpoints of stroke/TIA or cerebrovascular disease. Positive predictive values (PPVs) varied depending on the specific outcomes and algorithms evaluated. Specific algorithms to evaluate the presence of stroke and intracranial bleeds were found to have high PPVs (80% or greater). Algorithms to evaluate TIAs in adult populations were generally found to have PPVs of 70% or greater. Conclusions The algorithms and definitions to identify CVAs and TIAs using administrative and claims data differ greatly in the published literature. The choice of the algorithm employed should be determined by the stroke subtype of interest. PMID:22262598

  16. The metabolomic approach identifies a biological signature of low-dose chronic exposure to Cesium 137

    International Nuclear Information System (INIS)

    Grison, S.; Grandcolas, L.; Martin, J.C.

    2012-01-01

    Reports have described apparent biological effects of 137 Cs (the most persistent dispersed radionuclide) irradiation in people living in Chernobyl-contaminated territory. The sensitive analytical technology described here should now help assess the relation of this contamination to the observed effects. A rat model chronically exposed to 137 Cs through drinking water was developed to identify biomarkers of radiation-induced metabolic disorders, and the biological impact was evaluated by a metabolomic approach that allowed us to detect several hundred metabolites in biofluids and assess their association with disease states. After collection of plasma and urine from contaminated and non-contaminated rats at the end of the 9-months contamination period, analysis with a liquid chromatography coupled to mass spectrometry (LC-MS) system detected 742 features in urine and 1309 in plasma. Biostatistical discriminant analysis extracted a subset of 26 metabolite signals (2 urinary, 4 plasma non-polar, and 19 plasma polar metabolites) that in combination were able to predict from 68 up to 94% of the contaminated rats, depending on the prediction method used, with a misclassification rate as low as 5.3%. The difference in this metabolic score between the contaminated and non-contaminated rats was highly significant (P=0.019 after ANOVA cross-validation). In conclusion, our proof-of-principle study demonstrated for the first time the usefulness of a metabolomic approach for addressing biological effects of chronic low-dose contamination. We can conclude that a metabolomic signature discriminated 137 Cs-contaminated from control animals in our model. Further validation is nevertheless required together with full annotation of the metabolic indicators. (author)

  17. Rejoinder: A Construct Validity Approach to the Assessment of Narcissism.

    Science.gov (United States)

    Miller, Joshua D; Lynam, Donald R; Campbell, W Keith

    2016-02-01

    In this rejoinder, we comment on Wright's response to our reanalysis and reinterpretation of the data presented by Wright and colleagues. Two primary differences characterize these perspectives. First, the conceptualization of grandiose narcissism differs such that emotional and ego vulnerability, dysregulation, and pervasive impairments are more characteristic of Wright's conception, likely due to the degree to which it is tied to clinical observations. Our conceptualization is closer to psychopathy and describes an extraverted, dominant, and antagonistic individual who is relatively less likely to be found in clinical settings. Second, our approach to construct validation differs in that we take an empirical perspective that focuses on the degree to which inventories yield scores consistent with a priori predictions. The grandiose dimension of the Pathological Narcissism Inventory (PNI-G) yields data that fail to align with expert ratings of narcissistic personality disorder and grandiose narcissism. We suggest that caution should be taken in treating the PNI-G as a gold standard measure of pathological narcissism, that revision of the PNI-G is required before it can serve as a stand-alone measure of grandiose narcissism, and that the PNI-G should be buttressed by other scales when being used as a measure of grandiose narcissism. © The Author(s) 2015.

  18. The KNICS approach for verification and validation of safety software

    International Nuclear Information System (INIS)

    Cha, Kyung Ho; Sohn, Han Seong; Lee, Jang Soo; Kim, Jang Yeol; Cheon, Se Woo; Lee, Young Joon; Hwang, In Koo; Kwon, Kee Choon

    2003-01-01

    This paper presents verification and validation (VV) to be approached for safety software of POSAFE-Q Programmable Logic Controller (PLC) prototype and Plant Protection System (PPS) prototype, which consists of Reactor Protection System (RPS) and Engineered Safety Features-Component Control System (ESF-CCS) in development of Korea Nuclear Instrumentation and Control System (KNICS). The SVV criteria and requirements are selected from IEEE Std. 7-4.3.2, IEEE Std. 1012, IEEE Std. 1028 and BTP-14, and they have been considered for acceptance framework to be provided within SVV procedures. SVV techniques, including Review and Inspection (R and I), Formal Verification and Theorem Proving, and Automated Testing, are applied for safety software and automated SVV tools supports SVV tasks. Software Inspection Support and Requirement Traceability (SIS-RT) supports R and I and traceability analysis, a New Symbolic Model Verifier (NuSMV), Statemate MAGNUM (STM) ModelCertifier, and Prototype Verification System (PVS) are used for formal verification, and McCabe and Cantata++ are utilized for static and dynamic software testing. In addition, dedication of Commercial-Off-The-Shelf (COTS) software and firmware, Software Safety Analysis (SSA) and evaluation of Software Configuration Management (SCM) are being performed for the PPS prototype in the software requirements phase

  19. Shortest-path network analysis is a useful approach toward identifying genetic determinants of longevity.

    Directory of Open Access Journals (Sweden)

    J R Managbanag

    Full Text Available BACKGROUND: Identification of genes that modulate longevity is a major focus of aging-related research and an area of intense public interest. In addition to facilitating an improved understanding of the basic mechanisms of aging, such genes represent potential targets for therapeutic intervention in multiple age-associated diseases, including cancer, heart disease, diabetes, and neurodegenerative disorders. To date, however, targeted efforts at identifying longevity-associated genes have been limited by a lack of predictive power, and useful algorithms for candidate gene-identification have also been lacking. METHODOLOGY/PRINCIPAL FINDINGS: We have utilized a shortest-path network analysis to identify novel genes that modulate longevity in Saccharomyces cerevisiae. Based on a set of previously reported genes associated with increased life span, we applied a shortest-path network algorithm to a pre-existing protein-protein interaction dataset in order to construct a shortest-path longevity network. To validate this network, the replicative aging potential of 88 single-gene deletion strains corresponding to predicted components of the shortest-path longevity network was determined. Here we report that the single-gene deletion strains identified by our shortest-path longevity analysis are significantly enriched for mutations conferring either increased or decreased replicative life span, relative to a randomly selected set of 564 single-gene deletion strains or to the current data set available for the entire haploid deletion collection. Further, we report the identification of previously unknown longevity genes, several of which function in a conserved longevity pathway believed to mediate life span extension in response to dietary restriction. CONCLUSIONS/SIGNIFICANCE: This work demonstrates that shortest-path network analysis is a useful approach toward identifying genetic determinants of longevity and represents the first application of

  20. The bottom-up approach to integrative validity: a new perspective for program evaluation.

    Science.gov (United States)

    Chen, Huey T

    2010-08-01

    The Campbellian validity model and the traditional top-down approach to validity have had a profound influence on research and evaluation. That model includes the concepts of internal and external validity and within that model, the preeminence of internal validity as demonstrated in the top-down approach. Evaluators and researchers have, however, increasingly recognized that in an evaluation, the over-emphasis on internal validity reduces that evaluation's usefulness and contributes to the gulf between academic and practical communities regarding interventions. This article examines the limitations of the Campbellian validity model and the top-down approach and provides a comprehensive, alternative model, known as the integrative validity model for program evaluation. The integrative validity model includes the concept of viable validity, which is predicated on a bottom-up approach to validity. This approach better reflects stakeholders' evaluation views and concerns, makes external validity workable, and becomes therefore a preferable alternative for evaluation of health promotion/social betterment programs. The integrative validity model and the bottom-up approach enable evaluators to meet scientific and practical requirements, facilitate in advancing external validity, and gain a new perspective on methods. The new perspective also furnishes a balanced view of credible evidence, and offers an alternative perspective for funding. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  1. Nutrient profiling can help identify foods of good nutritional quality for their price: a validation study with linear programming.

    Science.gov (United States)

    Maillot, Matthieu; Ferguson, Elaine L; Drewnowski, Adam; Darmon, Nicole

    2008-06-01

    Nutrient profiling ranks foods based on their nutrient content. They may help identify foods with a good nutritional quality for their price. This hypothesis was tested using diet modeling with linear programming. Analyses were undertaken using food intake data from the nationally representative French INCA (enquête Individuelle et Nationale sur les Consommations Alimentaires) survey and its associated food composition and price database. For each food, a nutrient profile score was defined as the ratio between the previously published nutrient density score (NDS) and the limited nutrient score (LIM); a nutritional quality for price indicator was developed and calculated from the relationship between its NDS:LIM and energy cost (in euro/100 kcal). We developed linear programming models to design diets that fulfilled increasing levels of nutritional constraints at a minimal cost. The median NDS:LIM values of foods selected in modeled diets increased as the levels of nutritional constraints increased (P = 0.005). In addition, the proportion of foods with a good nutritional quality for price indicator was higher (P linear programming and the nutrient profiling approaches indicates that nutrient profiling can help identify foods of good nutritional quality for their price. Linear programming is a useful tool for testing nutrient profiling systems and validating the concept of nutrient profiling.

  2. Prediction potential of candidate biomarker sets identified and validated on gene expression data from multiple datasets

    Directory of Open Access Journals (Sweden)

    Karacali Bilge

    2007-10-01

    Full Text Available Abstract Background Independently derived expression profiles of the same biological condition often have few genes in common. In this study, we created populations of expression profiles from publicly available microarray datasets of cancer (breast, lymphoma and renal samples linked to clinical information with an iterative machine learning algorithm. ROC curves were used to assess the prediction error of each profile for classification. We compared the prediction error of profiles correlated with molecular phenotype against profiles correlated with relapse-free status. Prediction error of profiles identified with supervised univariate feature selection algorithms were compared to profiles selected randomly from a all genes on the microarray platform and b a list of known disease-related genes (a priori selection. We also determined the relevance of expression profiles on test arrays from independent datasets, measured on either the same or different microarray platforms. Results Highly discriminative expression profiles were produced on both simulated gene expression data and expression data from breast cancer and lymphoma datasets on the basis of ER and BCL-6 expression, respectively. Use of relapse-free status to identify profiles for prognosis prediction resulted in poorly discriminative decision rules. Supervised feature selection resulted in more accurate classifications than random or a priori selection, however, the difference in prediction error decreased as the number of features increased. These results held when decision rules were applied across-datasets to samples profiled on the same microarray platform. Conclusion Our results show that many gene sets predict molecular phenotypes accurately. Given this, expression profiles identified using different training datasets should be expected to show little agreement. In addition, we demonstrate the difficulty in predicting relapse directly from microarray data using supervised machine

  3. A practical approach to perform graded verification and validation

    International Nuclear Information System (INIS)

    Terrado, Carlos; Woolley, J.

    2000-01-01

    Modernization of instrumentation and control (I and C) systems in nuclear power plants often implies to go from analog to digital systems. One condition for the upgrade to be successful is that the new systems achieve at least the same quality level as the analog they replace. The most important part of digital systems quality assurance (QA) is verification and validation (V and V). V and V is concerned with the process as much as the product, it is a systematic program of review and testing activities performed throughout the system development life cycle. Briefly, we can say that verification is to build the product correctly, and validation is to build the correct product. Since V and V is necessary but costly, it is helpful to tailor the effort that should be performed to achieve the quality goal for each particular case. To do this, an accepted practice is to establish different V and V levels, each one with a proper degree of stringency or rigor. This paper shows a practical approach to estimate the appropriate level of V and V, and the resulting V and V techniques recommended for each specific system. The firs step purposed is to determine 'What to do', that is the selection of the V and V class. The main factors considered here are: required integrity, functional complexity, defense in depth and development environment. A guideline to classify the particular system using these factors and show how they lead to the selection of the V and V class is presented. The second step is to determine 'How to do it', that is to choose an appropriate set of V and V methods according to the attributes of the system and the V and V class already selected. A list of possible V and V methods that are recommended for each V and V level during different stages of the development life cycle is included. As a result of the application of this procedure, solutions are found for generalists interested in 'What to do', as well as for specialists, interested in 'How to do'. Finally

  4. The validity of the family history method for identifying Alzheimer disease.

    Science.gov (United States)

    Li, G; Aryan, M; Silverman, J M; Haroutunian, V; Perl, D P; Birstein, S; Lantz, M; Marin, D B; Mohs, R C; Davis, K L

    1997-05-01

    To examine the validity of the family history method for identifying Alzheimer disease (AD) by comparing family history and neuropathological diagnoses. Seventy-seven former residents of the Jewish Home and Hospital for the Aged, New York, NY, with neuropathological evaluations on record were blindly assessed for the presence of dementia and, if present, the type of dementia through family informants by telephone interviews. The Alzheimer's Disease Risk Questionnaire was used to collect demographic information and screen for possible dementia. If dementia was suspected, the Dementia Questionnaire was administered to assess the course and type of dementia, i.e., primary progressive dementia (PPD, likely AD), multiple infarct dementia, mixed dementia (i.e., PPD and multiple infarct dementia), and other dementias based on the modified Diagnostic and Statistical Manual of Mental Disorders, Third Edition, criteria. Sixty (77.9%) of 77 elderly subjects were classified as having dementia and 17 (22.1%) were without dementia by family history evaluation. Of the 60 elderly subjects with dementia, 57 (95%) were found at autopsy to have had neuropathological changes related to dementia. The sensitivity of the family history diagnosis for dementia with related neuropathological change was 0.84 (57 of 68) and the specificity was 0.67 (6 of 9). Using family history information to differentiate the type of dementia, the sensitivity for definite or probable AD (with or without another condition) was 0.69 (36 of 51) and the specificity was 0.73 (19 of 26). The majority (9 of 15) of patients testing false negative for PPD had a history of stroke associated with onset of memory changes, excluding a diagnosis of PPD. Identifying dementia, in general, and AD, in particular, has an acceptable sensitivity and specificity. As is true for direct clinical diagnosis, the major issue associated with misclassifying AD in a family history assessment is the masking effects of a coexisting non

  5. EEG-based motor network biomarkers for identifying target patients with stroke for upper limb rehabilitation and its construct validity.

    Directory of Open Access Journals (Sweden)

    Chun-Chuan Chen

    Full Text Available Rehabilitation is the main therapeutic approach for reducing poststroke functional deficits in the affected upper limb; however, significant between-patient variability in rehabilitation efficacy indicates the need to target patients who are likely to have clinically significant improvement after treatment. Many studies have determined robust predictors of recovery and treatment gains and yielded many great results using linear approachs. Evidence has emerged that the nonlinearity is a crucial aspect to study the inter-areal communication in human brains and abnormality of oscillatory activities in the motor system is linked to the pathological states. In this study, we hypothesized that combinations of linear and nonlinear (cross-frequency network connectivity parameters are favourable biomarkers for stratifying patients for upper limb rehabilitation with increased accuracy. We identified the biomarkers by using 37 prerehabilitation electroencephalogram (EEG datasets during a movement task through effective connectivity and logistic regression analyses. The predictive power of these biomarkers was then tested by using 16 independent datasets (i.e. construct validation. In addition, 14 right handed healthy subjects were also enrolled for comparisons. The result shows that the beta plus gamma or theta network features provided the best classification accuracy of 92%. The predictive value and the sensitivity of these biomarkers were 81.3% and 90.9%, respectively. Subcortical lesion, the time poststroke and initial Wolf Motor Function Test (WMFT score were identified as the most significant clinical variables affecting the classification accuracy of this predictive model. Moreover, 12 of 14 normal controls were classified as having favourable recovery. In conclusion, EEG-based linear and nonlinear motor network biomarkers are robust and can help clinical decision making.

  6. External Validity and Model Validity: A Conceptual Approach for Systematic Review Methodology

    Directory of Open Access Journals (Sweden)

    Raheleh Khorsan

    2014-01-01

    Full Text Available Background. Evidence rankings do not consider equally internal (IV, external (EV, and model validity (MV for clinical studies including complementary and alternative medicine/integrative health care (CAM/IHC research. This paper describe this model and offers an EV assessment tool (EVAT© for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IHC research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about “real-world” consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making.

  7. Liquid Water from First Principles: Validation of Different Sampling Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Mundy, C J; Kuo, W; Siepmann, J; McGrath, M J; Vondevondele, J; Sprik, M; Hutter, J; Parrinello, M; Mohamed, F; Krack, M; Chen, B; Klein, M

    2004-05-20

    A series of first principles molecular dynamics and Monte Carlo simulations were carried out for liquid water to assess the validity and reproducibility of different sampling approaches. These simulations include Car-Parrinello molecular dynamics simulations using the program CPMD with different values of the fictitious electron mass in the microcanonical and canonical ensembles, Born-Oppenheimer molecular dynamics using the programs CPMD and CP2K in the microcanonical ensemble, and Metropolis Monte Carlo using CP2K in the canonical ensemble. With the exception of one simulation for 128 water molecules, all other simulations were carried out for systems consisting of 64 molecules. It is found that the structural and thermodynamic properties of these simulations are in excellent agreement with each other as long as adiabatic sampling is maintained in the Car-Parrinello molecular dynamics simulations either by choosing a sufficiently small fictitious mass in the microcanonical ensemble or by Nos{acute e}-Hoover thermostats in the canonical ensemble. Using the Becke-Lee-Yang-Parr exchange and correlation energy functionals and norm-conserving Troullier-Martins or Goedecker-Teter-Hutter pseudopotentials, simulations at a fixed density of 1.0 g/cm{sup 3} and a temperature close to 315 K yield a height of the first peak in the oxygen-oxygen radial distribution function of about 3.0, a classical constant-volume heat capacity of about 70 J K{sup -1} mol{sup -1}, and a self-diffusion constant of about 0.1 Angstroms{sup 2}/ps.

  8. A Practical Approach to Validating a PD Model

    NARCIS (Netherlands)

    Medema, L.; Koning, de R.; Lensink, B.W.

    2009-01-01

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  9. A practical approach to validating a PD model

    NARCIS (Netherlands)

    Medema, Lydian; Koning, Ruud H.; Lensink, Robert; Medema, M.

    The capital adequacy framework Basel II aims to promote the adoption of stronger risk management practices by the banking industry. The implementation makes validation of credit risk models more important. Lenders therefore need a validation methodology to convince their supervisors that their

  10. An integrative -omics approach to identify functional sub-networks in human colorectal cancer.

    Directory of Open Access Journals (Sweden)

    Rod K Nibbe

    2010-01-01

    Full Text Available Emerging evidence indicates that gene products implicated in human cancers often cluster together in "hot spots" in protein-protein interaction (PPI networks. Additionally, small sub-networks within PPI networks that demonstrate synergistic differential expression with respect to tumorigenic phenotypes were recently shown to be more accurate classifiers of disease progression when compared to single targets identified by traditional approaches. However, many of these studies rely exclusively on mRNA expression data, a useful but limited measure of cellular activity. Proteomic profiling experiments provide information at the post-translational level, yet they generally screen only a limited fraction of the proteome. Here, we demonstrate that integration of these complementary data sources with a "proteomics-first" approach can enhance the discovery of candidate sub-networks in cancer that are well-suited for mechanistic validation in disease. We propose that small changes in the mRNA expression of multiple genes in the neighborhood of a protein-hub can be synergistically associated with significant changes in the activity of that protein and its network neighbors. Further, we hypothesize that proteomic targets with significant fold change between phenotype and control may be used to "seed" a search for small PPI sub-networks that are functionally associated with these targets. To test this hypothesis, we select proteomic targets having significant expression changes in human colorectal cancer (CRC from two independent 2-D gel-based screens. Then, we use random walk based models of network crosstalk and develop novel reference models to identify sub-networks that are statistically significant in terms of their functional association with these proteomic targets. Subsequently, using an information-theoretic measure, we evaluate synergistic changes in the activity of identified sub-networks based on genome-wide screens of mRNA expression in CRC

  11. Diagnostic accuracy of prostate health index to identify aggressive prostate cancer. An Institutional validation study.

    Science.gov (United States)

    Morote, J; Celma, A; Planas, J; Placer, J; Ferrer, R; de Torres, I; Pacciuci, R; Olivan, M

    2016-01-01

    New generations of tumor markers used to detect prostate cancer (PCa) should be able to discriminate men with aggressive PCa of those without PCa or nonaggressive tumors. The objective of this study has been to validate Prostate Health Index (PHI) as a marker of aggressive PCa in one academic institution. PHI was assessed in 357 men scheduled to prostatic biopsy between June of 2013 and July 2014 in one academic institution. Thereafter a subset of 183 men younger than 75 years and total PSA (tPSA) between 3.0 and 10.0 ng/mL, scheduled to it first prostatic biopsy, was retrospectively selected for this study. Twelve cores TRUS guided biopsy, under local anaesthesia, was performed in all cases. Total PSA, free PSA (fPSA), and [-2] proPSA (p2PSA) and prostate volume were determined before the procedure and %fPSA, PSA density (PSAd) and PHI were calculated. Aggressive tumors were considered if any Gleason 4 pattern was found. PHI was compared to %fPSA and PSAd through their ROC curves. Thresholds to detect 90%, 95% of all tumors and 95% and 100% of aggressive tumors were estimated and rates of unnecessary avoided biopsies were calculated and compared. The rate of PCa detection was 37.2% (68) and the rate of aggressive tumors was 24.6% (45). The PHI area under the curve was higher than those of %fPSA and PSAd to detect any PCa (0.749 vs 0.606 and 0.668 respectively) or to detect only aggressive tumors (0.786 vs 0.677 and 0.708 respectively), however, significant differences were not found. The avoided biopsy rates to detect 95% of aggressive tumors were 20.2% for PHI, 14.8% for %fPSA, and 23.5% for PSAd. Even more, to detect all aggressive tumors these rates dropped to 4.9% for PHI, 9.3% for %fPSA, and 7.9% for PSAd. PHI seems a good marker to PCa diagnosis. However, PHI was not superior to %fPSA and PSAd to identify at least 95% of aggressive tumors. Copyright © 2016 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  12. Validation of a Consensus Method for Identifying Delirium from Hospital Records

    Science.gov (United States)

    Kuhn, Elvira; Du, Xinyi; McGrath, Keith; Coveney, Sarah; O'Regan, Niamh; Richardson, Sarah; Teodorczuk, Andrew; Allan, Louise; Wilson, Dan; Inouye, Sharon K.; MacLullich, Alasdair M. J.; Meagher, David; Brayne, Carol; Timmons, Suzanne; Davis, Daniel

    2014-01-01

    Background Delirium is increasingly considered to be an important determinant of trajectories of cognitive decline. Therefore, analyses of existing cohort studies measuring cognitive outcomes could benefit from methods to ascertain a retrospective delirium diagnosis. This study aimed to develop and validate such a method for delirium detection using routine medical records in UK and Ireland. Methods A point prevalence study of delirium provided the reference-standard ratings for delirium diagnosis. Blinded to study results, clinical vignettes were compiled from participants' medical records in a standardised manner, describing any relevant delirium symptoms recorded in the whole case record for the period leading up to case-ascertainment. An expert panel rated each vignette as unlikely, possible, or probable delirium and disagreements were resolved by consensus. Results From 95 case records, 424 vignettes were abstracted by 5 trained clinicians. There were 29 delirium cases according to the reference standard. Median age of subjects was 76.6 years (interquartile range 54.6 to 82.5). Against the original study DSM-IV diagnosis, the chart abstraction method gave a positive likelihood ratio (LR) of 7.8 (95% CI 5.7–12.0) and the negative LR of 0.45 (95% CI 0.40–0.47) for probable delirium (sensitivity 0.58 (95% CI 0.53–0.62); specificity 0.93 (95% CI 0.90–0.95); AUC 0.86 (95% CI 0.82–0.89)). The method diagnosed possible delirium with positive LR 3.5 (95% CI 2.9–4.3) and negative LR 0.15 (95% CI 0.11–0.21) (sensitivity 0.89 (95% CI 0.85–0.91); specificity 0.75 (95% CI 0.71–0.79); AUC 0.86 (95% CI 0.80–0.89)). Conclusions This chart abstraction method can retrospectively diagnose delirium in hospitalised patients with good accuracy. This has potential for retrospectively identifying delirium in cohort studies where routine medical records are available. This example of record linkage between hospitalisations and epidemiological data may lead to

  13. Validation of a consensus method for identifying delirium from hospital records.

    Directory of Open Access Journals (Sweden)

    Elvira Kuhn

    Full Text Available Delirium is increasingly considered to be an important determinant of trajectories of cognitive decline. Therefore, analyses of existing cohort studies measuring cognitive outcomes could benefit from methods to ascertain a retrospective delirium diagnosis. This study aimed to develop and validate such a method for delirium detection using routine medical records in UK and Ireland.A point prevalence study of delirium provided the reference-standard ratings for delirium diagnosis. Blinded to study results, clinical vignettes were compiled from participants' medical records in a standardised manner, describing any relevant delirium symptoms recorded in the whole case record for the period leading up to case-ascertainment. An expert panel rated each vignette as unlikely, possible, or probable delirium and disagreements were resolved by consensus.From 95 case records, 424 vignettes were abstracted by 5 trained clinicians. There were 29 delirium cases according to the reference standard. Median age of subjects was 76.6 years (interquartile range 54.6 to 82.5. Against the original study DSM-IV diagnosis, the chart abstraction method gave a positive likelihood ratio (LR of 7.8 (95% CI 5.7-12.0 and the negative LR of 0.45 (95% CI 0.40-0.47 for probable delirium (sensitivity 0.58 (95% CI 0.53-0.62; specificity 0.93 (95% CI 0.90-0.95; AUC 0.86 (95% CI 0.82-0.89. The method diagnosed possible delirium with positive LR 3.5 (95% CI 2.9-4.3 and negative LR 0.15 (95% CI 0.11-0.21 (sensitivity 0.89 (95% CI 0.85-0.91; specificity 0.75 (95% CI 0.71-0.79; AUC 0.86 (95% CI 0.80-0.89.This chart abstraction method can retrospectively diagnose delirium in hospitalised patients with good accuracy. This has potential for retrospectively identifying delirium in cohort studies where routine medical records are available. This example of record linkage between hospitalisations and epidemiological data may lead to further insights into the inter-relationship between acute

  14. A goal-based approach for qualification of new technologies: Foundations, tool support, and industrial validation

    International Nuclear Information System (INIS)

    Sabetzadeh, Mehrdad; Falessi, Davide; Briand, Lionel; Di Alesio, Stefano

    2013-01-01

    New technologies typically involve innovative aspects that are not addressed by the existing normative standards and hence are not assessable through common certification procedures. To ensure that new technologies can be implemented in a safe and reliable manner, a specific kind of assessment is performed, which in many industries, e.g., the energy sector, is known as Technology Qualification (TQ). TQ aims at demonstrating with an acceptable level of confidence that a new technology will function within specified limits. Expert opinion plays an important role in TQ, both to identify the safety and reliability evidence that needs to be developed and to interpret the evidence provided. Since there are often multiple experts involved in TQ, it is crucial to apply a structured process for eliciting expert opinions, and to use this information systematically when analyzing the satisfaction of the technology's safety and reliability objectives. In this paper, we present a goal-based approach for TQ. Our approach enables analysts to quantitatively reason about the satisfaction of the technology's overall goals and further to identify the aspects that must be improved to increase goal satisfaction. The approach is founded on three main components: goal models, expert elicitation, and probabilistic simulation. We describe a tool, named Modus, that we have developed in support of our approach. We provide an extensive empirical validation of our approach through two industrial case studies and a survey

  15. Approaches to Demonstrating the Reliability and Validity of Core Diagnostic Criteria for Chronic Pain.

    Science.gov (United States)

    Bruehl, Stephen; Ohrbach, Richard; Sharma, Sonia; Widerstrom-Noga, Eva; Dworkin, Robert H; Fillingim, Roger B; Turk, Dennis C

    2016-09-01

    The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks-American Pain Society Pain Taxonomy (AAPT) is designed to be an evidence-based multidimensional chronic pain classification system that will facilitate more comprehensive and consistent chronic pain diagnoses, and thereby enhance research, clinical communication, and ultimately patient care. Core diagnostic criteria (dimension 1) for individual chronic pain conditions included in the initial version of AAPT will be the focus of subsequent empirical research to evaluate and provide evidence for their reliability and validity. Challenges to validating diagnostic criteria in the absence of clear and identifiable pathophysiological mechanisms are described. Based in part on previous experience regarding the development of evidence-based diagnostic criteria for psychiatric disorders, headache, and specific chronic pain conditions (fibromyalgia, complex regional pain syndrome, temporomandibular disorders, pain associated with spinal cord injuries), several potential approaches for documentation of the reliability and validity of the AAPT diagnostic criteria are summarized. The AAPT is designed to be an evidence-based multidimensional chronic pain classification system. Conceptual and methodological issues related to demonstrating the reliability and validity of the proposed AAPT chronic pain diagnostic criteria are discussed. Copyright © 2016 American Pain Society. Published by Elsevier Inc. All rights reserved.

  16. Modeling Run Test Validity: A Meta-Analytic Approach

    National Research Council Canada - National Science Library

    Vickers, Ross

    2002-01-01

    .... This study utilized data from 166 samples (N = 5,757) to test the general hypothesis that differences in testing methods could account for the cross-situational variation in validity. Only runs >2 km...

  17. An integrated approach for signal validation in nuclear power plants

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Kerlin, T.W.; Gloeckler, O.; Frei, Z.; Qualls, L.; Morgenstern, V.

    1987-08-01

    A signal validation system, based on several parallel signal processing modules, is being developed at the University of Tennessee. The major modules perform (1) general consistency checking (GCC) of a set of redundant measurements, (2) multivariate data-driven modeling of dynamic signal components for maloperation detection, (3) process empirical modeling for prediction and redundancy generation, (4) jump, pulse, noise detection, and (5) an expert system for qualitative signal validation. A central database stores information related to sensors, diagnostics rules, past system performance, subsystem models, etc. We are primarily concerned with signal validation during steady-state operation and slow degradations. In general, the different modules will perform signal validation during all operating conditions. The techniques have been successfully tested using PWR steam generator simulation, and efforts are currently underway in applying the techniques to Millstone-III operational data. These methods could be implemented in advanced reactors, including advanced liquid metal reactors

  18. Cost model validation: a technical and cultural approach

    Science.gov (United States)

    Hihn, J.; Rosenberg, L.; Roust, K.; Warfield, K.

    2001-01-01

    This paper summarizes how JPL's parametric mission cost model (PMCM) has been validated using both formal statistical methods and a variety of peer and management reviews in order to establish organizational acceptance of the cost model estimates.

  19. Combining phenotypic and proteomic approaches to identify membrane targets in a ‘triple negative’ breast cancer cell type

    Directory of Open Access Journals (Sweden)

    Rust Steven

    2013-02-01

    Full Text Available Abstract Background The continued discovery of therapeutic antibodies, which address unmet medical needs, requires the continued discovery of tractable antibody targets. Multiple protein-level target discovery approaches are available and these can be used in combination to extensively survey relevant cell membranomes. In this study, the MDA-MB-231 cell line was selected for membranome survey as it is a ‘triple negative’ breast cancer cell line, which represents a cancer subtype that is aggressive and has few treatment options. Methods The MDA-MB-231 breast carcinoma cell line was used to explore three membranome target discovery approaches, which were used in parallel to cross-validate the significance of identified antigens. A proteomic approach, which used membrane protein enrichment followed by protein identification by mass spectrometry, was used alongside two phenotypic antibody screening approaches. The first phenotypic screening approach was based on hybridoma technology and the second was based on phage display technology. Antibodies isolated by the phenotypic approaches were tested for cell specificity as well as internalisation and the targets identified were compared to each other as well as those identified by the proteomic approach. An anti-CD73 antibody derived from the phage display-based phenotypic approach was tested for binding to other ‘triple negative’ breast cancer cell lines and tested for tumour growth inhibitory activity in a MDA-MB-231 xenograft model. Results All of the approaches identified multiple cell surface markers, including integrins, CD44, EGFR, CD71, galectin-3, CD73 and BCAM, some of which had been previously confirmed as being tractable to antibody therapy. In total, 40 cell surface markers were identified for further study. In addition to cell surface marker identification, the phenotypic antibody screening approaches provided reagent antibodies for target validation studies. This is illustrated

  20. Identifying food-related life style segments by a cross-culturally valid scaling device

    DEFF Research Database (Denmark)

    Brunsø, Karen; Grunert, Klaus G.

    1994-01-01

    -related life style in a cross-culturally valid way. To this end, we have col-lected a pool of 202 items, collected data in three countries, and have con-structed scales based on cross-culturally stable patterns. These scales have then been subjected to a number of tests of reliability and vali-dity. We have...... then applied the set of scales to a fourth country, Germany, based on a representative sample of 1000 respondents. The scales had, with a fe exceptions, moderately good reliabilities. A cluster ana-ly-sis led to the identification of 5 segments, which differed on all 23 scales....

  1. Mixed Integer Linear Programming based machine learning approach identifies regulators of telomerase in yeast.

    Science.gov (United States)

    Poos, Alexandra M; Maicher, André; Dieckmann, Anna K; Oswald, Marcus; Eils, Roland; Kupiec, Martin; Luke, Brian; König, Rainer

    2016-06-02

    Understanding telomere length maintenance mechanisms is central in cancer biology as their dysregulation is one of the hallmarks for immortalization of cancer cells. Important for this well-balanced control is the transcriptional regulation of the telomerase genes. We integrated Mixed Integer Linear Programming models into a comparative machine learning based approach to identify regulatory interactions that best explain the discrepancy of telomerase transcript levels in yeast mutants with deleted regulators showing aberrant telomere length, when compared to mutants with normal telomere length. We uncover novel regulators of telomerase expression, several of which affect histone levels or modifications. In particular, our results point to the transcription factors Sum1, Hst1 and Srb2 as being important for the regulation of EST1 transcription, and we validated the effect of Sum1 experimentally. We compiled our machine learning method leading to a user friendly package for R which can straightforwardly be applied to similar problems integrating gene regulator binding information and expression profiles of samples of e.g. different phenotypes, diseases or treatments. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Designing and determining validity and reliability of a questionnaire to identify factors affecting nutritional behavior among patients with metabolic syndrome

    Directory of Open Access Journals (Sweden)

    Naseh Esmaeili

    2017-06-01

    Full Text Available Background : A number of studies have shown a clear relationship between diet and component of metabolic syndrome. Based on the Theory of Reasoned Action (TRA, attitude and subjective norm are factors affecting behavioral intention and subsequently behavior. The aim of the present study is to design a valid questionnaire identifying factors affecting nutritional behavior among patients with metabolic syndrome. Materials and Methods: Via literature review, six focus group discussion and interview with nutrition specialists were performed to develop an instrument based on the theory of reasoned action. To determine validity of the instrument, content and face validity analyses with 15 expert panels conducted and also to determine reliability, Cronbach’s Alpha coefficient performed. Results: A draft of 100 items questionnaire was developed and after evaluation of validity and reliability, final questionnaire included 46 items: 17 items for attitude, 13 items for subjective norms and 16 items for behavioral intention. For the final questionnaire average of content validity index was 0/92 and Cronbach’s Alpha coefficient was 0/85. Conclusion: Based on the results of the current study the developed questionnaire is a valid and reliable instrument and it can be used to identify factors affecting nutritional behavior among people with metabolic syndrome based on the theory of reasoned action.

  3. Referencing Science: Teaching Undergraduates to Identify, Validate, and Utilize Peer-Reviewed Online Literature

    Science.gov (United States)

    Berzonsky, William A.; Richardson, Katherine D.

    2008-01-01

    Accessibility of online scientific literature continues to expand due to the advent of scholarly databases and search engines. Studies have shown that undergraduates favor using online scientific literature to address research questions, but they often do not have the skills to assess the validity of research articles. Undergraduates generally are…

  4. Predictive Validity of an Empirical Approach for Selecting Promising Message Topics: A Randomized-Controlled Study

    Science.gov (United States)

    Lee, Stella Juhyun; Brennan, Emily; Gibson, Laura Anne; Tan, Andy S. L.; Kybert-Momjian, Ani; Liu, Jiaying; Hornik, Robert

    2016-01-01

    Several message topic selection approaches propose that messages based on beliefs pretested and found to be more strongly associated with intentions will be more effective in changing population intentions and behaviors when used in a campaign. This study aimed to validate the underlying causal assumption of these approaches which rely on cross-sectional belief–intention associations. We experimentally tested whether messages addressing promising themes as identified by the above criterion were more persuasive than messages addressing less promising themes. Contrary to expectations, all messages increased intentions. Interestingly, mediation analyses showed that while messages deemed promising affected intentions through changes in targeted promising beliefs, messages deemed less promising also achieved persuasion by influencing nontargeted promising beliefs. Implications for message topic selection are discussed. PMID:27867218

  5. 32 species validation of a new Illumina paired-end approach for the development of microsatellites.

    Directory of Open Access Journals (Sweden)

    Stacey L Lance

    Full Text Available Development and optimization of novel species-specific microsatellites, or simple sequence repeats (SSRs remains an important step for studies in ecology, evolution, and behavior. Numerous approaches exist for identifying new SSRs that vary widely in terms of both time and cost investments. A recent approach of using paired-end Illumina sequence data in conjunction with the bioinformatics pipeline, PAL_FINDER, has the potential to substantially reduce the cost and labor investment while also improving efficiency. However, it does not appear that the approach has been widely adopted, perhaps due to concerns over its broad applicability across taxa. Therefore, to validate the utility of the approach we developed SSRs for 32 species representing 30 families, 25 orders, 11 classes, and six phyla and optimized SSRs for 13 of the species. Overall the IPE method worked extremely well and we identified 1000s of SSRs for all species (mean = 128,485, with 17% of loci being potentially amplifiable loci, and 25% of these met our most stringent criteria designed to that avoid SSRs associated with repetitive elements. Approximately 61% of screened primers yielded strong amplification of a single locus.

  6. The relative validity and reproducibility of an iron food frequency questionnaire for identifying iron-related dietary patterns in young women.

    Science.gov (United States)

    Beck, Kathryn L; Kruger, Rozanne; Conlon, Cathryn A; Heath, Anne-Louise M; Coad, Jane; Matthys, Christophe; Jones, Beatrix; Stonehouse, Welma

    2012-08-01

    Using food frequency data to identify dietary patterns is a newly emerging approach to assessing the relationship between dietary intake and iron status. Food frequency questionnaires should be assessed for validity and reproducibility before use. We aimed to investigate the relative validity and reproducibility of an iron food frequency questionnaire (FeFFQ) specifically designed to identify iron-related dietary patterns. Participants completed the FeFFQ at baseline (FeFFQ1) and 1 month later (FeFFQ2) to assess reproducibility. A 4-day weighed diet record (4DDR) was completed between these assessments to determine validity. Foods appearing in the 4DDR were classified into the same 144 food groupings as the FeFFQ. Factor analysis was used to determine dietary patterns from FeFFQ1, FeFFQ2, and the 4DDR. A convenience sample of women (n=115) aged 18 to 44 years living in Auckland, New Zealand, during 2009. Agreement between diet pattern scores was compared using correlation coefficients, Bland-Altman analysis, cross-classification, and the weighted κ statistic. A "healthy" and a "sandwich and drinks" dietary pattern were identified from all three dietary assessments. Correlation coefficients between FeFFQ1 and the 4DDR diet pattern scores (validity) were 0.34 for the healthy, and 0.62 for the sandwich and drinks pattern (both Ps50% of participants into the correct tertile and <10% into the opposite tertile for both the healthy and sandwich and drinks diet pattern scores when compared with the 4DDR and FeFFQ2. The FeFFQ appears to be a reproducible and relatively valid method for identifying dietary patterns, and could be used to investigate the relationship between dietary patterns and iron status. Copyright © 2012 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  7. Development and Initial Validation of the Need Satisfaction and Need Support at Work Scales: A Validity-Focused Approach

    Directory of Open Access Journals (Sweden)

    Susanne Tafvelin

    2018-01-01

    Full Text Available Although the relevance of employee need satisfaction and manager need support have been examined, the integration of self-determination theory (SDT into work and organizational psychology has been hampered by the lack of validated measures. The purpose of the current study was to develop and validate measures of employees’ perception of need satisfaction (NSa-WS and need support (NSu-WS at work that were grounded in SDT. We used three Swedish samples (total 'N' = 1,430 to develop and validate our scales. We used a confirmatory approach including expert panels to assess item content relevance, confirmatory factor analysis for factorial validity, and associations with theoretically warranted outcomes to assess criterion-related validity. Scale reliability was also assessed. We found evidence of content, factorial, and criterion-related validity of our two scales of need satisfaction and need support at work. Further, the scales demonstrated high internal consistency. Our newly developed scales may be used in research and practice to further our understanding regarding how satisfaction and support of employee basic needs influence employee motivation, performance, and well-being. Our study makes a contribution to the current literature by providing (1 scales that are specifically designed for the work context, (2 an example of how expert panels can be used to assess content validity, and (3 testing of theoretically derived hypotheses that, although SDT is built on them, have not been examined before.

  8. Content Validity and Psychometric Properties of the Nomination Scale for Identifying Football Talent (NSIFT: Application to Coaches, Parents and Players

    Directory of Open Access Journals (Sweden)

    Alejandro Prieto-Ayuso

    2017-01-01

    Full Text Available The identification of football talent is a critical issue both for clubs and the families of players. However, despite its importance in a sporting, economic and social sense, there appears to be a lack of instruments that can reliably measure talent performance. The aim of this study was to design and validate the Nomination Scale for Identifying Football Talent (NSIFT, with the aim of optimising the processes for identifying said talent. The scale was first validated through expert judgment, and then statistically, by means of an exploratory factor analysis (EFA, confirmatory factor analysis (CFA, internal reliability and convergent validity. The results reveal the presence of three factors in the scale’s factor matrix, with these results being confirmed by the CFA. The scale revealed suitable internal reliability and homogeneity indices. Convergent validity showed that it is teammates who are best able to identify football talent, followed by coaches and parents. It can be concluded that the NSIFT is suitable for use in the football world. Future studies should seek to confirm these results in different contexts by means of further CFAs.

  9. Identifying inhibitory compounds in lignocellulosic biomass hydrolysates using an exometabolomics approach

    NARCIS (Netherlands)

    Zha, Y.; Westerhuis, J.A.; Muilwijk, B.; Overkamp, K.M.; Nijmeijer, B.M.; Coulier, L.; Smilde, A.K.; Punt, P.J.

    2014-01-01

    Background: Inhibitors are formed that reduce the fermentation performance of fermenting yeast during the pretreatment process of lignocellulosic biomass. An exometabolomics approach was applied to systematically identify inhibitors in lignocellulosic biomass hydrolysates.Results: We studied the

  10. Identifying inhibitory compounds in lignocellulosic biomass hydrolysates using an exometabolomics approach

    NARCIS (Netherlands)

    Zha, Y.; Westerhuis, J.A.; Muilwijk, B.; Overkamp, K.M.; Nijmeijer, B.M.; Coulier, L.; Smilde, A.K.; Punt, P.J.

    2014-01-01

    BACKGROUND: Inhibitors are formed that reduce the fermentation performance of fermenting yeast during the pretreatment process of lignocellulosic biomass. An exometabolomics approach was applied to systematically identify inhibitors in lignocellulosic biomass hydrolysates. RESULTS: We studied the

  11. Approaches to Validation of Models for Low Gravity Fluid Behavior

    Science.gov (United States)

    Chato, David J.; Marchetta, Jeffery; Hochstein, John I.; Kassemi, Mohammad

    2005-01-01

    This paper details the author experiences with the validation of computer models to predict low gravity fluid behavior. It reviews the literature of low gravity fluid behavior as a starting point for developing a baseline set of test cases. It examines authors attempts to validate their models against these cases and the issues they encountered. The main issues seem to be that: Most of the data is described by empirical correlation rather than fundamental relation; Detailed measurements of the flow field have not been made; Free surface shapes are observed but through thick plastic cylinders, and therefore subject to a great deal of optical distortion; and Heat transfer process time constants are on the order of minutes to days but the zero-gravity time available has been only seconds.

  12. Using Cluster Ensemble and Validation to Identify Subtypes of Pervasive Developmental Disorders

    OpenAIRE

    Shen, Jess J.; Lee, Phil Hyoun; Holden, Jeanette J.A.; Shatkay, Hagit

    2007-01-01

    Pervasive Developmental Disorders (PDD) are neurodevelopmental disorders characterized by impairments in social interaction, communication and behavior.1 Given the diversity and varying severity of PDD, diagnostic tools attempt to identify homogeneous subtypes within PDD. Identifying subtypes can lead to targeted etiology studies and to effective type-specific intervention. Cluster analysis can suggest coherent subsets in data; however, different methods and assumptions lead to different resu...

  13. A Human Proximity Operations System test case validation approach

    Science.gov (United States)

    Huber, Justin; Straub, Jeremy

    A Human Proximity Operations System (HPOS) poses numerous risks in a real world environment. These risks range from mundane tasks such as avoiding walls and fixed obstacles to the critical need to keep people and processes safe in the context of the HPOS's situation-specific decision making. Validating the performance of an HPOS, which must operate in a real-world environment, is an ill posed problem due to the complexity that is introduced by erratic (non-computer) actors. In order to prove the HPOS's usefulness, test cases must be generated to simulate possible actions of these actors, so the HPOS can be shown to be able perform safely in environments where it will be operated. The HPOS must demonstrate its ability to be as safe as a human, across a wide range of foreseeable circumstances. This paper evaluates the use of test cases to validate HPOS performance and utility. It considers an HPOS's safe performance in the context of a common human activity, moving through a crowded corridor, and extrapolates (based on this) to the suitability of using test cases for AI validation in other areas of prospective application.

  14. Development and validation of a questionnaire to identify severe maternal morbidity in epidemiological surveys

    Directory of Open Access Journals (Sweden)

    Parpinelli Mary A

    2010-07-01

    Full Text Available Abstract Objective to develop and validate a questionnaire on severe maternal morbidity and to evaluate the maternal recall of complications related to pregnancy and childbirth. Design: validity of a questionnaire as diagnostic instrument. Setting: a third level referral maternity in Campinas, Brazil. Population: 386 survivors of severe maternal complications and 123 women that delivered without major complications between 2002 and 2007. Methods eligible women were traced and interviewed by telephone on the occurrence of obstetric complications and events related to their treatment. Their answers were compared with their medical records as gold standard. Sensitivity, specificity and likelihood ratios plus their correspondent 95% confidence intervals were used as main estimators of accuracy. Main outcomes: diagnosis of severe maternal morbidity associated with past pregnancies, including hemorrhage, eclampsia, infections, jaundice and related procedures (hysterectomy, admission to ICU, blood transfusion, laparotomy, inter-hospital transfer, mechanical ventilation and post partum stay above seven days. Results Women did not recall accurately the occurrence of obstetric complications, especially hemorrhage and infection. The likelihood ratios were Conclusion Process indicators are better recalled by women than obstetric complication and should be considered when applying a questionnaire on severe maternal morbidity.

  15. The validation of synthetic spectra used in the performance evaluation of radionuclide identifiers

    International Nuclear Information System (INIS)

    Flynn, A.; Boardman, D.; Reinhard, M.I.

    2013-01-01

    This work has evaluated synthetic gamma-ray spectra created by the RASE sampler using experimental data. The RASE sampler resamples experimental data to create large data libraries which are subsequently available for use in evaluation of radionuclide identification algorithms. A statistical evaluation of the synthetic energy bins has shown the variation to follow a Poisson distribution identical to experimental data. The minimum amount of statistics required in each base spectrum to ensure the subsequent use of the base spectrum in the generation of statistically robust synthetic data was determined. A requirement that the simulated acquisition time of the synthetic spectra was not more than 4% of the acquisition time of the base spectrum was also determined. Further validation of RASE was undertaken using two different radionuclide identification algorithms. - Highlights: • A validation of synthetic data created in order to evaluate radionuclide identification systems has been carried out. • Statistical analysis has shown that the data accurately represents experimental data. • A limit to the amount of data which could be created using this method was evaluated. • Analysis of the synthetic gamma spectra show identical results to analysis carried out with experimental data

  16. Development and validation of a multi-locus DNA metabarcoding method to identify endangered species in complex samples.

    Science.gov (United States)

    Arulandhu, Alfred J; Staats, Martijn; Hagelaar, Rico; Voorhuijzen, Marleen M; Prins, Theo W; Scholtens, Ingrid; Costessi, Adalberto; Duijsings, Danny; Rechenmann, François; Gaspar, Frédéric B; Barreto Crespo, Maria Teresa; Holst-Jensen, Arne; Birck, Matthew; Burns, Malcolm; Haynes, Edward; Hochegger, Rupert; Klingl, Alexander; Lundberg, Lisa; Natale, Chiara; Niekamp, Hauke; Perri, Elena; Barbante, Alessandra; Rosec, Jean-Philippe; Seyfarth, Ralf; Sovová, Tereza; Van Moorleghem, Christoff; van Ruth, Saskia; Peelen, Tamara; Kok, Esther

    2017-10-01

    DNA metabarcoding provides great potential for species identification in complex samples such as food supplements and traditional medicines. Such a method would aid Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) enforcement officers to combat wildlife crime by preventing illegal trade of endangered plant and animal species. The objective of this research was to develop a multi-locus DNA metabarcoding method for forensic wildlife species identification and to evaluate the applicability and reproducibility of this approach across different laboratories. A DNA metabarcoding method was developed that makes use of 12 DNA barcode markers that have demonstrated universal applicability across a wide range of plant and animal taxa and that facilitate the identification of species in samples containing degraded DNA. The DNA metabarcoding method was developed based on Illumina MiSeq amplicon sequencing of well-defined experimental mixtures, for which a bioinformatics pipeline with user-friendly web-interface was developed. The performance of the DNA metabarcoding method was assessed in an international validation trial by 16 laboratories, in which the method was found to be highly reproducible and sensitive enough to identify species present in a mixture at 1% dry weight content. The advanced multi-locus DNA metabarcoding method assessed in this study provides reliable and detailed data on the composition of complex food products, including information on the presence of CITES-listed species. The method can provide improved resolution for species identification, while verifying species with multiple DNA barcodes contributes to an enhanced quality assurance. © The Authors 2017. Published by Oxford University Press.

  17. Validating the Modified Drug Adherence Work-Up (M-DRAW Tool to Identify and Address Barriers to Medication Adherence

    Directory of Open Access Journals (Sweden)

    Sun Lee

    2017-09-01

    Full Text Available Barriers to medication adherence stem from multiple factors. An effective and convenient tool is needed to identify these barriers so that clinicians can provide a tailored, patient-centered consultation with patients. The Modified Drug Adherence Work-up Tool (M-DRAW was developed as a 13-item checklist questionnaire to identify barriers to medication adherence. The response scale was a 4-point Likert scale of frequency of occurrence (1 = never to 4 = often. The checklist was accompanied by a GUIDE that provided corresponding motivational interview-based intervention strategies for each identified barrier. The current pilot study examined the psychometric properties of the M-DRAW checklist (reliability, responsiveness and discriminant validity in patients taking one or more prescription medication(s for chronic conditions. A cross-sectional sample of 26 patients was recruited between December 2015 and March 2016 at an academic medical center pharmacy in Southern California. A priming question that assessed self-reported adherence was used to separate participants into the control group of 17 “adherers” (65.4%, and into the intervention group of nine “unintentional and intentional non-adherers” (34.6%. Comparable baseline characteristics were observed between the two groups. The M-DRAW checklist showed acceptable reliability (13 item; alpha = 0.74 for identifying factors and barriers leading to medication non-adherence. Discriminant validity of the tool and the priming question was established by the four-fold number of barriers to adherence identified within the self-selected intervention group compared to the control group (4.4 versus 1.2 barriers, p < 0.05. The current study did not investigate construct validity due to small sample size and challenges on follow-up with patients. Future testing of the tool will include construct validation.

  18. Validity of a family-centered approach for assessing infants' social-emotional wellbeing and their developmental context : a prospective cohort study

    NARCIS (Netherlands)

    Hielkema, Margriet; De Winter, Andrea F.; Reijneveld, Sijmen A.

    2017-01-01

    Background: Family-centered care seems promising in preventive pediatrics, but evidence is lacking as to whether this type of care is also valid as a means to identify risks to infants' social-emotional development. We aimed to examine the validity of such a family-centered approach. Methods: We

  19. Identifying and prioritizing barriers to implementation of smart energy city projects in Europe: An empirical approach

    International Nuclear Information System (INIS)

    Mosannenzadeh, Farnaz; Di Nucci, Maria Rosaria; Vettorato, Daniele

    2017-01-01

    Successful implementation of smart energy city projects in Europe is crucial for a sustainable transition of urban energy systems and the improvement of quality of life for citizens. We aim to develop a systematic classification and analysis of the barriers hindering successful implementation of smart energy city projects. Through an empirical approach, we investigated 43 communities implementing smart and sustainable energy city projects under the Sixth and Seventh Framework Programmes of the European Union. Validated through literature review, we identified 35 barriers categorized in policy, administrative, legal, financial, market, environmental, technical, social, and information-and-awareness dimensions. We prioritized these barriers, using a novel multi-dimensional methodology that simultaneously analyses barriers based on frequency, level of impact, causal relationship among barriers, origin, and scale. The results indicate that the key barriers are lacking or fragmented political support on the long term at the policy level, and lack of good cooperation and acceptance among project partners, insufficient external financial support, lack of skilled and trained personnel, and fragmented ownership at the project level. The outcome of the research should aid policy-makers to better understand and prioritize implementation barriers to develop effective action and policy interventions towards more successful implementation of smart energy city projects. - Highlights: • A solid empirical study on the implementation of European smart energy city projects. • We found 35 barriers in nine dimensions; e.g. policy, legal, financial, and social. • We suggested a new multi-dimensional methodology to prioritize barriers. • Lacking or fragmented political support on the long term is a key barrier. • We provided insights for action for project coordinators and policy makers.

  20. Gene-based Association Approach Identify Genes Across Stress Traits in Fruit Flies

    DEFF Research Database (Denmark)

    Rohde, Palle Duun; Edwards, Stefan McKinnon; Sarup, Pernille Merete

    Identification of genes explaining variation in quantitative traits or genetic risk factors of human diseases requires both good phenotypic- and genotypic data, but also efficient statistical methods. Genome-wide association studies may reveal association between phenotypic variation and variation...... approach grouping variants accordingly to gene position, thus lowering the number of statistical tests performed and increasing the probability of identifying genes with small to moderate effects. Using this approach we identify numerous genes associated with different types of stresses in Drosophila...... melanogaster, but also identify common genes that affects the stress traits....

  1. Validation of MIMGO: a method to identify differentially expressed GO terms in a microarray dataset

    OpenAIRE

    Yamada, Yoichi; Sawada, Hiroki; Hirotani, Ken-ichi; Oshima, Masanobu; Satou, Kenji

    2012-01-01

    Abstract Background We previously proposed an algorithm for the identification of GO terms that commonly annotate genes whose expression is upregulated or downregulated in some microarray data compared with in other microarray data. We call these “differentially expressed GO terms” and have named the algorithm “matrix-assisted identification method of differentially expressed GO terms” (MIMGO). MIMGO can also identify microarray data in which genes annotated with a differentially expressed GO...

  2. Validation of standard operating procedures in a multicenter retrospective study to identify -omics biomarkers for chronic low back pain.

    Directory of Open Access Journals (Sweden)

    Concetta Dagostino

    Full Text Available Chronic low back pain (CLBP is one of the most common medical conditions, ranking as the greatest contributor to global disability and accounting for huge societal costs based on the Global Burden of Disease 2010 study. Large genetic and -omics studies provide a promising avenue for the screening, development and validation of biomarkers useful for personalized diagnosis and treatment (precision medicine. Multicentre studies are needed for such an effort, and a standardized and homogeneous approach is vital for recruitment of large numbers of participants among different centres (clinical and laboratories to obtain robust and reproducible results. To date, no validated standard operating procedures (SOPs for genetic/-omics studies in chronic pain have been developed. In this study, we validated an SOP model that will be used in the multicentre (5 centres retrospective "PainOmics" study, funded by the European Community in the 7th Framework Programme, which aims to develop new biomarkers for CLBP through three different -omics approaches: genomics, glycomics and activomics. The SOPs describe the specific procedures for (1 blood collection, (2 sample processing and storage, (3 shipping details and (4 cross-check testing and validation before assays that all the centres involved in the study have to follow. Multivariate analysis revealed the absolute specificity and homogeneity of the samples collected by the five centres for all genetics, glycomics and activomics analyses. The SOPs used in our multicenter study have been validated. Hence, they could represent an innovative tool for the correct management and collection of reliable samples in other large-omics-based multicenter studies.

  3. New approach for validating the segmentation of 3D data applied to individual fibre extraction

    DEFF Research Database (Denmark)

    Emerson, Monica Jane; Dahl, Anders Bjorholm; Dahl, Vedrana Andersen

    2017-01-01

    We present two approaches for validating the segmentation of 3D data. The first approach consists on comparing the amount of estimated material to a value provided by the manufacturer. The second approach consists on comparing the segmented results to those obtained from imaging modalities...

  4. Validity of various epidemiological approaches to assessing radon health risk

    International Nuclear Information System (INIS)

    Conrath, S.M.

    1990-01-01

    In this paper various epidemiologic study designs are defined and evaluated for their utility in assessing radon health risk. The strengths and limitations of these approaches are addressed. Common pitfalls and errors of epidemiologic method are delineated with examples of causes and remedies

  5. Outbreaks source: A new mathematical approach to identify their possible location

    Science.gov (United States)

    Buscema, Massimo; Grossi, Enzo; Breda, Marco; Jefferson, Tom

    2009-11-01

    Classical epidemiology has generally relied on the description and explanation of the occurrence of infectious diseases in relation to time occurrence of events rather than to place of occurrence. In recent times, computer generated dot maps have facilitated the modeling of the spread of infectious epidemic diseases either with classical statistics approaches or with artificial “intelligent systems”. Few attempts, however, have been made so far to identify the origin of the epidemic spread rather than its evolution by mathematical topology methods. We report on the use of a new artificial intelligence method (the H-PST Algorithm) and we compare this new technique with other well known algorithms to identify the source of three examples of infectious disease outbreaks derived from literature. The H-PST algorithm is a new system able to project a distances matrix of points (events) into a bi-dimensional space, with the generation of a new point, named hidden unit. This new hidden unit deforms the original Euclidean space and transforms it into a new space (cognitive space). The cost function of this transformation is the minimization of the differences between the original distance matrix among the assigned points and the distance matrix of the same points projected into the bi-dimensional map (or any different set of constraints). For many reasons we will discuss, the position of the hidden unit shows to target the outbreak source in many epidemics much better than the other classic algorithms specifically targeted for this task. Compared with main algorithms known in the location theory, the hidden unit was within yards of the outbreak source in the first example (the 2007 epidemic of Chikungunya fever in Italy). The hidden unit was located in the river between the two village epicentres of the spread exactly where the index case was living. Equally in the second (the 1967 foot and mouth disease epidemic in England), and the third (1854 London Cholera epidemic

  6. Approaches to Validate and Manipulate RNA Targets with Small Molecules in Cells.

    Science.gov (United States)

    Childs-Disney, Jessica L; Disney, Matthew D

    2016-01-01

    RNA has become an increasingly important target for therapeutic interventions and for chemical probes that dissect and manipulate its cellular function. Emerging targets include human RNAs that have been shown to directly cause cancer, metabolic disorders, and genetic disease. In this review, we describe various routes to obtain bioactive compounds that target RNA, with a particular emphasis on the development of small molecules. We use these cases to describe approaches that are being developed for target validation, which include target-directed cleavage, classic pull-down experiments, and covalent cross-linking. Thus, tools are available to design small molecules to target RNA and to identify the cellular RNAs that are their targets.

  7. [Development and validation of an algorithm to identify cancer recurrences from hospital data bases].

    Science.gov (United States)

    Manzanares-Laya, S; Burón, A; Murta-Nascimento, C; Servitja, S; Castells, X; Macià, F

    2014-01-01

    Hospital cancer registries and hospital databases are valuable and efficient sources of information for research into cancer recurrences. The aim of this study was to develop and validate algorithms for the detection of breast cancer recurrence. A retrospective observational study was conducted on breast cancer cases from the cancer registry of a third level university hospital diagnosed between 2003 and 2009. Different probable cancer recurrence algorithms were obtained by linking the hospital databases and the construction of several operational definitions, with their corresponding sensitivity, specificity, positive predictive value and negative predictive value. A total of 1,523 patients were diagnosed of breast cancer between 2003 and 2009. A request for bone gammagraphy after 6 months from the first oncological treatment showed the highest sensitivity (53.8%) and negative predictive value (93.8%), and a pathology test after 6 months after the diagnosis showed the highest specificity (93.8%) and negative predictive value (92.6%). The combination of different definitions increased the specificity and the positive predictive value, but decreased the sensitivity. Several diagnostic algorithms were obtained, and the different definitions could be useful depending on the interest and resources of the researcher. A higher positive predictive value could be interesting for a quick estimation of the number of cases, and a higher negative predictive value for a more exact estimation if more resources are available. It is a versatile and adaptable tool for other types of tumors, as well as for the needs of the researcher. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  8. The Clinical Validation of the Athlete Sleep Screening Questionnaire: an Instrument to Identify Athletes that Need Further Sleep Assessment.

    Science.gov (United States)

    Bender, Amy M; Lawson, Doug; Werthner, Penny; Samuels, Charles H

    2018-06-04

    Previous research has established that general sleep screening questionnaires are not valid and reliable in an athlete population. The Athlete Sleep Screening Questionnaire (ASSQ) was developed to address this need. While the initial validation of the ASSQ has been established, the clinical validity of the ASSQ has yet to be determined. The main objective of the current study was to evaluate the clinical validity of the ASSQ. Canadian National Team athletes (N = 199; mean age 24.0 ± 4.2 years, 62% females; from 23 sports) completed the ASSQ. A subset of athletes (N = 46) were randomized to the clinical validation sub-study which required subjects to complete an ASSQ at times 2 and 3 and to have a clinical sleep interview by a sleep medicine physician (SMP) who rated each subjects' category of clinical sleep problem and provided recommendations to improve sleep. To assess clinical validity, the SMP category of clinical sleep problem was compared to the ASSQ. The internal consistency (Cronbach's alpha = 0.74) and test-retest reliability (r = 0.86) of the ASSQ were acceptable. The ASSQ demonstrated good agreement with the SMP (Cohen's kappa = 0.84) which yielded a diagnostic sensitivity of 81%, specificity of 93%, positive predictive value of 87%, and negative predictive value of 90%. There were 25.1% of athletes identified to have clinically relevant sleep disturbances that required further clinical sleep assessment. Sleep improved from time 1 at baseline to after the recommendations at time 3. Sleep screening athletes with the ASSQ provides a method of accurately determining which athletes would benefit from preventative measures and which athletes suffer from clinically significant sleep problems. The process of sleep screening athletes and providing recommendations improves sleep and offers a clinical intervention output that is simple and efficient for teams and athletes to implement.

  9. Validating the Modified Drug Adherence Work-Up (M-DRAW) Tool to Identify and Address Barriers to Medication Adherence.

    Science.gov (United States)

    Lee, Sun; Bae, Yuna H; Worley, Marcia; Law, Anandi

    2017-09-08

    Barriers to medication adherence stem from multiple factors. An effective and convenient tool is needed to identify these barriers so that clinicians can provide a tailored, patient-centered consultation with patients. The Modified Drug Adherence Work-up Tool (M-DRAW) was developed as a 13-item checklist questionnaire to identify barriers to medication adherence. The response scale was a 4-point Likert scale of frequency of occurrence (1 = never to 4 = often). The checklist was accompanied by a GUIDE that provided corresponding motivational interview-based intervention strategies for each identified barrier. The current pilot study examined the psychometric properties of the M-DRAW checklist (reliability, responsiveness and discriminant validity) in patients taking one or more prescription medication(s) for chronic conditions. A cross-sectional sample of 26 patients was recruited between December 2015 and March 2016 at an academic medical center pharmacy in Southern California. A priming question that assessed self-reported adherence was used to separate participants into the control group of 17 "adherers" (65.4%), and into the intervention group of nine "unintentional and intentional non-adherers" (34.6%). Comparable baseline characteristics were observed between the two groups. The M-DRAW checklist showed acceptable reliability (13 item; alpha = 0.74) for identifying factors and barriers leading to medication non-adherence. Discriminant validity of the tool and the priming question was established by the four-fold number of barriers to adherence identified within the self-selected intervention group compared to the control group (4.4 versus 1.2 barriers, p tool will include construct validation.

  10. The Laboratory-Based Intermountain Validated Exacerbation (LIVE Score Identifies Chronic Obstructive Pulmonary Disease Patients at High Mortality Risk

    Directory of Open Access Journals (Sweden)

    Denitza P. Blagev

    2018-06-01

    Full Text Available Background: Identifying COPD patients at high risk for mortality or healthcare utilization remains a challenge. A robust system for identifying high-risk COPD patients using Electronic Health Record (EHR data would empower targeting interventions aimed at ensuring guideline compliance and multimorbidity management. The purpose of this study was to empirically derive, validate, and characterize subgroups of COPD patients based on routinely collected clinical data widely available within the EHR.Methods: Cluster analysis was used in 5,006 patients with COPD at Intermountain to identify clusters based on a large collection of clinical variables. Recursive Partitioning (RP was then used to determine a preferred tree that assigned patients to clusters based on a parsimonious variable subset. The mortality, COPD exacerbations, and comorbidity profile of the identified groups were examined. The findings were validated in an independent Intermountain cohort and in external cohorts from the United States Veterans Affairs (VA and University of Chicago Medicine systems.Measurements and Main Results: The RP algorithm identified five LIVE Scores based on laboratory values: albumin, creatinine, chloride, potassium, and hemoglobin. The groups were characterized by increasing risk of mortality. The lowest risk, LIVE Score 5 had 8% 4-year mortality vs. 56% in the highest risk LIVE Score 1 (p < 0.001. These findings were validated in the VA cohort (n = 83,134, an expanded Intermountain cohort (n = 48,871 and in the University of Chicago system (n = 3,236. Higher mortality groups also had higher COPD exacerbation rates and comorbidity rates.Conclusions: In large clinical datasets across different organizations, the LIVE Score utilizes existing laboratory data for COPD patients, and may be used to stratify risk for mortality and COPD exacerbations.

  11. An Integrated Approach to Establish Validity and Reliability of Reading Tests

    Science.gov (United States)

    Razi, Salim

    2012-01-01

    This study presents the processes of developing and establishing reliability and validity of a reading test by administering an integrative approach as conventional reliability and validity measures superficially reveals the difficulty of a reading test. In this respect, analysing vocabulary frequency of the test is regarded as a more eligible way…

  12. Geographic and temporal validity of prediction models: Different approaches were useful to examine model performance

    NARCIS (Netherlands)

    P.C. Austin (Peter); D. van Klaveren (David); Y. Vergouwe (Yvonne); D. Nieboer (Daan); D.S. Lee (Douglas); E.W. Steyerberg (Ewout)

    2016-01-01

    textabstractObjective: Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting: We

  13. Innovative Approach to Validation of Ultraviolet (UV) Reactors for Disinfection in Drinking Water Systems

    Science.gov (United States)

    Slide presentation at Conference: ASCE 7th Civil Engineering Conference in the Asian Region. USEPA in partnership with the Cadmus Group, Carollo Engineers, and other State & Industry collaborators, are evaluating new approaches for validating UV reactors to meet groundwater & sur...

  14. How to Identify High-Risk APS Patients: Clinical Utility and Predictive Values of Validated Scores.

    Science.gov (United States)

    Oku, Kenji; Amengual, Olga; Yasuda, Shinsuke; Atsumi, Tatsuya

    2017-08-01

    Antiphospholipid syndrome (APS) is a clinical disorder characterised by thrombosis and/or pregnancy morbidity in the persistence of antiphospholipid (aPL) antibodies that are pathogenic and have pro-coagulant activities. Thrombosis in APS tends to recur and require prophylaxis; however, the stereotypical treatment for APS patients is inadequate and stratification of the thrombotic risks is important as aPL are prevalently observed in various diseases or elderly population. It is previously known that the multiple positive aPL or high titre aPL correlate to thrombotic events. To progress the stratification of thrombotic risks in APS patients and to quantitatively analyse those risks, antiphospholipid score (aPL-S) and the Global Anti-phospholipid Syndrome Score (GAPSS) were defined. These scores were raised from the large patient cohort data and either aPL profile classified in detail (aPL-S) or simplified aPL profile with classical thrombotic risk factors (GAPSS) was put into a scoring system. Both the aPL-S and GAPSS have shown a degree of accuracy in identifying high-risk APS patients, especially those at a high risk of thrombosis. However, there are several areas requiring improvement, or at least that clinicians should be aware of, before these instruments are applied in clinical practice. One such issue is standardisation of the aPL tests, including general testing of phosphatidylserine-dependent antiprothrombin antibodies (aPS/PT). Additionally, clinicians may need to be aware of the patient's medical history, particularly with respect to the incidence of SLE, which influences the cutoff value for identifying high-risk patients.

  15. Validity of the Neuromuscular Recovery Scale: a measurement model approach.

    Science.gov (United States)

    Velozo, Craig; Moorhouse, Michael; Ardolino, Elizabeth; Lorenz, Doug; Suter, Sarah; Basso, D Michele; Behrman, Andrea L

    2015-08-01

    To determine how well the Neuromuscular Recovery Scale (NRS) items fit the Rasch, 1-parameter, partial-credit measurement model. Confirmatory factor analysis (CFA) and principal components analysis (PCA) of residuals were used to determine dimensionality. The Rasch, 1-parameter, partial-credit rating scale model was used to determine rating scale structure, person/item fit, point-measure item correlations, item discrimination, and measurement precision. Seven NeuroRecovery Network clinical sites. Outpatients (N=188) with spinal cord injury. Not applicable. NRS. While the NRS met 1 of 3 CFA criteria, the PCA revealed that the Rasch measurement dimension explained 76.9% of the variance. Ten of 11 items and 91% of the patients fit the Rasch model, with 9 of 11 items showing high discrimination. Sixty-nine percent of the ratings met criteria. The items showed a logical item-difficulty order, with Stand retraining as the easiest item and Walking as the most challenging item. The NRS showed no ceiling or floor effects and separated the sample into almost 5 statistically distinct strata; individuals with an American Spinal Injury Association Impairment Scale (AIS) D classification showed the most ability, and those with an AIS A classification showed the least ability. Items not meeting the rating scale criteria appear to be related to the low frequency counts. The NRS met many of the Rasch model criteria for construct validity. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  16. [Is psychoprophylaxis a valid approach for heart surgery in children?].

    Science.gov (United States)

    Pereira Ruschel, P; Pierini Cidade, D; Daudt, N S; Rossi Filho, R I

    1995-10-01

    To validate the hypothesys that a psycological preparation for children who will undergo cardiac surgery may improve the outcome. Sixty patients, with ages ranging between 3 and 10 years, submitted to heart surgery for treatment of congenital heart defects, were evaluated. They were divided in 2 groups: experimental and control. A questionnaire was designed for collecting data about psychological and clinical aspects of each patient. The following data was found to be of statistical significance: acceptance of peripheral vein puncture in the surgical group (chi 2 = 11.59, p < 0.05), calm awakening following general anesthesia (chi 2 = 9.64 p < 0.05), cooperation with the physiotherapy staff (chi 2 = 13.30, p < 0.05), coping with parents absence (chi 2 = 9.64, p < 0.05), acceptance of fluid restriction (chi 2 = 17.78, p < 0.05) and cooperation with removal of stitches and pacemaker electrodes (chi 2 = 19.20, p < 0.05). There was not statistical significance on demand of sedation, cooperation at removal of the orotracheal tube and during examination, necessity of reintubation and occurrence of clinical complications. However, the prepared group showed a slight tendency to have less postoperative complications (20%) than the control (27%). It was found that children who had adequated psychologic preparation prior to the correction of congenital heart defects had better psychological results with the imposed trauma.

  17. The node-weighted Steiner tree approach to identify elements of cancer-related signaling pathways.

    Science.gov (United States)

    Sun, Yahui; Ma, Chenkai; Halgamuge, Saman

    2017-12-28

    Cancer constitutes a momentous health burden in our society. Critical information on cancer may be hidden in its signaling pathways. However, even though a large amount of money has been spent on cancer research, some critical information on cancer-related signaling pathways still remains elusive. Hence, new works towards a complete understanding of cancer-related signaling pathways will greatly benefit the prevention, diagnosis, and treatment of cancer. We propose the node-weighted Steiner tree approach to identify important elements of cancer-related signaling pathways at the level of proteins. This new approach has advantages over previous approaches since it is fast in processing large protein-protein interaction networks. We apply this new approach to identify important elements of two well-known cancer-related signaling pathways: PI3K/Akt and MAPK. First, we generate a node-weighted protein-protein interaction network using protein and signaling pathway data. Second, we modify and use two preprocessing techniques and a state-of-the-art Steiner tree algorithm to identify a subnetwork in the generated network. Third, we propose two new metrics to select important elements from this subnetwork. On a commonly used personal computer, this new approach takes less than 2 s to identify the important elements of PI3K/Akt and MAPK signaling pathways in a large node-weighted protein-protein interaction network with 16,843 vertices and 1,736,922 edges. We further analyze and demonstrate the significance of these identified elements to cancer signal transduction by exploring previously reported experimental evidences. Our node-weighted Steiner tree approach is shown to be both fast and effective to identify important elements of cancer-related signaling pathways. Furthermore, it may provide new perspectives into the identification of signaling pathways for other human diseases.

  18. Validation of the ITS2 region as a novel DNA barcode for identifying medicinal plant species.

    Science.gov (United States)

    Chen, Shilin; Yao, Hui; Han, Jianping; Liu, Chang; Song, Jingyuan; Shi, Linchun; Zhu, Yingjie; Ma, Xinye; Gao, Ting; Pang, Xiaohui; Luo, Kun; Li, Ying; Li, Xiwen; Jia, Xiaocheng; Lin, Yulin; Leon, Christine

    2010-01-07

    The plant working group of the Consortium for the Barcode of Life recommended the two-locus combination of rbcL+matK as the plant barcode, yet the combination was shown to successfully discriminate among 907 samples from 550 species at the species level with a probability of 72%. The group admits that the two-locus barcode is far from perfect due to the low identification rate, and the search is not over. Here, we compared seven candidate DNA barcodes (psbA-trnH, matK, rbcL, rpoC1, ycf5, ITS2, and ITS) from medicinal plant species. Our ranking criteria included PCR amplification efficiency, differential intra- and inter-specific divergences, and the DNA barcoding gap. Our data suggest that the second internal transcribed spacer (ITS2) of nuclear ribosomal DNA represents the most suitable region for DNA barcoding applications. Furthermore, we tested the discrimination ability of ITS2 in more than 6600 plant samples belonging to 4800 species from 753 distinct genera and found that the rate of successful identification with the ITS2 was 92.7% at the species level. The ITS2 region can be potentially used as a standard DNA barcode to identify medicinal plants and their closely related species. We also propose that ITS2 can serve as a novel universal barcode for the identification of a broader range of plant taxa.

  19. GPM ground validation via commercial cellular networks: an exploratory approach

    Science.gov (United States)

    Rios Gaona, Manuel Felipe; Overeem, Aart; Leijnse, Hidde; Brasjen, Noud; Uijlenhoet, Remko

    2016-04-01

    The suitability of commercial microwave link networks for ground validation of GPM (Global Precipitation Measurement) data is evaluated here. Two state-of-the-art rainfall products are compared over the land surface of the Netherlands for a period of 7 months, i.e., rainfall maps from commercial cellular communication networks and Integrated Multi-satellite Retrievals for GPM (IMERG). Commercial microwave link networks are nowadays the core component in telecommunications worldwide. Rainfall rates can be retrieved from measurements of attenuation between transmitting and receiving antennas. If adequately set up, these networks enable rainfall monitoring tens of meters above the ground at high spatiotemporal resolutions (temporal sampling of seconds to tens of minutes, and spatial sampling of hundreds of meters to tens of kilometers). The GPM mission is the successor of TRMM (Tropical Rainfall Measurement Mission). For two years now, IMERG offers rainfall estimates across the globe (180°W - 180°E and 60°N - 60°S) at spatiotemporal resolutions of 0.1° x 0.1° every 30 min. These two data sets are compared against a Dutch gauge-adjusted radar data set, considered to be the ground truth given its accuracy, spatiotemporal resolution and availability. The suitability of microwave link networks in satellite rainfall evaluation is of special interest, given the independent character of this technique, its high spatiotemporal resolutions and availability. These are valuable assets for water management and modeling of floods, landslides, and weather extremes; especially in places where rain gauge networks are scarce or poorly maintained, or where weather radar networks are too expensive to acquire and/or maintain.

  20. Development and validation of a screening procedure to identify speech-language delay in toddlers with cleft palate

    DEFF Research Database (Denmark)

    Jørgensen, Line Dahl; Willadsen, Elisabeth

    2017-01-01

    condition based on assessment of consonant inventory using a real-time listening procedure in combination with parent-reported expressive vocabulary. These measures allowed evaluation of early speech-language skills found to correlate significantly with later speech-language difficulties in longitudinal......The purpose of this study was to develop and validate a clinically useful speech-language screening procedure for young children with cleft palate +/- cleft lip (CP) to identify those in need of speech-language intervention. Twenty-two children with CP were assigned to a +/- need for intervention...... studies of children with CP. The external validity of this screening procedure was evaluated by comparing the +/- need for intervention assignment determined by the screening procedure to experienced speech-language pathologists’ (SLPs’) clinical judgment of whether or not a child needed early...

  1. Development of the methodology and approaches to validate safety and accident management

    International Nuclear Information System (INIS)

    Asmolov, V.G.

    1997-01-01

    The article compares the development of the methodology and approaches to validate the nuclear power plant safety and accident management in Russia and advanced industrial countries. It demonstrates that the development of methods of safety validation is dialectically related to the accumulation of the knowledge base on processes and events during NPP normal operation, transients and emergencies, including severe accidents. The article describes the Russian severe accident research program (1987-1996), the implementation of which allowed Russia to reach the world level of the safety validation efforts, presents future high-priority study areas. Problems related to possible approaches to the methodological accident management development are discussed. (orig.)

  2. Real-time process signal validation based on neuro-fuzzy and possibilistic approach

    International Nuclear Information System (INIS)

    Figedy, S.; Fantoni, P.F.; Hoffmann, M.

    2001-01-01

    Real-time process signal validation is an application field where the use of fuzzy logic and Artificial Neural Networks can improve the diagnostics of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process is to be performed. The possibilistic approach allows a fast detection of unforeseen plant conditions. Specialized Artificial Neural Networks are used, one for each fuzzy cluster. This offers two main advantages: the accuracy and generalization capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This system analyzes the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. This model has been tested on a simulated data from the PWR type of a nuclear power plant, to monitor safety-related reactor variables over the entire power-flow operating map and were installed in real conditions of BWR nuclear reactor. (Authors)

  3. Validation of a search strategy to identify nutrition trials in PubMed using the relative recall method.

    Science.gov (United States)

    Durão, Solange; Kredo, Tamara; Volmink, Jimmy

    2015-06-01

    To develop, assess, and maximize the sensitivity of a search strategy to identify diet and nutrition trials in PubMed using relative recall. We developed a search strategy to identify diet and nutrition trials in PubMed. We then constructed a gold standard reference set to validate the identified trials using the relative recall method. Relative recall was calculated by dividing the number of references from the gold standard our search strategy identified by the total number of references in the gold standard. Our gold standard comprised 298 trials, derived from 16 included systematic reviews. The initial search strategy identified 242 of 298 references, with a relative recall of 81.2% [95% confidence interval (CI): 76.3%, 85.5%]. We analyzed titles and abstracts of the 56 missed references for possible additional terms. We then modified the search strategy accordingly. The relative recall of the final search strategy was 88.6% (95% CI: 84.4%, 91.9%). We developed a search strategy to identify diet and nutrition trials in PubMed with a high relative recall (sensitivity). This could be useful for establishing a nutrition trials register to support the conduct of future research, including systematic reviews. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  4. The CARPEDIEM Algorithm: A Rule-Based System for Identifying Heart Failure Phenotype with a Precision Public Health Approach

    Directory of Open Access Journals (Sweden)

    Michela Franchini

    2018-01-01

    Full Text Available Modern medicine remains dependent on the accurate evaluation of a patient’s health state, recognizing that disease is a process that evolves over time and interacts with many factors unique to that patient. The CARPEDIEM project represents a concrete attempt to address these issues by developing reproducible algorithms to support the accuracy in detection of complex diseases. This study aims to establish and validate the CARPEDIEM approach and algorithm for identifying those patients presenting with or at risk of heart failure (HF by studying 153,393 subjects in Italy, based on patient information flow databases and is not reliant on the electronic health record to accomplish its goals. The resulting algorithm has been validated in a two-stage process, comparing predicted results with (1 HF diagnosis as identified by general practitioners (GPs among the reference cohort and (2 HF diagnosis as identified by cardiologists within a randomly sampled subpopulation of 389 patients. The sources of data used to detect HF cases are numerous and were standardized for this study. The accuracy and the predictive values of the algorithm with respect to the GPs and the clinical standards are highly consistent with those from previous studies. In particular, the algorithm is more efficient in detecting the more severe cases of HF according to the GPs’ validation (specificity increases according to the number of comorbidities and external validation (NYHA: II–IV; HF severity index: 2, 3. Positive and negative predictive values reveal that the CARPEDIEM algorithm is most consistent with clinical evaluation performed in the specialist setting, while it presents a greater ability to rule out false-negative HF cases within the GP practice, probably as a consequence of the different HF prevalence in the two different care settings. Further development includes analyzing the clinical features of false-positive and -negative predictions, to explore the natural

  5. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    analyses and hypothesis tests as a part of the validation step to provide feedback to analysts and modelers. Decisions on how to proceed in making model-based predictions are made based on these analyses together with the application requirements. Updating modifying and understanding the boundaries associated with the model are also assisted through this feedback. (4) We include a ''model supplement term'' when model problems are indicated. This term provides a (bias) correction to the model so that it will better match the experimental results and more accurately account for uncertainty. Presumably, as the models continue to develop and are used for future applications, the causes for these apparent biases will be identified and the need for this supplementary modeling will diminish. (5) We use a response-modeling approach for our predictions that allows for general types of prediction and for assessment of prediction uncertainty. This approach is demonstrated through a case study supporting the assessment of a weapons response when subjected to a hydrocarbon fuel fire. The foam decomposition model provides an important element of the response of a weapon system in this abnormal thermal environment. Rigid foam is used to encapsulate critical components in the weapon system providing the needed mechanical support as well as thermal isolation. Because the foam begins to decompose at temperatures above 250 C, modeling the decomposition is critical to assessing a weapons response. In the validation analysis it is indicated that the model tends to ''exaggerate'' the effect of temperature changes when compared to the experimental results. The data, however, are too few and to restricted in terms of experimental design to make confident statements regarding modeling problems. For illustration, we assume these indications are correct and compensate for this apparent bias by constructing a model supplement term for use in the model

  6. Candidate gene linkage approach to identify DNA variants that predispose to preterm birth

    DEFF Research Database (Denmark)

    Bream, Elise N A; Leppellere, Cara R; Cooper, Margaret E

    2013-01-01

    Background:The aim of this study was to identify genetic variants contributing to preterm birth (PTB) using a linkage candidate gene approach.Methods:We studied 99 single-nucleotide polymorphisms (SNPs) for 33 genes in 257 families with PTBs segregating. Nonparametric and parametric analyses were...... through the infant and/or the mother in the etiology of PTB....

  7. The Baby TALK Model: An Innovative Approach to Identifying High-Risk Children and Families

    Science.gov (United States)

    Villalpando, Aimee Hilado; Leow, Christine; Hornstein, John

    2012-01-01

    This research report examines the Baby TALK model, an innovative early childhood intervention approach used to identify, recruit, and serve young children who are at-risk for developmental delays, mental health needs, and/or school failure, and their families. The report begins with a description of the model. This description is followed by an…

  8. Identifying Core Mobile Learning Faculty Competencies Based Integrated Approach: A Delphi Study

    Science.gov (United States)

    Elbarbary, Rafik Said

    2015-01-01

    This study is based on the integrated approach as a concept framework to identify, categorize, and rank a key component of mobile learning core competencies for Egyptian faculty members in higher education. The field investigation framework used four rounds Delphi technique to determine the importance rate of each component of core competencies…

  9. Validation of protein models by a neural network approach

    Directory of Open Access Journals (Sweden)

    Fantucci Piercarlo

    2008-01-01

    Full Text Available Abstract Background The development and improvement of reliable computational methods designed to evaluate the quality of protein models is relevant in the context of protein structure refinement, which has been recently identified as one of the bottlenecks limiting the quality and usefulness of protein structure prediction. Results In this contribution, we present a computational method (Artificial Intelligence Decoys Evaluator: AIDE which is able to consistently discriminate between correct and incorrect protein models. In particular, the method is based on neural networks that use as input 15 structural parameters, which include energy, solvent accessible surface, hydrophobic contacts and secondary structure content. The results obtained with AIDE on a set of decoy structures were evaluated using statistical indicators such as Pearson correlation coefficients, Znat, fraction enrichment, as well as ROC plots. It turned out that AIDE performances are comparable and often complementary to available state-of-the-art learning-based methods. Conclusion In light of the results obtained with AIDE, as well as its comparison with available learning-based methods, it can be concluded that AIDE can be successfully used to evaluate the quality of protein structures. The use of AIDE in combination with other evaluation tools is expected to further enhance protein refinement efforts.

  10. Identifying bioaccumulative halogenated organic compounds using a nontargeted analytical approach: seabirds as sentinels.

    Directory of Open Access Journals (Sweden)

    Christopher J Millow

    Full Text Available Persistent organic pollutants (POPs are typically monitored via targeted mass spectrometry, which potentially identifies only a fraction of the contaminants actually present in environmental samples. With new anthropogenic compounds continuously introduced to the environment, novel and proactive approaches that provide a comprehensive alternative to targeted methods are needed in order to more completely characterize the diversity of known and unknown compounds likely to cause adverse effects. Nontargeted mass spectrometry attempts to extensively screen for compounds, providing a feasible approach for identifying contaminants that warrant future monitoring. We employed a nontargeted analytical method using comprehensive two-dimensional gas chromatography coupled to time-of-flight mass spectrometry (GC×GC/TOF-MS to characterize halogenated organic compounds (HOCs in California Black skimmer (Rynchops niger eggs. Our study identified 111 HOCs; 84 of these compounds were regularly detected via targeted approaches, while 27 were classified as typically unmonitored or unknown. Typically unmonitored compounds of note in bird eggs included tris(4-chlorophenylmethane (TCPM, tris(4-chlorophenylmethanol (TCPMOH, triclosan, permethrin, heptachloro-1'-methyl-1,2'-bipyrrole (MBP, as well as four halogenated unknown compounds that could not be identified through database searching or the literature. The presence of these compounds in Black skimmer eggs suggests they are persistent, bioaccumulative, potentially biomagnifying, and maternally transferring. Our results highlight the utility and importance of employing nontargeted analytical tools to assess true contaminant burdens in organisms, as well as to demonstrate the value in using environmental sentinels to proactively identify novel contaminants.

  11. Identifying seizure onset zone from electrocorticographic recordings: A machine learning approach based on phase locking value.

    Science.gov (United States)

    Elahian, Bahareh; Yeasin, Mohammed; Mudigoudar, Basanagoud; Wheless, James W; Babajani-Feremi, Abbas

    2017-10-01

    Using a novel technique based on phase locking value (PLV), we investigated the potential for features extracted from electrocorticographic (ECoG) recordings to serve as biomarkers to identify the seizure onset zone (SOZ). We computed the PLV between the phase of the amplitude of high gamma activity (80-150Hz) and the phase of lower frequency rhythms (4-30Hz) from ECoG recordings obtained from 10 patients with epilepsy (21 seizures). We extracted five features from the PLV and used a machine learning approach based on logistic regression to build a model that classifies electrodes as SOZ or non-SOZ. More than 96% of electrodes identified as the SOZ by our algorithm were within the resected area in six seizure-free patients. In four non-seizure-free patients, more than 31% of the identified SOZ electrodes by our algorithm were outside the resected area. In addition, we observed that the seizure outcome in non-seizure-free patients correlated with the number of non-resected SOZ electrodes identified by our algorithm. This machine learning approach, based on features extracted from the PLV, effectively identified electrodes within the SOZ. The approach has the potential to assist clinicians in surgical decision-making when pre-surgical intracranial recordings are utilized. Copyright © 2017 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  12. Mining user-generated geographic content : an interactive, crowdsourced approach to validation and supervision

    NARCIS (Netherlands)

    Ostermann, F.O.; Garcia Chapeton, Gustavo Adolfo; Zurita-Milla, R.; Kraak, M.J.; Bergt, A.; Sarjakoski, T.; van Lammeren, R.; Rip, F.

    2017-01-01

    This paper describes a pilot study that implements a novel approach to validate data mining tasks by using the crowd to train a classifier. This hybrid approach to processing successfully addresses challenges faced during human curation or machine processing of user-generated geographic content

  13. Calibrating and Validating a Simulation Model to Identify Drivers of Urban Land Cover Change in the Baltimore, MD Metropolitan Region

    Directory of Open Access Journals (Sweden)

    Claire Jantz

    2014-09-01

    Full Text Available We build upon much of the accumulated knowledge of the widely used SLEUTH urban land change model and offer advances. First, we use SLEUTH’s exclusion/attraction layer to identify and test different urban land cover change drivers; second, we leverage SLEUTH’s self-modification capability to incorporate a demographic model; and third, we develop a validation procedure to quantify the influence of land cover change drivers and assess uncertainty. We found that, contrary to our a priori expectations, new development is not attracted to areas serviced by existing or planned water and sewer infrastructure. However, information about where population and employment growth is likely to occur did improve model performance. These findings point to the dominant role of centrifugal forces in post-industrial cities like Baltimore, MD. We successfully developed a demographic model that allowed us to constrain the SLEUTH model forecasts and address uncertainty related to the dynamic relationship between changes in population and employment and urban land use. Finally, we emphasize the importance of model validation. In this work the validation procedure played a key role in rigorously assessing the impacts of different exclusion/attraction layers and in assessing uncertainty related to population and employment forecasts.

  14. Derivation and validation of the automated search algorithms to identify cognitive impairment and dementia in electronic health records.

    Science.gov (United States)

    Amra, Sakusic; O'Horo, John C; Singh, Tarun D; Wilson, Gregory A; Kashyap, Rahul; Petersen, Ronald; Roberts, Rosebud O; Fryer, John D; Rabinstein, Alejandro A; Gajic, Ognjen

    2017-02-01

    Long-term cognitive impairment is a common and important problem in survivors of critical illness. We developed electronic search algorithms to identify cognitive impairment and dementia from the electronic medical records (EMRs) that provide opportunity for big data analysis. Eligible patients met 2 criteria. First, they had a formal cognitive evaluation by The Mayo Clinic Study of Aging. Second, they were hospitalized in intensive care unit at our institution between 2006 and 2014. The "criterion standard" for diagnosis was formal cognitive evaluation supplemented by input from an expert neurologist. Using all available EMR data, we developed and improved our algorithms in the derivation cohort and validated them in the independent validation cohort. Of 993 participants who underwent formal cognitive testing and were hospitalized in intensive care unit, we selected 151 participants at random to form the derivation and validation cohorts. The automated electronic search algorithm for cognitive impairment was 94.3% sensitive and 93.0% specific. The search algorithms for dementia achieved respective sensitivity and specificity of 97% and 99%. EMR search algorithms significantly outperformed International Classification of Diseases codes. Automated EMR data extractions for cognitive impairment and dementia are reliable and accurate and can serve as acceptable and efficient alternatives to time-consuming manual data review. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. The validity of the first and second generation Microsoft Kinect™ for identifying joint center locations during static postures.

    Science.gov (United States)

    Xu, Xu; McGorry, Raymond W

    2015-07-01

    The Kinect™ sensor released by Microsoft is a low-cost, portable, and marker-less motion tracking system for the video game industry. Since the first generation Kinect sensor was released in 2010, many studies have been conducted to examine the validity of this sensor when used to measure body movement in different research areas. In 2014, Microsoft released the computer-used second generation Kinect sensor with a better resolution for the depth sensor. However, very few studies have performed a direct comparison between all the Kinect sensor-identified joint center locations and their corresponding motion tracking system-identified counterparts, the result of which may provide some insight into the error of the Kinect-identified segment length, joint angles, as well as the feasibility of adapting inverse dynamics to Kinect-identified joint centers. The purpose of the current study is to first propose a method to align the coordinate system of the Kinect sensor with respect to the global coordinate system of a motion tracking system, and then to examine the accuracy of the Kinect sensor-identified coordinates of joint locations during 8 standing and 8 sitting postures of daily activities. The results indicate the proposed alignment method can effectively align the Kinect sensor with respect to the motion tracking system. The accuracy level of the Kinect-identified joint center location is posture-dependent and joint-dependent. For upright standing posture, the average error across all the participants and all Kinect-identified joint centers is 76 mm and 87 mm for the first and second generation Kinect sensor, respectively. In general, standing postures can be identified with better accuracy than sitting postures, and the identification accuracy of the joints of the upper extremities is better than for the lower extremities. This result may provide some information regarding the feasibility of using the Kinect sensor in future studies. Copyright © 2015 Elsevier

  16. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant.

    Directory of Open Access Journals (Sweden)

    William S Beatty

    Full Text Available The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international, whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result

  17. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant

    Science.gov (United States)

    Beatty, William S.; Kesler, Dylan C.; Webb, Elisabeth B.; Raedeke, Andrew H.; Naylor, Luke W.; Humburg, Dale D.

    2013-01-01

    The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos) equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international), whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result, targeted

  18. A comparison of approaches for finding minimum identifying codes on graphs

    Science.gov (United States)

    Horan, Victoria; Adachi, Steve; Bak, Stanley

    2016-05-01

    In order to formulate mathematical conjectures likely to be true, a number of base cases must be determined. However, many combinatorial problems are NP-hard and the computational complexity makes this research approach difficult using a standard brute force approach on a typical computer. One sample problem explored is that of finding a minimum identifying code. To work around the computational issues, a variety of methods are explored and consist of a parallel computing approach using MATLAB, an adiabatic quantum optimization approach using a D-Wave quantum annealing processor, and lastly using satisfiability modulo theory (SMT) and corresponding SMT solvers. Each of these methods requires the problem to be formulated in a unique manner. In this paper, we address the challenges of computing solutions to this NP-hard problem with respect to each of these methods.

  19. Validation of new prognostic and predictive scores by sequential testing approach

    International Nuclear Information System (INIS)

    Nieder, Carsten; Haukland, Ellinor; Pawinski, Adam; Dalhaug, Astrid

    2010-01-01

    Background and Purpose: For practitioners, the question arises how their own patient population differs from that used in large-scale analyses resulting in new scores and nomograms and whether such tools actually are valid at a local level and thus can be implemented. A recent article proposed an easy-to-use method for the in-clinic validation of new prediction tools with a limited number of patients, a so-called sequential testing approach. The present study evaluates this approach in scores related to radiation oncology. Material and Methods: Three different scores were used, each predicting short overall survival after palliative radiotherapy (bone metastases, brain metastases, metastatic spinal cord compression). For each scenario, a limited number of consecutive patients entered the sequential testing approach. The positive predictive value (PPV) was used for validation of the respective score and it was required that the PPV exceeded 80%. Results: For two scores, validity in the own local patient population could be confirmed after entering 13 and 17 patients, respectively. For the third score, no decision could be reached even after increasing the sample size to 30. Conclusion: In-clinic validation of new predictive tools with sequential testing approach should be preferred over uncritical adoption of tools which provide no significant benefit to local patient populations. Often the necessary number of patients can be reached within reasonable time frames even in small oncology practices. In addition, validation is performed continuously as the data are collected. (orig.)

  20. Validation of new prognostic and predictive scores by sequential testing approach

    Energy Technology Data Exchange (ETDEWEB)

    Nieder, Carsten [Radiation Oncology Unit, Nordland Hospital, Bodo (Norway); Inst. of Clinical Medicine, Univ. of Tromso (Norway); Haukland, Ellinor; Pawinski, Adam; Dalhaug, Astrid [Radiation Oncology Unit, Nordland Hospital, Bodo (Norway)

    2010-03-15

    Background and Purpose: For practitioners, the question arises how their own patient population differs from that used in large-scale analyses resulting in new scores and nomograms and whether such tools actually are valid at a local level and thus can be implemented. A recent article proposed an easy-to-use method for the in-clinic validation of new prediction tools with a limited number of patients, a so-called sequential testing approach. The present study evaluates this approach in scores related to radiation oncology. Material and Methods: Three different scores were used, each predicting short overall survival after palliative radiotherapy (bone metastases, brain metastases, metastatic spinal cord compression). For each scenario, a limited number of consecutive patients entered the sequential testing approach. The positive predictive value (PPV) was used for validation of the respective score and it was required that the PPV exceeded 80%. Results: For two scores, validity in the own local patient population could be confirmed after entering 13 and 17 patients, respectively. For the third score, no decision could be reached even after increasing the sample size to 30. Conclusion: In-clinic validation of new predictive tools with sequential testing approach should be preferred over uncritical adoption of tools which provide no significant benefit to local patient populations. Often the necessary number of patients can be reached within reasonable time frames even in small oncology practices. In addition, validation is performed continuously as the data are collected. (orig.)

  1. Alternative approaches for identifying acute systemic toxicity: Moving from research to regulatory testing.

    Science.gov (United States)

    Hamm, Jon; Sullivan, Kristie; Clippinger, Amy J; Strickland, Judy; Bell, Shannon; Bhhatarai, Barun; Blaauboer, Bas; Casey, Warren; Dorman, David; Forsby, Anna; Garcia-Reyero, Natàlia; Gehen, Sean; Graepel, Rabea; Hotchkiss, Jon; Lowit, Anna; Matheson, Joanna; Reaves, Elissa; Scarano, Louis; Sprankle, Catherine; Tunkel, Jay; Wilson, Dan; Xia, Menghang; Zhu, Hao; Allen, David

    2017-06-01

    Acute systemic toxicity testing provides the basis for hazard labeling and risk management of chemicals. A number of international efforts have been directed at identifying non-animal alternatives for in vivo acute systemic toxicity tests. A September 2015 workshop, Alternative Approaches for Identifying Acute Systemic Toxicity: Moving from Research to Regulatory Testing, reviewed the state-of-the-science of non-animal alternatives for this testing and explored ways to facilitate implementation of alternatives. Workshop attendees included representatives from international regulatory agencies, academia, nongovernmental organizations, and industry. Resources identified as necessary for meaningful progress in implementing alternatives included compiling and making available high-quality reference data, training on use and interpretation of in vitro and in silico approaches, and global harmonization of testing requirements. Attendees particularly noted the need to characterize variability in reference data to evaluate new approaches. They also noted the importance of understanding the mechanisms of acute toxicity, which could be facilitated by the development of adverse outcome pathways. Workshop breakout groups explored different approaches to reducing or replacing animal use for acute toxicity testing, with each group crafting a roadmap and strategy to accomplish near-term progress. The workshop steering committee has organized efforts to implement the recommendations of the workshop participants. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Three novel approaches to structural identifiability analysis in mixed-effects models.

    Science.gov (United States)

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2016-05-06

    Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not

  3. Validation of ATR FT-IR to identify polymers of plastic marine debris, including those ingested by marine organisms

    Science.gov (United States)

    Jung, Melissa R.; Horgen, F. David; Orski, Sara V.; Rodriguez, Viviana; Beers, Kathryn L.; Balazs, George H.; Jones, T. Todd; Work, Thierry M.; Brignac, Kayla C.; Royer, Sarah-Jeanne; Hyrenbach, David K.; Jensen, Brenda A.; Lynch, Jennifer M.

    2018-01-01

    Polymer identification of plastic marine debris can help identify its sources, degradation, and fate. We optimized and validated a fast, simple, and accessible technique, attenuated total reflectance Fourier transform infrared spectroscopy (ATR FT-IR), to identify polymers contained in plastic ingested by sea turtles. Spectra of consumer good items with known resin identification codes #1–6 and several #7 plastics were compared to standard and raw manufactured polymers. High temperature size exclusion chromatography measurements confirmed ATR FT-IR could differentiate these polymers. High-density (HDPE) and low-density polyethylene (LDPE) discrimination is challenging but a clear step-by-step guide is provided that identified 78% of ingested PE samples. The optimal cleaning methods consisted of wiping ingested pieces with water or cutting. Of 828 ingested plastics pieces from 50 Pacific sea turtles, 96% were identified by ATR FT-IR as HDPE, LDPE, unknown PE, polypropylene (PP), PE and PP mixtures, polystyrene, polyvinyl chloride, and nylon.

  4. An integrated approach to validation of safeguards and security program performance

    International Nuclear Information System (INIS)

    Altman, W.D.; Hunt, J.S.; Hockert, J.W.

    1988-01-01

    Department of Energy (DOE) requirements for safeguards and security programs are becoming increasingly performance oriented. Master Safeguards and Security Agreemtns specify performance levels for systems protecting DOE security interests. In order to measure and validate security system performance, Lawrence Livermore National Laboratory (LLNL) has developed cost effective validation tools and a comprehensive validation approach that synthesizes information gained from different activities such as force on force exercises, limited scope performance tests, equipment testing, vulnerability analyses, and computer modeling; into an overall assessment of the performance of the protection system. The analytic approach employs logic diagrams adapted from the fault and event trees used in probabilistic risk assessment. The synthesis of the results from the various validation activities is accomplished using a method developed by LLNL, based upon Bayes' theorem

  5. A Systematic Approach to Determining the Identifiability of Multistage Carcinogenesis Models.

    Science.gov (United States)

    Brouwer, Andrew F; Meza, Rafael; Eisenberg, Marisa C

    2017-07-01

    Multistage clonal expansion (MSCE) models of carcinogenesis are continuous-time Markov process models often used to relate cancer incidence to biological mechanism. Identifiability analysis determines what model parameter combinations can, theoretically, be estimated from given data. We use a systematic approach, based on differential algebra methods traditionally used for deterministic ordinary differential equation (ODE) models, to determine identifiable combinations for a generalized subclass of MSCE models with any number of preinitation stages and one clonal expansion. Additionally, we determine the identifiable combinations of the generalized MSCE model with up to four clonal expansion stages, and conjecture the results for any number of clonal expansion stages. The results improve upon previous work in a number of ways and provide a framework to find the identifiable combinations for further variations on the MSCE models. Finally, our approach, which takes advantage of the Kolmogorov backward equations for the probability generating functions of the Markov process, demonstrates that identifiability methods used in engineering and mathematics for systems of ODEs can be applied to continuous-time Markov processes. © 2016 Society for Risk Analysis.

  6. Relative validity and reproducibility of a food frequency questionnaire for identifying the dietary patterns of toddlers in New Zealand.

    Science.gov (United States)

    Mills, Virginia C; Skidmore, Paula M L; Watson, Emily O; Taylor, Rachael W; Fleming, Elizabeth A; Heath, Anne-Louise M

    2015-04-01

    Dietary patterns provide insight into relationships between diet and disease. Food frequency questionnaires (FFQs) can identify dietary patterns in adults, but similar analyses have not been performed for toddlers. The aim of the Eating Assessment in Toddlers study was to evaluate the relative validity and reproducibility of dietary patterns from an FFQ developed for toddlers aged 12 to 24 months. Participants were 160 toddlers aged 12 to 24 months and their primary caregiver who completed an FFQ twice, approximately 5 weeks apart (FFQ1 and FFQ2). A 5-day weighed food record was collected on nonconsecutive days between FFQ administrations. Principal component analysis identified three major dietary patterns similar across FFQ1, FFQ2, and the 5-day weighted food record. The sweet foods and fries pattern was characterized by high intakes of sweet foods, fries and roast potato and kumara (sweet potato), butter and margarines, processed meat, sweet drinks, and fruit or milk drinks. The vegetables and meat pattern was characterized by high intakes of vegetables, meat, eggs and beans, and fruit. The milk and fruit pattern was characterized by high intakes of milk and milk products and fruit, and low intakes of breastmilk and infant and follow-up formula. The FFQ (FFQ1) correctly classified 43.1% to 51.0% of toddlers into the same quartile of pattern score as the 5-day weighted food record, and Pearson correlations ranged from 0.56 to 0.68 for the three patterns. Reliability coefficients ranged from 0.71 to 0.72 for all three dietary patterns. the Eating Assessment in Toddlers study FFQ shows acceptable relative validity and high reproducibility for identifying dietary patterns in toddlers. Copyright © 2015 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  7. Online-Based Approaches to Identify Real Journals and Publishers from Hijacked Ones.

    Science.gov (United States)

    Asadi, Amin; Rahbar, Nader; Asadi, Meisam; Asadi, Fahime; Khalili Paji, Kokab

    2017-02-01

    The aim of the present paper was to introduce some online-based approaches to evaluate scientific journals and publishers and to differentiate them from the hijacked ones, regardless of their disciplines. With the advent of open-access journals, many hijacked journals and publishers have deceitfully assumed the mantle of authenticity in order to take advantage of researchers and students. Although these hijacked journals and publishers can be identified through checking their advertisement techniques and their websites, these ways do not always result in their identification. There exist certain online-based approaches, such as using Master Journal List provided by Thomson Reuters, and Scopus database, and using the DOI of a paper, to certify the realness of a journal or publisher. It is indispensable that inexperienced students and researchers know these methods so as to identify hijacked journals and publishers with a higher level of probability.

  8. Maximal Predictability Approach for Identifying the Right Descriptors for Electrocatalytic Reactions.

    Science.gov (United States)

    Krishnamurthy, Dilip; Sumaria, Vaidish; Viswanathan, Venkatasubramanian

    2018-02-01

    Density functional theory (DFT) calculations are being routinely used to identify new material candidates that approach activity near fundamental limits imposed by thermodynamics or scaling relations. DFT calculations are associated with inherent uncertainty, which limits the ability to delineate materials (distinguishability) that possess high activity. Development of error-estimation capabilities in DFT has enabled uncertainty propagation through activity-prediction models. In this work, we demonstrate an approach to propagating uncertainty through thermodynamic activity models leading to a probability distribution of the computed activity and thereby its expectation value. A new metric, prediction efficiency, is defined, which provides a quantitative measure of the ability to distinguish activity of materials and can be used to identify the optimal descriptor(s) ΔG opt . We demonstrate the framework for four important electrochemical reactions: hydrogen evolution, chlorine evolution, oxygen reduction and oxygen evolution. Future studies could utilize expected activity and prediction efficiency to significantly improve the prediction accuracy of highly active material candidates.

  9. Validation of Simulation Codes for Future Systems: Motivations, Approach and the Role of Nuclear Data

    International Nuclear Information System (INIS)

    G. Palmiotti; M. Salvatores; G. Aliberti

    2007-01-01

    The validation of advanced simulation tools will still play a very significant role in several areas of reactor system analysis. This is the case of reactor physics and neutronics, where nuclear data uncertainties still play a crucial role for many core and fuel cycle parameters. The present paper gives a summary of validation motivations, objectives and approach. A validation effort is in particular necessary in the frame of advanced (e.g. Generation-IV or GNEP) reactors and associated fuel cycles assessment and design

  10. Constructing New Theory for Identifying Students with Emotional Disturbance: A Constructivist Approach to Grounded Theory

    OpenAIRE

    Dori Barnett

    2012-01-01

    A grounded theory study that examined how practitioners in a county alternative and correctional education setting identify youth with emotional and behavioral difficulties for special education services provides an exemplar for a constructivist approach to grounded theory methodology. Discussion focuses on how a constructivist orientation to grounded theory methodology informed research decisions, shaped the development of the emergent grounded theory, and prompted a way of thinking about da...

  11. An Approach to Identify and Characterize a Subunit Candidate Shigella Vaccine Antigen.

    Science.gov (United States)

    Pore, Debasis; Chakrabarti, Manoj K

    2016-01-01

    Shigellosis remains a serious issue throughout the developing countries, particularly in children under the age of 5. Numerous strategies have been tested to develop vaccines targeting shigellosis; unfortunately despite several years of extensive research, no safe, effective, and inexpensive vaccine against shigellosis is available so far. Here, we illustrate in detail an approach to identify and establish immunogenic outer membrane proteins from Shigella flexneri 2a as subunit vaccine candidates.

  12. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    Science.gov (United States)

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  13. A Systematic Approach to Identify Promising New Items for Small to Medium Enterprises: A Case Study

    Directory of Open Access Journals (Sweden)

    Sukjae Jeong

    2016-11-01

    Full Text Available Despite the growing importance of identifying new business items for small and medium enterprises (SMEs, most previous studies focus on conglomerates. The paucity of empirical studies has also led to limited real-life applications. Hence, this study proposes a systematic approach to find new business items (NBIs that help the prospective SMEs develop, evaluate, and select viable business items to survive the competitive environment. The proposed approach comprises two stages: (1 the classification of diversification of SMEs; and (2 the searching and screening of business items. In the first stage, SMEs are allocated to five groups, based on their internal technological competency and external market conditions. In the second stage, based on the types of SMEs identified in the first stage, a set of alternative business items is derived by combining the results of portfolio analysis and benchmarking analysis. After deriving new business items, a market and technology-driven matrix analysis is utilized to screen suitable business items, and the Bruce Merrifield-Ohe (BMO method is used to categorize and identify prospective items based on market attractiveness and internal capability. To illustrate the applicability of the proposed approach, a case study is presented.

  14. Integrated systems approach identifies risk regulatory pathways and key regulators in coronary artery disease.

    Science.gov (United States)

    Zhang, Yan; Liu, Dianming; Wang, Lihong; Wang, Shuyuan; Yu, Xuexin; Dai, Enyu; Liu, Xinyi; Luo, Shanshun; Jiang, Wei

    2015-12-01

    Coronary artery disease (CAD) is the most common type of heart disease. However, the molecular mechanisms of CAD remain elusive. Regulatory pathways are known to play crucial roles in many pathogenic processes. Thus, inferring risk regulatory pathways is an important step toward elucidating the mechanisms underlying CAD. With advances in high-throughput data, we developed an integrated systems approach to identify CAD risk regulatory pathways and key regulators. Firstly, a CAD-related core subnetwork was identified from a curated transcription factor (TF) and microRNA (miRNA) regulatory network based on a random walk algorithm. Secondly, candidate risk regulatory pathways were extracted from the subnetwork by applying a breadth-first search (BFS) algorithm. Then, risk regulatory pathways were prioritized based on multiple CAD-associated data sources. Finally, we also proposed a new measure to prioritize upstream regulators. We inferred that phosphatase and tensin homolog (PTEN) may be a key regulator in the dysregulation of risk regulatory pathways. This study takes a closer step than the identification of disease subnetworks or modules. From the risk regulatory pathways, we could understand the flow of regulatory information in the initiation and progression of the disease. Our approach helps to uncover its potential etiology. We developed an integrated systems approach to identify risk regulatory pathways. We proposed a new measure to prioritize the key regulators in CAD. PTEN may be a key regulator in dysregulation of the risk regulatory pathways.

  15. An approach to identify issues affecting ERP implementation in Indian SMEs

    Directory of Open Access Journals (Sweden)

    Rana Basu

    2012-06-01

    Full Text Available Purpose: The purpose of this paper is to present the findings of a study which is based on the results of a comprehensive compilation of literature and subsequent analysis of ERP implementation success issues in context to Indian Small and Medium scale Enterprises (SME’s. This paper attempts to explore the existing literature and highlight those issues on ERP implementation and further to this the researchers applied TOPSIS (Technique for order preference by similarity to ideal solution method to prioritize issues affecting successful implementation of ERP. Design/methodology/approach: Based on the literature review certain issues leading to successful ERP implementation have been identified and to identify key issues Pareto Analysis (80-20 Rule have been applied. Further to extraction of key issues a survey based on TOPSIS was carried out in Indian small and medium scale enterprises. Findings: Based on review of literature 25 issues have been identified and further Pareto analysis has been done to extract key issues which is further prioritized by applying Topsis method. Research limitations/implications: Beside those identified issues there may be other issues that need to be explored. There is scope to enhance this study by taking into consideration different type of industries and by extending number of respondents. Practical implications: By identifying key issues for SMEs, managers can better prioritize issues to make implementation process smooth without disruption. ERP vendors can take inputs from this study to change their implementation approach while targeting small scale enterprises. Originality/value: There is no published literature available which followed a similar approach in identification of the critical issues affecting ERP in small and mid-sized companies in India or in any developing economy.

  16. Identifying Key Performance Indicators for Holistic Hospital Management with a Modified DEMATEL Approach.

    Science.gov (United States)

    Si, Sheng-Li; You, Xiao-Yue; Liu, Hu-Chen; Huang, Jia

    2017-08-19

    Performance analysis is an important way for hospitals to achieve higher efficiency and effectiveness in providing services to their customers. The performance of the healthcare system can be measured by many indicators, but it is difficult to improve them simultaneously due to the limited resources. A feasible way is to identify the central and influential indicators to improve healthcare performance in a stepwise manner. In this paper, we propose a hybrid multiple criteria decision making (MCDM) approach to identify key performance indicators (KPIs) for holistic hospital management. First, through integrating evidential reasoning approach and interval 2-tuple linguistic variables, various assessments of performance indicators provided by healthcare experts are modeled. Then, the decision making trial and evaluation laboratory (DEMATEL) technique is adopted to build an interactive network and visualize the causal relationships between the performance indicators. Finally, an empirical case study is provided to demonstrate the proposed approach for improving the efficiency of healthcare management. The results show that "accidents/adverse events", "nosocomial infection", ''incidents/errors", "number of operations/procedures" are significant influential indicators. Also, the indicators of "length of stay", "bed occupancy" and "financial measures" play important roles in performance evaluation of the healthcare organization. The proposed decision making approach could be considered as a reference for healthcare administrators to enhance the performance of their healthcare institutions.

  17. The development and validation of a two-tiered multiple-choice instrument to identify alternative conceptions in earth science

    Science.gov (United States)

    Mangione, Katherine Anna

    This study was to determine reliability and validity for a two-tiered, multiple- choice instrument designed to identify alternative conceptions in earth science. Additionally, this study sought to identify alternative conceptions in earth science held by preservice teachers, to investigate relationships between self-reported confidence scores and understanding of earth science concepts, and to describe relationships between content knowledge and alternative conceptions and planning instruction in the science classroom. Eighty-seven preservice teachers enrolled in the MAT program participated in this study. Sixty-eight participants were female, twelve were male, and seven chose not to answer. Forty-seven participants were in the elementary certification program, five were in the middle school certification program, and twenty-nine were pursuing secondary certification. Results indicate that the two-tiered, multiple-choice format can be a reliable and valid method for identifying alternative conceptions. Preservice teachers in all certification areas who participated in this study may possess common alternative conceptions previously identified in the literature. Alternative conceptions included: all rivers flow north to south, the shadow of the Earth covers the Moon causing lunar phases, the Sun is always directly overhead at noon, weather can be predicted by animal coverings, and seasons are caused by the Earth's proximity to the Sun. Statistical analyses indicated differences, however not all of them significant, among all subgroups according to gender and certification area. Generally males outperformed females and preservice teachers pursuing middle school certification had higher scores on the questionnaire followed by those obtaining secondary certification. Elementary preservice teachers scored the lowest. Additionally, self-reported scores of confidence in one's answers and understanding of the earth science concept in question were analyzed. There was a

  18. Invention and validation of an automated camera system that uses optical character recognition to identify patient name mislabeled samples.

    Science.gov (United States)

    Hawker, Charles D; McCarthy, William; Cleveland, David; Messinger, Bonnie L

    2014-03-01

    Mislabeled samples are a serious problem in most clinical laboratories. Published error rates range from 0.39/1000 to as high as 1.12%. Standardization of bar codes and label formats has not yet achieved the needed improvement. The mislabel rate in our laboratory, although low compared with published rates, prompted us to seek a solution to achieve zero errors. To reduce or eliminate our mislabeled samples, we invented an automated device using 4 cameras to photograph the outside of a sample tube. The system uses optical character recognition (OCR) to look for discrepancies between the patient name in our laboratory information system (LIS) vs the patient name on the customer label. All discrepancies detected by the system's software then require human inspection. The system was installed on our automated track and validated with production samples. We obtained 1 009 830 images during the validation period, and every image was reviewed. OCR passed approximately 75% of the samples, and no mislabeled samples were passed. The 25% failed by the system included 121 samples actually mislabeled by patient name and 148 samples with spelling discrepancies between the patient name on the customer label and the patient name in our LIS. Only 71 of the 121 mislabeled samples detected by OCR were found through our normal quality assurance process. We have invented an automated camera system that uses OCR technology to identify potential mislabeled samples. We have validated this system using samples transported on our automated track. Full implementation of this technology offers the possibility of zero mislabeled samples in the preanalytic stage.

  19. Validation of case-finding algorithms derived from administrative data for identifying adults living with human immunodeficiency virus infection.

    Directory of Open Access Journals (Sweden)

    Tony Antoniou

    Full Text Available OBJECTIVE: We sought to validate a case-finding algorithm for human immunodeficiency virus (HIV infection using administrative health databases in Ontario, Canada. METHODS: We constructed 48 case-finding algorithms using combinations of physician billing claims, hospital and emergency room separations and prescription drug claims. We determined the test characteristics of each algorithm over various time frames for identifying HIV infection, using data abstracted from the charts of 2,040 randomly selected patients receiving care at two medical practices in Toronto, Ontario as the reference standard. RESULTS: With the exception of algorithms using only a single physician claim, the specificity of all algorithms exceeded 99%. An algorithm consisting of three physician claims over a three year period had a sensitivity and specificity of 96.2% (95% CI 95.2%-97.9% and 99.6% (95% CI 99.1%-99.8%, respectively. Application of the algorithm to the province of Ontario identified 12,179 HIV-infected patients in care for the period spanning April 1, 2007 to March 31, 2009. CONCLUSIONS: Case-finding algorithms generated from administrative data can accurately identify adults living with HIV. A relatively simple "3 claims in 3 years" definition can be used for assembling a population-based cohort and facilitating future research examining trends in health service use and outcomes among HIV-infected adults in Ontario.

  20. Validation of a fracture mechanics approach to nuclear transportation cask design through a drop test program

    International Nuclear Information System (INIS)

    Sorenson, K.B.

    1986-01-01

    Sandia National Laboratories (SNL), under contract to the Department of Energy, is conducting a research program to develop and validate a fracture mechanics approach to cask design. A series of drop tests of a transportation cask is planned for the summer of 1986 as the method for benchmarking and, thereby, validating the fracture mechanics approach. This paper presents the drop test plan and background leading to the development of the test plan including structural analyses, material characterization, and non-destructive evaluation (NDE) techniques necessary for defining the test plan properly

  1. Multi-omics approach identifies molecular mechanisms of plant-fungus mycorrhizal interaction

    Directory of Open Access Journals (Sweden)

    Peter E Larsen

    2016-01-01

    Full Text Available In mycorrhizal symbiosis, plant roots form close, mutually beneficial interactions with soil fungi. Before this mycorrhizal interaction can be established however, plant roots must be capable of detecting potential beneficial fungal partners and initiating the gene expression patterns necessary to begin symbiosis. To predict a plant root – mycorrhizal fungi sensor systems, we analyzed in vitro experiments of Populus tremuloides (aspen tree and Laccaria bicolor (mycorrhizal fungi interaction and leveraged over 200 previously published transcriptomic experimental data sets, 159 experimentally validated plant transcription factor binding motifs, and more than 120-thousand experimentally validated protein-protein interactions to generate models of pre-mycorrhizal sensor systems in aspen root. These sensor mechanisms link extracellular signaling molecules with gene regulation through a network comprised of membrane receptors, signal cascade proteins, transcription factors, and transcription factor biding DNA motifs. Modeling predicted four pre-mycorrhizal sensor complexes in aspen that interact with fifteen transcription factors to regulate the expression of 1184 genes in response to extracellular signals synthesized by Laccaria. Predicted extracellular signaling molecules include common signaling molecules such as phenylpropanoids, salicylate, and, jasmonic acid. This multi-omic computational modeling approach for predicting the complex sensory networks yielded specific, testable biological hypotheses for mycorrhizal interaction signaling compounds, sensor complexes, and mechanisms of gene regulation.

  2. Alternative socio-centric approach for model validation - a way forward for socio-hydrology

    Science.gov (United States)

    van Emmerik, Tim; Elshafei, Yasmina; Mahendran, Roobavannan; Kandasamy, Jaya; Pande, Saket; Sivapalan, Murugesu

    2017-04-01

    To better understand and mitigate the impacts of humans on the water cycle, the importance of studying the co-evolution of coupled human-water systems has been recognized. Because of its unique system dynamics, the Murrumbidgee river basin (part of the larger Murray-Darlin basin, Australia) is one of the main study areas in the emerging field of socio-hydrology. In recent years, various historical and modeling studies have contributed to gaining a better understanding of this system's behavior. Kandasamy et al. (2014) performed a historical study on the development of this human-water coupled system. They identified four eras, providing a historical context of the observed "pendulum" swing between first an exclusive focus on agricultural development, followed by increasing environmental awareness, subsequent efforts to mitigate, and finally to restore environmental health. A modeling effort by Van Emmerik et al. (2014) focused on reconstructing hydrological, economical, and societal dynamics and their feedbacks. A measure of changing societal values was included by introducing environmental awareness as an endogenously modeled variable, which resulted in capturing the co-evolution between economic development and environmental health. Later work by Elshafei et al. (2015) modeled and analyzed the two-way feedbacks of land use management and land degradation in two other Australian coupled systems. A composite variable, community sensitivity, was used to measure changing community sentiment, such that the model was capable of isolating the two-way feedbacks in the coupled system. As socio-hydrology adopts a holistic approach, it is often required to introduce (hydrologically) unconventional variables, such as environmental awareness or community sensitivity. It is the subject of ongoing debate how such variables can be validated, as there is no standardized data set available from hydrological or statistical agencies. Recent research (Wei et al. 2017) has provided

  3. An integrated chemical biology approach identifies specific vulnerability of Ewing's sarcoma to combined inhibition of Aurora kinases A and B.

    Science.gov (United States)

    Winter, Georg E; Rix, Uwe; Lissat, Andrej; Stukalov, Alexey; Müllner, Markus K; Bennett, Keiryn L; Colinge, Jacques; Nijman, Sebastian M; Kubicek, Stefan; Kovar, Heinrich; Kontny, Udo; Superti-Furga, Giulio

    2011-10-01

    Ewing's sarcoma is a pediatric cancer of the bone that is characterized by the expression of the chimeric transcription factor EWS-FLI1 that confers a highly malignant phenotype and results from the chromosomal translocation t(11;22)(q24;q12). Poor overall survival and pronounced long-term side effects associated with traditional chemotherapy necessitate the development of novel, targeted, therapeutic strategies. We therefore conducted a focused viability screen with 200 small molecule kinase inhibitors in 2 different Ewing's sarcoma cell lines. This resulted in the identification of several potential molecular intervention points. Most notably, tozasertib (VX-680, MK-0457) displayed unique nanomolar efficacy, which extended to other cell lines, but was specific for Ewing's sarcoma. Furthermore, tozasertib showed strong synergies with the chemotherapeutic drugs etoposide and doxorubicin, the current standard agents for Ewing's sarcoma. To identify the relevant targets underlying the specific vulnerability toward tozasertib, we determined its cellular target profile by chemical proteomics. We identified 20 known and unknown serine/threonine and tyrosine protein kinase targets. Additional target deconvolution and functional validation by RNAi showed simultaneous inhibition of Aurora kinases A and B to be responsible for the observed tozasertib sensitivity, thereby revealing a new mechanism for targeting Ewing's sarcoma. We further corroborated our cellular observations with xenograft mouse models. In summary, the multilayered chemical biology approach presented here identified a specific vulnerability of Ewing's sarcoma to concomitant inhibition of Aurora kinases A and B by tozasertib and danusertib, which has the potential to become a new therapeutic option.

  4. A contemporary approach to validity arguments: a practical guide to Kane's framework.

    Science.gov (United States)

    Cook, David A; Brydges, Ryan; Ginsburg, Shiphra; Hatala, Rose

    2015-06-01

    Assessment is central to medical education and the validation of assessments is vital to their use. Earlier validity frameworks suffer from a multiplicity of types of validity or failure to prioritise among sources of validity evidence. Kane's framework addresses both concerns by emphasising key inferences as the assessment progresses from a single observation to a final decision. Evidence evaluating these inferences is planned and presented as a validity argument. We aim to offer a practical introduction to the key concepts of Kane's framework that educators will find accessible and applicable to a wide range of assessment tools and activities. All assessments are ultimately intended to facilitate a defensible decision about the person being assessed. Validation is the process of collecting and interpreting evidence to support that decision. Rigorous validation involves articulating the claims and assumptions associated with the proposed decision (the interpretation/use argument), empirically testing these assumptions, and organising evidence into a coherent validity argument. Kane identifies four inferences in the validity argument: Scoring (translating an observation into one or more scores); Generalisation (using the score[s] as a reflection of performance in a test setting); Extrapolation (using the score[s] as a reflection of real-world performance), and Implications (applying the score[s] to inform a decision or action). Evidence should be collected to support each of these inferences and should focus on the most questionable assumptions in the chain of inference. Key assumptions (and needed evidence) vary depending on the assessment's intended use or associated decision. Kane's framework applies to quantitative and qualitative assessments, and to individual tests and programmes of assessment. Validation focuses on evaluating the key claims, assumptions and inferences that link assessment scores with their intended interpretations and uses. The Implications

  5. A physarum-inspired prize-collecting steiner tree approach to identify subnetworks for drug repositioning.

    Science.gov (United States)

    Sun, Yahui; Hameed, Pathima Nusrath; Verspoor, Karin; Halgamuge, Saman

    2016-12-05

    Drug repositioning can reduce the time, costs and risks of drug development by identifying new therapeutic effects for known drugs. It is challenging to reposition drugs as pharmacological data is large and complex. Subnetwork identification has already been used to simplify the visualization and interpretation of biological data, but it has not been applied to drug repositioning so far. In this paper, we fill this gap by proposing a new Physarum-inspired Prize-Collecting Steiner Tree algorithm to identify subnetworks for drug repositioning. Drug Similarity Networks (DSN) are generated using the chemical, therapeutic, protein, and phenotype features of drugs. In DSNs, vertex prizes and edge costs represent the similarities and dissimilarities between drugs respectively, and terminals represent drugs in the cardiovascular class, as defined in the Anatomical Therapeutic Chemical classification system. A new Physarum-inspired Prize-Collecting Steiner Tree algorithm is proposed in this paper to identify subnetworks. We apply both the proposed algorithm and the widely-used GW algorithm to identify subnetworks in our 18 generated DSNs. In these DSNs, our proposed algorithm identifies subnetworks with an average Rand Index of 81.1%, while the GW algorithm can only identify subnetworks with an average Rand Index of 64.1%. We select 9 subnetworks with high Rand Index to find drug repositioning opportunities. 10 frequently occurring drugs in these subnetworks are identified as candidates to be repositioned for cardiovascular diseases. We find evidence to support previous discoveries that nitroglycerin, theophylline and acarbose may be able to be repositioned for cardiovascular diseases. Moreover, we identify seven previously unknown drug candidates that also may interact with the biological cardiovascular system. These discoveries show our proposed Prize-Collecting Steiner Tree approach as a promising strategy for drug repositioning.

  6. Identifying best-fitting inputs in health-economic model calibration: a Pareto frontier approach.

    Science.gov (United States)

    Enns, Eva A; Cipriano, Lauren E; Simons, Cyrena T; Kong, Chung Yin

    2015-02-01

    To identify best-fitting input sets using model calibration, individual calibration target fits are often combined into a single goodness-of-fit (GOF) measure using a set of weights. Decisions in the calibration process, such as which weights to use, influence which sets of model inputs are identified as best-fitting, potentially leading to different health economic conclusions. We present an alternative approach to identifying best-fitting input sets based on the concept of Pareto-optimality. A set of model inputs is on the Pareto frontier if no other input set simultaneously fits all calibration targets as well or better. We demonstrate the Pareto frontier approach in the calibration of 2 models: a simple, illustrative Markov model and a previously published cost-effectiveness model of transcatheter aortic valve replacement (TAVR). For each model, we compare the input sets on the Pareto frontier to an equal number of best-fitting input sets according to 2 possible weighted-sum GOF scoring systems, and we compare the health economic conclusions arising from these different definitions of best-fitting. For the simple model, outcomes evaluated over the best-fitting input sets according to the 2 weighted-sum GOF schemes were virtually nonoverlapping on the cost-effectiveness plane and resulted in very different incremental cost-effectiveness ratios ($79,300 [95% CI 72,500-87,600] v. $139,700 [95% CI 79,900-182,800] per quality-adjusted life-year [QALY] gained). Input sets on the Pareto frontier spanned both regions ($79,000 [95% CI 64,900-156,200] per QALY gained). The TAVR model yielded similar results. Choices in generating a summary GOF score may result in different health economic conclusions. The Pareto frontier approach eliminates the need to make these choices by using an intuitive and transparent notion of optimality as the basis for identifying best-fitting input sets. © The Author(s) 2014.

  7. A Methodology for Validating Safety Heuristics Using Clinical Simulations: Identifying and Preventing Possible Technology-Induced Errors Related to Using Health Information Systems

    Science.gov (United States)

    Borycki, Elizabeth; Kushniruk, Andre; Carvalho, Christopher

    2013-01-01

    Internationally, health information systems (HIS) safety has emerged as a significant concern for governments. Recently, research has emerged that has documented the ability of HIS to be implicated in the harm and death of patients. Researchers have attempted to develop methods that can be used to prevent or reduce technology-induced errors. Some researchers are developing methods that can be employed prior to systems release. These methods include the development of safety heuristics and clinical simulations. In this paper, we outline our methodology for developing safety heuristics specific to identifying the features or functions of a HIS user interface design that may lead to technology-induced errors. We follow this with a description of a methodological approach to validate these heuristics using clinical simulations. PMID:23606902

  8. Screening and syndromic approaches to identify gonorrhea and chlamydial infection among women.

    Science.gov (United States)

    Sloan, N L; Winikoff, B; Haberland, N; Coggins, C; Elias, C

    2000-03-01

    The standard diagnostic tools to identify sexually transmitted infections are often expensive and have laboratory and infrastructure requirements that make them unavailable to family planning and primary health-care clinics in developing countries. Therefore, inexpensive, accessible tools that rely on symptoms, signs, and/or risk factors have been developed to identify and treat reproductive tract infections without the need for laboratory diagnostics. Studies were reviewed that used standard diagnostic tests to identify gonorrhea and cervical chlamydial infection among women and that provided adequate information about the usefulness of the tools for screening. Aggregation of the studies' results suggest that risk factors, algorithms, and risk scoring for syndromic management are poor indicators of gonorrhea and chlamydial infection in samples of both low and high prevalence and, consequently, are not effective mechanisms with which to identify or manage these conditions. The development and evaluation of other approaches to identify gonorrhea and chlamydial infections, including inexpensive and simple laboratory screening tools, periodic universal treatment, and other alternatives must be given priority.

  9. Fatigue Equivalent Stress State Approach Validation in Non-conservative Criteria: a Comparative Study

    Directory of Open Access Journals (Sweden)

    Kévin Martial Tsapi Tchoupou

    Full Text Available Abstract This paper is concerned with the fatigue prediction models for estimating the multiaxial fatigue limit. An equivalent loading approach with zero out-of-phase angles intended for fatigue limit evaluation under multiaxial loading is used. Based on experimental data found in literatures, the equivalent stress is validated in Crossland and Sines criteria and predictions compared to the predictions of existing multiaxial fatigue; results over 87 experimental items show that the equivalent stress approach is very efficient.

  10. Relative criterion for validity of a semiclassical approach to the dynamics near quantum critical points.

    Science.gov (United States)

    Wang, Qian; Qin, Pinquan; Wang, Wen-ge

    2015-10-01

    Based on an analysis of Feynman's path integral formulation of the propagator, a relative criterion is proposed for validity of a semiclassical approach to the dynamics near critical points in a class of systems undergoing quantum phase transitions. It is given by an effective Planck constant, in the relative sense that a smaller effective Planck constant implies better performance of the semiclassical approach. Numerical tests of this relative criterion are given in the XY model and in the Dicke model.

  11. Validation of a Methodology to Predict Micro-Vibrations Based on Finite Element Model Approach

    Science.gov (United States)

    Soula, Laurent; Rathband, Ian; Laduree, Gregory

    2014-06-01

    This paper presents the second part of the ESA R&D study called "METhodology for Analysis of structure- borne MICro-vibrations" (METAMIC). After defining an integrated analysis and test methodology to help predicting micro-vibrations [1], a full-scale validation test campaign has been carried out. It is based on a bread-board representative of typical spacecraft (S/C) platform consisting in a versatile structure made of aluminium sandwich panels equipped with different disturbance sources and a dummy payload made of a silicon carbide (SiC) bench. The bread-board has been instrumented with a large set of sensitive accelerometers and tests have been performed including back-ground noise measurement, modal characterization and micro- vibration tests. The results provided responses to the perturbation coming from a reaction wheel or cryo-cooler compressors, operated independently then simultaneously with different operation modes. Using consistent modelling and associated experimental characterization techniques, a correlation status has been assessed by comparing test results with predictions based on FEM approach. Very good results have been achieved particularly for the case of a wheel in sweeping rate operation with test results over-predicted within a reasonable margin lower than two. Some limitations of the methodology have also been identified for sources operating at a fixed rate or coming with a small number of dominant harmonics and recommendations have been issued in order to deal with model uncertainties and stay conservative.

  12. A new approach for the validation of skeletal muscle modelling using MRI data

    Science.gov (United States)

    Böl, Markus; Sturmat, Maike; Weichert, Christine; Kober, Cornelia

    2011-05-01

    Active and passive experiments on skeletal muscles are in general arranged on isolated muscles or by consideration of the whole muscle packages, such as the arm or the leg. Both methods exhibit advantages and disadvantages. By applying experiments on isolated muscles it turns out that no information about the surrounding tissues are considered what leads to insufficient specifications of the isolated muscle. Especially, the muscle shape and the fibre directions of an embedded muscle are completely different to that of the same isolated muscle. An explicit advantage, in contrast, is the possibility to study the mechanical characteristics in an unique, isolated way. On the other hand, by applying experiments on muscle packages the aforementioned pros and cons reverse. In such situation, the whole surrounding tissue is considered in the mechanical characteristics of the muscle which are much more difficult to identify. However, an embedded muscle reflects a much more realistic situation as in isolated condition. Thus, in the proposed work to our knowledge, we, for the first time, suggest a technique that allows to study characteristics of single skeletal muscles inside a muscle package without any computation of the tissue around the muscle of interest. In doing so, we use magnetic resonance imaging data of an upper arm during contraction. By applying a three-dimensional continuum constitutive muscle model we are able to study the biceps brachii inside the upper arm and validate the modelling approach by optical experiments.

  13. A multi-indicator approach for identifying shoreline sewage pollution hotspots adjacent to coral reefs.

    Science.gov (United States)

    Abaya, Leilani M; Wiegner, Tracy N; Colbert, Steven L; Beets, James P; Carlson, Kaile'a M; Kramer, K Lindsey; Most, Rebecca; Couch, Courtney S

    2018-04-01

    Sewage pollution is contributing to the global decline of coral reefs. Identifying locations where it is entering waters near reefs is therefore a management priority. Our study documented shoreline sewage pollution hotspots in a coastal community with a fringing coral reef (Puakō, Hawai'i) using dye tracer studies, sewage indicator measurements, and a pollution scoring tool. Sewage reached shoreline waters within 9 h to 3 d. Fecal indicator bacteria concentrations were high and variable, and δ 15 N macroalgal values were indicative of sewage at many stations. Shoreline nutrient concentrations were two times higher than those in upland groundwater. Pollution hotspots were identified with a scoring tool using three sewage indicators. It confirmed known locations of sewage pollution from dye tracer studies. Our study highlights the need for a multi-indicator approach and scoring tool to identify sewage pollution hotspots. This approach will be useful for other coastal communities grappling with sewage pollution. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Non-Destructive Approaches for the Validation of Visually Observed Spatial Patterns of Decay

    Science.gov (United States)

    Johnston, Brian; McKinley, Jennifer; Warke, Patricia; Ruffell, Alastair

    2017-04-01

    Historical structures are regarded as a built legacy that is passed down through the generations and as such the conservation and restoration of these buildings is of great importance to governmental, religious and charitable organisations. As these groups play the role of custodians of this built heritage, they are therefore keen that the approaches employed in these studies of stone condition are non-destructive in nature. Determining sections of facades requiring repair work is often achieved through a visual conditional inspection of the stonework by a specialist. However, these reports focus upon the need to identify blocks requiring restorative action rather than the determination of spatial trends that lead to the identification of causes. This fixation on decay occurring at the block scale results in the spatial distribution of weathering present at the larger 'wall' scale appearing to have developed chaotically. Recent work has shown the importance of adopting a geomorphological focus when undertaking visual inspection of the facades of historical buildings to overcome this issue. Once trends have been ascertained, they can be used to bolster remedial strategies that target the sources of decay rather than just undertaking an aesthetic treatment of symptoms. Visual inspection of the study site, Fitzroy Presbyterian Church in Belfast, using the geomorphologically driven approach revealed three features suggestive of decay extending beyond the block scale. Firstly, the influence of architectural features on the susceptibility of blocks to decay. Secondly, the impact of the fluctuation in groundwater rise over the seasons and the influence of aspect upon this process. And finally, the interconnectivity of blocks, due to deteriorating mortar and poor repointing, providing conduits for the passage of moisture. Once these patterns were identified, it has proven necessary to validate the outcome of the visual inspection using other techniques. In this study

  15. Validity and Relative Ability of 4 Balance Tests to Identify Fall Status of Older Adults With Type 2 Diabetes.

    Science.gov (United States)

    Marques, Alda; Silva, Alexandre; Oliveira, Ana; Cruz, Joana; Machado, Ana; Jácome, Cristina

    The Berg Balance Scale (BBS), the Balance Evaluation Systems Test (BESTest), the Mini-BESTest, and the Brief-BESTest are useful tests to assess balance; however, their clinimetric properties have not been studied well in older adults with type 2 diabetes (T2D). This study compared the validity and relative ability of the BBS, BESTest, Mini-BESTest, and Brief-BESTest to identify fall status in older adults with T2D. This study involved a cross-sectional design. Sixty-six older adults with T2D (75 ± 7.6 years) were included and asked to report the number of falls during the previous 12 months and to complete the Activities-specific Balance Confidence scale. The BBS and the BESTest were administered, and the Mini-BESTest and Brief-BESTest scores were computed based on the BESTest performance. Receiver operating characteristics were used to assess the ability of each balance test to differentiate between participants with and without a history of falls. The 4 balance tests were able to identify fall status (areas under the curve = 0.74-0.76), with similar sensitivity (60%-67%) and specificity (71%-76%). The 4 balance tests were able to differentiate between older adults with T2D with and without a history of falls. As the BBS and the BESTest require longer application time, the Brief-BESTest may be an appropriate choice to use in clinical practice to detect fall risk.

  16. Improving accuracy for identifying related PubMed queries by an integrated approach.

    Science.gov (United States)

    Lu, Zhiyong; Wilbur, W John

    2009-10-01

    PubMed is the most widely used tool for searching biomedical literature online. As with many other online search tools, a user often types a series of multiple related queries before retrieving satisfactory results to fulfill a single information need. Meanwhile, it is also a common phenomenon to see a user type queries on unrelated topics in a single session. In order to study PubMed users' search strategies, it is necessary to be able to automatically separate unrelated queries and group together related queries. Here, we report a novel approach combining both lexical and contextual analyses for segmenting PubMed query sessions and identifying related queries and compare its performance with the previous approach based solely on concept mapping. We experimented with our integrated approach on sample data consisting of 1539 pairs of consecutive user queries in 351 user sessions. The prediction results of 1396 pairs agreed with the gold-standard annotations, achieving an overall accuracy of 90.7%. This demonstrates that our approach is significantly better than the previously published method. By applying this approach to a one day query log of PubMed, we found that a significant proportion of information needs involved more than one PubMed query, and that most of the consecutive queries for the same information need are lexically related. Finally, the proposed PubMed distance is shown to be an accurate and meaningful measure for determining the contextual similarity between biological terms. The integrated approach can play a critical role in handling real-world PubMed query log data as is demonstrated in our experiments.

  17. Validation of a LES turbulence modeling approach on a steady engine head flow

    NARCIS (Netherlands)

    Huijnen, V.; Somers, L.M.T.; Baert, R.S.G.; Goey, de L.P.H.; Dias, V.

    2005-01-01

    The application of the LES turbulence modeling approach in the Kiva-environment is validated on a complex geometry. Results for the steady flow in a realistic geometry of a production type heavy-duty diesel engine head with 120 mm cylinder bore are presented. The bulk Reynolds number is Reb = 1 fl

  18. Master Logic Diagram: An Approach to Identify Initiating Events of HTGRs

    Science.gov (United States)

    Purba, J. H.

    2018-02-01

    Initiating events of a nuclear power plant being evaluated need to be firstly identified prior to applying probabilistic safety assessment on that plant. Various types of master logic diagrams (MLDs) have been proposedforsearching initiating events of the next generation of nuclear power plants, which have limited data and operating experiences. Those MLDs are different in the number of steps or levels and different in the basis for developing them. This study proposed another type of MLD approach to find high temperature gas cooled reactor (HTGR) initiating events. It consists of five functional steps starting from the top event representing the final objective of the safety functions to the basic event representing the goal of the MLD development, which is an initiating event. The application of the proposed approach to search for two HTGR initiating events, i.e. power turbine generator trip and loss of offsite power, is provided. The results confirmed that the proposed MLD is feasiblefor finding HTGR initiating events.

  19. A multicriteria approach to identify investment opportunities for the exploitation of the clean development mechanism

    International Nuclear Information System (INIS)

    Diakoulaki, D.; Georgiou, P.; Tourkolias, C.; Georgopoulou, E.; Lalas, D.; Mirasgedis, S.; Sarafidis, Y.

    2007-01-01

    The aim of the present paper is to investigate the prospects for the exploitation of the Kyoto Protocol's Clean Development Mechanism (CDM) in Greece. The paper is addressing 3 questions: in which country, what kind of investment, with which economic and environmental return? The proposed approach is based on a multicriteria analysis for identifying priority countries and interesting investment opportunities in each priority country. These opportunities are then evaluated through a conventional financial analysis in order to assess their economic and environmental attractiveness. To this purpose, the IRR of a typical project in each investment category is calculated by taking into account country-specific parameters, such as baseline emission factors, load factors, costs, energy prices etc. The results reveal substantial differences in the economic and environmental return of different types of projects in different host-countries and show that for the full exploitation of the CDM a multifaceted approach to decision-making is necessary

  20. Interpretative approaches to identifying sources of hydrocarbons in complex contaminated environments

    International Nuclear Information System (INIS)

    Sauer, T.C.; Brown, J.S.; Boehm, P.D.

    1993-01-01

    Recent advances in analytical instrumental hardware and software have permitted the use of more sophisticated approaches in identifying or fingerprinting sources of hydrocarbons in complex matrix environments. In natural resource damage assessments and contaminated site investigations of both terrestrial and aquatic environments, chemical fingerprinting has become an important interpretative tool. The alkyl homologues of the major polycyclic and heterocyclic aromatic hydrocarbons (e.g., phenanthrenes/anthracenes, dibenzothiophenes, chrysenes) have been found to the most valuable hydrocarbons in differentiating hydrocarbon sources, but there are other hydrocarbon analytes, such as the chemical biomarkers steranes and triterpanes, and alkyl homologues of benzene, and chemical methodologies, such as scanning UV fluorescence, that have been found to be useful in certain environments. This presentation will focus on recent data interpretative approaches for hydrocarbon source identification assessments. Selection of appropriate targets analytes and data quality requirements will be discussed and example cases including the Arabian Gulf War oil spill results will be presented

  1. Identifying reports of randomized controlled trials (RCTs) via a hybrid machine learning and crowdsourcing approach.

    Science.gov (United States)

    Wallace, Byron C; Noel-Storr, Anna; Marshall, Iain J; Cohen, Aaron M; Smalheiser, Neil R; Thomas, James

    2017-11-01

    Identifying all published reports of randomized controlled trials (RCTs) is an important aim, but it requires extensive manual effort to separate RCTs from non-RCTs, even using current machine learning (ML) approaches. We aimed to make this process more efficient via a hybrid approach using both crowdsourcing and ML. We trained a classifier to discriminate between citations that describe RCTs and those that do not. We then adopted a simple strategy of automatically excluding citations deemed very unlikely to be RCTs by the classifier and deferring to crowdworkers otherwise. Combining ML and crowdsourcing provides a highly sensitive RCT identification strategy (our estimates suggest 95%-99% recall) with substantially less effort (we observed a reduction of around 60%-80%) than relying on manual screening alone. Hybrid crowd-ML strategies warrant further exploration for biomedical curation/annotation tasks. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  2. Identifying western yellow-billed cuckoo breeding habitat with a dual modelling approach

    Science.gov (United States)

    Johnson, Matthew J.; Hatten, James R.; Holmes, Jennifer A.; Shafroth, Patrick B.

    2017-01-01

    The western population of the yellow-billed cuckoo (Coccyzus americanus) was recently listed as threatened under the federal Endangered Species Act. Yellow-billed cuckoo conservation efforts require the identification of features and area requirements associated with high quality, riparian forest habitat at spatial scales that range from nest microhabitat to landscape, as well as lower-suitability areas that can be enhanced or restored. Spatially explicit models inform conservation efforts by increasing ecological understanding of a target species, especially at landscape scales. Previous yellow-billed cuckoo modelling efforts derived plant-community maps from aerial photography, an expensive and oftentimes inconsistent approach. Satellite models can remotely map vegetation features (e.g., vegetation density, heterogeneity in vegetation density or structure) across large areas with near perfect repeatability, but they usually cannot identify plant communities. We used aerial photos and satellite imagery, and a hierarchical spatial scale approach, to identify yellow-billed cuckoo breeding habitat along the Lower Colorado River and its tributaries. Aerial-photo and satellite models identified several key features associated with yellow-billed cuckoo breeding locations: (1) a 4.5 ha core area of dense cottonwood-willow vegetation, (2) a large native, heterogeneously dense forest (72 ha) around the core area, and (3) moderately rough topography. The odds of yellow-billed cuckoo occurrence decreased rapidly as the amount of tamarisk cover increased or when cottonwood-willow vegetation was limited. We achieved model accuracies of 75–80% in the project area the following year after updating the imagery and location data. The two model types had very similar probability maps, largely predicting the same areas as high quality habitat. While each model provided unique information, a dual-modelling approach provided a more complete picture of yellow-billed cuckoo habitat

  3. A multi-criteria decision making approach to identify a vaccine formulation.

    Science.gov (United States)

    Dewé, Walthère; Durand, Christelle; Marion, Sandie; Oostvogels, Lidia; Devaster, Jeanne-Marie; Fourneau, Marc

    2016-01-01

    This article illustrates the use of a multi-criteria decision making approach, based on desirability functions, to identify an appropriate adjuvant composition for an influenza vaccine to be used in elderly. The proposed adjuvant system contained two main elements: monophosphoryl lipid and α-tocopherol with squalene in an oil/water emulsion. The objective was to elicit a stronger immune response while maintaining an acceptable reactogenicity and safety profile. The study design, the statistical models, the choice of the desirability functions, the computation of the overall desirability index, and the assessment of the robustness of the ranking are all detailed in this manuscript.

  4. Constructing New Theory for Identifying Students with Emotional Disturbance: A Constructivist Approach to Grounded Theory

    Directory of Open Access Journals (Sweden)

    Dori Barnett

    2012-06-01

    Full Text Available A grounded theory study that examined how practitioners in a county alternative and correctional education setting identify youth with emotional and behavioral difficulties for special education services provides an exemplar for a constructivist approach to grounded theory methodology. Discussion focuses on how a constructivist orientation to grounded theory methodology informed research decisions, shaped the development of the emergent grounded theory, and prompted a way of thinking about data collection and analysis. Implications for future research directions and policy and practice in the field of special and alternative education are discussed.

  5. A Proteomic Approach Identifies Candidate Early Biomarkers to Predict Severe Dengue in Children.

    Directory of Open Access Journals (Sweden)

    Dang My Nhi

    2016-02-01

    Full Text Available Severe dengue with severe plasma leakage (SD-SPL is the most frequent of dengue severe form. Plasma biomarkers for early predictive diagnosis of SD-SPL are required in the primary clinics for the prevention of dengue death.Among 63 confirmed dengue pediatric patients recruited, hospital based longitudinal study detected six SD-SPL and ten dengue with warning sign (DWS. To identify the specific proteins increased or decreased in the SD-SPL plasma obtained 6-48 hours before the shock compared with the DWS, the isobaric tags for relative and absolute quantification (iTRAQ technology was performed using four patients each group. Validation was undertaken in 6 SD-SPL and 10 DWS patients.Nineteen plasma proteins exhibited significantly different relative concentrations (p<0.05, with five over-expressed and fourteen under-expressed in SD-SPL compared with DWS. The individual protein was classified to either blood coagulation, vascular regulation, cellular transport-related processes or immune response. The immunoblot quantification showed angiotensinogen and antithrombin III significantly increased in SD-SPL whole plasma of early stage compared with DWS subjects. Even using this small number of samples, antithrombin III predicted SD-SPL before shock occurrence with accuracy.Proteins identified here may serve as candidate predictive markers to diagnose SD-SPL for timely clinical management. Since the number of subjects are small, so further studies are needed to confirm all these biomarkers.

  6. Identifying approaches for assessing methodological and reporting quality of systematic reviews

    DEFF Research Database (Denmark)

    Pussegoda, Kusala; Turner, Lucy; Garritty, Chantelle

    2017-01-01

    there are potential gaps in research best-practice guidance materials. The aims of this study are to identify reports assessing the methodological quality (MQ) and/or reporting quality (RQ) of a cohort of SRs and to assess their number, general characteristics, and approaches to 'quality' assessment over time......BACKGROUND: The methodological quality and completeness of reporting of the systematic reviews (SRs) is fundamental to optimal implementation of evidence-based health care and the reduction of research waste. Methods exist to appraise SRs yet little is known about how they are used in SRs or where...... or reporting guidelines used as proxy to assess RQ were used in 80% (61/76) of identified reports. These included two reporting guidelines (PRISMA and QUOROM) and five quality assessment tools (AMSTAR, R-AMSTAR, OQAQ, Mulrow, Sacks) and GRADE criteria. The remaining 24% (18/76) of reports developed their own...

  7. Identifying Mother-Child Interaction Styles Using a Person-Centered Approach.

    Science.gov (United States)

    Nelson, Jackie A; O'Brien, Marion; Grimm, Kevin J; Leerkes, Esther M

    2014-05-01

    Parent-child conflict in the context of a supportive relationship has been discussed as a potentially constructive interaction pattern; the current study is the first to test this using a holistic analytic approach. Interaction styles, defined as mother-child conflict in the context of maternal sensitivity, were identified and described with demographic and stress-related characteristics of families. Longitudinal associations were tested between interaction styles and children's later social competence. Participants included 814 partnered mothers with a first-grade child. Latent profile analysis identified agreeable , dynamic , and disconnected interaction styles. Mothers' intimacy with a partner, depressive symptoms, and authoritarian childrearing beliefs, along with children's later conflict with a best friend and externalizing problems, were associated with group membership. Notably, the dynamic style, characterized by high sensitivity and high conflict, included families who experienced psychological and relational stressors. Findings are discussed with regard to how family stressors shape parent-child interaction patterns.

  8. Identifying the Critical Links in Road Transportation Networks: Centrality-based approach utilizing structural properties

    Energy Technology Data Exchange (ETDEWEB)

    Chinthavali, Supriya [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Surface transportation road networks share structural properties similar to other complex networks (e.g., social networks, information networks, biological networks, and so on). This research investigates the structural properties of road networks for any possible correlation with the traffic characteristics such as link flows those determined independently. Additionally, we define a criticality index for the links of the road network that identifies the relative importance in the network. We tested our hypotheses with two sample road networks. Results show that, correlation exists between the link flows and centrality measures of a link of the road (dual graph approach is followed) and the criticality index is found to be effective for one test network to identify the vulnerable nodes.

  9. Identifying natural compounds as multi-target-directed ligands against Alzheimer's disease: an in silico approach.

    Science.gov (United States)

    Ambure, Pravin; Bhat, Jyotsna; Puzyn, Tomasz; Roy, Kunal

    2018-04-23

    Alzheimer's disease (AD) is a multi-factorial disease, which can be simply outlined as an irreversible and progressive neurodegenerative disorder with an unclear root cause. It is a major cause of dementia in old aged people. In the present study, utilizing the structural and biological activity information of ligands for five important and mostly studied vital targets (i.e. cyclin-dependant kinase 5, β-secretase, monoamine oxidase B, glycogen synthase kinase 3β, acetylcholinesterase) that are believed to be effective against AD, we have developed five classification models using linear discriminant analysis (LDA) technique. Considering the importance of data curation, we have given more attention towards the chemical and biological data curation, which is a difficult task especially in case of big data-sets. Thus, to ease the curation process we have designed Konstanz Information Miner (KNIME) workflows, which are made available at http://teqip.jdvu.ac.in/QSAR_Tools/ . The developed models were appropriately validated based on the predictions for experiment derived data from test sets, as well as true external set compounds including known multi-target compounds. The domain of applicability for each classification model was checked based on a confidence estimation approach. Further, these validated models were employed for screening of natural compounds collected from the InterBioScreen natural database ( https://www.ibscreen.com/natural-compounds ). Further, the natural compounds that were categorized as 'actives' in at least two classification models out of five developed models were considered as multi-target leads, and these compounds were further screened using the drug-like filter, molecular docking technique and then thoroughly analyzed using molecular dynamics studies. Finally, the most potential multi-target natural compounds against AD are suggested.

  10. A Network Biology Approach Identifies Molecular Cross-Talk between Normal Prostate Epithelial and Prostate Carcinoma Cells.

    Directory of Open Access Journals (Sweden)

    Victor Trevino

    2016-04-01

    Full Text Available The advent of functional genomics has enabled the genome-wide characterization of the molecular state of cells and tissues, virtually at every level of biological organization. The difficulty in organizing and mining this unprecedented amount of information has stimulated the development of computational methods designed to infer the underlying structure of regulatory networks from observational data. These important developments had a profound impact in biological sciences since they triggered the development of a novel data-driven investigative approach. In cancer research, this strategy has been particularly successful. It has contributed to the identification of novel biomarkers, to a better characterization of disease heterogeneity and to a more in depth understanding of cancer pathophysiology. However, so far these approaches have not explicitly addressed the challenge of identifying networks representing the interaction of different cell types in a complex tissue. Since these interactions represent an essential part of the biology of both diseased and healthy tissues, it is of paramount importance that this challenge is addressed. Here we report the definition of a network reverse engineering strategy designed to infer directional signals linking adjacent cell types within a complex tissue. The application of this inference strategy to prostate cancer genome-wide expression profiling data validated the approach and revealed that normal epithelial cells exert an anti-tumour activity on prostate carcinoma cells. Moreover, by using a Bayesian hierarchical model integrating genetics and gene expression data and combining this with survival analysis, we show that the expression of putative cell communication genes related to focal adhesion and secretion is affected by epistatic gene copy number variation and it is predictive of patient survival. Ultimately, this study represents a generalizable approach to the challenge of deciphering cell

  11. A Network Biology Approach Identifies Molecular Cross-Talk between Normal Prostate Epithelial and Prostate Carcinoma Cells.

    Science.gov (United States)

    Trevino, Victor; Cassese, Alberto; Nagy, Zsuzsanna; Zhuang, Xiaodong; Herbert, John; Antczak, Philipp; Clarke, Kim; Davies, Nicholas; Rahman, Ayesha; Campbell, Moray J; Guindani, Michele; Bicknell, Roy; Vannucci, Marina; Falciani, Francesco

    2016-04-01

    The advent of functional genomics has enabled the genome-wide characterization of the molecular state of cells and tissues, virtually at every level of biological organization. The difficulty in organizing and mining this unprecedented amount of information has stimulated the development of computational methods designed to infer the underlying structure of regulatory networks from observational data. These important developments had a profound impact in biological sciences since they triggered the development of a novel data-driven investigative approach. In cancer research, this strategy has been particularly successful. It has contributed to the identification of novel biomarkers, to a better characterization of disease heterogeneity and to a more in depth understanding of cancer pathophysiology. However, so far these approaches have not explicitly addressed the challenge of identifying networks representing the interaction of different cell types in a complex tissue. Since these interactions represent an essential part of the biology of both diseased and healthy tissues, it is of paramount importance that this challenge is addressed. Here we report the definition of a network reverse engineering strategy designed to infer directional signals linking adjacent cell types within a complex tissue. The application of this inference strategy to prostate cancer genome-wide expression profiling data validated the approach and revealed that normal epithelial cells exert an anti-tumour activity on prostate carcinoma cells. Moreover, by using a Bayesian hierarchical model integrating genetics and gene expression data and combining this with survival analysis, we show that the expression of putative cell communication genes related to focal adhesion and secretion is affected by epistatic gene copy number variation and it is predictive of patient survival. Ultimately, this study represents a generalizable approach to the challenge of deciphering cell communication networks

  12. Validation of simulation codes for future systems: motivations, approach, and the role of nuclear data

    International Nuclear Information System (INIS)

    Palmiotti, G.; Salvatores, M.; Aliberti, G.

    2007-01-01

    The validation of advanced simulation tools will still play a very significant role in several areas of reactor system analysis. This is the case of reactor physics and neutronics, where nuclear data uncertainties still play a crucial role for many core and fuel cycle parameters. The present paper gives a summary of validation motivations, objectives and approach. A validation effort is in particular necessary in the frame of advanced (e.g. Generation-IV or GNEP) reactors and associated fuel cycles assessment and design. Validation of simulation codes is complementary to the 'verification' process. In fact, 'verification' addresses the question 'are we solving the equations correctly' while validation addresses the question 'are we solving the correct equations with the correct parameters'. Verification implies comparisons with 'reference' equation solutions or with analytical solutions, when they exist. Most of what is called 'numerical validation' falls in this category. Validation strategies differ according to the relative weight of the methods and of the parameters that enter into the simulation tools. Most validation is based on experiments, and the field of neutronics where a 'robust' physics description model exists and which is function of 'input' parameters not fully known, will be the focus of this paper. In fact, in the case of reactor core, shielding and fuel cycle physics the model (theory) is well established (the Boltzmann and Bateman equations) and the parameters are the nuclear cross-sections, decay data etc. Two types of validation approaches can and have been used: (a) Mock-up experiments ('global' validation): need for a very close experimental simulation of a reference configuration. Bias factors cannot be extrapolated beyond reference configuration; (b) Use of 'clean', 'representative' integral experiments ('bias factor and adjustment' method). Allows to define bias factors, uncertainties and can be used for a wide range of applications. It

  13. An Investigation to Validate the Grammar and Phonology Screening (GAPS) Test to Identify Children with Specific Language Impairment

    Science.gov (United States)

    van der Lely, Heather K. J.; Payne, Elisabeth; McClelland, Alastair

    2011-01-01

    Background The extraordinarily high incidence of grammatical language impairments in developmental disorders suggests that this uniquely human cognitive function is “fragile”. Yet our understanding of the neurobiology of grammatical impairments is limited. Furthermore, there is no “gold-standard” to identify grammatical impairments and routine screening is not undertaken. An accurate screening test to identify grammatical abilities would serve the research, health and education communities, further our understanding of developmental disorders, and identify children who need remediation, many of whom are currently un-diagnosed. A potential realistic screening tool that could be widely administered is the Grammar and Phonology Screening (GAPS) test – a 10 minute test that can be administered by professionals and non-professionals alike. Here we provide a further step in evaluating the validity and accuracy (sensitivity and specificity) of the GAPS test in identifying children who have Specific Language Impairment (SLI). Methods and Findings We tested three groups of children; two groups aged 3;6–6:6, a typically developing (n = 30) group, and a group diagnosed with SLI: (n = 11) (Young (Y)-SLI), and a further group aged 6;9–8;11 with SLI (Older (O)-SLI) (n = 10) who were above the test age norms. We employed a battery of language assessments including the GAPS test to assess the children's language abilities. For Y-SLI children, analyses revealed a sensitivity and specificity at the 5th and 10th percentile of 1.00 and 0.98, respectively, and for O-SLI children at the 10th and 15th percentile .83 and .90, respectively. Conclusions The findings reveal that the GAPS is highly accurate in identifying impaired vs. non-impaired children up to 6;8 years, and has moderate-to-high accuracy up to 9 years. The results indicate that GAPS is a realistic tool for the early identification of grammatical abilities and impairment in young children. A larger

  14. Validating the Copenhagen Psychosocial Questionnaire (COPSOQ-II) Using Set-ESEM: Identifying Psychosocial Risk Factors in a Sample of School Principals.

    Science.gov (United States)

    Dicke, Theresa; Marsh, Herbert W; Riley, Philip; Parker, Philip D; Guo, Jiesi; Horwood, Marcus

    2018-01-01

    School principals world-wide report high levels of strain and attrition resulting in a shortage of qualified principals. It is thus crucial to identify psychosocial risk factors that reflect principals' occupational wellbeing. For this purpose, we used the Copenhagen Psychosocial Questionnaire (COPSOQ-II), a widely used self-report measure covering multiple psychosocial factors identified by leading occupational stress theories. We evaluated the COPSOQ-II regarding factor structure and longitudinal, discriminant, and convergent validity using latent structural equation modeling in a large sample of Australian school principals ( N = 2,049). Results reveal that confirmatory factor analysis produced marginally acceptable model fit. A novel approach we call set exploratory structural equation modeling (set-ESEM), where cross-loadings were only allowed within a priori defined sets of factors, fit well, and was more parsimonious than a full ESEM. Further multitrait-multimethod models based on the set-ESEM confirm the importance of a principal's psychosocial risk factors; Stressors and depression were related to demands and ill-being, while confidence and autonomy were related to wellbeing. We also show that working in the private sector was beneficial for showing a low psychosocial risk, while other demographics have little effects. Finally, we identify five latent risk profiles (high risk to no risk) of school principals based on all psychosocial factors. Overall the research presented here closes the theory application gap of a strong multi-dimensional measure of psychosocial risk-factors.

  15. I know what you want to know: the impact of interviewees' ability to identify criteria on interview performance and construct-related validity

    NARCIS (Netherlands)

    Melchers, K.G.; Klehe, U.-C.; Richter, G.M.; Kleinmann, M.; König, C.J.; Lievens, F.

    2009-01-01

    The current study tested whether candidates' ability to identify the targeted interview dimensions fosters their interview success as well as the interviews' convergent and discriminant validity. Ninety-two interviewees participated in a simulated structured interview developed to measure three

  16. A unified approach to validation, reliability, and education study design for surgical technical skills training.

    Science.gov (United States)

    Sweet, Robert M; Hananel, David; Lawrenz, Frances

    2010-02-01

    To present modern educational psychology theory and apply these concepts to validity and reliability of surgical skills training and assessment. In a series of cross-disciplinary meetings, we applied a unified approach of behavioral science principles and theory to medical technical skills education given the recent advances in the theories in the field of behavioral psychology and statistics. While validation of the individual simulation tools is important, it is only one piece of a multimodal curriculum that in and of itself deserves examination and study. We propose concurrent validation throughout the design of simulation-based curriculum rather than once it is complete. We embrace the concept that validity and curriculum development are interdependent, ongoing processes that are never truly complete. Individual predictive, construct, content, and face validity aspects should not be considered separately but as interdependent and complementary toward an end application. Such an approach could help guide our acceptance and appropriate application of these exciting new training and assessment tools for technical skills training in medicine.

  17. An innovative and integrated approach based on DNA walking to identify unauthorised GMOs.

    Science.gov (United States)

    Fraiture, Marie-Alice; Herman, Philippe; Taverniers, Isabel; De Loose, Marc; Deforce, Dieter; Roosens, Nancy H

    2014-03-15

    In the coming years, the frequency of unauthorised genetically modified organisms (GMOs) being present in the European food and feed chain will increase significantly. Therefore, we have developed a strategy to identify unauthorised GMOs containing a pCAMBIA family vector, frequently present in transgenic plants. This integrated approach is performed in two successive steps on Bt rice grains. First, the potential presence of unauthorised GMOs is assessed by the qPCR SYBR®Green technology targeting the terminator 35S pCAMBIA element. Second, its presence is confirmed via the characterisation of the junction between the transgenic cassette and the rice genome. To this end, a DNA walking strategy is applied using a first reverse primer followed by two semi-nested PCR rounds using primers that are each time nested to the previous reverse primer. This approach allows to rapidly identify the transgene flanking region and can easily be implemented by the enforcement laboratories. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Accurately identifying patients who are excellent candidates or unsuitable for a medication: a novel approach

    Directory of Open Access Journals (Sweden)

    South C

    2017-12-01

    Full Text Available Charles South,1–3 A John Rush,4,* Thomas J Carmody,1–3 Manish K Jha,1,2 Madhukar H Trivedi1,2,*1Center for Depression Research and Clinical Care, 2Department of Psychiatry, 3Department of Clinical Sciences, University of Texas Southwestern Medical Center, Dallas, TX, USA; 4Department of Psychiatry and Behavioral Sciences, Duke-National University of Singapore, Singapore; Duke Medical School, Durham, NC, USA*These authors contributed equally to this work Objective: The objective of the study was to determine whether a unique analytic approach – as a proof of concept – could identify individual depressed outpatients (using 30 baseline clinical and demographic variables who are very likely (75% certain to not benefit (NB or to remit (R, accepting that without sufficient certainty, no prediction (NP would be made.Methods: Patients from the Combining Medications to Enhance Depression Outcomes trial treated with escitalopram (S-CIT + placebo (n=212 or S-CIT + bupropion-SR (n=206 were analyzed separately to assess replicability. For each treatment, the elastic net was used to identify subsets of predictive baseline measures for R and NB, separately. Two different equations that estimate the likelihood of remission and no benefit were developed for each patient. The ratio of these two numbers characterized likely outcomes for each patient.Results: The two treatment cells had comparable rates of remission (40% and no benefit (22%. In S-CIT + bupropion-SR, 11 were predicted NB of which 82% were correct; 26 were predicted R – 85% correct (169 had NP. For S-CIT + placebo, 13 were predicted NB – 69% correct; 44 were predicted R – 75% correct (155 were NP. Overall, 94/418 (22% patients were identified with a meaningful degree of certainty (69%–85% correct. Different variable sets with some overlap were predictive of remission and no benefit within and across treatments, despite comparable outcomes.Conclusion: In two separate analyses with two

  19. Towards a realistic approach to validation of reactive transport models for performance assessment

    International Nuclear Information System (INIS)

    Siegel, M.D.

    1993-01-01

    Performance assessment calculations are based on geochemical models that assume that interactions among radionuclides, rocks and groundwaters under natural conditions, can be estimated or bound by data obtained from laboratory-scale studies. The data include radionuclide distribution coefficients, measured in saturated batch systems of powdered rocks, and retardation factors measured in short-term column experiments. Traditional approaches to model validation cannot be applied in a straightforward manner to the simple reactive transport models that use these data. An approach to model validation in support of performance assessment is described in this paper. It is based on a recognition of different levels of model validity and is compatible with the requirements of current regulations for high-level waste disposal. Activities that are being carried out in support of this approach include (1) laboratory and numerical experiments to test the validity of important assumptions inherent in current performance assessment methodologies,(2) integrated transport experiments, and (3) development of a robust coupled reaction/transport code for sensitivity analyses using massively parallel computers

  20. A Systematic Approach to Identify Sources of Abnormal Interior Noise for a High-Speed Train

    Directory of Open Access Journals (Sweden)

    Jie Zhang

    2018-01-01

    Full Text Available A systematic approach to identify sources of abnormal interior noise occurring in a high-speed train is presented and applied in this paper to resolve a particular noise issue. This approach is developed based on a number of previous dealings with similar noise problems. The particular noise issue occurs in a Chinese high-speed train. It is measured that there is a difference of 7 dB(A in overall Sound Pressure Level (SPL between two nominally identical VIP cabins at 250 km/h. The systematic approach is applied to identify the root cause of the 7 dB(A difference. Well planned measurements are performed in both the VIP cabins. Sound pressure contributions, either in terms of frequency band or in terms of facing area, are analyzed. Order analysis is also carried out. Based on these analyses, it is found that the problematic frequency is the sleeper passing frequency of the train, and an area on the roof contributes the most. In order to determine what causes that area to be the main contributor without disassembling the structure of the roof, measured noise and vibration data for different train speeds are further analyzed. It is then reasoned that roof is the main contributor caused by sound pressure behind the panel. Up to this point, panels of the roof are removed, revealing that a hole of 300 cm2 for running cables is presented behind the red area without proper sound insulation. This study can provide a basis for abnormal interior noise analysis and control of high-speed trains.

  1. Identifying functional reorganization of spelling networks: an individual peak probability comparison approach

    Science.gov (United States)

    Purcell, Jeremy J.; Rapp, Brenda

    2013-01-01

    Previous research has shown that damage to the neural substrates of orthographic processing can lead to functional reorganization during reading (Tsapkini et al., 2011); in this research we ask if the same is true for spelling. To examine the functional reorganization of spelling networks we present a novel three-stage Individual Peak Probability Comparison (IPPC) analysis approach for comparing the activation patterns obtained during fMRI of spelling in a single brain-damaged individual with dysgraphia to those obtained in a set of non-impaired control participants. The first analysis stage characterizes the convergence in activations across non-impaired control participants by applying a technique typically used for characterizing activations across studies: Activation Likelihood Estimate (ALE) (Turkeltaub et al., 2002). This method was used to identify locations that have a high likelihood of yielding activation peaks in the non-impaired participants. The second stage provides a characterization of the degree to which the brain-damaged individual's activations correspond to the group pattern identified in Stage 1. This involves performing a Mahalanobis distance statistics analysis (Tsapkini et al., 2011) that compares each of a control group's peak activation locations to the nearest peak generated by the brain-damaged individual. The third stage evaluates the extent to which the brain-damaged individual's peaks are atypical relative to the range of individual variation among the control participants. This IPPC analysis allows for a quantifiable, statistically sound method for comparing an individual's activation pattern to the patterns observed in a control group and, thus, provides a valuable tool for identifying functional reorganization in a brain-damaged individual with impaired spelling. Furthermore, this approach can be applied more generally to compare any individual's activation pattern with that of a set of other individuals. PMID:24399981

  2. A Multiomics Approach to Identify Genes Associated with Childhood Asthma Risk and Morbidity.

    Science.gov (United States)

    Forno, Erick; Wang, Ting; Yan, Qi; Brehm, John; Acosta-Perez, Edna; Colon-Semidey, Angel; Alvarez, Maria; Boutaoui, Nadia; Cloutier, Michelle M; Alcorn, John F; Canino, Glorisa; Chen, Wei; Celedón, Juan C

    2017-10-01

    Childhood asthma is a complex disease. In this study, we aim to identify genes associated with childhood asthma through a multiomics "vertical" approach that integrates multiple analytical steps using linear and logistic regression models. In a case-control study of childhood asthma in Puerto Ricans (n = 1,127), we used adjusted linear or logistic regression models to evaluate associations between several analytical steps of omics data, including genome-wide (GW) genotype data, GW methylation, GW expression profiling, cytokine levels, asthma-intermediate phenotypes, and asthma status. At each point, only the top genes/single-nucleotide polymorphisms/probes/cytokines were carried forward for subsequent analysis. In step 1, asthma modified the gene expression-protein level association for 1,645 genes; pathway analysis showed an enrichment of these genes in the cytokine signaling system (n = 269 genes). In steps 2-3, expression levels of 40 genes were associated with intermediate phenotypes (asthma onset age, forced expiratory volume in 1 second, exacerbations, eosinophil counts, and skin test reactivity); of those, methylation of seven genes was also associated with asthma. Of these seven candidate genes, IL5RA was also significant in analytical steps 4-8. We then measured plasma IL-5 receptor α levels, which were associated with asthma age of onset and moderate-severe exacerbations. In addition, in silico database analysis showed that several of our identified IL5RA single-nucleotide polymorphisms are associated with transcription factors related to asthma and atopy. This approach integrates several analytical steps and is able to identify biologically relevant asthma-related genes, such as IL5RA. It differs from other methods that rely on complex statistical models with various assumptions.

  3. Validation of Sustainable Development Practices Scale Using the Bayesian Approach to Item Response Theory

    Directory of Open Access Journals (Sweden)

    Martin Hernani Merino

    2014-12-01

    Full Text Available There has been growing recognition of the importance of creating performance measurement tools for the economic, social and environmental management of micro and small enterprise (MSE. In this context, this study aims to validate an instrument to assess perceptions of sustainable development practices by MSEs by means of a Graded Response Model (GRM with a Bayesian approach to Item Response Theory (IRT. The results based on a sample of 506 university students in Peru, suggest that a valid measurement instrument was achieved. At the end of the paper, methodological and managerial contributions are presented.

  4. Validity of Purchasing Power Parity in BRICS under a DFA Approach

    Directory of Open Access Journals (Sweden)

    Emmanuel Numapau Gyamfi

    2017-02-01

    Full Text Available This study tests the validity of the purchasing power parity (PPP theory in Brazil, Russia, India, Macao-China and South Africa. We examine real exchange rates of these countries for mean reversion. The Hurst exponent is our mean reversion measure which is evaluated by the Detrended Fluctuation Analysis (DFA in a rolling window to determine the validity of the PPP theory amongst these countries through time. Our results show persistence in real exchange rates; an indication not supporting the PPP theory in the five countries. The study contributes to the extant literature of the PPP theory in BRICS using the DFA approach in a rolling window through time.

  5. Calibrated photostimulated luminescence is an effective approach to identify irradiated orange during storage

    International Nuclear Information System (INIS)

    Jo, Yunhee; Sanyal, Bhaskar; Chung, Namhyeok; Lee, Hyun-Gyu; Park, Yunji; Park, Hae-Jun; Kwon, Joong-Ho

    2015-01-01

    Photostimulated luminescence (PSL) has been employed as a fast screening method for various irradiated foods. In this study the potential use of PSL was evaluated to identify oranges irradiated with gamma ray, electron beam and X-ray (0–2 kGy) and stored under different conditions for 6 weeks. The effects of light conditions (natural light, artificial light, and dark) and storage temperatures (4 and 20 °C) on PSL photon counts (PCs) during post-irradiation periods were studied. Non-irradiated samples always showed negative values of PCs, while irradiated oranges exhibited intermediate results after first PSL measurements. However, the irradiated samples had much higher PCs. The PCs of all the samples declined as the storage time increased. Calibrated second PSL measurements showed PSL ratio <10 for the irradiated samples after 3 weeks of irradiation confirming their irradiation status in all the storage conditions. Calibrated PSL and sample storage in dark at 4 °C were found out to be most suitable approaches to identify irradiated oranges during storage. - Highlights: • Photostimulatedluminescence (PSL) was studied to identify irradiated orange for quarantine application. • PSL detection efficiency was compared amonggamma,electron, and X irradiation during shelf-life of oranges • PSL properties of samples were characterized by standard samples • Calibrated PSL gave a clear verdict on irradiation extending potential of PSL technique

  6. Structured methodology review identified seven (RETREAT) criteria for selecting qualitative evidence synthesis approaches.

    Science.gov (United States)

    Booth, Andrew; Noyes, Jane; Flemming, Kate; Gerhardus, Ansgar; Wahlster, Philip; van der Wilt, Gert Jan; Mozygemba, Kati; Refolo, Pietro; Sacchini, Dario; Tummers, Marcia; Rehfuess, Eva

    2018-07-01

    To compare and contrast different methods of qualitative evidence synthesis (QES) against criteria identified from the literature and to map their attributes to inform selection of the most appropriate QES method to answer research questions addressed by qualitative research. Electronic databases, citation searching, and a study register were used to identify studies reporting QES methods. Attributes compiled from 26 methodological papers (2001-2014) were used as a framework for data extraction. Data were extracted into summary tables by one reviewer and then considered within the author team. We identified seven considerations determining choice of methods from the methodological literature, encapsulated within the mnemonic Review question-Epistemology-Time/Timescale-Resources-Expertise-Audience and purpose-Type of data. We mapped 15 different published QES methods against these seven criteria. The final framework focuses on stand-alone QES methods but may also hold potential when integrating quantitative and qualitative data. These findings offer a contemporary perspective as a conceptual basis for future empirical investigation of the advantages and disadvantages of different methods of QES. It is hoped that this will inform appropriate selection of QES approaches. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Matrine Is Identified as a Novel Macropinocytosis Inducer by a Network Target Approach

    Directory of Open Access Journals (Sweden)

    Bo Zhang

    2018-01-01

    Full Text Available Comprehensively understanding pharmacological functions of natural products is a key issue to be addressed for the discovery of new drugs. Unlike some single-target drugs, natural products always exert diverse therapeutic effects through acting on a “network” that consists of multiple targets, making it necessary to develop a systematic approach, e.g., network pharmacology, to reveal pharmacological functions of natural products and infer their mechanisms of action. In this work, to identify the “network target” of a natural product, we perform a functional analysis of matrine, a marketed drug in China extracted from a medical herb Ku-Shen (Radix Sophorae Flavescentis. Here, the network target of matrine was firstly predicted by drugCIPHER, a genome-wide target prediction method. Based on the network target of matrine, we performed a functional gene set enrichment analysis to computationally identify the potential pharmacological functions of matrine, most of which are supported by the literature evidence, including neurotoxicity and neuropharmacological activities of matrine. Furthermore, computational results demonstrated that matrine has the potential for the induction of macropinocytosis and the regulation of ATP metabolism. Our experimental data revealed that the large vesicles induced by matrine are consistent with the typical characteristics of macropinosome. Our verification results also suggested that matrine could decrease cellular ATP level. These findings demonstrated the availability and effectiveness of the network target strategy for identifying the comprehensive pharmacological functions of natural products.

  8. Fingerprint-Based Machine Learning Approach to Identify Potent and Selective 5-HT2BR Ligands

    Directory of Open Access Journals (Sweden)

    Krzysztof Rataj

    2018-05-01

    Full Text Available The identification of subtype-selective GPCR (G-protein coupled receptor ligands is a challenging task. In this study, we developed a computational protocol to find compounds with 5-HT2BR versus 5-HT1BR selectivity. Our approach employs the hierarchical combination of machine learning methods, docking, and multiple scoring methods. First, we applied machine learning tools to filter a large database of druglike compounds by the new Neighbouring Substructures Fingerprint (NSFP. This two-dimensional fingerprint contains information on the connectivity of the substructural features of a compound. Preselected subsets of the database were then subjected to docking calculations. The main indicators of compounds’ selectivity were their different interactions with the secondary binding pockets of both target proteins, while binding modes within the orthosteric binding pocket were preserved. The combined methodology of ligand-based and structure-based methods was validated prospectively, resulting in the identification of hits with nanomolar affinity and ten-fold to ten thousand-fold selectivities.

  9. Algorithms to identify colonic ischemia, complications of constipation and irritable bowel syndrome in medical claims data: development and validation.

    Science.gov (United States)

    Sands, Bruce E; Duh, Mei-Sheng; Cali, Clorinda; Ajene, Anuli; Bohn, Rhonda L; Miller, David; Cole, J Alexander; Cook, Suzanne F; Walker, Alexander M

    2006-01-01

    A challenge in the use of insurance claims databases for epidemiologic research is accurate identification and verification of medical conditions. This report describes the development and validation of claims-based algorithms to identify colonic ischemia, hospitalized complications of constipation, and irritable bowel syndrome (IBS). From the research claims databases of a large healthcare company, we selected at random 120 potential cases of IBS and 59 potential cases each of colonic ischemia and hospitalized complications of constipation. We sought the written medical records and were able to abstract 107, 57, and 51 records, respectively. We established a 'true' case status for each subject by applying standard clinical criteria to the available chart data. Comparing the insurance claims histories to the assigned case status, we iteratively developed, tested, and refined claims-based algorithms that would capture the diagnoses obtained from the medical records. We set goals of high specificity for colonic ischemia and hospitalized complications of constipation, and high sensitivity for IBS. The resulting algorithms substantially improved on the accuracy achievable from a naïve acceptance of the diagnostic codes attached to insurance claims. The specificities for colonic ischemia and serious complications of constipation were 87.2 and 92.7%, respectively, and the sensitivity for IBS was 98.9%. U.S. commercial insurance claims data appear to be usable for the study of colonic ischemia, IBS, and serious complications of constipation. (c) 2005 John Wiley & Sons, Ltd.

  10. Evaluation of validity of Integrated Management of Childhood Illness guidelines in identifying edema of nutritional causes among Egyptian children.

    Science.gov (United States)

    El Habashy, Safinaz A; Mohamed, Maha H; Amin, Dina A; Marzouk, Diaa; Farid, Mohammed N

    2015-12-01

    The aim of this study was to assess the validity of the Integrated Management of Childhood Illness (IMCI) algorithm to detect edematous type of malnutrition in Egyptian infants and children ranging in age from 2 months to 5 years. This study was carried out by surveying 23 082 children aged between 2 months and 5 years visiting the pediatric outpatient clinic, Ain Shams University Hospital, over a period of 6 months. Thirty-eight patients with edema of both feet on their primary visit were enrolled in the study. Every child was assessed using the IMCI algorithm 'assess and classify' by the same physician, together with a systematic clinical evaluation with all relevant investigations. Twenty-two patients (57.9%) were proven to have nutritional etiology. 'Weight for age' sign had a sensitivity of 95.5%, a specificity of 56%, and a diagnostic accuracy of 78.95% in the identification of nutritional edema among all cases of bipedal edema. Combinations of IMCI symptoms 'pallor, visible severe wasting, fever, diarrhea', and 'weight for age' increased the sensitivity to 100%, but with a low specificity of 38% and a diagnostic accuracy of 73.68%. Bipedal edema and low weight for age as part of the IMCI algorithm can identify edema because of nutritional etiology with 100% sensitivity, but with 37% specificity. Revisions need to be made to the IMCI guidelines published in 2010 by the Egyptian Ministry of Health in the light of the new WHO guidelines of 2014.

  11. Hidden Markov model approach for identifying the modular framework of the protein backbone.

    Science.gov (United States)

    Camproux, A C; Tuffery, P; Chevrolat, J P; Boisvieux, J F; Hazout, S

    1999-12-01

    The hidden Markov model (HMM) was used to identify recurrent short 3D structural building blocks (SBBs) describing protein backbones, independently of any a priori knowledge. Polypeptide chains are decomposed into a series of short segments defined by their inter-alpha-carbon distances. Basically, the model takes into account the sequentiality of the observed segments and assumes that each one corresponds to one of several possible SBBs. Fitting the model to a database of non-redundant proteins allowed us to decode proteins in terms of 12 distinct SBBs with different roles in protein structure. Some SBBs correspond to classical regular secondary structures. Others correspond to a significant subdivision of their bounding regions previously considered to be a single pattern. The major contribution of the HMM is that this model implicitly takes into account the sequential connections between SBBs and thus describes the most probable pathways by which the blocks are connected to form the framework of the protein structures. Validation of the SBBs code was performed by extracting SBB series repeated in recoding proteins and examining their structural similarities. Preliminary results on the sequence specificity of SBBs suggest promising perspectives for the prediction of SBBs or series of SBBs from the protein sequences.

  12. A stepwise approach to identify intellectual disabilities in the criminal justice system

    Directory of Open Access Journals (Sweden)

    Valentina Cabral Iversen

    2010-07-01

    Full Text Available A significant proportion of the prison inmates have an IQ level corresponding to intellectual disability (ID or borderline ID. These persons are rarely identified and subsequently not offered any compensation for their learning and comprehension deficits. The purpose of this study was to explore and help providing methods for better identification of ID at an early stage during criminal proceedings. 143 randomly selected prisoners serving sentences in prisons were assessed using The Wechsler Abbreviated Scale of Intelligence (WASI and the Hayes Ability Screening Index (HASI while a semi-structured interview was carried out to obtain data on health as well as social and criminological issues. A total of 10.8% (n = 15 of the participants showed an IQ below 70. From previous analyses of the semistructured interview, a checklist was extracted and found to have good predictive validity on ID (AUC= 93%. The resulting identification referred 32% (n= 46 of the sample for comprehensive assessment. Within this group, all participants with an IQ below70 were included. Identification through this checklist, the screening and a full assessment is essential in improving the quality of the services.

  13. An affinity pull-down approach to identify the plant cyclic nucleotide interactome

    KAUST Repository

    Donaldson, Lara Elizabeth; Meier, Stuart Kurt

    2013-01-01

    Cyclic nucleotides (CNs) are intracellular second messengers that play an important role in mediating physiological responses to environmental and developmental signals, in species ranging from bacteria to humans. In response to these signals, CNs are synthesized by nucleotidyl cyclases and then act by binding to and altering the activity of downstream target proteins known as cyclic nucleotide-binding proteins (CNBPs). A number of CNBPs have been identified across kingdoms including transcription factors, protein kinases, phosphodiesterases, and channels, all of which harbor conserved CN-binding domains. In plants however, few CNBPs have been identified as homology searches fail to return plant sequences with significant matches to known CNBPs. Recently, affinity pull-down techniques have been successfully used to identify CNBPs in animals and have provided new insights into CN signaling. The application of these techniques to plants has not yet been extensively explored and offers an alternative approach toward the unbiased discovery of novel CNBP candidates in plants. Here, an affinity pull-down technique for the identification of the plant CN interactome is presented. In summary, the method involves an extraction of plant proteins which is incubated with a CN-bait, followed by a series of increasingly stringent elutions that eliminates proteins in a sequential manner according to their affinity to the bait. The eluted and bait-bound proteins are separated by one-dimensional gel electrophoresis, excised, and digested with trypsin after which the resultant peptides are identified by mass spectrometry - techniques that are commonplace in proteomics experiments. The discovery of plant CNBPs promises to provide valuable insight into the mechanism of CN signal transduction in plants. © Springer Science+Business Media New York 2013.

  14. Identifying key performance indicators for nursing and midwifery care using a consensus approach.

    Science.gov (United States)

    McCance, Tanya; Telford, Lorna; Wilson, Julie; Macleod, Olive; Dowd, Audrey

    2012-04-01

    The aim of this study was to gain consensus on key performance indicators that are appropriate and relevant for nursing and midwifery practice in the current policy context. There is continuing demand to demonstrate effectiveness and efficiency in health and social care and to communicate this at boardroom level. Whilst there is substantial literature on the use of clinical indicators and nursing metrics, there is less evidence relating to indicators that reflect the patient experience. A consensus approach was used to identify relevant key performance indicators. A nominal group technique was used comprising two stages: a workshop involving all grades of nursing and midwifery staff in two HSC trusts in Northern Ireland (n = 50); followed by a regional Consensus Conference (n = 80). During the workshop, potential key performance indicators were identified. This was used as the basis for the Consensus Conference, which involved two rounds of consensus. Analysis was based on aggregated scores that were then ranked. Stage one identified 38 potential indicators and stage two prioritised the eight top-ranked indicators as a core set for nursing and midwifery. The relevance and appropriateness of these indicators were confirmed with nurses and midwives working in a range of settings and from the perspective of service users. The eight indicators identified do not conform to the majority of other nursing metrics generally reported in the literature. Furthermore, they are strategically aligned to work on the patient experience and are reflective of the fundamentals of nursing and midwifery practice, with the focus on person-centred care. Nurses and midwives have a significant contribution to make in determining the extent to which these indicators are achieved in practice. Furthermore, measurement of such indicators provides an opportunity to evidence of the unique impact of nursing/midwifery care on the patient experience. © 2011 Blackwell Publishing Ltd.

  15. An affinity pull-down approach to identify the plant cyclic nucleotide interactome

    KAUST Repository

    Donaldson, Lara Elizabeth

    2013-09-03

    Cyclic nucleotides (CNs) are intracellular second messengers that play an important role in mediating physiological responses to environmental and developmental signals, in species ranging from bacteria to humans. In response to these signals, CNs are synthesized by nucleotidyl cyclases and then act by binding to and altering the activity of downstream target proteins known as cyclic nucleotide-binding proteins (CNBPs). A number of CNBPs have been identified across kingdoms including transcription factors, protein kinases, phosphodiesterases, and channels, all of which harbor conserved CN-binding domains. In plants however, few CNBPs have been identified as homology searches fail to return plant sequences with significant matches to known CNBPs. Recently, affinity pull-down techniques have been successfully used to identify CNBPs in animals and have provided new insights into CN signaling. The application of these techniques to plants has not yet been extensively explored and offers an alternative approach toward the unbiased discovery of novel CNBP candidates in plants. Here, an affinity pull-down technique for the identification of the plant CN interactome is presented. In summary, the method involves an extraction of plant proteins which is incubated with a CN-bait, followed by a series of increasingly stringent elutions that eliminates proteins in a sequential manner according to their affinity to the bait. The eluted and bait-bound proteins are separated by one-dimensional gel electrophoresis, excised, and digested with trypsin after which the resultant peptides are identified by mass spectrometry - techniques that are commonplace in proteomics experiments. The discovery of plant CNBPs promises to provide valuable insight into the mechanism of CN signal transduction in plants. © Springer Science+Business Media New York 2013.

  16. Validation of the TAPS-1: A Four-Item Screening Tool to Identify Unhealthy Substance Use in Primary Care.

    Science.gov (United States)

    Gryczynski, Jan; McNeely, Jennifer; Wu, Li-Tzy; Subramaniam, Geetha A; Svikis, Dace S; Cathers, Lauretta A; Sharma, Gaurav; King, Jacqueline; Jelstrom, Eve; Nordeck, Courtney D; Sharma, Anjalee; Mitchell, Shannon G; O'Grady, Kevin E; Schwartz, Robert P

    2017-09-01

    The Tobacco, Alcohol, Prescription Medication, and Other Substance use (TAPS) tool is a combined two-part screening and brief assessment developed for adult primary care patients. The tool's first-stage screening component (TAPS-1) consists of four items asking about past 12-month use for four substance categories, with response options of never, less than monthly, monthly, weekly, and daily or almost daily. To validate the TAPS-1 in primary care patients. Participants completed the TAPS tool in self- and interviewer-administered formats, in random order. In this secondary analysis, the TAPS-1 was evaluated against DSM-5 substance use disorder (SUD) criteria to determine optimal cut-points for identifying unhealthy substance use at three severity levels (problem use, mild SUD, and moderate-to-severe SUD). Two thousand adult patients at five primary care sites. DSM-5 SUD criteria were determined via the modified Composite International Diagnostic Interview. Oral fluid was used as a biomarker of recent drug use. Optimal frequency-of-use cut-points on the self-administered TAPS-1 for identifying SUDs were ≥ monthly use for tobacco and alcohol (sensitivity = 0.92 and 0.71, specificity = 0.80 and 0.85, AUC = 0.86 and 0.78, respectively) and any reported use for illicit drugs and prescription medication misuse (sensitivity = 0.93 and 0.89, specificity = 0.85 and 0.91, AUC = 0.89 and 0.90, respectively). The performance of the interviewer-administered format was similar. When administered first, the self-administered format yielded higher disclosure rates for past 12-month alcohol use, illicit drug use, and prescription medication misuse. Frequency of use alone did not provide sufficient information to discriminate between gradations of substance use problem severity. Among those who denied drug use on the TAPS-1, less than 4% had a drug-positive biomarker. The TAPS-1 can identify unhealthy substance use in primary care patients with a high level of accuracy

  17. Identifying problematic Internet users: development and validation of the Internet Motive Questionnaire for Adolescents (IMQ-A).

    Science.gov (United States)

    Bischof-Kastner, Christina; Kuntsche, Emmanuel; Wolstein, Jörg

    2014-10-09

    Internationally, up to 15.1% of intensive Internet use among adolescents is dysfunctional. To provide a basis for early intervention and preventive measures, understanding the motives behind intensive Internet use is important. This study aims to develop a questionnaire, the Internet Motive Questionnaire for Adolescents (IMQ-A), as a theory-based measurement for identifying the underlying motives for high-risk Internet use. More precisely, the aim was to confirm the 4-factor structure (ie, social, enhancement, coping, and conformity motives) as well as its construct and concurrent validity. Another aim was to identify the motivational differences between high-risk and low-risk Internet users. A sample of 101 German adolescents (female: 52.5%, 53/101; age: mean 15.9, SD 1.3 years) was recruited. High-risk users (n=47) and low-risk users (n=54) were identified based on a screening measure for online addiction behavior in children and adolescents (Online-Suchtverhalten-Skala, OSVK-S). Here, "high-risk" Internet use means use that exceeds the level of intensive Internet use (OSVK-S sum score ≥7). The confirmatory factor analysis confirmed the IMQ-A's 4-factor structure. A reliability analysis revealed good internal consistencies of the subscales (.71 up to .86). Moreover, regression analyses confirmed that the enhancement and coping motive groups significantly predicted high-risk Internet consumption and the OSVK-S sum score. A mixed-model ANOVA confirmed that adolescents mainly access the Internet for social motives, followed by enhancement and coping motives, and that high-risk users access the Internet more frequently for coping and enhancement motives than low-risk users. Low-risk users were primarily motivated socially. The IMQ-A enables the assessment of motives related to adolescent Internet use and thus the identification of populations at risk. The questionnaire enables the development of preventive measures or early intervention programs, especially dealing

  18. Hurricane Sandy Economic Impacts Assessment: A Computable General Equilibrium Approach and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Boero, Riccardo [Los Alamos National Laboratory; Edwards, Brian Keith [Los Alamos National Laboratory

    2017-08-07

    Economists use computable general equilibrium (CGE) models to assess how economies react and self-organize after changes in policies, technology, and other exogenous shocks. CGE models are equation-based, empirically calibrated, and inspired by Neoclassical economic theory. The focus of this work was to validate the National Infrastructure Simulation and Analysis Center (NISAC) CGE model and apply it to the problem of assessing the economic impacts of severe events. We used the 2012 Hurricane Sandy event as our validation case. In particular, this work first introduces the model and then describes the validation approach and the empirical data available for studying the event of focus. Shocks to the model are then formalized and applied. Finally, model results and limitations are presented and discussed, pointing out both the model degree of accuracy and the assessed total damage caused by Hurricane Sandy.

  19. Electromagnetic scattering problems -Numerical issues and new experimental approaches of validation

    Energy Technology Data Exchange (ETDEWEB)

    Geise, Robert; Neubauer, Bjoern; Zimmer, Georg [University of Braunschweig, Institute for Electromagnetic Compatibility, Schleinitzstrasse 23, 38106 Braunschweig (Germany)

    2015-03-10

    Electromagnetic scattering problems, thus the question how radiated energy spreads when impinging on an object, are an essential part of wave propagation. Though the Maxwell’s differential equations as starting point, are actually quite simple,the integral formulation of an object’s boundary conditions, respectively the solution for unknown induced currents can only be solved numerically in most cases.As a timely topic of practical importance the scattering of rotating wind turbines is discussed, the numerical description of which is still based on rigorous approximations with yet unspecified accuracy. In this context the issue of validating numerical solutions is addressed, both with reference simulations but in particular with the experimental approach of scaled measurements. For the latter the idea of an incremental validation is proposed allowing a step by step validation of required new mathematical models in scattering theory.

  20. Novel VEGFR-2 kinase inhibitor identified by the back-to-front approach

    Science.gov (United States)

    Sanphanya, Kingkan; Phowichit, Suwadee; Wattanapitayakul, Suvara K.; Fokin, Valery V.; Vajragupta, Opa

    2013-01-01

    Novel lead was developed as VEGFR-2 inhibitor by the back-to-front approach. Docking experiment guided that the 3-chloromethylphenylurea motif occupied the back pocket of the VEGFR-2 kinase. The attempt to enhance the binding affinity of 1 was made by expanding structure to access the front pocket using triazole as linker. A library of 1,4-(disubsituted)-1H-1,2,3-triazoles were screened in silico and one lead compound (VH02) was identified with enzymatic IC50 against VEGFR-2 of 0.56 μM. VH02 showed antiangiogenic effect by inhibiting the tube formation of HUVEC cells (EA.hy926) at 0.3 μM which was 13 times lower than its cytotoxic dose. The enzymatic and cellular activities suggested the potential of VH02 as a lead for further optimization. PMID:23562241

  1. Identifying niche-mediated regulatory factors of stem cell phenotypic state: a systems biology approach.

    Science.gov (United States)

    Ravichandran, Srikanth; Del Sol, Antonio

    2017-02-01

    Understanding how the cellular niche controls the stem cell phenotype is often hampered due to the complexity of variegated niche composition, its dynamics, and nonlinear stem cell-niche interactions. Here, we propose a systems biology view that considers stem cell-niche interactions as a many-body problem amenable to simplification by the concept of mean field approximation. This enables approximation of the niche effect on stem cells as a constant field that induces sustained activation/inhibition of specific stem cell signaling pathways in all stem cells within heterogeneous populations exhibiting the same phenotype (niche determinants). This view offers a new basis for the development of single cell-based computational approaches for identifying niche determinants, which has potential applications in regenerative medicine and tissue engineering. © 2017 The Authors. FEBS Letters published by John Wiley & Sons Ltd on behalf of Federation of European Biochemical Societies.

  2. Metabolomic approach to identifying bioactive compounds in berries: advances toward fruit nutritional enhancement.

    Science.gov (United States)

    Stewart, Derek; McDougall, Gordon J; Sungurtas, Julie; Verrall, Susan; Graham, Julie; Martinussen, Inger

    2007-06-01

    Plant polyphenolics continue to be the focus of attention with regard to their putative impact on human health. An increasing and ageing human population means that the focus on nutrition and nutritional enhancement or optimisation of our foodstuffs is paramount. Using the raspberry as a model, we have shown how modern metabolic profiling approaches can be used to identify the changes in the level of beneficial polyphenolics in fruit breeding segregating populations and how the level of these components is determined by genetic and/or environmental control. Interestingly, the vitamin C content appeared to be significantly influenced by environment (growth conditions) whilst the content of the polyphenols such as cyanidin, pelargonidin and quercetin glycosides appeared much more tightly regulated, suggesting a rigorous genetic control. Preliminary metabolic profiling showed that the fruit polyphenolic profiles divided into two gross groups segregating on the basis of relative levels of cyanidin-3-sophoroside and cyanidin-3-rutinoside, compounds implicated as conferring human health benefits.

  3. Identifying the critical financial ratios for stocks evaluation: A fuzzy delphi approach

    Science.gov (United States)

    Mokhtar, Mazura; Shuib, Adibah; Mohamad, Daud

    2014-12-01

    Stocks evaluation has always been an interesting and challenging problem for both researchers and practitioners. Generally, the evaluation can be made based on a set of financial ratios. Nevertheless, there are a variety of financial ratios that can be considered and if all ratios in the set are placed into the evaluation process, data collection would be more difficult and time consuming. Thus, the objective of this paper is to identify the most important financial ratios upon which to focus in order to evaluate the stock's performance. For this purpose, a survey was carried out using an approach which is based on an expert judgement, namely the Fuzzy Delphi Method (FDM). The results of this study indicated that return on equity, return on assets, net profit margin, operating profit margin, earnings per share and debt to equity are the most important ratios.

  4. Identifying Personal Goals of Patients With Long Term Condition: A Service Design Thinking Approach.

    Science.gov (United States)

    Lee, Eunji; Gammon, Deede

    2017-01-01

    Care for patients with long term conditions is often characterized as fragmented and ineffective, and fails to engage the resources of patients and their families in the care process. Information and communication technology can potentially help bridge the gap between patients' lives and resources and services provided by professionals. However, there is little attention on how to identify and incorporate the patients' individual needs, values, preferences and care goals into the digitally driven care settings. We conducted a case study with healthcare professionals and patients participated applying a service design thinking approach. The participants could elaborate some personal goals of patients with long term condition which can potentially be incorporated in digitally driven care plans using examples from their own experiences.

  5. A Complementary Bioinformatics Approach to Identify Potential Plant Cell Wall Glycosyltransferase-Encoding Genes

    DEFF Research Database (Denmark)

    Egelund, Jack; Skjøt, Michael; Geshi, Naomi

    2004-01-01

    Plant cell wall (CW) synthesizing enzymes can be divided into the glycan (i.e. cellulose and callose) synthases, which are multimembrane spanning proteins located at the plasma membrane, and the glycosyltransferases (GTs), which are Golgi localized single membrane spanning proteins, believed....... Although much is known with regard to composition and fine structures of the plant CW, only a handful of CW biosynthetic GT genes-all classified in the CAZy system-have been characterized. In an effort to identify CW GTs that have not yet been classified in the CAZy database, a simple bioinformatics...... approach was adopted. First, the entire Arabidopsis proteome was run through the Transmembrane Hidden Markov Model 2.0 server and proteins containing one or, more rarely, two transmembrane domains within the N-terminal 150 amino acids were collected. Second, these sequences were submitted...

  6. Xtalk: a path-based approach for identifying crosstalk between signaling pathways

    Science.gov (United States)

    Tegge, Allison N.; Sharp, Nicholas; Murali, T. M.

    2016-01-01

    Motivation: Cells communicate with their environment via signal transduction pathways. On occasion, the activation of one pathway can produce an effect downstream of another pathway, a phenomenon known as crosstalk. Existing computational methods to discover such pathway pairs rely on simple overlap statistics. Results: We present Xtalk, a path-based approach for identifying pairs of pathways that may crosstalk. Xtalk computes the statistical significance of the average length of multiple short paths that connect receptors in one pathway to the transcription factors in another. By design, Xtalk reports the precise interactions and mechanisms that support the identified crosstalk. We applied Xtalk to signaling pathways in the KEGG and NCI-PID databases. We manually curated a gold standard set of 132 crosstalking pathway pairs and a set of 140 pairs that did not crosstalk, for which Xtalk achieved an area under the receiver operator characteristic curve of 0.65, a 12% improvement over the closest competing approach. The area under the receiver operator characteristic curve varied with the pathway, suggesting that crosstalk should be evaluated on a pathway-by-pathway level. We also analyzed an extended set of 658 pathway pairs in KEGG and to a set of more than 7000 pathway pairs in NCI-PID. For the top-ranking pairs, we found substantial support in the literature (81% for KEGG and 78% for NCI-PID). We provide examples of networks computed by Xtalk that accurately recovered known mechanisms of crosstalk. Availability and implementation: The XTALK software is available at http://bioinformatics.cs.vt.edu/~murali/software. Crosstalk networks are available at http://graphspace.org/graphs?tags=2015-bioinformatics-xtalk. Contact: ategge@vt.edu, murali@cs.vt.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26400040

  7. Validation approach for a fast and simple targeted screening method for 75 antibiotics in meat and aquaculture products using LC-MS/MS.

    Science.gov (United States)

    Dubreil, Estelle; Gautier, Sophie; Fourmond, Marie-Pierre; Bessiral, Mélaine; Gaugain, Murielle; Verdon, Eric; Pessel, Dominique

    2017-04-01

    An approach is described to validate a fast and simple targeted screening method for antibiotic analysis in meat and aquaculture products by LC-MS/MS. The strategy of validation was applied for a panel of 75 antibiotics belonging to different families, i.e., penicillins, cephalosporins, sulfonamides, macrolides, quinolones and phenicols. The samples were extracted once with acetonitrile, concentrated by evaporation and injected into the LC-MS/MS system. The approach chosen for the validation was based on the Community Reference Laboratory (CRL) guidelines for the validation of screening qualitative methods. The aim of the validation was to prove sufficient sensitivity of the method to detect all the targeted antibiotics at the level of interest, generally the maximum residue limit (MRL). A robustness study was also performed to test the influence of different factors. The validation showed that the method is valid to detect and identify 73 antibiotics of the 75 antibiotics studied in meat and aquaculture products at the validation levels.

  8. A generalized approach for producing, quantifying, and validating citizen science data from wildlife images.

    Science.gov (United States)

    Swanson, Alexandra; Kosmala, Margaret; Lintott, Chris; Packer, Craig

    2016-06-01

    Citizen science has the potential to expand the scope and scale of research in ecology and conservation, but many professional researchers remain skeptical of data produced by nonexperts. We devised an approach for producing accurate, reliable data from untrained, nonexpert volunteers. On the citizen science website www.snapshotserengeti.org, more than 28,000 volunteers classified 1.51 million images taken in a large-scale camera-trap survey in Serengeti National Park, Tanzania. Each image was circulated to, on average, 27 volunteers, and their classifications were aggregated using a simple plurality algorithm. We validated the aggregated answers against a data set of 3829 images verified by experts and calculated 3 certainty metrics-level of agreement among classifications (evenness), fraction of classifications supporting the aggregated answer (fraction support), and fraction of classifiers who reported "nothing here" for an image that was ultimately classified as containing an animal (fraction blank)-to measure confidence that an aggregated answer was correct. Overall, aggregated volunteer answers agreed with the expert-verified data on 98% of images, but accuracy differed by species commonness such that rare species had higher rates of false positives and false negatives. Easily calculated analysis of variance and post-hoc Tukey tests indicated that the certainty metrics were significant indicators of whether each image was correctly classified or classifiable. Thus, the certainty metrics can be used to identify images for expert review. Bootstrapping analyses further indicated that 90% of images were correctly classified with just 5 volunteers per image. Species classifications based on the plurality vote of multiple citizen scientists can provide a reliable foundation for large-scale monitoring of African wildlife. © 2016 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  9. Identifying Green Infrastructure from Social Media and Crowdsourcing- An Image Based Machine-Learning Approach.

    Science.gov (United States)

    Rai, A.; Minsker, B. S.

    2016-12-01

    In this work we introduce a novel dataset GRID: GReen Infrastructure Detection Dataset and a framework for identifying urban green storm water infrastructure (GI) designs (wetlands/ponds, urban trees, and rain gardens/bioswales) from social media and satellite aerial images using computer vision and machine learning methods. Along with the hydrologic benefits of GI, such as reducing runoff volumes and urban heat islands, GI also provides important socio-economic benefits such as stress recovery and community cohesion. However, GI is installed by many different parties and cities typically do not know where GI is located, making study of its impacts or siting new GI difficult. We use object recognition learning methods (template matching, sliding window approach, and Random Hough Forest method) and supervised machine learning algorithms (e.g., support vector machines) as initial screening approaches to detect potential GI sites, which can then be investigated in more detail using on-site surveys. Training data were collected from GPS locations of Flickr and Instagram image postings and Amazon Mechanical Turk identification of each GI type. Sliding window method outperformed other methods and achieved an average F measure, which is combined metric for precision and recall performance measure of 0.78.

  10. OPTICAL CROSS-CORRELATION FILTERS: AN ECONOMICAL APPROACH FOR IDENTIFYING SNe Ia AND ESTIMATING THEIR REDSHIFTS

    International Nuclear Information System (INIS)

    Scolnic, Daniel M.; Riess, Adam G.; Huber, Mark E.; Rest, Armin; Stubbs, Christoper W.; Tonry, John L.

    2009-01-01

    Large photometric surveys of transient phenomena, such as Panoramic Survey Telescope and Rapid Response System and Large Synoptic Survey Telescope, will locate thousands to millions of Type Ia supernova (SN Ia) candidates per year, a rate prohibitive for acquiring spectroscopy to determine each candidate's type and redshift. In response, we have developed an economical approach to identifying SNe Ia and their redshifts using an uncommon type of optical filter which has multiple, discontinuous passbands on a single substrate. Observation of a supernova through a specially designed pair of these 'cross-correlation filters' measures the approximate amplitude and phase of the cross-correlation between the spectrum and a SN Ia template, a quantity typically used to determine the redshift and type of a high-redshift SN Ia. Simulating the use of these filters, we obtain a sample of SNe Ia which is ∼98% pure with individual redshifts measured to σ z = 0.01 precision. The advantages of this approach over standard broadband photometric methods are that it is insensitive to reddening, independent of the color data used for subsequent distance determinations which reduce selection or interpretation bias, and because it makes use of the spectral features its reliability is greater. A great advantage over long-slit spectroscopy comes from increased throughput, enhanced multiplexing, and reduced setup time resulting in a net gain in speed of up to ∼30 times. This approach is also insensitive to host galaxy contamination. Prototype filters were built and successfully used on Magellan with LDSS-3 to characterize three SuperNova Legacy Survey candidates. We discuss how these filters can provide critical information for the upcoming photometric supernova surveys.

  11. A statistical approach for identifying the ionospheric footprint of magnetospheric boundaries from SuperDARN observations

    Directory of Open Access Journals (Sweden)

    G. Lointier

    2008-02-01

    Full Text Available Identifying and tracking the projection of magnetospheric regions on the high-latitude ionosphere is of primary importance for studying the Solar Wind-Magnetosphere-Ionosphere system and for space weather applications. By its unique spatial coverage and temporal resolution, the Super Dual Auroral Radar Network (SuperDARN provides key parameters, such as the Doppler spectral width, which allows the monitoring of the ionospheric footprint of some magnetospheric boundaries in near real-time. In this study, we present the first results of a statistical approach for monitoring these magnetospheric boundaries. The singular value decomposition is used as a data reduction tool to describe the backscattered echoes with a small set of parameters. One of these is strongly correlated with the Doppler spectral width, and can thus be used as a proxy for it. Based on this, we propose a Bayesian classifier for identifying the spectral width boundary, which is classically associated with the Polar Cap boundary. The results are in good agreement with previous studies. Two advantages of the method are: the possibility to apply it in near real-time, and its capacity to select the appropriate threshold level for the boundary detection.

  12. Integrative screening approach identifies regulators of polyploidization and targets for acute megakaryocytic leukemia

    Science.gov (United States)

    Wen, Qiang; Goldenson, Benjamin; Silver, Serena J.; Schenone, Monica; Dancik, Vladimir; Huang, Zan; Wang, Ling-Zhi; Lewis, Timothy; An, W. Frank; Li, Xiaoyu; Bray, Mark-Anthony; Thiollier, Clarisse; Diebold, Lauren; Gilles, Laure; Vokes, Martha S.; Moore, Christopher B.; Bliss-Moreau, Meghan; VerPlank, Lynn; Tolliday, Nicola J.; Mishra, Rama; Vemula, Sasidhar; Shi, Jianjian; Wei, Lei; Kapur, Reuben; Lopez, Cécile K.; Gerby, Bastien; Ballerini, Paola; Pflumio, Francoise; Gilliland, D. Gary; Goldberg, Liat; Birger, Yehudit; Izraeli, Shai; Gamis, Alan S.; Smith, Franklin O.; Woods, William G.; Taub, Jeffrey; Scherer, Christina A.; Bradner, James; Goh, Boon-Cher; Mercher, Thomas; Carpenter, Anne E.; Gould, Robert J.; Clemons, Paul A.; Carr, Steven A.; Root, David E.; Schreiber, Stuart L.; Stern, Andrew M.; Crispino, John D.

    2012-01-01

    Summary The mechanism by which cells decide to skip mitosis to become polyploid is largely undefined. Here we used a high-content image-based screen to identify small-molecule probes that induce polyploidization of megakaryocytic leukemia cells and serve as perturbagens to help understand this process. We found that dimethylfasudil (diMF, H-1152P) selectively increased polyploidization, mature cell-surface marker expression, and apoptosis of malignant megakaryocytes. A broadly applicable, highly integrated target identification approach employing proteomic and shRNA screening revealed that a major target of diMF is Aurora A kinase (AURKA), which has not been studied extensively in megakaryocytes. Moreover, we discovered that MLN8237 (Alisertib), a selective inhibitor of AURKA, induced polyploidization and expression of mature megakaryocyte markers in AMKL blasts and displayed potent anti-AMKL activity in vivo. This research provides the rationale to support clinical trials of MLN8237 and other inducers of polyploidization in AMKL. Finally, we have identified five networks of kinases that regulate the switch to polyploidy. PMID:22863010

  13. Modelling Creativity: Identifying Key Components through a Corpus-Based Approach.

    Science.gov (United States)

    Jordanous, Anna; Keller, Bill

    2016-01-01

    Creativity is a complex, multi-faceted concept encompassing a variety of related aspects, abilities, properties and behaviours. If we wish to study creativity scientifically, then a tractable and well-articulated model of creativity is required. Such a model would be of great value to researchers investigating the nature of creativity and in particular, those concerned with the evaluation of creative practice. This paper describes a unique approach to developing a suitable model of how creative behaviour emerges that is based on the words people use to describe the concept. Using techniques from the field of statistical natural language processing, we identify a collection of fourteen key components of creativity through an analysis of a corpus of academic papers on the topic. Words are identified which appear significantly often in connection with discussions of the concept. Using a measure of lexical similarity to help cluster these words, a number of distinct themes emerge, which collectively contribute to a comprehensive and multi-perspective model of creativity. The components provide an ontology of creativity: a set of building blocks which can be used to model creative practice in a variety of domains. The components have been employed in two case studies to evaluate the creativity of computational systems and have proven useful in articulating achievements of this work and directions for further research.

  14. Calibrated photostimulated luminescence is an effective approach to identify irradiated orange during storage

    Science.gov (United States)

    Jo, Yunhee; Sanyal, Bhaskar; Chung, Namhyeok; Lee, Hyun-Gyu; Park, Yunji; Park, Hae-Jun; Kwon, Joong-Ho

    2015-06-01

    Photostimulated luminescence (PSL) has been employed as a fast screening method for various irradiated foods. In this study the potential use of PSL was evaluated to identify oranges irradiated with gamma ray, electron beam and X-ray (0-2 kGy) and stored under different conditions for 6 weeks. The effects of light conditions (natural light, artificial light, and dark) and storage temperatures (4 and 20 °C) on PSL photon counts (PCs) during post-irradiation periods were studied. Non-irradiated samples always showed negative values of PCs, while irradiated oranges exhibited intermediate results after first PSL measurements. However, the irradiated samples had much higher PCs. The PCs of all the samples declined as the storage time increased. Calibrated second PSL measurements showed PSL ratio <10 for the irradiated samples after 3 weeks of irradiation confirming their irradiation status in all the storage conditions. Calibrated PSL and sample storage in dark at 4 °C were found out to be most suitable approaches to identify irradiated oranges during storage.

  15. Clustering approaches to identifying gene expression patterns from DNA microarray data.

    Science.gov (United States)

    Do, Jin Hwan; Choi, Dong-Kug

    2008-04-30

    The analysis of microarray data is essential for large amounts of gene expression data. In this review we focus on clustering techniques. The biological rationale for this approach is the fact that many co-expressed genes are co-regulated, and identifying co-expressed genes could aid in functional annotation of novel genes, de novo identification of transcription factor binding sites and elucidation of complex biological pathways. Co-expressed genes are usually identified in microarray experiments by clustering techniques. There are many such methods, and the results obtained even for the same datasets may vary considerably depending on the algorithms and metrics for dissimilarity measures used, as well as on user-selectable parameters such as desired number of clusters and initial values. Therefore, biologists who want to interpret microarray data should be aware of the weakness and strengths of the clustering methods used. In this review, we survey the basic principles of clustering of DNA microarray data from crisp clustering algorithms such as hierarchical clustering, K-means and self-organizing maps, to complex clustering algorithms like fuzzy clustering.

  16. Identifying barriers to chronic disease reporting in Chicago Public Schools: a mixed-methods approach.

    Science.gov (United States)

    Rivkina, Victoria; Tapke, David E; Cardenas, Lilliana D; Harvey-Gintoft, Blair; Whyte, Stephanie A; Gupta, Ruchi S

    2014-12-06

    Chronic disease among school-aged children is a public health concern, particularly for asthma and food allergy. In Chicago Public Schools (CPS), rates of asthma and food allergy among students are underreported. The aim of this study was to determine the barriers to chronic disease reporting as experienced by CPS parents and school nurses. A mixed-methods approach included focus groups and key informant interviews with parents and school nurses, and a cross-sectional survey was completed by parents. Qualitative data analysis was performed and survey data were analyzed to determine the significant demographic and knowledge variables associated with successfully completing the reporting process. The three main barriers identified were 1) a lack of parental process knowledge; 2) limited communication from schools; and 3) insufficient availability of school nurses. Parents were significantly more likely to successfully complete the reporting process if they knew about special accommodations for chronic diseases, understood the need for physician verification, and/or knew the school nurse. These findings suggest that increasing parental knowledge of the reporting process will allow schools to better identify and manage their students' chronic conditions. A parent-focused intervention informed by these results has been completed.

  17. A Neural Network Approach for Identifying Particle Pitch Angle Distributions in Van Allen Probes Data

    Science.gov (United States)

    Souza, V. M.; Vieira, L. E. A.; Medeiros, C.; Da Silva, L. A.; Alves, L. R.; Koga, D.; Sibeck, D. G.; Walsh, B. M.; Kanekal, S. G.; Jauer, P. R.; hide

    2016-01-01

    Analysis of particle pitch angle distributions (PADs) has been used as a means to comprehend a multitude of different physical mechanisms that lead to flux variations in the Van Allen belts and also to particle precipitation into the upper atmosphere. In this work we developed a neural network-based data clustering methodology that automatically identifies distinct PAD types in an unsupervised way using particle flux data. One can promptly identify and locate three well-known PAD types in both time and radial distance, namely, 90deg peaked, butterfly, and flattop distributions. In order to illustrate the applicability of our methodology, we used relativistic electron flux data from the whole month of November 2014, acquired from the Relativistic Electron-Proton Telescope instrument on board the Van Allen Probes, but it is emphasized that our approach can also be used with multiplatform spacecraft data. Our PAD classification results are in reasonably good agreement with those obtained by standard statistical fitting algorithms. The proposed methodology has a potential use for Van Allen belt's monitoring.

  18. Learn the Lagrangian: A Vector-Valued RKHS Approach to Identifying Lagrangian Systems.

    Science.gov (United States)

    Cheng, Ching-An; Huang, Han-Pang

    2016-12-01

    We study the modeling of Lagrangian systems with multiple degrees of freedom. Based on system dynamics, canonical parametric models require ad hoc derivations and sometimes simplification for a computable solution; on the other hand, due to the lack of prior knowledge in the system's structure, modern nonparametric models in machine learning face the curse of dimensionality, especially in learning large systems. In this paper, we bridge this gap by unifying the theories of Lagrangian systems and vector-valued reproducing kernel Hilbert space. We reformulate Lagrangian systems with kernels that embed the governing Euler-Lagrange equation-the Lagrangian kernels-and show that these kernels span a subspace capturing the Lagrangian's projection as inverse dynamics. By such property, our model uses only inputs and outputs as in machine learning and inherits the structured form as in system dynamics, thereby removing the need for the mundane derivations for new systems as well as the generalization problem in learning from scratches. In effect, it learns the system's Lagrangian, a simpler task than directly learning the dynamics. To demonstrate, we applied the proposed kernel to identify the robot inverse dynamics in simulations and experiments. Our results present a competitive novel approach to identifying Lagrangian systems, despite using only inputs and outputs.

  19. Gamma ray self-attenuation correction: a simple numerical approach and its validation

    International Nuclear Information System (INIS)

    Agarwal, Chhavi; Poi, Sanhita; Mhatre, Amol; Goswami, A.

    2009-03-01

    A hybrid Monte Carlo method for gamma ray attenuation correction has been developed. The method has been applied to some common counting geometries like cylinder, box, sphere and disc. The method has been validated theoretically and experimentally over a wide range of transmittance and sample-to-detector distances. The advantage of the approach is that it is common to all sample geometries and can be used at all sample-to detector distances. (author)

  20. Validation of an employee satisfaction model: A structural equation model approach

    OpenAIRE

    Ophillia Ledimo; Nico Martins

    2015-01-01

    The purpose of this study was to validate an employee satisfaction model and to determine the relationships between the different dimensions of the concept, using the structural equation modelling approach (SEM). A cross-sectional quantitative survey design was used to collect data from a random sample of (n=759) permanent employees of a parastatal organisation. Data was collected using the Employee Satisfaction Survey (ESS) to measure employee satisfaction dimensions. Following the steps of ...

  1. A comprehensive approach to identifying repurposed drugs to treat SCN8A epilepsy.

    Science.gov (United States)

    Atkin, Talia A; Maher, Chani M; Gerlach, Aaron C; Gay, Bryant C; Antonio, Brett M; Santos, Sonia C; Padilla, Karen M; Rader, JulieAnn; Krafte, Douglas S; Fox, Matthew A; Stewart, Gregory R; Petrovski, Slavé; Devinsky, Orrin; Might, Matthew; Petrou, Steven; Goldstein, David B

    2018-04-01

    Many previous studies of drug repurposing have relied on literature review followed by evaluation of a limited number of candidate compounds. Here, we demonstrate the feasibility of a more comprehensive approach using high-throughput screening to identify inhibitors of a gain-of-function mutation in the SCN8A gene associated with severe pediatric epilepsy. We developed cellular models expressing wild-type or an R1872Q mutation in the Na v 1.6 sodium channel encoded by SCN8A. Voltage clamp experiments in HEK-293 cells expressing the SCN8A R1872Q mutation demonstrated a leftward shift in sodium channel activation as well as delayed inactivation; both changes are consistent with a gain-of-function mutation. We next developed a fluorescence-based, sodium flux assay and used it to assess an extensive library of approved drugs, including a panel of antiepileptic drugs, for inhibitory activity in the mutated cell line. Lead candidates were evaluated in follow-on studies to generate concentration-response curves for inhibiting sodium influx. Select compounds of clinical interest were evaluated by electrophysiology to further characterize drug effects on wild-type and mutant sodium channel functions. The screen identified 90 drugs that significantly inhibited sodium influx in the R1872Q cell line. Four drugs of potential clinical interest-amitriptyline, carvedilol, nilvadipine, and carbamazepine-were further investigated and demonstrated concentration-dependent inhibition of sodium channel currents. A comprehensive drug repurposing screen identified potential new candidates for the treatment of epilepsy caused by the R1872Q mutation in the SCN8A gene. Wiley Periodicals, Inc. © 2018 International League Against Epilepsy.

  2. SU-E-J-212: Identifying Bones From MRI: A Dictionary Learnign and Sparse Regression Approach

    International Nuclear Information System (INIS)

    Ruan, D; Yang, Y; Cao, M; Hu, P; Low, D

    2014-01-01

    Purpose: To develop an efficient and robust scheme to identify bony anatomy based on MRI-only simulation images. Methods: MRI offers important soft tissue contrast and functional information, yet its lack of correlation to electron-density has placed it as an auxiliary modality to CT in radiotherapy simulation and adaptation. An effective scheme to identify bony anatomy is an important first step towards MR-only simulation/treatment paradigm and would satisfy most practical purposes. We utilize a UTE acquisition sequence to achieve visibility of the bone. By contrast to manual + bulk or registration-to identify bones, we propose a novel learning-based approach for improved robustness to MR artefacts and environmental changes. Specifically, local information is encoded with MR image patch, and the corresponding label is extracted (during training) from simulation CT aligned to the UTE. Within each class (bone vs. nonbone), an overcomplete dictionary is learned so that typical patches within the proper class can be represented as a sparse combination of the dictionary entries. For testing, an acquired UTE-MRI is divided to patches using a sliding scheme, where each patch is sparsely regressed against both bone and nonbone dictionaries, and subsequently claimed to be associated with the class with the smaller residual. Results: The proposed method has been applied to the pilot site of brain imaging and it has showed general good performance, with dice similarity coefficient of greater than 0.9 in a crossvalidation study using 4 datasets. Importantly, it is robust towards consistent foreign objects (e.g., headset) and the artefacts relates to Gibbs and field heterogeneity. Conclusion: A learning perspective has been developed for inferring bone structures based on UTE MRI. The imaging setting is subject to minimal motion effects and the post-processing is efficient. The improved efficiency and robustness enables a first translation to MR-only routine. The scheme

  3. An Immunohistochemical Approach to Identify the Sex of Young Marine Turtles.

    Science.gov (United States)

    Tezak, Boris M; Guthrie, Kathleen; Wyneken, Jeanette

    2017-08-01

    Marine turtles exhibit temperature-dependent sex determination (TSD). During critical periods of embryonic development, the nest's thermal environment directs whether an embryo will develop as a male or female. At warmer sand temperatures, nests tend to produce female-biased sex ratios. The rapid increase of global temperature highlights the need for a clear assessment of its effects on sea turtle sex ratios. However, estimating hatchling sex ratios at rookeries remains imprecise due to the lack of sexual dimorphism in young marine turtles. We rely mainly upon laparoscopic procedures to verify hatchling sex; however, in some species, morphological sex can be ambiguous even at the histological level. Recent studies using immunohistochemical (IHC) techniques identified that embryonic snapping turtle (Chelydra serpentina) ovaries overexpressed a particular cold-induced RNA-binding protein in comparison to testes. This feature allows the identification of females vs. males. We modified this technique to successfully identify the sexes of loggerhead sea turtle (Caretta caretta) hatchlings, and independently confirmed the results by standard histological and laparoscopic methods that reliably identify sex in this species. We next tested the CIRBP IHC method on gonad samples from leatherback turtles (Dermochelys coriacea). Leatherbacks display delayed gonad differentiation, when compared to other sea turtles, making hatchling gonads difficult to sex using standard H&E stain histology. The IHC approach was successful in both C. caretta and D. coriacea samples, offering a much-needed tool to establish baseline hatchling sex ratios, particularly for assessing impacts of climate change effects on leatherback turtle hatchlings and sea turtle demographics. Anat Rec, 300:1512-1518, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  4. SU-E-J-212: Identifying Bones From MRI: A Dictionary Learnign and Sparse Regression Approach

    Energy Technology Data Exchange (ETDEWEB)

    Ruan, D; Yang, Y; Cao, M; Hu, P; Low, D [UCLA, Los Angeles, CA (United States)

    2014-06-01

    Purpose: To develop an efficient and robust scheme to identify bony anatomy based on MRI-only simulation images. Methods: MRI offers important soft tissue contrast and functional information, yet its lack of correlation to electron-density has placed it as an auxiliary modality to CT in radiotherapy simulation and adaptation. An effective scheme to identify bony anatomy is an important first step towards MR-only simulation/treatment paradigm and would satisfy most practical purposes. We utilize a UTE acquisition sequence to achieve visibility of the bone. By contrast to manual + bulk or registration-to identify bones, we propose a novel learning-based approach for improved robustness to MR artefacts and environmental changes. Specifically, local information is encoded with MR image patch, and the corresponding label is extracted (during training) from simulation CT aligned to the UTE. Within each class (bone vs. nonbone), an overcomplete dictionary is learned so that typical patches within the proper class can be represented as a sparse combination of the dictionary entries. For testing, an acquired UTE-MRI is divided to patches using a sliding scheme, where each patch is sparsely regressed against both bone and nonbone dictionaries, and subsequently claimed to be associated with the class with the smaller residual. Results: The proposed method has been applied to the pilot site of brain imaging and it has showed general good performance, with dice similarity coefficient of greater than 0.9 in a crossvalidation study using 4 datasets. Importantly, it is robust towards consistent foreign objects (e.g., headset) and the artefacts relates to Gibbs and field heterogeneity. Conclusion: A learning perspective has been developed for inferring bone structures based on UTE MRI. The imaging setting is subject to minimal motion effects and the post-processing is efficient. The improved efficiency and robustness enables a first translation to MR-only routine. The scheme

  5. A novel approach identifying hybrid sterility QTL on the autosomes of Drosophila simulans and D. mauritiana.

    Science.gov (United States)

    Dickman, Christopher T D; Moehring, Amanda J

    2013-01-01

    When species interbreed, the hybrid offspring that are produced are often sterile. If only one hybrid sex is sterile, it is almost always the heterogametic (XY or ZW) sex. Taking this trend into account, the predominant model used to explain the genetic basis of F1 sterility involves a deleterious interaction between recessive sex-linked loci from one species and dominant autosomal loci from the other species. This model is difficult to evaluate, however, as only a handful of loci influencing interspecies hybrid sterility have been identified, and their autosomal genetic interactors have remained elusive. One hindrance to their identification has been the overwhelming effect of the sex chromosome in mapping studies, which could 'mask' the ability to accurately map autosomal factors. Here, we use a novel approach employing attached-X chromosomes to create reciprocal backcross interspecies hybrid males that have a non-recombinant sex chromosome and recombinant autosomes. The heritable variation in phenotype is thus solely caused by differences in the autosomes, thereby allowing us to accurately identify the number and location of autosomal sterility loci. In one direction of backcross, all males were sterile, indicating that sterility could be entirely induced by the sex chromosome complement in these males. In the other direction, we identified nine quantitative trait loci that account for a surprisingly large amount (56%) of the autosome-induced phenotypic variance in sterility, with a large contribution of autosome-autosome epistatic interactions. These loci are capable of acting dominantly, and thus could contribute to F1 hybrid sterility.

  6. A novel approach identifying hybrid sterility QTL on the autosomes of Drosophila simulans and D. mauritiana.

    Directory of Open Access Journals (Sweden)

    Christopher T D Dickman

    Full Text Available When species interbreed, the hybrid offspring that are produced are often sterile. If only one hybrid sex is sterile, it is almost always the heterogametic (XY or ZW sex. Taking this trend into account, the predominant model used to explain the genetic basis of F1 sterility involves a deleterious interaction between recessive sex-linked loci from one species and dominant autosomal loci from the other species. This model is difficult to evaluate, however, as only a handful of loci influencing interspecies hybrid sterility have been identified, and their autosomal genetic interactors have remained elusive. One hindrance to their identification has been the overwhelming effect of the sex chromosome in mapping studies, which could 'mask' the ability to accurately map autosomal factors. Here, we use a novel approach employing attached-X chromosomes to create reciprocal backcross interspecies hybrid males that have a non-recombinant sex chromosome and recombinant autosomes. The heritable variation in phenotype is thus solely caused by differences in the autosomes, thereby allowing us to accurately identify the number and location of autosomal sterility loci. In one direction of backcross, all males were sterile, indicating that sterility could be entirely induced by the sex chromosome complement in these males. In the other direction, we identified nine quantitative trait loci that account for a surprisingly large amount (56% of the autosome-induced phenotypic variance in sterility, with a large contribution of autosome-autosome epistatic interactions. These loci are capable of acting dominantly, and thus could contribute to F1 hybrid sterility.

  7. A Machine Learning Approach to Identifying Placebo Responders in Late-Life Depression Trials.

    Science.gov (United States)

    Zilcha-Mano, Sigal; Roose, Steven P; Brown, Patrick J; Rutherford, Bret R

    2018-01-11

    Despite efforts to identify characteristics associated with medication-placebo differences in antidepressant trials, few consistent findings have emerged to guide participant selection in drug development settings and differential therapeutics in clinical practice. Limitations in the methodologies used, particularly searching for a single moderator while treating all other variables as noise, may partially explain the failure to generate consistent results. The present study tested whether interactions between pretreatment patient characteristics, rather than a single-variable solution, may better predict who is most likely to benefit from placebo versus medication. Data were analyzed from 174 patients aged 75 years and older with unipolar depression who were randomly assigned to citalopram or placebo. Model-based recursive partitioning analysis was conducted to identify the most robust significant moderators of placebo versus citalopram response. The greatest signal detection between medication and placebo in favor of medication was among patients with fewer years of education (≤12) who suffered from a longer duration of depression since their first episode (>3.47 years) (B = 2.53, t(32) = 3.01, p = 0.004). Compared with medication, placebo had the greatest response for those who were more educated (>12 years), to the point where placebo almost outperformed medication (B = -0.57, t(96) = -1.90, p = 0.06). Machine learning approaches capable of evaluating the contributions of multiple predictor variables may be a promising methodology for identifying placebo versus medication responders. Duration of depression and education should be considered in the efforts to modulate placebo magnitude in drug development settings and in clinical practice. Copyright © 2018 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  8. Developing and Validating Personas in e-Commerce: A Heuristic Approach

    Science.gov (United States)

    Thoma, Volker; Williams, Bryn

    A multi-method persona development process in a large e-commerce business is described. Personas are fictional representations of customers that describe typical user attributes to facilitate a user-centered approach in interaction design. In the current project persona attributes were derived from various data sources, such as stakeholder interviews, user tests and interviews, data mining, customer surveys, and ethnographic (direct observation, diary studies) research. The heuristic approach of using these data sources conjointly allowed for an early validation of relevant persona dimensions.

  9. Identifying applicants suitable to a career in nursing: a value-based approach to undergraduate selection.

    Science.gov (United States)

    Traynor, Marian; Galanouli, Despina; Roberts, Martin; Leonard, Lawrence; Gale, Thomas

    2017-06-01

    The aim of this study was to complement existing evidence on the suitability of Multiple Mini Interviews as a potential tool for the selection of nursing candidates on to a BSc (Hons) nursing programme. This study aimed to trial the Multiple Mini Interview approach to recruitment with a group of first year nursing students (already selected using traditional interviews). Cross-sectional validation study. This paper reports on the evaluation of the participants' detailed scores from the Multiple Mini Interview stations; their original interview scores and their end of year results. This study took place in March 2015. Scores from the seven Multiple Mini Interview stations were analysed to show the internal structure, reliability and generalizability of the stations. Original selection scores from interviews and in-course assessment were correlated with the MMI scores and variation by students' age, gender and disability status was explored. Reliability of the Multiple Mini Interview score was moderate (G = 0·52). The Multiple Mini Interview score provided better differentiation between more able students than did the original interview score but neither score was correlated with the module results. Multiple Mini Interview scores were positively associated with students' age but not their gender or disability status. The Multiple Mini Interview reported in this study offers a selection process that is based on the values and personal attributes regarded as desirable for a career in nursing and does not necessarily predict academic success. Its moderate reliability indicates the need for further improvement but it is capable of discriminating between candidates and shows little evidence of bias. © 2016 John Wiley & Sons Ltd.

  10. Developing and validating the Youth Conduct Problems Scale-Rwanda: a mixed methods approach.

    Directory of Open Access Journals (Sweden)

    Lauren C Ng

    Full Text Available This study developed and validated the Youth Conduct Problems Scale-Rwanda (YCPS-R. Qualitative free listing (n = 74 and key informant interviews (n = 47 identified local conduct problems, which were compared to existing standardized conduct problem scales and used to develop the YCPS-R. The YCPS-R was cognitive tested by 12 youth and caregiver participants, and assessed for test-retest and inter-rater reliability in a sample of 64 youth. Finally, a purposive sample of 389 youth and their caregivers were enrolled in a validity study. Validity was assessed by comparing YCPS-R scores to conduct disorder, which was diagnosed with the Mini International Neuropsychiatric Interview for Children, and functional impairment scores on the World Health Organization Disability Assessment Schedule Child Version. ROC analyses assessed the YCPS-R's ability to discriminate between youth with and without conduct disorder. Qualitative data identified a local presentation of youth conduct problems that did not match previously standardized measures. Therefore, the YCPS-R was developed solely from local conduct problems. Cognitive testing indicated that the YCPS-R was understandable and required little modification. The YCPS-R demonstrated good reliability, construct, criterion, and discriminant validity, and fair classification accuracy. The YCPS-R is a locally-derived measure of Rwandan youth conduct problems that demonstrated good psychometric properties and could be used for further research.

  11. Validity and reliability of three definitions of hip osteoarthritis: cross sectional and longitudinal approach.

    Science.gov (United States)

    Reijman, M; Hazes, J M W; Pols, H A P; Bernsen, R M D; Koes, B W; Bierma-Zeinstra, S M A

    2004-11-01

    To compare the reliability and validity in a large open population of three frequently used radiological definitions of hip osteoarthritis (OA): Kellgren and Lawrence grade, minimal joint space (MJS), and Croft grade; and to investigate whether the validity of the three definitions of hip OA is sex dependent. from the Rotterdam study (aged > or= 55 years, n = 3585) were evaluated. The inter-rater reliability was tested in a random set of 148 x rays. The validity was expressed as the ability to identify patients who show clinical symptoms of hip OA (construct validity) and as the ability to predict total hip replacement (THR) at follow up (predictive validity). Inter-rater reliability was similar for the Kellgren and Lawrence grade and MJS (kappa statistics 0.68 and 0.62, respectively) but lower for Croft's grade (kappa statistic, 0.51). The Kellgren and Lawrence grade and MJS showed the strongest associations with clinical symptoms of hip OA. Sex appeared to be an effect modifier for Kellgren and Lawrence and MJS definitions, women showing a stronger association between grading and symptoms than men. However, the sex dependency was attributed to differences in height between women and men. The Kellgren and Lawrence grade showed the highest predictive value for THR at follow up. Based on these findings, Kellgren and Lawrence still appears to be a useful OA definition for epidemiological studies focusing on the presence of hip OA.

  12. Antibiotic Resistances in Livestock: A Comparative Approach to Identify an Appropriate Regression Model for Count Data

    Directory of Open Access Journals (Sweden)

    Anke Hüls

    2017-05-01

    Full Text Available Antimicrobial resistance in livestock is a matter of general concern. To develop hygiene measures and methods for resistance prevention and control, epidemiological studies on a population level are needed to detect factors associated with antimicrobial resistance in livestock holdings. In general, regression models are used to describe these relationships between environmental factors and resistance outcome. Besides the study design, the correlation structures of the different outcomes of antibiotic resistance and structural zero measurements on the resistance outcome as well as on the exposure side are challenges for the epidemiological model building process. The use of appropriate regression models that acknowledge these complexities is essential to assure valid epidemiological interpretations. The aims of this paper are (i to explain the model building process comparing several competing models for count data (negative binomial model, quasi-Poisson model, zero-inflated model, and hurdle model and (ii to compare these models using data from a cross-sectional study on antibiotic resistance in animal husbandry. These goals are essential to evaluate which model is most suitable to identify potential prevention measures. The dataset used as an example in our analyses was generated initially to study the prevalence and associated factors for the appearance of cefotaxime-resistant Escherichia coli in 48 German fattening pig farms. For each farm, the outcome was the count of samples with resistant bacteria. There was almost no overdispersion and only moderate evidence of excess zeros in the data. Our analyses show that it is essential to evaluate regression models in studies analyzing the relationship between environmental factors and antibiotic resistances in livestock. After model comparison based on evaluation of model predictions, Akaike information criterion, and Pearson residuals, here the hurdle model was judged to be the most appropriate

  13. A systems immunology approach identifies the collective impact of 5 miRs in Th2 inflammation.

    Science.gov (United States)

    Kılıç, Ayşe; Santolini, Marc; Nakano, Taiji; Schiller, Matthias; Teranishi, Mizue; Gellert, Pascal; Ponomareva, Yuliya; Braun, Thomas; Uchida, Shizuka; Weiss, Scott T; Sharma, Amitabh; Renz, Harald

    2018-06-07

    Allergic asthma is a chronic inflammatory disease dominated by a CD4+ T helper 2 (Th2) cell signature. The immune response amplifies in self-enforcing loops, promoting Th2-driven cellular immunity and leaving the host unable to terminate inflammation. Posttranscriptional mechanisms, including microRNAs (miRs), are pivotal in maintaining immune homeostasis. Since an altered expression of various miRs has been associated with T cell-driven diseases, including asthma, we hypothesized that miRs control mechanisms ensuring Th2 stability and maintenance in the lung. We isolated murine CD4+ Th2 cells from allergic inflamed lungs and profiled gene and miR expression. Instead of focusing on the magnitude of miR differential expression, here we addressed the secondary consequences for the set of molecular interactions in the cell, the interactome. We developed the Impact of Differential Expression Across Layers, a network-based algorithm to prioritize disease-relevant miRs based on the central role of their targets in the molecular interactome. This method identified 5 Th2-related miRs (mir27b, mir206, mir106b, mir203, and mir23b) whose antagonization led to a sharp reduction of the Th2 phenotype. Overall, a systems biology tool was developed and validated, highlighting the role of miRs in Th2-driven immune response. This result offers potentially novel approaches for therapeutic interventions.

  14. Electrocatalysis of borohydride oxidation: a review of density functional theory approach combined with experimental validation

    Science.gov (United States)

    Sison Escaño, Mary Clare; Lacdao Arevalo, Ryan; Gyenge, Elod; Kasai, Hideaki

    2014-09-01

    The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4- on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.

  15. Fingerprints of zones in boreholes. An approach to identify the characteristics of structures

    Energy Technology Data Exchange (ETDEWEB)

    Straeng, Thomas; Waenstedt, Stefan; Tiren, Sven (GEOSIGMA AB (Sweden))

    2010-11-15

    The classification of geophysical borehole data in order to identify and characterize structures, which intercepts the borehole, is an important part of 3D modelling of the structural pattern of the bedrock. The objective of this study is to test a statistical approach, cluster analysis, on site data in order to see if it is possible to classify complex data set, geological and geophysical borehole data, in order to identify borehole intersects with increased brittle deformation, i.e. brittle deformation zones. The base data used in the study have been provided and delivered by SKB and consist of borehole logging data from the cored borehole KFM03A. The statistical method chosen for this study, cluster analysis using K-means method, groups data into a pre-defined number of clusters with the goal to minimize the variance of data within the group and to maximize the variance between the clusters. The idea is that data can and should be grouped into two categories, two clusters -corresponding to homogeneous bedrock (matrix) in one cluster and open fractures in the other cluster. The analysis also includes a repetition of the cluster analysis, with a stepwise refined data set to see whether strongly accentuated features could be identified. The results show that the use of K-mean Cluster analysis will present clusters that could represent the spatial distribution of bedrock matrix and the location of fractures respectively down the borehole. The results were compared with fracture frequency data (from core mapping) and also with the geological Single Hole Interpretation of KFM03A performed by SKB. The fracture zones identified in the Single Hole Interpretation process are all indicated in the cluster analysis results. The cluster analysis revealed eight additional possible zones. A majority of these are smaller than 5 metres (section width in the borehole) but they are still pronounced in the analysis. Based on the geophysical data, these sections should be taken into

  16. Integrative microRNA and proteomic approaches identify novel osteoarthritis genes and their collaborative metabolic and inflammatory networks.

    Directory of Open Access Journals (Sweden)

    Dimitrios Iliopoulos

    Full Text Available BACKGROUND: Osteoarthritis is a multifactorial disease characterized by destruction of the articular cartilage due to genetic, mechanical and environmental components affecting more than 100 million individuals all over the world. Despite the high prevalence of the disease, the absence of large-scale molecular studies limits our ability to understand the molecular pathobiology of osteoathritis and identify targets for drug development. METHODOLOGY/PRINCIPAL FINDINGS: In this study we integrated genetic, bioinformatic and proteomic approaches in order to identify new genes and their collaborative networks involved in osteoarthritis pathogenesis. MicroRNA profiling of patient-derived osteoarthritic cartilage in comparison to normal cartilage, revealed a 16 microRNA osteoarthritis gene signature. Using reverse-phase protein arrays in the same tissues we detected 76 differentially expressed proteins between osteoarthritic and normal chondrocytes. Proteins such as SOX11, FGF23, KLF6, WWOX and GDF15 not implicated previously in the genesis of osteoarthritis were identified. Integration of microRNA and proteomic data with microRNA gene-target prediction algorithms, generated a potential "interactome" network consisting of 11 microRNAs and 58 proteins linked by 414 potential functional associations. Comparison of the molecular and clinical data, revealed specific microRNAs (miR-22, miR-103 and proteins (PPARA, BMP7, IL1B to be highly correlated with Body Mass Index (BMI. Experimental validation revealed that miR-22 regulated PPARA and BMP7 expression and its inhibition blocked inflammatory and catabolic changes in osteoarthritic chondrocytes. CONCLUSIONS/SIGNIFICANCE: Our findings indicate that obesity and inflammation are related to osteoarthritis, a metabolic disease affected by microRNA deregulation. Gene network approaches provide new insights for elucidating the complexity of diseases such as osteoarthritis. The integration of microRNA, proteomic

  17. Probing the dynamics of identified neurons with a data-driven modeling approach.

    Directory of Open Access Journals (Sweden)

    Thomas Nowotny

    2008-07-01

    Full Text Available In controlling animal behavior the nervous system has to perform within the operational limits set by the requirements of each specific behavior. The implications for the corresponding range of suitable network, single neuron, and ion channel properties have remained elusive. In this article we approach the question of how well-constrained properties of neuronal systems may be on the neuronal level. We used large data sets of the activity of isolated invertebrate identified cells and built an accurate conductance-based model for this cell type using customized automated parameter estimation techniques. By direct inspection of the data we found that the variability of the neurons is larger when they are isolated from the circuit than when in the intact system. Furthermore, the responses of the neurons to perturbations appear to be more consistent than their autonomous behavior under stationary conditions. In the developed model, the constraints on different parameters that enforce appropriate model dynamics vary widely from some very tightly controlled parameters to others that are almost arbitrary. The model also allows predictions for the effect of blocking selected ionic currents and to prove that the origin of irregular dynamics in the neuron model is proper chaoticity and that this chaoticity is typical in an appropriate sense. Our results indicate that data driven models are useful tools for the in-depth analysis of neuronal dynamics. The better consistency of responses to perturbations, in the real neurons as well as in the model, suggests a paradigm shift away from measuring autonomous dynamics alone towards protocols of controlled perturbations. Our predictions for the impact of channel blockers on the neuronal dynamics and the proof of chaoticity underscore the wide scope of our approach.

  18. Determining the optimal approach to identifying individuals with chronic obstructive pulmonary disease: The DOC study.

    Science.gov (United States)

    Ronaldson, Sarah J; Dyson, Lisa; Clark, Laura; Hewitt, Catherine E; Torgerson, David J; Cooper, Brendan G; Kearney, Matt; Laughey, William; Raghunath, Raghu; Steele, Lisa; Rhodes, Rebecca; Adamson, Joy

    2018-06-01

    Early identification of chronic obstructive pulmonary disease (COPD) results in patients receiving appropriate management for their condition at an earlier stage in their disease. The determining the optimal approach to identifying individuals with chronic obstructive pulmonary disease (DOC) study was a case-finding study to enhance early identification of COPD in primary care, which evaluated the diagnostic accuracy of a series of simple lung function tests and symptom-based case-finding questionnaires. Current smokers aged 35 or more were invited to undertake a series of case-finding tools, which comprised lung function tests (specifically, spirometry, microspirometry, peak flow meter, and WheezoMeter) and several case-finding questionnaires. The effectiveness of these tests, individually or in combination, to identify small airways obstruction was evaluated against the gold standard of spirometry, with the quality of spirometry tests assessed by independent overreaders. The study was conducted with general practices in the Yorkshire and Humberside area, in the UK. Six hundred eighty-one individuals met the inclusion criteria, with 444 participants completing their study appointments. A total of 216 (49%) with good-quality spirometry readings were included in the analysis. The most effective case-finding tools were found to be the peak flow meter alone, the peak flow meter plus WheezoMeter, and microspirometry alone. In addition to the main analysis, where the severity of airflow obstruction was based on fixed ratios and percent of predicted values, sensitivity analyses were conducted by using lower limit of normal values. This research informs the choice of test for COPD identification; case-finding by use of the peak flow meter or microspirometer could be used routinely in primary care for suspected COPD patients. Only those testing positive to these tests would move on to full spirometry, thereby reducing unnecessary spirometric testing. © 2018 John Wiley

  19. Coalitional game theory as a promising approach to identify candidate autism genes.

    Science.gov (United States)

    Gupta, Anika; Sun, Min Woo; Paskov, Kelley Marie; Stockham, Nate Tyler; Jung, Jae-Yoon; Wall, Dennis Paul

    2018-01-01

    Despite mounting evidence for the strong role of genetics in the phenotypic manifestation of Autism Spectrum Disorder (ASD), the specific genes responsible for the variable forms of ASD remain undefined. ASD may be best explained by a combinatorial genetic model with varying epistatic interactions across many small effect mutations. Coalitional or cooperative game theory is a technique that studies the combined effects of groups of players, known as coalitions, seeking to identify players who tend to improve the performance--the relationship to a specific disease phenotype--of any coalition they join. This method has been previously shown to boost biologically informative signal in gene expression data but to-date has not been applied to the search for cooperative mutations among putative ASD genes. We describe our approach to highlight genes relevant to ASD using coalitional game theory on alteration data of 1,965 fully sequenced genomes from 756 multiplex families. Alterations were encoded into binary matrices for ASD (case) and unaffected (control) samples, indicating likely gene-disrupting, inherited mutations in altered genes. To determine individual gene contributions given an ASD phenotype, a "player" metric, referred to as the Shapley value, was calculated for each gene in the case and control cohorts. Sixty seven genes were found to have significantly elevated player scores and likely represent significant contributors to the genetic coordination underlying ASD. Using network and cross-study analysis, we found that these genes are involved in biological pathways known to be affected in the autism cases and that a subset directly interact with several genes known to have strong associations to autism. These findings suggest that coalitional game theory can be applied to large-scale genomic data to identify hidden yet influential players in complex polygenic disorders such as autism.

  20. A functional glycoproteomics approach identifies CD13 as a novel E-selectin ligand in breast cancer.

    Science.gov (United States)

    Carrascal, M A; Silva, M; Ferreira, J A; Azevedo, R; Ferreira, D; Silva, A M N; Ligeiro, D; Santos, L L; Sackstein, R; Videira, P A

    2018-05-17

    The glycan moieties sialyl-Lewis-X and/or -A (sLe X/A ) are the primary ligands for E-selectin, regulating subsequent tumor cell extravasation into distant organs. However, the nature of the glycoprotein scaffolds displaying these glycans in breast cancer remains unclear and constitutes the focus of the present investigation. We isolated glycoproteins that bind E-selectin from the CF1_T breast cancer cell line, derived from a patient with ductal carcinoma. Proteins were identified using bottom-up proteomics approach by nanoLC-orbitrap LTQ-MS/MS. Data were curated using bioinformatics tools to highlight clinically relevant glycoproteins, which were validated by flow cytometry, Western blot, immunohistochemistry and in-situ proximity ligation assays in clinical samples. We observed that the CF1_T cell line expressed sLe X , but not sLe A and the E-selectin reactivity was mainly on N-glycans. MS and bioinformatics analysis of the targeted glycoproteins, when narrowed down to the most clinically relevant species in breast cancer, identified CD44 glycoprotein (HCELL) and CD13 as key E-selectin ligands. Additionally, the co-expression of sLe X -CD44 and sLe X -CD13 was confirmed in clinical breast cancer tissue samples. Both CD44 and CD13 glycoforms display sLe X in breast cancer and bind E-selectin, suggesting a key role in metastasis development. Such observations provide a novel molecular rationale for developing targeted therapeutics. While HCELL expression in breast cancer has been previously reported, this is the first study indicating that CD13 functions as an E-selectin ligand in breast cancer. This observation supports previous associations of CD13 with metastasis and draws attention to this glycoprotein as an anti-cancer target. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. MrTADFinder: A network modularity based approach to identify topologically associating domains in multiple resolutions.

    Directory of Open Access Journals (Sweden)

    Koon-Kiu Yan

    2017-07-01

    Full Text Available Genome-wide proximity ligation based assays such as Hi-C have revealed that eukaryotic genomes are organized into structural units called topologically associating domains (TADs. From a visual examination of the chromosomal contact map, however, it is clear that the organization of the domains is not simple or obvious. Instead, TADs exhibit various length scales and, in many cases, a nested arrangement. Here, by exploiting the resemblance between TADs in a chromosomal contact map and densely connected modules in a network, we formulate TAD identification as a network optimization problem and propose an algorithm, MrTADFinder, to identify TADs from intra-chromosomal contact maps. MrTADFinder is based on the network-science concept of modularity. A key component of it is deriving an appropriate background model for contacts in a random chain, by numerically solving a set of matrix equations. The background model preserves the observed coverage of each genomic bin as well as the distance dependence of the contact frequency for any pair of bins exhibited by the empirical map. Also, by introducing a tunable resolution parameter, MrTADFinder provides a self-consistent approach for identifying TADs at different length scales, hence the acronym "Mr" standing for Multiple Resolutions. We then apply MrTADFinder to various Hi-C datasets. The identified domain boundaries are marked by characteristic signatures in chromatin marks and transcription factors (TF that are consistent with earlier work. Moreover, by calling TADs at different length scales, we observe that boundary signatures change with resolution, with different chromatin features having different characteristic length scales. Furthermore, we report an enrichment of HOT (high-occupancy target regions near TAD boundaries and investigate the role of different TFs in determining boundaries at various resolutions. To further explore the interplay between TADs and epigenetic marks, as tumor mutational

  2. Identifying diffused nitrate sources in a stream in an agricultural field using a dual isotopic approach

    International Nuclear Information System (INIS)

    Ding, Jingtao; Xi, Beidou; Gao, Rutai; He, Liansheng; Liu, Hongliang; Dai, Xuanli; Yu, Yijun

    2014-01-01

    Nitrate (NO 3 − ) pollution is a severe problem in aquatic systems in Taihu Lake Basin in China. A dual isotope approach (δ 15 N-NO 3 − and δ 18 O-NO 3 − ) was applied to identify diffused NO 3 − inputs in a stream in an agricultural field at the basin in 2013. The site-specific isotopic characteristics of five NO 3 − sources (atmospheric deposition, AD; NO 3 − derived from soil organic matter nitrification, NS; NO 3 − derived from chemical fertilizer nitrification, NF; groundwater, GW; and manure and sewage, M and S) were identified. NO 3 − concentrations in the stream during the rainy season [mean ± standard deviation (SD) = 2.5 ± 0.4 mg/L] were lower than those during the dry season (mean ± SD = 4.0 ± 0.5 mg/L), whereas the δ 18 O-NO 3 − values during the rainy season (mean ± SD = + 12.3 ± 3.6‰) were higher than those during the dry season (mean ± SD = + 0.9 ± 1.9‰). Both chemical and isotopic characteristics indicated that mixing with atmospheric NO 3 − resulted in the high δ 18 O values during the rainy season, whereas NS and M and S were the dominant NO 3 − sources during the dry season. A Bayesian model was used to determine the contribution of each NO 3 − source to total stream NO 3 − . Results showed that reduced N nitrification in soil zones (including soil organic matter and fertilizer) was the main NO 3 − source throughout the year. M and S contributed more NO 3 − during the dry season (22.4%) than during the rainy season (17.8%). AD generated substantial amounts of NO 3 − in May (18.4%), June (29.8%), and July (24.5%). With the assessment of temporal variation of diffused NO 3 − sources in agricultural field, improved agricultural management practices can be implemented to protect the water resource and avoid further water quality deterioration in Taihu Lake Basin. - Highlights: • The isotopic characteristics of potential NO 3 − sources were identified. • Mixing with atmospheric NO 3 − resulted

  3. New approaches for identifying and testing potential new anti-asthma agents.

    Science.gov (United States)

    Licari, Amelia; Castagnoli, Riccardo; Brambilla, Ilaria; Marseglia, Alessia; Tosca, Maria Angela; Marseglia, Gian Luigi; Ciprandi, Giorgio

    2018-01-01

    Asthma is a chronic disease with significant heterogeneity in clinical features, disease severity, pattern of underlying disease mechanisms, and responsiveness to specific treatments. While the majority of asthmatic patients are controlled by standard pharmacological strategies, a significant subgroup has limited therapeutic options representing a major unmet need. Ongoing asthma research aims to better characterize distinct clinical phenotypes, molecular endotypes, associated reliable biomarkers, and also to develop a series of new effective targeted treatment modalities. Areas covered: The expanding knowledge on the pathogenetic mechanisms of asthma has allowed researchers to investigate a range of new treatment options matched to patient profiles. The aim of this review is to provide a comprehensive and updated overview of the currently available, new and developing approaches for identifying and testing potential treatment options for asthma management. Expert opinion: Future therapeutic strategies for asthma require the identification of reliable biomarkers that can help with diagnosis and endotyping, in order to determine the most effective drug for the right patient phenotype. Furthermore, in addition to the identification of clinical and inflammatory phenotypes, it is expected that a better understanding of the mechanisms of airway remodeling will likely optimize asthma targeted treatment.

  4. Identifying Meaning Components in the Translation of Medical Terms from English into Indonesian: A Semantic Approach

    Directory of Open Access Journals (Sweden)

    I Gusti Agung Sri Rwa Jayantini

    2017-10-01

    Full Text Available This paper focuses on identifying meaning components in the translation of English medical terms into Indonesian. The data used in this study are the English medical term disorder and its Indonesian equivalent penyakit (disease. The two terms are purposively chosen as the data of the present study, which is a comparative research on the lexical meaning investigation in two different languages. The investigation involving a particular term in one language and its equivalent in the other language is worth doing since the lexicons in every language have their own specific concepts that may be synonymous, yet they are not always interchangeable in all contexts. The analysis into meaning components is called decomposition by means of several semantic theories to analyse the meaning of a lexical item (Löbner 2013. Here, the meaning components of the two compared terms are demonstrated through a semantic approach, particularly Natural Semantic Metalanguage (NSM supported by the investigation on their synonyms and how the terms are used in different contexts. The results show that the meaning components of a particular term in one language like the English term disorder are not always found in the Indonesian term penyakit, or, conversely, some of the meaning components of the Indonesian term do not always exist in the English term.

  5. Parallel approach to identifying the well-test interpretation model using a neurocomputer

    Science.gov (United States)

    May, Edward A., Jr.; Dagli, Cihan H.

    1996-03-01

    The well test is one of the primary diagnostic and predictive tools used in the analysis of oil and gas wells. In these tests, a pressure recording device is placed in the well and the pressure response is recorded over time under controlled flow conditions. The interpreted results are indicators of the well's ability to flow and the damage done to the formation surrounding the wellbore during drilling and completion. The results are used for many purposes, including reservoir modeling (simulation) and economic forecasting. The first step in the analysis is the identification of the Well-Test Interpretation (WTI) model, which determines the appropriate solution method. Mis-identification of the WTI model occurs due to noise and non-ideal reservoir conditions. Previous studies have shown that a feed-forward neural network using the backpropagation algorithm can be used to identify the WTI model. One of the drawbacks to this approach is, however, training time, which can run into days of CPU time on personal computers. In this paper a similar neural network is applied using both a personal computer and a neurocomputer. Input data processing, network design, and performance are discussed and compared. The results show that the neurocomputer greatly eases the burden of training and allows the network to outperform a similar network running on a personal computer.

  6. Pharmacy patronage: identifying key factors in the decision making process using the determinant attribute approach.

    Science.gov (United States)

    Franic, Duska M; Haddock, Sarah M; Tucker, Leslie Tootle; Wooten, Nathan

    2008-01-01

    To use the determinant attribute approach, a research method commonly used in marketing to identify the wants of various consumer groups, to evaluate consumer pharmacy choice when having a prescription order filled in different pharmacy settings. Cross sectional. Community independent, grocery store, community chain, and discount store pharmacies in Georgia between April 2005 and April 2006. Convenience sample of adult pharmacy consumers (n = 175). Survey measuring consumer preferences on 26 attributes encompassing general pharmacy site features (16 items), pharmacist characteristics (5 items), and pharmacy staff characteristics (5 items). 26 potential determinant attributes for pharmacy selection. 175 consumers were surveyed at community independent (n = 81), grocery store (n = 44), community chain (n = 27), or discount store (n = 23) pharmacy settings. The attributes of pharmacists and staff at all four pharmacy settings were shown to affect pharmacy patronage motives, although consumers frequenting non-community independent pharmacies were also motivated by secondary convenience factors, e.g., hours of operation, and prescription coverage. Most consumers do not perceive pharmacies as merely prescription-distribution centers that vary only by convenience. Prescriptions are not just another economic good. Pharmacy personnel influence pharmacy selection; therefore, optimal staff selection and training is likely the greatest asset and most important investment for ensuring pharmacy success.

  7. Identifying medication error chains from critical incident reports: a new analytic approach.

    Science.gov (United States)

    Huckels-Baumgart, Saskia; Manser, Tanja

    2014-10-01

    Research into the distribution of medication errors usually focuses on isolated stages within the medication use process. Our study aimed to provide a novel process-oriented approach to medication incident analysis focusing on medication error chains. Our study was conducted across a 900-bed teaching hospital in Switzerland. All reported 1,591 medication errors 2009-2012 were categorized using the Medication Error Index NCC MERP and the WHO Classification for Patient Safety Methodology. In order to identify medication error chains, each reported medication incident was allocated to the relevant stage of the hospital medication use process. Only 25.8% of the reported medication errors were detected before they propagated through the medication use process. The majority of medication errors (74.2%) formed an error chain encompassing two or more stages. The most frequent error chain comprised preparation up to and including medication administration (45.2%). "Non-consideration of documentation/prescribing" during the drug preparation was the most frequent contributor for "wrong dose" during the administration of medication. Medication error chains provide important insights for detecting and stopping medication errors before they reach the patient. Existing and new safety barriers need to be extended to interrupt error chains and to improve patient safety. © 2014, The American College of Clinical Pharmacology.

  8. Omics Approaches for Identifying Physiological Adaptations to Genome Instability in Aging.

    Science.gov (United States)

    Edifizi, Diletta; Schumacher, Björn

    2017-11-04

    DNA damage causally contributes to aging and age-related diseases. The declining functioning of tissues and organs during aging can lead to the increased risk of succumbing to aging-associated diseases. Congenital syndromes that are caused by heritable mutations in DNA repair pathways lead to cancer susceptibility and accelerated aging, thus underlining the importance of genome maintenance for withstanding aging. High-throughput mass-spectrometry-based approaches have recently contributed to identifying signalling response networks and gaining a more comprehensive understanding of the physiological adaptations occurring upon unrepaired DNA damage. The insulin-like signalling pathway has been implicated in a DNA damage response (DDR) network that includes epidermal growth factor (EGF)-, AMP-activated protein kinases (AMPK)- and the target of rapamycin (TOR)-like signalling pathways, which are known regulators of growth, metabolism, and stress responses. The same pathways, together with the autophagy-mediated proteostatic response and the decline in energy metabolism have also been found to be similarly regulated during natural aging, suggesting striking parallels in the physiological adaptation upon persistent DNA damage due to DNA repair defects and long-term low-level DNA damage accumulation occurring during natural aging. These insights will be an important starting point to study the interplay between signalling networks involved in progeroid syndromes that are caused by DNA repair deficiencies and to gain new understanding of the consequences of DNA damage in the aging process.

  9. Assessment of a Novel Approach to Identify Trichiasis Cases Using Community Treatment Assistants in Tanzania.

    Science.gov (United States)

    Greene, Gregory S; West, Sheila K; Mkocha, Harran; Munoz, Beatriz; Merbs, Shannath L

    2015-12-01

    Simple surgical intervention advocated by the World Health Organization can alleviate trachomatous trichiasis (TT) and prevent subsequent blindness. A large backlog of TT cases remain unidentified and untreated. To increase identification and referral of TT cases, a novel approach using standard screening questions, a card, and simple training for Community Treatment Assistants (CTAs) to use during Mass Drug Administration (MDA) was developed and evaluated in Kongwa District, a trachoma-endemic area of central Tanzania. A community randomized trial was conducted in 36 communities during MDA. CTAs in intervention villages received an additional half-day of training and a TT screening card in addition to the training received by CTAs in villages assigned to usual care. All MDA participants 15 years and older were screened for TT, and senior TT graders confirmed case status by evaluating all screened-positive cases. A random sample of those screened negative for TT and those who did not present at MDA were also evaluated by the master graders. Intervention CTAs identified 5.6 times as many cases (n = 50) as those assigned to usual care (n = 9, p card significantly increased the ability of CTAs to recognize and refer TT cases during MDA; however, further efforts are needed to improve case detection and reduce the number of false positive cases.

  10. Omics Approaches for Identifying Physiological Adaptations to Genome Instability in Aging

    Directory of Open Access Journals (Sweden)

    Diletta Edifizi

    2017-11-01

    Full Text Available DNA damage causally contributes to aging and age-related diseases. The declining functioning of tissues and organs during aging can lead to the increased risk of succumbing to aging-associated diseases. Congenital syndromes that are caused by heritable mutations in DNA repair pathways lead to cancer susceptibility and accelerated aging, thus underlining the importance of genome maintenance for withstanding aging. High-throughput mass-spectrometry-based approaches have recently contributed to identifying signalling response networks and gaining a more comprehensive understanding of the physiological adaptations occurring upon unrepaired DNA damage. The insulin-like signalling pathway has been implicated in a DNA damage response (DDR network that includes epidermal growth factor (EGF-, AMP-activated protein kinases (AMPK- and the target of rapamycin (TOR-like signalling pathways, which are known regulators of growth, metabolism, and stress responses. The same pathways, together with the autophagy-mediated proteostatic response and the decline in energy metabolism have also been found to be similarly regulated during natural aging, suggesting striking parallels in the physiological adaptation upon persistent DNA damage due to DNA repair defects and long-term low-level DNA damage accumulation occurring during natural aging. These insights will be an important starting point to study the interplay between signalling networks involved in progeroid syndromes that are caused by DNA repair deficiencies and to gain new understanding of the consequences of DNA damage in the aging process.

  11. Prevalence and characteristics of asthma–COPD overlap syndrome identified by a stepwise approach

    Directory of Open Access Journals (Sweden)

    Inoue H

    2017-06-01

    Full Text Available Hiromasa Inoue,1 Takahide Nagase,2 Satoshi Morita,3 Atsushi Yoshida,4 Tatsunori Jinnai,4 Masakazu Ichinose5 1Department of Pulmonary Medicine, Graduate School of Medical and Dental Sciences, Kagoshima University, Kagoshima, 2Department of Respiratory Medicine, Graduate School of Medicine, The University of Tokyo, Tokyo, 3Department of Biomedical Statistics and Bioinformatics, Kyoto University Graduate School of Medicine, Kyoto, 4Medical Department, AstraZeneca K.K., Osaka, 5Department of Respiratory Medicine, Tohoku University Graduate School of Medicine, Sendai, Japan Background and objective: There is increasing recognition of asthma–COPD overlap syndrome (ACOS, which shares some features of both asthma and COPD; however, the prevalence and characteristics of ACOS are not well understood. The aim of this study was to investigate the prevalence of ACOS among patients with COPD and its characteristics using a stepwise approach as stated in the recent report of the Global Initiative for Asthma (GINA and the Global Initiative for Chronic Obstructive Lung Disease (GOLD. Methods: This multicenter, cross-sectional, observational study enrolled outpatients who were receiving medical treatment for COPD. Clinical data, including spirometry results, were retrieved from medical records. For symptom assessment, patients were asked to complete the Clinical COPD questionnaire and the modified British Medical Research Council questionnaire. Results: Of the 1,008 patients analyzed, 167 (16.6% had syndromic features of ACOS. Of the total number of patients, 93 and 42 (9.2% and 4.2% also had a predefined clinical variability of ≥12%/≥200 mL and ≥12%/≥400 mL in forced expiratory volume in 1 second (FEV1, respectively, and therefore were identified as having ACOS. Conversely, the number of patients who had either syndromic or spirometric feature of ACOS was 595 (59.0%, ≥12%/≥200 mL FEV1 clinical variability, and 328 patients (32.5%, ≥12%/≥400 m

  12. A zone-based approach to identifying urban land uses using nationally-available data

    Science.gov (United States)

    Falcone, James A.

    Accurate identification of urban land use is essential for many applications in environmental study, ecological assessment, and urban planning, among other fields. However, because physical surfaces of land cover types are not necessarily related to their use and economic function, differentiating among thematically-detailed urban land uses (single-family residential, multi-family residential, commercial, industrial, etc.) using remotely-sensed imagery is a challenging task, particularly over large areas. Because the process requires an interpretation of tone/color, size, shape, pattern, and neighborhood association elements within a scene, it has traditionally been accomplished via manual interpretation of aerial photography or high-resolution satellite imagery. Although success has been achieved for localized areas using various automated techniques based on high-spatial or high-spectral resolution data, few detailed (Anderson Level II equivalent or greater) urban land use mapping products have successfully been created via automated means for broad (multi-county or larger) areas, and no such product exists today for the United States. In this study I argue that by employing a zone-based approach it is feasible to map thematically-detailed urban land use classes over large areas using appropriate combinations of non-image based predictor data which are nationally and publicly available. The approach presented here uses U.S. Census block groups as the basic unit of geography, and predicts the percent of each of ten land use types---nine of them urban---for each block group based on a number of data sources, to include census data, nationally-available point locations of features from the USGS Geographic Names Information System, historical land cover, and metrics which characterize spatial pattern, context (e.g. distance to city centers or other features), and measures of spatial autocorrelation. The method was demonstrated over a four-county area surrounding the

  13. Bridging the gap between neurocognitive processing theory and performance validity assessment among the cognitively impaired: a review and methodological approach.

    Science.gov (United States)

    Leighton, Angela; Weinborn, Michael; Maybery, Murray

    2014-10-01

    Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance.

  14. A Spatial Approach to Identify Slum Areas in East Wara Sub-Districts, South Sulawesi

    Science.gov (United States)

    Anurogo, W.; Lubis, M. Z.; Pamungkas, D. S.; Hartono; Ibrahim, F. M.

    2017-12-01

    Spatial approach is one of the main approaches of geography, its analysis emphasizes the existence of space that serves to accommodate human activities. The dynamic development of the city area brings many impacts to the urban community’s own life patterns. The development of the city center which is the center of economic activity becomes the attraction for the community that can bring influence to the high flow of labor both from within the city itself and from outside the city area, thus causing the high flow of urbanization. Urbanization has caused an explosion in urban population and one implication is the occurrence of labor-clumping in major cities in Indonesia. Another impact of the high urbanization flow of cities is the problem of urban settlements. The more populations that come in the city, the worse the quality of the existing settlements in the city if not managed properly. This study aims to determine the location of slum areas in East Wara Sub-Districts using remote sensing technology tools and Geographic Information System (GIS). Parameters used to identify slum areas partially extracted using remote sensing data and for parameters that cannot be extracted using remote sensing data, information obtained from field surveys with information retrieval based on reference data. Analysis results for slum settlements taken from the parameters indicate that the East Wara Sub-District has the largest slum areas located in Pontap village. The village of Pontap has two classes of slums that are very shabby and slums. Slum classes are also in Surutangga Village. The result of the analysis shows that the slum settlement area has 46,324 Ha, which is only located in Pontap Village, whereas for the slum class are found in some villages of Pontap and Surutangga Urban Village, there are 37.797 Ha area. The class of slum settlement areas has the largest proportion of the area among other classes in East Wara Subdistrict. The class of slum settlement areas has an

  15. The GOES-R/JPSS Approach for Identifying Hazardous Low Clouds: Overview and Operational Impacts

    Science.gov (United States)

    Calvert, Corey; Pavolonis, Michael; Lindstrom, Scott; Gravelle, Chad; Terborg, Amanda

    2017-04-01

    Low ceiling and visibility is a weather hazard that nearly every forecaster, in nearly every National Weather Service (NWS) Weather Forecast Office (WFO), must regularly address. In addition, national forecast centers such as the Aviation Weather Center (AWC), Alaska Aviation Weather Unit (AAWU) and the Ocean Prediction Center (OPC) are responsible for issuing low ceiling and visibility related products. As such, reliable methods for detecting and characterizing hazardous low clouds are needed. Traditionally, hazardous areas of Fog/Low Stratus (FLS) are identified using a simple stand-alone satellite product that is constructed by subtracting the 3.9 and 11 μm brightness temperatures. However, the 3.9-11 μm brightness temperature difference (BTD) has several major limitations. In an effort to address the limitations of the BTD product, the GOES-R Algorithm Working Group (AWG) developed an approach that fuses satellite, Numerical Weather Prediction (NWP) model, Sea Surface Temperature (SST) analyses, and other data sets (e.g. digital surface elevation maps, surface emissivity maps, and surface type maps) to determine the probability that hazardous low clouds are present using a naïve Bayesian classifier. In addition, recent research has focused on blending geostationary (e.g. GOES-R) and low earth orbit (e.g. JPSS) satellite data to further improve the products. The FLS algorithm has adopted an enterprise approach in that it can utilize satellite data from a variety of current and future operational sensors and NWP data from a variety of models. The FLS products are available in AWIPS/N-AWIPS/AWIPS-II and have been evaluated within NWS operations over the last four years as part of the Satellite Proving Ground. Forecaster feedback has been predominantly positive and references to these products within Area Forecast Discussions (AFD's) indicate that the products are influencing operational forecasts. At the request of the NWS, the FLS products are currently being

  16. Identifying Country-Specific Cultures of Physics Education: A differential item functioning approach

    Science.gov (United States)

    Mesic, Vanes

    2012-11-01

    In international large-scale assessments of educational outcomes, student achievement is often represented by unidimensional constructs. This approach allows for drawing general conclusions about country rankings with respect to the given achievement measure, but it typically does not provide specific diagnostic information which is necessary for systematic comparisons and improvements of educational systems. Useful information could be obtained by exploring the differences in national profiles of student achievement between low-achieving and high-achieving countries. In this study, we aimed to identify the relative weaknesses and strengths of eighth graders' physics achievement in Bosnia and Herzegovina in comparison to the achievement of their peers from Slovenia. For this purpose, we ran a secondary analysis of Trends in International Mathematics and Science Study (TIMSS) 2007 data. The student sample consisted of 4,220 students from Bosnia and Herzegovina and 4,043 students from Slovenia. After analysing the cognitive demands of TIMSS 2007 physics items, the correspondent differential item functioning (DIF)/differential group functioning contrasts were estimated. Approximately 40% of items exhibited large DIF contrasts, indicating significant differences between cultures of physics education in Bosnia and Herzegovina and Slovenia. The relative strength of students from Bosnia and Herzegovina showed to be mainly associated with the topic area 'Electricity and magnetism'. Classes of items which required the knowledge of experimental method, counterintuitive thinking, proportional reasoning and/or the use of complex knowledge structures proved to be differentially easier for students from Slovenia. In the light of the presented results, the common practice of ranking countries with respect to universally established cognitive categories seems to be potentially misleading.

  17. A spatial modeling approach to identify potential butternut restoration sites in Mammoth Cave National Park

    Science.gov (United States)

    Thompson, L.M.; Van Manen, F.T.; Schlarbaum, S.E.; DePoy, M.

    2006-01-01

    Incorporation of disease resistance is nearly complete for several important North American hardwood species threatened by exotic fungal diseases. The next important step toward species restoration would be to develop reliable tools to delineate ideal restoration sites on a landscape scale. We integrated spatial modeling and remote sensing techniques to delineate potential restoration sites for Butternut (Juglans cinerea L.) trees, a hardwood species being decimated by an exotic fungus, in Mammoth Cave National Park (MCNP), Kentucky. We first developed a multivariate habitat model to determine optimum Butternut habitats within MCNP. Habitat characteristics of 54 known Butternut locations were used in combination with eight topographic and land use data layers to calculate an index of habitat suitability based on Mahalanobis distance (D2). We used a bootstrapping technique to test the reliability of model predictions. Based on a threshold value for the D2 statistic, 75.9% of the Butternut locations were correctly classified, indicating that the habitat model performed well. Because Butternut seedlings require extensive amounts of sunlight to become established, we used canopy cover data to refine our delineation of favorable areas for Butternut restoration. Areas with the most favorable conditions to establish Butternut seedlings were limited to 291.6 ha. Our study provides a useful reference on the amount and location of favorable Butternut habitat in MCNP and can be used to identify priority areas for future Butternut restoration. Given the availability of relevant habitat layers and accurate location records, our approach can be applied to other tree species and areas. ?? 2006 Society for Ecological Restoration International.

  18. Assessment of a Novel Approach to Identify Trichiasis Cases Using Community Treatment Assistants in Tanzania.

    Directory of Open Access Journals (Sweden)

    Gregory S Greene

    2015-12-01

    Full Text Available Simple surgical intervention advocated by the World Health Organization can alleviate trachomatous trichiasis (TT and prevent subsequent blindness. A large backlog of TT cases remain unidentified and untreated. To increase identification and referral of TT cases, a novel approach using standard screening questions, a card, and simple training for Community Treatment Assistants (CTAs to use during Mass Drug Administration (MDA was developed and evaluated in Kongwa District, a trachoma-endemic area of central Tanzania.A community randomized trial was conducted in 36 communities during MDA. CTAs in intervention villages received an additional half-day of training and a TT screening card in addition to the training received by CTAs in villages assigned to usual care. All MDA participants 15 years and older were screened for TT, and senior TT graders confirmed case status by evaluating all screened-positive cases. A random sample of those screened negative for TT and those who did not present at MDA were also evaluated by the master graders. Intervention CTAs identified 5.6 times as many cases (n = 50 as those assigned to usual care (n = 9, p < 0.05. While specificity was above 90% for both groups, the sensitivity for the novel screening tool was 31.2% compared to 5.6% for the usual care group (p < 0.05.CTAs appear to be viable resources for the identification of TT cases. Additional training and use of a TT screening card significantly increased the ability of CTAs to recognize and refer TT cases during MDA; however, further efforts are needed to improve case detection and reduce the number of false positive cases.

  19. Predicting Fish Growth Potential and Identifying Water Quality Constraints: A Spatially-Explicit Bioenergetics Approach

    Science.gov (United States)

    Budy, Phaedra; Baker, Matthew; Dahle, Samuel K.

    2011-10-01

    Anthropogenic impairment of water bodies represents a global environmental concern, yet few attempts have successfully linked fish performance to thermal habitat suitability and fewer have distinguished co-varying water quality constraints. We interfaced fish bioenergetics, field measurements, and Thermal Remote Imaging to generate a spatially-explicit, high-resolution surface of fish growth potential, and next employed a structured hypothesis to detect relationships among measures of fish performance and co-varying water quality constraints. Our thermal surface of fish performance captured the amount and spatial-temporal arrangement of thermally-suitable habitat for three focal species in an extremely heterogeneous reservoir, but interpretation of this pattern was initially confounded by seasonal covariation of water residence time and water quality. Subsequent path analysis revealed that in terms of seasonal patterns in growth potential, catfish and walleye responded to temperature, positively and negatively, respectively; crappie and walleye responded to eutrophy (negatively). At the high eutrophy levels observed in this system, some desired fishes appear to suffer from excessive cultural eutrophication within the context of elevated temperatures whereas others appear to be largely unaffected or even enhanced. Our overall findings do not lead to the conclusion that this system is degraded by pollution; however, they do highlight the need to use a sensitive focal species in the process of determining allowable nutrient loading and as integrators of habitat suitability across multiple spatial and temporal scales. We provide an integrated approach useful for quantifying fish growth potential and identifying water quality constraints on fish performance at spatial scales appropriate for whole-system management.

  20. External validity of sentiment mining reports: Can current methods identify demographic biases, event biases, and manipulation of reviews?

    NARCIS (Netherlands)

    Wijnhoven, Alphonsus B.J.M.; Bloemen, Oscar

    2014-01-01

    Many publications in sentiment mining provide new techniques for improved accuracy in extracting features and corresponding sentiments in texts. For the external validity of these sentiment reports, i.e., the applicability of the results to target audiences, it is important to well analyze data of

  1. Use of the "Intervention Selection Profile-Social Skills" to Identify Social Skill Acquisition Deficits: A Preliminary Validation Study

    Science.gov (United States)

    Kilgus, Stephen P.; von der Embse, Nathaniel P.; Scott, Katherine; Paxton, Sara

    2015-01-01

    The purpose of this investigation was to develop and initially validate the "Intervention Selection Profile-Social Skills" (ISP-SS), a novel brief social skills assessment method intended for use at Tier 2. Participants included 54 elementary school teachers and their 243 randomly selected students. Teachers rated students on two rating…

  2. Identifying the Barriers to Using Games and Simulations in Education: Creating a Valid and Reliable Survey Instrument

    Science.gov (United States)

    Justice, Lenora Jean

    2012-01-01

    The purpose of this study was to create a valid and reliable instrument to measure teacher perceived barriers to the adoption of games and simulations in instruction. Previous research, interviews with educators, a focus group, an expert review, and a think aloud protocol were used to design a survey instrument. After finalization, the survey was…

  3. Identifying the Return on Investment for Army Migration to a Modular Open Systems Approach for Future and Legacy Systems

    Science.gov (United States)

    2017-04-05

    Identifying the Return on Investment for Army Migration to a Modular Open Systems Approach for Future and Legacy Systems Phillip Minor...Authorization Act (NDAA) of 2015, cites the modular open systems approach (MOSA) as both a business and technical strategy to reduce the cost of system ...access the service over the network. Combine the advances cited above with the emergence of systems developed using the modular open systems approach

  4. Electrocatalysis of borohydride oxidation: a review of density functional theory approach combined with experimental validation

    International Nuclear Information System (INIS)

    Sison Escaño, Mary Clare; Arevalo, Ryan Lacdao; Kasai, Hideaki; Gyenge, Elod

    2014-01-01

    The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH 4 − on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements. (topical review)

  5. A standardized approach to verification and validation to assist in expert system development

    International Nuclear Information System (INIS)

    Hines, J.W.; Hajek, B.K.; Miller, D.W.; Haas, M.A.

    1992-01-01

    For the past six years, the Nuclear Engineering Program's Artificial Intelligence (AI) Group at The Ohio State University has been developing an integration of expert systems to act as an aid to nuclear power plant operators. This Operator Advisor consists of four modules that monitor plant parameters, detect deviations from normality, diagnose the root cause of the abnormality, manage procedures to effectively respond to the abnormality, and mitigate its consequences. To aid in the development of this new system, a standardized Verification and Validation (V and V) approach is being implemented. The primary functions are to guide the development of the expert system and to ensure that the end product fulfills the initial objectives. The development process has been divided into eight life-cycle V and V phases from concept to operation and maintenance. Each phase has specific V and V tasks to be performed to ensure a quality end product. Four documents are being used to guide development. The Software Verification and Validation Plan (SVVP) outlines the V and V tasks necessary to verify the product at the end of each software development phase, and to validate that the end product complies with the established software and system requirements and meets the needs of the user. The Software Requirements Specification (SRS) documents the essential requirements of the system. The Software Design Description (SDD) represents these requirements with a specific design. And lastly, the Software Test Document establishes a testing methodology to be used throughout the development life-cycle

  6. On the validity of the incremental approach to estimate the impact of cities on air quality

    Science.gov (United States)

    Thunis, Philippe

    2018-01-01

    The question of how much cities are the sources of their own air pollution is not only theoretical as it is critical to the design of effective strategies for urban air quality planning. In this work, we assess the validity of the commonly used incremental approach to estimate the likely impact of cities on their air pollution. With the incremental approach, the city impact (i.e. the concentration change generated by the city emissions) is estimated as the concentration difference between a rural background and an urban background location, also known as the urban increment. We show that the city impact is in reality made up of the urban increment and two additional components and consequently two assumptions need to be fulfilled for the urban increment to be representative of the urban impact. The first assumption is that the rural background location is not influenced by emissions from within the city whereas the second requires that background concentration levels, obtained with zero city emissions, are equal at both locations. Because the urban impact is not measurable, the SHERPA modelling approach, based on a full air quality modelling system, is used in this work to assess the validity of these assumptions for some European cities. Results indicate that for PM2.5, these two assumptions are far from being fulfilled for many large or medium city sizes. For this type of cities, urban increments are largely underestimating city impacts. Although results are in better agreement for NO2, similar issues are met. In many situations the incremental approach is therefore not an adequate estimate of the urban impact on air pollution. This poses issues in terms of interpretation when these increments are used to define strategic options in terms of air quality planning. We finally illustrate the interest of comparing modelled and measured increments to improve our confidence in the model results.

  7. A multivariate and stochastic approach to identify key variables to rank dairy farms on profitability.

    Science.gov (United States)

    Atzori, A S; Tedeschi, L O; Cannas, A

    2013-05-01

    The economic efficiency of dairy farms is the main goal of farmers. The objective of this work was to use routinely available information at the dairy farm level to develop an index of profitability to rank dairy farms and to assist the decision-making process of farmers to increase the economic efficiency of the entire system. A stochastic modeling approach was used to study the relationships between inputs and profitability (i.e., income over feed cost; IOFC) of dairy cattle farms. The IOFC was calculated as: milk revenue + value of male calves + culling revenue - herd feed costs. Two databases were created. The first one was a development database, which was created from technical and economic variables collected in 135 dairy farms. The second one was a synthetic database (sDB) created from 5,000 synthetic dairy farms using the Monte Carlo technique and based on the characteristics of the development database data. The sDB was used to develop a ranking index as follows: (1) principal component analysis (PCA), excluding IOFC, was used to identify principal components (sPC); and (2) coefficient estimates of a multiple regression of the IOFC on the sPC were obtained. Then, the eigenvectors of the sPC were used to compute the principal component values for the original 135 dairy farms that were used with the multiple regression coefficient estimates to predict IOFC (dRI; ranking index from development database). The dRI was used to rank the original 135 dairy farms. The PCA explained 77.6% of the sDB variability and 4 sPC were selected. The sPC were associated with herd profile, milk quality and payment, poor management, and reproduction based on the significant variables of the sPC. The mean IOFC in the sDB was 0.1377 ± 0.0162 euros per liter of milk (€/L). The dRI explained 81% of the variability of the IOFC calculated for the 135 original farms. When the number of farms below and above 1 standard deviation (SD) of the dRI were calculated, we found that 21

  8. Validating a conceptual model for an inter-professional approach to shared decision making: a mixed methods study

    Science.gov (United States)

    Légaré, France; Stacey, Dawn; Gagnon, Susie; Dunn, Sandy; Pluye, Pierre; Frosch, Dominick; Kryworuchko, Jennifer; Elwyn, Glyn; Gagnon, Marie-Pierre; Graham, Ian D

    2011-01-01

    Rationale, aims and objectives Following increased interest in having inter-professional (IP) health care teams engage patients in decision making, we developed a conceptual model for an IP approach to shared decision making (SDM) in primary care. We assessed the validity of the model with stakeholders in Canada. Methods In 15 individual interviews and 7 group interviews with 79 stakeholders, we asked them to: (1) propose changes to the IP-SDM model; (2) identify barriers and facilitators to the model's implementation in clinical practice; and (3) assess the model using a theory appraisal questionnaire. We performed a thematic analysis of the transcripts and a descriptive analysis of the questionnaires. Results Stakeholders suggested placing the patient at its centre; extending the concept of family to include significant others; clarifying outcomes; highlighting the concept of time; merging the micro, meso and macro levels in one figure; and recognizing the influence of the environment and emotions. The most common barriers identified were time constraints, insufficient resources and an imbalance of power among health professionals. The most common facilitators were education and training in inter-professionalism and SDM, motivation to achieve an IP approach to SDM, and mutual knowledge and understanding of disciplinary roles. Most stakeholders considered that the concepts and relationships between the concepts were clear and rated the model as logical, testable, having clear schematic representation, and being relevant to inter-professional collaboration, SDM and primary care. Conclusions Stakeholders validated the new IP-SDM model for primary care settings and proposed few modifications. Future research should assess if the model helps implement SDM in IP clinical practice. PMID:20695950

  9. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    Science.gov (United States)

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  10. Computational Design and Discovery of Ni-Based Alloys and Coatings: Thermodynamic Approaches Validated by Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zi-Kui [Pennsylvania State University; Gleeson, Brian [University of Pittsburgh; Shang, Shunli [Pennsylvania State University; Gheno, Thomas [University of Pittsburgh; Lindwall, Greta [Pennsylvania State University; Zhou, Bi-Cheng [Pennsylvania State University; Liu, Xuan [Pennsylvania State University; Ross, Austin [Pennsylvania State University

    2018-04-23

    This project developed computational tools that can complement and support experimental efforts in order to enable discovery and more efficient development of Ni-base structural materials and coatings. The project goal was reached through an integrated computation-predictive and experimental-validation approach, including first-principles calculations, thermodynamic CALPHAD (CALculation of PHAse Diagram), and experimental investigations on compositions relevant to Ni-base superalloys and coatings in terms of oxide layer growth and microstructure stabilities. The developed description included composition ranges typical for coating alloys and, hence, allow for prediction of thermodynamic properties for these material systems. The calculation of phase compositions, phase fraction, and phase stabilities, which are directly related to properties such as ductility and strength, was a valuable contribution, along with the collection of computational tools that are required to meet the increasing demands for strong, ductile and environmentally-protective coatings. Specifically, a suitable thermodynamic description for the Ni-Al-Cr-Co-Si-Hf-Y system was developed for bulk alloy and coating compositions. Experiments were performed to validate and refine the thermodynamics from the CALPHAD modeling approach. Additionally, alloys produced using predictions from the current computational models were studied in terms of their oxidation performance. Finally, results obtained from experiments aided in the development of a thermodynamic modeling automation tool called ESPEI/pycalphad - for more rapid discovery and development of new materials.

  11. Raman fiber-optical method for colon cancer detection: Cross-validation and outlier identification approach

    Science.gov (United States)

    Petersen, D.; Naveed, P.; Ragheb, A.; Niedieker, D.; El-Mashtoly, S. F.; Brechmann, T.; Kötting, C.; Schmiegel, W. H.; Freier, E.; Pox, C.; Gerwert, K.

    2017-06-01

    Endoscopy plays a major role in early recognition of cancer which is not externally accessible and therewith in increasing the survival rate. Raman spectroscopic fiber-optical approaches can help to decrease the impact on the patient, increase objectivity in tissue characterization, reduce expenses and provide a significant time advantage in endoscopy. In gastroenterology an early recognition of malign and precursor lesions is relevant. Instantaneous and precise differentiation between adenomas as precursor lesions for cancer and hyperplastic polyps on the one hand and between high and low-risk alterations on the other hand is important. Raman fiber-optical measurements of colon biopsy samples taken during colonoscopy were carried out during a clinical study, and samples of adenocarcinoma (22), tubular adenomas (141), hyperplastic polyps (79) and normal tissue (101) from 151 patients were analyzed. This allows us to focus on the bioinformatic analysis and to set stage for Raman endoscopic measurements. Since spectral differences between normal and cancerous biopsy samples are small, special care has to be taken in data analysis. Using a leave-one-patient-out cross-validation scheme, three different outlier identification methods were investigated to decrease the influence of systematic errors, like a residual risk in misplacement of the sample and spectral dilution of marker bands (esp. cancerous tissue) and therewith optimize the experimental design. Furthermore other validations methods like leave-one-sample-out and leave-one-spectrum-out cross-validation schemes were compared with leave-one-patient-out cross-validation. High-risk lesions were differentiated from low-risk lesions with a sensitivity of 79%, specificity of 74% and an accuracy of 77%, cancer and normal tissue with a sensitivity of 79%, specificity of 83% and an accuracy of 81%. Additionally applied outlier identification enabled us to improve the recognition of neoplastic biopsy samples.

  12. Raman fiber-optical method for colon cancer detection: Cross-validation and outlier identification approach.

    Science.gov (United States)

    Petersen, D; Naveed, P; Ragheb, A; Niedieker, D; El-Mashtoly, S F; Brechmann, T; Kötting, C; Schmiegel, W H; Freier, E; Pox, C; Gerwert, K

    2017-06-15

    Endoscopy plays a major role in early recognition of cancer which is not externally accessible and therewith in increasing the survival rate. Raman spectroscopic fiber-optical approaches can help to decrease the impact on the patient, increase objectivity in tissue characterization, reduce expenses and provide a significant time advantage in endoscopy. In gastroenterology an early recognition of malign and precursor lesions is relevant. Instantaneous and precise differentiation between adenomas as precursor lesions for cancer and hyperplastic polyps on the one hand and between high and low-risk alterations on the other hand is important. Raman fiber-optical measurements of colon biopsy samples taken during colonoscopy were carried out during a clinical study, and samples of adenocarcinoma (22), tubular adenomas (141), hyperplastic polyps (79) and normal tissue (101) from 151 patients were analyzed. This allows us to focus on the bioinformatic analysis and to set stage for Raman endoscopic measurements. Since spectral differences between normal and cancerous biopsy samples are small, special care has to be taken in data analysis. Using a leave-one-patient-out cross-validation scheme, three different outlier identification methods were investigated to decrease the influence of systematic errors, like a residual risk in misplacement of the sample and spectral dilution of marker bands (esp. cancerous tissue) and therewith optimize the experimental design. Furthermore other validations methods like leave-one-sample-out and leave-one-spectrum-out cross-validation schemes were compared with leave-one-patient-out cross-validation. High-risk lesions were differentiated from low-risk lesions with a sensitivity of 79%, specificity of 74% and an accuracy of 77%, cancer and normal tissue with a sensitivity of 79%, specificity of 83% and an accuracy of 81%. Additionally applied outlier identification enabled us to improve the recognition of neoplastic biopsy samples. Copyright

  13. Comparison of methodological approaches to identify economic activity regularities in transition economy

    Directory of Open Access Journals (Sweden)

    Jitka Poměnková

    2011-01-01

    Full Text Available Presented paper focuses on consideration and evaluation of methodical approaches to analyze cyclical structure character of economic activity in transition economy. As a starting point, work in time domain is applied, which is followed in frequency domain approach. Both approaches are viewed from methodical as well as application point of view and their advantage and disadvantage are discussed. Consequently, time-frequency domain approach is added and applied on real data. On the basis of obtained results recommendation is formulated. All discussed methodical approaches are also considered from the perspective of capability to evaluate behaving of business cycle in time of global economic crisis before/after year 2008. The empirical part of the paper deals with data of gross domestic product in the Czech Republic in 1996/Q1–2010/Q2.

  14. Experimental Validation of a Differential Variational Inequality-Based Approach for Handling Friction and Contact in Vehicle

    Science.gov (United States)

    2015-11-20

    terrain modeled using the discrete element method (DEM). Experimental Validation of a Differential Variational Inequality -Based Approach for Handling...COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Experimental Validation of a Differential Variational Inequality -Based Approach for...sinkage, and single wheel tests. 1.1. Modeling Frictional Contact Via Differential Variational Inequalities Consider a three dimensional (3D) system of

  15. Neuroproteomics and Systems Biology Approach to Identify Temporal Biomarker Changes Post Experimental Traumatic Brain Injury in Rats

    Directory of Open Access Journals (Sweden)

    Firas H Kobeissy

    2016-11-01

    Full Text Available Traumatic brain injury (TBI represents a critical health problem of which diagnosis, management and treatment remain challenging. TBI is a contributing factor in approximately 1/3 of all injury-related deaths in the United States. The Centers for Disease Control and Prevention (CDC estimate that 1.7 million TBI people suffer a TBI in the United States annually. Efforts continue to focus on elucidating the complex molecular mechanisms underlying TBI pathophysiology and defining sensitive and specific biomarkers that can aid in improving patient management and care. Recently, the area of neuroproteomics-systems biology is proving to be a prominent tool in biomarker discovery for central nervous system (CNS injury and other neurological diseases. In this work, we employed the controlled cortical impact (CCI model of experimental TBI in rat model to assess the temporal-global proteome changes after acute (1 day and for the first time, subacute (7 days, post-injury time frame using the established CAX-PAGE LC-MS/MS platform for protein separation combined with discrete systems biology analyses to identify temporal biomarker changes related to this rat TBI model. Rather than focusing on any one individual molecular entities, we used in silico systems biology approach to understand the global dynamics that govern proteins that are differentially altered post-injury. In addition, gene ontology analysis of the proteomic data was conducted in order to categorize the proteins by molecular function, biological process, and cellular localization. Results show alterations in several proteins related to inflammatory responses and oxidative stress in both acute (1 day and subacute (7 days periods post TBI. Moreover, results suggest a differential upregulation of neuroprotective proteins at 7-days post-CCI involved in cellular functions such as neurite growth, regeneration, and axonal guidance. Our study is amongst the first to assess temporal neuroproteome

  16. Identifying Creatively Gifted Students: Necessity of a Multi-Method Approach

    Science.gov (United States)

    Ambrose, Laura; Machek, Greg R.

    2015-01-01

    The process of identifying students as creatively gifted provides numerous challenges for educators. Although many schools assess for creativity in identifying students for gifted and talented services, the relationship between creativity and giftedness is often not fully understood. This article reviews commonly used methods of creativity…

  17. A systems genetics approach identifies genes and pathways for type 2 diabetes in human islets

    DEFF Research Database (Denmark)

    Taneera, Jalal; Lang, Stefan; Sharma, Amitabh

    2012-01-01

    Close to 50 genetic loci have been associated with type 2 diabetes (T2D), but they explain only 15% of the heritability. In an attempt to identify additional T2D genes, we analyzed global gene expression in human islets from 63 donors. Using 48 genes located near T2D risk variants, we identified ...

  18. Validating the TeleStroke Mimic Score: A Prediction Rule for Identifying Stroke Mimics Evaluated Over Telestroke Networks.

    Science.gov (United States)

    Ali, Syed F; Hubert, Gordian J; Switzer, Jeffrey A; Majersik, Jennifer J; Backhaus, Roland; Shepard, L Wylie; Vedala, Kishore; Schwamm, Lee H

    2018-03-01

    Up to 30% of acute stroke evaluations are deemed stroke mimics, and these are common in telestroke as well. We recently published a risk prediction score for use during telestroke encounters to differentiate stroke mimics from ischemic cerebrovascular disease derived and validated in the Partners TeleStroke Network. Using data from 3 distinct US and European telestroke networks, we sought to externally validate the TeleStroke Mimic (TM) score in a broader population. We evaluated the TM score in 1930 telestroke consults from the University of Utah, Georgia Regents University, and the German TeleMedical Project for Integrative Stroke Care Network. We report the area under the curve in receiver-operating characteristic curve analysis with 95% confidence interval for our previously derived TM score in which lower TM scores correspond with a higher likelihood of being a stroke mimic. Based on final diagnosis at the end of the telestroke consultation, there were 630 of 1930 (32.6%) stroke mimics in the external validation cohort. All 6 variables included in the score were significantly different between patients with ischemic cerebrovascular disease versus stroke mimics. The TM score performed well (area under curve, 0.72; 95% confidence interval, 0.70-0.73; P mimic during telestroke consultation in these diverse cohorts was similar to its performance in our original cohort. Predictive decision-support tools like the TM score may help highlight key clinical differences between mimics and patients with stroke during complex, time-critical telestroke evaluations. © 2018 American Heart Association, Inc.

  19. Identifying target groups for environmentally sustainable transport: assessment of different segmentation approaches

    DEFF Research Database (Denmark)

    Haustein, Sonja; Hunecke, Marcel

    2013-01-01

    Recently, the use of attitude-based market segmentation to promote environmentally sustainable transport has significantly increased. The segmentation of the population into meaningful groups sharing similar attitudes and preferences provides valuable information about how green measures should...... and behavioural segmentations are compared regarding marketing criteria. Although none of the different approaches can claim absolute superiority, attitudinal approaches show advantages in providing startingpoints for interventions to reduce car use....

  20. Identifying subgroups among poor prognosis patients with nonseminomatous germ cell cancer by tree modelling: a validation study.

    NARCIS (Netherlands)

    M.R. van Dijk (Merel); E.W. Steyerberg (Ewout); S.P. Stenning; J.D.F. Habbema (Dik)

    2004-01-01

    textabstractBACKGROUND: In order to target intensive treatment strategies for poor prognosis patients with non-seminomatous germ cell cancer, those with the poorest prognosis should be identified. These patients might profit most from more intensive treatment strategies. For

  1. Vertebrae classification models - Validating classification models that use morphometrics to identify ancient salmonid (Oncorhynchus spp.) vertebrae to species

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Using morphometric characteristics of modern salmonid (Oncorhynchus spp.) vertebrae, we have developed classification models to identify salmonid vertebrae to the...

  2. A New Approach for Identifying Ionospheric Gradients in the Context of the Gagan System

    Science.gov (United States)

    Kudala, Ravi Chandra

    The Indian Space Research Organization and the Airports Authority of India are jointly implementing the Global Positioning System (GPS) aided GEO Augmented Navigation (GAGAN) system in order to meet the following required navigation performance (RNP) parameters: integrity, continuity, accuracy, and availability (for aircraft operations). Such a system provides the user with orbit, clock, and ionospheric corrections in addition to ranging signals via the geostationary earth orbit satellite (GEOSAT). The equatorial ionization anomaly (EIA), due to rapid non-uniform electron-ion recombination that persists on the Indian subcontinent, causes ionospheric gradients. Ionospheric gradients represent the most severe threat to high-integrity differential GNSS systems such as GAGAN. In order to ensure integrity under conditions of an ionospheric storm, the following three objectives must be met: careful monitoring, error bounding, and sophisticated storm-front modeling. The first objective is met by continuously tracking data due to storms, and, on quiet days, determining precise estimates of the threat parameters from reference monitoring stations. The second objective is met by quantifying the above estimates of threat parameters due to storms through maximum and minimum typical thresholds. In the context GAGAN, this work proposes a new method for identifying ionospheric gradients, in addition to determining an appropriate upper bound, in order to sufficiently understand error during storm days. Initially, carrier phase data of the GAGAN network from Indian TEC stations for both storm and quiet days was used for estimating ionospheric spatial and temporal gradients (the vertical ionospheric gradient (σVIG) and the rate of the TEC index (ROTI), respectively) in multiple viewing directions. Along similar lines, using the carrier to noise ratio (C/N0) for the same data, the carrier to noise ratio index (σCNRI) was derived. Subsequently, the one-toone relationship between

  3. Derivation and validation of REASON: a risk score identifying candidates to screen for peripheral arterial disease using ankle brachial index.

    Science.gov (United States)

    Ramos, Rafel; Baena-Díez, Jose Miguel; Quesada, Miquel; Solanas, Pascual; Subirana, Isaac; Sala, Joan; Alzamora, Maite; Forès, Rosa; Masiá, Rafel; Elosua, Roberto; Grau, María; Cordón, Ferran; Pera, Guillem; Rigo, Fernando; Martí, Ruth; Ponjoan, Anna; Cerezo, Carlos; Brugada, Ramon; Marrugat, Jaume

    2011-02-01

    The recommendation of screening with ankle brachial index (ABI) in asymptomatic individuals is controversial. The aims of the present study were to develop and validate a pre-screening test to select candidates for ABI measurement in the Spanish population 50-79 years old, and to compare its predictive capacity to current Inter-Society Consensus (ISC) screening criteria. Two population-based cross-sectional studies were used to develop (n = 4046) and validate (n = 3285) a regression model to predict ABI guidelines, and similar sensitivity. This resulted in fewer patients screened per diagnosis of ABI < 0.9 (10.6 vs. 8.75) and a lower proportion of the population aged 50-79 years candidate to ABI screening (63.3% vs. 55.0%). This model provides accurate ABI < 0.9 risk estimates for ages 50-79, with a better predictive capacity than that of ISC criteria. Its use could reduce possible harms and unnecessary work-ups of ABI screening as a risk stratification strategy in primary prevention of peripheral vascular disease. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  4. Identifying patients with severe sepsis using administrative claims: patient-level validation of the angus implementation of the international consensus conference definition of severe sepsis.

    Science.gov (United States)

    Iwashyna, Theodore J; Odden, Andrew; Rohde, Jeffrey; Bonham, Catherine; Kuhn, Latoya; Malani, Preeti; Chen, Lena; Flanders, Scott

    2014-06-01

    Severe sepsis is a common and costly problem. Although consistently defined clinically by consensus conference since 1991, there have been several different implementations of the severe sepsis definition using ICD-9-CM codes for research. We conducted a single center, patient-level validation of 1 common implementation of the severe sepsis definition, the so-called "Angus" implementation. Administrative claims for all hospitalizations for patients initially admitted to general medical services from an academic medical center in 2009-2010 were reviewed. On the basis of ICD-9-CM codes, hospitalizations were sampled for review by 3 internal medicine-trained hospitalists. Chart reviews were conducted with a structured instrument, and the gold standard was the hospitalists' summary clinical judgment on whether the patient had severe sepsis. Three thousand one hundred forty-six (13.5%) hospitalizations met ICD-9-CM criteria for severe sepsis by the Angus implementation (Angus-positive) and 20,142 (86.5%) were Angus-negative. Chart reviews were performed for 92 randomly selected Angus-positive and 19 randomly-selected Angus-negative hospitalizations. Reviewers had a κ of 0.70. The Angus implementation's positive predictive value was 70.7% [95% confidence interval (CI): 51.2%, 90.5%]. The negative predictive value was 91.5% (95% CI: 79.0%, 100%). The sensitivity was 50.4% (95% CI: 14.8%, 85.7%). Specificity was 96.3% (95% CI: 92.4%, 100%). Two alternative ICD-9-CM implementations had high positive predictive values but sensitivities of Angus implementation of the international consensus conference definition of severe sepsis offers a reasonable but imperfect approach to identifying patients with severe sepsis when compared with a gold standard of structured review of the medical chart by trained hospitalists.

  5. Clinically Detectable Dental Identifiers Observed in Intra-oral Photographs and Extra-oral Radiographs, Validated for Human Identification Purposes.

    Science.gov (United States)

    Angelakopoulos, Nikolaos; Franco, Ademir; Willems, Guy; Fieuws, Steffen; Thevissen, Patrick

    2017-07-01

    Screening the prevalence and pattern of dental identifiers contributes toward the process of human identification. This research investigated the uniqueness of clinical dental identifiers in photographs and radiographs. Panoramic and lateral cephalometric radiographs and five intra-oral photographs of 1727 subjects were used. In a target set, two observers examined different subjects. In a subset, both observers examined the same subjects (source set). The distance between source and target subjects was quantified for each identifier. The percentage of subjects in the target set being at least as close as the correct subject was assessed. The number of molars (34.6%), missing teeth (42%), and displaced teeth (59.9%) were the most unique identifiers in photographs and panoramic and lateral cephalometric radiographs, respectively. The pattern of rotated teeth (14.9%) was the most unique in photographs, while displaced teeth was in panoramic (37.6%) and lateral cephalometric (54.8%) radiographs. Morphological identifiers were the most unique, highlighting their importance for human identifications. © 2016 American Academy of Forensic Sciences.

  6. A multi-source satellite data approach for modelling Lake Turkana water level: calibration and validation using satellite altimetry data

    Directory of Open Access Journals (Sweden)

    N. M. Velpuri

    2012-01-01

    Full Text Available Lake Turkana is one of the largest desert lakes in the world and is characterized by high degrees of inter- and intra-annual fluctuations. The hydrology and water balance of this lake have not been well understood due to its remote location and unavailability of reliable ground truth datasets. Managing surface water resources is a great challenge in areas where in-situ data are either limited or unavailable. In this study, multi-source satellite-driven data such as satellite-based rainfall estimates, modelled runoff, evapotranspiration, and a digital elevation dataset were used to model Lake Turkana water levels from 1998 to 2009. Due to the unavailability of reliable lake level data, an approach is presented to calibrate and validate the water balance model of Lake Turkana using a composite lake level product of TOPEX/Poseidon, Jason-1, and ENVISAT satellite altimetry data. Model validation results showed that the satellite-driven water balance model can satisfactorily capture the patterns and seasonal variations of the Lake Turkana water level fluctuations with a Pearson's correlation coefficient of 0.90 and a Nash-Sutcliffe Coefficient of Efficiency (NSCE of 0.80 during the validation period (2004–2009. Model error estimates were within 10% of the natural variability of the lake. Our analysis indicated that fluctuations in Lake Turkana water levels are mainly driven by lake inflows and over-the-lake evaporation. Over-the-lake rainfall contributes only up to 30% of lake evaporative demand. During the modelling time period, Lake Turkana showed seasonal variations of 1–2 m. The lake level fluctuated in the range up to 4 m between the years 1998–2009. This study demonstrated the usefulness of satellite altimetry data to calibrate and validate the satellite-driven hydrological model for Lake Turkana without using any in-situ data. Furthermore, for Lake Turkana, we identified and outlined opportunities and challenges of using a calibrated

  7. A multi-source satellite data approach for modelling Lake Turkana water level: Calibration and validation using satellite altimetry data

    Science.gov (United States)

    Velpuri, N.M.; Senay, G.B.; Asante, K.O.

    2012-01-01

    Lake Turkana is one of the largest desert lakes in the world and is characterized by high degrees of interand intra-annual fluctuations. The hydrology and water balance of this lake have not been well understood due to its remote location and unavailability of reliable ground truth datasets. Managing surface water resources is a great challenge in areas where in-situ data are either limited or unavailable. In this study, multi-source satellite-driven data such as satellite-based rainfall estimates, modelled runoff, evapotranspiration, and a digital elevation dataset were used to model Lake Turkana water levels from 1998 to 2009. Due to the unavailability of reliable lake level data, an approach is presented to calibrate and validate the water balance model of Lake Turkana using a composite lake level product of TOPEX/Poseidon, Jason-1, and ENVISAT satellite altimetry data. Model validation results showed that the satellitedriven water balance model can satisfactorily capture the patterns and seasonal variations of the Lake Turkana water level fluctuations with a Pearson's correlation coefficient of 0.90 and a Nash-Sutcliffe Coefficient of Efficiency (NSCE) of 0.80 during the validation period (2004-2009). Model error estimates were within 10% of the natural variability of the lake. Our analysis indicated that fluctuations in Lake Turkana water levels are mainly driven by lake inflows and over-the-lake evaporation. Over-the-lake rainfall contributes only up to 30% of lake evaporative demand. During the modelling time period, Lake Turkana showed seasonal variations of 1-2m. The lake level fluctuated in the range up to 4m between the years 1998-2009. This study demonstrated the usefulness of satellite altimetry data to calibrate and validate the satellite-driven hydrological model for Lake Turkana without using any in-situ data. Furthermore, for Lake Turkana, we identified and outlined opportunities and challenges of using a calibrated satellite-driven water balance

  8. Development of Novel Random Network Theory-Based Approaches to Identify Network Interactions among Nitrifying Bacteria

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Cindy

    2015-07-17

    The interactions among different microbial populations in a community could play more important roles in determining ecosystem functioning than species numbers and their abundances, but very little is known about such network interactions at a community level. The goal of this project is to develop novel framework approaches and associated software tools to characterize the network interactions in microbial communities based on high throughput, large scale high-throughput metagenomics data and apply these approaches to understand the impacts of environmental changes (e.g., climate change, contamination) on network interactions among different nitrifying populations and associated microbial communities.

  9. Cross sectional study of performance indicators for English Primary Care Trusts: testing construct validity and identifying explanatory variables

    Directory of Open Access Journals (Sweden)

    Lilford Richard

    2006-06-01

    Full Text Available Abstract Background The performance of Primary Care Trusts in England is assessed and published using a number of different performance indicators. Our study has two broad purposes. Firstly, to find out whether pairs of indicators that purport to measure similar aspects of quality are correlated (as would be expected if they are both valid measures of the same construct. Secondly, we wanted to find out whether broad (global indicators correlated with any particular features of Primary Care Trusts, such as expenditure per capita. Methods Cross sectional quantitative analysis using data from six 2004/05 PCT performance indicators for 303 English Primary Care Trusts from four sources in the public domain: Star Rating, aggregated Quality and Outcomes Framework scores, Dr Foster mortality index, Dr Foster equity index (heart by-pass and hip replacements, NHS Litigation Authority Risk Management standards and Patient Satisfaction scores from the Star Ratings. Forward stepwise multiple regression analysis to determine the effect of Primary Care Trust characteristics on performance. Results Star Rating and Quality and Outcomes Framework total, both summary measures of global quality, were not correlated with each other (F = 0.66, p = 0.57. There were however positive correlations between Quality and Outcomes Framework total and patient satisfaction (r = 0.61, p Conclusion Performance assessment in healthcare remains on the Government's agenda, with new core and developmental standards set to replace the Star Ratings in 2006. Yet the results of this analysis provide little evidence that the current indicators have sufficient construct validity to measure the underlying concept of quality, except when the specific area of screening is considered.

  10. Validity of consumer-grade activity monitor to identify manual wheelchair propulsion in standardized activities of daily living.

    Science.gov (United States)

    Leving, Marika T; Horemans, Henricus L D; Vegter, Riemer J K; de Groot, Sonja; Bussmann, Johannes B J; van der Woude, Lucas H V

    2018-01-01

    Hypoactive lifestyle contributes to the development of secondary complications and lower quality of life in wheelchair users. There is a need for objective and user-friendly physical activity monitors for wheelchair-dependent individuals in order to increase physical activity through self-monitoring, goal setting, and feedback provision. To determine the validity of Activ8 Activity Monitors to 1) distinguish two classes of activities: independent wheelchair propulsion from other non-propulsive wheelchair-related activities 2) distinguish five wheelchair-related classes of activities differing by the movement intensity level: sitting in a wheelchair (hands may be moving but wheelchair remains stationary), maneuvering, and normal, high speed or assisted wheelchair propulsion. Sixteen able-bodied individuals performed sixteen various standardized 60s-activities of daily living. Each participant was equipped with a set of two Activ8 Professional Activity Monitors, one at the right forearm and one at the right wheel. Task classification by the Active8 Monitors was validated using video recordings. For the overall agreement, sensitivity and positive predictive value, outcomes above 90% are considered excellent, between 70 and 90% good, and below 70% unsatisfactory. Division in two classes resulted in overall agreement of 82.1%, sensitivity of 77.7% and positive predictive value of 78.2%. 84.5% of total duration of all tasks was classified identically by Activ8 and based on the video material. Division in five classes resulted in overall agreement of 56.6%, sensitivity of 52.8% and positive predictive value of 51.9%. 59.8% of total duration of all tasks was classified identically by Activ8 and based on the video material. Activ8 system proved to be suitable for distinguishing between active wheelchair propulsion and other non-propulsive wheelchair-related activities. The ability of the current system and algorithms to distinguish five various wheelchair-related activities

  11. A standardized approach to verification and validation to assist in expert system development

    International Nuclear Information System (INIS)

    Hines, J.W.; Hajek, B.K.; Miller, D.W.; Haas, M.A.

    1992-01-01

    For the past six years, the Nuclear Engineering Program's Artificial Intelligence (AI) Group at The Ohio State University has been developing an integration of expert systems to act as an aid to nuclear power plant operators. This Operator Advisor consists of four modules that monitor plant parameters, detect deviations from normality, diagnose the root cause of the abnormality, manage procedures to effectively respond to the abnormality, and mitigate its consequences. Recently Ohio State University received a grant from the Department of Energy's Special Research Grant Program to utilize the methodologies developed for the Operator Advisor for Heavy Water Reactor (HWR) malfunction root cause diagnosis. To aid in the development of this new system, a standardized Verification and Validation (V ampersand V) approach is being implemented. Its primary functions are to guide the development of the expert system and to ensure that the end product fulfills the initial objectives. The development process has been divided into eight life-cycle V ampersand V phases from concept to operation and maintenance. Each phase has specific V ampersand V tasks to be performed to ensure a quality end product. Four documents are being used to guide development. The Software Verification and Validation Plan (SVVP) outlines the V ampersand V tasks necessary to verify the product at the end of each software development phase, and to validate that the end product complies with the established software and system requirements and meets the needs of the user. The Software Requirements Specification (SRS) documents the essential requirements of the system. The Software Design Description (SDD) represents these requirements with a specific design. And lastly, the Software Test Document establishes a testing methodology to be used throughout the development life-cycle. 10 refs., 1 fig

  12. Design of character-based DNA barcode motif for species identification: A computational approach and its validation in fishes.

    Science.gov (United States)

    Chakraborty, Mohua; Dhar, Bishal; Ghosh, Sankar Kumar

    2017-11-01

    The DNA barcodes are generally interpreted using distance-based and character-based methods. The former uses clustering of comparable groups, based on the relative genetic distance, while the latter is based on the presence or absence of discrete nucleotide substitutions. The distance-based approach has a limitation in defining a universal species boundary across the taxa as the rate of mtDNA evolution is not constant throughout the taxa. However, character-based approach more accurately defines this using a unique set of nucleotide characters. The character-based analysis of full-length barcode has some inherent limitations, like sequencing of the full-length barcode, use of a sparse-data matrix and lack of a uniform diagnostic position for each group. A short continuous stretch of a fragment can be used to resolve the limitations. Here, we observe that a 154-bp fragment, from the transversion-rich domain of 1367 COI barcode sequences can successfully delimit species in the three most diverse orders of freshwater fishes. This fragment is used to design species-specific barcode motifs for 109 species by the character-based method, which successfully identifies the correct species using a pattern-matching program. The motifs also correctly identify geographically isolated population of the Cypriniformes species. Further, this region is validated as a species-specific mini-barcode for freshwater fishes by successful PCR amplification and sequencing of the motif (154 bp) using the designed primers. We anticipate that use of such motifs will enhance the diagnostic power of DNA barcode, and the mini-barcode approach will greatly benefit the field-based system of rapid species identification. © 2017 John Wiley & Sons Ltd.

  13. A Contextual Approach to the Assessment of Social Skills: Identifying Meaningful Behaviors for Social Competence

    Science.gov (United States)

    Warnes, Emily D.; Sheridan, Susan M.; Geske, Jenenne; Warnes, William A.

    2005-01-01

    An exploratory study was conducted which assessed behaviors that characterize social competence in the second and fifth grades. A contextual approach was used to gather information from second- and fifth-grade children and their parents and teachers regarding the behaviors they perceived to be important for getting along well with peers. Data were…

  14. A probabilistic approach to identify putative drug targets in biochemical networks.

    NARCIS (Netherlands)

    Murabito, E.; Smalbone, K.; Swinton, J.; Westerhoff, H.V.; Steuer, R.

    2011-01-01

    Network-based drug design holds great promise in clinical research as a way to overcome the limitations of traditional approaches in the development of drugs with high efficacy and low toxicity. This novel strategy aims to study how a biochemical network as a whole, rather than its individual

  15. Identifying multiple outliers in linear regression: robust fit and clustering approach

    International Nuclear Information System (INIS)

    Robiah Adnan; Mohd Nor Mohamad; Halim Setan

    2001-01-01

    This research provides a clustering based approach for determining potential candidates for outliers. This is modification of the method proposed by Serbert et. al (1988). It is based on using the single linkage clustering algorithm to group the standardized predicted and residual values of data set fit by least trimmed of squares (LTS). (Author)

  16. Validating CDIAC's population-based approach to the disaggregation of within-country CO2 emissions

    International Nuclear Information System (INIS)

    Cushman, R.M.; Beauchamp, J.J.; Brenkert, A.L.

    1998-01-01

    The Carbon Dioxide Information Analysis Center produces and distributes a data base of CO 2 emissions from fossil-fuel combustion and cement production, expressed as global, regional, and national estimates. CDIAC also produces a companion data base, expressed on a one-degree latitude-longitude grid. To do this gridding, emissions within each country are spatially disaggregated according to the distribution of population within that country. Previously, the lack of within-country emissions data prevented a validation of this approach. But emissions inventories are now becoming available for most US states. An analysis of these inventories confirms that population distribution explains most, but not all, of the variance in the distribution of CO 2 emissions within the US. Additional sources of variance (coal production, non-carbon energy sources, and interstate electricity transfers) are explored, with the hope that the spatial disaggregation of emissions can be improved

  17. Ageing nuclear plants inspection, maintenance and performance monitoring eased by Data Validation and Reconciliation (DVR) approach

    International Nuclear Information System (INIS)

    Quang, A. Tran; Chares, R.

    2012-01-01

    With the long term operation of nuclear power plants, decisions to replace or repair should be optimized from an economical, human resource and safety point of view. In that frame, the operation requires a continuous inspection and testing of systems in order to detect degradations and failures at an early stage. Nevertheless, in spite of the fact that such assiduous vigilance ensures safety, it is not optimized in terms of man power and maintenance costs. Preventive strategies show the same types of drawbacks. On top of that, these maintenance procedures rely on process measurements which accuracy, availability and reliability cannot always be ensured. The present paper introduces a new approach to the maintenance management of ageing installations and suggests a method that overcomes the actual drawbacks. The Reconciliation and Validation of data (DVR) is an advanced and innovative technology that relies on process data statistics, thermodynamic and on-site process knowledge. Benefits of real-time applications are also presented. (author)

  18. Identify too big to fail banks and capital insurance: An equilibrium approach

    OpenAIRE

    Katerina Ivanov

    2017-01-01

    The objective of this paper is develop a rational expectation equilibrium model of capital insurance to identify too big to fail banks. The main results of this model include (1) too big to fail banks can be identified explicitly by a systemic risk measure, loss betas, of all banks in the entire financial sector; (2) the too big to fail feature can be largely justified by a high level of loss beta; (3) the capital insurance proposal benefits market participants and reduces the systemic risk; ...

  19. A Transcriptomic Approach to Identify Novel Drug Efflux Pumps in Bacteria.

    Science.gov (United States)

    Li, Liping; Tetu, Sasha G; Paulsen, Ian T; Hassan, Karl A

    2018-01-01

    The core genomes of most bacterial species include a large number of genes encoding putative efflux pumps. The functional roles of most of these pumps are unknown, however, they are often under tight regulatory control and expressed in response to their substrates. Therefore, one way to identify pumps that function in antimicrobial resistance is to examine the transcriptional responses of efflux pump genes to antimicrobial shock. By conducting complete transcriptomic experiments following antimicrobial shock treatments, it may be possible to identify novel drug efflux pumps encoded in bacterial genomes. In this chapter we describe a complete workflow for conducting transcriptomic analyses by RNA sequencing, to determine transcriptional changes in bacteria responding to antimicrobials.

  20. A User Centered Innovation Approach Identifying Key User Values for the E-Newspaper

    OpenAIRE

    Carina Ihlström Eriksson; Jesper Svensson

    2009-01-01

    We have studied the pre-adoption phase of the e-newspaper, i.e. a newspaper published with e-paper technology. The research question of this article is: In what way can a user centered innovation process contribute to identifying key values in mobile innovations? The aim of this article is threefold: firstly, to identify key values for the e-newspaper, secondly, to examine the intention to adopt a new possible innovation and thirdly, to explore user centered design processes ability to captur...

  1. Particle Swarm Based Approach of a Real-Time Discrete Neural Identifier for Linear Induction Motors

    Directory of Open Access Journals (Sweden)

    Alma Y. Alanis

    2013-01-01

    Full Text Available This paper focusses on a discrete-time neural identifier applied to a linear induction motor (LIM model, whose model is assumed to be unknown. This neural identifier is robust in presence of external and internal uncertainties. The proposed scheme is based on a discrete-time recurrent high-order neural network (RHONN trained with a novel algorithm based on extended Kalman filter (EKF and particle swarm optimization (PSO, using an online series-parallel con…figuration. Real-time results are included in order to illustrate the applicability of the proposed scheme.

  2. Identifying Ghanaian Pre-Service Teachers' Readiness for Computer Use: A Technology Acceptance Model Approach

    Science.gov (United States)

    Gyamfi, Stephen Adu

    2016-01-01

    This study extends the technology acceptance model to identify factors that influence technology acceptance among pre-service teachers in Ghana. Data from 380 usable questionnaires were tested against the research model. Utilising the extended technology acceptance model (TAM) as a research framework, the study found that: pre-service teachers'…

  3. An interdisciplinary approach to identify adaptation strategies that enhance flood resilience and urban liveability

    DEFF Research Database (Denmark)

    Rogers, B. C.; Bertram, N.; Gunn, Alex

    This paper provides guidance on how to identify and design the most suitable climate adaptation strategies for enhancing the liveability and flood resilience of urban catchments. It presents findings from a case study of Elwood, a coastal Melbourne suburb regularly affected by flooding. The resea...

  4. An integrated computational validation approach for potential novel miRNA prediction

    Directory of Open Access Journals (Sweden)

    Pooja Viswam

    2017-12-01

    Full Text Available MicroRNAs (miRNAs are short, non-coding RNAs between 17bp-24bp length that regulate gene expression by targeting mRNA molecules. The regulatory functions of miRNAs are known to be majorly associated with disease phenotypes such as cancer, cell signaling, cell division, growth and other metabolisms. Novel miRNAs are defined as sequences which does not have any similarity with the existing known sequences and void of any experimental evidences. In recent decades, the advent of next-generation sequencing allows us to capture the small RNA molecules form the cells and developing methods to estimate their expression levels. Several computational algorithms are available to predict the novel miRNAs from the deep sequencing data. In this work, we integrated three novel miRNA prediction programs miRDeep, miRanalyzer and miRPRo to compare and validate their prediction efficiency. The dicer cleavage sites, alignment density, seed conservation, minimum free energy, AU-GC percentage, secondary loop scores, false discovery rates and confidence scores will be considered for comparison and evaluation. Efficiency to identify isomiRs and base pair mismatches in a strand specific manner will also be considered for the computational validation. Further, the criteria and parameters for the identification of the best possible novel miRNA with minimal false positive rates were deduced.

  5. Optimizing and Validating a Brief Assessment for Identifying Children of Service Members at Risk for Psychological Health Problems Following Parent Deployment

    Science.gov (United States)

    2016-07-01

    Journal of Family Therapy, 21, 313-323. Behar, L.B. (1997). The Preschool Behavior Questionnaire. Journal of Abnormal Child Psychology , 5, 265-275... Psychological Health Problems Following Parent Deployment PRINCIPAL INVESTIGATOR: Julie Wargo Aikins, PhD CONTRACTING ORGANIZATION: Wayne State...Validating a Brief Assessment for Identifying Children of Service Members at Risk for Psychological Health Problems Following Parent Deployment 5b. GRANT

  6. Validation of rearrangement break points identified by paired-end sequencing in natural populations of Drosophila melanogaster.

    Science.gov (United States)

    Cridland, Julie M; Thornton, Kevin R

    2010-01-13

    Several recent studies have focused on the evolution of recently duplicated genes in Drosophila. Currently, however, little is known about the evolutionary forces acting upon duplications that are segregating in natural populations. We used a high-throughput, paired-end sequencing platform (Illumina) to identify structural variants in a population sample of African D. melanogaster. Polymerase chain reaction and sequencing confirmation of duplications detected by multiple, independent paired-ends showed that paired-end sequencing reliably uncovered the break points of structural rearrangements and allowed us to identify a number of tandem duplications segregating within a natural population. Our confirmation experiments show that rates of confirmation are very high, even at modest coverage. Our results also compare well with previous studies using microarrays (Emerson J, Cardoso-Moreira M, Borevitz JO, Long M. 2008. Natural selection shapes genome wide patterns of copy-number polymorphism in Drosophila melanogaster. Science. 320:1629-1631. and Dopman EB, Hartl DL. 2007. A portrait of copy-number polymorphism in Drosophila melanogaster. Proc Natl Acad Sci U S A. 104:19920-19925.), which both gives us confidence in the results of this study as well as confirms previous microarray results.We were also able to identify whole-gene duplications, such as a novel duplication of Or22a, an olfactory receptor, and identify copy-number differences in genes previously known to be under positive selection, like Cyp6g1, which confers resistance to dichlorodiphenyltrichloroethane. Several "hot spots" of duplications were detected in this study, which indicate that particular regions of the genome may be more prone to generating duplications. Finally, population frequency analysis of confirmed events also showed an excess of rare variants in our population, which indicates that duplications segregating in the population may be deleterious and ultimately destined to be lost from the

  7. Identifying Onboarding Heuristics for Free-to-Play Mobile Games: A Mixed Methods Approach

    DEFF Research Database (Denmark)

    Thomsen, Line Ebdrup; Weigert Petersen, Falko; Mirza-Babaei, Pejman

    2016-01-01

    The onboarding phase of Free-to-Play mobile games, covering the first few minutes of play, typically sees a substantial retention rate amongst players. It is therefore crucial to the success of these games that the onboarding phase promotes engagement to the widest degree possible. In this paper ...... of puzzle games, base builders and arcade games, and utilize different onboarding phase design approaches. Results showcase how heuristics can be used to design engaging onboarding phases in mobile games....

  8. A Guttman-Based Approach to Identifying Cumulativeness Applied to Chimpanzee Culture

    OpenAIRE

    Graber, RB; de Cock, DR; Burton, ML

    2012-01-01

    Human culture appears to build on itself-that is, to be to some extent cumulative. Whether this property is shared by culture in the common chimpanzee is controversial. The question previously has been approached, qualitatively (and inconclusively), by debating whether any chimpanzee culture traits have resulted from individuals building on one another's work ("ratcheting"). The fact that the chimpanzees at different sites have distinctive repertoires of traits affords a different avenue of a...

  9. Provocative Endoscopy to Identify Bleeding Site in Upper Gastrointestinal Bleeding: A Novel Approach in Transarterial Embolization.

    Science.gov (United States)

    Kamo, Minobu; Fuwa, Sokun; Fukuda, Katsuyuki; Fujita, Yoshiyuki; Kurihara, Yasuyuki

    2016-07-01

    This report describes a novel approach to endoscopically induce bleeding by removing a clot from the bleeding site during angiography for upper gastrointestinal (UGI) hemorrhage. This procedure enabled accurate identification of the bleeding site, allowing for successful targeted embolization despite a negative initial angiogram. Provocative endoscopy may be a feasible and useful option for angiography of obscure bleeding sites in patients with UGI arterial hemorrhage. Copyright © 2016 SIR. Published by Elsevier Inc. All rights reserved.

  10. Optimization and experimental validation of a thermal cycle that maximizes entropy coefficient fisher identifiability for lithium iron phosphate cells

    Science.gov (United States)

    Mendoza, Sergio; Rothenberger, Michael; Hake, Alison; Fathy, Hosam

    2016-03-01

    This article presents a framework for optimizing the thermal cycle to estimate a battery cell's entropy coefficient at 20% state of charge (SOC). Our goal is to maximize Fisher identifiability: a measure of the accuracy with which a parameter can be estimated. Existing protocols in the literature for estimating entropy coefficients demand excessive laboratory time. Identifiability optimization makes it possible to achieve comparable accuracy levels in a fraction of the time. This article demonstrates this result for a set of lithium iron phosphate (LFP) cells. We conduct a 24-h experiment to obtain benchmark measurements of their entropy coefficients. We optimize a thermal cycle to maximize parameter identifiability for these cells. This optimization proceeds with respect to the coefficients of a Fourier discretization of this thermal cycle. Finally, we compare the estimated parameters using (i) the benchmark test, (ii) the optimized protocol, and (iii) a 15-h test from the literature (by Forgez et al.). The results are encouraging for two reasons. First, they confirm the simulation-based prediction that the optimized experiment can produce accurate parameter estimates in 2 h, compared to 15-24. Second, the optimized experiment also estimates a thermal time constant representing the effects of thermal capacitance and convection heat transfer.

  11. An integrated approach to identify the origin of PM10 exceedances.

    Science.gov (United States)

    Amodio, M; Andriani, E; de Gennaro, G; Demarinis Loiotile, A; Di Gilio, A; Placentino, M C

    2012-09-01

    This study was aimed to the development of an integrated approach for the characterization of particulate matter (PM) pollution events in the South of Italy. PM(10) and PM(2.5) daily samples were collected from June to November 2008 at an urban background site located in Bari (Puglia Region, South of Italy). Meteorological data, particle size distributions and atmospheric dispersion conditions were also monitored in order to provide information concerning the different features of PM sources. The collected data allowed suggesting four indicators to characterize different PM(10) exceedances. PM(2.5)/PM(10) ratio, natural radioactivity, aerosol maps and back-trajectory analysis and particle distributions were considered in order to evaluate the contribution of local anthropogenic sources and to determine the different origins of intrusive air mass coming from long-range transport, such as African dust outbreaks and aerosol particles from Central and Eastern Europe. The obtained results were confirmed by applying principal component analysis to the number particle concentration dataset and by the chemical characterization of the samples (PM(10) and PM(2.5)). The integrated approach for PM study suggested in this paper can be useful to support the air quality managers for the development of cost-effective control strategies and the application of more suitable risk management approaches.

  12. Surveying Libraries to Identify Best Practices for a Menu Approach for Library Instruction Requests

    Directory of Open Access Journals (Sweden)

    Candice Benjes-Small

    2009-11-01

    Full Text Available A challenging situation has developed in regards to library instruction. With increases in both the quantity of information and the variety of information technologies being made available to researchers, the information literacy landscape is getting more complex. Simultaneously, the time allotted for library instruction is remaining essentially the same. In order to market the breadth of content available for library instruction sessions and to promote collaboration between librarians and teaching faculty in order to create optimal instruction sessions an 'a la carte menu' approach to library instruction requests was adopted by Radford University in 2004. Since the late 1990s a number of community colleges and universities have included some type of menu in their instruction request forms or documentation and the authors desired to understand what approach these institutions had taken and whether they were effective in marketing instruction and improving communication between library instructors and teaching faculty. They analyzed forty-seven adaptations of the menu available on the web and surveyed the librarians who created them. In this article the authors present the findings of the web analysis and the survey, and recommendations are given for using the menu approach to library instruction requests.

  13. Identify too big to fail banks and capital insurance: An equilibrium approach

    Directory of Open Access Journals (Sweden)

    Katerina Ivanov

    2017-09-01

    Full Text Available The objective of this paper is develop a rational expectation equilibrium model of capital insurance to identify too big to fail banks. The main results of this model include (1 too big to fail banks can be identified explicitly by a systemic risk measure, loss betas, of all banks in the entire financial sector; (2 the too big to fail feature can be largely justified by a high level of loss beta; (3 the capital insurance proposal benefits market participants and reduces the systemic risk; (4 the implicit guarantee subsidy can be estimated endogenously; and lastly, (5 the capital insurance proposal can be used to resolve the moral hazard issue. We implement this model and document that the too big to fail issue has been considerably reduced in the pro-crisis period. As a result, the capital insurance proposal could be a useful macro-regulation innovation policy tool

  14. Astronomy and big data a data clustering approach to identifying uncertain galaxy morphology

    CERN Document Server

    Edwards, Kieran Jay

    2014-01-01

    With the onset of massive cosmological data collection through media such as the Sloan Digital Sky Survey (SDSS), galaxy classification has been accomplished for the most part with the help of citizen science communities like Galaxy Zoo. Seeking the wisdom of the crowd for such Big Data processing has proved extremely beneficial. However, an analysis of one of the Galaxy Zoo morphological classification data sets has shown that a significant majority of all classified galaxies are labelled as “Uncertain”. This book reports on how to use data mining, more specifically clustering, to identify galaxies that the public has shown some degree of uncertainty for as to whether they belong to one morphology type or another. The book shows the importance of transitions between different data mining techniques in an insightful workflow. It demonstrates that Clustering enables to identify discriminating features in the analysed data sets, adopting a novel feature selection algorithms called Incremental Feature Select...

  15. A GIS Approach to Identifying Socially and Medically Vulnerable Older Adult Populations in South Florida.

    Science.gov (United States)

    Hames, Elizabeth; Stoler, Justin; Emrich, Christopher T; Tewary, Sweta; Pandya, Naushira

    2017-11-10

    We define, map, and analyze geodemographic patterns of socially and medically vulnerable older adults within the tri-county region of South Florida. We apply principal components analysis (PCA) to a set of previously identified indicators of social and medical vulnerability at the census tract level. We create and map age-stratified vulnerability scores using a geographic information system (GIS), and use spatial analysis techniques to identify patterns and interactions between social and medical vulnerability. Key factors contributing to social vulnerability in areas with higher numbers of older adults include age, large household size, and Hispanic ethnicity. Medical vulnerability in these same areas is driven by disease burden, access to emergency cardiac services, availability of nursing home and hospice beds, access to home health care, and available mental health services. Age-dependent areas of social vulnerability emerge in Broward County, whereas age-dependent areas of medical vulnerability emerge in Palm Beach County. Older-adult social and medical vulnerability interact differently throughout the study area. Spatial analysis of older adult social and medical vulnerability using PCA and GIS can help identify age-dependent pockets of vulnerability that are not easily identifiable in a populationwide analysis; improve our understanding of the dynamic spatial organization of health care, health care needs, access to care, and outcomes; and ultimately serve as a tool for health care planning. © The Author 2016. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. An International Approach to identify the root causes of Childhood Leukemia

    International Nuclear Information System (INIS)

    Laurier, Dominique

    2015-01-01

    Living near a nuclear site is one of the risk factors studied for childhood leukemia. The IRSN and BfS brought together scientists from several disciplines under the aegis of the European Association MelodiGLO to take stock of the existing epidemiological studies, their contributions and their limitations as well as to identify the analysis and research avenues to provide more robust answers. (author)

  17. Identifying predictors, moderators, and mediators of antidepressant response in major depressive disorder: neuroimaging approaches.

    Science.gov (United States)

    Phillips, Mary L; Chase, Henry W; Sheline, Yvette I; Etkin, Amit; Almeida, Jorge R C; Deckersbach, Thilo; Trivedi, Madhukar H

    2015-02-01

    Despite significant advances in neuroscience and treatment development, no widely accepted biomarkers are available to inform diagnostics or identify preferred treatments for individuals with major depressive disorder. In this critical review, the authors examine the extent to which multimodal neuroimaging techniques can identify biomarkers reflecting key pathophysiologic processes in depression and whether such biomarkers may act as predictors, moderators, and mediators of treatment response that might facilitate development of personalized treatments based on a better understanding of these processes. The authors first highlight the most consistent findings from neuroimaging studies using different techniques in depression, including structural and functional abnormalities in two parallel neural circuits: serotonergically modulated implicit emotion regulation circuitry, centered on the amygdala and different regions in the medial prefrontal cortex; and dopaminergically modulated reward neural circuitry, centered on the ventral striatum and medial prefrontal cortex. They then describe key findings from the relatively small number of studies indicating that specific measures of regional function and, to a lesser extent, structure in these neural circuits predict treatment response in depression. Limitations of existing studies include small sample sizes, use of only one neuroimaging modality, and a focus on identifying predictors rather than moderators and mediators of differential treatment response. By addressing these limitations and, most importantly, capitalizing on the benefits of multimodal neuroimaging, future studies can yield moderators and mediators of treatment response in depression to facilitate significant improvements in shorter- and longer-term clinical and functional outcomes.

  18. Identifying Predictors, Moderators, and Mediators of Antidepressant Response in Major Depressive Disorder: Neuroimaging Approaches

    Science.gov (United States)

    Phillips, Mary L.; Chase, Henry W.; Sheline, Yvette I.; Etkin, Amit; Almeida, Jorge R.C.; Deckersbach, Thilo; Trivedi, Madhukar H.

    2015-01-01

    Objective Despite significant advances in neuroscience and treatment development, no widely accepted biomarkers are available to inform diagnostics or identify preferred treatments for individuals with major depressive disorder. Method In this critical review, the authors examine the extent to which multimodal neuroimaging techniques can identify biomarkers reflecting key pathophysiologic processes in depression and whether such biomarkers may act as predictors, moderators, and mediators of treatment response that might facilitate development of personalized treatments based on a better understanding of these processes. Results The authors first highlight the most consistent findings from neuroimaging studies using different techniques in depression, including structural and functional abnormalities in two parallel neural circuits: serotonergically modulated implicit emotion regulation circuitry, centered on the amygdala and different regions in the medial prefrontal cortex; and dopaminergically modulated reward neural circuitry, centered on the ventral striatum and medial prefrontal cortex. They then describe key findings from the relatively small number of studies indicating that specific measures of regional function and, to a lesser extent, structure in these neural circuits predict treatment response in depression. Conclusions Limitations of existing studies include small sample sizes, use of only one neuroimaging modality, and a focus on identifying predictors rather than moderators and mediators of differential treatment response. By addressing these limitations and, most importantly, capitalizing on the benefits of multimodal neuroimaging, future studies can yield moderators and mediators of treatment response in depression to facilitate significant improvements in shorter- and longer-term clinical and functional outcomes. PMID:25640931

  19. A target based approach identifies genomic predictors of breast cancer patient response to chemotherapy

    Directory of Open Access Journals (Sweden)

    Hallett Robin M

    2012-05-01

    Full Text Available Abstract Background The efficacy of chemotherapy regimens in breast cancer patients is variable and unpredictable. Whether individual patients either achieve long-term remission or suffer recurrence after therapy may be dictated by intrinsic properties of their breast tumors including genetic lesions and consequent aberrant transcriptional programs. Global gene expression profiling provides a powerful tool to identify such tumor-intrinsic transcriptional programs, whose analyses provide insight into the underlying biology of individual patient tumors. For example, multi-gene expression signatures have been identified that can predict the likelihood of disease reccurrence, and thus guide patient prognosis. Whereas such prognostic signatures are being introduced in the clinical setting, similar signatures that predict sensitivity or resistance to chemotherapy are not currently clinically available. Methods We used gene expression profiling to identify genes that were co-expressed with genes whose transcripts encode the protein targets of commonly used chemotherapeutic agents. Results Here, we present target based expression indices that predict breast tumor response to anthracycline and taxane based chemotherapy. Indeed, these signatures were independently predictive of chemotherapy response after adjusting for standard clinic-pathological variables such as age, grade, and estrogen receptor status in a cohort of 488 breast cancer patients treated with adriamycin and taxotere/taxol. Conclusions Importantly, our findings suggest the practicality of developing target based indices that predict response to therapeutics, as well as highlight the possibility of using gene signatures to guide the use of chemotherapy during treatment of breast cancer patients.

  20. Prospective validation of a predictive model that identifies homeless people at risk of re-presentation to the emergency department.

    Science.gov (United States)

    Moore, Gaye; Hepworth, Graham; Weiland, Tracey; Manias, Elizabeth; Gerdtz, Marie Frances; Kelaher, Margaret; Dunt, David

    2012-02-01

    To prospectively evaluate the accuracy of a predictive model to identify homeless people at risk of representation to an emergency department. A prospective cohort analysis utilised one month of data from a Principal Referral Hospital in Melbourne, Australia. All visits involving people classified as homeless were included, excluding those who died. Homelessness was defined as living on the streets, in crisis accommodation, in boarding houses or residing in unstable housing. Rates of re-presentation, defined as the total number of visits to the same emergency department within 28 days of discharge from hospital, were measured. Performance of the risk screening tool was assessed by calculating sensitivity, specificity, positive and negative predictive values and likelihood ratios. Over the study period (April 1, 2009 to April 30, 2009), 3298 presentations from 2888 individuals were recorded. The homeless population accounted for 10% (n=327) of all visits and 7% (n=211) of all patients. A total of 90 (43%) homeless people re-presented to the emergency department. The predictive model included nine variables and achieved 98% (CI, 0.92-0.99) sensitivity and 66% (CI, 0.57-0.74) specificity. The positive predictive value was 68% and the negative predictive value was 98%. The positive likelihood ratio 2.9 (CI, 2.2-3.7) and the negative likelihood ratio was 0.03 (CI, 0.01-0.13). The high emergency department re-presentation rate for people who were homeless identifies unresolved psychosocial health needs. The emergency department remains a vital access point for homeless people, particularly after hours. The risk screening tool is key to identify medical and social aspects of a homeless patient's presentation to assist early identification and referral. Copyright © 2012 College of Emergency Nursing Australasia Ltd. Published by Elsevier Ltd. All rights reserved.

  1. A model-based design and validation approach with OMEGA-UML and the IF toolset

    Science.gov (United States)

    Ben-hafaiedh, Imene; Constant, Olivier; Graf, Susanne; Robbana, Riadh

    2009-03-01

    Intelligent, embedded systems such as autonomous robots and other industrial systems are becoming increasingly more heterogeneous with respect to the platforms on which they are implemented, and thus the software architecture more complex to design and analyse. In this context, it is important to have well-defined design methodologies which should be supported by (1) high level design concepts allowing to master the design complexity, (2) concepts for the expression of non-functional requirements and (3) analysis tools allowing to verify or invalidate that the system under development will be able to conform to its requirements. We illustrate here such an approach for the design of complex embedded systems on hand of a small case study used as a running example for illustration purposes. We briefly present the important concepts of the OMEGA-RT UML profile, we show how we use this profile in a modelling approach, and explain how these concepts are used in the IFx verification toolbox to integrate validation into the design flow and make scalable verification possible.

  2. Experimental Validation of Various Temperature Modells for Semi-Physical Tyre Model Approaches

    Science.gov (United States)

    Hackl, Andreas; Scherndl, Christoph; Hirschberg, Wolfgang; Lex, Cornelia

    2017-10-01

    With increasing level of complexity and automation in the area of automotive engineering, the simulation of safety relevant Advanced Driver Assistance Systems (ADAS) leads to increasing accuracy demands in the description of tyre contact forces. In recent years, with improvement in tyre simulation, the needs for coping with tyre temperatures and the resulting changes in tyre characteristics are rising significantly. Therefore, experimental validation of three different temperature model approaches is carried out, discussed and compared in the scope of this article. To investigate or rather evaluate the range of application of the presented approaches in combination with respect of further implementation in semi-physical tyre models, the main focus lies on the a physical parameterisation. Aside from good modelling accuracy, focus is held on computational time and complexity of the parameterisation process. To evaluate this process and discuss the results, measurements from a Hoosier racing tyre 6.0 / 18.0 10 LCO C2000 from an industrial flat test bench are used. Finally the simulation results are compared with the measurement data.

  3. Innovative approach for in-vivo ablation validation on multimodal images

    Science.gov (United States)

    Shahin, O.; Karagkounis, G.; Carnegie, D.; Schlaefer, A.; Boctor, E.

    2014-03-01

    Radiofrequency ablation (RFA) is an important therapeutic procedure for small hepatic tumors. To make sure that the target tumor is effectively treated, RFA monitoring is essential. While several imaging modalities can observe the ablation procedure, it is not clear how ablated lesions on the images correspond to actual necroses. This uncertainty contributes to the high local recurrence rates (up to 55%) after radiofrequency ablative therapy. This study investigates a novel approach to correlate images of ablated lesions with actual necroses. We mapped both intraoperative images of the lesion and a slice through the actual necrosis in a common reference frame. An electromagnetic tracking system was used to accurately match lesion slices from different imaging modalities. To minimize the liver deformation effect, the tracking reference frame was defined inside the tissue by anchoring an electromagnetic sensor adjacent to the lesion. A validation test was performed using a phantom and proved that the end-to-end accuracy of the approach was within 2mm. In an in-vivo experiment, intraoperative magnetic resonance imaging (MRI) and ultrasound (US) ablation images were correlated to gross and histopathology. The results indicate that the proposed method can accurately correlate invivo ablations on different modalities. Ultimately, this will improve the interpretation of the ablation monitoring and reduce the recurrence rates associated with RFA.

  4. Improving probabilistic prediction of daily streamflow by identifying Pareto optimal approaches for modelling heteroscedastic residual errors

    Science.gov (United States)

    David, McInerney; Mark, Thyer; Dmitri, Kavetski; George, Kuczera

    2017-04-01

    This study provides guidance to hydrological researchers which enables them to provide probabilistic predictions of daily streamflow with the best reliability and precision for different catchment types (e.g. high/low degree of ephemerality). Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. It is commonly known that hydrological model residual errors are heteroscedastic, i.e. there is a pattern of larger errors in higher streamflow predictions. Although multiple approaches exist for representing this heteroscedasticity, few studies have undertaken a comprehensive evaluation and comparison of these approaches. This study fills this research gap by evaluating 8 common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter, lambda) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and USA, and two lumped hydrological models. We find the choice of heteroscedastic error modelling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with lambda of 0.2 and 0.5, and the log scheme (lambda=0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.

  5. Integrated Pathway-Based Approach Identifies Association between Genomic Regions at CTCF and CACNB2 and Schizophrenia

    NARCIS (Netherlands)

    Juraeva, Dilafruz; Haenisch, Britta; Zapatka, Marc; Frank, Josef; Witt, Stephanie H.; Mühleisen, Thomas W.; Treutlein, Jens; Strohmaier, Jana; Meier, Sandra; Degenhardt, Franziska; Giegling, Ina; Ripke, Stephan; Leber, Markus; Lange, Christoph; Schulze, Thomas G.; Mössner, Rainald; Nenadic, Igor; Sauer, Heinrich; Rujescu, Dan; Maier, Wolfgang; Børglum, Anders; Ophoff, Roel; Cichon, Sven; Nöthen, Markus M.; Rietschel, Marcella; Mattheisen, Manuel; Brors, Benedikt; Kahn, René S.; Cahn, Wiepke; Linszen, Don H.; de Haan, Lieuwe; van Os, Jim; Krabbendam, Lydia; Myin-Germeys, Inez; Wiersma, Durk; Bruggeman, Richard; Mors, O.; Børglum, A. D.; Mortensen, P. B.; Pedersen, C. B.; Demontis, D.; Grove, J.; Mattheisen, M.; Hougaard, D. M.

    2014-01-01

    In the present study, an integrated hierarchical approach was applied to: (1) identify pathways associated with susceptibility to schizophrenia; (2) detect genes that may be potentially affected in these pathways since they contain an associated polymorphism; and (3) annotate the functional

  6. 78 FR 20672 - Literature Review ApproachIdentifying Research Needs for Assessing Safe Use of High Intakes of...

    Science.gov (United States)

    2013-04-05

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Literature Review Approach... Needs for Assessing Safe Use of High Intakes of Folic Acid,'' for review of the pertinent literature... folate and folic acid, screening of the literature was undertaken to identify the potential adverse...

  7. Exome-first approach identified a novel gloss deletion associated with Lowe syndrome.

    Science.gov (United States)

    Watanabe, Miki; Nakagawa, Ryuji; Kohmoto, Tomohiro; Naruto, Takuya; Suga, Ken-Ichi; Goji, Aya; Horikawa, Hideaki; Masuda, Kiyoshi; Kagami, Shoji; Imoto, Issei

    2016-01-01

    Lowe syndrome (LS) is an X-linked disorder affecting the eyes, nervous system and kidneys, typically caused by missense or nonsense/frameshift OCRL mutations. We report a 6-month-old male clinically suspected to have LS, but without the Fanconi-type renal dysfunction. Using a targeted-exome sequencing-first approach, LS was diagnosed by the identification of a deletion involving 1.7 Mb at Xq25-q26.1, encompassing the entire OCRL gene and neighboring loci.

  8. [The general methodological approaches identifying strategic positions in developing healthy lifestyle of population].

    Science.gov (United States)

    Dorofeev, S B; Babenko, A I

    2017-01-01

    The article deals with analysis of national and international publications concerning methodological aspects of elaborating systematic approach to healthy life-style of population. This scope of inquiry plays a key role in development of human capital. The costs related to healthy life-style are to be considered as personal investment into future income due to physical incrementation of human capital. The definitions of healthy life-style, its categories and supportive factors are to be considered in the process of development of strategies and programs of healthy lifestyle. The implementation of particular strategies entails application of comprehensive information and educational programs meant for various categories of population. Therefore, different motivation techniques are to be considered for children, adolescents, able-bodied population, the elderly. This approach is to be resulted in establishing particular responsibility for national government, territorial administrations, health care administrations, employers and population itself. The necessity of complex legislative measures is emphasized. The recent social hygienic studies were focused mostly on particular aspects of development of healthy life-style of population. Hence, the demand for long term exploration of development of organizational and functional models implementing medical preventive measures on the basis of comprehensive information analysis using statistical, sociological and professional expertise.

  9. Integrating modelling and phenotyping approaches to identify and screen complex traits - Illustration for transpiration efficiency in cereals.

    Science.gov (United States)

    Chenu, K; van Oosterom, E J; McLean, G; Deifel, K S; Fletcher, A; Geetika, G; Tirfessa, A; Mace, E S; Jordan, D R; Sulman, R; Hammer, G L

    2018-02-21

    Following advances in genetics, genomics, and phenotyping, trait selection in breeding is limited by our ability to understand interactions within the plants and with their environments, and to target traits of most relevance for the target population of environments. We propose an integrated approach that combines insights from crop modelling, physiology, genetics, and breeding to identify traits valuable for yield gain in the target population of environments, develop relevant high-throughput phenotyping platforms, and identify genetic controls and their values in production environments. This paper uses transpiration efficiency (biomass produced per unit of water used) as an example of a complex trait of interest to illustrate how the approach can guide modelling, phenotyping, and selection in a breeding program. We believe that this approach, by integrating insights from diverse disciplines, can increase the resource use efficiency of breeding programs for improving yield gains in target populations of environments.

  10. Identifying "social smoking" U.S. young adults using an empirically-driven approach.

    Science.gov (United States)

    Villanti, Andrea C; Johnson, Amanda L; Rath, Jessica M; Williams, Valerie; Vallone, Donna M; Abrams, David B; Hedeker, Donald; Mermelstein, Robin J

    2017-07-01

    The phenomenon of "social smoking" emerged in the past decade as an important area of research, largely due to its high prevalence in young adults. The purpose of this study was to identify classes of young adult ever smokers based on measures of social and contextual influences on tobacco use. Latent class models were developed using social smoking measures, and not the frequency or quantity of tobacco use. Data come from a national sample of young adult ever smokers aged 18-24 (Truth Initiative Young Adult Cohort Study, N=1564). The optimal models identified three latent classes: Class 1 - nonsmokers (52%); Class 2 - social smokers (18%); and Class 3 - smokers (30%). Nearly 60% of the "social smoker" class self-identified as a social smoker, 30% as an ex-smoker/tried smoking, and 12% as a non-smoker. The "social smoker" class was most likely to report using tobacco mainly or only with others. Past 30-day cigarette use was highest in the "smoker" class. Hookah use was highest in the "social smoker" class. Other tobacco and e-cigarette use was similar in the "social smoker" and "smoker" classes. Past 30-day tobacco and e-cigarette use was present for all products in the "non-smoker" class. Young adult social smokers emerge empirically as a sizable, distinct class from other smokers, even without accounting for tobacco use frequency or intensity. The prevalence of hookah use in "social smokers" indicates a group for which the social aspect of tobacco use could drive experimentation and progression to regular use. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Using a Systematic Approach to Identifying Organizational Factors in Root Cause Analysis

    International Nuclear Information System (INIS)

    Gallogly, Kay Wilde

    2011-01-01

    This presentation set the scene for the second discussion session. In her presentation, the author observed that: - Investigators do not see the connection between the analysis tools available and the identification of HOF. Most investigators use the tools in a cursory manner and so do not derive the full benefits of the tools. Some tools are used for presentation purposes as opposed to being used for analytical purposes e.g. event and causal factors charts. In some cases, the report will indicate that specific analytical tools were used in the investigation but the analysis is not in the body of the report. - Some investigators are documenting HOF causes but do not recognize them as such. This indicates a lack of understanding of HOF. - Others investigators focus on technical issues because of their own comfort level. - The culture of the Organisation will affect the depth of the investigation and therefore the use of the analytical tools to pursue HOF issues. - The author contends that if analysis tools are applied systematically to gather factually based data, then HOF issues can be identified. The use of factual information (without judgement and subjectivity) is important to maintain the credibility of the investigation especially when HOF issues are identified. - Systematic use of tools assists in better communication of the issues to foster greater understanding and acceptance by senior management. - Barrier Analysis, Change Analysis, and TWIN (Task Demands, Work Environment, Individual Capabilities, and Human Nature) all offer the opportunity to identify HOF issues if the analyst pursues this line of investigation. It was illustrated that many elements of the TWIN Error Precursors are themselves Organisational in nature. - The TWIN model applied to the Anatomy of an Event will help to distinguish those which are Organisational issues (Latent Organisational Weaknesses, Error Precursors and Flawed Defences) and those which are human factors (Active Errors

  12. Domestic Violence and Pregnancy: A CBPR Coalition Approach to Identifying Needs and Informing Policy.

    Science.gov (United States)

    Bright, Candace Forbes; Bagley, Braden; Pulliam, Ivie; Newton, Amy Swetha

    2018-01-01

    Community engagement-the collaborative process of addressing issues that impact the well-being of a community-is a strategic effort to address community issues. The Gulf States Health Policy Center (GS-HPC) formed the Hattiesburg Area Health Coalition (HAHC) in November 2014 for the purpose of addressing policies impacting the health of Forrest and Lamar counties in Mississippi. To chronicle the community-based participatory research (CBPR) process used by HAHC's identification of infant and maternal health as a policy area, domestic violence in pregnancy as a priority area within infant and maternal health, and a community action plan (CAP) regarding this priority area. HAHC reviewed data and identified infant and maternal health as a priority area. They then conducted a policy scan of local prenatal health care to determine the policy area of domestic violence in pregnancy. HAHC developed a CAP identifying three goals with regard to domestic violence and pregnancy that together informed policy. Changes included the development of materials specific to resources available in the area. The materials and recommended changes will first be implemented by Southeast Mississippi Rural Health Initiative (SeMRHI) through a screening question for all pregnant patients, and the adoption of policies for providing information and referrals. The lack of community-level data was a challenge to HAHC in identifying focus and priority areas, but this was overcome by shared leadership and community engagement. After completion of the CAP, 100% of expecting mothers receiving prenatal care in the area will be screened for domestic violence.

  13. Using combined morphological, allometric and molecular approaches to identify species of the genus Raillietiella (Pentastomida.

    Directory of Open Access Journals (Sweden)

    Crystal Kelehear

    Full Text Available Taxonomic studies of parasites can be severely compromised if the host species affects parasite morphology; an uncritical analysis might recognize multiple taxa simply because of phenotypically plastic responses of parasite morphology to host physiology. Pentastomids of the genus Raillietiella are endoparasitic crustaceans primarily infecting the respiratory system of carnivorous reptiles, but also recorded from bufonid anurans. The delineation of pentastomids at the generic level is clear, but the taxonomic status of many species is not. We collected raillietiellids from lungs of the invasive cane toad (Rhinella marina, the invasive Asian house gecko (Hemidactylus frenatus, and a native tree frog (Litoria caerulea in tropical Australia, and employed a combination of genetic analyses, and traditional and novel morphological methods to clarify their identity. Conventional analyses of parasite morphology (which focus on raw values of morphological traits revealed two discrete clusters in terms of pentastome hook size, implying two different species of pentastomes: one from toads and a tree frog (Raillietiella indica and another from lizards (Raillietiella frenatus. However, these clusters disappeared in allometric analyses that took pentastome body size into account, suggesting that only a single pentastome taxon may be involved. Our molecular data revealed no genetic differences between parasites in toads versus lizards, confirming that there was only one species: R. frenatus. This pentastome (previously known only from lizards clearly is also capable of maturing in anurans. Our analyses show that the morphological features used in pentastomid taxonomy change as the parasite transitions through developmental stages in the definitive host. To facilitate valid descriptions of new species of pentastomes, future taxonomic work should include both morphological measurements (incorporating quantitative measures of body size and hook bluntness and

  14. Identifying the determinants of South Africa's extensive and intensive trade margins: A gravity model approach

    OpenAIRE

    Matthee, Marianne; Santana-Gallego, Maria

    2017-01-01

    Background: The significance of the paper is twofold. Firstly, it adds to the small but growing body of literature focusing on the decomposition of South Africa’s export growth. Secondly, it identifies the determinants of the intensive and extensive margins of South Africa’s exports – a topic that (as far as the authors are concerned) has not been explored before. Aim: This paper aims to investigate a wide range of market access determinants that affect South Africa’s export growth along ...

  15. An experimental approach to validating a theory of human error in complex systems

    Science.gov (United States)

    Morris, N. M.; Rouse, W. B.

    1985-01-01

    The problem of 'human error' is pervasive in engineering systems in which the human is involved. In contrast to the common engineering approach of dealing with error probabilistically, the present research seeks to alleviate problems associated with error by gaining a greater understanding of causes and contributing factors from a human information processing perspective. The general approach involves identifying conditions which are hypothesized to contribute to errors, and experimentally creating the conditions in order to verify the hypotheses. The conceptual framework which serves as the basis for this research is discussed briefly, followed by a description of upcoming research. Finally, the potential relevance of this research to design, training, and aiding issues is discussed.

  16. Development and validation of the Approach-Iron Skill Test for use in golf.

    Science.gov (United States)

    Robertson, Samuel John; Burnett, Angus F; Newton, Robert U

    2013-01-01

    The primary aim of this study was to develop and validate a golf-specific approach-iron test for use with elite and high-level amateur golfers. Elite (n=26) and high-level amateur (n=23) golfers were recruited for this study. The 'Approach-Iron Skill Test' requires players to hit a total of 27 shots. Specifically, three shots are hit at each of nine targets on a specially constructed driving range in a randomised order. A real-time launch monitor positioned behind the player, measured the carry distance for each of these shots. A scoring system was developed based on the percentage error index of each shot, meaning that 81 points was the maximum score possible (with a maximum of three points per shot). Two rounds of the test were performed. For both rounds of the test, elite-level golfers scored significantly higher than their high-level amateur counterparts (56.3 ± 5.6 and 58.5 ± 4.6 points versus 46.0 ± 6.3 and 46.1 ± 6.7 points, respectively) (P<0.05). For both elite and high-level players, 95% limits of agreement statistics also indicated that the test showed good test-retest reliability (2.1 ± 7.9 and 0.2 ± 10.8, respectively). Due to the clinimetric properties of the test, we conclude that the Approach-Iron Skill Test is suitable for further examination with the players examined in this study.

  17. Validating emotional attention regulation as a component of emotional intelligence: A Stroop approach to individual differences in tuning in to and out of nonverbal cues.

    Science.gov (United States)

    Elfenbein, Hillary Anger; Jang, Daisung; Sharma, Sudeep; Sanchez-Burks, Jeffrey

    2017-03-01

    Emotional intelligence (EI) has captivated researchers and the public alike, but it has been challenging to establish its components as objective abilities. Self-report scales lack divergent validity from personality traits, and few ability tests have objectively correct answers. We adapt the Stroop task to introduce a new facet of EI called emotional attention regulation (EAR), which involves focusing emotion-related attention for the sake of information processing rather than for the sake of regulating one's own internal state. EAR includes 2 distinct components. First, tuning in to nonverbal cues involves identifying nonverbal cues while ignoring alternate content, that is, emotion recognition under conditions of distraction by competing stimuli. Second, tuning out of nonverbal cues involves ignoring nonverbal cues while identifying alternate content, that is, the ability to interrupt emotion recognition when needed to focus attention elsewhere. An auditory test of valence included positive and negative words spoken in positive and negative vocal tones. A visual test of approach-avoidance included green- and red-colored facial expressions depicting happiness and anger. The error rates for incongruent trials met the key criteria for establishing the validity of an EI test, in that the measure demonstrated test-retest reliability, convergent validity with other EI measures, divergent validity from factors such as general processing speed and mostly personality, and predictive validity in this case for well-being. By demonstrating that facets of EI can be validly theorized and empirically assessed, results also speak to the validity of EI more generally. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. Identifying the greatest team and captain—A complex network approach to cricket matches

    Science.gov (United States)

    Mukherjee, Satyam

    2012-12-01

    We consider all Test matches played between 1877 and 2010 and One Day International (ODI) matches played between 1971 and 2010. We form directed and weighted networks of teams and also of their captains. The success of a team (or captain) is determined by the ‘quality’ of the wins, not simply by the number of wins. We apply the diffusion-based PageRank algorithm to the networks to assess the importance of the wins, and rank the respective teams and captains. Our analysis identifies Australia as the best team in both forms of cricket, Test and ODI. Steve Waugh is identified as the best captain in Test cricket and Ricky Ponting is the best captain in the ODI format. We also compare our ranking scheme with an existing ranking scheme, the Reliance ICC ranking. Our method does not depend on ‘external’ criteria in the ranking of teams (captains). The purpose of this paper is to introduce a revised ranking of cricket teams and to quantify the success of the captains.

  19. A model-based approach for identifying signatures of ancient balancing selection in genetic data.

    Science.gov (United States)

    DeGiorgio, Michael; Lohmueller, Kirk E; Nielsen, Rasmus

    2014-08-01

    While much effort has focused on detecting positive and negative directional selection in the human genome, relatively little work has been devoted to balancing selection. This lack of attention is likely due to the paucity of sophisticated methods for identifying sites under balancing selection. Here we develop two composite likelihood ratio tests for detecting balancing selection. Using simulations, we show that these methods outperform competing methods under a variety of assumptions and demographic models. We apply the new methods to whole-genome human data, and find a number of previously-identified loci with strong evidence of balancing selection, including several HLA genes. Additionally, we find evidence for many novel candidates, the strongest of which is FANK1, an imprinted gene that suppresses apoptosis, is expressed during meiosis in males, and displays marginal signs of segregation distortion. We hypothesize that balancing selection acts on this locus to stabilize the segregation distortion and negative fitness effects of the distorter allele. Thus, our methods are able to reproduce many previously-hypothesized signals of balancing selection, as well as discover novel interesting candidates.

  20. A model-based approach for identifying signatures of ancient balancing selection in genetic data.

    Directory of Open Access Journals (Sweden)

    Michael DeGiorgio

    2014-08-01

    Full Text Available While much effort has focused on detecting positive and negative directional selection in the human genome, relatively little work has been devoted to balancing selection. This lack of attention is likely due to the paucity of sophisticated methods for identifying sites under balancing selection. Here we develop two composite likelihood ratio tests for detecting balancing selection. Using simulations, we show that these methods outperform competing methods under a variety of assumptions and demographic models. We apply the new methods to whole-genome human data, and find a number of previously-identified loci with strong evidence of balancing selection, including several HLA genes. Additionally, we find evidence for many novel candidates, the strongest of which is FANK1, an imprinted gene that suppresses apoptosis, is expressed during meiosis in males, and displays marginal signs of segregation distortion. We hypothesize that balancing selection acts on this locus to stabilize the segregation distortion and negative fitness effects of the distorter allele. Thus, our methods are able to reproduce many previously-hypothesized signals of balancing selection, as well as discover novel interesting candidates.

  1. Genetic Susceptibility to Vitiligo: GWAS Approaches for Identifying Vitiligo Susceptibility Genes and Loci

    Science.gov (United States)

    Shen, Changbing; Gao, Jing; Sheng, Yujun; Dou, Jinfa; Zhou, Fusheng; Zheng, Xiaodong; Ko, Randy; Tang, Xianfa; Zhu, Caihong; Yin, Xianyong; Sun, Liangdan; Cui, Yong; Zhang, Xuejun

    2016-01-01

    Vitiligo is an autoimmune disease with a strong genetic component, characterized by areas of depigmented skin resulting from loss of epidermal melanocytes. Genetic factors are known to play key roles in vitiligo through discoveries in association studies and family studies. Previously, vitiligo susceptibility genes were mainly revealed through linkage analysis and candidate gene studies. Recently, our understanding of the genetic basis of vitiligo has been rapidly advancing through genome-wide association study (GWAS). More than 40 robust susceptible loci have been identified and confirmed to be associated with vitiligo by using GWAS. Most of these associated genes participate in important pathways involved in the pathogenesis of vitiligo. Many susceptible loci with unknown functions in the pathogenesis of vitiligo have also been identified, indicating that additional molecular mechanisms may contribute to the risk of developing vitiligo. In this review, we summarize the key loci that are of genome-wide significance, which have been shown to influence vitiligo risk. These genetic loci may help build the foundation for genetic diagnosis and personalize treatment for patients with vitiligo in the future. However, substantial additional studies, including gene-targeted and functional studies, are required to confirm the causality of the genetic variants and their biological relevance in the development of vitiligo. PMID:26870082

  2. Validation of mercury tip-switch and accelerometer activity sensors for identifying resting and active behavior in bears

    Science.gov (United States)

    Jasmine Ware,; Rode, Karyn D.; Pagano, Anthony M.; Bromaghin, Jeffrey F.; Robbins, Charles T.; Joy Erlenbach,; Shannon Jensen,; Amy Cutting,; Nicole Nicassio-Hiskey,; Amy Hash,; Owen, Megan A.; Heiko Jansen,

    2015-01-01

    Activity sensors are often included in wildlife transmitters and can provide information on the behavior and activity patterns of animals remotely. However, interpreting activity-sensor data relative to animal behavior can be difficult if animals cannot be continuously observed. In this study, we examined the performance of a mercury tip-switch and a tri-axial accelerometer housed in collars to determine whether sensor data can be accurately classified as resting and active behaviors and whether data are comparable for the 2 sensor types. Five captive bears (3 polar [Ursus maritimus] and 2 brown [U. arctos horribilis]) were fitted with a collar specially designed to internally house the sensors. The bears’ behaviors were recorded, classified, and then compared with sensor readings. A separate tri-axial accelerometer that sampled continuously at a higher frequency and provided raw acceleration values from 3 axes was also mounted on the collar to compare with the lower resolution sensors. Both accelerometers more accurately identified resting and active behaviors at time intervals ranging from 1 minute to 1 hour (≥91.1% accuracy) compared with the mercury tip-switch (range = 75.5–86.3%). However, mercury tip-switch accuracy improved when sampled at longer intervals (e.g., 30–60 min). Data from the lower resolution accelerometer, but not the mercury tip-switch, accurately predicted the percentage of time spent resting during an hour. Although the number of bears available for this study was small, our results suggest that these activity sensors can remotely identify resting versus active behaviors across most time intervals. We recommend that investigators consider both study objectives and the variation in accuracy of classifying resting and active behaviors reported here when determining sampling interval.

  3. Validity of High School Physic Module With Character Values Using Process Skill Approach In STKIP PGRI West Sumatera

    Science.gov (United States)

    Anaperta, M.; Helendra, H.; Zulva, R.

    2018-04-01

    This study aims to describe the validity of physics module with Character Oriented Values Using Process Approach Skills at Dynamic Electrical Material in high school physics / MA and SMK. The type of research is development research. The module development model uses the development model proposed by Plomp which consists of (1) preliminary research phase, (2) the prototyping phase, and (3) assessment phase. In this research is done is initial investigation phase and designing. Data collecting technique to know validation is observation and questionnaire. In the initial investigative phase, curriculum analysis, student analysis, and concept analysis were conducted. In the design phase and the realization of module design for SMA / MA and SMK subjects in dynamic electrical materials. After that, the formative evaluation which include self evaluation, prototyping (expert reviews, one-to-one, and small group. At this stage validity is performed. This research data is obtained through the module validation sheet, which then generates a valid module.

  4. A Systematic Approach to Identify Candidate Transcription Factors that Control Cell Identity

    Directory of Open Access Journals (Sweden)

    Ana C. D’Alessio

    2015-11-01

    Full Text Available Hundreds of transcription factors (TFs are expressed in each cell type, but cell identity can be induced through the activity of just a small number of core TFs. Systematic identification of these core TFs for a wide variety of cell types is currently lacking and would establish a foundation for understanding the transcriptional control of cell identity in development, disease, and cell-based therapy. Here, we describe a computational approach that generates an atlas of candidate core TFs for a broad spectrum of human cells. The potential impact of the atlas was demonstrated via cellular reprogramming efforts where candidate core TFs proved capable of converting human fibroblasts to retinal pigment epithelial-like cells. These results suggest that candidate core TFs from the atlas will prove a useful starting point for studying transcriptional control of cell identity and reprogramming in many human cell types.

  5. Identifying the optimal supply temperature in district heating networks - A modelling approach

    DEFF Research Database (Denmark)

    Mohammadi, Soma; Bojesen, Carsten

    2014-01-01

    of this study is to develop a model for thermo-hydraulic calculation of low temperature DH system. The modelling is performed with emphasis on transient heat transfer in pipe networks. The pseudo-dynamic approach is adopted to model the District Heating Network [DHN] behaviour which estimates the temperature...... dynamically while the flow and pressure are calculated on the basis of steady state conditions. The implicit finite element method is applied to simulate the transient temperature behaviour in the network. Pipe network heat losses, pressure drop in the network and return temperature to the plant...... are calculated in the developed model. The model will serve eventually as a basis to find out the optimal supply temperature in an existing DHN in later work. The modelling results are used as decision support for existing DHN; proposing possible modifications to operate at optimal supply temperature....

  6. Identifying Silence Climate in Organizations in the Framework of Contemporary Management Approaches

    Directory of Open Access Journals (Sweden)

    Mustafa Emre Civelek

    2015-10-01

    Full Text Available Dynamic competition conditions in present day, bring about the consequence for businesses to face varied problems with each passing day. At this point, current management approaches include studies that would shed light on the new problems of businesses. Organizational Silence, a concept that has recently been being voiced in business world, has come up in such context. Organizational silence could be expressed as the employee behavior of keeping silent about certain negativities due to various reasons in an organization. Since knowledge sharing in modern organizations is of capital importance in terms of responding hastily to the changes in a competitive environment, spread of this behavior of employees to organization culture and climate presents a threat of indifference. In this study, the concept of Organizational Silence is defined and the effects of conceived silence climate on management of organizations are discussed.

  7. Identifying Silence Climate in Organizations in the Framework of Contemporary Management Approaches

    Directory of Open Access Journals (Sweden)

    Mustafa Emre Civelek

    2015-12-01

    Full Text Available Dynamic competition conditions in present day bring about the consequence for businesses to face varied problems with each passing day. At this point, current management approaches include studies that would shed light on the new problems of businesses. Organizational Silence, a concept that has recently been being voiced in business world, has come up in such context. Organizational silence could be expressed as the employee behavior of keeping silent about certain negativities due to various reasons in an organization. Since knowledge sharing in modern organizations is of capital importance in terms of responding hastily to the changes in a competitive environment, spread of this behavior of employees to organization culture and climate presents a threat of indifference. In this study, the concept of Organizational Silence is defined and the effects of conceived silence climate on management of organizations are discussed.

  8. Utility of the Conners' Adult ADHD Rating Scale validity scales in identifying simulated attention-deficit hyperactivity disorder and random responding.

    Science.gov (United States)

    Walls, Brittany D; Wallace, Elizabeth R; Brothers, Stacey L; Berry, David T R

    2017-12-01

    Recent concern about malingered self-report of symptoms of attention-deficit hyperactivity disorder (ADHD) in college students has resulted in an urgent need for scales that can detect feigning of this disorder. The present study provided further validation data for a recently developed validity scale for the Conners' Adult ADHD Rating Scale (CAARS), the CAARS Infrequency Index (CII), as well as for the Inconsistency Index (INC). The sample included 139 undergraduate students: 21 individuals with diagnoses of ADHD, 29 individuals responding honestly, 54 individuals responding randomly (full or half), and 35 individuals instructed to feign. Overall, the INC showed moderate sensitivity to random responding (.44-.63) and fairly high specificity to ADHD (.86-.91). The CII demonstrated modest sensitivity to feigning (.31-.46) and excellent specificity to ADHD (.91-.95). Sequential application of validity scales had correct classification rates of honest (93.1%), ADHD (81.0%), feigning (57.1%), half random (42.3%), and full random (92.9%). The present study suggests that the CII is modestly sensitive (true positive rate) to feigned ADHD symptoms, and highly specific (true negative rate) to ADHD. Additionally, this study highlights the utility of applying the CAARS validity scales in a sequential manner for identifying feigning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Improving probabilistic prediction of daily streamflow by identifying Pareto optimal approaches for modeling heteroscedastic residual errors

    Science.gov (United States)

    McInerney, David; Thyer, Mark; Kavetski, Dmitri; Lerat, Julien; Kuczera, George

    2017-03-01

    Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. This study focuses on approaches for representing error heteroscedasticity with respect to simulated streamflow, i.e., the pattern of larger errors in higher streamflow predictions. We evaluate eight common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter λ) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and the United States, and two lumped hydrological models. Performance is quantified using predictive reliability, precision, and volumetric bias metrics. We find the choice of heteroscedastic error modeling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with λ of 0.2 and 0.5, and the log scheme (λ = 0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Paradoxically, calibration of λ is often counterproductive: in perennial catchments, it tends to overfit low flows at the expense of abysmal precision in high flows. The log-sinh transformation is dominated by the simpler Pareto optimal schemes listed above. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.

  10. Identifying dominant controls on hydrologic parameter transfer from gauged to ungauged catchments: a comparative hydrology approach

    Science.gov (United States)

    Singh, R.; Archfield, S.A.; Wagener, T.

    2014-01-01

    Daily streamflow information is critical for solving various hydrologic problems, though observations of continuous streamflow for model calibration are available at only a small fraction of the world’s rivers. One approach to estimate daily streamflow at an ungauged location is to transfer rainfall–runoff model parameters calibrated at a gauged (donor) catchment to an ungauged (receiver) catchment of interest. Central to this approach is the selection of a hydrologically similar donor. No single metric or set of metrics of hydrologic similarity have been demonstrated to consistently select a suitable donor catchment. We design an experiment to diagnose the dominant controls on successful hydrologic model parameter transfer. We calibrate a lumped rainfall–runoff model to 83 stream gauges across the United States. All locations are USGS reference gauges with minimal human influence. Parameter sets from the calibrated models are then transferred to each of the other catchments and the performance of the transferred parameters is assessed. This transfer experiment is carried out both at the scale of the entire US and then for six geographic regions. We use classification and regression tree (CART) analysis to determine the relationship between catchment similarity and performance of transferred parameters. Similarity is defined using physical/climatic catchment characteristics, as well as streamflow response characteristics (signatures such as baseflow index and runoff ratio). Across the entire US, successful parameter transfer is governed by similarity in elevation and climate, and high similarity in streamflow signatures. Controls vary for different geographic regions though. Geology followed by drainage, topography and climate constitute the dominant similarity metrics in forested eastern mountains and plateaus, whereas agricultural land use relates most strongly with successful parameter transfer in the humid plains.

  11. Identifying seminal papers in the Australasian Journal on Ageing 1982-2011: a Delphi consensus approach.

    Science.gov (United States)

    Parkinson, Lynne; Richardson, Kristy; Sims, Jane; Wells, Yvonne; Naganathan, Vasi; Brooke, Elizabeth; Lindley, Richard

    2013-10-01

    The aim of this study was to identify seminal Australasian Journal on Ageing papers published over 30 years through a Delphi consensus process. The main data collection was a three-round Delphi consensus study with 38 past and current members of the Australasian Journal on Ageing Editorial Board, Editorial Team and Management Committee. Three papers were agreed as top-ranking. One of the top-ranking articles was also highly cited. One article was published in the 1990 s, two in 2001. While it is difficult to judge how well the top-ranking papers represent seminal papers arising over 30 years, these papers do represent three different research strengths in Australasia, they do span three different disciplines, and they do reflect some of the diversity that characterises ageing research in Australasia over 30 years. © 2013 ACOTA.

  12. An improved approach to identify irradiated dog feed by electron paramagnetic resonance study and thermoluminescence measurements

    Energy Technology Data Exchange (ETDEWEB)

    Sanyal, Bhaskar, E-mail: bhaskar_sanyal@rediffmail.co [Food Technology Division, Bhabha Atomic Research Centre, Mumbai-400 085 (India); Chawla, S.P.; Sharma, Arun [Food Technology Division, Bhabha Atomic Research Centre, Mumbai-400 085 (India)

    2011-05-15

    In the present study, probably for the first time, a detailed analysis of the radiation induced radical species and thermoluminescence measurements of irradiated dog feed are reported. The EPR spectrum of non-irradiated ready-to-eat dog feed was characterized by singlet g=2.0047{+-}0.0003. Irradiated samples exhibited a complex EPR spectrum. During high power (50.0 mW) EPR spectroscopy, a visible change in the shape of the EPR spectrum was observed and characterized by EPR spectrum simulation technique. An axially symmetric anisotropic signal with g{sub ||}=2.0028 and g{sub perpendicular}=1.9976 was identified. However, a negligible change in the matrix of irradiated edible dog chew was observed using EPR spectroscopy. Therefore, thermoluminescence study of the isolated minerals from dog chew was carried out. The composition of the poly-minerals was studied using SEM and EDX analysis and a complete verdict on identification of irradiation is proposed.

  13. An improved approach to identify irradiated dog feed by electron paramagnetic resonance study and thermoluminescence measurements

    International Nuclear Information System (INIS)

    Sanyal, Bhaskar; Chawla, S.P.; Sharma, Arun

    2011-01-01

    In the present study, probably for the first time, a detailed analysis of the radiation induced radical species and thermoluminescence measurements of irradiated dog feed are reported. The EPR spectrum of non-irradiated ready-to-eat dog feed was characterized by singlet g=2.0047±0.0003. Irradiated samples exhibited a complex EPR spectrum. During high power (50.0 mW) EPR spectroscopy, a visible change in the shape of the EPR spectrum was observed and characterized by EPR spectrum simulation technique. An axially symmetric anisotropic signal with g || =2.0028 and g perpendicular =1.9976 was identified. However, a negligible change in the matrix of irradiated edible dog chew was observed using EPR spectroscopy. Therefore, thermoluminescence study of the isolated minerals from dog chew was carried out. The composition of the poly-minerals was studied using SEM and EDX analysis and a complete verdict on identification of irradiation is proposed.

  14. One Health approach to identify research needs in bovine and human babesioses: workshop report

    Directory of Open Access Journals (Sweden)

    McElwain Terry F

    2010-04-01

    Full Text Available Abstract Background Babesia are emerging health threats to humans and animals in the United States. A collaborative effort of multiple disciplines to attain optimal health for people, animals and our environment, otherwise known as the One Health concept, was taken during a research workshop held in April 2009 to identify gaps in scientific knowledge regarding babesioses. The impetus for this analysis was the increased risk for outbreaks of bovine babesiosis, also known as Texas cattle fever, associated with the re-infestation of the U.S. by cattle fever ticks. Results The involvement of wildlife in the ecology of cattle fever ticks jeopardizes the ability of state and federal agencies to keep the national herd free of Texas cattle fever. Similarly, there has been a progressive increase in the number of cases of human babesiosis over the past 25 years due to an increase in the white-tailed deer population. Human babesiosis due to cattle-associated Babesia divergens and Babesia divergens-like organisms have begun to appear in residents of the United States. Research needs for human and bovine babesioses were identified and are presented herein. Conclusions The translation of this research is expected to provide veterinary and public health systems with the tools to mitigate the impact of bovine and human babesioses. However, economic, political, and social commitments are urgently required, including increased national funding for animal and human Babesia research, to prevent the re-establishment of cattle fever ticks and the increasing problem of human babesiosis in the United States.

  15. TU-H-206-01: An Automated Approach for Identifying Geometric Distortions in Gamma Cameras

    Energy Technology Data Exchange (ETDEWEB)

    Mann, S; Nelson, J [Clinical Imaging Physics Group and Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Samei, E [Clinical Imaging Physics Group and Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States)

    2016-06-15

    Purpose: To develop a clinically-deployable, automated process for detecting artifacts in routine nuclear medicine (NM) quality assurance (QA) bar phantom images. Methods: An artifact detection algorithm was created to analyze bar phantom images as part of an ongoing QA program. A low noise, high resolution reference image was acquired from an x-ray of the bar phantom with a Philips Digital Diagnost system utilizing image stitching. NM bar images, acquired for 5 million counts over a 512×512 matrix, were registered to the template image by maximizing mutual information (MI). The MI index was used as an initial test for artifacts; low values indicate an overall presence of distortions regardless of their spatial location. Images with low MI scores were further analyzed for bar linearity, periodicity, alignment, and compression to locate differences with respect to the template. Findings from each test were spatially correlated and locations failing multiple tests were flagged as potential artifacts requiring additional visual analysis. The algorithm was initially deployed for GE Discovery 670 and Infinia Hawkeye gamma cameras. Results: The algorithm successfully identified clinically relevant artifacts from both systems previously unnoticed by technologists performing the QA. Average MI indices for artifact-free images are 0.55. Images with MI indices < 0.50 have shown 100% sensitivity and specificity for artifact detection when compared with a thorough visual analysis. Correlation of geometric tests confirms the ability to spatially locate the most likely image regions containing an artifact regardless of initial phantom orientation. Conclusion: The algorithm shows the potential to detect gamma camera artifacts that may be missed by routine technologist inspections. Detection and subsequent correction of artifacts ensures maximum image quality and may help to identify failing hardware before it impacts clinical workflow. Going forward, the algorithm is being

  16. A molecular systems approach to modelling human skin pigmentation: identifying underlying pathways and critical components.

    Science.gov (United States)

    Raghunath, Arathi; Sambarey, Awanti; Sharma, Neha; Mahadevan, Usha; Chandra, Nagasuma

    2015-04-29

    Ultraviolet radiations (UV) serve as an environmental stress for human skin, and result in melanogenesis, with the pigment melanin having protective effects against UV induced damage. This involves a dynamic and complex regulation of various biological processes that results in the expression of melanin in the outer most layers of the epidermis, where it can exert its protective effect. A comprehensive understanding of the underlying cross talk among different signalling molecules and cell types is only possible through a systems perspective. Increasing incidences of both melanoma and non-melanoma skin cancers necessitate the need to better comprehend UV mediated effects on skin pigmentation at a systems level, so as to ultimately evolve knowledge-based strategies for efficient protection and prevention of skin diseases. A network model for UV-mediated skin pigmentation in the epidermis was constructed and subjected to shortest path analysis. Virtual knock-outs were carried out to identify essential signalling components. We describe a network model for UV-mediated skin pigmentation in the epidermis. The model consists of 265 components (nodes) and 429 directed interactions among them, capturing the manner in which one component influences the other and channels information. Through shortest path analysis, we identify novel signalling pathways relevant to pigmentation. Virtual knock-outs or perturbations of specific nodes in the network have led to the identification of alternate modes of signalling as well as enabled determining essential nodes in the process. The model presented provides a comprehensive picture of UV mediated signalling manifesting in human skin pigmentation. A systems perspective helps provide a holistic purview of interconnections and complexity in the processes leading to pigmentation. The model described here is extensive yet amenable to expansion as new data is gathered. Through this study, we provide a list of important proteins essential

  17. An open simulation approach to identify chances and limitations for vulnerable road user (VRU) active safety.

    Science.gov (United States)

    Seiniger, Patrick; Bartels, Oliver; Pastor, Claus; Wisch, Marcus

    2013-01-01

    It is commonly agreed that active safety will have a significant impact on reducing accident figures for pedestrians and probably also bicyclists. However, chances and limitations for active safety systems have only been derived based on accident data and the current state of the art, based on proprietary simulation models. The objective of this article is to investigate these chances and limitations by developing an open simulation model. This article introduces a simulation model, incorporating accident kinematics, driving dynamics, driver reaction times, pedestrian dynamics, performance parameters of different autonomous emergency braking (AEB) generations, as well as legal and logical limitations. The level of detail for available pedestrian accident data is limited. Relevant variables, especially timing of the pedestrian appearance and the pedestrian's moving speed, are estimated using assumptions. The model in this article uses the fact that a pedestrian and a vehicle in an accident must have been in the same spot at the same time and defines the impact position as a relevant accident parameter, which is usually available from accident data. The calculations done within the model identify the possible timing available for braking by an AEB system as well as the possible speed reduction for different accident scenarios as well as for different system configurations. The simulation model identifies the lateral impact position of the pedestrian as a significant parameter for system performance, and the system layout is designed to brake when the accident becomes unavoidable by the vehicle driver. Scenarios with a pedestrian running from behind an obstruction are the most demanding scenarios and will very likely never be avoidable for all vehicle speeds due to physical limits. Scenarios with an unobstructed person walking will very likely be treatable for a wide speed range for next generation AEB systems.

  18. PREDICT-PD: An online approach to prospectively identify risk indicators of Parkinson's disease.

    Science.gov (United States)

    Noyce, Alastair J; R'Bibo, Lea; Peress, Luisa; Bestwick, Jonathan P; Adams-Carr, Kerala L; Mencacci, Niccolo E; Hawkes, Christopher H; Masters, Joseph M; Wood, Nicholas; Hardy, John; Giovannoni, Gavin; Lees, Andrew J; Schrag, Anette

    2017-02-01

    A number of early features can precede the diagnosis of Parkinson's disease (PD). To test an online, evidence-based algorithm to identify risk indicators of PD in the UK population. Participants aged 60 to 80 years without PD completed an online survey and keyboard-tapping task annually over 3 years, and underwent smell tests and genotyping for glucocerebrosidase (GBA) and leucine-rich repeat kinase 2 (LRRK2) mutations. Risk scores were calculated based on the results of a systematic review of risk factors and early features of PD, and individuals were grouped into higher (above 15th centile), medium, and lower risk groups (below 85th centile). Previously defined indicators of increased risk of PD ("intermediate markers"), including smell loss, rapid eye movement-sleep behavior disorder, and finger-tapping speed, and incident PD were used as outcomes. The correlation of risk scores with intermediate markers and movement of individuals between risk groups was assessed each year and prospectively. Exploratory Cox regression analyses with incident PD as the dependent variable were performed. A total of 1323 participants were recruited at baseline and >79% completed assessments each year. Annual risk scores were correlated with intermediate markers of PD each year and baseline scores were correlated with intermediate markers during follow-up (all P values < 0.001). Incident PD diagnoses during follow-up were significantly associated with baseline risk score (hazard ratio = 4.39, P = .045). GBA variants or G2019S LRRK2 mutations were found in 47 participants, and the predictive power for incident PD was improved by the addition of genetic variants to risk scores. The online PREDICT-PD algorithm is a unique and simple method to identify indicators of PD risk. © 2017 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder Society. © 2016 International Parkinson and Movement Disorder

  19. Validating the UNICEF/Washington Group Child Functioning Module for Fijian schools to identify seeing, hearing and walking difficulties.

    Science.gov (United States)

    Sprunt, Beth; Hoq, Monsurul; Sharma, Umesh; Marella, Manjula

    2017-09-20

    This study investigated the seeing, hearing and walking questions of the UNICEF/Washington Group Child Functioning Module and the inter-rater reliability between teachers and parents as proxy respondents. Cross-sectional diagnostic accuracy study, two-gate design with representative sampling, comparing Module responses to reference standard assessments for 472 primary aged students in Fiji. Receiver operating characteristic curves were constructed to determine the area under the curve and optimal cut-off points. Areas under the curves ranged from 0.823 to 0.889 indicating "good" diagnostic accuracy. Inter-rater reliability between parent and teacher responses was "good" to "excellent". The optimal cut-off determined by the Youden Index was "some difficulty" however a wide spread of impairment levels were found in this category with most children either having none or substantial impairments. The diagnostic accuracy of the Module seeing, hearing and walking questions appears acceptable with either parents or teachers as proxy respondents. For education systems, use of the cut-off "some difficulty" with accompanying clinical assessment may be important to capture children who require services and learning supports and avoid potentially misleading categorization. Given the high proportion of the sample from special schools research is required to further test the Module in mainstream schools. Implications for rehabilitation Identification of children who are at risk of disability in Fiji is important to enable planning, monitoring and evaluating access to quality inclusive education. The UNICEF/Washington Group Child Functioning Module appears to be a practical and effective tool that can be used by teachers to identify children at risk of disability. Children identified on the UNICEF/Washington Group Child Functioning Module as having "some difficulty" or higher levels of difficulty in relation to vision, hearing or walking should be referred for further assessment

  20. COSMID: A Web-based Tool for Identifying and Validating CRISPR/Cas Off-target Sites

    Directory of Open Access Journals (Sweden)

    Thomas J Cradick

    2014-01-01

    Full Text Available Precise genome editing using engineered nucleases can significantly facilitate biological studies and disease treatment. In particular, clustered regularly interspaced short palindromic repeats (CRISPR with CRISPR-associated (Cas proteins are a potentially powerful tool for modifying a genome by targeted cleavage of DNA sequences complementary to designed guide strand RNAs. Although CRISPR/Cas systems can have on-target cleavage rates close to the transfection rates, they may also have relatively high off-target cleavage at similar genomic sites that contain one or more base pair mismatches, and insertions or deletions relative to the guide strand. We have developed a bioinformatics-based tool, COSMID (CRISPR Off-target Sites with Mismatches, Insertions, and Deletions that searches genomes for potential off-target sites (http://crispr.bme.gatech.edu. Based on the user-supplied guide strand and input parameters, COSMID identifies potential off-target sites with the specified number of mismatched bases and insertions or deletions when compared with the guide strand. For each site, amplification primers optimal for the chosen application are also given as output. This ranked-list of potential off-target sites assists the choice and evaluation of intended target sites, thus helping the design of CRISPR/Cas systems with minimal off-target effects, as well as the identification and quantification of CRISPR/Cas induced off-target cleavage in cells.

  1. A novel approach to sequence validating protein expression clones with automated decision making

    Directory of Open Access Journals (Sweden)

    Mohr Stephanie E

    2007-06-01

    Full Text Available Abstract Background Whereas the molecular assembly of protein expression clones is readily automated and routinely accomplished in high throughput, sequence verification of these clones is still largely performed manually, an arduous and time consuming process. The ultimate goal of validation is to determine if a given plasmid clone matches its reference sequence sufficiently to be "acceptable" for use in protein expression experiments. Given the accelerating increase in availability of tens of thousands of unverified clones, there is a strong demand for rapid, efficient and accurate software that automates clone validation. Results We have developed an Automated Clone Evaluation (ACE system – the first comprehensive, multi-platform, web-based plasmid sequence verification software package. ACE automates the clone verification process by defining each clone sequence as a list of multidimensional discrepancy objects, each describing a difference between the clone and its expected sequence including the resulting polypeptide consequences. To evaluate clones automatically, this list can be compared against user acceptance criteria that specify the allowable number of discrepancies of each type. This strategy allows users to re-evaluate the same set of clones against different acceptance criteria as needed for use in other experiments. ACE manages the entire sequence validation process including contig management, identifying and annotating discrepancies, determining if discrepancies correspond to polymorphisms and clone finishing. Designed to manage thousands of clones simultaneously, ACE maintains a relational database to store information about clones at various completion stages, project processing parameters and acceptance criteria. In a direct comparison, the automated analysis by ACE took less time and was more accurate than a manual analysis of a 93 gene clone set. Conclusion ACE was designed to facilitate high throughput clone sequence

  2. A macroepigenetic approach to identify factors responsible for the autism epidemic in the United States

    Directory of Open Access Journals (Sweden)

    Dufault Renee

    2012-04-01

    Full Text Available Abstract The number of children ages 6 to 21 in the United States receiving special education services under the autism disability category increased 91% between 2005 to 2010 while the number of children receiving special education services overall declined by 5%. The demand for special education services continues to rise in disability categories associated with pervasive developmental disorders. Neurodevelopment can be adversely impacted when gene expression is altered by dietary transcription factors, such as zinc insufficiency or deficiency, or by exposure to toxic substances found in our environment, such as mercury or organophosphate pesticides. Gene expression patterns differ geographically between populations and within populations. Gene variants of paraoxonase-1 are associated with autism in North America, but not in Italy, indicating regional specificity in gene-environment interactions. In the current review, we utilize a novel macroepigenetic approach to compare variations in diet and toxic substance exposure between these two geographical populations to determine the likely factors responsible for the autism epidemic in the United States.

  3. Biomechanical approaches to identify and quantify injury mechanisms and risk factors in women's artistic gymnastics.

    Science.gov (United States)

    Bradshaw, Elizabeth J; Hume, Patria A

    2012-09-01

    Targeted injury prevention strategies, based on biomechanical analyses, have the potential to help reduce the incidence and severity of gymnastics injuries. This review outlines the potential benefits of biomechanics research to contribute to injury prevention strategies for women's artistic gymnastics by identification of mechanisms of injury and quantification of the effects of injury risk factors. One hundred and twenty-three articles were retained for review after searching electronic databases using key words, including 'gymnastic', 'biomech*', and 'inj*', and delimiting by language and relevance to the paper aim. Impact load can be measured biomechanically by the use of instrumented equipment (e.g. beatboard), instrumentation on the gymnast (accelerometers), or by landings on force plates. We need further information on injury mechanisms and risk factors in gymnastics and practical methods of monitoring training loads. We have not yet shown, beyond a theoretical approach, how biomechanical analysis of gymnastics can help reduce injury risk through injury prevention interventions. Given the high magnitude of impact load, both acute and accumulative, coaches should monitor impact loads per training session, taking into consideration training quality and quantity such as the control of rotation and the height from which the landings are executed.

  4. Identifying endogenous neural stem cells in the adult brain in vitro and in vivo: novel approaches.

    Science.gov (United States)

    Rueger, Maria Adele; Androutsellis-Theotokis, Andreas

    2013-01-01

    In the 1960s, Joseph Altman reported that the adult mammalian brain is capable of generating new neurons. Today it is understood that some of these neurons are derived from uncommitted cells in the subventricular zone lining the lateral ventricles, and the dentate gyrus of the hippocampus. The first area generates new neuroblasts which migrate to the olfactory bulb, whereas hippocampal neurogenesis seems to play roles in particular types of learning and memory. A part of these uncommitted (immature) cells is able to divide and their progeny can generate all three major cell types of the nervous system: neurons, astrocytes, and oligodendrocytes; these properties define such cells as neural stem cells. Although the roles of these cells are not yet clear, it is accepted that they affect functions including olfaction and learning/memory. Experiments with insults to the central nervous system also show that neural stem cells are quickly mobilized due to injury and in various disorders by proliferating, and migrating to injury sites. This suggests a role of endogenous neural stem cells in disease. New pools of stem cells are being discovered, suggesting an even more important role for these cells. To understand these cells and to coax them to contribute to tissue repair it would be very useful to be able to image them in the living organism. Here we discuss advances in imaging approaches as well as new concepts that emerge from stem cell biology with emphasis on the interface between imaging and stem cells.

  5. An Integrated Bioinformatics and Computational Biology Approach Identifies New BH3-Only Protein Candidates.

    Science.gov (United States)

    Hawley, Robert G; Chen, Yuzhong; Riz, Irene; Zeng, Chen

    2012-05-04

    In this study, we utilized an integrated bioinformatics and computational biology approach in search of new BH3-only proteins belonging to the BCL2 family of apoptotic regulators. The BH3 (BCL2 homology 3) domain mediates specific binding interactions among various BCL2 family members. It is composed of an amphipathic α-helical region of approximately 13 residues that has only a few amino acids that are highly conserved across all members. Using a generalized motif, we performed a genome-wide search for novel BH3-containing proteins in the NCBI Consensus Coding Sequence (CCDS) database. In addition to known pro-apoptotic BH3-only proteins, 197 proteins were recovered that satisfied the search criteria. These were categorized according to α-helical content and predictive binding to BCL-xL (encoded by BCL2L1) and MCL-1, two representative anti-apoptotic BCL2 family members, using position-specific scoring matrix models. Notably, the list is enriched for proteins associated with autophagy as well as a broad spectrum of cellular stress responses such as endoplasmic reticulum stress, oxidative stress, antiviral defense, and the DNA damage response. Several potential novel BH3-containing proteins are highlighted. In particular, the analysis strongly suggests that the apoptosis inhibitor and DNA damage response regulator, AVEN, which was originally isolated as a BCL-xL-interacting protein, is a functional BH3-only protein representing a distinct subclass of BCL2 family members.

  6. How to Identify the Most Powerful Node in Complex Networks? A Novel Entropy Centrality Approach

    Directory of Open Access Journals (Sweden)

    Tong Qiao

    2017-11-01

    Full Text Available Centrality is one of the most studied concepts in network analysis. Despite an abundance of methods for measuring centrality in social networks has been proposed, each approach exclusively characterizes limited parts of what it implies for an actor to be “vital” to the network. In this paper, a novel mechanism is proposed to quantitatively measure centrality using the re-defined entropy centrality model, which is based on decompositions of a graph into subgraphs and analysis on the entropy of neighbor nodes. By design, the re-defined entropy centrality which describes associations among node pairs and captures the process of influence propagation can be interpreted explained as a measure of actor potential for communication activity. We evaluate the efficiency of the proposed model by using four real-world datasets with varied sizes and densities and three artificial networks constructed by models including Barabasi-Albert, Erdos-Renyi and Watts-Stroggatz. The four datasets are Zachary’s karate club, USAir97, Collaboration network and Email network URV respectively. Extensive experimental results prove the effectiveness of the proposed method.

  7. A Novel Entropy-Based Centrality Approach for Identifying Vital Nodes in Weighted Networks

    Directory of Open Access Journals (Sweden)

    Tong Qiao

    2018-04-01

    Full Text Available Measuring centrality has recently attracted increasing attention, with algorithms ranging from those that simply calculate the number of immediate neighbors and the shortest paths to those that are complicated iterative refinement processes and objective dynamical approaches. Indeed, vital nodes identification allows us to understand the roles that different nodes play in the structure of a network. However, quantifying centrality in complex networks with various topological structures is not an easy task. In this paper, we introduce a novel definition of entropy-based centrality, which can be applicable to weighted directed networks. By design, the total power of a node is divided into two parts, including its local power and its indirect power. The local power can be obtained by integrating the structural entropy, which reveals the communication activity and popularity of each node, and the interaction frequency entropy, which indicates its accessibility. In addition, the process of influence propagation can be captured by the two-hop subnetworks, resulting in the indirect power. In order to evaluate the performance of the entropy-based centrality, we use four weighted real-world networks with various instance sizes, degree distributions, and densities. Correspondingly, these networks are adolescent health, Bible, United States (US airports, and Hep-th, respectively. Extensive analytical results demonstrate that the entropy-based centrality outperforms degree centrality, betweenness centrality, closeness centrality, and the Eigenvector centrality.

  8. Experimental approaches to identify cellular G-quadruplex structures and functions.

    Science.gov (United States)

    Di Antonio, Marco; Rodriguez, Raphaël; Balasubramanian, Shankar

    2012-05-01

    Guanine-rich nucleic acids can fold into non-canonical DNA secondary structures called G-quadruplexes. The formation of these structures can interfere with the biology that is crucial to sustain cellular homeostases and metabolism via mechanisms that include transcription, translation, splicing, telomere maintenance and DNA recombination. Thus, due to their implication in several biological processes and possible role promoting genomic instability, G-quadruplex forming sequences have emerged as potential therapeutic targets. There has been a growing interest in the development of synthetic molecules and biomolecules for sensing G-quadruplex structures in cellular DNA. In this review, we summarise and discuss recent methods developed for cellular imaging of G-quadruplexes, and the application of experimental genomic approaches to detect G-quadruplexes throughout genomic DNA. In particular, we will discuss the use of engineered small molecules and natural proteins to enable pull-down, ChIP-Seq, ChIP-chip and fluorescence imaging of G-quadruplex structures in cellular DNA. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. A bioinformatics approach for identifying transgene insertion sites using whole genome sequencing data.

    Science.gov (United States)

    Park, Doori; Park, Su-Hyun; Ban, Yong Wook; Kim, Youn Shic; Park, Kyoung-Cheul; Kim, Nam-Soo; Kim, Ju-Kon; Choi, Ik-Young

    2017-08-15

    Genetically modified crops (GM crops) have been developed to improve the agricultural traits of modern crop cultivars. Safety assessments of GM crops are of paramount importance in research at developmental stages and before releasing transgenic plants into the marketplace. Sequencing technology is developing rapidly, with higher output and labor efficiencies, and will eventually replace existing methods for the molecular characterization of genetically modified organisms. To detect the transgenic insertion locations in the three GM rice gnomes, Illumina sequencing reads are mapped and classified to the rice genome and plasmid sequence. The both mapped reads are classified to characterize the junction site between plant and transgene sequence by sequence alignment. Herein, we present a next generation sequencing (NGS)-based molecular characterization method, using transgenic rice plants SNU-Bt9-5, SNU-Bt9-30, and SNU-Bt9-109. Specifically, using bioinformatics tools, we detected the precise insertion locations and copy numbers of transfer DNA, genetic rearrangements, and the absence of backbone sequences, which were equivalent to results obtained from Southern blot analyses. NGS methods have been suggested as an effective means of characterizing and detecting transgenic insertion locations in genomes. Our results demonstrate the use of a combination of NGS technology and bioinformatics approaches that offers cost- and time-effective methods for assessing the safety of transgenic plants.

  10. Identifying determinants of medication adherence following myocardial infarction using the Theoretical Domains Framework and the Health Action Process Approach.

    Science.gov (United States)

    Presseau, Justin; Schwalm, J D; Grimshaw, Jeremy M; Witteman, Holly O; Natarajan, Madhu K; Linklater, Stefanie; Sullivan, Katrina; Ivers, Noah M

    2017-10-01

    Despite evidence-based recommendations, adherence with secondary prevention medications post-myocardial infarction (MI) remains low. Taking medication requires behaviour change, and using behavioural theories to identify what factors determine adherence could help to develop novel adherence interventions. Compare the utility of different behaviour theory-based approaches for identifying modifiable determinants of medication adherence post-MI that could be targeted by interventions. Two studies were conducted with patients 0-2, 3-12, 13-24 or 25-36 weeks post-MI. Study 1: 24 patients were interviewed about barriers and facilitators to medication adherence. Interviews were conducted and coded using the Theoretical Domains Framework. Study 2: 201 patients answered a telephone questionnaire assessing Health Action Process Approach constructs to predict intention and medication adherence (MMAS-8). Study 1: domains identified: Beliefs about Consequences, Memory/Attention/Decision Processes, Behavioural Regulation, Social Influences and Social Identity. Study 2: 64, 59, 42 and 58% reported high adherence at 0-2, 3-12, 13-24 and 25-36 weeks. Social Support and Action Planning predicted adherence at all time points, though the relationship between Action Planning and adherence decreased over time. Using two behaviour theory-based approaches provided complimentary findings and identified modifiable factors that could be targeted to help translate Intention into action to improve medication adherence post-MI.

  11. Cross-species multiple environmental stress responses: An integrated approach to identify candidate genes for multiple stress tolerance in sorghum (Sorghum bicolor (L. Moench and related model species.

    Directory of Open Access Journals (Sweden)

    Adugna Abdi Woldesemayat

    associated with different traits that are responsive to multiple stresses. Ontology mapping was used to validate the identified genes, while reconstruction of the phylogenetic tree was instrumental to infer the evolutionary relationship of the sorghum orthologs. The results also show specific genes responsible for various interrelated components of drought response mechanism such as drought tolerance, drought avoidance and drought escape.We submit that this approach is novel and to our knowledge, has not been used previously in any other research; it enables us to perform cross-species queries for genes that are likely to be associated with multiple stress tolerance, as a means to identify novel targets for engineering stress resistance in sorghum and possibly, in other crop species.

  12. Post Mortem Validation of MRI-Identified Veins on the Surface of the Cerebral Cortex as Potential Landmarks for Neurosurgery

    Directory of Open Access Journals (Sweden)

    Günther Grabner

    2017-06-01

    Full Text Available Background and Objective: Image-guided neurosurgery uses information from a wide spectrum of methods to inform the neurosurgeon's judgement about which tissue to resect and which to spare. Imaging data are registered to the patient's skull so that they correspond to the intraoperative macro- and microscopic view. The correspondence between imaging and optical systems breaks down during surgery, however, as a result of cerebro-spinal fluid drain age, tissue resection, and gravity-based brain shift. In this work we investigate whether a map of surface veins, automatically segmented from MRI, could serve as additional reference system.Methods: Gradient-echo based T2*-weighted imaging was performed on two human cadavers heads using a 7 Tesla MRI scanner. Automatic vessel segmentation was performed using the Frangi vesselness filter, and surface renderings of vessels compared with photographs of the surface of the brain following craniotomy.Results: A high level of correspondence was established between vessel maps and the post autopsy photographs. Corresponding veins, including the prominent superior anastomotic veins, could be identified in all brain lobes.Conclusion: Automatic surface vessel segmentation is feasible and the high correspondence to post autopsy photographs indicates that they could be used as an additional reference system for image-guided neurosurgery in order to maintain the correspondence between imaging and optical systems.This has the advantage over a skull-based reference system that veins are clearly visible to the surgeon and move and deform with the underlying tissue, potentially making this surface net of landmarks robust to brain shift.

  13. Proteomics approach to identify unique xylem sap proteins in Pierce's disease-tolerant Vitis species.

    Science.gov (United States)

    Basha, Sheikh M; Mazhar, Hifza; Vasanthaiah, Hemanth K N

    2010-03-01

    Pierce's disease (PD) is a destructive bacterial disease of grapes caused by Xylella fastidiosa which is xylem-confined. The tolerance level to this disease varies among Vitis species. Our research was aimed at identifying unique xylem sap proteins present in PD-tolerant Vitis species. The results showed wide variation in the xylem sap protein composition, where a set of polypeptides with pI between 4.5 and 4.7 and M(r) of 31 kDa were present in abundant amount in muscadine (Vitis rotundifolia, PD-tolerant), in reduced levels in Florida hybrid bunch (Vitis spp., PD-tolerant) and absent in bunch grapes (Vitis vinifera, PD-susceptible). Liquid chromatography/mass spectrometry/mass spectrometry analysis of these proteins revealed their similarity to beta-1, 3-glucanase, peroxidase, and a subunit of oxygen-evolving enhancer protein 1, which are known to play role in defense and oxygen generation. In addition, the amount of free amino acids and soluble sugars was found to be significantly lower in xylem sap of muscadine genotypes compared to V. vinifera genotypes, indicating that the higher nutritional value of bunch grape sap may be more suitable for Xylella growth. These data suggest that the presence of these unique proteins in xylem sap is vital for PD tolerance in muscadine and Florida hybrid bunch grapes.

  14. Ab initio Thermodynamic Approach to Identify Mixed Solid Sorbents for CO2 Capture Technology

    Directory of Open Access Journals (Sweden)

    Yuhua eDuan

    2015-10-01

    Full Text Available Because the current technologies for capturing CO2 are still too energy intensive, new materials must be developed that can capture CO2 reversibly with acceptable energy costs. At a given CO2 pressure, the turnover temperature (Tt of the reaction of an individual solid that can capture CO2 is fixed. Such Tt may be outside the operating temperature range (ΔTo for a practical capture technology. To adjust Tt to fit the practical ΔTo, in this study, three scenarios of mixing schemes are explored by combining thermodynamic database mining with first principles density functional theory and phonon lattice dynamics calculations. Our calculated results demonstrate that by mixing different types of solids, it’s possible to shift Tt to the range of practical operating temperature conditions. According to the requirements imposed by the pre- and post- combustion technologies and based on our calculated thermodynamic properties for the CO2 capture reactions by the mixed solids of interest, we were able to identify the mixing ratios of two or more solids to form new sorbent materials for which lower capture energy costs are expected at the desired pressure and temperature conditions.

  15. General Practice Clinical Data Help Identify Dementia Hotspots: A Novel Geospatial Analysis Approach.

    Science.gov (United States)

    Bagheri, Nasser; Wangdi, Kinley; Cherbuin, Nicolas; Anstey, Kaarin J

    2018-01-01

    We have a poor understanding of whether dementia clusters geographically, how this occurs, and how dementia may relate to socio-demographic factors. To shed light on these important questions, this study aimed to compute a dementia risk score for individuals to assess spatial variation of dementia risk, identify significant clusters (hotspots), and explore their association with socioeconomic status. We used clinical records from 16 general practices (468 Statistical Area level 1 s, N = 14,746) from the city of west Adelaide, Australia for the duration of 1 January 2012 to 31 December 2014. Dementia risk was estimated using The Australian National University-Alzheimer's Disease Risk Index. Hotspot analyses were applied to examine potential clusters in dementia risk at small area level. Significant hotspots were observed in eastern and southern areas while coldspots were observed in the western area within the study perimeter. Additionally, significant hotspots were observed in low socio-economic communities. We found dementia risk scores increased with age, sex (female), high cholesterol, no physical activity, living alone (widow, divorced, separated, or never married), and co-morbidities such as diabetes and depression. Similarly, smoking was associated with a lower dementia risk score. The identification of dementia risk clusters may provide insight into possible geographical variations in risk factors for dementia and quantify these risks at the community level. As such, this research may enable policy makers to tailor early prevention strategies to the correct individuals within their precise locations.

  16. An improved approach to identify irradiated spices using electronic nose, FTIR, and EPR spectroscopy.

    Science.gov (United States)

    Sanyal, Bhaskar; Ahn, Jae-Jun; Maeng, Jeong-Hwan; Kyung, Hyun-Kyu; Lim, Ha-Kyeong; Sharma, Arun; Kwon, Joong-Ho

    2014-09-01

    Changes in cumin and chili powder from India resulting from electron-beam irradiation were investigated using 3 analytical methods: electronic nose (E-nose), Fourier transform infrared (FTIR) spectroscopy, and electron paramagnetic resonance (EPR) spectroscopy. The spices had been exposed to 6 to 14 kGy doses recommended for microbial decontamination. E-nose measured a clear difference in flavor patterns of the irradiated spices in comparison with the nonirradiated samples. Principal component analysis further showed a dose-dependent variation. FTIR spectra of the samples showed strong absorption bands at 3425, 3007 to 2854, and 1746 cm(-1). However, both nonirradiated and irradiated spice samples had comparable patterns without any noteworthy changes in functional groups. EPR spectroscopy of the irradiated samples showed a radiation-specific triplet signal at g = 2.006 with a hyper-fine coupling constant of 3 mT confirming the results obtained with the E-nose technique. Thus, E-nose was found to be a potential tool to identify irradiated spices. © 2014 Institute of Food Technologists®

  17. Identifying and Classifying Mobile Business Models Based on Meta-Synthesis Approach

    Directory of Open Access Journals (Sweden)

    Porrandokht Niroomand

    2012-03-01

    Full Text Available The appearance of mobile has provided unique opportunities and fields through the development and creation of businesses and has been able to create the new job opportunities. The current research tries to familiarize entrepreneures who are running the businesses especially in the area of mobile services with business models. These business models can familiarize them for implementing the new ideas and designs since they can enter to business market. Searching in many papers shows that there are no propitiated papers and researches that can identify, categorize and analyze the mobile business models. Consequently, this paper involves innovation. The first part of this paper presents the review about the concepts and theories about the different mobile generations, the mobile commerce and business models. Afterwards, 92 models are compared, interpreted, translated and combined using 33 papers, books based on two different criteria that are expert criterion and kind of product criterion. In the classification of models according to models that are presented by experts, the models are classified based on criteria such as business fields, business partners, the rate of dynamism, the kind of activity, the focus areas, the mobile generations, transparency, the type of operator activities, marketing and advertisements. The models that are classified based on the kind of product have been analyzed and classified at four different areas of mobile commerce including the content production, technology (software and hardware, network and synthetic.

  18. Identifying Associations Between Brain Imaging Phenotypes and Genetic Factors via A Novel Structured SCCA Approach.

    Science.gov (United States)

    Du, Lei; Zhang, Tuo; Liu, Kefei; Yan, Jingwen; Yao, Xiaohui; Risacher, Shannon L; Saykin, Andrew J; Han, Junwei; Guo, Lei; Shen, Li

    2017-06-01

    Brain imaging genetics attracts more and more attention since it can reveal associations between genetic factors and the structures or functions of human brain. Sparse canonical correlation analysis (SCCA) is a powerful bi-multivariate association identification technique in imaging genetics. There have been many SCCA methods which could capture different types of structured imaging genetic relationships. These methods either use the group lasso to recover the group structure, or employ the graph/network guided fused lasso to find out the network structure. However, the group lasso methods have limitation in generalization because of the incomplete or unavailable prior knowledge in real world. The graph/network guided methods are sensitive to the sign of the sample correlation which may be incorrectly estimated. We introduce a new SCCA model using a novel graph guided pairwise group lasso penalty, and propose an efficient optimization algorithm. The proposed method has a strong upper bound for the grouping effect for both positively and negatively correlated variables. We show that our method performs better than or equally to two state-of-the-art SCCA methods on both synthetic and real neuroimaging genetics data. In particular, our method identifies stronger canonical correlations and captures better canonical loading profiles, showing its promise for revealing biologically meaningful imaging genetic associations.

  19. A systems approach identifies networks and genes linking sleep and stress: implications for neuropsychiatric disorders.

    Science.gov (United States)

    Jiang, Peng; Scarpa, Joseph R; Fitzpatrick, Karrie; Losic, Bojan; Gao, Vance D; Hao, Ke; Summa, Keith C; Yang, He S; Zhang, Bin; Allada, Ravi; Vitaterna, Martha H; Turek, Fred W; Kasarskis, Andrew

    2015-05-05

    Sleep dysfunction and stress susceptibility are comorbid complex traits that often precede and predispose patients to a variety of neuropsychiatric diseases. Here, we demonstrate multilevel organizations of genetic landscape, candidate genes, and molecular networks associated with 328 stress and sleep traits in a chronically stressed population of 338 (C57BL/6J × A/J) F2 mice. We constructed striatal gene co-expression networks, revealing functionally and cell-type-specific gene co-regulations important for stress and sleep. Using a composite ranking system, we identified network modules most relevant for 15 independent phenotypic categories, highlighting a mitochondria/synaptic module that links sleep and stress. The key network regulators of this module are overrepresented with genes implicated in neuropsychiatric diseases. Our work suggests that the interplay among sleep, stress, and neuropathology emerges from genetic influences on gene expression and their collective organization through complex molecular networks, providing a framework for interrogating the mechanisms underlying sleep, stress susceptibility, and related neuropsychiatric disorders. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Identifying overrepresented concepts in gene lists from literature: a statistical approach based on Poisson mixture model

    Directory of Open Access Journals (Sweden)

    Zhai Chengxiang

    2010-05-01

    Full Text Available Abstract Background Large-scale genomic studies often identify large gene lists, for example, the genes sharing the same expression patterns. The interpretation of these gene lists is generally achieved by extracting concepts overrepresented in the gene lists. This analysis often depends on manual annotation of genes based on controlled vocabularies, in particular, Gene Ontology (GO. However, the annotation of genes is a labor-intensive process; and the vocabularies are generally incomplete, leaving some important biological domains inadequately covered. Results We propose a statistical method that uses the primary literature, i.e. free-text, as the source to perform overrepresentation analysis. The method is based on a statistical framework of mixture model and addresses the methodological flaws in several existing programs. We implemented this method within a literature mining system, BeeSpace, taking advantage of its analysis environment and added features that facilitate the interactive analysis of gene sets. Through experimentation with several datasets, we showed that our program can effectively summarize the important conceptual themes of large gene sets, even when traditional GO-based analysis does not yield informative results. Conclusions We conclude that the current work will provide biologists with a tool that effectively complements the existing ones for overrepresentation analysis from genomic experiments. Our program, Genelist Analyzer, is freely available at: http://workerbee.igb.uiuc.edu:8080/BeeSpace/Search.jsp

  1. Systems Biology Genetic Approach Identifies Serotonin Pathway as a Possible Target for Obstructive Sleep Apnea: Results from a Literature Search Review

    Directory of Open Access Journals (Sweden)

    Ram Jagannathan

    2017-01-01

    Full Text Available Rationale. Overall validity of existing genetic biomarkers in the diagnosis of obstructive sleep apnea (OSA remains unclear. The objective of this systematic genetic study is to identify “novel” biomarkers for OSA using systems biology approach. Methods. Candidate genes for OSA were extracted from PubMed, MEDLINE, and Embase search engines and DisGeNET database. The gene ontology (GO analyses and candidate genes prioritization were performed using Enrichr tool. Genes pertaining to the top 10 pathways were extracted and used for Ingenuity Pathway Analysis. Results. In total, we have identified 153 genes. The top 10 pathways associated with OSA include (i serotonin receptor interaction, (ii pathways in cancer, (iii AGE-RAGE signaling in diabetes, (iv infectious diseases, (v serotonergic synapse, (vi inflammatory bowel disease, (vii HIF-1 signaling pathway, (viii PI3-AKT signaling pathway, (ix regulation lipolysis in adipocytes, and (x rheumatoid arthritis. After removing the overlapping genes, we have identified 23 candidate genes, out of which >30% of the genes were related to the genes involved in the serotonin pathway. Among these 4 serotonin receptors SLC6A4, HTR2C, HTR2A, and HTR1B were strongly associated with OSA. Conclusions. This preliminary report identifies several potential candidate genes associated with OSA and also describes the possible regulatory mechanisms.

  2. A Proteomics Approach to Identify New Putative Cardiac Intercalated Disk Proteins.

    Directory of Open Access Journals (Sweden)

    Siddarth Soni

    Full Text Available Synchronous beating of the heart is dependent on the efficient functioning of the cardiac intercalated disk (ID. The ID is composed of a complex protein network enabling electrical continuity and chemical communication between individual cardiomyocytes. Recently, several different studies have shed light on increasingly prevalent cardiac diseases involving the ID. Insufficient knowledge of its composition makes it difficult to study these disease mechanisms in more detail and therefore here we aim expand the ID proteome. Here, using a combination of general membrane enrichment, in-depth quantitative proteomics and an intracellular location driven bioinformatics approach, we aim to discover new putative ID proteins in rat ventricular tissue.General membrane isolation, enriched amongst others also with ID proteins as based on presence of the established markers connexin-43 and n-cadherin, was performed using centrifugation. By mass spectrometry, we quantitatively evaluated the level of 3455 proteins in the enriched membrane fraction (EMF and its counterpart, the soluble cytoplasmic fraction. These data were stringently filtered to generate a final set of 97 enriched, putative ID proteins. These included Cx43 and n-cadherin, but also many interesting novel candidates. We selected 4 candidates (Flotillin-2 (FLOT2, Nexilin (NEXN, Popeye-domain-containg-protein 2 (POPDC2 and thioredoxin-related-transmembrane-protein 2 (TMX2 and confirmed their co-localization with n-cadherin in the ID of human and rat heart cryo-sections, and isolated dog cardiomyocytes.The presented proteomics dataset of putative new ID proteins is a valuable resource for future research into this important molecular intersection of the heart.

  3. Harmonization of clinical laboratories in Africa: a multidisciplinary approach to identify innovative and sustainable technical solutions.

    Science.gov (United States)

    Putoto, Giovanni; Cortese, Antonella; Pecorari, Ilaria; Musi, Roberto; Nunziata, Enrico

    2015-06-01

    In an effective and efficient health system, laboratory medicine should play a critical role. This is not the case in Africa, where there is a lack of demand for diagnostic exams due to mistrust of health laboratory performance. Doctors with Africa CUAMM (Collegio Universitario Aspiranti Medici Missionari) is a non-profit organization, working mainly in sub-Saharan Africa (Angola, Ethiopia, Mozambique, Sierra Leone, South Sudan, Tanzania and Uganda) to help and sustain local health systems. Doctors with Africa CUAMM has advocated the need for a harmonized model for health laboratories to assess and evaluate the performance of the facilities in which they operate. In order to develop a harmonized model for African health laboratories, previous attempts at strengthening them through standardization were taken into consideration and reviewed. A survey with four Italian clinicians experienced in the field was then performed to try and understand the actual needs of health facilities. Finally a market survey was conducted to find new technologies able to update the resulting model. Comparison of actual laboratories with the developed standard - which represents the best setting any African health laboratory could aim for - allowed shortcomings in expected services to be identified and interventions subsequently prioritized. The most appropriate equipment was proposed to perform the envisaged techniques. The suitability of appliances was evaluated in consideration of recognized international recommendations, reported experiences in the field, and the availability of innovative solutions that can be performed on site in rural areas, but require minimal sample preparation and little technical expertise. The present work has developed a new, up-to-date, harmonized model for African health laboratories. The authors suggest lists of procedures to challenge the major African health problems - HIV/AIDS, malaria, tubercolosis (TB) - at each level of pyramidal health system. This

  4. An Evolutionary Genomic Approach to Identify Genes Involved in Human Birth Timing

    Science.gov (United States)

    Orabona, Guilherme; Morgan, Thomas; Haataja, Ritva; Hallman, Mikko; Puttonen, Hilkka; Menon, Ramkumar; Kuczynski, Edward; Norwitz, Errol; Snegovskikh, Victoria; Palotie, Aarno; Fellman, Vineta; DeFranco, Emily A.; Chaudhari, Bimal P.; McGregor, Tracy L.; McElroy, Jude J.; Oetjens, Matthew T.; Teramo, Kari; Borecki, Ingrid; Fay, Justin; Muglia, Louis

    2011-01-01

    Coordination of fetal maturation with birth timing is essential for mammalian reproduction. In humans, preterm birth is a disorder of profound global health significance. The signals initiating parturition in humans have remained elusive, due to divergence in physiological mechanisms between humans and model organisms typically studied. Because of relatively large human head size and narrow birth canal cross-sectional area compared to other primates, we hypothesized that genes involved in parturition would display accelerated evolution along the human and/or higher primate phylogenetic lineages to decrease the length of gestation and promote delivery of a smaller fetus that transits the birth canal more readily. Further, we tested whether current variation in such accelerated genes contributes to preterm birth risk. Evidence from allometric scaling of gestational age suggests human gestation has been shortened relative to other primates. Consistent with our hypothesis, many genes involved in reproduction show human acceleration in their coding or adjacent noncoding regions. We screened >8,400 SNPs in 150 human accelerated genes in 165 Finnish preterm and 163 control mothers for association with preterm birth. In this cohort, the most significant association was in FSHR, and 8 of the 10 most significant SNPs were in this gene. Further evidence for association of a linkage disequilibrium block of SNPs in FSHR, rs11686474, rs11680730, rs12473870, and rs1247381 was found in African Americans. By considering human acceleration, we identified a novel gene that may be associated with preterm birth, FSHR. We anticipate other human accelerated genes will similarly be associated with preterm birth risk and elucidate essential pathways for human parturition. PMID:21533219

  5. Molecular subtypes of osteosarcoma identified by reducing tumor heterogeneity through an interspecies comparative approach

    Science.gov (United States)

    Scott, Milcah C.; Sarver, Aaron L.; Gavin, Katherine J.; Thayanithy, Venugopal; Getzy, David M.; Newman, Robert A.; Cutter, Gary R.; Lindblad-Toh, Kerstin; Kisseberth, William C.; Hunter, Lawrence E.; Subramanian, Subbaya; Breen, Matthew; Modiano, Jaime F.

    2011-01-01

    The heterogeneous and chaotic nature of osteosarcoma has confounded accurate molecular classification, prognosis, and prediction for this tumor. The occurrence of spontaneous osteosarcoma is largely confined to humans and dogs. While the clinical features are remarkably similar in both species, the organization of dogs into defined breeds provides a more homogeneous genetic background that may increase the likelihood to uncover molecular subtypes for this complex disease. We thus hypothesized that molecular profiles derived from canine osteosarcoma would aid in molecular subclassification of this disease when applied to humans. To test the hypothesis, we performed genome wide gene expression profiling in a cohort of dogs with osteosarcoma, primarily from high-risk breeds. To further reduce inter-sample heterogeneity, we assessed tumor-intrinsic properties through use of an extensive panel of osteosarcoma-derived cell lines. We observed strong differential gene expression that segregated samples into two groups with differential survival probabilities. Groupings were characterized by the inversely correlated expression of genes associated with G2/M transition and DNA damage checkpoint and microenvironment-interaction categories. This signature was preserved in data from whole tumor samples of three independent dog osteosarcoma cohorts, with stratification into the two expected groups. Significantly, this restricted signature partially overlapped a previously defined, predictive signature for soft tissue sarcomas, and it unmasked orthologous molecular subtypes and their corresponding natural histories in five independent data sets from human patients with osteosarcoma. Our results indicate that the narrower genetic diversity of dogs can be utilized to group complex human osteosarcoma into biologically and clinically relevant molecular subtypes. This in turn may enhance prognosis and prediction, and identify relevant therapeutic targets. PMID:21621658

  6. Validity of the Male Depression Risk Scale in a representative Canadian sample: sensitivity and specificity in identifying men with recent suicide attempt.

    Science.gov (United States)

    Rice, Simon M; Ogrodniczuk, John S; Kealy, David; Seidler, Zac E; Dhillon, Haryana M; Oliffe, John L

    2017-12-22

    Clinical practice and literature has supported the existence of a phenotypic sub-type of depression in men. While a number of self-report rating scales have been developed in order to empirically test the male depression construct, psychometric validation of these scales is limited. To confirm the psychometric properties of the multidimensional Male Depression Risk Scale (MDRS-22) and to develop clinical cut-off scores for the MDRS-22. Data were obtained from an online sample of 1000 Canadian men (median age (M) = 49.63, standard deviation (SD) = 14.60). Confirmatory factor analysis (CFA) was used to replicate the established six-factor model of the MDRS-22. Psychometric values of the MDRS subscales were comparable to the widely used Patient Health Questionnaire-9. CFA model fit indices indicated adequate model fit for the six-factor MDRS-22 model. ROC curve analysis indicated the MDRS-22 was effective for identifying those with a recent (previous four-weeks) suicide attempt (area under curve (AUC) values = 0.837). The MDRS-22 cut-off identified proportionally more (84.62%) cases of recent suicide attempt relative to the PHQ-9 moderate range (53.85%). The MDRS-22 is the first male-sensitive depression scale to be psychometrically validated using CFA techniques in independent and cross-nation samples. Additional studies should identify differential item functioning and evaluate cross-cultural effects.

  7. Validation of transcutaneous bilirubin nomogram for identifying neonatal hyperbilirubinemia in healthy Chinese term and late-preterm infants: a multicenter study

    Directory of Open Access Journals (Sweden)

    Zhangbin Yu

    2014-06-01

    Full Text Available OBJECTIVE: to prospectively validate a previously constructed transcutaneous bilirubin (TcB nomogram for identifying severe hyperbilirubinemia in healthy Chinese term and late-preterm infants. METHODS: this was a multicenter study that included 9,174 healthy term and late-preterm infants in eight hospitals of China. TcB measurements were performed using a JM-103 bilirubinometer. TcB values were plotted on a previously developed TcB nomogram, to identify the predictive ability for subsequent significant hyperbilirubinemia. RESULTS: in the present study, 972 neonates (10.6% developed significant hyperbilirubinemia. The 40th percentile of the nomogram could identify all neonates who were at risk of significant hyperbilirubinemia, but with a low positive predictive value (PPV (18.9%. Of the 453 neonates above the 95th percentile, 275 subsequently developed significant hyperbilirubinemia, with a high PPV (60.7%, but with low sensitivity (28.3%. The 75th percentile was highly specific (81.9% and moderately sensitive (79.8%. The area under the curve (AUC for the TcB nomogram was 0.875. CONCLUSIONS: this study validated the previously developed TcB nomogram, which could be used to predict subsequent significant hyperbilirubinemia in healthy Chinese term and late-preterm infants. However, combining TcB nomogram and clinical risk factors could improve the predictive accuracy for severe hyperbilirubinemia, which was not assessed in the study. Further studies are necessary to confirm this combination.

  8. The hierarchy-by-interval approach to identifying important models that need improvement in severe-accident simulation codes

    International Nuclear Information System (INIS)

    Heames, T.J.; Khatib-Rahbar, M.; Kelly, J.E.

    1995-01-01

    The hierarchy-by-interval (HBI) methodology was developed to determine an appropriate phenomena identification and ranking table for an independent peer review of severe-accident computer codes. The methodology is described, and the results of a specific code review are presented. Use of this systematic and structured approach ensures that important code models that need improvement are identified and prioritized, which allows code sponsors to more effectively direct limited resources in future code development. In addition, critical phenomenological areas that need more fundamental work, such as experimentation, are identified

  9. Validation of a Smartphone-Based Approach to In Situ Cognitive Fatigue Assessment

    Science.gov (United States)

    Linden, Mark

    2017-01-01

    Background Acquired Brain Injuries (ABIs) can result in multiple detrimental cognitive effects, such as reduced memory capability, concentration, and planning. These effects can lead to cognitive fatigue, which can exacerbate the symptoms of ABIs and hinder management and recovery. Assessing cognitive fatigue is difficult due to the largely subjective nature of the condition and existing assessment approaches. Traditional methods of assessment use self-assessment questionnaires delivered in a medical setting, but recent work has attempted to employ more objective cognitive tests as a way of evaluating cognitive fatigue. However, these tests are still predominantly delivered within a medical environment, limiting their utility and efficacy. Objective The aim of this research was to investigate how cognitive fatigue can be accurately assessed in situ, during the quotidian activities of life. It was hypothesized that this assessment could be achieved through the use of mobile assistive technology to assess working memory, sustained attention, information processing speed, reaction time, and cognitive throughput. Methods The study used a bespoke smartphone app to track daily cognitive performance, in order to assess potential levels of cognitive fatigue. Twenty-one participants with no prior reported brain injuries took place in a two-week study, resulting in 81 individual testing instances being collected. The smartphone app delivered three cognitive tests on a daily basis: (1) Spatial Span to measure visuospatial working memory; (2) Psychomotor Vigilance Task (PVT) to measure sustained attention, information processing speed, and reaction time; and (3) a Mental Arithmetic Test to measure cognitive throughput. A smartphone-optimized version of the Mental Fatigue Scale (MFS) self-assessment questionnaire was used as a baseline to assess the validity of the three cognitive tests, as the questionnaire has already been validated in multiple peer-reviewed studies. Results

  10. Validation of a Smartphone-Based Approach to In Situ Cognitive Fatigue Assessment.

    Science.gov (United States)

    Price, Edward; Moore, George; Galway, Leo; Linden, Mark

    2017-08-17

    Acquired Brain Injuries (ABIs) can result in multiple detrimental cognitive effects, such as reduced memory capability, concentration, and planning. These effects can lead to cognitive fatigue, which can exacerbate the symptoms of ABIs and hinder management and recovery. Assessing cognitive fatigue is difficult due to the largely subjective nature of the condition and existing assessment approaches. Traditional methods of assessment use self-assessment questionnaires delivered in a medical setting, but recent work has attempted to employ more objective cognitive tests as a way of evaluating cognitive fatigue. However, these tests are still predominantly delivered within a medical environment, limiting their utility and efficacy. The aim of this research was to investigate how cognitive fatigue can be accurately assessed in situ, during the quotidian activities of life. It was hypothesized that this assessment could be achieved through the use of mobile assistive technology to assess working memory, sustained attention, information processing speed, reaction time, and cognitive throughput. The study used a bespoke smartphone app to track daily cognitive performance, in order to assess potential levels of cognitive fatigue. Twenty-one participants with no prior reported brain injuries took place in a two-week study, resulting in 81 individual testing instances being collected. The smartphone app delivered three cognitive tests on a daily basis: (1) Spatial Span to measure visuospatial working memory; (2) Psychomotor Vigilance Task (PVT) to measure sustained attention, information processing speed, and reaction time; and (3) a Mental Arithmetic Test to measure cognitive throughput. A smartphone-optimized version of the Mental Fatigue Scale (MFS) self-assessment questionnaire was used as a baseline to assess the validity of the three cognitive tests, as the questionnaire has already been validated in multiple peer-reviewed studies. The most highly correlated results

  11. Microbial production of polyhydroxybutyrate with tailor-made properties: an integrated modelling approach and experimental validation.

    Science.gov (United States)

    Penloglou, Giannis; Chatzidoukas, Christos; Kiparissides, Costas

    2012-01-01

    The microbial production of polyhydroxybutyrate (PHB) is a complex process in which the final quantity and quality of the PHB depend on a large number of process operating variables. Consequently, the design and optimal dynamic operation of a microbial process for the efficient production of PHB with tailor-made molecular properties is an extremely interesting problem. The present study investigates how key process operating variables (i.e., nutritional and aeration conditions) affect the biomass production rate and the PHB accumulation in the cells and its associated molecular weight distribution. A combined metabolic/polymerization/macroscopic modelling approach, relating the process performance and product quality with the process variables, was developed and validated using an extensive series of experiments and measurements. The model predicts the dynamic evolution of the biomass growth, the polymer accumulation, the consumption of carbon and nitrogen sources and the average molecular weights of the PHB in a bioreactor, under batch and fed-batch operating conditions. The proposed integrated model was used for the model-based optimization of the production of PHB with tailor-made molecular properties in Azohydromonas lata bacteria. The process optimization led to a high intracellular PHB accumulation (up to 95% g of PHB per g of DCW) and the production of different grades (i.e., different molecular weight distributions) of PHB. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Test validation of environmental barrier coating (EBC) durability and damage tolerance modeling approach

    Science.gov (United States)

    Abdul-Aziz, Ali; Najafi, Ali; Abdi, Frank; Bhatt, Ramakrishna T.; Grady, Joseph E.

    2014-03-01

    Protection of Ceramic Matrix Composites (CMCs) is rather an important element for the engine manufacturers and aerospace companies to help improve the durability of their hot engine components. The CMC's are typically porous materials which permits some desirable infiltration that lead to strength enhancements. However, they experience various durability issues such as degradation due to coating oxidation. These concerns are being addressed by introducing a high temperature protective system, Environmental Barrier Coating (EBC) that can operate at temperature applications1, 3 In this paper, linear elastic progressive failure analyses are performed to evaluate conditions that would cause crack initiation in the EBC. The analysis is to determine the overall failure sequence under tensile loading conditions on different layers of material including the EBC and CMC in an attempt to develop a life/failure model. A 3D finite element model of a dogbone specimen is constructed for the analyses. Damage initiation, propagation and final failure is captured using a progressive failure model considering tensile loading conditions at room temperature. It is expected that this study will establish a process for using a computational approach, validated at a specimen level, to predict reliably in the future component level performance without resorting to extensive testing.

  13. Validation of an employee satisfaction model: A structural equation model approach

    Directory of Open Access Journals (Sweden)

    Ophillia Ledimo

    2015-01-01

    Full Text Available The purpose of this study was to validate an employee satisfaction model and to determine the relationships between the different dimensions of the concept, using the structural equation modelling approach (SEM. A cross-sectional quantitative survey design was used to collect data from a random sample of (n=759 permanent employees of a parastatal organisation. Data was collected using the Employee Satisfaction Survey (ESS to measure employee satisfaction dimensions. Following the steps of SEM analysis, the three domains and latent variables of employee satisfaction were specified as organisational strategy, policies and procedures, and outcomes. Confirmatory factor analysis of the latent variables was conducted, and the path coefficients of the latent variables of the employee satisfaction model indicated a satisfactory fit for all these variables. The goodness-of-fit measure of the model indicated both absolute and incremental goodness-of-fit; confirming the relationships between the latent and manifest variables. It also indicated that the latent variables, organisational strategy, policies and procedures, and outcomes, are the main indicators of employee satisfaction. This study adds to the knowledge base on employee satisfaction and makes recommendations for future research.

  14. Acoustic telemetry validates a citizen science approach for monitoring sharks on coral reefs.

    Science.gov (United States)

    Vianna, Gabriel M S; Meekan, Mark G; Bornovski, Tova H; Meeuwig, Jessica J

    2014-01-01

    Citizen science is promoted as a simple and cost-effective alternative to traditional approaches for the monitoring of populations of marine megafauna. However, the reliability of datasets collected by these initiatives often remains poorly quantified. We compared datasets of shark counts collected by professional dive guides with acoustic telemetry data from tagged sharks collected at the same coral reef sites over a period of five years. There was a strong correlation between the number of grey reef sharks (Carcharhinus amblyrhynchos) observed by dive guides and the telemetry data at both daily and monthly intervals, suggesting that variation in relative abundance of sharks was detectable in datasets collected by dive guides in a similar manner to data derived from telemetry at these time scales. There was no correlation between the number or mean depth of sharks recorded by telemetry and the presence of tourist divers, suggesting that the behaviour of sharks was not affected by the presence of divers during our study. Data recorded by dive guides showed that current strength and temperature were important drivers of the relative abundance of sharks at monitored sites. Our study validates the use of datasets of shark abundance collected by professional dive guides in frequently-visited dive sites in Palau, and supports the participation of experienced recreational divers as contributors to long-term monitoring programs of shark populations.

  15. On the validity of the effective field theory approach to SM precision tests

    Energy Technology Data Exchange (ETDEWEB)

    Contino, Roberto [EPFL, Lausanne (Switzerland). Inst. de Theorie des Phenomenes Physiques; CERN, Geneva (Switzerland). Theoretical Physics Dept.; Falkowski, Adam [Paris-11 Univ., 91 - Orsay (France). Lab. de Physique Theorique; Goertz, Florian; Riva, Francesco [CERN, Geneva (Switzerland). Theoretical Physics Dept.; Grojean, Christophe [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2016-09-15

    We discuss the conditions for an effective field theory (EFT) to give an adequate low-energy description of an underlying physics beyond the Standard Model (SM). Starting from the EFT where the SM is extended by dimension-6 operators, experimental data can be used without further assumptions to measure (or set limits on) the EFT parameters. The interpretation of these results requires instead a set of broad assumptions (e.g. power counting rules) on the UV dynamics. This allows one to establish, in a bottom-up approach, the validity range of the EFT description, and to assess the error associated with the truncation of the EFT series. We give a practical prescription on how experimental results could be reported, so that they admit a maximally broad range of theoretical interpretations. Namely, the experimental constraints on dimension-6 operators should be reported as functions of the kinematic variables that set the relevant energy scale of the studied process. This is especially important for hadron collider experiments where collisions probe a wide range of energy scales.

  16. Macrophage polarisation: an immunohistochemical approach for identifying M1 and M2 macrophages.

    Directory of Open Access Journals (Sweden)

    Mário Henrique M Barros

    Full Text Available Macrophage polarization is increasingly recognised as an important pathogenetic factor in inflammatory and neoplastic diseases. Proinflammatory M1 macrophages promote T helper (Th 1 responses and show tumoricidal activity. M2 macrophages contribute to tissue repair and promote Th2 responses. CD68 and CD163 are used to identify macrophages in tissue sections. However, characterisation of polarised macrophages in situ has remained difficult. Macrophage polarisation is regulated by transcription factors, pSTAT1 and RBP-J for M1, and CMAF for M2. We reasoned that double-labelling immunohistochemistry for the detection of macrophage markers together with transcription factors may be suitable to characterise macrophage polarisation in situ. To test this hypothesis, we have studied conditions associated with Th1- and Th2-predominant immune responses: infectious mononucleosis and Crohn's disease for Th1 and allergic nasal polyps, oxyuriasis, wound healing and foreign body granulomas for predominant Th2 response. In all situations, CD163+ cells usually outnumbered CD68+ cells. Moreover, CD163+ cells, usually considered as M2 macrophages, co-expressing pSTAT1 and RBP-J were found in all conditions examined. The numbers of putative M1 macrophages were higher in Th1- than in Th2-associated diseases, while more M2 macrophages were seen in Th2- than in Th1 related disorders. In most Th1-related diseases, the balance of M1 over M2 cells was shifted towards M1 cells, while the reverse was observed for Th2-related conditions. Hierarchical cluster analysis revealed two distinct clusters: cluster I included Th1 diseases together with cases with high numbers of CD163+pSTAT1+, CD68+pSTAT1+, CD163+RBP-J+ and CD68+RBP-J+ macrophages; cluster II comprised Th2 conditions together with cases displaying high numbers of CD163+CMAF+ and CD68+CMAF+ macrophages. These results suggest that the detection of pSTAT1, RBP-J, and CMAF in the context of CD68 or CD163 expression is a

  17. Macrophage polarisation: an immunohistochemical approach for identifying M1 and M2 macrophages.

    Science.gov (United States)

    Barros, Mário Henrique M; Hauck, Franziska; Dreyer, Johannes H; Kempkes, Bettina; Niedobitek, Gerald

    2013-01-01

    Macrophage polarization is increasingly recognised as an important pathogenetic factor in inflammatory and neoplastic diseases. Proinflammatory M1 macrophages promote T helper (Th) 1 responses and show tumoricidal activity. M2 macrophages contribute to tissue repair and promote Th2 responses. CD68 and CD163 are used to identify macrophages in tissue sections. However, characterisation of polarised macrophages in situ has remained difficult. Macrophage polarisation is regulated by transcription factors, pSTAT1 and RBP-J for M1, and CMAF for M2. We reasoned that double-labelling immunohistochemistry for the detection of macrophage markers together with transcription factors may be suitable to characterise macrophage polarisation in situ. To test this hypothesis, we have studied conditions associated with Th1- and Th2-predominant immune responses: infectious mononucleosis and Crohn's disease for Th1 and allergic nasal polyps, oxyuriasis, wound healing and foreign body granulomas for predominant Th2 response. In all situations, CD163+ cells usually outnumbered CD68+ cells. Moreover, CD163+ cells, usually considered as M2 macrophages, co-expressing pSTAT1 and RBP-J were found in all conditions examined. The numbers of putative M1 macrophages were higher in Th1- than in Th2-associated diseases, while more M2 macrophages were seen in Th2- than in Th1 related disorders. In most Th1-related diseases, the balance of M1 over M2 cells was shifted towards M1 cells, while the reverse was observed for Th2-related conditions. Hierarchical cluster analysis revealed two distinct clusters: cluster I included Th1 diseases together with cases with high numbers of CD163+pSTAT1+, CD68+pSTAT1+, CD163+RBP-J+ and CD68+RBP-J+ macrophages; cluster II comprised Th2 conditions together with cases displaying high numbers of CD163+CMAF+ and CD68+CMAF+ macrophages. These results suggest that the detection of pSTAT1, RBP-J, and CMAF in the context of CD68 or CD163 expression is a suitable tool for

  18. External validation of fatty liver index for identifying ultrasonographic fatty liver in a large-scale cross-sectional study in Taiwan.

    Directory of Open Access Journals (Sweden)

    Bi-Ling Yang

    Full Text Available The fatty liver index (FLI is an algorithm involving the waist circumference, body mass index, and serum levels of triglyceride and gamma-glutamyl transferase to identify fatty liver. Although some studies have attempted to validate the FLI, few studies have been conducted for external validation among Asians. We attempted to validate FLI to predict ultrasonographic fatty liver in Taiwanese subjects.We enrolled consecutive subjects who received health check-up services at the Taipei Veterans General Hospital from 2002 to 2009. Ultrasonography was applied to diagnose fatty liver. The ability of the FLI to detect ultrasonographic fatty liver was assessed by analyzing the area under the receiver operating characteristic (AUROC curve.Among the 29,797 subjects enrolled in this study, fatty liver was diagnosed in 44.5% of the population. Subjects with ultrasonographic fatty liver had a significantly higher FLI than those without fatty liver by multivariate analysis (odds ratio 1.045; 95% confidence interval, CI 1.044-1.047, p< 0.001. Moreover, FLI had the best discriminative ability to identify patients with ultrasonographic fatty liver (AUROC: 0.827, 95% confidence interval, 0.822-0.831. An FLI < 25 (negative likelihood ratio (LR- 0.32 for males and <10 (LR- 0.26 for females rule out ultrasonographic fatty liver. Moreover, an FLI ≥ 35 (positive likelihood ratio (LR+ 3.12 for males and ≥ 20 (LR+ 4.43 for females rule in ultrasonographic fatty liver.FLI could accurately identify ultrasonographic fatty liver in a large-scale population in Taiwan but with lower cut-off value than the Western population. Meanwhile the cut-off value was lower in females than in males.

  19. Non-destructive inspection approach using ultrasound to identify the material state for amorphous and semi-crystalline materials

    Science.gov (United States)

    Jost, Elliott; Jack, David; Moore, David

    2018-04-01

    At present, there are many methods to identify the temperature and phase of a material using invasive techniques. However, most current methods require physical contact or implicit methods utilizing light reflectance of the specimen. This work presents a nondestructive inspection method using ultrasonic wave technology that circumvents these disadvantages to identify phase change regions and infer the temperature state of a material. In the present study an experiment is performed to monitor the time of flight within a wax as it undergoes melting and the subsequent cooling. Results presented in this work show a clear relationship between a material's speed of sound and its temperature. The phase change transition of the material is clear from the time of flight results, and in the case of the investigated material, this change in the material state occurs over a range of temperatures. The range of temperatures over which the wax material melts is readily identified by speed of sound represented as a function of material temperature. The melt temperature, obtained acoustically, is validated using Differential Scanning Calorimetry (DSC), which uses shifts in heat flow rates to identify phase transition temperature ranges. The investigated ultrasonic NDE method has direct applications in many industries, including oil and gas, food and beverage, and polymer composites, in addition to many implications for future capabilities of nondestructive inspection of multi-phase materials.

  20. Identifying Opportunities for Decision Support Systems in Support of Regional Resource Use Planning: An Approach Through Soft Systems Methodology.

    Science.gov (United States)

    Zhu; Dale

    2000-10-01

    / Regional resource use planning relies on key regional stakeholder groups using and having equitable access to appropriate social, economic, and environmental information and assessment tools. Decision support systems (DSS) can improve stakeholder access to such information and analysis tools. Regional resource use planning, however, is a complex process involving multiple issues, multiple assessment criteria, multiple stakeholders, and multiple values. There is a need for an approach to DSS development that can assist in understanding and modeling complex problem situations in regional resource use so that areas where DSSs could provide effective support can be identified, and the user requirements can be well established. This paper presents an approach based on the soft systems methodology for identifying DSS opportunities for regional resource use planning, taking the Central Highlands Region of Queensland, Australia, as a case study.

  1. Development and validation testing of a short nutrition questionnaire to identify dietary risk factors in preschoolers aged 12–36 months

    Directory of Open Access Journals (Sweden)

    Niamh Rice

    2015-06-01

    Full Text Available Background: Although imbalances in dietary intakes can have short and longer term influences on the health of preschool children, few tools exist to quickly and easily identify nutritional risk in otherwise healthy young children. Objectives: To develop and test the validity of a parent-administered questionnaire (NutricheQ as a means of evaluating dietary risk in young children (12–36 months. Design: Following a comprehensive development process and internal reliability assessment, the NutricheQ questionnaire was validated in a cohort of 371 Irish preschool children as part of the National Preschool Nutrition Survey. Dietary risk was rated on a scale ranging from 0 to 22 from 11 questions, with a higher score indicating higher risk. Results: Children with higher NutricheQ scores had significantly (p<0.05 lower mean daily intakes of key nutrients such as iron, zinc, vitamin D, riboflavin, niacin, folate, phosphorous, potassium, carotene, retinol, and dietary fibre. They also had lower (p<0.05 intakes of vegetables, fish and fish dishes, meat and infant/toddler milks and higher intakes of processed foods and non-milk beverages, confectionery, sugars and savoury snack foods indicative of poorer dietary quality. Areas under the curve values of 84.7 and 75.6% were achieved for ‘medium’ and ‘high’ dietary risk when compared with expert risk ratings indicating good consistency between the two methods. Conclusion: NutricheQ is a valid method of quickly assessing dietary quality in preschoolers and in identifying those at increased nutritional risk.In ContextAnalysis of data from national food and nutrition surveys typically identifies shortfalls in dietary intakes or quality of young children. This can relate to intakes of micronutrients such as iron or vitamin D as well as to the balance of macronutrients they consume (e.g. fat or sugar. Alongside this lie concerns regarding overweight and obesity and physical inactivity. This combination of

  2. Can children identify and achieve goals for intervention? A randomized trial comparing two goal-setting approaches.

    Science.gov (United States)

    Vroland-Nordstrand, Kristina; Eliasson, Ann-Christin; Jacobsson, Helén; Johansson, Ulla; Krumlinde-Sundholm, Lena

    2016-06-01

    The efficacy of two different goal-setting approaches (children's self-identified goals and goals identified by parents) were compared on a goal-directed, task-oriented intervention. In this assessor-blinded parallel randomized trial, 34 children with disabilities (13 males, 21 females; mean age 9y, SD 1y 4mo) were randomized using concealed allocation to one of two 8-week, goal-directed, task-oriented intervention groups with different goal-setting approaches: (1) children's self-identified goals (n=18) using the Perceived Efficacy and Goal-Setting System, or (2) goals identified by parents (n=16) using the Canadian Occupational Performance Measure (COPM). Participants were recruited through eight paediatric rehabilitation centres and randomized between October 2011 and May 2013. The primary outcome measure was the Goal Attainment Scaling and the secondary measure, the COPM performance scale (COPM-P). Data were collected pre- and post-intervention and at the 5-month follow-up. There was no evidence of a difference in mean characteristics at baseline between groups. There was evidence of an increase in mean goal attainment (mean T score) in both groups after intervention (child-goal group: estimated mean difference [EMD] 27.84, 95% CI 22.93-32.76; parent-goal group: EMD 21.42, 95% CI 16.16-26.67). There was no evidence of a difference in the mean T scores post-intervention between the two groups (EMD 6.42, 95% CI -0.80 to 13.65). These results were sustained at the 5-month follow-up. Children's self-identified goals are achievable to the same extent as parent-identified goals and remain stable over time. Thus children can be trusted to identify their own goals for intervention, thereby influencing their involvement in their intervention programmes. © 2015 Mac Keith Press.

  3. Peptidomic approach identifies cruzioseptins, a new family of potent antimicrobial peptides in the splendid leaf frog, Cruziohyla calcarifer

    OpenAIRE

    Proaño Bolaños, Carolina; Zhou, Mei; Wang, Lei; Luis, Coloma; Chen, Tianbao; Shaw, Christopher

    2016-01-01

    Phyllomedusine frogs are an extraordinary source of biologically active peptides. At least 8 families of antimicrobial peptides have been reported in this frog clade, the dermaseptins being the most diverse. By a peptidomic approach, integrating molecular cloning, Edman degradation sequencing and tandem mass spectrometry, a new family of antimicrobial peptides has been identified in Cruziohyla calcarifer. These 15 novel antimicrobial peptides of 20–32 residues in length are named cruzioseptin...

  4. Personality subtypes in adolescents with eating disorders: validation of a classification approach.

    Science.gov (United States)

    Thompson-Brenner, Heather; Eddy, Kamryn T; Satir, Dana A; Boisseau, Christina L; Westen, Drew

    2008-02-01

    Research has identified three personality subtypes in adults with eating disorders (EDs): a high-functioning, an undercontrolled, and an overcontrolled group. The current study investigated whether similar personality prototypes exist in adolescents with EDs, and whether these personality prototypes show relationships to external correlates indicative of diagnostic validity. Experienced clinicians from an adolescent practice-research network provided data on ED symptoms, DSM-IV comorbidity, personality pathology, and family and developmental history for 120 adolescent patients with EDs. Consistent with the findings from the adult literature, three types of personality pathology emerged in adolescents: High-functioning/Perfectionistic, Emotionally Dysregulated, and Avoidant/Depressed. The High-functioning prototype showed negative associations with comorbidity and positive associations with treatment response. The Emotionally Dysregulated prototype was specifically associated with externalizing Axis I and Cluster B Axis II disorders, poor school functioning, and adverse events in childhood. The Avoidant/Depressed prototype showed specific associations with internalizing Axis I and Clusters A Axis II disorders, poor peer relationships, poor maternal relationships, and internalizing disorders in first-degree relatives. These data support the presence of at least three diagnostically meaningful personality prototypes in adolescents with EDs, similar to those found previously in adults. Diagnosis of adolescents with EDs may be usefully supplemented by the assessment of personality style.

  5. The Dutch Linguistic Intraoperative Protocol: a valid linguistic approach to awake brain surgery.

    Science.gov (United States)

    De Witte, E; Satoer, D; Robert, E; Colle, H; Verheyen, S; Visch-Brink, E; Mariën, P

    2015-01-01

    Intraoperative direct electrical stimulation (DES) is increasingly used in patients operated on for tumours in eloquent areas. Although a positive impact of DES on postoperative linguistic outcome is generally advocated, information about the neurolinguistic methods applied in awake surgery is scarce. We developed for the first time a standardised Dutch linguistic test battery (measuring phonology, semantics, syntax) to reliably identify the critical language zones in detail. A normative study was carried out in a control group of 250 native Dutch-speaking healthy adults. In addition, the clinical application of the Dutch Linguistic Intraoperative Protocol (DuLIP) was demonstrated by means of anatomo-functional models and five case studies. A set of DuLIP tests was selected for each patient depending on the tumour location and degree of linguistic impairment. DuLIP is a valid test battery for pre-, intraoperative and postoperative language testing and facilitates intraoperative mapping of eloquent language regions that are variably located. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. A bottom-up approach to identifying the maximum operational adaptive capacity of water resource systems to a changing climate

    Science.gov (United States)

    Culley, S.; Noble, S.; Yates, A.; Timbs, M.; Westra, S.; Maier, H. R.; Giuliani, M.; Castelletti, A.

    2016-09-01

    Many water resource systems have been designed assuming that the statistical characteristics of future inflows are similar to those of the historical record. This assumption is no longer valid due to large-scale changes in the global climate, potentially causing declines in water resource system performance, or even complete system failure. Upgrading system infrastructure to cope with climate change can require substantial financial outlay, so it might be preferable to optimize existing system performance when possible. This paper builds on decision scaling theory by proposing a bottom-up approach to designing optimal feedback control policies for a water system exposed to a changing climate. This approach not only describes optimal operational policies for a range of potential climatic changes but also enables an assessment of a system's upper limit of its operational adaptive capacity, beyond which upgrades to infrastructure become unavoidable. The approach is illustrated using the Lake Como system in Northern Italy—a regulated system with a complex relationship between climate and system performance. By optimizing system operation under different hydrometeorological states, it is shown that the system can continue to meet its minimum performance requirements for more than three times as many states as it can under current operations. Importantly, a single management policy, no matter how robust, cannot fully utilize existing infrastructure as effectively as an ensemble of flexible management policies that are updated as the climate changes.

  7. Evaluation of multiple approaches to identify genome-wide polymorphisms in closely related genotypes of sweet cherry (Prunus avium L.

    Directory of Open Access Journals (Sweden)

    Seanna Hewitt

    Full Text Available Identification of genetic polymorphisms and subsequent development of molecular markers is important for marker assisted breeding of superior cultivars of economically important species. Sweet cherry (Prunus avium L. is an economically important non-climacteric tree fruit crop in the Rosaceae family and has undergone a genetic bottleneck due to breeding, resulting in limited genetic diversity in the germplasm that is utilized for breeding new cultivars. Therefore, it is critical to recognize the best platforms for identifying genome-wide polymorphisms that can help identify, and consequently preserve, the diversity in a genetically constrained species. For the identification of polymorphisms in five closely related genotypes of sweet cherry, a gel-based approach (TRAP, reduced representation sequencing (TRAPseq, a 6k cherry SNParray, and whole genome sequencing (WGS approaches were evaluated in the identification of genome-wide polymorphisms in sweet cherry cultivars. All platforms facilitated detection of polymorphisms among the genotypes with variable efficiency. In assessing multiple SNP detection platforms, this study has demonstrated that a combination of appropriate approaches is necessary for efficient polymorphism identification, especially between closely related cultivars of a species. The information generated in this study provides a valuable resource for future genetic and genomic studies in sweet cherry, and the insights gained from the evaluation of multiple approaches can be utilized for other closely related species with limited genetic diversity in the breeding germplasm. Keywords: Polymorphisms, Prunus avium, Next-generation sequencing, Target region amplification polymorphism (TRAP, Genetic diversity, SNParray, Reduced representation sequencing, Whole genome sequencing (WGS

  8. Measuring multi-joint stiffness during single movements: numerical validation of a novel time-frequency approach.

    Science.gov (United States)

    Piovesan, Davide; Pierobon, Alberto; DiZio, Paul; Lackner, James R

    2012-01-01

    This study presents and validates a Time-Frequency technique for measuring 2-dimensional multijoint arm stiffness throughout a single planar movement as well as during static posture. It is proposed as an alternative to current regressive methods which require numerous repetitions to obtain average stiffness on a small segment of the hand trajectory. The method is based on the analysis of the reassigned spectrogram of the arm's response to impulsive perturbations and can estimate arm stiffness on a trial-by-trial basis. Analytic and empirical methods are first derived and tested through modal analysis on synthetic data. The technique's accuracy and robustness are assessed by modeling the estimation of stiffness time profiles changing at diff