WorldWideScience

Sample records for identify cis-expression quantitative

  1. Genome-wide haplotype analysis of cis expression quantitative trait loci in monocytes.

    Directory of Open Access Journals (Sweden)

    Sophie Garnier

    Full Text Available In order to assess whether gene expression variability could be influenced by several SNPs acting in cis, either through additive or more complex haplotype effects, a systematic genome-wide search for cis haplotype expression quantitative trait loci (eQTL was conducted in a sample of 758 individuals, part of the Cardiogenics Transcriptomic Study, for which genome-wide monocyte expression and GWAS data were available. 19,805 RNA probes were assessed for cis haplotypic regulation through investigation of ~2,1 × 10(9 haplotypic combinations. 2,650 probes demonstrated haplotypic p-values >10(4-fold smaller than the best single SNP p-value. Replication of significant haplotype effects were tested for 412 probes for which SNPs (or proxies that defined the detected haplotypes were available in the Gutenberg Health Study composed of 1,374 individuals. At the Bonferroni correction level of 1.2 × 10(-4 (~0.05/412, 193 haplotypic signals replicated. 1000 G imputation was then conducted, and 105 haplotypic signals still remained more informative than imputed SNPs. In-depth analysis of these 105 cis eQTL revealed that at 76 loci genetic associations were compatible with additive effects of several SNPs, while for the 29 remaining regions data could be compatible with a more complex haplotypic pattern. As 24 of the 105 cis eQTL have previously been reported to be disease-associated loci, this work highlights the need for conducting haplotype-based and 1000 G imputed cis eQTL analysis before commencing functional studies at disease-associated loci.

  2. Assessing quantitative EEG spectrograms to identify non-epileptic events.

    Science.gov (United States)

    Goenka, Ajay; Boro, Alexis; Yozawitz, Elissa

    2017-09-01

    To evaluate the sensitivity and specificity of quantitative EEG (QEEG) spectrograms in order to distinguish epileptic from non-epileptic events. Seventeen patients with paroxysmal non-epileptic events, captured during EEG monitoring, were retrospectively assessed using QEEG spectrograms. These patients were compared to a control group of 13 consecutive patients (ages 25-60 years) with epileptic seizures of similar semiology. Assessment of raw EEG was employed as the gold standard against which epileptic and non-epileptic events were validated. QEEG spectrograms, available using Persyst 12 EEG system integration software, were each assessed with respect to their usefulness to distinguish epileptic from non-epileptic seizures. The given spectrogram was interpreted as indicating a seizure if, at the time of the clinically identified event, it showed a visually significant change from baseline. Eighty-two clinically identified paroxysmal events were analysed (46 non-epileptic and 36 epileptic). The "seizure detector trend analysis" spectrogram correctly classified 33/46 (71%) non-epileptic events (no seizure indicated during a clinically identified event) vs. 29/36 (81%) epileptic seizures (seizure indicated during a clinically identified event) (p=0.013). Similarly, "rhythmicity spectrogram", FFT spectrogram, "asymmetry relative spectrogram", and integrated-amplitude EEG spectrogram detected 28/46 (61%), 30/46 (65%), 22/46 (48%) and 27/46 (59%) non-epileptic events vs. 27/36 (75%), 25/36 (69%), 25/36 (69%) and 27/36 (75%) epileptic events, respectively. High sensitivities and specificities for QEEG seizure detection analyses suggest that QEEG may have a role at the bedside to facilitate early differentiation between epileptic seizures and non-epileptic events in order to avoid unnecessary administration of antiepileptic drugs and possible iatrogenic consequences.

  3. A Quantitative Study Identifying Political Strategies Used by Principals of Dual Language Programs

    Science.gov (United States)

    Girard, Guadalupe

    2017-01-01

    Purpose. The purpose of this quantitative study was to identify the external and internal political strategies used by principals that allow them to successfully navigate the political environment surrounding dual language programs. Methodology. This quantitative study used descriptive research to collect, analyze, and report data that identified…

  4. Data in support of quantitative proteomics to identify potential virulence regulators in Paracoccidioides brasiliensis isolates

    Directory of Open Access Journals (Sweden)

    Alexandre Keiji Tashima

    2015-12-01

    Full Text Available Paracoccidioides genus are the etiologic agents of paracoccidioidomycosis (PCM, a systemic mycosis endemic in Latin America. Few virulence factors have been identified in these fungi. This paper describes support data from the quantitative proteomics of Paracoccidioides brasiliensis attenuated and virulent isolates [1]. The protein compositions of two isolates of the Pb18 strain showing distinct infection profiles were quantitatively assessed by stable isotopic dimethyl labeling and proteomic analysis. The mass spectrometry and the analysis dataset have been deposited to the ProteomeXchange Consortium via the PRIDE partner repository with identifier PXD000804.

  5. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N

    2013-01-01

    a combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  6. Using Extreme Phenotype Sampling to Identify the Rare Causal Variants of Quantitative Traits in Association Studies

    OpenAIRE

    Li, Dalin; Lewinger, Juan Pablo; Gauderman, William J.; Murcray, Cassandra Elizabeth; Conti, David

    2011-01-01

    Variants identified in recent genome-wide association studies based on the common-disease common-variant hypothesis are far from fully explaining the hereditability of complex traits. Rare variants may, in part, explain some of the missing hereditability. Here, we explored the advantage of the extreme phenotype sampling in rare-variant analysis and refined this design framework for future large-scale association studies on quantitative traits. We first proposed a power calculation approach fo...

  7. A comparison of visual and quantitative methods to identify interstitial lung abnormalities

    OpenAIRE

    Kliment, Corrine R.; Araki, Tetsuro; Doyle, Tracy J.; Gao, Wei; Dupuis, Jos?e; Latourelle, Jeanne C.; Zazueta, Oscar E.; Fernandez, Isis E.; Nishino, Mizuki; Okajima, Yuka; Ross, James C.; Est?par, Ra?l San Jos?; Diaz, Alejandro A.; Lederer, David J.; Schwartz, David A.

    2015-01-01

    Background: Evidence suggests that individuals with interstitial lung abnormalities (ILA) on a chest computed tomogram (CT) may have an increased risk to develop a clinically significant interstitial lung disease (ILD). Although methods used to identify individuals with ILA on chest CT have included both automated quantitative and qualitative visual inspection methods, there has been not direct comparison between these two methods. To investigate this relationship, we created lung density met...

  8. Identifying specific protein interaction partners using quantitative mass spectrometry and bead proteomes

    Science.gov (United States)

    Trinkle-Mulcahy, Laura; Boulon, Séverine; Lam, Yun Wah; Urcia, Roby; Boisvert, François-Michel; Vandermoere, Franck; Morrice, Nick A.; Swift, Sam; Rothbauer, Ulrich; Leonhardt, Heinrich; Lamond, Angus

    2008-01-01

    The identification of interaction partners in protein complexes is a major goal in cell biology. Here we present a reliable affinity purification strategy to identify specific interactors that combines quantitative SILAC-based mass spectrometry with characterization of common contaminants binding to affinity matrices (bead proteomes). This strategy can be applied to affinity purification of either tagged fusion protein complexes or endogenous protein complexes, illustrated here using the well-characterized SMN complex as a model. GFP is used as the tag of choice because it shows minimal nonspecific binding to mammalian cell proteins, can be quantitatively depleted from cell extracts, and allows the integration of biochemical protein interaction data with in vivo measurements using fluorescence microscopy. Proteins binding nonspecifically to the most commonly used affinity matrices were determined using quantitative mass spectrometry, revealing important differences that affect experimental design. These data provide a specificity filter to distinguish specific protein binding partners in both quantitative and nonquantitative pull-down and immunoprecipitation experiments. PMID:18936248

  9. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis

    Directory of Open Access Journals (Sweden)

    Akira Ishikawa

    2017-11-01

    Full Text Available Large numbers of quantitative trait loci (QTL affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  10. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    Science.gov (United States)

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  11. Identifying Contributors of DNA Mixtures by Means of Quantitative Information of STR Typing

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    2012-01-01

    identified using polymorphic genetic markers. However, modern typing techniques supply additional quantitative data, which contain very important information about the observed evidence. This is particularly true for cases of DNA mixtures, where more than one individual has contributed to the observed......Abstract Estimating the weight of evidence in forensic genetics is often done in terms of a likelihood ratio, LR. The LR evaluates the probability of the observed evidence under competing hypotheses. Most often, probabilities used in the LR only consider the evidence from the genomic variation...... biological stain. This article presents a method for including the quantitative information of short tandem repeat (STR) DNA mixtures in the LR. Also, an efficient algorithmic method for finding the best matching combination of DNA mixture profiles is derived and implemented in an on-line tool for two...

  12. Identifying the genes underlying quantitative traits: a rationale for the QTN programme.

    Science.gov (United States)

    Lee, Young Wha; Gould, Billie A; Stinchcombe, John R

    2014-01-01

    The goal of identifying the genes or even nucleotides underlying quantitative and adaptive traits has been characterized as the 'QTN programme' and has recently come under severe criticism. Part of the reason for this criticism is that much of the QTN programme has asserted that finding the genes and nucleotides for adaptive and quantitative traits is a fundamental goal, without explaining why it is such a hallowed goal. Here we outline motivations for the QTN programme that offer general insight, regardless of whether QTNs are of large or small effect, and that aid our understanding of the mechanistic dynamics of adaptive evolution. We focus on five areas: (i) vertical integration of insight across different levels of biological organization, (ii) genetic parallelism and the role of pleiotropy in shaping evolutionary dynamics, (iii) understanding the forces maintaining genetic variation in populations, (iv) distinguishing between adaptation from standing variation and new mutation, and (v) the role of genomic architecture in facilitating adaptation. We argue that rather than abandoning the QTN programme, we should refocus our efforts on topics where molecular data will be the most effective for testing hypotheses about phenotypic evolution.

  13. Link-based quantitative methods to identify differentially coexpressed genes and gene Pairs

    Directory of Open Access Journals (Sweden)

    Ye Zhi-Qiang

    2011-08-01

    Full Text Available Abstract Background Differential coexpression analysis (DCEA is increasingly used for investigating the global transcriptional mechanisms underlying phenotypic changes. Current DCEA methods mostly adopt a gene connectivity-based strategy to estimate differential coexpression, which is characterized by comparing the numbers of gene neighbors in different coexpression networks. Although it simplifies the calculation, this strategy mixes up the identities of different coexpression neighbors of a gene, and fails to differentiate significant differential coexpression changes from those trivial ones. Especially, the correlation-reversal is easily missed although it probably indicates remarkable biological significance. Results We developed two link-based quantitative methods, DCp and DCe, to identify differentially coexpressed genes and gene pairs (links. Bearing the uniqueness of exploiting the quantitative coexpression change of each gene pair in the coexpression networks, both methods proved to be superior to currently popular methods in simulation studies. Re-mining of a publicly available type 2 diabetes (T2D expression dataset from the perspective of differential coexpression analysis led to additional discoveries than those from differential expression analysis. Conclusions This work pointed out the critical weakness of current popular DCEA methods, and proposed two link-based DCEA algorithms that will make contribution to the development of DCEA and help extend it to a broader spectrum.

  14. Identifying the genes underlying quantitative traits: a rationale for the QTN programme

    Science.gov (United States)

    Lee, Young Wha; Gould, Billie A.; Stinchcombe, John R.

    2014-01-01

    The goal of identifying the genes or even nucleotides underlying quantitative and adaptive traits has been characterized as the ‘QTN programme’ and has recently come under severe criticism. Part of the reason for this criticism is that much of the QTN programme has asserted that finding the genes and nucleotides for adaptive and quantitative traits is a fundamental goal, without explaining why it is such a hallowed goal. Here we outline motivations for the QTN programme that offer general insight, regardless of whether QTNs are of large or small effect, and that aid our understanding of the mechanistic dynamics of adaptive evolution. We focus on five areas: (i) vertical integration of insight across different levels of biological organization, (ii) genetic parallelism and the role of pleiotropy in shaping evolutionary dynamics, (iii) understanding the forces maintaining genetic variation in populations, (iv) distinguishing between adaptation from standing variation and new mutation, and (v) the role of genomic architecture in facilitating adaptation. We argue that rather than abandoning the QTN programme, we should refocus our efforts on topics where molecular data will be the most effective for testing hypotheses about phenotypic evolution. PMID:24790125

  15. Quantitative Proteomics Identifies Activation of Hallmark Pathways of Cancer in Patient Melanoma.

    Science.gov (United States)

    Byrum, Stephanie D; Larson, Signe K; Avaritt, Nathan L; Moreland, Linley E; Mackintosh, Samuel G; Cheung, Wang L; Tackett, Alan J

    2013-03-01

    Molecular pathways regulating melanoma initiation and progression are potential targets of therapeutic development for this aggressive cancer. Identification and molecular analysis of these pathways in patients has been primarily restricted to targeted studies on individual proteins. Here, we report the most comprehensive analysis of formalin-fixed paraffin-embedded human melanoma tissues using quantitative proteomics. From 61 patient samples, we identified 171 proteins varying in abundance among benign nevi, primary melanoma, and metastatic melanoma. Seventy-three percent of these proteins were validated by immunohistochemistry staining of malignant melanoma tissues from the Human Protein Atlas database. Our results reveal that molecular pathways involved with tumor cell proliferation, motility, and apoptosis are mis-regulated in melanoma. These data provide the most comprehensive proteome resource on patient melanoma and reveal insight into the molecular mechanisms driving melanoma progression.

  16. Quantitative proteomics identify molecular targets that are crucial in larval settlement and metamorphosis of bugula neritina

    KAUST Repository

    Zhang, Huoming

    2011-01-07

    The marine invertebrate Bugula neritina has a biphasic life cycle that consists of a swimming larval stage and a sessile juvenile and adult stage. The attachment of larvae to the substratum and their subsequent metamorphosis have crucial ecological consequences. Despite many studies on this species, little is known about the molecular mechanism of these processes. Here, we report a comparative study of swimming larvae and metamorphosing individuals at 4 and 24 h postattachment using label-free quantitative proteomics. We identified more than 1100 proteins at each stage, 61 of which were differentially expressed. Specifically, proteins involved in energy metabolism and structural molecules were generally down-regulated, whereas proteins involved in transcription and translation, the extracellular matrix, and calcification were strongly up-regulated during metamorphosis. Many tightly regulated novel proteins were also identified. Subsequent analysis of the temporal and spatial expressions of some of the proteins and an assay of their functions indicated that they may have key roles in metamorphosis of B. neritina. These findings not only provide molecular evidence with which to elucidate the substantial changes in morphology and physiology that occur during larval attachment and metamorphosis but also identify potential targets for antifouling treatment. © 2011 American Chemical Society.

  17. Incorporation of unique molecular identifiers in TruSeq adapters improves the accuracy of quantitative sequencing.

    Science.gov (United States)

    Hong, Jungeui; Gresham, David

    2017-11-01

    Quantitative analysis of next-generation sequencing (NGS) data requires discriminating duplicate reads generated by PCR from identical molecules that are of unique origin. Typically, PCR duplicates are identified as sequence reads that align to the same genomic coordinates using reference-based alignment. However, identical molecules can be independently generated during library preparation. Misidentification of these molecules as PCR duplicates can introduce unforeseen biases during analyses. Here, we developed a cost-effective sequencing adapter design by modifying Illumina TruSeq adapters to incorporate a unique molecular identifier (UMI) while maintaining the capacity to undertake multiplexed, single-index sequencing. Incorporation of UMIs into TruSeq adapters (TrUMIseq adapters) enables identification of bona fide PCR duplicates as identically mapped reads with identical UMIs. Using TrUMIseq adapters, we show that accurate removal of PCR duplicates results in improved accuracy of both allele frequency (AF) estimation in heterogeneous populations using DNA sequencing and gene expression quantification using RNA-Seq.

  18. A quantitative metric to identify critical elements within seafood supply networks.

    Science.gov (United States)

    Plagányi, Éva E; van Putten, Ingrid; Thébaud, Olivier; Hobday, Alistair J; Innes, James; Lim-Camacho, Lilly; Norman-López, Ana; Bustamante, Rodrigo H; Farmery, Anna; Fleming, Aysha; Frusher, Stewart; Green, Bridget; Hoshino, Eriko; Jennings, Sarah; Pecl, Gretta; Pascoe, Sean; Schrobback, Peggy; Thomas, Linda

    2014-01-01

    A theoretical basis is required for comparing key features and critical elements in wild fisheries and aquaculture supply chains under a changing climate. Here we develop a new quantitative metric that is analogous to indices used to analyse food-webs and identify key species. The Supply Chain Index (SCI) identifies critical elements as those elements with large throughput rates, as well as greater connectivity. The sum of the scores for a supply chain provides a single metric that roughly captures both the resilience and connectedness of a supply chain. Standardised scores can facilitate cross-comparisons both under current conditions as well as under a changing climate. Identification of key elements along the supply chain may assist in informing adaptation strategies to reduce anticipated future risks posed by climate change. The SCI also provides information on the relative stability of different supply chains based on whether there is a fairly even spread in the individual scores of the top few key elements, compared with a more critical dependence on a few key individual supply chain elements. We use as a case study the Australian southern rock lobster Jasus edwardsii fishery, which is challenged by a number of climate change drivers such as impacts on recruitment and growth due to changes in large-scale and local oceanographic features. The SCI identifies airports, processors and Chinese consumers as the key elements in the lobster supply chain that merit attention to enhance stability and potentially enable growth. We also apply the index to an additional four real-world Australian commercial fishery and two aquaculture industry supply chains to highlight the utility of a systematic method for describing supply chains. Overall, our simple methodological approach to empirically-based supply chain research provides an objective method for comparing the resilience of supply chains and highlighting components that may be critical.

  19. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant.

    Directory of Open Access Journals (Sweden)

    William S Beatty

    Full Text Available The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international, whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result

  20. Quantitative and qualitative approaches to identifying migration chronology in a continental migrant

    Science.gov (United States)

    Beatty, William S.; Kesler, Dylan C.; Webb, Elisabeth B.; Raedeke, Andrew H.; Naylor, Luke W.; Humburg, Dale D.

    2013-01-01

    The degree to which extrinsic factors influence migration chronology in North American waterfowl has not been quantified, particularly for dabbling ducks. Previous studies have examined waterfowl migration using various methods, however, quantitative approaches to define avian migration chronology over broad spatio-temporal scales are limited, and the implications for using different approaches have not been assessed. We used movement data from 19 female adult mallards (Anas platyrhynchos) equipped with solar-powered global positioning system satellite transmitters to evaluate two individual level approaches for quantifying migration chronology. The first approach defined migration based on individual movements among geopolitical boundaries (state, provincial, international), whereas the second method modeled net displacement as a function of time using nonlinear models. Differences in migration chronologies identified by each of the approaches were examined with analysis of variance. The geopolitical method identified mean autumn migration midpoints at 15 November 2010 and 13 November 2011, whereas the net displacement method identified midpoints at 15 November 2010 and 14 November 2011. The mean midpoints for spring migration were 3 April 2011 and 20 March 2012 using the geopolitical method and 31 March 2011 and 22 March 2012 using the net displacement method. The duration, initiation date, midpoint, and termination date for both autumn and spring migration did not differ between the two individual level approaches. Although we did not detect differences in migration parameters between the different approaches, the net displacement metric offers broad potential to address questions in movement ecology for migrating species. Ultimately, an objective definition of migration chronology will allow researchers to obtain a comprehensive understanding of the extrinsic factors that drive migration at the individual and population levels. As a result, targeted

  1. A Quantitative Metric to Identify Critical Elements within Seafood Supply Networks

    Science.gov (United States)

    Plagányi, Éva E.; van Putten, Ingrid; Thébaud, Olivier; Hobday, Alistair J.; Innes, James; Lim-Camacho, Lilly; Norman-López, Ana; Bustamante, Rodrigo H.; Farmery, Anna; Fleming, Aysha; Frusher, Stewart; Green, Bridget; Hoshino, Eriko; Jennings, Sarah; Pecl, Gretta; Pascoe, Sean; Schrobback, Peggy; Thomas, Linda

    2014-01-01

    A theoretical basis is required for comparing key features and critical elements in wild fisheries and aquaculture supply chains under a changing climate. Here we develop a new quantitative metric that is analogous to indices used to analyse food-webs and identify key species. The Supply Chain Index (SCI) identifies critical elements as those elements with large throughput rates, as well as greater connectivity. The sum of the scores for a supply chain provides a single metric that roughly captures both the resilience and connectedness of a supply chain. Standardised scores can facilitate cross-comparisons both under current conditions as well as under a changing climate. Identification of key elements along the supply chain may assist in informing adaptation strategies to reduce anticipated future risks posed by climate change. The SCI also provides information on the relative stability of different supply chains based on whether there is a fairly even spread in the individual scores of the top few key elements, compared with a more critical dependence on a few key individual supply chain elements. We use as a case study the Australian southern rock lobster Jasus edwardsii fishery, which is challenged by a number of climate change drivers such as impacts on recruitment and growth due to changes in large-scale and local oceanographic features. The SCI identifies airports, processors and Chinese consumers as the key elements in the lobster supply chain that merit attention to enhance stability and potentially enable growth. We also apply the index to an additional four real-world Australian commercial fishery and two aquaculture industry supply chains to highlight the utility of a systematic method for describing supply chains. Overall, our simple methodological approach to empirically-based supply chain research provides an objective method for comparing the resilience of supply chains and highlighting components that may be critical. PMID:24633147

  2. A quantitative metric to identify critical elements within seafood supply networks.

    Directory of Open Access Journals (Sweden)

    Éva E Plagányi

    Full Text Available A theoretical basis is required for comparing key features and critical elements in wild fisheries and aquaculture supply chains under a changing climate. Here we develop a new quantitative metric that is analogous to indices used to analyse food-webs and identify key species. The Supply Chain Index (SCI identifies critical elements as those elements with large throughput rates, as well as greater connectivity. The sum of the scores for a supply chain provides a single metric that roughly captures both the resilience and connectedness of a supply chain. Standardised scores can facilitate cross-comparisons both under current conditions as well as under a changing climate. Identification of key elements along the supply chain may assist in informing adaptation strategies to reduce anticipated future risks posed by climate change. The SCI also provides information on the relative stability of different supply chains based on whether there is a fairly even spread in the individual scores of the top few key elements, compared with a more critical dependence on a few key individual supply chain elements. We use as a case study the Australian southern rock lobster Jasus edwardsii fishery, which is challenged by a number of climate change drivers such as impacts on recruitment and growth due to changes in large-scale and local oceanographic features. The SCI identifies airports, processors and Chinese consumers as the key elements in the lobster supply chain that merit attention to enhance stability and potentially enable growth. We also apply the index to an additional four real-world Australian commercial fishery and two aquaculture industry supply chains to highlight the utility of a systematic method for describing supply chains. Overall, our simple methodological approach to empirically-based supply chain research provides an objective method for comparing the resilience of supply chains and highlighting components that may be critical.

  3. Quantitative high-throughput screen identifies inhibitors of the Schistosoma mansoni redox cascade.

    Directory of Open Access Journals (Sweden)

    Anton Simeonov

    2008-01-01

    Full Text Available Schistosomiasis is a tropical disease associated with high morbidity and mortality, currently affecting over 200 million people worldwide. Praziquantel is the only drug used to treat the disease, and with its increased use the probability of developing drug resistance has grown significantly. The Schistosoma parasites can survive for up to decades in the human host due in part to a unique set of antioxidant enzymes that continuously degrade the reactive oxygen species produced by the host's innate immune response. Two principal components of this defense system have been recently identified in S. mansoni as thioredoxin/glutathione reductase (TGR and peroxiredoxin (Prx and as such these enzymes present attractive new targets for anti-schistosomiasis drug development. Inhibition of TGR/Prx activity was screened in a dual-enzyme format with reducing equivalents being transferred from NADPH to glutathione via a TGR-catalyzed reaction and then to hydrogen peroxide via a Prx-catalyzed step. A fully automated quantitative high-throughput (qHTS experiment was performed against a collection of 71,028 compounds tested as 7- to 15-point concentration series at 5 microL reaction volume in 1536-well plate format. In order to generate a robust data set and to minimize the effect of compound autofluorescence, apparent reaction rates derived from a kinetic read were utilized instead of end-point measurements. Actives identified from the screen, along with previously untested analogues, were subjected to confirmatory experiments using the screening assay and subsequently against the individual targets in secondary assays. Several novel active series were identified which inhibited TGR at a range of potencies, with IC(50s ranging from micromolar to the assay response limit ( approximately 25 nM. This is, to our knowledge, the first report of a large-scale HTS to identify lead compounds for a helminthic disease, and provides a paradigm that can be used to jump

  4. Role of peripheral quantitative computed tomography in identifying disuse osteoporosis in paraplegia

    International Nuclear Information System (INIS)

    Coupaud, Sylvie; McLean, Alan N.; Allan, David B.

    2009-01-01

    Disuse osteoporosis is a major long-term health consequence of spinal cord injury (SCI) that still needs to be addressed. Its management in SCI should begin with accurate diagnosis, followed by targeted treatments in the most vulnerable subgroups. We present data quantifying disuse osteoporosis in a cross-section of the Scottish paraplegic population to identify subgroups with lowest bone mineral density (BMD). Forty-seven people with chronic SCI at levels T2-L2 were scanned using peripheral quantitative computed tomography at four tibial sites and two femoral sites, at the Queen Elizabeth National Spinal Injuries Unit, Glasgow (UK). At the distal epiphyses, trabecular BMD (BMDtrab), total BMD, total bone cross-sectional area (CSA) and bone mineral content (BMC) were determined. In the diaphyses, cortical BMD, total bone CSA, cortical CSA and BMC were calculated. Bone, muscle and fat CSAs were estimated in the lower leg and thigh. BMDtrab decreased exponentially with time since injury at different rates in the tibia and femur. At most sites, female paraplegics had significantly lower BMC, total bone CSA and muscle CSA than male paraplegics. Subjects with lumbar SCI tended to have lower bone values and smaller muscle CSAs than in thoracic SCI. At the distal epiphyses of the tibia and femur, there is generally a rapid and extensive reduction in BMDtrab after SCI. Female subjects, and those with lumbar SCI, tend to have lower bone values than males or those with thoracic SCI, respectively. (orig.)

  5. Role of peripheral quantitative computed tomography in identifying disuse osteoporosis in paraplegia

    Energy Technology Data Exchange (ETDEWEB)

    Coupaud, Sylvie [University of Glasgow, Centre for Rehabilitation Engineering, Department of Mechanical Engineering, Glasgow (United Kingdom); Southern General Hospital, Queen Elizabeth National Spinal Injuries Unit, Glasgow (United Kingdom); McLean, Alan N.; Allan, David B. [Southern General Hospital, Queen Elizabeth National Spinal Injuries Unit, Glasgow (United Kingdom)

    2009-10-15

    Disuse osteoporosis is a major long-term health consequence of spinal cord injury (SCI) that still needs to be addressed. Its management in SCI should begin with accurate diagnosis, followed by targeted treatments in the most vulnerable subgroups. We present data quantifying disuse osteoporosis in a cross-section of the Scottish paraplegic population to identify subgroups with lowest bone mineral density (BMD). Forty-seven people with chronic SCI at levels T2-L2 were scanned using peripheral quantitative computed tomography at four tibial sites and two femoral sites, at the Queen Elizabeth National Spinal Injuries Unit, Glasgow (UK). At the distal epiphyses, trabecular BMD (BMDtrab), total BMD, total bone cross-sectional area (CSA) and bone mineral content (BMC) were determined. In the diaphyses, cortical BMD, total bone CSA, cortical CSA and BMC were calculated. Bone, muscle and fat CSAs were estimated in the lower leg and thigh. BMDtrab decreased exponentially with time since injury at different rates in the tibia and femur. At most sites, female paraplegics had significantly lower BMC, total bone CSA and muscle CSA than male paraplegics. Subjects with lumbar SCI tended to have lower bone values and smaller muscle CSAs than in thoracic SCI. At the distal epiphyses of the tibia and femur, there is generally a rapid and extensive reduction in BMDtrab after SCI. Female subjects, and those with lumbar SCI, tend to have lower bone values than males or those with thoracic SCI, respectively. (orig.)

  6. Trichloroethylene and Cancer: Systematic and Quantitative Review of Epidemiologic Evidence for Identifying Hazards

    Directory of Open Access Journals (Sweden)

    Cheryl Siegel Scott

    2011-11-01

    Full Text Available We conducted a meta-analysis focusing on studies with high potential for trichloroethylene (TCE exposure to provide quantitative evaluations of the evidence for associations between TCE exposure and kidney, liver, and non-Hodgkin lymphoma (NHL cancers. A systematic review documenting essential design features, exposure assessment approaches, statistical analyses, and potential sources of confounding and bias identified twenty-four cohort and case-control studies on TCE and the three cancers of interest with high potential for exposure, including five recently published case-control studies of kidney cancer or NHL. Fixed- and random-effects models were fitted to the data on overall exposure and on the highest exposure group. Sensitivity analyses examined the influence of individual studies and of alternative risk estimate selections. For overall TCE exposure and kidney cancer, the summary relative risk (RRm estimate from the random effects model was 1.27 (95% CI: 1.13, 1.43, with a higher RRm for the highest exposure groups (1.58, 95% CI: 1.28, 1.96. The RRm estimates were not overly sensitive to alternative risk estimate selections or to removal of an individual study. There was no apparent heterogeneity or publication bias. For NHL, RRm estimates for overall exposure and for the highest exposure group, respectively, were 1.23 (95% CI: 1.07, 1.42 and 1.43 (95% CI: 1.13, 1.82 and, for liver cancer, 1.29 (95% CI: 1.07, 1.56 and 1.28 (95% CI: 0.93, 1.77. Our findings provide strong support for a causal association between TCE exposure and kidney cancer. The support is strong but less robust for NHL, where issues of study heterogeneity, potential publication bias, and weaker exposure-response results contribute uncertainty, and more limited for liver cancer, where only cohort studies with small numbers of cases were available.

  7. Trichloroethylene and Cancer: Systematic and Quantitative Review of Epidemiologic Evidence for Identifying Hazards

    Science.gov (United States)

    Scott, Cheryl Siegel; Jinot, Jennifer

    2011-01-01

    We conducted a meta-analysis focusing on studies with high potential for trichloroethylene (TCE) exposure to provide quantitative evaluations of the evidence for associations between TCE exposure and kidney, liver, and non-Hodgkin lymphoma (NHL) cancers. A systematic review documenting essential design features, exposure assessment approaches, statistical analyses, and potential sources of confounding and bias identified twenty-four cohort and case-control studies on TCE and the three cancers of interest with high potential for exposure, including five recently published case-control studies of kidney cancer or NHL. Fixed- and random-effects models were fitted to the data on overall exposure and on the highest exposure group. Sensitivity analyses examined the influence of individual studies and of alternative risk estimate selections. For overall TCE exposure and kidney cancer, the summary relative risk (RRm) estimate from the random effects model was 1.27 (95% CI: 1.13, 1.43), with a higher RRm for the highest exposure groups (1.58, 95% CI: 1.28, 1.96). The RRm estimates were not overly sensitive to alternative risk estimate selections or to removal of an individual study. There was no apparent heterogeneity or publication bias. For NHL, RRm estimates for overall exposure and for the highest exposure group, respectively, were 1.23 (95% CI: 1.07, 1.42) and 1.43 (95% CI: 1.13, 1.82) and, for liver cancer, 1.29 (95% CI: 1.07, 1.56) and 1.28 (95% CI: 0.93, 1.77). Our findings provide strong support for a causal association between TCE exposure and kidney cancer. The support is strong but less robust for NHL, where issues of study heterogeneity, potential publication bias, and weaker exposure-response results contribute uncertainty, and more limited for liver cancer, where only cohort studies with small numbers of cases were available. PMID:22163205

  8. A genome-wide association study identifies protein quantitative trait loci (pQTLs.

    Directory of Open Access Journals (Sweden)

    David Melzer

    2008-05-01

    Full Text Available There is considerable evidence that human genetic variation influences gene expression. Genome-wide studies have revealed that mRNA levels are associated with genetic variation in or close to the gene coding for those mRNA transcripts - cis effects, and elsewhere in the genome - trans effects. The role of genetic variation in determining protein levels has not been systematically assessed. Using a genome-wide association approach we show that common genetic variation influences levels of clinically relevant proteins in human serum and plasma. We evaluated the role of 496,032 polymorphisms on levels of 42 proteins measured in 1200 fasting individuals from the population based InCHIANTI study. Proteins included insulin, several interleukins, adipokines, chemokines, and liver function markers that are implicated in many common diseases including metabolic, inflammatory, and infectious conditions. We identified eight Cis effects, including variants in or near the IL6R (p = 1.8x10(-57, CCL4L1 (p = 3.9x10(-21, IL18 (p = 6.8x10(-13, LPA (p = 4.4x10(-10, GGT1 (p = 1.5x10(-7, SHBG (p = 3.1x10(-7, CRP (p = 6.4x10(-6 and IL1RN (p = 7.3x10(-6 genes, all associated with their respective protein products with effect sizes ranging from 0.19 to 0.69 standard deviations per allele. Mechanisms implicated include altered rates of cleavage of bound to unbound soluble receptor (IL6R, altered secretion rates of different sized proteins (LPA, variation in gene copy number (CCL4L1 and altered transcription (GGT1. We identified one novel trans effect that was an association between ABO blood group and tumour necrosis factor alpha (TNF-alpha levels (p = 6.8x10(-40, but this finding was not present when TNF-alpha was measured using a different assay , or in a second study, suggesting an assay-specific association. Our results show that protein levels share some of the features of the genetics of gene expression. These include the presence of strong genetic effects in cis

  9. Protein quantitative trait locus study in obesity during weight-loss identifies a leptin regulator

    DEFF Research Database (Denmark)

    Carayol, Jérôme; Chabert, Christian; Di Cara, Alessandro

    2017-01-01

    of an organism. Proteome analysis especially can provide new insights into the molecular mechanisms of complex traits like obesity. The role of genetic variation in determining protein level variation has not been assessed in obesity. To address this, we design a large-scale protein quantitative trait locus (p...

  10. Quantitative phase-digital holographic microscopy: a new imaging modality to identify original cellular biomarkers of diseases

    KAUST Repository

    Marquet, P.

    2016-05-03

    Quantitative phase microscopy (QPM) has recently emerged as a powerful label-free technique in the field of living cell imaging allowing to non-invasively measure with a nanometric axial sensitivity cell structure and dynamics. Since the phase retardation of a light wave when transmitted through the observed cells, namely the quantitative phase signal (QPS), is sensitive to both cellular thickness and intracellular refractive index related to the cellular content, its accurate analysis allows to derive various cell parameters and monitor specific cell processes, which are very likely to identify new cell biomarkers. Specifically, quantitative phase-digital holographic microscopy (QP-DHM), thanks to its numerical flexibility facilitating parallelization and automation processes, represents an appealing imaging modality to both identify original cellular biomarkers of diseases as well to explore the underlying pathophysiological processes.

  11. iTRAQ-Based Quantitative Proteomics Identifies Potential Regulatory Proteins Involved in Chicken Eggshell Brownness.

    Directory of Open Access Journals (Sweden)

    Guangqi Li

    Full Text Available Brown eggs are popular in many countries and consumers regard eggshell brownness as an important indicator of egg quality. However, the potential regulatory proteins and detailed molecular mechanisms regulating eggshell brownness have yet to be clearly defined. In the present study, we performed quantitative proteomics analysis with iTRAQ technology in the shell gland epithelium of hens laying dark and light brown eggs to investigate the candidate proteins and molecular mechanisms underlying variation in chicken eggshell brownness. The results indicated 147 differentially expressed proteins between these two groups, among which 65 and 82 proteins were significantly up-regulated in the light and dark groups, respectively. Functional analysis indicated that in the light group, the down-regulated iron-sulfur cluster assembly protein (Iba57 would decrease the synthesis of protoporphyrin IX; furthermore, the up-regulated protein solute carrier family 25 (mitochondrial carrier; adenine nucleotide translocator, member 5 (SLC25A5 and down-regulated translocator protein (TSPO would lead to increased amounts of protoporphyrin IX transported into the mitochondria matrix to form heme with iron, which is supplied by ovotransferrin protein (TF. In other words, chickens from the light group produce less protoporphyrin IX, which is mainly used for heme synthesis. Therefore, the exported protoporphyrin IX available for eggshell deposition and brownness is reduced in the light group. The current study provides valuable information to elucidate variation of chicken eggshell brownness, and demonstrates the feasibility and sensitivity of iTRAQ-based quantitative proteomics analysis in providing useful insights into the molecular mechanisms underlying brown eggshell pigmentation.

  12. Quantitative Proteomics Analysis Identifies Mitochondria as Therapeutic Targets of Multidrug-Resistance in Ovarian Cancer

    Science.gov (United States)

    Chen, Xiulan; Wei, Shasha; Ma, Ying; Lu, Jie; Niu, Gang; Xue, Yanhong; Chen, Xiaoyuan; Yang, Fuquan

    2014-01-01

    Doxorubicin is a widely used chemotherapeutic agent for the treatment of a variety of solid tumors. However, resistance to this anticancer drug is a major obstacle to the effective treatment of tumors. As mitochondria play important roles in cell life and death, we anticipate that mitochondria may be related to drug resistance. Here, stable isotope labeling by amino acids in cell culture (SILAC)-based quantitative proteomic strategy was applied to compare mitochondrial protein expression in doxorubicin sensitive OVCAR8 cells and its doxorubicin-resistant variant NCI_ADR/RES cells. A total of 2085 proteins were quantified, of which 122 proteins displayed significant changes in the NCI_ADR/RES cells. These proteins participated in a variety of cell processes including cell apoptosis, substance metabolism, transport, detoxification and drug metabolism. Then qRT-PCR and western blot were applied to validate the differentially expressed proteins quantified by SILAC. Further functional studies with RNAi demonstrated TOP1MT, a mitochondrial protein participated in DNA repair, was involved in doxorubicin resistance in NCI_ADR/RES cells. Besides the proteomic study, electron microscopy and fluorescence analysis also observed that mitochondrial morphology and localization were greatly altered in NCI_ADR/RES cells. Mitochondrial membrane potential was also decreased in NCI_ADR/RES cells. All these results indicate that mitochondrial function is impaired in doxorubicin-resistant cells and mitochondria play an important role in doxorubicin resistance. This research provides some new information about doxorubicin resistance, indicating that mitochondria could be therapeutic targets of doxorubicin resistance in ovarian cancer cells. PMID:25285166

  13. Quantitative high-throughput screening identifies 8-hydroxyquinolines as cell-active histone demethylase inhibitors.

    Directory of Open Access Journals (Sweden)

    Oliver N F King

    2010-11-01

    Full Text Available Small molecule modulators of epigenetic processes are currently sought as basic probes for biochemical mechanisms, and as starting points for development of therapeutic agents. N(ε-Methylation of lysine residues on histone tails is one of a number of post-translational modifications that together enable transcriptional regulation. Histone lysine demethylases antagonize the action of histone methyltransferases in a site- and methylation state-specific manner. N(ε-Methyllysine demethylases that use 2-oxoglutarate as co-factor are associated with diverse human diseases, including cancer, inflammation and X-linked mental retardation; they are proposed as targets for the therapeutic modulation of transcription. There are few reports on the identification of templates that are amenable to development as potent inhibitors in vivo and large diverse collections have yet to be exploited for the discovery of demethylase inhibitors.High-throughput screening of a ∼236,000-member collection of diverse molecules arrayed as dilution series was used to identify inhibitors of the JMJD2 (KDM4 family of 2-oxoglutarate-dependent histone demethylases. Initial screening hits were prioritized by a combination of cheminformatics, counterscreening using a coupled assay enzyme, and orthogonal confirmatory detection of inhibition by mass spectrometric assays. Follow-up studies were carried out on one of the series identified, 8-hydroxyquinolines, which were shown by crystallographic analyses to inhibit by binding to the active site Fe(II and to modulate demethylation at the H3K9 locus in a cell-based assay.These studies demonstrate that diverse compound screening can yield novel inhibitors of 2OG dependent histone demethylases and provide starting points for the development of potent and selective agents to interrogate epigenetic regulation.

  14. Single-cell quantitative HER2 measurement identifies heterogeneity and distinct subgroups within traditionally defined HER2-positive patients.

    Science.gov (United States)

    Onsum, Matthew D; Geretti, Elena; Paragas, Violette; Kudla, Arthur J; Moulis, Sharon P; Luus, Lia; Wickham, Thomas J; McDonagh, Charlotte F; MacBeath, Gavin; Hendriks, Bart S

    2013-11-01

    Human epidermal growth factor receptor 2 (HER2) is an important biomarker for breast and gastric cancer prognosis and patient treatment decisions. HER2 positivity, as defined by IHC or fluorescent in situ hybridization testing, remains an imprecise predictor of patient response to HER2-targeted therapies. Challenges to correct HER2 assessment and patient stratification include intratumoral heterogeneity, lack of quantitative and/or objective assays, and differences between measuring HER2 amplification at the protein versus gene level. We developed a novel immunofluorescence method for quantitation of HER2 protein expression at the single-cell level on FFPE patient samples. Our assay uses automated image analysis to identify and classify tumor versus non-tumor cells, as well as quantitate the HER2 staining for each tumor cell. The HER2 staining level is converted to HER2 protein expression using a standard cell pellet array stained in parallel with the tissue sample. This approach allows assessment of HER2 expression and heterogeneity within a tissue section at the single-cell level. By using this assay, we identified distinct subgroups of HER2 heterogeneity within traditional definitions of HER2 positivity in both breast and gastric cancers. Quantitative assessment of intratumoral HER2 heterogeneity may offer an opportunity to improve the identification of patients likely to respond to HER2-targeted therapies. The broad applicability of the assay was demonstrated by measuring HER2 expression profiles on multiple tumor types, and on normal and diseased heart tissues. Copyright © 2013 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  15. Risk Factors for Chronic Subdural Hematoma Recurrence Identified Using Quantitative Computed Tomography Analysis of Hematoma Volume and Density.

    Science.gov (United States)

    Stavrinou, Pantelis; Katsigiannis, Sotirios; Lee, Jong Hun; Hamisch, Christina; Krischek, Boris; Mpotsaris, Anastasios; Timmer, Marco; Goldbrunner, Roland

    2017-03-01

    Chronic subdural hematoma (CSDH), a common condition in elderly patients, presents a therapeutic challenge with recurrence rates of 33%. We aimed to identify specific prognostic factors for recurrence using quantitative analysis of hematoma volume and density. We retrospectively reviewed radiographic and clinical data of 227 CSDHs in 195 consecutive patients who underwent evacuation of the hematoma through a single burr hole, 2 burr holes, or a mini-craniotomy. To examine the relationship between hematoma recurrence and various clinical, radiologic, and surgical factors, we used quantitative image-based analysis to measure the hematoma and trapped air volumes and the hematoma densities. Recurrence of CSDH occurred in 35 patients (17.9%). Multivariate logistic regression analysis revealed that the percentage of hematoma drained and postoperative CSDH density were independent risk factors for recurrence. All 3 evacuation methods were equally effective in draining the hematoma (71.7% vs. 73.7% vs. 71.9%) without observable differences in postoperative air volume captured in the subdural space. Quantitative image analysis provided evidence that percentage of hematoma drained and postoperative CSDH density are independent prognostic factors for subdural hematoma recurrence. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Metastasis-related plasma membrane proteins of human breast cancer cells identified by comparative quantitative mass spectrometry

    DEFF Research Database (Denmark)

    Leth-Larsen, Rikke; Lund, Rikke; Hansen, Helle V

    2009-01-01

    The spread of cancer cells from a primary tumor to form metastasis at distant sites is a complex multi-step process. The cancer cell proteins, and plasma membrane proteins in particular, involved in this process are poorly defined and a study of the very early events of the metastatic process using...... clinical samples or in vitro assays is not feasible. We have used a unique model system consisting of two isogenic human breast cancer cell lines that are equally tumorigenic in mice, but while one gives rise to metastasis, the other disseminates single cells that remain dormant at distant organs. Membrane...... purification and comparative quantitative LC-MS/MS proteomic analysis identified 13 membrane proteins that were expressed at higher levels and 3 that were under-expressed in the metastatic compared to the non-metastatic cell line from a total of 1919 identified protein entries. Among the proteins were ecto-5...

  17. A Quantitative RNAi Screen for JNK Modifiers Identifies Pvr as a Novel Regulator of Drosophila Immune Signaling

    Science.gov (United States)

    Bond, David; Foley, Edan

    2009-01-01

    Drosophila melanogaster responds to gram-negative bacterial challenges through the IMD pathway, a signal transduction cassette that is driven by the coordinated activities of JNK, NF-κB and caspase modules. While many modifiers of NF-κB activity were identified in cell culture and in vivo assays, the regulatory apparatus that determines JNK inputs into the IMD pathway is relatively unexplored. In this manuscript, we present the first quantitative screen of the entire genome of Drosophila for novel regulators of JNK activity in the IMD pathway. We identified a large number of gene products that negatively or positively impact on JNK activation in the IMD pathway. In particular, we identified the Pvr receptor tyrosine kinase as a potent inhibitor of JNK activation. In a series of in vivo and cell culture assays, we demonstrated that activation of the IMD pathway drives JNK-dependent expression of the Pvr ligands, Pvf2 and Pvf3, which in turn act through the Pvr/ERK MAP kinase pathway to attenuate the JNK and NF-κB arms of the IMD pathway. Our data illuminate a poorly understood arm of a critical and evolutionarily conserved innate immune response. Furthermore, given the pleiotropic involvement of JNK in eukaryotic cell biology, we believe that many of the novel regulators identified in this screen are of interest beyond immune signaling. PMID:19893628

  18. Relationship between thin cap fibroatheroma identified by virtual histology and angioscopic yellow plaque in quantitative analysis with colorimetry.

    Science.gov (United States)

    Yamamoto, Masanori; Takano, Masamichi; Okamatsu, Kentaro; Murakami, Daisuke; Inami, Shigenobu; Xie, Yong; Seimiya, Koji; Ohba, Takayoshi; Seino, Yoshihiko; Mizuno, Kyoichi

    2009-03-01

    Thin cap fibroatheroma (TCFA) is considered to be a vulnerable plaque. Virtual Histology-intravascular ultrasound (VH-IVUS) can precisely identify TCFA in vivo. Intense yellow plaque on angioscopy determined by quantitative colorimetry with L a b color space corresponds with histological TCFA; in particular, a plaque of color b value >23 indicates an atheroma with a fibrous cap thickness colorimetry was investigated. Fifty-seven culprit plaques in 57 patients were evaluated by VH-IVUS and angioscopy. VH-TCFA was defined as a plaque with a necrotic core >10% of plaque area without overlying fibrous tissue, and angioscopic TCFA was a plaque with b value >23. The frequency of angioscopic TCFA was higher in the VH-TCFA group than in the VH-non-TCFA group (74% vs 23%, P=0.0002). Moreover, yellow color intensity (b value) significantly correlated with plaque classification on VH-IVUS. When TCFA detected with angioscopy was used as the gold standard, the sensitivity, specificity, and accuracy for TCFA with VH-IVUS was 68%, 81%, and 75%, respectively. VH-TCFA strongly correlated with angioscopic TCFA determined by a quantitative analysis with colorimetry.

  19. Digital X-ray radiogrammetry better identifies osteoarthritis patients with a low bone mineral density than quantitative ultrasound

    Energy Technology Data Exchange (ETDEWEB)

    Goerres, Gerhard W. [University Hospital Zurich, Institute of Diagnostic Radiology, Department of Medical Radiology, Zurich (Switzerland); University Hospital Zurich, Osteoporosis Center, Zurich (Switzerland); Frey, Diana; Studer, Annina; Hauser, Dagmar; Zilic, Nathalie [University Hospital Zurich, Osteoporosis Center, Zurich (Switzerland); Hany, Thomas F. [University Hospital Zurich, Institute of Nuclear Medicine, Department of Medical Radiology, Zurich (Switzerland); Seifert, Burkhardt [University of Zurich, Department of Biostatistics, Zurich (Switzerland); Haeuselmann, Hans J. [Center for Rheumatology and Bone Disease, Klinik im Park, Zurich (Switzerland); Michel, Beat A.; Uebelhart, Daniel [University Hospital Zurich, Osteoporosis Center, Zurich (Switzerland); University Hospital Zurich, Department of Rheumatology and Institute of Physical Medicine, Zurich (Switzerland); Hans, Didier [University Hospital Geneva, Division of Nuclear Medicine, Geneva (Switzerland)

    2007-04-15

    This study assessed the ability of quantitative ultrasound (QUS) and digital X-ray radiogrammetry (DXR) to identify osteopenia and osteoporosis in patients with knee osteoarthritis (OA). One hundred and sixty-one patients with painful knee OA (81 men, 80 women; age 62.6{+-}9.2 years, range 40-82 years) were included in this cross-sectional study and underwent dual-energy X-ray absorptiometry (DXA) of both hips and the lumbar spine, QUS of the phalanges and calcanei of both hands and heels, and DXR using radiographs of both hands. Unpaired t-test, Mann-Whitney U test, ROC analysis and Spearman's rank correlation were used for comparisons and correlation of methods. Using DXA as the reference standard, we defined a low bone mineral density (BMD) as a T-score {<=}-1.0 at the lumbar spine or proximal femur. In contrast to phalangeal or calcaneal QUS, DXR was able to discriminate patients with a low BMD at the lumbar spine (p<0.0001) or hips (p<0.0001). ROC analysis showed that DXR had an acceptable predictive power in identifying OA patients a low hip BMD (sensitivity 70%, specificity 71%). Therefore, DXR used as a screening tool could help in identifying patients with knee OA for DXA. (orig.)

  20. Quantitative trait loci (QTL study identifies novel genomic regions associated to Chiari-like malformation in Griffon Bruxellois dogs.

    Directory of Open Access Journals (Sweden)

    Philippe Lemay

    Full Text Available Chiari-like malformation (CM is a developmental abnormality of the craniocervical junction that is common in the Griffon Bruxellois (GB breed with an estimated prevalence of 65%. This disease is characterized by overcrowding of the neural parenchyma at the craniocervical junction and disturbance of cerebrospinal fluid (CSF flow. The most common clinical sign is pain either as a direct consequence of CM or neuropathic pain as a consequence of secondary syringomyelia. The etiology of CM remains unknown but genetic factors play an important role. To investigate the genetic complexity of the disease, a quantitative trait locus (QTL approach was adopted. A total of 14 quantitative skull and atlas measurements were taken and were tested for association to CM. Six traits were found to be associated to CM and were subjected to a whole-genome association study using the Illumina canine high density bead chip in 74 GB dogs (50 affected and 24 controls. Linear and mixed regression analyses identified associated single nucleotide polymorphisms (SNPs on 5 Canis Familiaris Autosomes (CFAs: CFA2, CFA9, CFA12, CFA14 and CFA24. A reconstructed haplotype of 0.53 Mb on CFA2 strongly associated to the height of the cranial fossa (diameter F and an haplotype of 2.5 Mb on CFA14 associated to both the height of the rostral part of the caudal cranial fossa (AE and the height of the brain (FG were significantly associated to CM after 10 000 permutations strengthening their candidacy for this disease (P = 0.0421, P = 0.0094 respectively. The CFA2 QTL harbours the Sall-1 gene which is an excellent candidate since its orthologue in humans is mutated in Townes-Brocks syndrome which has previously been associated to Chiari malformation I. Our study demonstrates the implication of multiple traits in the etiology of CM and has successfully identified two new QTL associated to CM and a potential candidate gene.

  1. Identifying Quantitative Trait Loci (QTLs) and Developing Diagnostic Markers Linked to Orange Rust Resistance in Sugarcane (Saccharum spp.).

    Science.gov (United States)

    Yang, Xiping; Islam, Md S; Sood, Sushma; Maya, Stephanie; Hanson, Erik A; Comstock, Jack; Wang, Jianping

    2018-01-01

    Sugarcane ( Saccharum spp.) is an important economic crop, contributing up to 80% of table sugar used in the world and has become a promising feedstock for biofuel production. Sugarcane production has been threatened by many diseases, and fungicide applications for disease control have been opted out for sustainable agriculture. Orange rust is one of the major diseases impacting sugarcane production worldwide. Identifying quantitative trait loci (QTLs) and developing diagnostic markers are valuable for breeding programs to expedite release of superior sugarcane cultivars for disease control. In this study, an F 1 segregating population derived from a cross between two hybrid sugarcane clones, CP95-1039 and CP88-1762, was evaluated for orange rust resistance in replicated trails. Three QTLs controlling orange rust resistance in sugarcane (qORR109, qORR4 and qORR102) were identified for the first time ever, which can explain 58, 12 and 8% of the phenotypic variation, separately. We also characterized 1,574 sugarcane putative resistance ( R ) genes. These sugarcane putative R genes and simple sequence repeats in the QTL intervals were further used to develop diagnostic markers for marker-assisted selection of orange rust resistance. A PCR-based Resistance gene-derived maker, G1 was developed, which showed significant association with orange rust resistance. The putative QTLs and marker developed in this study can be effectively utilized in sugarcane breeding programs to facilitate the selection process, thus contributing to the sustainable agriculture for orange rust disease control.

  2. Discrimination and resilience and the needs of people who identify as Transgender: A narrative review of quantitative research studies.

    Science.gov (United States)

    McCann, Edward; Brown, Michael

    2017-12-01

    To examine discrimination and resilience experiences of people who identify as transgender and establish potential health service responses. People who identify as transgender face many challenges in society in terms of the knowledge, understanding and acceptance of a person's gender identity. A narrative review of quantitative empirical research. A comprehensive search of CINAHL, MEDLINE, PsycINFO and Sociological Abstracts electronic databases from 2006-2016 was conducted. The search yielded 1,478 papers and following the application of rigorous inclusion and exclusion criteria a total of 19 papers were included in the review. The findings reveal that there is a need to ensure that the needs of transgender people are represented, fully integrated and clearly linked to outcomes that improve their health and quality of life. Discrimination experiences can result in poorer health outcomes; however, many people have developed resilience and positive coping strategies. Nurses need to recognise and respond appropriately to the care and treatment needs of this population. Comprehensive nursing assessments and plans of care that encompass all aspects of the person should be in place supported by clear policy guidelines and evidence-based research. The education requirements of practitioners are outlined. © 2017 John Wiley & Sons Ltd.

  3. Quantitative trait loci identified for blood chemistry components of an advanced intercross line of chickens under heat stress.

    Science.gov (United States)

    Van Goor, Angelica; Ashwell, Christopher M; Persia, Michael E; Rothschild, Max F; Schmidt, Carl J; Lamont, Susan J

    2016-04-14

    Heat stress in poultry results in considerable economic losses and is a concern for both animal health and welfare. Physiological changes occur during periods of heat stress, including changes in blood chemistry components. A highly advanced intercross line, created from a broiler (heat susceptible) by Fayoumi (heat resistant) cross, was exposed to daily heat cycles for seven days starting at 22 days of age. Blood components measured pre-heat treatment and on the seventh day of heat treatment included pH, pCO2, pO2, base excess, HCO3, TCO2, K, Na, ionized Ca, hematocrit, hemoglobin, sO2, and glucose. A genome-wide association study (GWAS) for these traits and their calculated changes was conducted to identify quantitative trait loci (QTL) using a 600 K SNP panel. There were significant increases in pH, base excess, HCO3, TCO2, ionized Ca, hematocrit, hemoglobin, and sO2, and significant decreases in pCO2 and glucose after 7 days of heat treatment. Heritabilities ranged from 0.01-0.21 for pre-heat measurements, 0.01-0.23 for measurements taken during heat, and 0.00-0.10 for the calculated change due to heat treatment. All blood components were highly correlated within measurement days, but not correlated between measurement days. The GWAS revealed 61 QTL for all traits, located on GGA (Gallus gallus chromosome) 1, 3, 6, 9, 10, 12-14, 17, 18, 21-28, and Z. A functional analysis of the genes in these QTL regions identified the Angiopoietin pathway as significant. The QTL that co-localized for three or more traits were on GGA10, 22, 26, 28, and Z and revealed candidate genes for birds' response to heat stress. The results of this study contribute to our knowledge of levels and heritabilities of several blood components of chickens under thermoneutral and heat stress conditions. Most components responded to heat treatment. Mapped QTL may serve as markers for genomic selection to enhance heat tolerance in poultry. The Angiopoietin pathway is likely involved in the

  4. Quantitative Secretomic Analysis Identifies Extracellular Protein Factors That Modulate the Metastatic Phenotype of Non-Small Cell Lung Cancer.

    Science.gov (United States)

    Hu, Rongkuan; Huffman, Kenneth E; Chu, Michael; Zhang, Yajie; Minna, John D; Yu, Yonghao

    2016-02-05

    Lung cancer is the leading cause of cancer-related deaths for men and women in the United States, with non-small cell lung cancer (NSCLC) representing 85% of all diagnoses. Late stage detection, metastatic disease and lack of actionable biomarkers contribute to the high mortality rate. Proteins in the extracellular space are known to be critically involved in regulating every stage of the pathogenesis of lung cancer. To investigate the mechanism by which secreted proteins contribute to the pathogenesis of NSCLC, we performed quantitative secretomic analysis of two isogenic NSCLC cell lines (NCI-H1993 and NCI-H2073) and an immortalized human bronchial epithelial cell line (HBEC3-KT) as control. H1993 was derived from a chemo-naïve metastatic tumor, while H2073 was derived from the primary tumor after etoposide/cisplatin therapy. From the conditioned media of these three cell lines, we identified and quantified 2713 proteins, including a series of proteins involved in regulating inflammatory response, programmed cell death and cell motion. Gene Ontology (GO) analysis indicates that a number of proteins overexpressed in H1993 media are involved in biological processes related to cancer metastasis, including cell motion, cell-cell adhesion and cell migration. RNA interference (RNAi)-mediated knock down of a number of these proteins, including SULT2B1, CEACAM5, SPRR3, AGR2, S100P, and S100A14, leads to dramatically reduced migration of these cells. In addition, meta-analysis of survival data indicates NSCLC patients whose tumors express higher levels of several of these secreted proteins, including SULT2B1, CEACAM5, SPRR3, S100P, and S100A14, have a worse prognosis. Collectively, our results provide a potential molecular link between deregulated secretome and NSCLC cell migration/metastasis. In addition, the identification of these aberrantly secreted proteins might facilitate the development of biomarkers for early detection of this devastating disease.

  5. Genome-wide Association Study to Identify Quantitative Trait Loci for Meat and Carcass Quality Traits in Berkshire

    Science.gov (United States)

    Iqbal, Asif; Kim, You-Sam; Kang, Jun-Mo; Lee, Yun-Mi; Rai, Rajani; Jung, Jong-Hyun; Oh, Dong-Yup; Nam, Ki-Chang; Lee, Hak-Kyo; Kim, Jong-Joo

    2015-01-01

    Meat and carcass quality attributes are of crucial importance influencing consumer preference and profitability in the pork industry. A set of 400 Berkshire pigs were collected from Dasan breeding farm, Namwon, Chonbuk province, Korea that were born between 2012 and 2013. To perform genome wide association studies (GWAS), eleven meat and carcass quality traits were considered, including carcass weight, backfat thickness, pH value after 24 hours (pH24), Commission Internationale de l’Eclairage lightness in meat color (CIE L), redness in meat color (CIE a), yellowness in meat color (CIE b), filtering, drip loss, heat loss, shear force and marbling score. All of the 400 animals were genotyped with the Porcine 62K SNP BeadChips (Illumina Inc., USA). A SAS general linear model procedure (SAS version 9.2) was used to pre-adjust the animal phenotypes before GWAS with sire and sex effects as fixed effects and slaughter age as a covariate. After fitting the fixed and covariate factors in the model, the residuals of the phenotype regressed on additive effects of each single nucleotide polymorphism (SNP) under a linear regression model (PLINK version 1.07). The significant SNPs after permutation testing at a chromosome-wise level were subjected to stepwise regression analysis to determine the best set of SNP markers. A total of 55 significant (p<0.05) SNPs or quantitative trait loci (QTL) were detected on various chromosomes. The QTLs explained from 5.06% to 8.28% of the total phenotypic variation of the traits. Some QTLs with pleiotropic effect were also identified. A pair of significant QTL for pH24 was also found to affect both CIE L and drip loss percentage. The significant QTL after characterization of the functional candidate genes on the QTL or around the QTL region may be effectively and efficiently used in marker assisted selection to achieve enhanced genetic improvement of the trait considered. PMID:26580276

  6. Genome-wide Association Study to Identify Quantitative Trait Loci for Meat and Carcass Quality Traits in Berkshire

    Directory of Open Access Journals (Sweden)

    Asif Iqbal

    2015-11-01

    Full Text Available Meat and carcass quality attributes are of crucial importance influencing consumer preference and profitability in the pork industry. A set of 400 Berkshire pigs were collected from Dasan breeding farm, Namwon, Chonbuk province, Korea that were born between 2012 and 2013. To perform genome wide association studies (GWAS, eleven meat and carcass quality traits were considered, including carcass weight, backfat thickness, pH value after 24 hours (pH24, Commission Internationale de l’Eclairage lightness in meat color (CIE L, redness in meat color (CIE a, yellowness in meat color (CIE b, filtering, drip loss, heat loss, shear force and marbling score. All of the 400 animals were genotyped with the Porcine 62K SNP BeadChips (Illumina Inc., USA. A SAS general linear model procedure (SAS version 9.2 was used to pre-adjust the animal phenotypes before GWAS with sire and sex effects as fixed effects and slaughter age as a covariate. After fitting the fixed and covariate factors in the model, the residuals of the phenotype regressed on additive effects of each single nucleotide polymorphism (SNP under a linear regression model (PLINK version 1.07. The significant SNPs after permutation testing at a chromosome-wise level were subjected to stepwise regression analysis to determine the best set of SNP markers. A total of 55 significant (p<0.05 SNPs or quantitative trait loci (QTL were detected on various chromosomes. The QTLs explained from 5.06% to 8.28% of the total phenotypic variation of the traits. Some QTLs with pleiotropic effect were also identified. A pair of significant QTL for pH24 was also found to affect both CIE L and drip loss percentage. The significant QTL after characterization of the functional candidate genes on the QTL or around the QTL region may be effectively and efficiently used in marker assisted selection to achieve enhanced genetic improvement of the trait considered.

  7. In vivo quantitative phosphoproteomic profiling identifies novel regulators of castration-resistant prostate cancer growth

    DEFF Research Database (Denmark)

    Jiang, Nan; Hjorth-Jensen, Kim; Hekmat, Omid

    2015-01-01

    Prostate cancer remains a leading cause of cancer-related mortality worldwide owing to our inability to treat effectively castration-resistant tumors. To understand the signaling mechanisms sustaining castration-resistant growth, we implemented a mass spectrometry-based quantitative proteomic app...

  8. Quantitative phase-digital holographic microscopy: a new imaging modality to identify original cellular biomarkers of diseases

    KAUST Repository

    Marquet, P.; Rothenfusser, K.; Rappaz, B.; Depeursinge, Christian; Jourdain, P.; Magistretti, Pierre J.

    2016-01-01

    parallelization and automation processes, represents an appealing imaging modality to both identify original cellular biomarkers of diseases as well to explore the underlying pathophysiological processes.

  9. Factor analysis in the Genetics of Asthma International Network family study identifies five major quantitative asthma phenotypes

    NARCIS (Netherlands)

    Pillai, S. G.; Tang, Y.; van den Oord, E.; Klotsman, M.; Barnes, K.; Carlsen, K.; Gerritsen, J.; Lenney, W.; Silverman, M.; Sly, P.; Sundy, J.; Tsanakas, J.; von Berg, A.; Whyte, M.; Ortega, H. G.; Anderson, W. H.; Helms, P. J.

    Background Asthma is a clinically heterogeneous disease caused by a complex interaction between genetic susceptibility and diverse environmental factors. In common with other complex diseases the lack of a standardized scheme to evaluate the phenotypic variability poses challenges in identifying the

  10. Quantitative proteomics identifies central players in erlotinib resistance of the non-small cell lung cancer cell line HCC827

    DEFF Research Database (Denmark)

    Jacobsen, Kirstine; Lund, Rikke Raaen; Beck, Hans Christian

    Background: Erlotinib (Tarceva®, Roche) has significantly changed the treatment of non-small cell lung cancer (NSCLC) as 70% of patients show significant tumor regression when treated. However, all patients relapse due to development of acquired resistance, which in 43-50% of cases are caused...... by a secondary mutation (T790M) in EGFR. Importantly, a majority of resistance cases are still unexplained. Our aim is to identify novel resistance mechanisms in erlotinib-resistant subclones of the NSCLC cell line HCC827. Materials & Methods: We established 3 erlotinib-resistant subclones (resistant to 10, 20...... or other EGFR or KRAS mutations, potentiating the identification of novel resistance mechanisms. We identified 2875 cytoplasmic proteins present in all 4 cell lines. Of these 87, 56 and 23 are upregulated >1.5 fold; and 117, 72 and 32 are downregulated >1.5 fold, respectively, in the 3 resistant clones...

  11. IBT-based quantitative proteomics identifies potential regulatory proteins involved in pigmentation of purple sea cucumber, Apostichopus japonicus.

    Science.gov (United States)

    Xing, Lili; Sun, Lina; Liu, Shilin; Li, Xiaoni; Zhang, Libin; Yang, Hongsheng

    2017-09-01

    Sea cucumbers are an important economic species and exhibit high yield value among aquaculture animals. Purple sea cucumbers are very rare and beautiful and have stable hereditary patterns. In this study, isobaric tags (IBT) were first used to reveal the molecular mechanism of pigmentation in the body wall of the purple sea cucumber. We analyzed the proteomes of purple sea cucumber in early pigmentation stage (Pa), mid pigmentation stage (Pb) and late pigmentation stage (Pc), resulting in the identification of 5580 proteins, including 1099 differentially expressed proteins in Pb: Pa and 339 differentially expressed proteins in Pc: Pb. GO and KEGG analyses revealed possible differentially expressed proteins, including"melanogenesis", "melanosome", "melanoma", "pigment-biosynthetic process", "Epidermis development", "Ras-signaling pathway", "Wnt-signaling pathway", "response to UV light", and "tyrosine metabolism", involved in pigment synthesis and regulation in purple sea cucumbers. The large number of differentially expressed proteins identified here should be highly useful in further elucidating the mechanisms underlying pigmentation in sea cucumbers. Furthermore, these results may also provide the base for further identification of proteins involved in resistance mechanisms against melanoma, albinism, UV damage, and other diseases in sea cucumbers. Copyright © 2017. Published by Elsevier Inc.

  12. A quantitative analysis of statistical power identifies obesity end points for improved in vivo preclinical study design.

    Science.gov (United States)

    Selimkhanov, J; Thompson, W C; Guo, J; Hall, K D; Musante, C J

    2017-08-01

    The design of well-powered in vivo preclinical studies is a key element in building the knowledge of disease physiology for the purpose of identifying and effectively testing potential antiobesity drug targets. However, as a result of the complexity of the obese phenotype, there is limited understanding of the variability within and between study animals of macroscopic end points such as food intake and body composition. This, combined with limitations inherent in the measurement of certain end points, presents challenges to study design that can have significant consequences for an antiobesity program. Here, we analyze a large, longitudinal study of mouse food intake and body composition during diet perturbation to quantify the variability and interaction of the key metabolic end points. To demonstrate how conclusions can change as a function of study size, we show that a simulated preclinical study properly powered for one end point may lead to false conclusions based on secondary end points. We then propose the guidelines for end point selection and study size estimation under different conditions to facilitate proper power calculation for a more successful in vivo study design.

  13. Integration analysis of quantitative proteomics and transcriptomics data identifies potential targets of frizzled-8 protein-related antiproliferative factor in vivo.

    Science.gov (United States)

    Yang, Wei; Kim, Yongsoo; Kim, Taek-Kyun; Keay, Susan K; Kim, Kwang Pyo; Steen, Hanno; Freeman, Michael R; Hwang, Daehee; Kim, Jayoung

    2012-12-01

    What's known on the subject? and What does the study add? Interstitial cystitis (IC) is a prevalent and debilitating pelvic disorder generally accompanied by chronic pain combined with chronic urinating problems. Over one million Americans are affected, especially middle-aged women. However, its aetiology or mechanism remains unclear. No efficient drug has been provided to patients. Several urinary biomarker candidates have been identified for IC; among the most promising is antiproliferative factor (APF), whose biological activity is detectable in urine specimens from >94% of patients with both ulcerative and non-ulcerative IC. The present study identified several important mediators of the effect of APF on bladder cell physiology, suggesting several candidate drug targets against IC. In an attempt to identify potential proteins and genes regulated by APF in vivo, and to possibly expand the APF-regulated network identified by stable isotope labelling by amino acids in cell culture (SILAC), we performed an integration analysis of our own SILAC data and the microarray data of Gamper et al. (2009) BMC Genomics 10: 199. Notably, two of the proteins (i.e. MAPKSP1 and GSPT1) that are down-regulated by APF are involved in the activation of mTORC1, suggesting that the mammalian target of rapamycin (mTOR) pathway is potentially a critical pathway regulated by APF in vivo. Several components of the mTOR pathway are currently being studied as potential therapeutic targets in other diseases. Our analysis suggests that this pathway might also be relevant in the design of diagnostic tools and medications targeting IC. • To enhance our understanding of the interstitial cystitis urine biomarker antiproliferative factor (APF), as well as interstitial cystitis biology more generally at the systems level, we reanalyzed recently published large-scale quantitative proteomics and in vivo transcriptomics data sets using an integration analysis tool that we have developed. • To

  14. How powerful are summary-based methods for identifying expression-trait associations under different genetic architectures?

    Science.gov (United States)

    Veturi, Yogasudha; Ritchie, Marylyn D

    2018-01-01

    Transcriptome-wide association studies (TWAS) have recently been employed as an approach that can draw upon the advantages of genome-wide association studies (GWAS) and gene expression studies to identify genes associated with complex traits. Unlike standard GWAS, summary level data suffices for TWAS and offers improved statistical power. Two popular TWAS methods include either (a) imputing the cis genetic component of gene expression from smaller sized studies (using multi-SNP prediction or MP) into much larger effective sample sizes afforded by GWAS - TWAS-MP or (b) using summary-based Mendelian randomization - TWAS-SMR. Although these methods have been effective at detecting functional variants, it remains unclear how extensive variability in the genetic architecture of complex traits and diseases impacts TWAS results. Our goal was to investigate the different scenarios under which these methods yielded enough power to detect significant expression-trait associations. In this study, we conducted extensive simulations based on 6000 randomly chosen, unrelated Caucasian males from Geisinger's MyCode population to compare the power to detect cis expression-trait associations (within 500 kb of a gene) using the above-described approaches. To test TWAS across varying genetic backgrounds we simulated gene expression and phenotype using different quantitative trait loci per gene and cis-expression /trait heritability under genetic models that differentiate the effect of causality from that of pleiotropy. For each gene, on a training set ranging from 100 to 1000 individuals, we either (a) estimated regression coefficients with gene expression as the response using five different methods: LASSO, elastic net, Bayesian LASSO, Bayesian spike-slab, and Bayesian ridge regression or (b) performed eQTL analysis. We then sampled with replacement 50,000, 150,000, and 300,000 individuals respectively from the testing set of the remaining 5000 individuals and conducted GWAS on each

  15. Quantitative Analysis of {sup 18}F-Fluorodeoxyglucose Positron Emission Tomography Identifies Novel Prognostic Imaging Biomarkers in Locally Advanced Pancreatic Cancer Patients Treated With Stereotactic Body Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Yi [Department of Radiation Oncology, Stanford University, Palo Alto, California (United States); Global Institution for Collaborative Research and Education, Hokkaido University, Sapporo (Japan); Song, Jie; Pollom, Erqi; Alagappan, Muthuraman [Department of Radiation Oncology, Stanford University, Palo Alto, California (United States); Shirato, Hiroki [Global Institution for Collaborative Research and Education, Hokkaido University, Sapporo (Japan); Chang, Daniel T.; Koong, Albert C. [Department of Radiation Oncology, Stanford University, Palo Alto, California (United States); Stanford Cancer Institute, Stanford, California (United States); Li, Ruijiang, E-mail: rli2@stanford.edu [Department of Radiation Oncology, Stanford University, Palo Alto, California (United States); Global Institution for Collaborative Research and Education, Hokkaido University, Sapporo (Japan); Stanford Cancer Institute, Stanford, California (United States)

    2016-09-01

    Purpose: To identify prognostic biomarkers in pancreatic cancer using high-throughput quantitative image analysis. Methods and Materials: In this institutional review board–approved study, we retrospectively analyzed images and outcomes for 139 locally advanced pancreatic cancer patients treated with stereotactic body radiation therapy (SBRT). The overall population was split into a training cohort (n=90) and a validation cohort (n=49) according to the time of treatment. We extracted quantitative imaging characteristics from pre-SBRT {sup 18}F-fluorodeoxyglucose positron emission tomography, including statistical, morphologic, and texture features. A Cox proportional hazard regression model was built to predict overall survival (OS) in the training cohort using 162 robust image features. To avoid over-fitting, we applied the elastic net to obtain a sparse set of image features, whose linear combination constitutes a prognostic imaging signature. Univariate and multivariate Cox regression analyses were used to evaluate the association with OS, and concordance index (CI) was used to evaluate the survival prediction accuracy. Results: The prognostic imaging signature included 7 features characterizing different tumor phenotypes, including shape, intensity, and texture. On the validation cohort, univariate analysis showed that this prognostic signature was significantly associated with OS (P=.002, hazard ratio 2.74), which improved upon conventional imaging predictors including tumor volume, maximum standardized uptake value, and total legion glycolysis (P=.018-.028, hazard ratio 1.51-1.57). On multivariate analysis, the proposed signature was the only significant prognostic index (P=.037, hazard ratio 3.72) when adjusted for conventional imaging and clinical factors (P=.123-.870, hazard ratio 0.53-1.30). In terms of CI, the proposed signature scored 0.66 and was significantly better than competing prognostic indices (CI 0.48-0.64, Wilcoxon rank sum test P<1e-6

  16. Identification of three novel OA1 gene mutations identified in three families misdiagnosed with congenital nystagmus and carrier status determination by real-time quantitative PCR assay

    Directory of Open Access Journals (Sweden)

    Hamel Christian

    2003-01-01

    Full Text Available Abstract Background X-linked ocular albinism type 1 (OA1 is caused by mutations in OA1 gene, which encodes a membrane glycoprotein localised to melanosomes. OA1 mainly affects pigment production in the eye, resulting in optic changes associated with albinism including hypopigmentation of the retina, nystagmus, strabismus, foveal hypoplasia, abnormal crossing of the optic fibers and reduced visual acuity. Affected Caucasian males usually appear to have normal skin and hair pigment. Results We identified three previously undescribed mutations consisting of two intragenic deletions (one encompassing exon 6, the other encompassing exons 7–8, and a point mutation (310delG in exon 2. We report the development of a new method for diagnosis of heterozygous deletions in OA1 gene based on measurement of gene copy number using real-time quantitative PCR from genomic DNA. Conclusion The identification of OA1 mutations in families earlier reported as families with hereditary nystagmus indicate that ocular albinism type 1 is probably underdiagnosed. Our method of real-time quantitative PCR of OA1 exons with DMD exon as external standard performed on the LightCycler™ allows quick and accurate carrier-status assessment for at-risk females.

  17. Quantitative proteomics identifies Gemin5, a scaffolding protein involved in ribonucleoprotein assembly, as a novel partner for eukaryotic initiation factor 4E

    DEFF Research Database (Denmark)

    Fierro-Monti, Ivo; Mohammed, Shabaz; Matthiesen, Rune

    2006-01-01

    Protein complexes are dynamic entities; identification and quantitation of their components is critical in elucidating functional roles under specific cellular conditions. We report the first quantitative proteomic analysis of the human cap-binding protein complex. Components and proteins......-starved tumorigenic human mesenchymal stromal cells, attested to their activated translational states. The WD-repeat, scaffolding-protein Gemin5 was identified as a novel eIF4E binding partner, which interacted directly with eIF4E through a motif (YXXXXLPhi) present in a number of eIF4E-interacting partners. Elevated...... levels of Gemin5:eIF4E complexes were found in phorbol ester treated HEK293 cells. Gemin5 and eIF4E co-localized to cytoplasmic P-bodies in human osteosarcoma U2OS cells. Interaction between eIF4E and Gemin5 and their co-localization to the P-bodies, may serve to recruit capped mRNAs to these RNP...

  18. Integrating genome-wide association study and expression quantitative trait loci data identifies multiple genes and gene set associated with neuroticism.

    Science.gov (United States)

    Fan, Qianrui; Wang, Wenyu; Hao, Jingcan; He, Awen; Wen, Yan; Guo, Xiong; Wu, Cuiyan; Ning, Yujie; Wang, Xi; Wang, Sen; Zhang, Feng

    2017-08-01

    Neuroticism is a fundamental personality trait with significant genetic determinant. To identify novel susceptibility genes for neuroticism, we conducted an integrative analysis of genomic and transcriptomic data of genome wide association study (GWAS) and expression quantitative trait locus (eQTL) study. GWAS summary data was driven from published studies of neuroticism, totally involving 170,906 subjects. eQTL dataset containing 927,753 eQTLs were obtained from an eQTL meta-analysis of 5311 samples. Integrative analysis of GWAS and eQTL data was conducted by summary data-based Mendelian randomization (SMR) analysis software. To identify neuroticism associated gene sets, the SMR analysis results were further subjected to gene set enrichment analysis (GSEA). The gene set annotation dataset (containing 13,311 annotated gene sets) of GSEA Molecular Signatures Database was used. SMR single gene analysis identified 6 significant genes for neuroticism, including MSRA (p value=2.27×10 -10 ), MGC57346 (p value=6.92×10 -7 ), BLK (p value=1.01×10 -6 ), XKR6 (p value=1.11×10 -6 ), C17ORF69 (p value=1.12×10 -6 ) and KIAA1267 (p value=4.00×10 -6 ). Gene set enrichment analysis observed significant association for Chr8p23 gene set (false discovery rate=0.033). Our results provide novel clues for the genetic mechanism studies of neuroticism. Copyright © 2017. Published by Elsevier Inc.

  19. Quantitative and mixed analyses to identify factors that affect cervical cancer screening uptake among lesbian and bisexual women and transgender men.

    Science.gov (United States)

    Johnson, Michael J; Mueller, Martina; Eliason, Michele J; Stuart, Gail; Nemeth, Lynne S

    2016-12-01

    The purposes of this study were to measure the prevalence of, and identify factors associated with, cervical cancer screening among a sample of lesbian, bisexual and queer women, and transgender men. Past research has found that lesbian, bisexual and queer women underuse cervical screening service. Because deficient screening remains the most significant risk factor for cervical cancer, it is essential to understand the differences between routine and nonroutine screeners. A convergent-parallel mixed methods design. A convenience sample of 21- to 65-year-old lesbian and bisexual women and transgender men were recruited in the USA from August-December 2014. Quantitative data were collected via a 48-item Internet questionnaire (N = 226), and qualitative data were collected through in-depth telephone interviews (N = 20) and open-ended questions on the Internet questionnaire. Seventy-three per cent of the sample was routine cervical screeners. The results showed that a constellation of factors influence the use of cervical cancer screening among lesbian, bisexual and queer women. Some of those factors overlap with the general female population, whereas others are specific to the lesbian, bisexual or queer identity. Routine screeners reported feeling more welcome in the health care setting, while nonroutine screeners reported more discrimination related to their sexual orientation and gender expression. Routine screeners were also more likely to 'out' to their provider. The quantitative and qualitative factors were also compared and contrasted. Many of the factors identified in this study to influence cervical cancer screening relate to the health care environment and to interactions between the patient and provider. Nurses should be involved with creating welcoming environments for lesbian, bisexual and queer women and their partners. Moreover, nurses play a large role in patient education and should promote self-care behaviours among lesbian women and transgender

  20. Genome-wide association mapping identifies the genetic basis of discrete and quantitative variation in sexual weaponry in a wild sheep population.

    Science.gov (United States)

    Johnston, Susan E; McEwan, John C; Pickering, Natalie K; Kijas, James W; Beraldi, Dario; Pilkington, Jill G; Pemberton, Josephine M; Slate, Jon

    2011-06-01

    Understanding the genetic architecture of phenotypic variation in natural populations is a fundamental goal of evolutionary genetics. Wild Soay sheep (Ovis aries) have an inherited polymorphism for horn morphology in both sexes, controlled by a single autosomal locus, Horns. The majority of males have large normal horns, but a small number have vestigial, deformed horns, known as scurs; females have either normal horns, scurs or no horns (polled). Given that scurred males and polled females have reduced fitness within each sex, it is counterintuitive that the polymorphism persists within the population. Therefore, identifying the genetic basis of horn type will provide a vital foundation for understanding why the different morphs are maintained in the face of natural selection. We conducted a genome-wide association study using ∼36000 single nucleotide polymorphisms (SNPs) and determined the main candidate for Horns as RXFP2, an autosomal gene with a known involvement in determining primary sex characters in humans and mice. Evidence from additional SNPs in and around RXFP2 supports a new model of horn-type inheritance in Soay sheep, and for the first time, sheep with the same horn phenotype but different underlying genotypes can be identified. In addition, RXFP2 was shown to be an additive quantitative trait locus (QTL) for horn size in normal-horned males, accounting for up to 76% of additive genetic variation in this trait. This finding contrasts markedly from genome-wide association studies of quantitative traits in humans and some model species, where it is often observed that mapped loci only explain a modest proportion of the overall genetic variation. © 2011 Blackwell Publishing Ltd.

  1. Review of quantitative phase-digital holographic microscopy: promising novel imaging technique to resolve neuronal network activity and identify cellular biomarkers of psychiatric disorders

    KAUST Repository

    Marquet, Pierre; Depeursinge, Christian; Magistretti, Pierre J.

    2014-01-01

    Quantitative phase microscopy (QPM) has recently emerged as a new powerful quantitative imaging technique well suited to noninvasively explore a transparent specimen with a nanometric axial sensitivity. In this review, we expose the recent

  2. Single-cell-type quantitative proteomic and ionomic analysis of epidermal bladder cells from the halophyte model plant Mesembryanthemum crystallinum to identify salt-responsive proteins.

    Science.gov (United States)

    Barkla, Bronwyn J; Vera-Estrella, Rosario; Raymond, Carolyn

    2016-05-10

    Epidermal bladder cells (EBC) are large single-celled, specialized, and modified trichomes found on the aerial parts of the halophyte Mesembryanthemum crystallinum. Recent development of a simple but high throughput technique to extract the contents from these cells has provided an opportunity to conduct detailed single-cell-type analyses of their molecular characteristics at high resolution to gain insight into the role of these cells in the salt tolerance of the plant. In this study, we carry out large-scale complementary quantitative proteomic studies using both a label (DIGE) and label-free (GeLC-MS) approach to identify salt-responsive proteins in the EBC extract. Additionally we perform an ionomics analysis (ICP-MS) to follow changes in the amounts of 27 different elements. Using these methods, we were able to identify 54 proteins and nine elements that showed statistically significant changes in the EBC from salt-treated plants. GO enrichment analysis identified a large number of transport proteins but also proteins involved in photosynthesis, primary metabolism and Crassulacean acid metabolism (CAM). Validation of results by western blot, confocal microscopy and enzyme analysis helped to strengthen findings and further our understanding into the role of these specialized cells. As expected EBC accumulated large quantities of sodium, however, the most abundant element was chloride suggesting the sequestration of this ion into the EBC vacuole is just as important for salt tolerance. This single-cell type omics approach shows that epidermal bladder cells of M. crystallinum are metabolically active modified trichomes, with primary metabolism supporting cell growth, ion accumulation, compatible solute synthesis and CAM. Data are available via ProteomeXchange with identifier PXD004045.

  3. Quantitative multiplex quantum dot in-situ hybridisation based gene expression profiling in tissue microarrays identifies prognostic genes in acute myeloid leukaemia

    Energy Technology Data Exchange (ETDEWEB)

    Tholouli, Eleni [Department of Haematology, Manchester Royal Infirmary, Oxford Road, Manchester, M13 9WL (United Kingdom); MacDermott, Sarah [The Medical School, The University of Manchester, Oxford Road, M13 9PT Manchester (United Kingdom); Hoyland, Judith [School of Biomedicine, Faculty of Medical and Human Sciences, The University of Manchester, Oxford Road, M13 9PT Manchester (United Kingdom); Yin, John Liu [Department of Haematology, Manchester Royal Infirmary, Oxford Road, Manchester, M13 9WL (United Kingdom); Byers, Richard, E-mail: richard.byers@cmft.nhs.uk [School of Cancer and Enabling Sciences, Faculty of Medical and Human Sciences, The University of Manchester, Stopford Building, Oxford Road, M13 9PT Manchester (United Kingdom)

    2012-08-24

    Highlights: Black-Right-Pointing-Pointer Development of a quantitative high throughput in situ expression profiling method. Black-Right-Pointing-Pointer Application to a tissue microarray of 242 AML bone marrow samples. Black-Right-Pointing-Pointer Identification of HOXA4, HOXA9, Meis1 and DNMT3A as prognostic markers in AML. -- Abstract: Measurement and validation of microarray gene signatures in routine clinical samples is problematic and a rate limiting step in translational research. In order to facilitate measurement of microarray identified gene signatures in routine clinical tissue a novel method combining quantum dot based oligonucleotide in situ hybridisation (QD-ISH) and post-hybridisation spectral image analysis was used for multiplex in-situ transcript detection in archival bone marrow trephine samples from patients with acute myeloid leukaemia (AML). Tissue-microarrays were prepared into which white cell pellets were spiked as a standard. Tissue microarrays were made using routinely processed bone marrow trephines from 242 patients with AML. QD-ISH was performed for six candidate prognostic genes using triplex QD-ISH for DNMT1, DNMT3A, DNMT3B, and for HOXA4, HOXA9, Meis1. Scrambled oligonucleotides were used to correct for background staining followed by normalisation of expression against the expression values for the white cell pellet standard. Survival analysis demonstrated that low expression of HOXA4 was associated with poorer overall survival (p = 0.009), whilst high expression of HOXA9 (p < 0.0001), Meis1 (p = 0.005) and DNMT3A (p = 0.04) were associated with early treatment failure. These results demonstrate application of a standardised, quantitative multiplex QD-ISH method for identification of prognostic markers in formalin-fixed paraffin-embedded clinical samples, facilitating measurement of gene expression signatures in routine clinical samples.

  4. Differentially expressed genes of Tetrahymena thermophila in response to tributyltin (TBT) identified by suppression subtractive hybridization and real time quantitative PCR.

    Science.gov (United States)

    Feng, Lifang; Miao, Wei; Wu, Yuxuan

    2007-02-15

    Tributyltin (TBT) is widely used as antifouling paints, agriculture biocides, and plastic stabilizers around the world, resulting in great pollution problem in aquatic environments. However, it has been short of the biomonitor to detect TBT in freshwater. We constructed the suppression subtractive hybridization library of Tetrahymena thermophila exposed to TBT, and screened out 101 Expressed Sequence Tags whose expressions were significantly up- or down-regulated with TBT treatment. From this, a series of genes related to the TBT toxicity were discovered, such as glutathione-S-transferase gene (down-regulated), plasma membrane Ca2+ ATPase isoforms 3 gene (up-regulated) and NgoA (up-regulated). Furthermore, their expressions under different concentrations of TBT treatment (0.5-40 ppb) were detected by real time fluorescent quantitative PCR. The differentially expressed genes of T. thermophila in response to TBT were identified, which provide the basic to make Tetrahymena as a sensitive, rapid and convenient TBT biomonitor in freshwater based on rDNA inducible expression system.

  5. Quantitative in vivo analyses reveal calcium-dependent phosphorylation sites and identifies a novel component of the Toxoplasma invasion motor complex.

    Directory of Open Access Journals (Sweden)

    Thomas Nebl

    2011-09-01

    Full Text Available Apicomplexan parasites depend on the invasion of host cells for survival and proliferation. Calcium-dependent signaling pathways appear to be essential for micronemal release and gliding motility, yet the target of activated kinases remains largely unknown. We have characterized calcium-dependent phosphorylation events during Toxoplasma host cell invasion. Stimulation of live tachyzoites with Ca²⁺-mobilizing drugs leads to phosphorylation of numerous parasite proteins, as shown by differential 2-DE display of ³²[P]-labeled protein extracts. Multi-dimensional Protein Identification Technology (MudPIT identified ∼546 phosphorylation sites on over 300 Toxoplasma proteins, including 10 sites on the actomyosin invasion motor. Using a Stable Isotope of Amino Acids in Culture (SILAC-based quantitative LC-MS/MS analyses we monitored changes in the abundance and phosphorylation of the invasion motor complex and defined Ca²⁺-dependent phosphorylation patterns on three of its components--GAP45, MLC1 and MyoA. Furthermore, calcium-dependent phosphorylation of six residues across GAP45, MLC1 and MyoA is correlated with invasion motor activity. By analyzing proteins that appear to associate more strongly with the invasion motor upon calcium stimulation we have also identified a novel 15-kDa Calmodulin-like protein that likely represents the MyoA Essential Light Chain of the Toxoplasma invasion motor. This suggests that invasion motor activity could be regulated not only by phosphorylation but also by the direct binding of calcium ions to this new component.

  6. Review of quantitative phase-digital holographic microscopy: promising novel imaging technique to resolve neuronal network activity and identify cellular biomarkers of psychiatric disorders

    KAUST Repository

    Marquet, Pierre

    2014-09-22

    Quantitative phase microscopy (QPM) has recently emerged as a new powerful quantitative imaging technique well suited to noninvasively explore a transparent specimen with a nanometric axial sensitivity. In this review, we expose the recent developments of quantitative phase-digital holographic microscopy (QP-DHM). Quantitative phase-digital holographic microscopy (QP-DHM) represents an important and efficient quantitative phase method to explore cell structure and dynamics. In a second part, the most relevant QPM applications in the field of cell biology are summarized. A particular emphasis is placed on the original biological information, which can be derived from the quantitative phase signal. In a third part, recent applications obtained, with QP-DHM in the field of cellular neuroscience, namely the possibility to optically resolve neuronal network activity and spine dynamics, are presented. Furthermore, potential applications of QPM related to psychiatry through the identification of new and original cell biomarkers that, when combined with a range of other biomarkers, could significantly contribute to the determination of high risk developmental trajectories for psychiatric disorders, are discussed.

  7. Quantitation of heteroplasmy of mtDNA sequence variants identified in a population of AD patients and controls by array-based resequencing.

    Science.gov (United States)

    Coon, Keith D; Valla, Jon; Szelinger, Szabolics; Schneider, Lonnie E; Niedzielko, Tracy L; Brown, Kevin M; Pearson, John V; Halperin, Rebecca; Dunckley, Travis; Papassotiropoulos, Andreas; Caselli, Richard J; Reiman, Eric M; Stephan, Dietrich A

    2006-08-01

    The role of mitochondrial dysfunction in the pathogenesis of Alzheimer's disease (AD) has been well documented. Though evidence for the role of mitochondria in AD seems incontrovertible, the impact of mitochondrial DNA (mtDNA) mutations in AD etiology remains controversial. Though mutations in mitochondrially encoded genes have repeatedly been implicated in the pathogenesis of AD, many of these studies have been plagued by lack of replication as well as potential contamination of nuclear-encoded mitochondrial pseudogenes. To assess the role of mtDNA mutations in the pathogenesis of AD, while avoiding the pitfalls of nuclear-encoded mitochondrial pseudogenes encountered in previous investigations and showcasing the benefits of a novel resequencing technology, we sequenced the entire coding region (15,452 bp) of mtDNA from 19 extremely well-characterized AD patients and 18 age-matched, unaffected controls utilizing a new, reliable, high-throughput array-based resequencing technique, the Human MitoChip. High-throughput, array-based DNA resequencing of the entire mtDNA coding region from platelets of 37 subjects revealed the presence of 208 loci displaying a total of 917 sequence variants. There were no statistically significant differences in overall mutational burden between cases and controls, however, 265 independent sites of statistically significant change between cases and controls were identified. Changed sites were found in genes associated with complexes I (30.2%), III (3.0%), IV (33.2%), and V (9.1%) as well as tRNA (10.6%) and rRNA (14.0%). Despite their statistical significance, the subtle nature of the observed changes makes it difficult to determine whether they represent true functional variants involved in AD etiology or merely naturally occurring dissimilarity. Regardless, this study demonstrates the tremendous value of this novel mtDNA resequencing platform, which avoids the pitfalls of erroneously amplifying nuclear-encoded mtDNA pseudogenes, and

  8. Quantitative real-time PCR identifies a critical region of deletion on 22q13 related to prognosis in oral cancer

    DEFF Research Database (Denmark)

    Reis, Patricia P; Rogatto, Silvia R; Kowalski, Luiz P

    2002-01-01

    Quantitative real time PCR was performed on genomic DNA from 40 primary oral carcinomas and the normal adjacent tissues. The target genes ECGFB, DIA1, BIK, and PDGFB and the microsatellite markers D22S274 and D22S277, mapped on 22q13, were selected according to our previous loss of heterozygosity...... findings in head and neck tumors. Quantitative PCR relies on the comparison of the amount of product generated from a target gene and that generated from a disomic reference gene (GAPDH-housekeeping gene). Reactions have been performed with normal control in triplicates, using the 7700 Sequence Detection.......0018) for patients with DIA1 gene loss. Relative copy number losses detected in these sequences may be related to disease progression and a worse prognosis in patients with oral cancer....

  9. Use of quantitative molecular diagnostic methods to identify causes of diarrhoea in children: a reanalysis of the GEMS case-control study.

    Science.gov (United States)

    Liu, Jie; Platts-Mills, James A; Juma, Jane; Kabir, Furqan; Nkeze, Joseph; Okoi, Catherine; Operario, Darwin J; Uddin, Jashim; Ahmed, Shahnawaz; Alonso, Pedro L; Antonio, Martin; Becker, Stephen M; Blackwelder, William C; Breiman, Robert F; Faruque, Abu S G; Fields, Barry; Gratz, Jean; Haque, Rashidul; Hossain, Anowar; Hossain, M Jahangir; Jarju, Sheikh; Qamar, Farah; Iqbal, Najeeha Talat; Kwambana, Brenda; Mandomando, Inacio; McMurry, Timothy L; Ochieng, Caroline; Ochieng, John B; Ochieng, Melvin; Onyango, Clayton; Panchalingam, Sandra; Kalam, Adil; Aziz, Fatima; Qureshi, Shahida; Ramamurthy, Thandavarayan; Roberts, James H; Saha, Debasish; Sow, Samba O; Stroup, Suzanne E; Sur, Dipika; Tamboura, Boubou; Taniuchi, Mami; Tennant, Sharon M; Toema, Deanna; Wu, Yukun; Zaidi, Anita; Nataro, James P; Kotloff, Karen L; Levine, Myron M; Houpt, Eric R

    2016-09-24

    Diarrhoea is the second leading cause of mortality in children worldwide, but establishing the cause can be complicated by diverse diagnostic approaches and varying test characteristics. We used quantitative molecular diagnostic methods to reassess causes of diarrhoea in the Global Enteric Multicenter Study (GEMS). GEMS was a study of moderate to severe diarrhoea in children younger than 5 years in Africa and Asia. We used quantitative real-time PCR (qPCR) to test for 32 enteropathogens in stool samples from cases and matched asymptomatic controls from GEMS, and compared pathogen-specific attributable incidences with those found with the original GEMS microbiological methods, including culture, EIA, and reverse-transcriptase PCR. We calculated revised pathogen-specific burdens of disease and assessed causes in individual children. We analysed 5304 sample pairs. For most pathogens, incidence was greater with qPCR than with the original methods, particularly for adenovirus 40/41 (around five times), Shigella spp or enteroinvasive Escherichia coli (EIEC) and Campylobactor jejuni o C coli (around two times), and heat-stable enterotoxin-producing E coli ([ST-ETEC] around 1·5 times). The six most attributable pathogens became, in descending order, Shigella spp, rotavirus, adenovirus 40/41, ST-ETEC, Cryptosporidium spp, and Campylobacter spp. Pathogen-attributable diarrhoeal burden was 89·3% (95% CI 83·2-96·0) at the population level, compared with 51·5% (48·0-55·0) in the original GEMS analysis. The top six pathogens accounted for 77·8% (74·6-80·9) of all attributable diarrhoea. With use of model-derived quantitative cutoffs to assess individual diarrhoeal cases, 2254 (42·5%) of 5304 cases had one diarrhoea-associated pathogen detected and 2063 (38·9%) had two or more, with Shigella spp and rotavirus being the pathogens most strongly associated with diarrhoea in children with mixed infections. A quantitative molecular diagnostic approach improved population

  10. Isobaric Tags for Relative and Absolute Quantification (iTRAQ)-Based Untargeted Quantitative Proteomic Approach To Identify Change of the Plasma Proteins by Salbutamol Abuse in Beef Cattle.

    Science.gov (United States)

    Zhang, Kai; Tang, Chaohua; Liang, Xiaowei; Zhao, Qingyu; Zhang, Junmin

    2018-01-10

    Salbutamol, a selective β 2 -agonist, endangers the safety of animal products as a result of illegal use in food animals. In this study, an iTRAQ-based untargeted quantitative proteomic approach was applied to screen potential protein biomarkers in plasma of cattle before and after treatment with salbutamol for 21 days. A total of 62 plasma proteins were significantly affected by salbutamol treatment, which can be used as potential biomarkers to screen for the illegal use of salbutamol in beef cattle. Enzyme-linked immunosorbent assay measurements of five selected proteins demonstrated the reliability of iTRAQ-based proteomics in screening of candidate biomarkers among the plasma proteins. The plasma samples collected before and after salbutamol treatment were well-separated by principal component analysis (PCA) using the differentially expressed proteins. These results suggested that an iTRAQ-based untargeted quantitative proteomic strategy combined with PCA pattern recognition methods can discriminate differences in plasma protein profiles collected before and after salbutamol treatment.

  11. Quantitative Approach Based on Wearable Inertial Sensors to Assess and Identify Motion and Errors in Techniques Used during Training of Transfers of Simulated c-Spine-Injured Patients

    Directory of Open Access Journals (Sweden)

    Karina Lebel

    2018-01-01

    Full Text Available Patients with suspected spinal cord injuries undergo numerous transfers throughout treatment and care. Effective c-spine stabilization is crucial to minimize the impacts of the suspected injury. Healthcare professionals are trained to perform those transfers using simulation; however, the feedback on the manoeuvre is subjective. This paper proposes a quantitative approach to measure the efficacy of the c-spine stabilization and provide objective feedback during training. Methods. 3D wearable motion sensors are positioned on a simulated patient to capture the motion of the head and trunk during a training scenario. Spatial and temporal indicators associated with the motion can then be derived from the signals. The approach was developed and tested on data obtained from 21 paramedics performing the log-roll, a transfer technique commonly performed during prehospital and hospital care. Results. In this scenario, 55% of the c-spine motion could be explained by the difficulty of rescuers to maintain head and trunk alignment during the rotation part of the log-roll and their difficulty to initiate specific phases of the motion synchronously. Conclusion. The proposed quantitative approach has the potential to be used for personalized feedback during training sessions and could even be embedded into simulation mannequins to provide an innovative training solution.

  12. A quantitative analysis of 2-D gels identifies proteins in which labeling is increased following long-term sensitization in Aplysia

    International Nuclear Information System (INIS)

    Castellucci, V.F.; Kennedy, T.E.; Kandel, E.R.; Goelet, P.

    1988-01-01

    Long-term memory for sensitization of the gill- and siphon-withdrawal reflex in Aplysia, produced by 4 days of training, is associated with increased synaptic efficacy of the connection between the sensory and motor neurons. This training is also accompanied by neuronal growth; there is an increase in the number of synaptic varicosities per sensory neuron and in the number of active zones. Such structural changes may be due to changes in the rates of synthesis of certain proteins. We have searched for proteins in which the rates of [ 35 S]methionine labeling are altered during the maintenance phase of long-term memory for sensitization by using computer-assisted quantitative 2-D gel analysis. This method has allowed us to detect 4 proteins in which labeling is altered after 4 days of sensitization training

  13. Quantitative proteomics as a tool to identify resistance mechanisms in erlotinib-resistant subclones of the non-small cell lung cancer cell line HCC827

    DEFF Research Database (Denmark)

    Jacobsen, Kirstine

    , which in 43-50% of cases are caused by a secondary mutation (T790M) in EGFR. Importantly, a majority of resistance cases are still unexplained (Lin & Bivona, 2012). Our aim is to identify novel resistance mechanisms – and potentially new drug targets - in erlotinib-resistant subclones of the NSCLC cell...... of erlotinib, and in biological triplicates on a Q-Exactive mass spectrometer. Only proteins identified with minimum 2 unique peptides and in minimum 2 of 3 replicates were accepted. Results: Importantly, the resistant clones did not acquire the T790M or other EGFR or KRAS mutations, potentiating...... the identification of novel resistance mechanisms. We identified 2875 cytoplasmic proteins present in all 4 cell lines. Of these 87, 56 and 23 are upregulated >1.5 fold; and 117, 72 and 32 are downregulated >1.5 fold, respectively, in the 3 resistant clones compared to the parental cell line. By network analysis, we...

  14. Quantitative Glycoproteomic Analysis Identifies Platelet-Induced Increase of Monocyte Adhesion via the Up-Regulation of Very Late Antigen 5.

    Science.gov (United States)

    Huang, Jiqing; Kast, Juergen

    2015-08-07

    Physiological stimuli, such as thrombin, or pathological stimuli, such as lysophosphatidic acid (LPA), activate platelets circulating in blood. Once activated, platelets bind to monocytes via P-selectin-PSGL-1 interactions but also release the stored contents of their granules. These platelet releasates, in addition to direct platelet binding, activate monocytes and facilitate their recruitment to atherosclerotic sites. Consequently, understanding the changes platelet releasates induce in monocyte membrane proteins is critical. We studied the glyco-proteome changes of THP-1 monocytic cells affected by LPA- or thrombin-induced platelet releasates. We employed lectin affinity chromatography combined with filter aided sample preparation to achieve high glyco- and membrane protein and protein sequence coverage. Using stable isotope labeling by amino acids in cell culture, we quantified 1715 proteins, including 852 membrane and 500 glycoproteins, identifying the up-regulation of multiple proteins involved in monocyte extracellular matrix binding and transendothelial migration. Flow cytometry indicated expression changes of integrin α5, integrin β1, PECAM-1, and PSGL-1. The observed increase in monocyte adhesion to fibronectin was determined to be mediated by the up-regulation of very late antigen 5 via a P-selectin-PSGL-1 independent mechanism. This novel aspect could be validated on CD14+ human primary monocytes, highlighting the benefits of the improved enrichment method regarding high membrane protein coverage and reliable quantification.

  15. Primary genome scan to identify putative quantitative trait loci for feedlot growth rate, feed intake, and feed efficiency of beef cattle.

    Science.gov (United States)

    Nkrumah, J D; Sherman, E L; Li, C; Marques, E; Crews, D H; Bartusiak, R; Murdoch, B; Wang, Z; Basarab, J A; Moore, S S

    2007-12-01

    Feed intake and feed efficiency of beef cattle are economically relevant traits. The study was conducted to identify QTL for feed intake and feed efficiency of beef cattle by using genotype information from 100 microsatellite markers and 355 SNP genotyped across 400 progeny of 20 Angus, Charolais, or Alberta Hybrid bulls. Traits analyzed include feedlot ADG, daily DMI, feed-to-gain ratio [F:G, which is the reciprocal of the efficiency of gain (G:F)], and residual feed intake (RFI). A mixed model with sire as random and QTL effects as fixed was used to generate an F-statistic profile across and within families for each trait along each chromosome, followed by empirical permutation tests to determine significance thresholds for QTL detection. Putative QTL for ADG (chromosome-wise P < 0.05) were detected across families on chromosomes 5 (130 cM), 6 (42 cM), 7 (84 cM), 11 (20 cM), 14 (74 cM), 16 (22 cM), 17 (9 cM), 18 (46 cM), 19 (53 cM), and 28 (23 cM). For DMI, putative QTL that exceeded the chromosome-wise P < 0.05 threshold were detected on chromosomes 1 (93 cM), 3 (123 cM), 15 (31 cM), 17 (81 cM), 18 (49 cM), 20 (56 cM), and 26 (69 cM) in the across-family analyses. Putative across-family QTL influencing F:G that exceeded the chromosome-wise P < 0.05 threshold were detected on chromosomes 3 (62 cM), 5 (129 cM), 7 (27 cM), 11 (16 cM), 16 (30 cM), 17 (81 cM), 22 (72 cM), 24 (55 cM), and 28 (24 cM). Putative QTL influencing RFI that exceeded the chromosome-wise P < 0.05 threshold were detected on chromosomes 1 (90 cM), 5 (129 cM), 7 (22 cM), 8 (80 cM), 12 (89 cM), 16 (41 cM), 17 (19 cM), and 26 (48 cM) in the across-family analyses. In addition, a total of 4, 6, 1, and 8 chromosomes showed suggestive evidence (chromosome-wise, P < 0.10) for putative ADG, DMI, F:G, and RFI QTL, respectively. Most of the QTL detected across families were also detected within families, although the locations across families were not necessarily the locations within families, which is

  16. Quantitative Analysis of "1"8F-Fluorodeoxyglucose Positron Emission Tomography Identifies Novel Prognostic Imaging Biomarkers in Locally Advanced Pancreatic Cancer Patients Treated With Stereotactic Body Radiation Therapy

    International Nuclear Information System (INIS)

    Cui, Yi; Song, Jie; Pollom, Erqi; Alagappan, Muthuraman; Shirato, Hiroki; Chang, Daniel T.; Koong, Albert C.; Li, Ruijiang

    2016-01-01

    Purpose: To identify prognostic biomarkers in pancreatic cancer using high-throughput quantitative image analysis. Methods and Materials: In this institutional review board–approved study, we retrospectively analyzed images and outcomes for 139 locally advanced pancreatic cancer patients treated with stereotactic body radiation therapy (SBRT). The overall population was split into a training cohort (n=90) and a validation cohort (n=49) according to the time of treatment. We extracted quantitative imaging characteristics from pre-SBRT "1"8F-fluorodeoxyglucose positron emission tomography, including statistical, morphologic, and texture features. A Cox proportional hazard regression model was built to predict overall survival (OS) in the training cohort using 162 robust image features. To avoid over-fitting, we applied the elastic net to obtain a sparse set of image features, whose linear combination constitutes a prognostic imaging signature. Univariate and multivariate Cox regression analyses were used to evaluate the association with OS, and concordance index (CI) was used to evaluate the survival prediction accuracy. Results: The prognostic imaging signature included 7 features characterizing different tumor phenotypes, including shape, intensity, and texture. On the validation cohort, univariate analysis showed that this prognostic signature was significantly associated with OS (P=.002, hazard ratio 2.74), which improved upon conventional imaging predictors including tumor volume, maximum standardized uptake value, and total legion glycolysis (P=.018-.028, hazard ratio 1.51-1.57). On multivariate analysis, the proposed signature was the only significant prognostic index (P=.037, hazard ratio 3.72) when adjusted for conventional imaging and clinical factors (P=.123-.870, hazard ratio 0.53-1.30). In terms of CI, the proposed signature scored 0.66 and was significantly better than competing prognostic indices (CI 0.48-0.64, Wilcoxon rank sum test P<1e-6

  17. High-fidelity target sequencing of individual molecules identified using barcode sequences: de novo detection and absolute quantitation of mutations in plasma cell-free DNA from cancer patients.

    Science.gov (United States)

    Kukita, Yoji; Matoba, Ryo; Uchida, Junji; Hamakawa, Takuya; Doki, Yuichiro; Imamura, Fumio; Kato, Kikuya

    2015-08-01

    Circulating tumour DNA (ctDNA) is an emerging field of cancer research. However, current ctDNA analysis is usually restricted to one or a few mutation sites due to technical limitations. In the case of massively parallel DNA sequencers, the number of false positives caused by a high read error rate is a major problem. In addition, the final sequence reads do not represent the original DNA population due to the global amplification step during the template preparation. We established a high-fidelity target sequencing system of individual molecules identified in plasma cell-free DNA using barcode sequences; this system consists of the following two steps. (i) A novel target sequencing method that adds barcode sequences by adaptor ligation. This method uses linear amplification to eliminate the errors introduced during the early cycles of polymerase chain reaction. (ii) The monitoring and removal of erroneous barcode tags. This process involves the identification of individual molecules that have been sequenced and for which the number of mutations have been absolute quantitated. Using plasma cell-free DNA from patients with gastric or lung cancer, we demonstrated that the system achieved near complete elimination of false positives and enabled de novo detection and absolute quantitation of mutations in plasma cell-free DNA. © The Author 2015. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  18. Integrative genomic analysis identifies ancestry-related expression quantitative trait loci on DNA polymerase β and supports the association of genetic ancestry with survival disparities in head and neck squamous cell carcinoma.

    Science.gov (United States)

    Ramakodi, Meganathan P; Devarajan, Karthik; Blackman, Elizabeth; Gibbs, Denise; Luce, Danièle; Deloumeaux, Jacqueline; Duflo, Suzy; Liu, Jeffrey C; Mehra, Ranee; Kulathinal, Rob J; Ragin, Camille C

    2017-03-01

    African Americans with head and neck squamous cell carcinoma (HNSCC) have a lower survival rate than whites. This study investigated the functional importance of ancestry-informative single-nucleotide polymorphisms (SNPs) in HNSCC and also examined the effect of functionally important genetic elements on racial disparities in HNSCC survival. Ancestry-informative SNPs, RNA sequencing, methylation, and copy number variation data for 316 oral cavity and laryngeal cancer patients were analyzed across 178 DNA repair genes. The results of expression quantitative trait locus (eQTL) analyses were also replicated with a Gene Expression Omnibus (GEO) data set. The effects of eQTLs on overall survival (OS) and disease-free survival (DFS) were evaluated. Five ancestry-related SNPs were identified as cis-eQTLs in the DNA polymerase β (POLB) gene (false discovery rate [FDR] ancestry (P = .002). An association was observed between these eQTLs and OS (P ancestry-related alleles could act as eQTLs in HNSCC and support the association of ancestry-related genetic factors with survival disparities in patients diagnosed with oral cavity and laryngeal cancer. Cancer 2017;123:849-60. © 2016 American Cancer Society. © 2016 American Cancer Society.

  19. Quantitative proteomics identifies altered O-GlcNAcylation of structural, synaptic and memory-associated proteins in Alzheimer's disease: Brain protein O-GlcNAcylation in Alzheimer's disease

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Sheng [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Yang, Feng [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Petyuk, Vladislav A. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Shukla, Anil K. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Monroe, Matthew E. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Gritsenko, Marina A. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Rodland, Karin D. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Smith, Richard D. [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Qian, Wei-Jun [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA; Gong, Cheng-Xin [New York State Institute for Basic Research in Developmental Disabilities, Staten Island, New York USA; Liu, Tao [Biological Sciences Division, Pacific Northwest National Laboratory, Richland WA USA

    2017-07-28

    Protein modification by O-linked beta-N-acetylglucosamine (O-GlcNAc) is emerging as an important factor in the pathogenesis of sporadic Alzheimer’s disease. Herein we report the most comprehensive, quantitative proteomics analysis for protein O-GlcNAcylation in post-mortem human brains with and without Alzheimer’s using isobaric tandem mass tags labeling, chemoenzymatic photocleavage enrichment and liquid chromatography coupled to mass spectrometry. A total of 1,850 O-GlcNAc peptides covering 1,094 O-GlcNAcylation sites were identified from 530 proteins in the human brain. 128 O-GlcNAc peptides covering 78 proteins were altered significantly in Alzheimer’s brain as compared to controls (q<0.05). Moreover, alteration of the O-GlcNAc peptide abundance could be attributed more to O-GlcNAcylation level than to protein level changes. The altered O-GlcNAcylated proteins belong to several structural and functional categories, including synaptic proteins, cytoskeleton proteins, and memory-associated proteins. These findings suggest that dysregulation of O-GlcNAcylation of multiple brain proteins may be involved in the development of sporadic Alzheimer’s disease.

  20. Quantitative Non-canonical Amino Acid Tagging (QuaNCAT) Proteomics Identifies Distinct Patterns of Protein Synthesis Rapidly Induced by Hypertrophic Agents in Cardiomyocytes, Revealing New Aspects of Metabolic Remodeling*

    Science.gov (United States)

    Liu, Rui; Kenney, Justin W.; Manousopoulou, Antigoni; Johnston, Harvey E.; Kamei, Makoto; Woelk, Christopher H.; Xie, Jianling; Schwarzer, Michael; Proud, Christopher G.

    2016-01-01

    Cardiomyocytes undergo growth and remodeling in response to specific pathological or physiological conditions. In the former, myocardial growth is a risk factor for cardiac failure and faster protein synthesis is a major factor driving cardiomyocyte growth. Our goal was to quantify the rapid effects of different pro-hypertrophic stimuli on the synthesis of specific proteins in ARVC and to determine whether such effects are caused by alterations on mRNA abundance or the translation of specific mRNAs. Cardiomyocytes have very low rates of protein synthesis, posing a challenging problem in terms of studying changes in the synthesis of specific proteins, which also applies to other nondividing primary cells. To study the rates of accumulation of specific proteins in these cells, we developed an optimized version of the Quantitative Noncanonical Amino acid Tagging LC/MS proteomic method to label and selectively enrich newly synthesized proteins in these primary cells while eliminating the suppressive effects of pre-existing and highly abundant nonisotope-tagged polypeptides. Our data revealed that a classical pathologic (phenylephrine; PE) and the recently identified insulin stimulus that also contributes to the development of pathological cardiac hypertrophy (insulin), both increased the synthesis of proteins involved in, e.g. glycolysis, the Krebs cycle and beta-oxidation, and sarcomeric components. However, insulin increased synthesis of many metabolic enzymes to a greater extent than PE. Using a novel validation method, we confirmed that synthesis of selected candidates is indeed up-regulated by PE and insulin. Synthesis of all proteins studied was up-regulated by signaling through mammalian target of rapamycin complex 1 without changes in their mRNA levels, showing the key importance of translational control in the rapid effects of hypertrophic stimuli. Expression of PKM2 was up-regulated in rat hearts following TAC. This isoform possesses specific regulatory

  1. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  2. Quantitative habitability.

    Science.gov (United States)

    Shock, Everett L; Holland, Melanie E

    2007-12-01

    A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.

  3. Quantitative Finance

    Science.gov (United States)

    James, Jessica

    2017-01-01

    Quantitative finance is a field that has risen to prominence over the last few decades. It encompasses the complex models and calculations that value financial contracts, particularly those which reference events in the future, and apply probabilities to these events. While adding greatly to the flexibility of the market available to corporations and investors, it has also been blamed for worsening the impact of financial crises. But what exactly does quantitative finance encompass, and where did these ideas and models originate? We show that the mathematics behind finance and behind games of chance have tracked each other closely over the centuries and that many well-known physicists and mathematicians have contributed to the field.

  4. Quantitative radiography

    International Nuclear Information System (INIS)

    Brase, J.M.; Martz, H.E.; Waltjen, K.E.; Hurd, R.L.; Wieting, M.G.

    1986-01-01

    Radiographic techniques have been used in nondestructive evaluation primarily to develop qualitative information (i.e., defect detection). This project applies and extends the techniques developed in medical x-ray imaging, particularly computed tomography (CT), to develop quantitative information (both spatial dimensions and material quantities) on the three-dimensional (3D) structure of solids. Accomplishments in FY 86 include (1) improvements in experimental equipment - an improved microfocus system that will give 20-μm resolution and has potential for increased imaging speed, and (2) development of a simple new technique for displaying 3D images so as to clearly show the structure of the object. Image reconstruction and data analysis for a series of synchrotron CT experiments conducted by LLNL's Chemistry Department has begun

  5. Quantitative lymphography

    International Nuclear Information System (INIS)

    Mostbeck, A.; Lofferer, O.; Kahn, P.; Partsch, H.; Koehn, H.; Bialonczyk, Ch.; Koenig, B.

    1984-01-01

    Labelled colloids and macromolecules are removed lymphatically. The uptake of tracer in the regional lymphnodes is a parameter of lymphatic flow. Due to great variations in patient shape - obesity, cachexia - and accompanying variations in counting efficiencies quantitative measurements with reasonable accuracy have not been reported to date. A new approach to regional absorption correction is based on the combination of transmission and emission scans for each patient. The transmission scan is used for calculation of an absorption correction matrix. Accurate superposition of the correction matrix and the emission scan is achieved by computing the centers of gravity of point sources and - in the case of aligning opposite views - by cross correlation of binary images. In phantom studies the recovery was high (98.3%) and the coefficient of variation of repeated measurement below 1%. In patient studies a standardized stress is a prerequisite for reliable and comparable results. Discrimination between normals (14.3 +- 4.2D%) and patients with lymphedema (2.05 +- 2.5D%) was highly significant using praefascial lymphography and sc injection. Clearence curve analysis of the activities at the injection site, however, gave no reliable data for this purpose. In normals, the uptake in lymphnodes after im injection is by one order of magnitude lower then the uptake after sc injection. The discrimination between normals and patients with postthromboic syndrome was significant. Lymphography after ic injection was in the normal range in 2/3 of the patients with lymphedema and is therefore of no diagnostic value. The difference in uptake after ic and sc injection demonstrated for the first time by our quantitative method provides new insights into the pathophysiology of lymphedema and needs further investigation. (Author)

  6. Quantitative Thermochronology

    Science.gov (United States)

    Braun, Jean; van der Beek, Peter; Batt, Geoffrey

    2006-05-01

    Thermochronology, the study of the thermal history of rocks, enables us to quantify the nature and timing of tectonic processes. Quantitative Thermochronology is a robust review of isotopic ages, and presents a range of numerical modeling techniques to allow the physical implications of isotopic age data to be explored. The authors provide analytical, semi-analytical, and numerical solutions to the heat transfer equation in a range of tectonic settings and under varying boundary conditions. They then illustrate their modeling approach built around a large number of case studies. The benefits of different thermochronological techniques are also described. Computer programs on an accompanying website at www.cambridge.org/9780521830577 are introduced through the text and provide a means of solving the heat transport equation in the deforming Earth to predict the ages of rocks and compare them directly to geological and geochronological data. Several short tutorials, with hints and solutions, are also included. Numerous case studies help geologists to interpret age data and relate it to Earth processes Essential background material to aid understanding and using thermochronological data Provides a thorough treatise on numerical modeling of heat transport in the Earth's crust Supported by a website hosting relevant computer programs and colour slides of figures from the book for use in teaching

  7. The quantitative Morse theorem

    OpenAIRE

    Loi, Ta Le; Phien, Phan

    2013-01-01

    In this paper, we give a proof of the quantitative Morse theorem stated by {Y. Yomdin} in \\cite{Y1}. The proof is based on the quantitative Sard theorem, the quantitative inverse function theorem and the quantitative Morse lemma.

  8. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  9. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  10. Time-resolved quantitative phosphoproteomics

    DEFF Research Database (Denmark)

    Verano-Braga, Thiago; Schwämmle, Veit; Sylvester, Marc

    2012-01-01

    proteins involved in the Ang-(1-7) signaling, we performed a mass spectrometry-based time-resolved quantitative phosphoproteome study of human aortic endothelial cells (HAEC) treated with Ang-(1-7). We identified 1288 unique phosphosites on 699 different proteins with 99% certainty of correct peptide...

  11. Quantitative nature of overexpression experiments

    Science.gov (United States)

    Moriya, Hisao

    2015-01-01

    Overexpression experiments are sometimes considered as qualitative experiments designed to identify novel proteins and study their function. However, in order to draw conclusions regarding protein overexpression through association analyses using large-scale biological data sets, we need to recognize the quantitative nature of overexpression experiments. Here I discuss the quantitative features of two different types of overexpression experiment: absolute and relative. I also introduce the four primary mechanisms involved in growth defects caused by protein overexpression: resource overload, stoichiometric imbalance, promiscuous interactions, and pathway modulation associated with the degree of overexpression. PMID:26543202

  12. 78 FR 64202 - Quantitative Messaging Research

    Science.gov (United States)

    2013-10-28

    ... COMMODITY FUTURES TRADING COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures... survey will follow qualitative message testing research (for which CFTC received fast- track OMB approval... comments. Please submit your comments using only one method and identify that it is for the ``Quantitative...

  13. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  14. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  15. Quantitation: clinical applications

    International Nuclear Information System (INIS)

    Britton, K.E.

    1982-01-01

    Single photon emission tomography may be used quantitatively if its limitations are recognized and quantitation is made in relation to some reference area on the image. Relative quantitation is discussed in outline in relation to the liver, brain and pituitary, thyroid, adrenals, and heart. (U.K.)

  16. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  17. Quantitative Algebraic Reasoning

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Panangaden, Prakash; Plotkin, Gordon

    2016-01-01

    We develop a quantitative analogue of equational reasoning which we call quantitative algebra. We define an equality relation indexed by rationals: a =ε b which we think of as saying that “a is approximately equal to b up to an error of ε”. We have 4 interesting examples where we have a quantitative...... equational theory whose free algebras correspond to well known structures. In each case we have finitary and continuous versions. The four cases are: Hausdorff metrics from quantitive semilattices; pWasserstein metrics (hence also the Kantorovich metric) from barycentric algebras and also from pointed...

  18. Quantitative autoradiography of neurochemicals

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Biegon, A.; Bleisch, W.V.

    1982-01-01

    Several new methods have been developed that apply quantitative autoradiography to neurochemistry. These methods are derived from the 2-deoxyglucose (2DG) technique of Sokoloff (1), which uses quantitative autoradiography to measure the rate of glucose utilization in brain structures. The new methods allow the measurement of the rate of cerbral protein synthesis and the levels of particular neurotransmitter receptors by quantitative autoradiography. As with the 2DG method, the new techniques can measure molecular levels in micron-sized brain structures; and can be used in conjunction with computerized systems of image processing. It is possible that many neurochemical measurements could be made by computerized analysis of quantitative autoradiograms

  19. Quantitative film radiography

    International Nuclear Information System (INIS)

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-01-01

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects

  20. Quantitative Reasoning Learning Progressions for Environmental Science: Developing a Framework

    Directory of Open Access Journals (Sweden)

    Robert L. Mayes

    2013-01-01

    Full Text Available Quantitative reasoning is a complex concept with many definitions and a diverse account in the literature. The purpose of this article is to establish a working definition of quantitative reasoning within the context of science, construct a quantitative reasoning framework, and summarize research on key components in that framework. Context underlies all quantitative reasoning; for this review, environmental science serves as the context.In the framework, we identify four components of quantitative reasoning: the quantification act, quantitative literacy, quantitative interpretation of a model, and quantitative modeling. Within each of these components, the framework provides elements that comprise the four components. The quantification act includes the elements of variable identification, communication, context, and variation. Quantitative literacy includes the elements of numeracy, measurement, proportional reasoning, and basic probability/statistics. Quantitative interpretation includes the elements of representations, science diagrams, statistics and probability, and logarithmic scales. Quantitative modeling includes the elements of logic, problem solving, modeling, and inference. A brief comparison of the quantitative reasoning framework with the AAC&U Quantitative Literacy VALUE rubric is presented, demonstrating a mapping of the components and illustrating differences in structure. The framework serves as a precursor for a quantitative reasoning learning progression which is currently under development.

  1. Understanding Pre-Quantitative Risk in Projects

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  2. Quantitative secondary electron detection

    Science.gov (United States)

    Agrawal, Jyoti; Joy, David C.; Nayak, Subuhadarshi

    2018-05-08

    Quantitative Secondary Electron Detection (QSED) using the array of solid state devices (SSD) based electron-counters enable critical dimension metrology measurements in materials such as semiconductors, nanomaterials, and biological samples (FIG. 3). Methods and devices effect a quantitative detection of secondary electrons with the array of solid state detectors comprising a number of solid state detectors. An array senses the number of secondary electrons with a plurality of solid state detectors, counting the number of secondary electrons with a time to digital converter circuit in counter mode.

  3. [Methods of quantitative proteomics].

    Science.gov (United States)

    Kopylov, A T; Zgoda, V G

    2007-01-01

    In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.

  4. Extending Quantitative Easing

    DEFF Research Database (Denmark)

    Hallett, Andrew Hughes; Fiedler, Salomon; Kooths, Stefan

    The notes in this compilation address the pros and cons associated with the extension of ECB quantitative easing programme of asset purchases. The notes have been requested by the Committee on Economic and Monetary Affairs as an input for the February 2017 session of the Monetary Dialogue....

  5. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  6. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  7. Quantitative Mapping of Large Area Graphene Conductance

    DEFF Research Database (Denmark)

    Buron, Jonas Christian Due; Petersen, Dirch Hjorth; Bøggild, Peter

    2012-01-01

    We present quantitative mapping of large area graphene conductance by terahertz time-domain spectroscopy and micro four point probe. We observe a clear correlation between the techniques and identify the observed systematic differences to be directly related to imperfections of the graphene sheet...

  8. Values in Qualitative and Quantitative Research

    Science.gov (United States)

    Duffy, Maureen; Chenail, Ronald J.

    2008-01-01

    The authors identify the philosophical underpinnings and value-ladenness of major research paradigms. They argue that useful and meaningful research findings for counseling can be generated from both qualitative and quantitative research methodologies, provided that the researcher has an appreciation of the importance of philosophical coherence in…

  9. 78 FR 52166 - Quantitative Messaging Research

    Science.gov (United States)

    2013-08-22

    ... COMMODITY FUTURES TRADING COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures... survey will follow qualitative message testing research (for which CFTC received fast-track OMB approval... message testing research (for which CFTC received fast-track OMB approval) and is necessary to identify...

  10. Subjective Quantitative Studies of Human Agency

    Science.gov (United States)

    Alkire, Sabina

    2005-01-01

    Amartya Sen's writings have articulated the importance of human agency, and identified the need for information on agency freedom to inform our evaluation of social arrangements. Many approaches to poverty reduction stress the need for empowerment. This paper reviews "subjective quantitative measures of human agency at the individual level." It…

  11. Quantitative Characterization of Nanostructured Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Frank (Bud) Bridges, University of California-Santa Cruz

    2010-08-05

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structure measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.

  12. Quantitative criticism of literary relationships.

    Science.gov (United States)

    Dexter, Joseph P; Katz, Theodore; Tripuraneni, Nilesh; Dasgupta, Tathagata; Kannan, Ajay; Brofos, James A; Bonilla Lopez, Jorge A; Schroeder, Lea A; Casarez, Adriana; Rabinovich, Maxim; Haimson Lushkov, Ayelet; Chaudhuri, Pramit

    2017-04-18

    Authors often convey meaning by referring to or imitating prior works of literature, a process that creates complex networks of literary relationships ("intertextuality") and contributes to cultural evolution. In this paper, we use techniques from stylometry and machine learning to address subjective literary critical questions about Latin literature, a corpus marked by an extraordinary concentration of intertextuality. Our work, which we term "quantitative criticism," focuses on case studies involving two influential Roman authors, the playwright Seneca and the historian Livy. We find that four plays related to but distinct from Seneca's main writings are differentiated from the rest of the corpus by subtle but important stylistic features. We offer literary interpretations of the significance of these anomalies, providing quantitative data in support of hypotheses about the use of unusual formal features and the interplay between sound and meaning. The second part of the paper describes a machine-learning approach to the identification and analysis of citational material that Livy loosely appropriated from earlier sources. We extend our approach to map the stylistic topography of Latin prose, identifying the writings of Caesar and his near-contemporary Livy as an inflection point in the development of Latin prose style. In total, our results reflect the integration of computational and humanistic methods to investigate a diverse range of literary questions.

  13. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  14. Quantitative skeletal scintiscanning

    International Nuclear Information System (INIS)

    Haushofer, R.

    1982-01-01

    330 patients were examined by skeletal scintiscanning with sup(99m)Tc pyrophosphate and sup(99m)methylene diphosphonate in the years between 1977 and 1979. Course control examinations were carried out in 12 patients. The collective of patients presented with primary skeletal tumours, metastases, inflammatory and degenerative skeletal diseases. Bone scintiscanning combined with the ''region of interest'' technique was found to be an objective and reproducible technique for quantitative measurement of skeletal radioactivity concentrations. The validity of nuclear skeletal examinations can thus be enhanced as far as diagnosis, course control, and differential diagnosis are concerned. Quantitative skeletal scintiscanning by means of the ''region of interest'' technique has opened up a new era in skeletal diagnosis by nuclear methods. (orig./MG) [de

  15. Thoughts on identifiers

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    As business processes and information transactions have become an inextricably intertwined with the Web, the importance of assignment, registration, discovery, and maintenance of identifiers has increased. In spite of this, integrated frameworks for managing identifiers have been slow to emerge. Instead, identification systems arise (quite naturally) from immediate business needs without consideration for how they fit into larger information architectures. In addition, many legacy identifier systems further complicate the landscape, making it difficult for content managers to select and deploy identifier systems that meet both the business case and long term information management objectives. This presentation will outline a model for evaluating identifier applications and the functional requirements of the systems necessary to support them. The model is based on a layered analysis of the characteristics of identifier systems, including: * Functional characteristics * Technology * Policy * Business * Social T...

  16. Quantitative FDG in depression

    Energy Technology Data Exchange (ETDEWEB)

    Chua, P.; O`Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D. [Austin Hospital, Melbourne, VIC (Australia). Dept of Psychiatry and Centre for PET

    1998-03-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual``s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals.

  17. Quantitative FDG in depression

    International Nuclear Information System (INIS)

    Chua, P.; O'Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D.

    1998-01-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual''s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals

  18. Identifiability in stochastic models

    CERN Document Server

    1992-01-01

    The problem of identifiability is basic to all statistical methods and data analysis, occurring in such diverse areas as Reliability Theory, Survival Analysis, and Econometrics, where stochastic modeling is widely used. Mathematics dealing with identifiability per se is closely related to the so-called branch of ""characterization problems"" in Probability Theory. This book brings together relevant material on identifiability as it occurs in these diverse fields.

  19. Quantitative imaging as cancer biomarker

    Science.gov (United States)

    Mankoff, David A.

    2015-03-01

    The ability to assay tumor biologic features and the impact of drugs on tumor biology is fundamental to drug development. Advances in our ability to measure genomics, gene expression, protein expression, and cellular biology have led to a host of new targets for anticancer drug therapy. In translating new drugs into clinical trials and clinical practice, these same assays serve to identify patients most likely to benefit from specific anticancer treatments. As cancer therapy becomes more individualized and targeted, there is an increasing need to characterize tumors and identify therapeutic targets to select therapy most likely to be successful in treating the individual patient's cancer. Thus far assays to identify cancer therapeutic targets or anticancer drug pharmacodynamics have been based upon in vitro assay of tissue or blood samples. Advances in molecular imaging, particularly PET, have led to the ability to perform quantitative non-invasive molecular assays. Imaging has traditionally relied on structural and anatomic features to detect cancer and determine its extent. More recently, imaging has expanded to include the ability to image regional biochemistry and molecular biology, often termed molecular imaging. Molecular imaging can be considered an in vivo assay technique, capable of measuring regional tumor biology without perturbing it. This makes molecular imaging a unique tool for cancer drug development, complementary to traditional assay methods, and a potentially powerful method for guiding targeted therapy in clinical trials and clinical practice. The ability to quantify, in absolute measures, regional in vivo biologic parameters strongly supports the use of molecular imaging as a tool to guide therapy. This review summarizes current and future applications of quantitative molecular imaging as a biomarker for cancer therapy, including the use of imaging to (1) identify patients whose tumors express a specific therapeutic target; (2) determine

  20. Quantitative traits and diversification.

    Science.gov (United States)

    FitzJohn, Richard G

    2010-12-01

    Quantitative traits have long been hypothesized to affect speciation and extinction rates. For example, smaller body size or increased specialization may be associated with increased rates of diversification. Here, I present a phylogenetic likelihood-based method (quantitative state speciation and extinction [QuaSSE]) that can be used to test such hypotheses using extant character distributions. This approach assumes that diversification follows a birth-death process where speciation and extinction rates may vary with one or more traits that evolve under a diffusion model. Speciation and extinction rates may be arbitrary functions of the character state, allowing much flexibility in testing models of trait-dependent diversification. I test the approach using simulated phylogenies and show that a known relationship between speciation and a quantitative character could be recovered in up to 80% of the cases on large trees (500 species). Consistent with other approaches, detecting shifts in diversification due to differences in extinction rates was harder than when due to differences in speciation rates. Finally, I demonstrate the application of QuaSSE to investigate the correlation between body size and diversification in primates, concluding that clade-specific differences in diversification may be more important than size-dependent diversification in shaping the patterns of diversity within this group.

  1. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  2. Quantitative ion implantation

    International Nuclear Information System (INIS)

    Gries, W.H.

    1976-06-01

    This is a report of the study of the implantation of heavy ions at medium keV-energies into electrically conducting mono-elemental solids, at ion doses too small to cause significant loss of the implanted ions by resputtering. The study has been undertaken to investigate the possibility of accurate portioning of matter in submicrogram quantities, with some specific applications in mind. The problem is extensively investigated both on a theoretical level and in practice. A mathematical model is developed for calculating the loss of implanted ions by resputtering as a function of the implanted ion dose and the sputtering yield. Numerical data are produced therefrom which permit a good order-of-magnitude estimate of the loss for any ion/solid combination in which the ions are heavier than the solid atoms, and for any ion energy from 10 to 300 keV. The implanted ion dose is measured by integration of the ion beam current, and equipment and techniques are described which make possible the accurate integration of an ion current in an electromagnetic isotope separator. The methods are applied to two sample cases, one being a stable isotope, the other a radioisotope. In both cases independent methods are used to show that the implantation is indeed quantitative, as predicted. At the same time the sample cases are used to demonstrate two possible applications for quantitative ion implantation, viz. firstly for the manufacture of calibration standards for instrumental micromethods of elemental trace analysis in metals, and secondly for the determination of the half-lives of long-lived radioisotopes by a specific activity method. It is concluded that the present study has advanced quantitative ion implantation to the state where it can be successfully applied to the solution of problems in other fields

  3. Quantitative cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Thelen, M.; Dueber, C.; Wolff, P.; Erbel, R.; Hoffmann, T.

    1985-06-01

    The scope and limitations of quantitative cardiac CT have been evaluated in a series of experimental and clinical studies. The left ventricular muscle mass was estimated by computed tomography in 19 dogs (using volumetric methods, measurements in two axes and planes and reference volume). There was good correlation with anatomical findings. The enddiastolic volume of the left ventricle was estimated in 22 patients with cardiomyopathies; using angiography as a reference, CT led to systematic under-estimation. It is also shown that ECG-triggered magnetic resonance tomography results in improved visualisation and may be expected to improve measurements of cardiac morphology.

  4. F# for quantitative finance

    CERN Document Server

    Astborg, Johan

    2013-01-01

    To develop your confidence in F#, this tutorial will first introduce you to simpler tasks such as curve fitting. You will then advance to more complex tasks such as implementing algorithms for trading semi-automation in a practical scenario-based format.If you are a data analyst or a practitioner in quantitative finance, economics, or mathematics and wish to learn how to use F# as a functional programming language, this book is for you. You should have a basic conceptual understanding of financial concepts and models. Elementary knowledge of the .NET framework would also be helpful.

  5. Quantitative trait loci (QTL) mapping for inflorescence length traits in ...

    African Journals Online (AJOL)

    Lablab purpureus (L.) sweet is an ancient legume species whose immature pods serve as a vegetable in south and south-east Asia. The objective of this study is to identify quantitative trait loci (QTLs) associated with quantitative traits such as inflorescence length, peduncle length from branch to axil, peduncle length from ...

  6. Identifying Strategic Scientific Opportunities

    Science.gov (United States)

    As NCI's central scientific strategy office, CRS collaborates with the institute's divisions, offices, and centers to identify research opportunities to advance NCI's vision for the future of cancer research.

  7. Identifying Breast Cancer Oncogenes

    Science.gov (United States)

    2011-10-01

    cells we observed that it promoted transformation of HMLE cells, suggesting a tumor suppressive role of Merlin in breast cancer (Figure 4B). A...08-1-0767 TITLE: Identifying Breast Cancer Oncogenes PRINCIPAL INVESTIGATOR: Yashaswi Shrestha...Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 W81XWH-08-1-0767 Identifying Breast Cancer Oncogenes Yashaswi Shrestha Dana-Farber

  8. Quantitative performance monitoring

    International Nuclear Information System (INIS)

    Heller, A.S.

    1987-01-01

    In the recently published update of NUREG/CR 3883, it was shown that Japanese plants of size and design similar to those in the US have significantly fewer trips in a given year of operation. One way to reduce such imbalance is the efficient use of available plant data. Since plant data are recorded and monitored continuously for management feedback and timely resolution of problems, this data should be actively used to increase the efficiency of operations and, ultimately, for a reduction of plant trips in power plants. A great deal of information is lost, however, if the analytical tools available for the data evaluation are misapplied or not adopted at all. This paper deals with a program developed to use quantitative techniques to monitor personnel performance in an operating power plant. Visual comparisons of ongoing performance with predetermined quantitative performance goals are made. A continuous feedback is provided to management for early detection of adverse trends and timely resolution of problems. Ultimately, costs are reduced through effective resource management and timely decision making

  9. Quantitative clinical radiobiology

    International Nuclear Information System (INIS)

    Bentzen, S.M.

    1993-01-01

    Based on a series of recent papers, a status is given of our current ability to quantify the radiobiology of human tumors and normal tissues. Progress has been made in the methods of analysis. This includes the introduction of 'direct' (maximum likelihood) analysis, incorporation of latent-time in the analyses, and statistical approaches to allow for the many factors of importance in predicting tumor-control probability of normal-tissue complications. Quantitative clinical radiobiology of normal tissues is reviewed with emphasis on fractionation sensitivity, repair kinetics, regeneration, latency, and the steepness of dose-response curves. In addition, combined modality treatment, functional endpoints, and the search for a correlation between the occurrence of different endpoints in the same individual are discussed. For tumors, quantitative analyses of fractionation sensitivity, repair kinetics, reoxygenation, and regeneration are reviewed. Other factors influencing local control are: Tumor volume, histopathologic differentiation and hemoglobin concentration. Also, the steepness of the dose-response curve for tumors is discussed. Radiobiological strategies for improving radiotherapy are discussed with emphasis on non-standard fractionation and individualization of treatment schedules. (orig.)

  10. Identifying Knowledge and Communication

    Directory of Open Access Journals (Sweden)

    Eduardo Coutinho Lourenço de Lima

    2006-12-01

    Full Text Available In this paper, I discuss how the principle of identifying knowledge which Strawson advances in ‘Singular Terms and Predication’ (1961, and in ‘Identifying Reference and Truth-Values’ (1964 turns out to constrain communication. The principle states that a speaker’s use of a referring expression should invoke identifying knowledge on the part of the hearer, if the hearer is to understand what the speaker is saying, and also that, in so referring, speakers are attentive to hearers’ epistemic states. In contrasting it with Russell’s Principle (Evans 1982, as well as with the principle of identifying descriptions (Donnellan 1970, I try to show that the principle of identifying knowledge, ultimately a condition for understanding, makes sense only in a situation of conversation. This allows me to conclude that the cooperative feature of communication (Grice 1975 and reference (Clark andWilkes-Gibbs 1986 holds also at the understanding level. Finally, I discuss where Strawson’s views seem to be unsatisfactory, and suggest how they might be improved.

  11. Quantitative trait loci mapping for stomatal traits in interspecific ...

    Indian Academy of Sciences (India)

    M. Sumathi

    2018-02-23

    Feb 23, 2018 ... Journal of Genetics, Vol. ... QTL analysis was carried out to identify the chromosomal regions affecting ... Keywords. linkage map; quantitative trait loci; stomata; stress ..... of India for providing financial support for the project.

  12. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  13. Identifying and Managing Risk.

    Science.gov (United States)

    Abraham, Janice M.

    1999-01-01

    The role of the college or university chief financial officer in institutional risk management is (1) to identify risk (physical, casualty, fiscal, business, reputational, workplace safety, legal liability, employment practices, general liability), (2) to develop a campus plan to reduce and control risk, (3) to transfer risk, and (4) to track and…

  14. Internally readable identifying tag

    International Nuclear Information System (INIS)

    Jefferts, K.B.; Jefferts, E.R.

    1980-01-01

    A method of identifying non-metallic objects by means of X-ray equipment is described in detail. A small metal pin with a number of grooves cut in a pre-determined equi-spaced pattern is implanted into the non-metallic object and by decoding the groove patterns using X-ray equipment, the object is uniquely identified. A specific example of such an application is in studying the migratory habits of fish. The pin inserted into the snout of the fish is 0.010 inch in diameter, 0.040 inch in length with 8 possible positions for grooves if spaced 0.005 inch apart. With 6 of the groove positions available for data, the capacity is 2 6 or 64 combinations; clearly longer pins would increase the data capacity. This method of identification is a major advance over previous techniques which necessitated destruction of the fish in order to recover the identification tag. (UK)

  15. Identifying Breast Cancer Oncogenes

    Science.gov (United States)

    2010-10-01

    tyrosine kinases with an SH3, SH2 and catalytic domain, it lacks a native myristylation signal shared by most members of this class [14], [38]. The...therapeutics and consequently, improve clinical outcomes. We aim to identify novel drivers of breast oncogenesis. We hypothesize that a kinase gain-of...human mammary epithelial cells. A pBabe-Puro-Myr-Flag kinase open reading frame (ORF) library was screened in immortalized human mammary epithelial

  16. Rock disposal problems identified

    Energy Technology Data Exchange (ETDEWEB)

    Knox, R

    1978-06-01

    Mathematical models are the only way of examining the return of radioactivity from nuclear waste to the environment over long periods of time. Work in Britain has helped identify areas where more basic data is required, but initial results look very promising for final disposal of high level waste in hard rock repositories. A report by the National Radiological Protection Board of a recent study, is examined.

  17. Quantitive DNA Fiber Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  18. Identifying phenomenal consciousness.

    Science.gov (United States)

    Schier, Elizabeth

    2009-03-01

    This paper examines the possibility of finding evidence that phenomenal consciousness is independent of access. The suggestion reviewed is that we should look for isomorphisms between phenomenal and neural activation spaces. It is argued that the fact that phenomenal spaces are mapped via verbal report is no problem for this methodology. The fact that activation and phenomenal space are mapped via different means does not mean that they cannot be identified. The paper finishes by examining how data addressing this theoretical question could be obtained.

  19. A quantitative philology of introspection

    Directory of Open Access Journals (Sweden)

    Carlos eDiuk

    2012-09-01

    Full Text Available The cultural evolution of introspective thought has been recognized to undergo a drastic change during the middle of the first millennium BC. This period, known as the ``Axial Age'', saw the birth of religions and philosophies still alive in modern culture, as well as the transition from orality to literacy - which led to the hypothesis of a link between introspection and literacy. Here we set out to examine the evolution of introspection in the Axial Age, studying the cultural record of the Greco-Roman and Judeo-Christian literary traditions. Using a statistical measure of semantic similarity, we identify a single ``arrow of time'' in the Old and New Testaments of the Bible, and a more complex non-monotonic dynamics in the Greco-Roman tradition reflecting the rise and fall of the respective societies. A comparable analysis of the 20th century cultural record shows a steady increase in the incidence of introspective topics, punctuated by abrupt declines during and preceding the First and Second World Wars. Our results show that (a it is possible to devise a consistent metric to quantify the history of a high-level concept such as introspection, cementing the path for a new quantitative philology and (b to the extent that it is captured in the cultural record, the increased ability of human thought for self-reflection that the Axial Age brought about is still heavily determined by societal contingencies beyond the orality-literacy nexus.

  20. Quantitative sexing (Q-Sexing) and relative quantitative sexing (RQ ...

    African Journals Online (AJOL)

    samer

    Key words: Polymerase chain reaction (PCR), quantitative real time polymerase chain reaction (qPCR), quantitative sexing, Siberian tiger. INTRODUCTION. Animal molecular sexing .... 43:3-12. Ellegren H (1996). First gene on the avian W chromosome (CHD) provides a tag for universal sexing of non-ratite birds. Proc.

  1. Monotowns: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Shastitko Andrei

    2016-06-01

    Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.

  2. List identifies threatened ecosystems

    Science.gov (United States)

    Showstack, Randy

    2012-09-01

    The International Union for Conservation of Nature (IUCN) announced on 9 September that it will develop a new Red List of Ecosystems that will identify which ecosystems are vulnerable or endangered. The list, which is modeled on the group's Red List of Threatened Species™, could help to guide conservation activities and influence policy processes such as the Convention on Biological Diversity, according to the group. “We will assess the status of marine, terrestrial, freshwater, and subterranean ecosystems at local, regional, and global levels,” stated Jon Paul Rodriguez, leader of IUCN's Ecosystems Red List Thematic Group. “The assessment can then form the basis for concerted implementation action so that we can manage them sustainably if their risk of collapse is low or restore them if they are threatened and then monitor their recovery.”

  3. Global Microbial Identifier

    DEFF Research Database (Denmark)

    Wielinga, Peter; Hendriksen, Rene S.; Aarestrup, Frank Møller

    2017-01-01

    ) will likely also enable a much better understanding of the pathogenesis of the infection and the molecular basis of the host response to infection. But the full potential of these advances will only transpire if the data in this area become transferable and thereby comparable, preferably in open-source...... of microorganisms, for the identification of relevant genes and for the comparison of genomes to detect outbreaks and emerging pathogens. To harness the full potential of WGS, a shared global database of genomes linked to relevant metadata and the necessary software tools needs to be generated, hence the global...... microbial identifier (GMI) initiative. This tool will ideally be used in amongst others in the diagnosis of infectious diseases in humans and animals, in the identification of microorganisms in food and environment, and to track and trace microbial agents in all arenas globally. This will require...

  4. Quantitative Nuclear Medicine. Chapter 17

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, J.; El Fakhri, G. [Massachusetts General Hospital and Harvard Medical School, Boston (United States)

    2014-12-15

    Planar imaging is still used in clinical practice although tomographic imaging (single photon emission computed tomography (SPECT) and positron emission tomography (PET)) is becoming more established. In this chapter, quantitative methods for both imaging techniques are presented. Planar imaging is limited to single photon. For both SPECT and PET, the focus is on the quantitative methods that can be applied to reconstructed images.

  5. Mastering R for quantitative finance

    CERN Document Server

    Berlinger, Edina; Badics, Milán; Banai, Ádám; Daróczi, Gergely; Dömötör, Barbara; Gabler, Gergely; Havran, Dániel; Juhász, Péter; Margitai, István; Márkus, Balázs; Medvegyev, Péter; Molnár, Julia; Szucs, Balázs Árpád; Tuza, Ágnes; Vadász, Tamás; Váradi, Kata; Vidovics-Dancs, Ágnes

    2015-01-01

    This book is intended for those who want to learn how to use R's capabilities to build models in quantitative finance at a more advanced level. If you wish to perfectly take up the rhythm of the chapters, you need to be at an intermediate level in quantitative finance and you also need to have a reasonable knowledge of R.

  6. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  7. Quantitative Analysis of Renogram

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Keun Chul [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    1969-03-15

    value are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  8. Quantitative Analysis of Renogram

    International Nuclear Information System (INIS)

    Choi, Keun Chul

    1969-01-01

    are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  9. Radiograph identifying means

    International Nuclear Information System (INIS)

    Sheldon, A.D.

    1983-01-01

    A flexible character-indentable plastics embossing tape is backed by and bonded to a lead strip, not more than 0.025 inches thick, to form a tape suitable for identifying radiographs. The lead strip is itself backed by a relatively thin and flimsy plastics or fabric strip which, when removed, allows the lead plastic tape to be pressure-bonded to the surface to be radiographed. A conventional tape-embossing gun is used to indent the desired characters in succession into the lead-backed tape, without necessarily severing the lead; and then the backing strip is peeled away to expose the layer of adhesive which pressure-bonds the indented tape to the object to be radiographed. X-rays incident on the embossed tape will cause the raised characters to show up dark on the subsequently-developed film, whilst the raised side areas will show up white. Each character will thus stand out on the developed film. (author)

  10. Quantitative phosphoproteomics to characterize signaling networks

    DEFF Research Database (Denmark)

    Rigbolt, Kristoffer T G; Blagoev, Blagoy

    2012-01-01

    for analyzing protein phosphorylation at a system-wide scale and has become the intuitive strategy for comprehensive characterization of signaling networks. Contemporary phosphoproteomics use highly optimized procedures for sample preparation, mass spectrometry and data analysis algorithms to identify......Reversible protein phosphorylation is involved in the regulation of most, if not all, major cellular processes via dynamic signal transduction pathways. During the last decade quantitative phosphoproteomics have evolved from a highly specialized area to a powerful and versatile platform...... and quantify thousands of phosphorylations, thus providing extensive overviews of the cellular signaling networks. As a result of these developments quantitative phosphoproteomics have been applied to study processes as diverse as immunology, stem cell biology and DNA damage. Here we review the developments...

  11. Quantitative trait loci associated with anthracnose resistance in sorghum

    Science.gov (United States)

    With an aim to develop a durable resistance to the fungal disease anthracnose, two unique genetic sources of resistance were selected to create genetic mapping populations to identify regions of the sorghum genome that encode anthracnose resistance. A series of quantitative trait loci were identifi...

  12. Quantitative Trait Locus and Brain Expression of HLA-DPA1 Offers Evidence of Shared Immune Alterations in Psychiatric Disorders

    Directory of Open Access Journals (Sweden)

    Ling Z. Morgan

    2016-03-01

    Full Text Available Genome-wide association studies of schizophrenia encompassing the major histocompatibility locus (MHC were highly significant following genome-wide correction. This broad region implicates many genes including the MHC complex class II. Within this interval we examined the expression of two MHC II genes (HLA-DPA1 and HLA-DRB1 in brain from individual subjects with schizophrenia (SZ, bipolar disorder (BD, major depressive disorder (MDD, and controls by differential gene expression methods. A third MHC II mRNA, CD74, was studied outside of the MHC II locus, as it interacts within the same immune complex. Exon microarrays were performed in anterior cingulate cortex (ACC in BD compared to controls, and both HLA-DPA1 and CD74 were decreased in expression in BD. The expression of HLA-DPA1 and CD74 were both reduced in hippocampus, amygdala, and dorsolateral prefrontal cortex regions in SZ and BD compared to controls by specific qPCR assay. We found several novel HLA-DPA1 mRNA variants spanning HLA-DPA1 exons 2-3-4 as suggested by exon microarrays. The intronic rs9277341 SNP was a significant cis expression quantitative trait locus (eQTL that was associated with the total expression of HLA-DPA1 in five brain regions. A biomarker study of MHC II mRNAs was conducted in SZ, BD, MDD, and control lymphoblastic cell lines (LCL by qPCR assay of 87 subjects. There was significantly decreased expression of HLA-DPA1 and CD74 in BD, and trends for reductions in SZ in LCLs. The discovery of multiple splicing variants in brain for HLA-DPA1 is important as the HLA-DPA1 gene is highly conserved, there are no reported splicing variants, and the functions in brain are unknown. Future work on the function and localization of MHC Class II proteins in brain will help to understand the role of alterations in neuropsychiatric disorders. The HLA-DPA1 eQTL is located within a large linkage disequilibrium block that has an irrefutable association with schizophrenia. Future

  13. Mixing quantitative with qualitative methods

    DEFF Research Database (Denmark)

    Morrison, Ann; Viller, Stephen; Heck, Tamara

    2017-01-01

    with or are considering, researching, or working with both quantitative and qualitative evaluation methods (in academia or industry), join us in this workshop. In particular, we look at adding quantitative to qualitative methods to build a whole picture of user experience. We see a need to discuss both quantitative...... and qualitative research because there is often a perceived lack of understanding of the rigor involved in each. The workshop will result in a White Paper on the latest developments in this field, within Australia and comparative with international work. We anticipate sharing submissions and workshop outcomes...

  14. Understanding quantitative research: part 1.

    Science.gov (United States)

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  15. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  16. Quantitative Information Flow as Safety and Liveness Hyperproperties

    Directory of Open Access Journals (Sweden)

    Hirotoshi Yasuoka

    2012-07-01

    Full Text Available We employ Clarkson and Schneider's "hyperproperties" to classify various verification problems of quantitative information flow. The results of this paper unify and extend the previous results on the hardness of checking and inferring quantitative information flow. In particular, we identify a subclass of liveness hyperproperties, which we call "k-observable hyperproperties", that can be checked relative to a reachability oracle via self composition.

  17. Cytoarchitectonic and quantitative Golgi study of the hedgehog supraoptic nucleus.

    OpenAIRE

    Caminero, A A; Machín, C; Sanchez-Toscano, F

    1992-01-01

    A cytoarchitectural study was made of the supraoptic nucleus (SON) of the hedgehog with special attention to the quantitative comparison of its main neuronal types. The main purposes were (1) to relate the characteristics of this nucleus in the hedgehog (a primitive mammalian insectivorous brain) with those in the SONs of more evolutionarily advanced species; (2) to identify quantitatively the dendritic fields of the main neuronal types in the hedgehog SON and to study their synaptic connecti...

  18. QUALITATIVE AND QUANTITATIVE METHODS OF SUICIDE RESEARCH IN OLD AGE

    OpenAIRE

    Ojagbemi, A.

    2017-01-01

    This paper examines the merits of the qualitative and quantitative methods of suicide research in the elderly using two studies identified through a free search of the Pubmed database for articles that might have direct bearing on suicidality in the elderly. The studies have been purposively selected for critical appraisal because they meaningfully reflect the quantitative and qualitative divide as well as the social, economic, and cultural boundaries between the elderly living in sub-Saharan...

  19. Enhancing quantitative approaches for assessing community resilience

    Science.gov (United States)

    Chuang, W. C.; Garmestani, A.S.; Eason, T. N.; Spanbauer, T. L.; Fried-Peterson, H. B.; Roberts, C.P.; Sundstrom, Shana M.; Burnett, J.L.; Angeler, David G.; Chaffin, Brian C.; Gunderson, L.; Twidwell, Dirac; Allen, Craig R.

    2018-01-01

    Scholars from many different intellectual disciplines have attempted to measure, estimate, or quantify resilience. However, there is growing concern that lack of clarity on the operationalization of the concept will limit its application. In this paper, we discuss the theory, research development and quantitative approaches in ecological and community resilience. Upon noting the lack of methods that quantify the complexities of the linked human and natural aspects of community resilience, we identify several promising approaches within the ecological resilience tradition that may be useful in filling these gaps. Further, we discuss the challenges for consolidating these approaches into a more integrated perspective for managing social-ecological systems.

  20. Quantitative Assessment of the IT Agile Transformation

    Directory of Open Access Journals (Sweden)

    Orłowski Cezary

    2017-03-01

    Full Text Available The aim of this paper is to present the quantitative perspective of the agile transformation processes in IT organisations. The phenomenon of agile transformation becomes a complex challenge for an IT organisation since it has not been analysed in detail so far. There is no research on the readiness of IT organisations to realise agile transformation processes. Such processes also prove to have uncontrolled character. Therefore, to minimise the risk of failure referring to the realisation of transformation processes, it is necessary to monitor them. It is also necessary to identify and analyse such processes to ensure their continuous character.

  1. SPARQL-enabled identifier conversion with Identifiers.org

    Science.gov (United States)

    Wimalaratne, Sarala M.; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille

    2015-01-01

    Motivation: On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. Results: We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. Availability and implementation: The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. Contact: sarala@ebi.ac.uk PMID:25638809

  2. SPARQL-enabled identifier conversion with Identifiers.org.

    Science.gov (United States)

    Wimalaratne, Sarala M; Bolleman, Jerven; Juty, Nick; Katayama, Toshiaki; Dumontier, Michel; Redaschi, Nicole; Le Novère, Nicolas; Hermjakob, Henning; Laibe, Camille

    2015-06-01

    On the semantic web, in life sciences in particular, data is often distributed via multiple resources. Each of these sources is likely to use their own International Resource Identifier for conceptually the same resource or database record. The lack of correspondence between identifiers introduces a barrier when executing federated SPARQL queries across life science data. We introduce a novel SPARQL-based service to enable on-the-fly integration of life science data. This service uses the identifier patterns defined in the Identifiers.org Registry to generate a plurality of identifier variants, which can then be used to match source identifiers with target identifiers. We demonstrate the utility of this identifier integration approach by answering queries across major producers of life science Linked Data. The SPARQL-based identifier conversion service is available without restriction at http://identifiers.org/services/sparql. © The Author 2015. Published by Oxford University Press.

  3. Quantitative mass spectrometry: an overview

    Science.gov (United States)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  4. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  5. Quantitative radiation monitors for containment and surveillance

    International Nuclear Information System (INIS)

    Fehlau, P.E.

    1983-01-01

    Quantitative radiation monitors make it possible to differentiate between shielded and unshielded nuclear materials. The hardness of the gamma-ray spectrum is the attribute that characterizes bare or shielded material. Separate high- and low-energy gamma-ray regions are obtained from a single-channel analyzer through its window and discriminator outputs. The monitor counts both outputs and computes a ratio of the high- and low-energy region counts whenever an alarm occurs. The ratio clearly differentiates between shielded and unshielded nuclear material so that the net alarm count may be identified with a small quantity of unshielded material or a large quantity of shielded material. Knowledge of the diverted quantity helps determine whether an inventory should be called to identify the loss

  6. A Large-Scale Quantitative Proteomic Approach to Identifying Sulfur Mustard-Induced Protein Phosphorylation Cascades

    Science.gov (United States)

    2010-01-01

    snapshot of SM-induced toxicity. Over the past few years, innovations in systems biology and biotechnology have led to important advances in our under...perturbations. SILAC has been used to study tumor metastasis (3, 4), focal adhesion- associated proteins, growth factor signaling, and insulin regula- tion (5...stained with colloidal Coomassie blue. After it was destained, the gel lane was excised into six regions, and each region was cut into 1 mm cubes

  7. Quantitative proteomics identify molecular targets that are crucial in larval settlement and metamorphosis of bugula neritina

    KAUST Repository

    Zhang, Huoming; Wong, Yuehim; Wang, Hao; Chen, Zhangfan; Arellano, Shawn M.; Ravasi, Timothy; Qian, Peiyuan

    2011-01-01

    The marine invertebrate Bugula neritina has a biphasic life cycle that consists of a swimming larval stage and a sessile juvenile and adult stage. The attachment of larvae to the substratum and their subsequent metamorphosis have crucial ecological

  8. Quantitative Genetics Identifies Cryptic Genetic Variation Involved in the Paternal Regulation of Seed Development

    NARCIS (Netherlands)

    Pires, Nuno D.; Bemer, Marian; Müller, Lena M.; Baroux, Célia; Spillane, Charles; Grossniklaus, Ueli

    2016-01-01

    Embryonic development requires a correct balancing of maternal and paternal genetic information. This balance is mediated by genomic imprinting, an epigenetic mechanism that leads to parent-of-origin-dependent gene expression. The parental conflict (or kinship) theory proposes that imprinting can

  9. Identifying and Retaining Quality Naval Officers: A Quantitative Analysis of Job Matching and Lateral Transfers

    Science.gov (United States)

    2017-03-01

    happy face at the time. I am seriously grateful for your love and moral support. YOOHOO FAMILY! Thank you. Lastly, I would like to thank wine ...career match of naval officers, and therefore, the quality of Navy personnel. The benefit of this study is to contribute to the Navy’s efforts to...aligning officers with a more suitable community at the onset of their careers, the Navy stands to benefit from the gains associated with retaining

  10. Quantitative proteomics by amino acid labeling identifies novel NHR-49 regulated proteins in C. elegans

    DEFF Research Database (Denmark)

    Fredens, Julius; Færgeman, Nils J.

    2012-01-01

    in the nematode Caenorhabditis elegans. We have recently shown that C. elegans can be completely labeled with heavy-labeled lysine by feeding worms on prelabeled lysine auxotroph Escherichia coli for just one generation. We applied this methodology to examine the organismal response to functional loss or RNAi...... gene knockdown by RNAi provides a powerful tool with broad implications for C. elegans biology....

  11. Quantitative Genetics Identifies Cryptic Genetic Variation Involved in the Paternal Regulation of Seed Development.

    Directory of Open Access Journals (Sweden)

    Nuno D Pires

    2016-01-01

    Full Text Available Embryonic development requires a correct balancing of maternal and paternal genetic information. This balance is mediated by genomic imprinting, an epigenetic mechanism that leads to parent-of-origin-dependent gene expression. The parental conflict (or kinship theory proposes that imprinting can evolve due to a conflict between maternal and paternal alleles over resource allocation during seed development. One assumption of this theory is that paternal alleles can regulate seed growth; however, paternal effects on seed size are often very low or non-existent. We demonstrate that there is a pool of cryptic genetic variation in the paternal control of Arabidopsis thaliana seed development. Such cryptic variation can be exposed in seeds that maternally inherit a medea mutation, suggesting that MEA acts as a maternal buffer of paternal effects. Genetic mapping using recombinant inbred lines, and a novel method for the mapping of parent-of-origin effects using whole-genome sequencing of segregant bulks, indicate that there are at least six loci with small, paternal effects on seed development. Together, our analyses reveal the existence of a pool of hidden genetic variation on the paternal control of seed development that is likely shaped by parental conflict.

  12. Quantitative densitometry of neurotransmitter receptors

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Bleisch, W.V.; Biegon, A.; McEwen, B.S.

    1982-01-01

    An autoradiographic procedure is described that allows the quantitative measurement of neurotransmitter receptors by optical density readings. Frozen brain sections are labeled in vitro with [ 3 H]ligands under conditions that maximize specific binding to neurotransmitter receptors. The labeled sections are then placed against the 3 H-sensitive LKB Ultrofilm to produce the autoradiograms. These autoradiograms resemble those produced by [ 14 C]deoxyglucose autoradiography and are suitable for quantitative analysis with a densitometer. Muscarinic cholinergic receptors in rat and zebra finch brain and 5-HT receptors in rat brain were visualized by this method. When the proper combination of ligand concentration and exposure time are used, the method provides quantitative information about the amount and affinity of neurotransmitter receptors in brain sections. This was established by comparisons of densitometric readings with parallel measurements made by scintillation counting of sections. (Auth.)

  13. Energy Education: The Quantitative Voice

    Science.gov (United States)

    Wolfson, Richard

    2010-02-01

    A serious study of energy use and its consequences has to be quantitative. It makes little sense to push your favorite renewable energy source if it can't provide enough energy to make a dent in humankind's prodigious energy consumption. Conversely, it makes no sense to dismiss alternatives---solar in particular---that supply Earth with energy at some 10,000 times our human energy consumption rate. But being quantitative---especially with nonscience students or the general public---is a delicate business. This talk draws on the speaker's experience presenting energy issues to diverse audiences through single lectures, entire courses, and a textbook. The emphasis is on developing a quick, ``back-of-the-envelope'' approach to quantitative understanding of energy issues. )

  14. Quantitative Characterisation of Surface Texture

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Lonardo, P.M.; Trumpold, H.

    2000-01-01

    This paper reviews the different methods used to give a quantitative characterisation of surface texture. The paper contains a review of conventional 2D as well as 3D roughness parameters, with particular emphasis on recent international standards and developments. It presents new texture...

  15. GPC and quantitative phase imaging

    DEFF Research Database (Denmark)

    Palima, Darwin; Banas, Andrew Rafael; Villangca, Mark Jayson

    2016-01-01

    shaper followed by the potential of GPC for biomedical and multispectral applications where we experimentally demonstrate the active light shaping of a supercontinuum laser over most of the visible wavelength range. Finally, we discuss how GPC can be advantageously applied for Quantitative Phase Imaging...

  16. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  17. La quantite en islandais modern

    Directory of Open Access Journals (Sweden)

    Magnús Pétursson

    1978-12-01

    Full Text Available La réalisation phonétique de la quantité en syllabe accentuée dans la lecture de deux textes continus. Le problème de la quantité est un des problèmes les plus étudiés dans la phonologie de l'islandais moderne. Du point de vue phonologique il semble qu'on ne peut pas espérer apporter du nouveau, les possibilités théoriques ayant été pratiquement épuisées comme nous 1'avons rappelé dans notre étude récente (Pétursson 1978, pp. 76-78. Le résultat le plus inattendu des recherches des dernières années est sans doute la découverte d'une différenciation quantitative entre le Nord et le Sud de l'Islande (Pétursson 1976a. Il est pourtant encore prématuré de parler de véritables zones quantitatives puisqu'on n' en connaît ni les limites ni l' étendue sur le plan géographique.

  18. Quantitative Reasoning in Problem Solving

    Science.gov (United States)

    Ramful, Ajay; Ho, Siew Yin

    2015-01-01

    In this article, Ajay Ramful and Siew Yin Ho explain the meaning of quantitative reasoning, describing how it is used in the to solve mathematical problems. They also describe a diagrammatic approach to represent relationships among quantities and provide examples of problems and their solutions.

  19. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  20. Quantitative proteomics by amino acid labeling in C. elegans

    DEFF Research Database (Denmark)

    Fredens, Julius; Engholm-Keller, Kasper; Giessing, Anders

    2011-01-01

    We demonstrate labeling of Caenorhabditis elegans with heavy isotope-labeled lysine by feeding them with heavy isotope-labeled Escherichia coli. Using heavy isotope-labeled worms and quantitative proteomics methods, we identified several proteins that are regulated in response to loss or RNAi-med......-mediated knockdown of the nuclear hormone receptor 49 in C. elegans. The combined use of quantitative proteomics and selective gene knockdown is a powerful tool for C. elegans biology.......We demonstrate labeling of Caenorhabditis elegans with heavy isotope-labeled lysine by feeding them with heavy isotope-labeled Escherichia coli. Using heavy isotope-labeled worms and quantitative proteomics methods, we identified several proteins that are regulated in response to loss or RNAi...

  1. Ebola Virus Infection Modelling and Identifiability Problems

    Directory of Open Access Journals (Sweden)

    Van-Kinh eNguyen

    2015-04-01

    Full Text Available The recent outbreaks of Ebola virus (EBOV infections have underlined the impact of the virus as a major threat for human health. Due to the high biosafety classification of EBOV (level 4, basic research is very limited. Therefore, the development of new avenues of thinking to advance quantitative comprehension of the virus and its interaction with the host cells is urgently neededto tackle this lethal disease. Mathematical modelling of the EBOV dynamics can be instrumental to interpret Ebola infection kinetics on quantitative grounds. To the best of our knowledge, a mathematical modelling approach to unravel the interaction between EBOV and the host cells isstill missing. In this paper, a mathematical model based on differential equations is used to represent the basic interactions between EBOV and wild-type Vero cells in vitro. Parameter sets that represent infectivity of pathogens are estimated for EBOV infection and compared with influenza virus infection kinetics. The average infecting time of wild-type Vero cells in EBOV is slower than in influenza infection. Simulation results suggest that the slow infecting time of EBOV could be compensated by its efficient replication. This study reveals several identifiability problems and what kind of experiments are necessary to advance the quantification of EBOV infection. A first mathematical approach of EBOV dynamics and the estimation of standard parametersin viral infections kinetics is the key contribution of this work, paving the way for future modelling work on EBOV infection.

  2. Reconciling Anti-essentialism and Quantitative Methodology

    DEFF Research Database (Denmark)

    Jensen, Mathias Fjællegaard

    2017-01-01

    Quantitative methodology has a contested role in feminist scholarship which remains almost exclusively qualitative. Considering Irigaray’s notion of mimicry, Spivak’s strategic essentialism, and Butler’s contingent foundations, the essentialising implications of quantitative methodology may prove...... the potential to reconcile anti-essentialism and quantitative methodology, and thus, to make peace in the quantitative/qualitative Paradigm Wars....

  3. Identification of ginseng root using quantitative X-ray microtomography

    Directory of Open Access Journals (Sweden)

    Linlin Ye

    2017-07-01

    Conclusion: This study is the first to provide evidence of the distribution characteristics of COCCs to identify four types of ginseng, with regard to species authentication and age identification, by X-ray phase-contrast microtomography quantitative imaging. This method is also expected to reveal important relationships between COCCs and the occurrence of the effective medicinal components of ginseng.

  4. Quantitative resistance to Botrytis cinerea from Solanum neorickii

    NARCIS (Netherlands)

    Finkers, H.J.; Bai, Y.; Berg, van den P.M.M.M.; Berloo, van R.; Meijer-Dekens, R.G.; Have, ten A.; Kan, van J.A.L.; Lindhout, P.; Heusden, van A.W.

    2008-01-01

    Tomato (Solanum lycopersicum) is susceptible to gray mold (Botrytis cinerea). Quantitative resistance to B. cinerea was previously identified in a wild relative, S. neorickii G1.1601. The 122 F3 families derived from a cross between the susceptible S. lycopersicum cv. Moneymaker and the partially

  5. Simulation and the Development of Clinical Judgment: A Quantitative Study

    Science.gov (United States)

    Holland, Susan

    2015-01-01

    The purpose of this quantitative pretest posttest quasi-experimental research study was to explore the effect of the NESD on clinical judgment in associate degree nursing students and compare the differences between groups when the Nursing Education Simulation Design (NESD) guided simulation in order to identify educational strategies promoting…

  6. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. Keywords: Spleen, Rat, Protein extraction, Label-free quantitative proteomics

  7. Quantitative health impact assessment: current practice and future directions

    NARCIS (Netherlands)

    J.L. Veerman (Lennert); J.J.M. Barendregt (Jan); J.P. Mackenbach (Johan)

    2005-01-01

    textabstractSTUDY OBJECTIVE: To assess what methods are used in quantitative health impact assessment (HIA), and to identify areas for future research and development. DESIGN: HIA reports were assessed for (1) methods used to quantify effects of policy on determinants of health

  8. The Limitations of Quantitative Social Science for Informing Public Policy

    Science.gov (United States)

    Jerrim, John; de Vries, Robert

    2017-01-01

    Quantitative social science (QSS) has the potential to make an important contribution to public policy. However it also has a number of limitations. The aim of this paper is to explain these limitations to a non-specialist audience and to identify a number of ways in which QSS research could be improved to better inform public policy.

  9. The use of quantitative risk assessment in HACCP

    NARCIS (Netherlands)

    Hoornstra, E.; Northolt, M.D.; Notermans, S.; Barendsz, A.W.

    2001-01-01

    During the hazard analysis as part of the development of a HACCP-system, first the hazards (contaminants) have to be identified and then the risks have to be assessed. Often, this assessment is restricted to a qualitative analysis. By using elements of quantitative risk assessment (QRA) the hazard

  10. Quantitative Reasoning and the Sine Function: The Case of Zac

    Science.gov (United States)

    Moore, Kevin C.

    2014-01-01

    A growing body of literature has identified quantitative and covariational reasoning as critical for secondary and undergraduate student learning, particularly for topics that require students to make sense of relationships between quantities. The present study extends this body of literature by characterizing an undergraduate precalculus…

  11. Identifying public expectations of genetic biobanks.

    Science.gov (United States)

    Critchley, Christine; Nicol, Dianne; McWhirter, Rebekah

    2017-08-01

    Understanding public priorities for biobanks is vital for maximising utility and efficiency of genetic research and maintaining respect for donors. This research directly assessed the relative importance the public place on different expectations of biobanks. Quantitative and qualitative results from a national sample of 800 Australians revealed that the majority attributed more importance to protecting privacy and ethical conduct than maximising new healthcare benefits, which was in turn viewed as more important than obtaining specific consent, benefit sharing, collaborating and sharing data. A latent class analysis identified two distinct classes displaying different patterns of expectations. One placed higher priority on behaviours that respect the donor ( n = 623), the other on accelerating science ( n = 278). Additional expectations derived from qualitative data included the need for biobanks to be transparent and to prioritise their research focus, educate the public and address commercialisation.

  12. Quantitative Imaging in Cancer Evolution and Ecology

    Science.gov (United States)

    Grove, Olya; Gillies, Robert J.

    2013-01-01

    Cancer therapy, even when highly targeted, typically fails because of the remarkable capacity of malignant cells to evolve effective adaptations. These evolutionary dynamics are both a cause and a consequence of cancer system heterogeneity at many scales, ranging from genetic properties of individual cells to large-scale imaging features. Tumors of the same organ and cell type can have remarkably diverse appearances in different patients. Furthermore, even within a single tumor, marked variations in imaging features, such as necrosis or contrast enhancement, are common. Similar spatial variations recently have been reported in genetic profiles. Radiologic heterogeneity within tumors is usually governed by variations in blood flow, whereas genetic heterogeneity is typically ascribed to random mutations. However, evolution within tumors, as in all living systems, is subject to Darwinian principles; thus, it is governed by predictable and reproducible interactions between environmental selection forces and cell phenotype (not genotype). This link between regional variations in environmental properties and cellular adaptive strategies may permit clinical imaging to be used to assess and monitor intratumoral evolution in individual patients. This approach is enabled by new methods that extract, report, and analyze quantitative, reproducible, and mineable clinical imaging data. However, most current quantitative metrics lack spatialness, expressing quantitative radiologic features as a single value for a region of interest encompassing the whole tumor. In contrast, spatially explicit image analysis recognizes that tumors are heterogeneous but not well mixed and defines regionally distinct habitats, some of which appear to harbor tumor populations that are more aggressive and less treatable than others. By identifying regional variations in key environmental selection forces and evidence of cellular adaptation, clinical imaging can enable us to define intratumoral

  13. Quantitative safety goals for the regulatory process

    International Nuclear Information System (INIS)

    Joksimovic, V.; O'Donnell, L.F.

    1981-01-01

    The paper offers a brief summary of the current regulatory background in the USA, emphasizing nuclear, related to the establishment of quantitative safety goals as a way to respond to the key issue of 'how safe is safe enough'. General Atomic has taken a leading role in advocating the use of probabilistic risk assessment techniques in the regulatory process. This has led to understanding of the importance of quantitative safety goals. The approach developed by GA is discussed in the paper. It is centred around definition of quantitative safety regions. The regions were termed: design basis, safety margin or design capability and safety research. The design basis region is bounded by the frequency of 10 -4 /reactor-year and consequences of no identifiable public injury. 10 -4 /reactor-year is associated with the total projected lifetime of a commercial US nuclear power programme. Events which have a 50% chance of happening are included in the design basis region. In the safety margin region, which extends below the design basis region, protection is provided against some events whose probability of not happening during the expected course of the US nuclear power programme is within the range of 50 to 90%. Setting the lower mean frequency to this region of 10 -5 /reactor-year is equivalent to offering 90% assurance that an accident of given severity will not happen. Rare events with a mean frequency below 10 -5 can be predicted to occur. However, accidents predicted to have a probability of less than 10 -6 are 99% certain not to happen at all, and are thus not anticipated to affect public health and safety. The area between 10 -5 and 10 -6 defines the frequency portion of the safety research region. Safety goals associated with individual risk to a maximum-exposed member of public, general societal risk and property risk are proposed in the paper

  14. Identification of ginseng root using quantitative X-ray microtomography.

    Science.gov (United States)

    Ye, Linlin; Xue, Yanling; Wang, Yudan; Qi, Juncheng; Xiao, Tiqiao

    2017-07-01

    The use of X-ray phase-contrast microtomography for the investigation of Chinese medicinal materials is advantageous for its nondestructive, in situ , and three-dimensional quantitative imaging properties. The X-ray phase-contrast microtomography quantitative imaging method was used to investigate the microstructure of ginseng, and the phase-retrieval method is also employed to process the experimental data. Four different ginseng samples were collected and investigated; these were classified according to their species, production area, and sample growth pattern. The quantitative internal characteristic microstructures of ginseng were extracted successfully. The size and position distributions of the calcium oxalate cluster crystals (COCCs), important secondary metabolites that accumulate in ginseng, are revealed by the three-dimensional quantitative imaging method. The volume and amount of the COCCs in different species of the ginseng are obtained by a quantitative analysis of the three-dimensional microstructures, which shows obvious difference among the four species of ginseng. This study is the first to provide evidence of the distribution characteristics of COCCs to identify four types of ginseng, with regard to species authentication and age identification, by X-ray phase-contrast microtomography quantitative imaging. This method is also expected to reveal important relationships between COCCs and the occurrence of the effective medicinal components of ginseng.

  15. Fundamental quantitative security in quantum key generation

    International Nuclear Information System (INIS)

    Yuen, Horace P.

    2010-01-01

    We analyze the fundamental security significance of the quantitative criteria on the final generated key K in quantum key generation including the quantum criterion d, the attacker's mutual information on K, and the statistical distance between her distribution on K and the uniform distribution. For operational significance a criterion has to produce a guarantee on the attacker's probability of correctly estimating some portions of K from her measurement, in particular her maximum probability of identifying the whole K. We distinguish between the raw security of K when the attacker just gets at K before it is used in a cryptographic context and its composition security when the attacker may gain further information during its actual use to help get at K. We compare both of these securities of K to those obtainable from conventional key expansion with a symmetric key cipher. It is pointed out that a common belief in the superior security of a quantum generated K is based on an incorrect interpretation of d which cannot be true, and the security significance of d is uncertain. Generally, the quantum key distribution key K has no composition security guarantee and its raw security guarantee from concrete protocols is worse than that of conventional ciphers. Furthermore, for both raw and composition security there is an exponential catch-up problem that would make it difficult to quantitatively improve the security of K in a realistic protocol. Some possible ways to deal with the situation are suggested.

  16. Quantitative image fusion in infrared radiometry

    Science.gov (United States)

    Romm, Iliya; Cukurel, Beni

    2018-05-01

    Towards high-accuracy infrared radiance estimates, measurement practices and processing techniques aimed to achieve quantitative image fusion using a set of multi-exposure images of a static scene are reviewed. The conventional non-uniformity correction technique is extended, as the original is incompatible with quantitative fusion. Recognizing the inherent limitations of even the extended non-uniformity correction, an alternative measurement methodology, which relies on estimates of the detector bias using self-calibration, is developed. Combining data from multi-exposure images, two novel image fusion techniques that ultimately provide high tonal fidelity of a photoquantity are considered: ‘subtract-then-fuse’, which conducts image subtraction in the camera output domain and partially negates the bias frame contribution common to both the dark and scene frames; and ‘fuse-then-subtract’, which reconstructs the bias frame explicitly and conducts image fusion independently for the dark and the scene frames, followed by subtraction in the photoquantity domain. The performances of the different techniques are evaluated for various synthetic and experimental data, identifying the factors contributing to potential degradation of the image quality. The findings reflect the superiority of the ‘fuse-then-subtract’ approach, conducting image fusion via per-pixel nonlinear weighted least squares optimization.

  17. Quantitative stratification of diffuse parenchymal lung diseases.

    Directory of Open Access Journals (Sweden)

    Sushravya Raghunath

    Full Text Available Diffuse parenchymal lung diseases (DPLDs are characterized by widespread pathological changes within the pulmonary tissue that impair the elasticity and gas exchange properties of the lungs. Clinical-radiological diagnosis of these diseases remains challenging and their clinical course is characterized by variable disease progression. These challenges have hindered the introduction of robust objective biomarkers for patient-specific prediction based on specific phenotypes in clinical practice for patients with DPLD. Therefore, strategies facilitating individualized clinical management, staging and identification of specific phenotypes linked to clinical disease outcomes or therapeutic responses are urgently needed. A classification schema consistently reflecting the radiological, clinical (lung function and clinical outcomes and pathological features of a disease represents a critical need in modern pulmonary medicine. Herein, we report a quantitative stratification paradigm to identify subsets of DPLD patients with characteristic radiologic patterns in an unsupervised manner and demonstrate significant correlation of these self-organized disease groups with clinically accepted surrogate endpoints. The proposed consistent and reproducible technique could potentially transform diagnostic staging, clinical management and prognostication of DPLD patients as well as facilitate patient selection for clinical trials beyond the ability of current radiological tools. In addition, the sequential quantitative stratification of the type and extent of parenchymal process may allow standardized and objective monitoring of disease, early assessment of treatment response and mortality prediction for DPLD patients.

  18. Quantitative Stratification of Diffuse Parenchymal Lung Diseases

    Science.gov (United States)

    Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Maldonado, Fabien; Peikert, Tobias; Moua, Teng; Ryu, Jay H.; Bartholmai, Brian J.; Robb, Richard A.

    2014-01-01

    Diffuse parenchymal lung diseases (DPLDs) are characterized by widespread pathological changes within the pulmonary tissue that impair the elasticity and gas exchange properties of the lungs. Clinical-radiological diagnosis of these diseases remains challenging and their clinical course is characterized by variable disease progression. These challenges have hindered the introduction of robust objective biomarkers for patient-specific prediction based on specific phenotypes in clinical practice for patients with DPLD. Therefore, strategies facilitating individualized clinical management, staging and identification of specific phenotypes linked to clinical disease outcomes or therapeutic responses are urgently needed. A classification schema consistently reflecting the radiological, clinical (lung function and clinical outcomes) and pathological features of a disease represents a critical need in modern pulmonary medicine. Herein, we report a quantitative stratification paradigm to identify subsets of DPLD patients with characteristic radiologic patterns in an unsupervised manner and demonstrate significant correlation of these self-organized disease groups with clinically accepted surrogate endpoints. The proposed consistent and reproducible technique could potentially transform diagnostic staging, clinical management and prognostication of DPLD patients as well as facilitate patient selection for clinical trials beyond the ability of current radiological tools. In addition, the sequential quantitative stratification of the type and extent of parenchymal process may allow standardized and objective monitoring of disease, early assessment of treatment response and mortality prediction for DPLD patients. PMID:24676019

  19. The Quantitative Nature of Autistic Social Impairment

    Science.gov (United States)

    Constantino, John N.

    2011-01-01

    Autism, like intellectual disability, represents the severe end of a continuous distribution of developmental impairments that occur in nature, that are highly inherited, and that are orthogonally related to other parameters of development. A paradigm shift in understanding the core social abnormality of autism as a quantitative trait rather than as a categorically-defined condition has key implications for diagnostic classification, the measurement of change over time, the search for underlying genetic and neurobiologic mechanisms, and public health efforts to identify and support affected children. Here a recent body of research in genetics and epidemiology is presented to examine a dimensional reconceptualization of autistic social impairment—as manifested in clinical autistic syndromes, the broader autism phenotype, and normal variation in the general population. It illustrates how traditional categorical approaches to diagnosis may lead to misclassification of subjects (especially girls and mildly affected boys in multiple-incidence autism families), which can be particularly damaging to biological studies, and proposes continued efforts to derive a standardized quantitative system by which to characterize this family of conditions. PMID:21289537

  20. Immune chromatography: a quantitative radioimmunological assay

    International Nuclear Information System (INIS)

    Davis, J.W.; Demetriades, M.; Bowen, J.M.

    1984-01-01

    Immune chromatography, a radioimmunological binding assay, employs paper chromatography to separate immune complexes from free antigen and antibodies. During chromatography free antigen and antibodies become distributed throughout the paper, while immune complexes remain near the bottoms of the strips. The chromatographic differences can be made quantitative by using either iodinated antigens or antibodies. Under these conditions nanogram quantities of antigen can be detected or antibodies in sera diluted several 1000-fold. The immune chromatography assay can also be performed as an indirect assay, since the paper strips are cut from nitrocellulose paper. In this case the immune components are absorbed by the paper during chromatography. Antigen is then detected with an iodinated second antibody. The indirect immune chromatography assay is particularly useful for identifying different sera that react with the same antigen. Reaction with the first serum before chromatography reduces the amount of antigen available to the second serum following chromatography. In addition to characterizing the immune chromatography procedure, we discuss the possible applications of chromatography assays for the quantitation of other types of molecular binding interactions. (Auth.)

  1. Technological innovation in neurosurgery: a quantitative study.

    Science.gov (United States)

    Marcus, Hani J; Hughes-Hallett, Archie; Kwasnicki, Richard M; Darzi, Ara; Yang, Guang-Zhong; Nandi, Dipankar

    2015-07-01

    Technological innovation within health care may be defined as the introduction of a new technology that initiates a change in clinical practice. Neurosurgery is a particularly technology-intensive surgical discipline, and new technologies have preceded many of the major advances in operative neurosurgical techniques. The aim of the present study was to quantitatively evaluate technological innovation in neurosurgery using patents and peer-reviewed publications as metrics of technology development and clinical translation, respectively. The authors searched a patent database for articles published between 1960 and 2010 using the Boolean search term "neurosurgeon OR neurosurgical OR neurosurgery." The top 50 performing patent codes were then grouped into technology clusters. Patent and publication growth curves were then generated for these technology clusters. A top-performing technology cluster was then selected as an exemplar for a more detailed analysis of individual patents. In all, 11,672 patents and 208,203 publications related to neurosurgery were identified. The top-performing technology clusters during these 50 years were image-guidance devices, clinical neurophysiology devices, neuromodulation devices, operating microscopes, and endoscopes. In relation to image-guidance and neuromodulation devices, the authors found a highly correlated rapid rise in the numbers of patents and publications, which suggests that these are areas of technology expansion. An in-depth analysis of neuromodulation-device patents revealed that the majority of well-performing patents were related to deep brain stimulation. Patent and publication data may be used to quantitatively evaluate technological innovation in neurosurgery.

  2. Quantitative (real-time) PCR

    International Nuclear Information System (INIS)

    Denman, S.E.; McSweeney, C.S.

    2005-01-01

    Many nucleic acid-based probe and PCR assays have been developed for the detection tracking of specific microbes within the rumen ecosystem. Conventional PCR assays detect PCR products at the end stage of each PCR reaction, where exponential amplification is no longer being achieved. This approach can result in different end product (amplicon) quantities being generated. In contrast, using quantitative, or real-time PCR, quantification of the amplicon is performed not at the end of the reaction, but rather during exponential amplification, where theoretically each cycle will result in a doubling of product being created. For real-time PCR, the cycle at which fluorescence is deemed to be detectable above the background during the exponential phase is termed the cycle threshold (Ct). The Ct values obtained are then used for quantitation, which will be discussed later

  3. QUANTITATIVE CONFOCAL LASER SCANNING MICROSCOPY

    Directory of Open Access Journals (Sweden)

    Merete Krog Raarup

    2011-05-01

    Full Text Available This paper discusses recent advances in confocal laser scanning microscopy (CLSM for imaging of 3D structure as well as quantitative characterization of biomolecular interactions and diffusion behaviour by means of one- and two-photon excitation. The use of CLSM for improved stereological length estimation in thick (up to 0.5 mm tissue is proposed. The techniques of FRET (Fluorescence Resonance Energy Transfer, FLIM (Fluorescence Lifetime Imaging Microscopy, FCS (Fluorescence Correlation Spectroscopy and FRAP (Fluorescence Recovery After Photobleaching are introduced and their applicability for quantitative imaging of biomolecular (co-localization and trafficking in live cells described. The advantage of two-photon versus one-photon excitation in relation to these techniques is discussed.

  4. Quantitative phase imaging of arthropods

    Science.gov (United States)

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-11-01

    Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy.

  5. Qualitative discussion of quantitative radiography

    International Nuclear Information System (INIS)

    Berger, H.; Motz, J.W.

    1975-01-01

    Since radiography yields an image that can be easily related to the tested object, it is superior to many nondestructive testing techniques in revealing the size, shape, and location of certain types of discontinuities. The discussion is limited to a description of the radiographic process, examination of some of the quantitative aspects of radiography, and an outline of some of the new ideas emerging in radiography. The advantages of monoenergetic x-ray radiography and neutron radiography are noted

  6. Quantitative inspection by computerized tomography

    International Nuclear Information System (INIS)

    Lopes, R.T.; Assis, J.T. de; Jesus, E.F.O. de

    1989-01-01

    The computerized Tomography (CT) is a method of nondestructive testing, that furnish quantitative information, that permit the detection and accurate localization of defects, internal dimension measurement, and, measurement and chart of the density distribution. The CT technology is much versatile, not presenting restriction in relation to form, size or composition of the object. A tomographic system, projected and constructed in our laboratory is presented. The applications and limitation of this system, illustrated by tomographyc images, are shown. (V.R.B.)

  7. Quantitative analysis of coupler tuning

    International Nuclear Information System (INIS)

    Zheng Shuxin; Cui Yupeng; Chen Huaibi; Xiao Liling

    2001-01-01

    The author deduces the equation of coupler frequency deviation Δf and coupling coefficient β instead of only giving the adjusting direction in the process of matching coupler, on the basis of coupling-cavity chain equivalent circuits model. According to this equation, automatic measurement and quantitative display are realized on a measuring system. It contributes to industrialization of traveling-wave accelerators for large container inspection systems

  8. Quantitative Methods for Teaching Review

    OpenAIRE

    Irina Milnikova; Tamara Shioshvili

    2011-01-01

    A new method of quantitative evaluation of teaching processes is elaborated. On the base of scores data, the method permits to evaluate efficiency of teaching within one group of students and comparative teaching efficiency in two or more groups. As basic characteristics of teaching efficiency heterogeneity, stability and total variability indices both for only one group and for comparing different groups are used. The method is easy to use and permits to rank results of teaching review which...

  9. Computational complexity a quantitative perspective

    CERN Document Server

    Zimand, Marius

    2004-01-01

    There has been a common perception that computational complexity is a theory of "bad news" because its most typical results assert that various real-world and innocent-looking tasks are infeasible. In fact, "bad news" is a relative term, and, indeed, in some situations (e.g., in cryptography), we want an adversary to not be able to perform a certain task. However, a "bad news" result does not automatically become useful in such a scenario. For this to happen, its hardness features have to be quantitatively evaluated and shown to manifest extensively. The book undertakes a quantitative analysis of some of the major results in complexity that regard either classes of problems or individual concrete problems. The size of some important classes are studied using resource-bounded topological and measure-theoretical tools. In the case of individual problems, the book studies relevant quantitative attributes such as approximation properties or the number of hard inputs at each length. One chapter is dedicated to abs...

  10. In-vivo quantitative measurement

    International Nuclear Information System (INIS)

    Ito, Takashi

    1992-01-01

    So far by positron CT, the quantitative analyses of oxygen consumption rate, blood flow distribution, glucose metabolic rate and so on have been carried out. The largest merit of using the positron CT is the observation and verification of mankind have become easy. Recently, accompanying the rapid development of the mapping tracers for central nervous receptors, the observation of many central nervous receptors by the positron CT has become feasible, and must expectation has been placed on the elucidation of brain functions. The conditions required for in vitro processes cannot be realized in strict sense in vivo. The quantitative measurement of in vivo tracer method is carried out by measuring the accumulation and movement of a tracer after its administration. The movement model of the mapping tracer for central nervous receptors is discussed. The quantitative analysis using a steady movement model, the measurement of dopamine receptors by reference method, the measurement of D 2 receptors using 11C-Racloprode by direct method, and the possibility of measuring dynamics bio-reaction are reported. (K.I.)

  11. Near Identifiability of Dynamical Systems

    Science.gov (United States)

    Hadaegh, F. Y.; Bekey, G. A.

    1987-01-01

    Concepts regarding approximate mathematical models treated rigorously. Paper presents new results in analysis of structural identifiability, equivalence, and near equivalence between mathematical models and physical processes they represent. Helps establish rigorous mathematical basis for concepts related to structural identifiability and equivalence revealing fundamental requirements, tacit assumptions, and sources of error. "Structural identifiability," as used by workers in this field, loosely translates as meaning ability to specify unique mathematical model and set of model parameters that accurately predict behavior of corresponding physical system.

  12. Brain Injury Lesion Imaging Using Preconditioned Quantitative Susceptibility Mapping without Skull Stripping.

    Science.gov (United States)

    Soman, S; Liu, Z; Kim, G; Nemec, U; Holdsworth, S J; Main, K; Lee, B; Kolakowsky-Hayner, S; Selim, M; Furst, A J; Massaband, P; Yesavage, J; Adamson, M M; Spincemallie, P; Moseley, M; Wang, Y

    2018-04-01

    Identifying cerebral microhemorrhage burden can aid in the diagnosis and management of traumatic brain injury, stroke, hypertension, and cerebral amyloid angiopathy. MR imaging susceptibility-based methods are more sensitive than CT for detecting cerebral microhemorrhage, but methods other than quantitative susceptibility mapping provide results that vary with field strength and TE, require additional phase maps to distinguish blood from calcification, and depict cerebral microhemorrhages as bloom artifacts. Quantitative susceptibility mapping provides universal quantification of tissue magnetic property without these constraints but traditionally requires a mask generated by skull-stripping, which can pose challenges at tissue interphases. We evaluated the preconditioned quantitative susceptibility mapping MR imaging method, which does not require skull-stripping, for improved depiction of brain parenchyma and pathology. Fifty-six subjects underwent brain MR imaging with a 3D multiecho gradient recalled echo acquisition. Mask-based quantitative susceptibility mapping images were created using a commonly used mask-based quantitative susceptibility mapping method, and preconditioned quantitative susceptibility images were made using precondition-based total field inversion. All images were reviewed by a neuroradiologist and a radiology resident. Ten subjects (18%), all with traumatic brain injury, demonstrated blood products on 3D gradient recalled echo imaging. All lesions were visible on preconditioned quantitative susceptibility mapping, while 6 were not visible on mask-based quantitative susceptibility mapping. Thirty-one subjects (55%) demonstrated brain parenchyma and/or lesions that were visible on preconditioned quantitative susceptibility mapping but not on mask-based quantitative susceptibility mapping. Six subjects (11%) demonstrated pons artifacts on preconditioned quantitative susceptibility mapping and mask-based quantitative susceptibility mapping

  13. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  14. Quantitative models for sustainable supply chain management

    DEFF Research Database (Denmark)

    Brandenburg, M.; Govindan, Kannan; Sarkis, J.

    2014-01-01

    and directions of this research area, this paper provides a content analysis of 134 carefully identified papers on quantitative, formal models that address sustainability aspects in the forward SC. It was found that a preponderance of the publications and models appeared in a limited set of six journals......Sustainability, the consideration of environmental factors and social aspects, in supply chain management (SCM) has become a highly relevant topic for researchers and practitioners. The application of operations research methods and related models, i.e. formal modeling, for closed-loop SCM...... and reverse logistics has been effectively reviewed in previously published research. This situation is in contrast to the understanding and review of mathematical models that focus on environmental or social factors in forward supply chains (SC), which has seen less investigation. To evaluate developments...

  15. Quantitative Efficiency Evaluation Method for Transportation Networks

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2014-11-01

    Full Text Available An effective evaluation of transportation network efficiency/performance is essential to the establishment of sustainable development in any transportation system. Based on a redefinition of transportation network efficiency, a quantitative efficiency evaluation method for transportation network is proposed, which could reflect the effects of network structure, traffic demands, travel choice, and travel costs on network efficiency. Furthermore, the efficiency-oriented importance measure for network components is presented, which can be used to help engineers identify the critical nodes and links in the network. The numerical examples show that, compared with existing efficiency evaluation methods, the network efficiency value calculated by the method proposed in this paper can portray the real operation situation of the transportation network as well as the effects of main factors on network efficiency. We also find that the network efficiency and the importance values of the network components both are functions of demands and network structure in the transportation network.

  16. Quantitative phosphoproteomics applied to the yeast pheromone signaling pathway

    DEFF Research Database (Denmark)

    Gruhler, Albrecht; Olsen, Jesper Velgaard; Mohammed, Shabaz

    2005-01-01

    of a detailed molecular view of complex biological processes. We present a quantitative modification-specific proteomic approach that combines stable isotope labeling by amino acids in cell culture (SILAC) for quantitation with IMAC for phosphopeptide enrichment and three stages of mass spectrometry (MS....... Phosphopeptide fractions were analyzed by LC-MS using a linear ion trap-Fourier transform ion cyclotron resonance mass spectrometer. MS/MS and neutral loss-directed MS/MS/MS analysis allowed detection and sequencing of phosphopeptides with exceptional accuracy and specificity. Of more than 700 identified...

  17. Quantitative Evaluation of Fire and EMS Mobilization Times

    CERN Document Server

    Upson, Robert

    2010-01-01

    Quantitative Evaluation of Fire and EMS Mobilization Times presents comprehensive empirical data on fire emergency and EMS call processing and turnout times, and aims to improve the operational benchmarks of NFPA peer consensus standards through a close examination of real-world data. The book also identifies and analyzes the elements that can influence EMS mobilization response times. Quantitative Evaluation of Fire and EMS Mobilization Times is intended for practitioners as a tool for analyzing fire emergency response times and developing methods for improving them. Researchers working in a

  18. Identifying mechanistic similarities in drug responses

    KAUST Repository

    Zhao, C.

    2012-05-15

    Motivation: In early drug development, it would be beneficial to be able to identify those dynamic patterns of gene response that indicate that drugs targeting a particular gene will be likely or not to elicit the desired response. One approach would be to quantitate the degree of similarity between the responses that cells show when exposed to drugs, so that consistencies in the regulation of cellular response processes that produce success or failure can be more readily identified.Results: We track drug response using fluorescent proteins as transcription activity reporters. Our basic assumption is that drugs inducing very similar alteration in transcriptional regulation will produce similar temporal trajectories on many of the reporter proteins and hence be identified as having similarities in their mechanisms of action (MOA). The main body of this work is devoted to characterizing similarity in temporal trajectories/signals. To do so, we must first identify the key points that determine mechanistic similarity between two drug responses. Directly comparing points on the two signals is unrealistic, as it cannot handle delays and speed variations on the time axis. Hence, to capture the similarities between reporter responses, we develop an alignment algorithm that is robust to noise, time delays and is able to find all the contiguous parts of signals centered about a core alignment (reflecting a core mechanism in drug response). Applying the proposed algorithm to a range of real drug experiments shows that the result agrees well with the prior drug MOA knowledge. © The Author 2012. Published by Oxford University Press. All rights reserved.

  19. The NOAA Dataset Identifier Project

    Science.gov (United States)

    de la Beaujardiere, J.; Mccullough, H.; Casey, K. S.

    2013-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) initiated a project in 2013 to assign persistent identifiers to datasets archived at NOAA and to create informational landing pages about those datasets. The goals of this project are to enable the citation of datasets used in products and results in order to help provide credit to data producers, to support traceability and reproducibility, and to enable tracking of data usage and impact. A secondary goal is to encourage the submission of datasets for long-term preservation, because only archived datasets will be eligible for a NOAA-issued identifier. A team was formed with representatives from the National Geophysical, Oceanographic, and Climatic Data Centers (NGDC, NODC, NCDC) to resolve questions including which identifier scheme to use (answer: Digital Object Identifier - DOI), whether or not to embed semantics in identifiers (no), the level of granularity at which to assign identifiers (as coarsely as reasonable), how to handle ongoing time-series data (do not break into chunks), creation mechanism for the landing page (stylesheet from formal metadata record preferred), and others. Decisions made and implementation experience gained will inform the writing of a Data Citation Procedural Directive to be issued by the Environmental Data Management Committee in 2014. Several identifiers have been issued as of July 2013, with more on the way. NOAA is now reporting the number as a metric to federal Open Government initiatives. This paper will provide further details and status of the project.

  20. Quantitative Analysis of cardiac SPECT

    International Nuclear Information System (INIS)

    Nekolla, S.G.; Bengel, F.M.

    2004-01-01

    The quantitative analysis of myocardial SPECT images is a powerful tool to extract the highly specific radio tracer uptake in these studies. If compared to normal data bases, the uptake values can be calibrated on an individual basis. Doing so increases the reproducibility of the analysis substantially. Based on the development over the last three decades starting from planar scinitigraphy, this paper discusses the methods used today incorporating the changes due to tomographic image acquisitions. Finally, the limitations of these approaches as well as consequences from most recent hardware developments, commercial analysis packages and a wider view of the description of the left ventricle are discussed. (orig.)

  1. Quantitative Trait Loci in Inbred Lines

    NARCIS (Netherlands)

    Jansen, R.C.

    2001-01-01

    Quantitative traits result from the influence of multiple genes (quantitative trait loci) and environmental factors. Detecting and mapping the individual genes underlying such 'complex' traits is a difficult task. Fortunately, populations obtained from crosses between inbred lines are relatively

  2. A quantitative framework for assessing ecological resilience

    Science.gov (United States)

    Quantitative approaches to measure and assess resilience are needed to bridge gaps between science, policy, and management. In this paper, we suggest a quantitative framework for assessing ecological resilience. Ecological resilience as an emergent ecosystem phenomenon can be de...

  3. Operations management research methodologies using quantitative modeling

    NARCIS (Netherlands)

    Bertrand, J.W.M.; Fransoo, J.C.

    2002-01-01

    Gives an overview of quantitative model-based research in operations management, focusing on research methodology. Distinguishes between empirical and axiomatic research, and furthermore between descriptive and normative research. Presents guidelines for doing quantitative model-based research in

  4. Quantitative self-assembly prediction yields targeted nanomedicines

    Science.gov (United States)

    Shamay, Yosi; Shah, Janki; Işık, Mehtap; Mizrachi, Aviram; Leibold, Josef; Tschaharganeh, Darjus F.; Roxbury, Daniel; Budhathoki-Uprety, Januka; Nawaly, Karla; Sugarman, James L.; Baut, Emily; Neiman, Michelle R.; Dacek, Megan; Ganesh, Kripa S.; Johnson, Darren C.; Sridharan, Ramya; Chu, Karen L.; Rajasekhar, Vinagolu K.; Lowe, Scott W.; Chodera, John D.; Heller, Daniel A.

    2018-02-01

    Development of targeted nanoparticle drug carriers often requires complex synthetic schemes involving both supramolecular self-assembly and chemical modification. These processes are generally difficult to predict, execute, and control. We describe herein a targeted drug delivery system that is accurately and quantitatively predicted to self-assemble into nanoparticles based on the molecular structures of precursor molecules, which are the drugs themselves. The drugs assemble with the aid of sulfated indocyanines into particles with ultrahigh drug loadings of up to 90%. We devised quantitative structure-nanoparticle assembly prediction (QSNAP) models to identify and validate electrotopological molecular descriptors as highly predictive indicators of nano-assembly and nanoparticle size. The resulting nanoparticles selectively targeted kinase inhibitors to caveolin-1-expressing human colon cancer and autochthonous liver cancer models to yield striking therapeutic effects while avoiding pERK inhibition in healthy skin. This finding enables the computational design of nanomedicines based on quantitative models for drug payload selection.

  5. Differential membrane proteomics using 18O-labeling to identify biomarkers for cholangiocarcinoma

    DEFF Research Database (Denmark)

    Kristiansen, Troels Zakarias; Harsha, H C; Grønborg, Mads

    2008-01-01

    Quantitative proteomic methodologies allow profiling of hundreds to thousands of proteins in a high-throughput fashion. This approach is increasingly applied to cancer biomarker discovery to identify proteins that are differentially regulated in cancers. Fractionation of protein samples based...

  6. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  7. Methods for Quantitative Creatinine Determination.

    Science.gov (United States)

    Moore, John F; Sharer, J Daniel

    2017-04-06

    Reliable measurement of creatinine is necessary to assess kidney function, and also to quantitate drug levels and diagnostic compounds in urine samples. The most commonly used methods are based on the Jaffe principal of alkaline creatinine-picric acid complex color formation. However, other compounds commonly found in serum and urine may interfere with Jaffe creatinine measurements. Therefore, many laboratories have made modifications to the basic method to remove or account for these interfering substances. This appendix will summarize the basic Jaffe method, as well as a modified, automated version. Also described is a high performance liquid chromatography (HPLC) method that separates creatinine from contaminants prior to direct quantification by UV absorption. Lastly, a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method is described that uses stable isotope dilution to reliably quantify creatinine in any sample. This last approach has been recommended by experts in the field as a means to standardize all quantitative creatinine methods against an accepted reference. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  8. Quantitative risk assessment system (QRAS)

    Science.gov (United States)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  9. Quantitative information in medical imaging

    International Nuclear Information System (INIS)

    Deconinck, F.

    1985-01-01

    When developing new imaging or image processing techniques, one constantly has in mind that the new technique should provide a better, or more optimal answer to medical tasks than existing techniques do 'Better' or 'more optimal' imply some kind of standard by which one can measure imaging or image processing performance. The choice of a particular imaging modality to answer a diagnostic task, such as the detection of coronary artery stenosis is also based on an implicit optimalisation of performance criteria. Performance is measured by the ability to provide information about an object (patient) to the person (referring doctor) who ordered a particular task. In medical imaging the task is generally to find quantitative information on bodily function (biochemistry, physiology) and structure (histology, anatomy). In medical imaging, a wide range of techniques is available. Each technique has it's own characteristics. The techniques discussed in this paper are: nuclear magnetic resonance, X-ray fluorescence, scintigraphy, positron emission tomography, applied potential tomography, computerized tomography, and compton tomography. This paper provides a framework for the comparison of imaging performance, based on the way the quantitative information flow is altered by the characteristics of the modality

  10. Digital radiography: a quantitative approach

    International Nuclear Information System (INIS)

    Retraint, F.

    2004-01-01

    'Full-text:' In a radiograph the value of each pixel is related to the material thickness crossed by the x-rays. Using this relationship, an object can be characterized by parameters such as depth, surface and volume. Assuming a locally linear detector response and using a radiograph of reference object, the quantitative thickness map of object can be obtained by applying offset and gain corrections. However, for an acquisition system composed of cooled CCD camera optically coupled to a scintillator screen, the radiographic image formation process generates some bias which prevent from obtaining the quantitative information: non uniformity of the x-ray source, beam hardening, Compton scattering, scintillator screen, optical system response. In a first section, we propose a complete model of the radiographic image formation process taking account of these biases. In a second section, we present an inversion scheme of this model for a single material object, which enables to obtain the thickness map of the object crossed by the x-rays. (author)

  11. Infrared thermography quantitative image processing

    Science.gov (United States)

    Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB

    2017-11-01

    Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.

  12. Magnetoresistive biosensors for quantitative proteomics

    Science.gov (United States)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  13. Identifying suitable sites for Florida panther reintroduction

    Science.gov (United States)

    Thatcher, Cindy A.; van Manen, Frank T.; Clark, Joseph D.

    2006-01-01

    A major objective of the 1995 Florida Panther (Puma concolor cory) Recovery Plan is the establishment of 2 additional panther populations within the historic range. Our goal was to identify prospective sites for Florida panther reintroduction within the historic range based on quantitative landscape assessments. First, we delineated 86 panther home ranges using telemetry data collected from 1981 to 2001 in south Florida to develop a Mahalanobis distance (D2) habitat model, using 4 anthropogenic variables and 3 landscape variables mapped at a 500-m resolution. From that analysis, we identified 9 potential reintroduction sites of sufficient size to support a panther population. We then developed a similar D2 model at a higher spatial resolution to quantify the area of favorable panther habitat at each site. To address potential for the population to expand, we calculated the amount of favorable habitat adjacent to each prospective reintroduction site within a range of dispersal distances of female panthers. We then added those totals to the contiguous patches to estimate the total amount of effective panther habitat at each site. Finally, we developed an expert-assisted model to rank and incorporate potentially important habitat variables that were not appropriate for our empirical analysis (e.g., area of public lands, livestock density). Anthropogenic factors heavily influenced both the landscape and the expert-assisted models. Of the 9 areas we identified, the Okefenokee National Wildlife Refuge, Ozark National Forest, and Felsenthal National Wildlife Refuge regions had the highest combination of effective habitat area and expert opinion scores. Sensitivity analyses indicated that variability among key model parameters did not affect the high ranking of those sites. Those sites should be considered as starting points for the field evaluation of potential reintroduction sites.

  14. The Case for Infusing Quantitative Literacy into Introductory Geoscience Courses

    Directory of Open Access Journals (Sweden)

    Jennifer M. Wenner

    2009-01-01

    Full Text Available We present the case for introductory geoscience courses as model venues for increasing the quantitative literacy (QL of large numbers of the college-educated population. The geosciences provide meaningful context for a number of fundamental mathematical concepts that are revisited several times in a single course. Using some best practices from the mathematics education community surrounding problem solving, calculus reform, pre-college mathematics and five geoscience/math workshops, geoscience and mathematics faculty have identified five pedagogical ideas to increase the QL of the students who populate introductory geoscience courses. These five ideas include techniques such as: place mathematical concepts in context, use multiple representations, use technology appropriately, work in groups, and do multiple-day, in-depth problems that place quantitative skills in multiple contexts. We discuss the pedagogical underpinnings of these five ideas and illustrate some ways that the geosciences represent ideal places to use these techniques. However, the inclusion of QL in introductory courses is often met with resistance at all levels. Faculty who wish to include quantitative content must use creative means to break down barriers of public perception of geoscience as qualitative, administrative worry that enrollments will drop and faculty resistance to change. Novel ways to infuse QL into geoscience classrooms include use of web-based resources, shadow courses, setting clear expectations, and promoting quantitative geoscience to the general public. In order to help faculty increase the QL of geoscience students, a community-built faculty-centered web resource (Teaching Quantitative Skills in the Geosciences houses multiple examples that implement the five best practices of QL throughout the geoscience curriculum. We direct faculty to three portions of the web resource: Teaching Quantitative Literacy, QL activities, and the 2006 workshop website

  15. Envisioning a Quantitative Studies Center: A Liberal Arts Perspective

    Directory of Open Access Journals (Sweden)

    Gizem Karaali

    2010-01-01

    Full Text Available Several academic institutions are searching for ways to help students develop their quantitative reasoning abilities and become more adept at higher-level tasks that involve quantitative skills. In this note we study the particular way Pomona College has framed this issue within its own context and what it plans to do about it. To this end we describe our efforts as members of a campus-wide committee that was assigned the duty of investigating the feasibility of founding a quantitative studies center on our campus. These efforts involved analysis of data collected through a faculty questionnaire, discipline-specific input obtained from each departmental representative, and a survey of what some of our peer institutions are doing to tackle these issues. In our studies, we identified three critical needs where quantitative support would be most useful in our case: tutoring and mentoring for entry-level courses; support for various specialized and analytic software tools for upper-level courses; and a uniform basic training for student tutors and mentors. We surmise that our challenges can be mitigated effectively via the formation of a well-focused and -planned quantitative studies center. We believe our process, findings and final proposal will be helpful to others who are looking to resolve similar issues on their own campuses.

  16. Dynamic Quantitative T1 Mapping in Orthotopic Brain Tumor Xenografts

    Directory of Open Access Journals (Sweden)

    Kelsey Herrmann

    2016-04-01

    Full Text Available Human brain tumors such as glioblastomas are typically detected using conventional, nonquantitative magnetic resonance imaging (MRI techniques, such as T2-weighted and contrast enhanced T1-weighted MRI. In this manuscript, we tested whether dynamic quantitative T1 mapping by MRI can localize orthotopic glioma tumors in an objective manner. Quantitative T1 mapping was performed by MRI over multiple time points using the conventional contrast agent Optimark. We compared signal differences to determine the gadolinium concentration in tissues over time. The T1 parametric maps made it easy to identify the regions of contrast enhancement and thus tumor location. Doubling the typical human dose of contrast agent resulted in a clearer demarcation of these tumors. Therefore, T1 mapping of brain tumors is gadolinium dose dependent and improves detection of tumors by MRI. The use of T1 maps provides a quantitative means to evaluate tumor detection by gadolinium-based contrast agents over time. This dynamic quantitative T1 mapping technique will also enable future quantitative evaluation of various targeted MRI contrast agents.

  17. Quantitative phenotyping via deep barcode sequencing.

    Science.gov (United States)

    Smith, Andrew M; Heisler, Lawrence E; Mellor, Joseph; Kaper, Fiona; Thompson, Michael J; Chee, Mark; Roth, Frederick P; Giaever, Guri; Nislow, Corey

    2009-10-01

    Next-generation DNA sequencing technologies have revolutionized diverse genomics applications, including de novo genome sequencing, SNP detection, chromatin immunoprecipitation, and transcriptome analysis. Here we apply deep sequencing to genome-scale fitness profiling to evaluate yeast strain collections in parallel. This method, Barcode analysis by Sequencing, or "Bar-seq," outperforms the current benchmark barcode microarray assay in terms of both dynamic range and throughput. When applied to a complex chemogenomic assay, Bar-seq quantitatively identifies drug targets, with performance superior to the benchmark microarray assay. We also show that Bar-seq is well-suited for a multiplex format. We completely re-sequenced and re-annotated the yeast deletion collection using deep sequencing, found that approximately 20% of the barcodes and common priming sequences varied from expectation, and used this revised list of barcode sequences to improve data quality. Together, this new assay and analysis routine provide a deep-sequencing-based toolkit for identifying gene-environment interactions on a genome-wide scale.

  18. Neuropathic pain: is quantitative sensory testing helpful?

    Science.gov (United States)

    Krumova, Elena K; Geber, Christian; Westermann, Andrea; Maier, Christoph

    2012-08-01

    Neuropathic pain arises as a consequence of a lesion or disease affecting the somatosensory system and is characterised by a combination of positive and negative sensory symptoms. Quantitative sensory testing (QST) examines the sensory perception after application of different mechanical and thermal stimuli of controlled intensity and the function of both large (A-beta) and small (A-delta and C) nerve fibres, including the corresponding central pathways. QST can be used to determine detection, pain thresholds and stimulus-response curves and can thus detect both negative and positive sensory signs, the second ones not being assessed by other methods. Similarly to all other psychophysical tests QST requires standardised examination, instructions and data evaluation to receive valid and reliable results. Since normative data are available, QST can contribute also to the individual diagnosis of neuropathy, especially in the case of isolated small-fibre neuropathy, in contrast to the conventional electrophysiology which assesses only large myelinated fibres. For example, detection of early stages of subclinical neuropathy in symptomatic or asymptomatic patients with diabetes mellitus can be helpful to optimise treatment and identify diabetic foot at risk of ulceration. QST assessed the individual's sensory profile and thus can be valuable to evaluate the underlying pain mechanisms which occur in different frequencies even in the same neuropathic pain syndromes. Furthermore, assessing the exact sensory phenotype by QST might be useful in the future to identify responders to certain treatments in accordance to the underlying pain mechanisms.

  19. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  20. The APEX Quantitative Proteomics Tool: Generating protein quantitation estimates from LC-MS/MS proteomics results

    Directory of Open Access Journals (Sweden)

    Saeed Alexander I

    2008-12-01

    Full Text Available Abstract Background Mass spectrometry (MS based label-free protein quantitation has mainly focused on analysis of ion peak heights and peptide spectral counts. Most analyses of tandem mass spectrometry (MS/MS data begin with an enzymatic digestion of a complex protein mixture to generate smaller peptides that can be separated and identified by an MS/MS instrument. Peptide spectral counting techniques attempt to quantify protein abundance by counting the number of detected tryptic peptides and their corresponding MS spectra. However, spectral counting is confounded by the fact that peptide physicochemical properties severely affect MS detection resulting in each peptide having a different detection probability. Lu et al. (2007 described a modified spectral counting technique, Absolute Protein Expression (APEX, which improves on basic spectral counting methods by including a correction factor for each protein (called Oi value that accounts for variable peptide detection by MS techniques. The technique uses machine learning classification to derive peptide detection probabilities that are used to predict the number of tryptic peptides expected to be detected for one molecule of a particular protein (Oi. This predicted spectral count is compared to the protein's observed MS total spectral count during APEX computation of protein abundances. Results The APEX Quantitative Proteomics Tool, introduced here, is a free open source Java application that supports the APEX protein quantitation technique. The APEX tool uses data from standard tandem mass spectrometry proteomics experiments and provides computational support for APEX protein abundance quantitation through a set of graphical user interfaces that partition thparameter controls for the various processing tasks. The tool also provides a Z-score analysis for identification of significant differential protein expression, a utility to assess APEX classifier performance via cross validation, and a

  1. Quantitative Ultrasound for Measuring Obstructive Severity in Children with Hydronephrosis.

    Science.gov (United States)

    Cerrolaza, Juan J; Peters, Craig A; Martin, Aaron D; Myers, Emmarie; Safdar, Nabile; Linguraru, Marius George

    2016-04-01

    We define sonographic biomarkers for hydronephrotic renal units that can predict the necessity of diuretic nuclear renography. We selected a cohort of 50 consecutive patients with hydronephrosis of varying severity in whom 2-dimensional sonography and diuretic mercaptoacetyltriglycine renography had been performed. A total of 131 morphological parameters were computed using quantitative image analysis algorithms. Machine learning techniques were then applied to identify ultrasound based safety thresholds that agreed with the t½ for washout. A best fit model was then derived for each threshold level of t½ that would be clinically relevant at 20, 30 and 40 minutes. Receiver operating characteristic curve analysis was performed. Sensitivity, specificity and area under the receiver operating characteristic curve were determined. Improvement obtained by the quantitative imaging method compared to the Society for Fetal Urology grading system and the hydronephrosis index was statistically verified. For the 3 thresholds considered and at 100% sensitivity the specificities of the quantitative imaging method were 94%, 70% and 74%, respectively. Corresponding area under the receiver operating characteristic curve values were 0.98, 0.94 and 0.94, respectively. Improvement obtained by the quantitative imaging method over the Society for Fetal Urology grade and hydronephrosis index was statistically significant (p hydronephrosis can identify thresholds of clinically significant washout times with 100% sensitivity to decrease the number of diuretic renograms in up to 62% of children. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  2. Quantitative Analysis of Thallium-201 Myocardial Tomograms

    International Nuclear Information System (INIS)

    Kim, Sang Eun; Nam, Gi Byung; Choi, Chang Woon

    1991-01-01

    The purpose of this study was to assess the ability of quantitative Tl-201 tomography to identify and localize coronary artery disease (CAD). The study population consisted of 41 patients (31 males, 10 females; mean age 55 ± 7 yr) including 14 with prior myocardial infarction who underwent both exercise Tl-201 myocardium SPECT and coronary angiography for the evaluation of chest pain. From the short axis and vertical long axis tomograms, stress extent polar maps were generated by Cedars-Sinai Medical Center program, and the 9 stress defect extent (SDE) was quantified for each coronary artery territory. For the purpose of this study, the coronary circulation was divided into 6 arterial segments, and the myocardial ischemic score (MIS) was calculated from the coronary angiogram. Sensitivity for the detection of CAD (>50% coronary stenosis by angiography) by stress extent polar map was 95% in single vessel disease, and 100% in double and triple vessel diseases. Overall sensitivity was 97%<. Sensitivity and specificity for the detection of individual diseased vessels were, respectively, 87% and 90% for the left anterior descending artery (LAD), 36% and 93% for the left circumflex artery (LCX), and 71% and 70%, for the right coronary artery (RCA). Concordance for the detection of individual diseased vessels between the coronary angiography and stress polar map was fair for the LAD (kappa=0.70), and RCA (kappa=0.41) lesions, whereas it was poor for the LCK lesions (kappa =0.32) There were significant correlations between the MIS and SDE in LAD (rs=0. 56, p=0.0027), and RCA territory (rs=0.60, p=0.0094). No significant correlation was found in LCX territory. When total vascular territories were combined, there was a significant correlation between the MIS and SDE (rs=0.42, p=0,0116). In conclusion, the quantitative analysis of Tl-201 tomograms appears to be accurate for determining the presence and location of CAD.

  3. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  4. Identifying tier one key suppliers.

    Science.gov (United States)

    Wicks, Steve

    2013-01-01

    In today's global marketplace, businesses are becoming increasingly reliant on suppliers for the provision of key processes, activities, products and services in support of their strategic business goals. The result is that now, more than ever, the failure of a key supplier has potential to damage reputation, productivity, compliance and financial performance seriously. Yet despite this, there is no recognised standard or guidance for identifying a tier one key supplier base and, up to now, there has been little or no research on how to do so effectively. This paper outlines the key findings of a BCI-sponsored research project to investigate good practice in identifying tier one key suppliers, and suggests a scalable framework process model and risk matrix tool to help businesses effectively identify their tier one key supplier base.

  5. Identifying factors affecting optimal management of agricultural water

    Directory of Open Access Journals (Sweden)

    Masoud Samian

    2015-01-01

    In addition to quantitative methodology such as descriptive statistics and factor analysis a qualitative methodology was employed for dynamic simulation among variables through Vensim software. In this study, the factor analysis technique was used through the Kaiser-Meyer-Olkin (KMO and Bartlett tests. From the results, four key elements were identified as factors affecting the optimal management of agricultural water in Hamedan area. These factors were institutional and legal factors, technical and knowledge factors, economic factors and social factors.

  6. Football refereeing: Identifying innovative methods

    Directory of Open Access Journals (Sweden)

    Reza MohammadKazemi

    2014-08-01

    Full Text Available The aim of the present study is to identify the potentials innovation in football industry. Data were collected from 10 national and international referees, assistant referees and referees’ supervisors in Iran. In this study, technological innovations are identified that assist better refereeing performances. The analysis revealed a significant relationship between using new technologies and referees ‘performance. The results indicate that elite referees, assistant referees and supervisors agreed to use new technological innovations during the game. According to their comments, this kind of technology causes the referees’ performance development.

  7. Quantitative evaluation of dermatological antiseptics.

    Science.gov (United States)

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus. © 2015 British Association of Dermatologists.

  8. Quantitative genetics of disease traits.

    Science.gov (United States)

    Wray, N R; Visscher, P M

    2015-04-01

    John James authored two key papers on the theory of risk to relatives for binary disease traits and the relationship between parameters on the observed binary scale and an unobserved scale of liability (James Annals of Human Genetics, 1971; 35: 47; Reich, James and Morris Annals of Human Genetics, 1972; 36: 163). These two papers are John James' most cited papers (198 and 328 citations, November 2014). They have been influential in human genetics and have recently gained renewed popularity because of their relevance to the estimation of quantitative genetics parameters for disease traits using SNP data. In this review, we summarize the two early papers and put them into context. We show recent extensions of the theory for ascertained case-control data and review recent applications in human genetics. © 2015 Blackwell Verlag GmbH.

  9. Quantitative Activities for Introductory Astronomy

    Science.gov (United States)

    Keohane, Jonathan W.; Bartlett, J. L.; Foy, J. P.

    2010-01-01

    We present a collection of short lecture-tutorial (or homework) activities, designed to be both quantitative and accessible to the introductory astronomy student. Each of these involves interpreting some real data, solving a problem using ratios and proportionalities, and making a conclusion based on the calculation. Selected titles include: "The Mass of Neptune” "The Temperature on Titan” "Rocks in the Early Solar System” "Comets Hitting Planets” "Ages of Meteorites” "How Flat are Saturn's Rings?” "Tides of the Sun and Moon on the Earth” "The Gliese 581 Solar System"; "Buckets in the Rain” "How Hot, Bright and Big is Betelgeuse?” "Bombs and the Sun” "What Forms Stars?” "Lifetimes of Cars and Stars” "The Mass of the Milky” "How Old is the Universe?” "Is The Universe Speeding up or Slowing Down?"

  10. Quantitative patterns in drone wars

    Science.gov (United States)

    Garcia-Bernardo, Javier; Dodds, Peter Sheridan; Johnson, Neil F.

    2016-02-01

    Attacks by drones (i.e., unmanned combat air vehicles) continue to generate heated political and ethical debates. Here we examine the quantitative nature of drone attacks, focusing on how their intensity and frequency compare with that of other forms of human conflict. Instead of the power-law distribution found recently for insurgent and terrorist attacks, the severity of attacks is more akin to lognormal and exponential distributions, suggesting that the dynamics underlying drone attacks lie beyond these other forms of human conflict. We find that the pattern in the timing of attacks is consistent with one side having almost complete control, an important if expected result. We show that these novel features can be reproduced and understood using a generative mathematical model in which resource allocation to the dominant side is regulated through a feedback loop.

  11. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  12. Quantitative variation in natural populations

    International Nuclear Information System (INIS)

    Parsons, P.A.

    1975-01-01

    Quantitative variation is considered in natural populations using Drosophila as the example. A knowledge of such variation enables its rapid exploitation in directional selection experiments as shown for scutellar chaeta number. Where evidence has been obtained, genetic architectures are in qualitative agreement with Mather's concept of balance for traits under stabilizing selection. Additive genetic control is found for acute environmental stresses, but not for less acute stresses as shown by exposure to 60 Co-γ rays. D. simulans probably has a narrower ecological niche than its sibling species D. melanogaster associated with lower genetic heterogeneity. One specific environmental stress to which D. simulans is sensitive in nature is ethyl alcohol as shown by winery data. (U.S.)

  13. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  14. Innovations in Quantitative Risk Management

    CERN Document Server

    Scherer, Matthias; Zagst, Rudi

    2015-01-01

    Quantitative models are omnipresent –but often controversially discussed– in todays risk management practice. New regulations, innovative financial products, and advances in valuation techniques provide a continuous flow of challenging problems for financial engineers and risk managers alike. Designing a sound stochastic model requires finding a careful balance between parsimonious model assumptions, mathematical viability, and interpretability of the output. Moreover, data requirements and the end-user training are to be considered as well. The KPMG Center of Excellence in Risk Management conference Risk Management Reloaded and this proceedings volume contribute to bridging the gap between academia –providing methodological advances– and practice –having a firm understanding of the economic conditions in which a given model is used. Discussed fields of application range from asset management, credit risk, and energy to risk management issues in insurance. Methodologically, dependence modeling...

  15. SOCIODEMOGRAPHIC DATA USED FOR IDENTIFYING ...

    Science.gov (United States)

    Due to unique social and demographic characteristics, various segments of the population may experience exposures different from those of the general population, which, in many cases, may be greater. When risk assessments do not characterize subsets of the general population, the populations that may experience the greatest risk remain unidentified. When such populations are not identified, the social and demographic data relevant to these populations is not considered when preparing exposure estimates, which can underestimate exposure and risk estimates for at-risk populations. Thus, it is necessary for risk or exposure assessors characterizing a diverse population, to first identify and then enumerate certain groups within the general population who are at risk for greater contaminant exposures. The document entitled Sociodemographic Data Used for Identifying Potentially Highly Exposed Populations (also referred to as the Highly Exposed Populations document), assists assessors in identifying and enumerating potentially highly exposed populations. This document presents data relating to factors which potentially impact an individual or group's exposure to environmental contaminants based on activity patterns (how time is spent), microenvironments (locations where time is spent), and other socio-demographic data such as age, gender, race and economic status. Populations potentially more exposed to various chemicals of concern, relative to the general population

  16. SNP interaction pattern identifier (SIPI)

    DEFF Research Database (Denmark)

    Lin, Hui Yi; Chen, Dung Tsa; Huang, Po Yu

    2017-01-01

    Motivation: Testing SNP-SNP interactions is considered as a key for overcoming bottlenecks of genetic association studies. However, related statistical methods for testing SNP-SNP interactions are underdeveloped. Results: We propose the SNP Interaction Pattern Identifier (SIPI), which tests 45...

  17. Identifying the Gifted Child Humorist.

    Science.gov (United States)

    Fern, Tami L.

    1991-01-01

    This study attempted to identify gifted child humorists among 1,204 children in grades 3-6. Final identification of 13 gifted child humorists was determined through application of such criteria as funniness, originality, and exemplary performance or product. The influence of intelligence, development, social factors, sex differences, family…

  18. Identifying high-risk medication

    DEFF Research Database (Denmark)

    Sædder, Eva; Brock, Birgitte; Nielsen, Lars Peter

    2014-01-01

    salicylic acid, and beta-blockers; 30 drugs or drug classes caused 82 % of all serious MEs. The top ten drugs involved in fatal events accounted for 73 % of all drugs identified. CONCLUSION: Increasing focus on seven drugs/drug classes can potentially reduce hospitalizations, extended hospitalizations...

  19. Management of COPD: Is there a role for quantitative imaging?

    International Nuclear Information System (INIS)

    Kirby, Miranda; Beek, Edwin J.R. van; Seo, Joon Beom; Biederer, Juergen; Nakano, Yasutaka; Coxson, Harvey O.; Parraga, Grace

    2017-01-01

    Highlights: • Multicentre studies with CT are enabling a better understanding of COPD phenotypes. • New pulmonary MRI techniques have emerged that provide sensitive COPD biomarkers. • OCT is the only imaging modality that can directly quantify the small airways. • Imaging may identify phenotypes for effective COPD management to improve outcomes. - Abstract: While the recent development of quantitative imaging methods have led to their increased use in the diagnosis and management of many chronic diseases, medical imaging still plays a limited role in the management of chronic obstructive pulmonary disease (COPD). In this review we highlight three pulmonary imaging modalities: computed tomography (CT), magnetic resonance imaging (MRI) and optical coherence tomography (OCT) imaging and the COPD biomarkers that may be helpful for managing COPD patients. We discussed the current role imaging plays in COPD management as well as the potential role quantitative imaging will play by identifying imaging phenotypes to enable more effective COPD management and improved outcomes.

  20. Management of COPD: Is there a role for quantitative imaging?

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, Miranda [Department of Radiology, University of British Columbia, Vancouver (Canada); UBC James Hogg Research Center & The Institute of Heart and Lung Health, St. Paul' s Hospital, Vancouver (Canada); Beek, Edwin J.R. van [Clinical Research Imaging Centre, Queen’s Medical Research Institute, University of Edinburgh, Edinburgh (United Kingdom); Seo, Joon Beom [Department of Radiology, University of Ulsan College of Medicine, Asan Medical Center (Korea, Republic of); Biederer, Juergen [Department of Diagnostic and Interventional Radiology, University Hospital of Heidelberg (Germany); Translational Lung Research Center Heidelberg (TLRC), Member of the German Lung Research Center (DZL) (Germany); Radiologie Darmstadt, Gross-Gerau County Hospital (Germany); Nakano, Yasutaka [Division of Respiratory Medicine, Department of Internal Medicine, Shiga University of Medical Science, Shiga (Japan); Coxson, Harvey O. [Department of Radiology, University of British Columbia, Vancouver (Canada); UBC James Hogg Research Center & The Institute of Heart and Lung Health, St. Paul' s Hospital, Vancouver (Canada); Parraga, Grace, E-mail: gparraga@robarts.ca [Robarts Research Institute, The University of Western Ontario, London (Canada); Department of Medical Biophysics, The University of Western Ontario, London (Canada)

    2017-01-15

    Highlights: • Multicentre studies with CT are enabling a better understanding of COPD phenotypes. • New pulmonary MRI techniques have emerged that provide sensitive COPD biomarkers. • OCT is the only imaging modality that can directly quantify the small airways. • Imaging may identify phenotypes for effective COPD management to improve outcomes. - Abstract: While the recent development of quantitative imaging methods have led to their increased use in the diagnosis and management of many chronic diseases, medical imaging still plays a limited role in the management of chronic obstructive pulmonary disease (COPD). In this review we highlight three pulmonary imaging modalities: computed tomography (CT), magnetic resonance imaging (MRI) and optical coherence tomography (OCT) imaging and the COPD biomarkers that may be helpful for managing COPD patients. We discussed the current role imaging plays in COPD management as well as the potential role quantitative imaging will play by identifying imaging phenotypes to enable more effective COPD management and improved outcomes.

  1. Distributed Persistent Identifiers System Design

    Directory of Open Access Journals (Sweden)

    Pavel Golodoniuc

    2017-06-01

    Full Text Available The need to identify both digital and physical objects is ubiquitous in our society. Past and present persistent identifier (PID systems, of which there is a great variety in terms of technical and social implementation, have evolved with the advent of the Internet, which has allowed for globally unique and globally resolvable identifiers. PID systems have, by in large, catered for identifier uniqueness, integrity, and persistence, regardless of the identifier’s application domain. Trustworthiness of these systems has been measured by the criteria first defined by Bütikofer (2009 and further elaborated by Golodoniuc 'et al'. (2016 and Car 'et al'. (2017. Since many PID systems have been largely conceived and developed by a single organisation they faced challenges for widespread adoption and, most importantly, the ability to survive change of technology. We believe that a cause of PID systems that were once successful fading away is the centralisation of support infrastructure – both organisational and computing and data storage systems. In this paper, we propose a PID system design that implements the pillars of a trustworthy system – ensuring identifiers’ independence of any particular technology or organisation, implementation of core PID system functions, separation from data delivery, and enabling the system to adapt for future change. We propose decentralisation at all levels — persistent identifiers and information objects registration, resolution, and data delivery — using Distributed Hash Tables and traditional peer-to-peer networks with information replication and caching mechanisms, thus eliminating the need for a central PID data store. This will increase overall system fault tolerance thus ensuring its trustworthiness. We also discuss important aspects of the distributed system’s governance, such as the notion of the authoritative source and data integrity

  2. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  3. A quantitative reading of competences documents of Law new degrees.

    OpenAIRE

    Leví Orta, Genoveva del Carmen; Ramos Méndez, Eduardo

    2014-01-01

    Documents formulating competences of degrees are key sources for analysis, evaluation and profile comparison of training, currently offered by different university degrees. This work aims to make a quantitative reading of competences documents of Law degree from various Spanish universities, based on the ideas of Content Analysis. The methodology has two phases. Firstly, a dictionary of concepts related to the components of competences is identified in the documentary corpus. Next, the corpus...

  4. A Review on Quantitative Models for Sustainable Food Logistics Management

    Directory of Open Access Journals (Sweden)

    M. Soysal

    2012-12-01

    Full Text Available The last two decades food logistics systems have seen the transition from a focus on traditional supply chain management to food supply chain management, and successively, to sustainable food supply chain management. The main aim of this study is to identify key logistical aims in these three phases and analyse currently available quantitative models to point out modelling challenges in sustainable food logistics management (SFLM. A literature review on quantitative studies is conducted and also qualitative studies are consulted to understand the key logistical aims more clearly and to identify relevant system scope issues. Results show that research on SFLM has been progressively developing according to the needs of the food industry. However, the intrinsic characteristics of food products and processes have not yet been handled properly in the identified studies. The majority of the works reviewed have not contemplated on sustainability problems, apart from a few recent studies. Therefore, the study concludes that new and advanced quantitative models are needed that take specific SFLM requirements from practice into consideration to support business decisions and capture food supply chain dynamics.

  5. ORCID: Author Identifiers for Librarians

    Directory of Open Access Journals (Sweden)

    Robyn B. Reed

    2017-10-01

    Full Text Available Generating accurate publication lists by researchers can be challenging when faced with scholars who have common names or who have published under name variations. This article describes ORCID and the goal of generating author identifiers for scholars to connect their research outputs. Included are the reasons for having author identifiers as well as the types of information within individual profiles. This article includes information on how academic libraries are playing a role with ORCID initiatives as well as describing how publishers, institutions, and funders are employing ORCID in their workflows. Highlighted is material on academic institutions in Pennsylvania using ORCID. The purpose of the article is to provide an overview of ORCID and its uses to inform librarians about this important initiative.

  6. Device for identifying fuel assembly

    International Nuclear Information System (INIS)

    Imai, Tetsuo; Miyazawa, Tatsuo.

    1982-01-01

    Purpose: To accurately identify a symbol printed on a hanging tool at the upper part of a fuel assembly. Constitution: Optical fibers are bundled to prepare a detector which is disposed at a predetermined position on a hanging tool. This position is set by a guide. Thus, the light emitted from an illumination lamp arrives at the bottom of a groove printed on the upper surface of the tool, and is divided into a weak light reflected upwardly and a strong light reflected on the surface lower than the groove. When these lights are received by the optical fibers, the fibers corresponding to the grooved position become dark, and the fibers corresponding to the ungrooved position become bright. Since the fuel assembly is identified by the dark and bright of the optical fibers as symbols, different machining can be performed every fuel assembly on the upper surface of the tool. (Yoshihara, H.)

  7. Identifying patient risks during hospitalization

    Directory of Open Access Journals (Sweden)

    Lucélia Ferreira Lima

    2008-12-01

    Full Text Available Objective: To identify the risks reported at a public institution andto know the main patient risks from the nursing staff point of view.Methods: A retrospective, descriptive and exploratory study. Thesurvey was developed at a hospital in the city of Taboão da Serra, SãoPaulo, Brazil. The study included all nurses working in care areas whoagreed to participate in the study. At the same time, sentinel eventsoccurring in the period from July 2006 to July 2007 were identified.Results: There were 440 sentinel events reported, and the main risksincluded patient falls, medication errors and pressure ulcers. Sixty-fivenurses were interviewed. They also reported patient falls, medicationerrors and pressure ulcers as the main risks. Conclusions: Riskassessment and implementation of effective preventive actions arenecessary to ensure patient’s safety. Involvement of a multidisciplinaryteam is one of the steps for a successful process.

  8. Identifying High Performance ERP Projects

    OpenAIRE

    Stensrud, Erik; Myrtveit, Ingunn

    2002-01-01

    Learning from high performance projects is crucial for software process improvement. Therefore, we need to identify outstanding projects that may serve as role models. It is common to measure productivity as an indicator of performance. It is vital that productivity measurements deal correctly with variable returns to scale and multivariate data. Software projects generally exhibit variable returns to scale, and the output from ERP projects is multivariate. We propose to use Data Envelopment ...

  9. Expression profiling identifies genes involved in emphysema severity

    Directory of Open Access Journals (Sweden)

    Bowman Rayleen V

    2009-09-01

    Full Text Available Abstract Chronic obstructive pulmonary disease (COPD is a major public health problem. The aim of this study was to identify genes involved in emphysema severity in COPD patients. Gene expression profiling was performed on total RNA extracted from non-tumor lung tissue from 30 smokers with emphysema. Class comparison analysis based on gas transfer measurement was performed to identify differentially expressed genes. Genes were then selected for technical validation by quantitative reverse transcriptase-PCR (qRT-PCR if also represented on microarray platforms used in previously published emphysema studies. Genes technically validated advanced to tests of biological replication by qRT-PCR using an independent test set of 62 lung samples. Class comparison identified 98 differentially expressed genes (p p Gene expression profiling of lung from emphysema patients identified seven candidate genes associated with emphysema severity including COL6A3, SERPINF1, ZNHIT6, NEDD4, CDKN2A, NRN1 and GSTM3.

  10. Quantitation of esophageal transit and gastroesophageal reflux

    International Nuclear Information System (INIS)

    Malmud, L.S.; Fisher, R.S.

    1986-01-01

    Scintigraphic techniques are the only quantitative methods for the evaluation of esophageal transit and gastroesophageal reflux. By comparison, other techniques are not quantitative and are either indirect, inconvenient, or less sensitive. Methods, such as perfusion techniques, which measure flow, require the introduction of a tube assembly into the gastrointestinal tract with the possible introduction of artifacts into the measurements due to the indwelling tubes. Earlier authors using radionuclide markers, introduced a method for measuring gastric emptying which was both tubeless and quantitative in comparison to other techniques. More recently, a number of scintigraphic methods have been introduced for the quantitation of esophageal transit and clearance, the detection and quantitation of gastroesophageal reflux, the measurement of gastric emptying using a mixed solid-liquid meal, and the quantitation of enterogastric reflux. This chapter reviews current techniques for the evaluation of esophageal transit and gastroesophageal reflux

  11. Quantitative organ visualization using SPECT

    International Nuclear Information System (INIS)

    Kircos, L.T.; Carey, J.E. Jr.; Keyes, J.W. Jr.

    1987-01-01

    Quantitative organ visualization (QOV) was performed using single photon emission computed tomography (SPECT). Organ size was calculated from serial, contiguous ECT images taken through the organ of interest with image boundaries determined using a maximum directional gradient edge finding technique. Organ activity was calculated using ECT counts bounded by the directional gradient, imaging system efficiency, and imaging time. The technique used to perform QOV was evaluated using phantom studies, in vivo canine liver, spleen, bladder, and kidney studies, and in vivo human bladder studies. It was demonstrated that absolute organ activity and organ size could be determined with this system and total imaging time restricted to less than 45 min to an accuracy of about +/- 10% providing the minimum dimensions of the organ are greater than the FWHM of the imaging system and the total radioactivity within the organ of interest exceeds 15 nCi/cc for dog-sized torsos. In addition, effective half-lives of approximately 1.5 hr or greater could be determined

  12. Quantitative isotopes miction cystoureterography (QIMCU)

    International Nuclear Information System (INIS)

    Szy, D.A.G.; Stroetges, M.W.; Funke-Voelkers, R.

    1982-01-01

    A simple method for a quantitative evaluation of vesicoureteral reflux was developed. It allows the determination of a) the volume of reflux b) the volume of the bladder at each point of time during the examination. The QIMCU gives an insight into the dynamic of reflux, of reflux volume, and of actual bladder volume. The clinical application in 37 patients with 53 insufficient ureteral orifices (i.e. reflux) showed that the onset of reflux occured in 60% as early as in the first five minutes of the examination but later in the remaining 40%. The maximal reflux was found only in 26% during the first five minutes. The reflux volume exceeded in more than 50% the amount of 3.5 ml. The international grading corresponds with the reflux volume determined by this method. Radionuclide cystoureterography can be used as well in childhood as in adults. Because the radiaction exposure is low, the method can be recommended for the initial examination and for follow up studies. (Author)

  13. Practical quantitative measures of ALARA

    International Nuclear Information System (INIS)

    Kathren, R.L.; Larson, H.V.

    1982-06-01

    Twenty specific quantitative measures to assist in evaluating the effectiveness of as low as reasonably achievable (ALARA) programs are described along with their applicability, practicality, advantages, disadvantages, and potential for misinterpretation or dortion. Although no single index or combination of indices is suitable for all facilities, generally, these five: (1) mean individual dose equivalent (MIDE) to the total body from penetrating radiations; (2) statistical distribution of MIDE to the whole body from penetrating radiations; (3) cumulative penetrating whole body dose equivalent; (4) MIDE evaluated by job classification; and (5) MIDE evaluated by work location-apply to most programs. Evaluation of other programs may require other specific dose equivalent based indices, including extremity exposure data, cumulative dose equivalent to organs or to the general population, and nonpenetrating radiation dose equivalents. Certain nondose equivalent indices, such as the size of the radiation or contamination area, may also be used; an airborne activity index based on air concentration, room volume, and radiotoxicity is developed for application in some ALARA programs

  14. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    Science.gov (United States)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  15. Quantitative Ultrasound Measurements at the Heel

    DEFF Research Database (Denmark)

    Daugschies, M.; Brixen, K.; Hermann, P.

    2015-01-01

    Calcaneal quantitative ultrasound can be used to predict osteoporotic fracture risk, but its ability to monitor therapy is unclear possibly because of its limited precision. We developed a quantitative ultrasound device (foot ultrasound scanner) that measures the speed of sound at the heel...... with the foot ultrasound scanner reduced precision errors by half (p quantitative ultrasound measurements is feasible. (E-mail: m.daugschies@rad.uni-kiel.de) (C) 2015 World Federation for Ultrasound in Medicine & Biology....

  16. Qualitative and quantitative methods in health research

    OpenAIRE

    V?zquez Navarrete, M. Luisa

    2009-01-01

    Introduction Research in the area of health has been traditionally dominated by quantitative research. However, the complexity of ill-health, which is socially constructed by individuals, health personnel and health authorities have motivated the search for other forms to approach knowledge. Aim To discuss the complementarities of qualitative and quantitative research methods in the generation of knowledge. Contents The purpose of quantitative research is to measure the magnitude of an event,...

  17. Sparse Linear Identifiable Multivariate Modeling

    DEFF Research Database (Denmark)

    Henao, Ricardo; Winther, Ole

    2011-01-01

    and bench-marked on artificial and real biological data sets. SLIM is closest in spirit to LiNGAM (Shimizu et al., 2006), but differs substantially in inference, Bayesian network structure learning and model comparison. Experimentally, SLIM performs equally well or better than LiNGAM with comparable......In this paper we consider sparse and identifiable linear latent variable (factor) and linear Bayesian network models for parsimonious analysis of multivariate data. We propose a computationally efficient method for joint parameter and model inference, and model comparison. It consists of a fully...

  18. Identifying flares in rheumatoid arthritis

    DEFF Research Database (Denmark)

    Bykerk, Vivian P; Bingham, Clifton O; Choy, Ernest H

    2016-01-01

    to flare, with escalation planned in 61%. CONCLUSIONS: Flares are common in rheumatoid arthritis (RA) and are often preceded by treatment reductions. Patient/MD/DAS agreement of flare status is highest in patients worsening from R/LDA. OMERACT RA flare questions can discriminate between patients with...... Set. METHODS: Candidate flare questions and legacy measures were administered at consecutive visits to Canadian Early Arthritis Cohort (CATCH) patients between November 2011 and November 2014. The American College of Rheumatology (ACR) core set indicators were recorded. Concordance to identify flares...

  19. Quantitative Ultrasond in the assessment of Osteoporosis

    International Nuclear Information System (INIS)

    Guglielmi, Giuseppe; Terlizzi, Francesca de

    2009-01-01

    Quantitative ultrasound (QUS) is used in the clinical setting to identify changes in bone tissue connected with menopause, osteoporosis and bone fragility. The versatility of the technique, its low cost and lack of ionizing radiation have led to the use of this method worldwide. Furthermore, with increased clinical interest among clinicians, QUS has been applied to several field of investigation of bone, in various pathologies of bone metabolism, in paediatrics, neonatology, genetics and other fields. Several studies have been carried out in recent years to investigate the potential of QUS, with important positive results. The technique is able to predict osteoporotic fractures; some evidence of the ability to monitor therapies has been reported; the usefulness in the management of secondary osteoporosis has been confirmed; studies in paediatrics have reported reference curves for some QUS devices, and there have been relevant studies in conditions involving metabolic bone disorders. This article is an overview of the most relevant developments in the field of QUS, both in the clinical and in the experimental settings. The advantages and limitations of the present technique have been outlined, together with suggestions for the use in the clinical practice.

  20. Quantitative topographic differentiation of the neonatal EEG.

    Science.gov (United States)

    Paul, Karel; Krajca, Vladimír; Roth, Zdenek; Melichar, Jan; Petránek, Svojmil

    2006-09-01

    To test the discriminatory topographic potential of a new method of the automatic EEG analysis in neonates. A quantitative description of the neonatal EEG can contribute to the objective assessment of the functional state of the brain, and may improve the precision of diagnosing cerebral dysfunctions manifested by 'disorganization', 'dysrhythmia' or 'dysmaturity'. 21 healthy, full-term newborns were examined polygraphically during sleep (EEG-8 referential derivations, respiration, ECG, EOG, EMG). From each EEG record, two 5-min samples (one from the middle of quiet sleep, the other from the middle of active sleep) were subject to subsequent automatic analysis and were described by 13 variables: spectral features and features describing shape and variability of the signal. The data from individual infants were averaged and the number of variables was reduced by factor analysis. All factors identified by factor analysis were statistically significantly influenced by the location of derivation. A large number of statistically significant differences were also established when comparing the effects of individual derivations on each of the 13 measured variables. Both spectral features and features describing shape and variability of the signal are largely accountable for the topographic differentiation of the neonatal EEG. The presented method of the automatic EEG analysis is capable to assess the topographic characteristics of the neonatal EEG, and it is adequately sensitive and describes the neonatal electroencephalogram with sufficient precision. The discriminatory capability of the used method represents a promise for their application in the clinical practice.

  1. Quantitative Ultrasond in the assessment of Osteoporosis

    Energy Technology Data Exchange (ETDEWEB)

    Guglielmi, Giuseppe [Department of Radiology, University of Foggia, Viale L. Pinto, 71100 Foggia (Italy); Department of Radiology, Scientific Institute Hospital, San Giovanni Rotondo (Italy)], E-mail: g.guglielmi@unifg.it; Terlizzi, Francesca de [IGEA srl, Via Parmenide 10/A 41012 Carpi, MO (Italy)], E-mail: f.deterlizzi@igeamedical.com

    2009-09-15

    Quantitative ultrasound (QUS) is used in the clinical setting to identify changes in bone tissue connected with menopause, osteoporosis and bone fragility. The versatility of the technique, its low cost and lack of ionizing radiation have led to the use of this method worldwide. Furthermore, with increased clinical interest among clinicians, QUS has been applied to several field of investigation of bone, in various pathologies of bone metabolism, in paediatrics, neonatology, genetics and other fields. Several studies have been carried out in recent years to investigate the potential of QUS, with important positive results. The technique is able to predict osteoporotic fractures; some evidence of the ability to monitor therapies has been reported; the usefulness in the management of secondary osteoporosis has been confirmed; studies in paediatrics have reported reference curves for some QUS devices, and there have been relevant studies in conditions involving metabolic bone disorders. This article is an overview of the most relevant developments in the field of QUS, both in the clinical and in the experimental settings. The advantages and limitations of the present technique have been outlined, together with suggestions for the use in the clinical practice.

  2. Quantitative Analysis of Retrieved Glenoid Liners

    Directory of Open Access Journals (Sweden)

    Katelyn Childs

    2016-02-01

    Full Text Available Revision of orthopedic surgeries is often expensive and involves higher risk from complications. Since most total joint replacement devices use a polyethylene bearing, which serves as a weak link, the assessment of damage to the liner due to in vivo exposure is very important. The failures often are due to excessive polyethylene wear. The glenoid liners are complex and hemispherical in shape and present challenges while assessing the damage. Therefore, the study on the analysis of glenoid liners retrieved from revision surgery may lend insight into common wear patterns and improve future product designs. The purpose of this pilot study is to further develop the methods of segmenting a liner into four quadrants to quantify the damage in the liner. Different damage modes are identified and statistically analyzed. Multiple analysts were recruited to conduct the damage assessments. In this paper, four analysts evaluated nine glenoid liners, retrieved from revision surgery, two of whom had an engineering background and two of whom had a non-engineering background. Associated human factor mechanisms are reported in this paper. The wear patterns were quantified using the Hood/Gunther, Wasielewski, Brandt, and Lombardi methods. The quantitative assessments made by several observers were analyzed. A new, composite damage parameter was developed and applied to assess damage. Inter-observer reliability was assessed using a paired t-test. Data reported by four analysts showed a high standard deviation; however, only two analysts performed the tests in a significantly similar way and they had engineering backgrounds.

  3. Quantitative autoradiography - a method of radioactivity measurement

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1988-01-01

    In the last years the autoradiography has been developed to a quantitative method of radioactivity measurement. Operating techniques of quantitative autoradiography are demonstrated using special standard objects. Influences of irradiation quality, of backscattering in sample and detector materials, and of sensitivity and fading of the detectors are considered. Furthermore, questions of quantitative evaluation of autoradiograms are dealt with, and measuring errors are discussed. Finally, some practical uses of quantitative autoradiography are demonstrated by means of the estimation of activity distribution in radioactive foil samples. (author)

  4. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity.

    Science.gov (United States)

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A; Bradford, William D; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S; Li, Rong

    2015-03-30

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein-based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. Copyright © 2015 Zhu et al.

  5. Quantitative influence of risk factors on blood glucose level.

    Science.gov (United States)

    Chen, Songjing; Luo, Senlin; Pan, Limin; Zhang, Tiemei; Han, Longfei; Zhao, Haixiu

    2014-01-01

    The aim of this study is to quantitatively analyze the influence of risk factors on the blood glucose level, and to provide theory basis for understanding the characteristics of blood glucose change and confirming the intervention index for type 2 diabetes. The quantitative method is proposed to analyze the influence of risk factors on blood glucose using back propagation (BP) neural network. Ten risk factors are screened first. Then the cohort is divided into nine groups by gender and age. According to the minimum error principle, nine BP models are trained respectively. The quantitative values of the influence of different risk factors on the blood glucose change can be obtained by sensitivity calculation. The experiment results indicate that weight is the leading cause of blood glucose change (0.2449). The second factors are cholesterol, age and triglyceride. The total ratio of these four factors reaches to 77% of the nine screened risk factors. And the sensitivity sequences can provide judgment method for individual intervention. This method can be applied to risk factors quantitative analysis of other diseases and potentially used for clinical practitioners to identify high risk populations for type 2 diabetes as well as other disease.

  6. Quantitative Identification of Construction Risk

    OpenAIRE

    Kasprowicz T.

    2017-01-01

    Risks pertaining to construction work relate to situations in which various events may randomly change the duration and cost of the project or worsen its quality. Because of possible significant changes of random events, favorable, moderate, and difficult conditions of construction work are considered. It is the first stage of the construction risk analysis. The probabilistic parameters of construction are identified and described by using the design characteristics model of the structure and...

  7. Quantitative PET of liver functions.

    Science.gov (United States)

    Keiding, Susanne; Sørensen, Michael; Frisch, Kim; Gormsen, Lars C; Munk, Ole Lajord

    2018-01-01

    Improved understanding of liver physiology and pathophysiology is urgently needed to assist the choice of new and upcoming therapeutic modalities for patients with liver diseases. In this review, we focus on functional PET of the liver: 1) Dynamic PET with 2-deoxy-2-[ 18 F]fluoro- D -galactose ( 18 F-FDGal) provides quantitative images of the hepatic metabolic clearance K met (mL blood/min/mL liver tissue) of regional and whole-liver hepatic metabolic function. Standard-uptake-value ( SUV ) from a static liver 18 F-FDGal PET/CT scan can replace K met and is currently used clinically. 2) Dynamic liver PET/CT in humans with 11 C-palmitate and with the conjugated bile acid tracer [ N -methyl- 11 C]cholylsarcosine ( 11 C-CSar) can distinguish between individual intrahepatic transport steps in hepatic lipid metabolism and in hepatic transport of bile acid from blood to bile, respectively, showing diagnostic potential for individual patients. 3) Standard compartment analysis of dynamic PET data can lead to physiological inconsistencies, such as a unidirectional hepatic clearance of tracer from blood ( K 1 ; mL blood/min/mL liver tissue) greater than the hepatic blood perfusion. We developed a new microvascular compartment model with more physiology, by including tracer uptake into the hepatocytes from the blood flowing through the sinusoids, backflux from hepatocytes into the sinusoidal blood, and re-uptake along the sinusoidal path. Dynamic PET data include information on liver physiology which cannot be extracted using a standard compartment model. In conclusion , SUV of non-invasive static PET with 18 F-FDGal provides a clinically useful measurement of regional and whole-liver hepatic metabolic function. Secondly, assessment of individual intrahepatic transport steps is a notable feature of dynamic liver PET.

  8. Quantitative PET of liver functions

    Science.gov (United States)

    Keiding, Susanne; Sørensen, Michael; Frisch, Kim; Gormsen, Lars C; Munk, Ole Lajord

    2018-01-01

    Improved understanding of liver physiology and pathophysiology is urgently needed to assist the choice of new and upcoming therapeutic modalities for patients with liver diseases. In this review, we focus on functional PET of the liver: 1) Dynamic PET with 2-deoxy-2-[18F]fluoro-D-galactose (18F-FDGal) provides quantitative images of the hepatic metabolic clearance K met (mL blood/min/mL liver tissue) of regional and whole-liver hepatic metabolic function. Standard-uptake-value (SUV) from a static liver 18F-FDGal PET/CT scan can replace K met and is currently used clinically. 2) Dynamic liver PET/CT in humans with 11C-palmitate and with the conjugated bile acid tracer [N-methyl-11C]cholylsarcosine (11C-CSar) can distinguish between individual intrahepatic transport steps in hepatic lipid metabolism and in hepatic transport of bile acid from blood to bile, respectively, showing diagnostic potential for individual patients. 3) Standard compartment analysis of dynamic PET data can lead to physiological inconsistencies, such as a unidirectional hepatic clearance of tracer from blood (K 1; mL blood/min/mL liver tissue) greater than the hepatic blood perfusion. We developed a new microvascular compartment model with more physiology, by including tracer uptake into the hepatocytes from the blood flowing through the sinusoids, backflux from hepatocytes into the sinusoidal blood, and re-uptake along the sinusoidal path. Dynamic PET data include information on liver physiology which cannot be extracted using a standard compartment model. In conclusion, SUV of non-invasive static PET with 18F-FDGal provides a clinically useful measurement of regional and whole-liver hepatic metabolic function. Secondly, assessment of individual intrahepatic transport steps is a notable feature of dynamic liver PET. PMID:29755841

  9. Persistent Identifiers as Boundary Objects

    Science.gov (United States)

    Parsons, M. A.; Fox, P. A.

    2017-12-01

    In 1989, Leigh Star and Jim Griesemer defined the seminal concept of `boundary objects'. These `objects' are what Latour calls `immutable mobiles' that enable communication and collaboration across difference by helping meaning to be understood in different contexts. As Star notes, they are a sort of arrangement that allow different groups to work together without (a priori) consensus. Part of the idea is to recognize and allow for the `interpretive flexibility' that is central to much of the `constructivist' approach in the sociology of science. Persistent Identifiers (PIDs) can clearly act as boundary objects, but people do not usually assume that they enable interpretive flexibility. After all, they are meant to be unambiguous, machine-interpretable identifiers of defined artifacts. In this paper, we argue that PIDs can fill at least two roles: 1) That of the standardized form, where there is strong agreement on what is being represented and how and 2) that of the idealized type, a more conceptual concept that allows many different representations. We further argue that these seemingly abstract conceptions actually help us implement PIDs more effectively to link data, publications, various other artifacts, and especially people. Considering PIDs as boundary objects can help us address issues such as what level of granularity is necessary for PIDs, what metadata should be directly associated with PIDs, and what purpose is the PID serving (reference, provenance, credit, etc.). In short, sociological theory can improve data sharing standards and their implementation in a way that enables broad interdisciplinary data sharing and reuse. We will illustrate this with several specific examples of Earth science data.

  10. Validating quantitative precipitation forecast for the Flood ...

    Indian Academy of Sciences (India)

    In order to issue an accurate warning for flood, a better or appropriate quantitative forecasting of precipitationis required. In view of this, the present study intends to validate the quantitative precipitationforecast (QPF) issued during southwest monsoon season for six river catchments (basin) under theflood meteorological ...

  11. Applications of quantitative remote sensing to hydrology

    NARCIS (Netherlands)

    Su, Z.; Troch, P.A.A.

    2003-01-01

    In order to quantify the rates of the exchanges of energy and matter among hydrosphere, biosphere and atmosphere, quantitative description of land surface processes by means of measurements at different scales are essential. Quantitative remote sensing plays an important role in this respect. The

  12. Development and applications of quantitative NMR spectroscopy

    International Nuclear Information System (INIS)

    Yamazaki, Taichi

    2016-01-01

    Recently, quantitative NMR spectroscopy has attracted attention as an analytical method which can easily secure traceability to SI unit system, and discussions about its accuracy and inaccuracy are also started. This paper focuses on the literatures on the advancement of quantitative NMR spectroscopy reported between 2009 and 2016, and introduces both NMR measurement conditions and actual analysis cases in quantitative NMR. The quantitative NMR spectroscopy using an internal reference method enables accurate quantitative analysis with a quick and versatile way in general, and it is possible to obtain the precision sufficiently applicable to the evaluation of pure substances and standard solutions. Since the external reference method can easily prevent contamination to samples and the collection of samples, there are many reported cases related to the quantitative analysis of biologically related samples and highly scarce natural products in which NMR spectra are complicated. In the precision of quantitative NMR spectroscopy, the internal reference method is superior. As the quantitative NMR spectroscopy widely spreads, discussions are also progressing on how to utilize this analytical method as the official methods in various countries around the world. In Japan, this method is listed in the Pharmacopoeia and Japanese Standard of Food Additives, and it is also used as the official method for purity evaluation. In the future, this method will be expected to spread as the general-purpose analysis method that can ensure traceability to SI unit system. (A.O.)

  13. Quantitative Phase Imaging Using Hard X Rays

    International Nuclear Information System (INIS)

    Nugent, K.A.; Gureyev, T.E.; Cookson, D.J.; Paganin, D.; Barnea, Z.

    1996-01-01

    The quantitative imaging of a phase object using 16keV xrays is reported. The theoretical basis of the techniques is presented along with its implementation using a synchrotron x-ray source. We find that our phase image is in quantitative agreement with independent measurements of the object. copyright 1996 The American Physical Society

  14. A Primer on Disseminating Applied Quantitative Research

    Science.gov (United States)

    Bell, Bethany A.; DiStefano, Christine; Morgan, Grant B.

    2010-01-01

    Transparency and replication are essential features of scientific inquiry, yet scientific communications of applied quantitative research are often lacking in much-needed procedural information. In an effort to promote researchers dissemination of their quantitative studies in a cohesive, detailed, and informative manner, the authors delineate…

  15. Using Popular Culture to Teach Quantitative Reasoning

    Science.gov (United States)

    Hillyard, Cinnamon

    2007-01-01

    Popular culture provides many opportunities to develop quantitative reasoning. This article describes a junior-level, interdisciplinary, quantitative reasoning course that uses examples from movies, cartoons, television, magazine advertisements, and children's literature. Some benefits from and cautions to using popular culture to teach…

  16. Quantitative radiomic profiling of glioblastoma represents transcriptomic expression.

    Science.gov (United States)

    Kong, Doo-Sik; Kim, Junhyung; Ryu, Gyuha; You, Hye-Jin; Sung, Joon Kyung; Han, Yong Hee; Shin, Hye-Mi; Lee, In-Hee; Kim, Sung-Tae; Park, Chul-Kee; Choi, Seung Hong; Choi, Jeong Won; Seol, Ho Jun; Lee, Jung-Il; Nam, Do-Hyun

    2018-01-19

    Quantitative imaging biomarkers have increasingly emerged in the field of research utilizing available imaging modalities. We aimed to identify good surrogate radiomic features that can represent genetic changes of tumors, thereby establishing noninvasive means for predicting treatment outcome. From May 2012 to June 2014, we retrospectively identified 65 patients with treatment-naïve glioblastoma with available clinical information from the Samsung Medical Center data registry. Preoperative MR imaging data were obtained for all 65 patients with primary glioblastoma. A total of 82 imaging features including first-order statistics, volume, and size features, were semi-automatically extracted from structural and physiologic images such as apparent diffusion coefficient and perfusion images. Using commercially available software, NordicICE, we performed quantitative imaging analysis and collected the dataset composed of radiophenotypic parameters. Unsupervised clustering methods revealed that the radiophenotypic dataset was composed of three clusters. Each cluster represented a distinct molecular classification of glioblastoma; classical type, proneural and neural types, and mesenchymal type. These clusters also reflected differential clinical outcomes. We found that extracted imaging signatures does not represent copy number variation and somatic mutation. Quantitative radiomic features provide a potential evidence to predict molecular phenotype and treatment outcome. Radiomic profiles represents transcriptomic phenotypes more well.

  17. Identifying thresholds for ecosystem-based management.

    Directory of Open Access Journals (Sweden)

    Jameal F Samhouri

    Full Text Available BACKGROUND: One of the greatest obstacles to moving ecosystem-based management (EBM from concept to practice is the lack of a systematic approach to defining ecosystem-level decision criteria, or reference points that trigger management action. METHODOLOGY/PRINCIPAL FINDINGS: To assist resource managers and policymakers in developing EBM decision criteria, we introduce a quantitative, transferable method for identifying utility thresholds. A utility threshold is the level of human-induced pressure (e.g., pollution at which small changes produce substantial improvements toward the EBM goal of protecting an ecosystem's structural (e.g., diversity and functional (e.g., resilience attributes. The analytical approach is based on the detection of nonlinearities in relationships between ecosystem attributes and pressures. We illustrate the method with a hypothetical case study of (1 fishing and (2 nearshore habitat pressure using an empirically-validated marine ecosystem model for British Columbia, Canada, and derive numerical threshold values in terms of the density of two empirically-tractable indicator groups, sablefish and jellyfish. We also describe how to incorporate uncertainty into the estimation of utility thresholds and highlight their value in the context of understanding EBM trade-offs. CONCLUSIONS/SIGNIFICANCE: For any policy scenario, an understanding of utility thresholds provides insight into the amount and type of management intervention required to make significant progress toward improved ecosystem structure and function. The approach outlined in this paper can be applied in the context of single or multiple human-induced pressures, to any marine, freshwater, or terrestrial ecosystem, and should facilitate more effective management.

  18. Theory and Practice in Quantitative Genetics

    DEFF Research Database (Denmark)

    Posthuma, Daniëlle; Beem, A Leo; de Geus, Eco J C

    2003-01-01

    With the rapid advances in molecular biology, the near completion of the human genome, the development of appropriate statistical genetic methods and the availability of the necessary computing power, the identification of quantitative trait loci has now become a realistic prospect for quantitative...... geneticists. We briefly describe the theoretical biometrical foundations underlying quantitative genetics. These theoretical underpinnings are translated into mathematical equations that allow the assessment of the contribution of observed (using DNA samples) and unobserved (using known genetic relationships......) genetic variation to population variance in quantitative traits. Several statistical models for quantitative genetic analyses are described, such as models for the classical twin design, multivariate and longitudinal genetic analyses, extended twin analyses, and linkage and association analyses. For each...

  19. Applications of Microfluidics in Quantitative Biology.

    Science.gov (United States)

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  20. Quantitative neutron radiography using neutron absorbing honeycomb

    International Nuclear Information System (INIS)

    Tamaki, Masayoshi; Oda, Masahiro; Takahashi, Kenji; Ohkubo, Kohei; Tasaka, Kanji; Tsuruno, Akira; Matsubayashi, Masahito.

    1993-01-01

    This investigation concerns quantitative neutron radiography and computed tomography by using a neutron absorbing honeycomb collimator. By setting the neutron absorbing honeycomb collimator between object and imaging system, neutrons scattered in the object were absorbed by the honeycomb material and eliminated before coming to the imaging system, but the neutrons which were transmitted the object without interaction could reach the imaging system. The image by purely transmitted neutrons gives the quantitative information. Two honeycombs were prepared with coating of boron nitride and gadolinium oxide and evaluated for the quantitative application. The relation between the neutron total cross section and the attenuation coefficient confirmed that they were in a fairly good agreement. Application to quantitative computed tomography was also successfully conducted. The new neutron radiography method using the neutron-absorbing honeycomb collimator for the elimination of the scattered neutrons improved remarkably the quantitativeness of the neutron radiography and computed tomography. (author)

  1. RECOVIR Software for Identifying Viruses

    Science.gov (United States)

    Chakravarty, Sugoto; Fox, George E.; Zhu, Dianhui

    2013-01-01

    Most single-stranded RNA (ssRNA) viruses mutate rapidly to generate a large number of strains with highly divergent capsid sequences. Determining the capsid residues or nucleotides that uniquely characterize these strains is critical in understanding the strain diversity of these viruses. RECOVIR (an acronym for "recognize viruses") software predicts the strains of some ssRNA viruses from their limited sequence data. Novel phylogenetic-tree-based databases of protein or nucleic acid residues that uniquely characterize these virus strains are created. Strains of input virus sequences (partial or complete) are predicted through residue-wise comparisons with the databases. RECOVIR uses unique characterizing residues to identify automatically strains of partial or complete capsid sequences of picorna and caliciviruses, two of the most highly diverse ssRNA virus families. Partition-wise comparisons of the database residues with the corresponding residues of more than 300 complete and partial sequences of these viruses resulted in correct strain identification for all of these sequences. This study shows the feasibility of creating databases of hitherto unknown residues uniquely characterizing the capsid sequences of two of the most highly divergent ssRNA virus families. These databases enable automated strain identification from partial or complete capsid sequences of these human and animal pathogens.

  2. Informatics methods to enable sharing of quantitative imaging research data.

    Science.gov (United States)

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    Science.gov (United States)

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  4. Incorporating assumption deviation risk in quantitative risk assessments: A semi-quantitative approach

    International Nuclear Information System (INIS)

    Khorsandi, Jahon; Aven, Terje

    2017-01-01

    Quantitative risk assessments (QRAs) of complex engineering systems are based on numerous assumptions and expert judgments, as there is limited information available for supporting the analysis. In addition to sensitivity analyses, the concept of assumption deviation risk has been suggested as a means for explicitly considering the risk related to inaccuracies and deviations in the assumptions, which can significantly impact the results of the QRAs. However, challenges remain for its practical implementation, considering the number of assumptions and magnitude of deviations to be considered. This paper presents an approach for integrating an assumption deviation risk analysis as part of QRAs. The approach begins with identifying the safety objectives for which the QRA aims to support, and then identifies critical assumptions with respect to ensuring the objectives are met. Key issues addressed include the deviations required to violate the safety objectives, the uncertainties related to the occurrence of such events, and the strength of knowledge supporting the assessments. Three levels of assumptions are considered, which include assumptions related to the system's structural and operational characteristics, the effectiveness of the established barriers, as well as the consequence analysis process. The approach is illustrated for the case of an offshore installation. - Highlights: • An approach for assessing the risk of deviations in QRA assumptions is presented. • Critical deviations and uncertainties related to their occurrence are addressed. • The analysis promotes critical thinking about the foundation and results of QRAs. • The approach is illustrated for the case of an offshore installation.

  5. Identifying ELIXIR Core Data Resources.

    Science.gov (United States)

    Durinx, Christine; McEntyre, Jo; Appel, Ron; Apweiler, Rolf; Barlow, Mary; Blomberg, Niklas; Cook, Chuck; Gasteiger, Elisabeth; Kim, Jee-Hyub; Lopez, Rodrigo; Redaschi, Nicole; Stockinger, Heinz; Teixeira, Daniel; Valencia, Alfonso

    2016-01-01

    The core mission of ELIXIR is to build a stable and sustainable infrastructure for biological information across Europe. At the heart of this are the data resources, tools and services that ELIXIR offers to the life-sciences community, providing stable and sustainable access to biological data. ELIXIR aims to ensure that these resources are available long-term and that the life-cycles of these resources are managed such that they support the scientific needs of the life-sciences, including biological research. ELIXIR Core Data Resources are defined as a set of European data resources that are of fundamental importance to the wider life-science community and the long-term preservation of biological data. They are complete collections of generic value to life-science, are considered an authority in their field with respect to one or more characteristics, and show high levels of scientific quality and service. Thus, ELIXIR Core Data Resources are of wide applicability and usage. This paper describes the structures, governance and processes that support the identification and evaluation of ELIXIR Core Data Resources. It identifies key indicators which reflect the essence of the definition of an ELIXIR Core Data Resource and support the promotion of excellence in resource development and operation. It describes the specific indicators in more detail and explains their application within ELIXIR's sustainability strategy and science policy actions, and in capacity building, life-cycle management and technical actions. The identification process is currently being implemented and tested for the first time. The findings and outcome will be evaluated by the ELIXIR Scientific Advisory Board in March 2017. Establishing the portfolio of ELIXIR Core Data Resources and ELIXIR Services is a key priority for ELIXIR and publicly marks the transition towards a cohesive infrastructure.

  6. DIA-datasnooping and identifiability

    Science.gov (United States)

    Zaminpardaz, S.; Teunissen, P. J. G.

    2018-04-01

    In this contribution, we present and analyze datasnooping in the context of the DIA method. As the DIA method for the detection, identification and adaptation of mismodelling errors is concerned with estimation and testing, it is the combination of both that needs to be considered. This combination is rigorously captured by the DIA estimator. We discuss and analyze the DIA-datasnooping decision probabilities and the construction of the corresponding partitioning of misclosure space. We also investigate the circumstances under which two or more hypotheses are nonseparable in the identification step. By means of a theorem on the equivalence between the nonseparability of hypotheses and the inestimability of parameters, we demonstrate that one can forget about adapting the parameter vector for hypotheses that are nonseparable. However, as this concerns the complete vector and not necessarily functions of it, we also show that parameter functions may exist for which adaptation is still possible. It is shown how this adaptation looks like and how it changes the structure of the DIA estimator. To demonstrate the performance of the various elements of DIA-datasnooping, we apply the theory to some selected examples. We analyze how geometry changes in the measurement setup affect the testing procedure, by studying their partitioning of misclosure space, the decision probabilities and the minimal detectable and identifiable biases. The difference between these two minimal biases is highlighted by showing the difference between their corresponding contributing factors. We also show that if two alternative hypotheses, say Hi and Hj , are nonseparable, the testing procedure may have different levels of sensitivity to Hi -biases compared to the same Hj -biases.

  7. Induced mutations for quantitative traits in rice

    International Nuclear Information System (INIS)

    Chakrabarti, B.N.

    1974-01-01

    The characteristics and frequency of micro-mutations induced in quantitative traits by radiation treatment and the extent of heterozygotic effects of different recessive chlorophyll-mutant-genes on quantitative trait has been presented. Mutagenic treatments increased the variance for quantitative traits in all cases although the magnitude of increase varied depending on the treatment and the selection procedure adopted. The overall superiority of the chlorophyll-mutant heterozygotes over the corresponding wild homozygotes, as noted in consecutive two seasons, was not observed when these were grown at a high level of nitrogen fertiliser. (author)

  8. Quantitative determination of uranium by SIMS

    International Nuclear Information System (INIS)

    Kuruc, J.; Harvan, D.; Galanda, D.; Matel, L.; Aranyosiova, M.; Velic, D.

    2008-01-01

    The paper presents results of quantitative measurements of uranium-238 by secondary ion mass spectrometry (SIMS) with using alpha spectrometry as well as complementary technique. Samples with specific activity of uranium-238 were prepared by electrodeposition from aqueous solution of UO 2 (NO 3 ) 2 ·6H 2 O. We tried to apply SIMS to quantitative analysis and search for correlation between intensity obtained from SIMS and activity of uranium-238 in dependence on the surface's weight and possibility of using SIMS in quantitative analysis of environmental samples. The obtained results and correlations as well as results of two real samples measurements are presented in this paper. (authors)

  9. A writer's guide to education scholarship: Quantitative methodologies for medical education research (part 1).

    Science.gov (United States)

    Thoma, Brent; Camorlinga, Paola; Chan, Teresa M; Hall, Andrew Koch; Murnaghan, Aleisha; Sherbino, Jonathan

    2018-01-01

    Quantitative research is one of the many research methods used to help educators advance their understanding of questions in medical education. However, little research has been done on how to succeed in publishing in this area. We conducted a scoping review to identify key recommendations and reporting guidelines for quantitative educational research and scholarship. Medline, ERIC, and Google Scholar were searched for English-language articles published between 2006 and January 2016 using the search terms, "research design," "quantitative," "quantitative methods," and "medical education." A hand search was completed for additional references during the full-text review. Titles/abstracts were reviewed by two authors (BT, PC) and included if they focused on quantitative research in medical education and outlined reporting guidelines, or provided recommendations on conducting quantitative research. One hundred articles were reviewed in parallel with the first 30 used for calibration and the subsequent 70 to calculate Cohen's kappa coefficient. Two reviewers (BT, PC) conducted a full text review and extracted recommendations and reporting guidelines. A simple thematic analysis summarized the extracted recommendations. Sixty-one articles were reviewed in full, and 157 recommendations were extracted. The thematic analysis identified 86 items, 14 categories, and 3 themes. Fourteen quality evaluation tools and reporting guidelines were found. Discussion This paper provides guidance for junior researchers in the form of key quality markers and reporting guidelines. We hope that quantitative researchers in medical education will be informed by the results and that further work will be done to refine the list of recommendations.

  10. Phenotypic characterization of glioblastoma identified through shape descriptors

    Science.gov (United States)

    Chaddad, Ahmad; Desrosiers, Christian; Toews, Matthew

    2016-03-01

    This paper proposes quantitatively describing the shape of glioblastoma (GBM) tissue phenotypes as a set of shape features derived from segmentations, for the purposes of discriminating between GBM phenotypes and monitoring tumor progression. GBM patients were identified from the Cancer Genome Atlas, and quantitative MR imaging data were obtained from the Cancer Imaging Archive. Three GBM tissue phenotypes are considered including necrosis, active tumor and edema/invasion. Volumetric tissue segmentations are obtained from registered T1˗weighted (T1˗WI) postcontrast and fluid-attenuated inversion recovery (FLAIR) MRI modalities. Shape features are computed from respective tissue phenotype segmentations, and a Kruskal-Wallis test was employed to select features capable of classification with a significance level of p < 0.05. Several classifier models are employed to distinguish phenotypes, where a leave-one-out cross-validation was performed. Eight features were found statistically significant for classifying GBM phenotypes with p <0.05, orientation is uninformative. Quantitative evaluations show the SVM results in the highest classification accuracy of 87.50%, sensitivity of 94.59% and specificity of 92.77%. In summary, the shape descriptors proposed in this work show high performance in predicting GBM tissue phenotypes. They are thus closely linked to morphological characteristics of GBM phenotypes and could potentially be used in a computer assisted labeling system.

  11. Preparing Tomorrow's Administrators: A Quantitative Correlation Study of the Relationship between Emotional Intelligence and Effective Leadership Practices

    Science.gov (United States)

    May-Vollmar, Kelly

    2017-01-01

    Purpose: The purpose of this quantitative correlation study was to identify whether there is a relationship between emotional intelligence and effective leadership practices, specifically with school administrators in Southern California K-12 public schools. Methods: This study was conducted using a quantitative descriptive design, correlation…

  12. 12 CFR 223.42 - What covered transactions are exempt from the quantitative limits, collateral requirements, and...

    Science.gov (United States)

    2010-01-01

    ...) Purchasing certain liquid assets. Purchasing an asset having a readily identifiable and publicly available... quantitative limits, collateral requirements, and low-quality asset prohibition? 223.42 Section 223.42 Banks... requirements, and low-quality asset prohibition? The following transactions are not subject to the quantitative...

  13. Quantitative traits in wheat (Triticum aestivum L

    African Journals Online (AJOL)

    MSS

    2012-11-13

    Nov 13, 2012 ... Of the quantitative traits in wheat, spike length, number of spikes per m2, grain mass per spike, number ... design with four liming variants along with three replications, in which the experimental field .... The sampling was done.

  14. Quantitative Fundus Autofluorescence in Recessive Stargardt Disease

    OpenAIRE

    Burke, Tomas R.; Duncker, Tobias; Woods, Russell L.; Greenberg, Jonathan P.; Zernant, Jana; Tsang, Stephen H.; Smith, R. Theodore; Allikmets, Rando; Sparrow, Janet R.; Delori, François C.

    2014-01-01

    Quantitative fundus autofluorescence (qAF) is significantly increased in Stargardt disease, consistent with previous reports of increased RPE lipofuscin. QAF will help to establish genotype-phenotype correlations and may serve as an outcome measure in clinical trials.

  15. Quantitative Microbial Risk Assessment Tutorial - Primer

    Science.gov (United States)

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  16. Optofluidic time-stretch quantitative phase microscopy.

    Science.gov (United States)

    Guo, Baoshan; Lei, Cheng; Wu, Yi; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Lee, Sangwook; Isozaki, Akihiro; Li, Ming; Jiang, Yiyue; Yasumoto, Atsushi; Di Carlo, Dino; Tanaka, Yo; Yatomi, Yutaka; Ozeki, Yasuyuki; Goda, Keisuke

    2018-03-01

    Innovations in optical microscopy have opened new windows onto scientific research, industrial quality control, and medical practice over the last few decades. One of such innovations is optofluidic time-stretch quantitative phase microscopy - an emerging method for high-throughput quantitative phase imaging that builds on the interference between temporally stretched signal and reference pulses by using dispersive properties of light in both spatial and temporal domains in an interferometric configuration on a microfluidic platform. It achieves the continuous acquisition of both intensity and phase images with a high throughput of more than 10,000 particles or cells per second by overcoming speed limitations that exist in conventional quantitative phase imaging methods. Applications enabled by such capabilities are versatile and include characterization of cancer cells and microalgal cultures. In this paper, we review the principles and applications of optofluidic time-stretch quantitative phase microscopy and discuss its future perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. A Quantitative Technique for Beginning Microscopists.

    Science.gov (United States)

    Sundberg, Marshall D.

    1984-01-01

    Stereology is the study of three-dimensional objects through the interpretation of two-dimensional images. Stereological techniques used in introductory botany to quantitatively examine changes in leaf anatomy in response to different environments are discussed. (JN)

  18. Quantitative data extraction from transmission electron micrographs

    International Nuclear Information System (INIS)

    Sprague, J.A.

    1982-01-01

    The discussion will cover an overview of quantitative TEM, the digital image analysis process, coherent optical processing, and finally a summary of the author's views on potentially useful advances in TEM image processing

  19. Quantitative Ability as Correlates of Students' Academic ...

    African Journals Online (AJOL)

    Nekky Umera

    The introduction of quantitative topics into the secondary school economics curriculum has ... since the quality of education at any level is highly dependent on the quality and dedication of ..... Ibadan: Constellations Books 466-481. Anderson ...

  20. Laboratory technique for quantitative thermal emissivity ...

    Indian Academy of Sciences (India)

    Emission of radiation from a sample occurs due to thermal vibration of its .... Quantitative thermal emissivity measurements of geological samples. 393. Figure 1. ...... tral mixture modeling: A new analysis of rock and soil types at the Viking ...

  1. A Quantitative Gas Chromatographic Ethanol Determination.

    Science.gov (United States)

    Leary, James J.

    1983-01-01

    Describes a gas chromatographic experiment for the quantitative determination of volume percent ethanol in water ethanol solutions. Background information, procedures, and typical results are included. Accuracy and precision of results are both on the order of two percent. (JN)

  2. Qualitative vs. quantitative atopic dermatitis criteria

    DEFF Research Database (Denmark)

    Andersen, R M; Thyssen, J P; Maibach, H I

    2016-01-01

    This review summarizes historical aspects, clinical expression and pathophysiology leading to coining of the terms atopy and atopic dermatitis, current diagnostic criteria and further explore the possibility of developing quantitative diagnostic criteria of atopic dermatitis (AD) based on the imp...

  3. Strategies for quantitation of phosphoproteomic data

    DEFF Research Database (Denmark)

    Palmisano, Giuseppe; Thingholm, Tine Engberg

    2010-01-01

    Recent developments in phosphoproteomic sample-preparation techniques and sensitive mass spectrometry instrumentation have led to large-scale identifications of phosphoproteins and phosphorylation sites from highly complex samples. This has facilitated the implementation of different quantitation...

  4. Quantitative Methods to Evaluate Timetable Attractiveness

    DEFF Research Database (Denmark)

    Schittenhelm, Bernd; Landex, Alex

    2009-01-01

    The article describes how the attractiveness of timetables can be evaluated quantitatively to ensure a consistent evaluation of timetables. Since the different key stakeholders (infrastructure manager, train operating company, customers, and society) have different opinions on what an attractive...

  5. Instrumentation and quantitative methods of evaluation

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1991-01-01

    This report summarizes goals and accomplishments of the research program entitled Instrumentation and Quantitative Methods of Evaluation, during the period January 15, 1989 through July 15, 1991. This program is very closely integrated with the radiopharmaceutical program entitled Quantitative Studies in Radiopharmaceutical Science. Together, they constitute the PROGRAM OF NUCLEAR MEDICINE AND QUANTITATIVE IMAGING RESEARCH within The Franklin McLean Memorial Research Institute (FMI). The program addresses problems involving the basic science and technology that underlie the physical and conceptual tools of radiotracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 234 refs., 11 figs., 2 tabs

  6. Quantitative approaches in climate change ecology

    DEFF Research Database (Denmark)

    Brown, Christopher J.; Schoeman, David S.; Sydeman, William J.

    2011-01-01

    Contemporary impacts of anthropogenic climate change on ecosystems are increasingly being recognized. Documenting the extent of these impacts requires quantitative tools for analyses of ecological observations to distinguish climate impacts in noisy data and to understand interactions between...... climate variability and other drivers of change. To assist the development of reliable statistical approaches, we review the marine climate change literature and provide suggestions for quantitative approaches in climate change ecology. We compiled 267 peer‐reviewed articles that examined relationships...

  7. Development of quantitative x-ray microtomography

    International Nuclear Information System (INIS)

    Deckman, H.W.; Dunsmuir, J.A.; D'Amico, K.L.; Ferguson, S.R.; Flannery, B.P.

    1990-01-01

    The authors have developed several x-ray microtomography systems which function as quantitative three dimensional x-ray microscopes. In this paper the authors describe the evolutionary path followed from making the first high resolution experimental microscopes to later generations which can be routinely used for investigating materials. Developing the instrumentation for reliable quantitative x-ray microscopy using synchrotron and laboratory based x-ray sources has led to other imaging modalities for obtaining temporal and spatial two dimensional information

  8. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  9. Quantitative autoradiography of semiconductor base material

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1983-01-01

    Autoradiographic methods for the quantitative determination of elements interesting in semiconductor technology and their distribution in silicon are described. Whereas the local concentration and distribution of phosphorus has been determined with the aid of silver halide films the neutron-induced autoradiography has been applied in the case of boron. Silicon disks containing diffused phosphorus or implanted or diffused boron have been used as standard samples. Different possibilities of the quantitative evaluation of autoradiograms are considered and compared

  10. Quantitative methods in psychology: inevitable and useless

    Directory of Open Access Journals (Sweden)

    Aaro Toomela

    2010-07-01

    Full Text Available Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.

  11. Radiological interpretation 2020: Toward quantitative image assessment

    International Nuclear Information System (INIS)

    Boone, John M.

    2007-01-01

    The interpretation of medical images by radiologists is primarily and fundamentally a subjective activity, but there are a number of clinical applications such as tumor imaging where quantitative imaging (QI) metrics (such as tumor growth rate) would be valuable to the patient’s care. It is predicted that the subjective interpretive environment of the past will, over the next decade, evolve toward the increased use of quantitative metrics for evaluating patient health from images. The increasing sophistication and resolution of modern tomographic scanners promote the development of meaningful quantitative end points, determined from images which are in turn produced using well-controlled imaging protocols. For the QI environment to expand, medical physicists, physicians, other researchers and equipment vendors need to work collaboratively to develop the quantitative protocols for imaging, scanner calibrations, and robust analytical software that will lead to the routine inclusion of quantitative parameters in the diagnosis and therapeutic assessment of human health. Most importantly, quantitative metrics need to be developed which have genuine impact on patient diagnosis and welfare, and only then will QI techniques become integrated into the clinical environment.

  12. Integration of hydrothermal-energy economics: related quantitative studies

    Energy Technology Data Exchange (ETDEWEB)

    1982-08-01

    A comparison of ten models for computing the cost of hydrothermal energy is presented. This comparison involved a detailed examination of a number of technical and economic parameters of the various quantitative models with the objective of identifying the most important parameters in the context of accurate estimates of cost of hydrothermal energy. Important features of various models, such as focus of study, applications, marked sectors covered, methodology, input data requirements, and output are compared in the document. A detailed sensitivity analysis of all the important engineering and economic parameters is carried out to determine the effect of non-consideration of individual parameters.

  13. Risk management and analysis: risk assessment (qualitative and quantitative)

    OpenAIRE

    Valentin Mazareanu

    2007-01-01

    We use to define risk as the possibility of suffering a loss. Starting this, risk management is defined as a business process whose purpose is to ensure that the organization is protected against risks and their effects. In order to prioritize, to develop a response plan and after that to monitor the identified risks we need to asses them. But at this point a question is born: should I choose a qualitative approach or a quantitative one? This paper will make a short overview over the risk eva...

  14. Pulmonary nodule characterization, including computer analysis and quantitative features.

    Science.gov (United States)

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  15. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  16. Cytoarchitectonic and quantitative Golgi study of the hedgehog supraoptic nucleus.

    Science.gov (United States)

    Caminero, A A; Machín, C; Sanchez-Toscano, F

    1992-01-01

    A cytoarchitectural study was made of the supraoptic nucleus (SON) of the hedgehog with special attention to the quantitative comparison of its main neuronal types. The main purposes were (1) to relate the characteristics of this nucleus in the hedgehog (a primitive mammalian insectivorous brain) with those in the SONs of more evolutionarily advanced species; (2) to identify quantitatively the dendritic fields of the main neuronal types in the hedgehog SON and to study their synaptic connectivity. From a descriptive standpoint, 3 neuronal types were found with respect to the number of dendritic stems arising from the neuronal soma: bipolar neurons (48%), multipolar neurons (45.5%) and monopolar neurons (6.5%). Within the multipolar type 2 subtypes could be distinguished, taking into account the number of dendritic spines: (a) with few spines (93%) and (b) very spiny (7%). These results indicate that the hedgehog SON is similar to that in other species except for the very spiny neurons, the significance of which is discussed. In order to characterise the main types more satisfactorily (bipolar and multipolars with few spines) we undertook a quantitative Golgi study of their dendritic fields. Although the patterns of the dendritic field are similar in both neuronal types, the differences in the location of their connectivity can reflect functional changes and alterations in relation to the synaptic afferences. Images Fig. 2 Fig. 3 Fig. 5 Fig. 6 Fig. 7 Fig. 8 Fig. 9 PMID:1452481

  17. Cytoarchitectonic and quantitative Golgi study of the hedgehog supraoptic nucleus.

    Science.gov (United States)

    Caminero, A A; Machín, C; Sanchez-Toscano, F

    1992-02-01

    A cytoarchitectural study was made of the supraoptic nucleus (SON) of the hedgehog with special attention to the quantitative comparison of its main neuronal types. The main purposes were (1) to relate the characteristics of this nucleus in the hedgehog (a primitive mammalian insectivorous brain) with those in the SONs of more evolutionarily advanced species; (2) to identify quantitatively the dendritic fields of the main neuronal types in the hedgehog SON and to study their synaptic connectivity. From a descriptive standpoint, 3 neuronal types were found with respect to the number of dendritic stems arising from the neuronal soma: bipolar neurons (48%), multipolar neurons (45.5%) and monopolar neurons (6.5%). Within the multipolar type 2 subtypes could be distinguished, taking into account the number of dendritic spines: (a) with few spines (93%) and (b) very spiny (7%). These results indicate that the hedgehog SON is similar to that in other species except for the very spiny neurons, the significance of which is discussed. In order to characterise the main types more satisfactorily (bipolar and multipolars with few spines) we undertook a quantitative Golgi study of their dendritic fields. Although the patterns of the dendritic field are similar in both neuronal types, the differences in the location of their connectivity can reflect functional changes and alterations in relation to the synaptic afferences.

  18. Research design: qualitative, quantitative and mixed methods approaches Research design: qualitative, quantitative and mixed methods approaches Creswell John W Sage 320 £29 0761924426 0761924426 [Formula: see text].

    Science.gov (United States)

    2004-09-01

    The second edition of Creswell's book has been significantly revised and updated. The author clearly sets out three approaches to research: quantitative, qualitative and mixed methods. As someone who has used mixed methods in my research, it is refreshing to read a textbook that addresses this. The differences between the approaches are clearly identified and a rationale for using each methodological stance provided.

  19. Identification of redox-sensitive cysteines in the arabidopsis proteome using OxiTRAQ, a quantitative redox proteomics method

    KAUST Repository

    Liu, Pei; Zhang, Huoming; Wang, Hai; Xia, Yiji

    2014-01-01

    -throughput quantitative proteomic approach termed OxiTRAQ for identifying proteins whose thiols undergo reversible oxidative modifications in Arabidopsis cells subjected to oxidative stress. In this approach, a biotinylated thiol-reactive reagent is used for differential

  20. Using Spatial Semantics and Interactions to Identify Urban Functional Regions

    Directory of Open Access Journals (Sweden)

    Yandong Wang

    2018-03-01

    Full Text Available The spatial structures of cities have changed dramatically with rapid socio-economic development in ways that are not well understood. To support urban structural analysis and rational planning, we propose a framework to identify urban functional regions and quantitatively explore the intensity of the interactions between them, thus increasing the understanding of urban structures. A method for the identification of functional regions via spatial semantics is proposed, which involves two steps: (1 the study area is classified into three types of functional regions using taxi origin/destination (O/D flows; and (2 the spatial semantics for the three types of functional regions are demonstrated based on point-of-interest (POI categories. To validate the existence of urban functional regions, we explored the intensity of interactions quantitatively between them. A case study using POI data and taxi trajectory data from Beijing validates the proposed framework. The results show that the proposed framework can be used to identify urban functional regions and promotes an enhanced understanding of urban structures.

  1. Identifying influential directors in the United States corporate governance network

    Science.gov (United States)

    Huang, Xuqing; Vodenska, Irena; Wang, Fengzhong; Havlin, Shlomo; Stanley, H. Eugene

    2011-10-01

    The influence of directors has been one of the most engaging topics recently, but surprisingly little research has been done to quantitatively evaluate the influence and power of directors. We analyze the structure of the US corporate governance network for the 11-year period 1996-2006 based on director data from the Investor Responsibility Research Center director database, and we develop a centrality measure named the influence factor to estimate the influence of directors quantitatively. The US corporate governance network is a network of directors with nodes representing directors and links between two directors representing their service on common company boards. We assume that information flows in the network through information-sharing processes among linked directors. The influence factor assigned to a director is based on the level of information that a director obtains from the entire network. We find that, contrary to commonly accepted belief that directors of large companies, measured by market capitalization, are the most powerful, in some instances, the directors who are influential do not necessarily serve on boards of large companies. By applying our influence factor method to identify the influential people contained in the lists created by popular magazines such as Fortune, Networking World, and Treasury and Risk Management, we find that the influence factor method is consistently either the best or one of the two best methods in identifying powerful people compared to other general centrality measures that are used to denote the significance of a node in complex network theory.

  2. Identifying influential directors in the United States corporate governance network.

    Science.gov (United States)

    Huang, Xuqing; Vodenska, Irena; Wang, Fengzhong; Havlin, Shlomo; Stanley, H Eugene

    2011-10-01

    The influence of directors has been one of the most engaging topics recently, but surprisingly little research has been done to quantitatively evaluate the influence and power of directors. We analyze the structure of the US corporate governance network for the 11-year period 1996-2006 based on director data from the Investor Responsibility Research Center director database, and we develop a centrality measure named the influence factor to estimate the influence of directors quantitatively. The US corporate governance network is a network of directors with nodes representing directors and links between two directors representing their service on common company boards. We assume that information flows in the network through information-sharing processes among linked directors. The influence factor assigned to a director is based on the level of information that a director obtains from the entire network. We find that, contrary to commonly accepted belief that directors of large companies, measured by market capitalization, are the most powerful, in some instances, the directors who are influential do not necessarily serve on boards of large companies. By applying our influence factor method to identify the influential people contained in the lists created by popular magazines such as Fortune, Networking World, and Treasury and Risk Management, we find that the influence factor method is consistently either the best or one of the two best methods in identifying powerful people compared to other general centrality measures that are used to denote the significance of a node in complex network theory.

  3. Scientific aspects of urolithiasis: quantitative stone analysis and crystallization experiments

    International Nuclear Information System (INIS)

    Wandt, M.A.E.

    1986-03-01

    The theory, development and results of three quantitative analytical procedures are described and the crystallization experiments in a rotary evaporator are presented. Of the different methods of quantitative X-ray powder diffraction analyses, the 'internal standard method' and a microanalytical technique were identified as the two most useful procedures for the quantitative analysis of urinary calculi. 'Reference intensity ratios' for 6 major stone phases were determined and were used in the analysis of 20 calculi by the 'internal standard method'. Inductively coupled plasma atomic emission spectroscopic (ICP-AES) methods were also investigated, developed and used in this study. Various procedures for the digestion of calculi were tested and a mixture of HNO 3 and HC1O 4 was eventually found to be the most successful. The major elements Ca, Mg, and P in 41 calculi were determined. For the determination of trace elements, a new microwave-assisted digestion procedure was developed and used for the digestion of 100 calculi. Fluoride concentrations in two stone collections were determined using a fluoride-ion sensitive electrode and the HNO 3 /HC1O 4 digestion prodecure used for the ICP study. A series of crystallization experiments involving a standard reference artificial urine was carried out in a rotary evaporator. The effect of pH and urine composition was studied by varying the former and by including uric acid, urea, creatinine, MgO, methylene blue, chondroitin sulphate A, and fluoride in the reference solution. Crystals formed in these experiments were subjected to qualitative and semi-quantitative X-ray powder diffraction analyses. Scanning electron microscopy of several deposits was also carried out. Similar deposits to those observed in calculi were obtained with the fast evaporator. The results presented suggest that this system provides a simple, yet very useful means for studying the crystallization characteristics of urine solutions

  4. Qualitative versus quantitative methods in psychiatric research.

    Science.gov (United States)

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  5. Quantitative Radio-Cardiography with the Digital Autofluoroscope

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M. A.; Moussa-Mahmoud, L.; Blau, M. [Roswell Park Memorial Institute, Buffalo, NY (United States)

    1969-05-15

    The Digital Autofluoroscope was designed primarily to permit a quantitative evaluation of the rapid flow of short-lived radioisotopes through compartments within organs. To perform these studies, the instrument is operated in the dynamic mode. In this mode the patient is positioned in front of the detector, the radioactive material is administered, and the instrument automatically accumulates data in a magnetic core memory for a preset period of time varying from 30 milliseconds to 1 minute. At the end of the accumulation period, the stored information is dumped on computer-compatible digital magnetictape, the memory is cleared, and a new accumulation cycle commences. Upon completion of a study, the tape is replayed and anatomical sites identified from the images of the distribution of the radioactive material. A memory flagging system is then used to obtain quantitative information on a regional basis. Radio-cardiograms are performed following the intravenous injection of a bolus of 10 millicuries of {sup 99m}Tc, and rapid sequence recording of the cardiac inflow and outflow data is obtained at the rate of five frames per second. Upon completion of the study, the digital tape is played back and the locations of the four chambers of the heart are identified. The memory elements corresponding to each of these anatomical sites are then flagged, the data is replayed, and the inflow and outflow curves for each chamber are recorded separately. An EKG trigger device can be used to initiate every count-record cycle to permit the accumulation of data only during diastole. The resulting data is easier to interpret as changes in cardiac volume due to normal contractions are not recorded. This technique has been evaluated in 20 volunteers to establish normal values. Over 50 patients with congenital and acquired heart disease have been studied, and the following parameters evaluated: (1) cardiac output, (2) pulmonary blood transit time, (3) pulmonary blood volume, and (4) the

  6. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  7. Quantitative microbiome profiling links gut community variation to microbial load.

    Science.gov (United States)

    Vandeputte, Doris; Kathagen, Gunter; D'hoe, Kevin; Vieira-Silva, Sara; Valles-Colomer, Mireia; Sabino, João; Wang, Jun; Tito, Raul Y; De Commer, Lindsey; Darzi, Youssef; Vermeire, Séverine; Falony, Gwen; Raes, Jeroen

    2017-11-23

    Current sequencing-based analyses of faecal microbiota quantify microbial taxa and metabolic pathways as fractions of the sample sequence library generated by each analysis. Although these relative approaches permit detection of disease-associated microbiome variation, they are limited in their ability to reveal the interplay between microbiota and host health. Comparative analyses of relative microbiome data cannot provide information about the extent or directionality of changes in taxa abundance or metabolic potential. If microbial load varies substantially between samples, relative profiling will hamper attempts to link microbiome features to quantitative data such as physiological parameters or metabolite concentrations. Saliently, relative approaches ignore the possibility that altered overall microbiota abundance itself could be a key identifier of a disease-associated ecosystem configuration. To enable genuine characterization of host-microbiota interactions, microbiome research must exchange ratios for counts. Here we build a workflow for the quantitative microbiome profiling of faecal material, through parallelization of amplicon sequencing and flow cytometric enumeration of microbial cells. We observe up to tenfold differences in the microbial loads of healthy individuals and relate this variation to enterotype differentiation. We show how microbial abundances underpin both microbiota variation between individuals and covariation with host phenotype. Quantitative profiling bypasses compositionality effects in the reconstruction of gut microbiota interaction networks and reveals that the taxonomic trade-off between Bacteroides and Prevotella is an artefact of relative microbiome analyses. Finally, we identify microbial load as a key driver of observed microbiota alterations in a cohort of patients with Crohn's disease, here associated with a low-cell-count Bacteroides enterotype (as defined through relative profiling).

  8. Advancing the Fork detector for quantitative spent nuclear fuel verification

    Science.gov (United States)

    Vaccaro, S.; Gauld, I. C.; Hu, J.; De Baere, P.; Peterson, J.; Schwalbach, P.; Smejkal, A.; Tomanin, A.; Sjöland, A.; Tobin, S.; Wiarda, D.

    2018-04-01

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations. A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This paper describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms

  9. Characterization of early autophagy signaling by quantitative phosphoproteomics

    DEFF Research Database (Denmark)

    Rigbolt, Kristoffer Tg; Zarei, Mostafa; Sprenger, Adrian

    2014-01-01

    . To elucidate the regulation of early signaling events upon autophagy induction, we applied quantitative phosphoproteomics characterizing the temporal phosphorylation dynamics after starvation and rapamycin treatment. We obtained a comprehensive atlas of phosphorylation kinetics within the first 30 min upon...... revealing regulated phosphorylation sites on proteins involved in a wide range of cellular processes and an impact of the treatments on the kinome. To approach the potential function of the identified phosphorylation sites we performed a screen for MAP1LC3-interacting proteins and identified a group...... induction of autophagy with both treatments affecting widely different cellular processes. The identification of dynamic phosphorylation already after 2 min demonstrates that the earliest events in autophagy signaling occur rapidly after induction. The data was subjected to extensive bioinformatics analysis...

  10. Development and standardization of multiplexed antibody microarrays for use in quantitative proteomics

    Directory of Open Access Journals (Sweden)

    Sorette M

    2004-12-01

    Full Text Available Abstract Background Quantitative proteomics is an emerging field that encompasses multiplexed measurement of many known proteins in groups of experimental samples in order to identify differences between groups. Antibody arrays are a novel technology that is increasingly being used for quantitative proteomics studies due to highly multiplexed content, scalability, matrix flexibility and economy of sample consumption. Key applications of antibody arrays in quantitative proteomics studies are identification of novel diagnostic assays, biomarker discovery in trials of new drugs, and validation of qualitative proteomics discoveries. These applications require performance benchmarking, standardization and specification. Results Six dual-antibody, sandwich immunoassay arrays that measure 170 serum or plasma proteins were developed and experimental procedures refined in more than thirty quantitative proteomics studies. This report provides detailed information and specification for manufacture, qualification, assay automation, performance, assay validation and data processing for antibody arrays in large scale quantitative proteomics studies. Conclusion The present report describes development of first generation standards for antibody arrays in quantitative proteomics. Specifically, it describes the requirements of a comprehensive validation program to identify and minimize antibody cross reaction under highly multiplexed conditions; provides the rationale for the application of standardized statistical approaches to manage the data output of highly replicated assays; defines design requirements for controls to normalize sample replicate measurements; emphasizes the importance of stringent quality control testing of reagents and antibody microarrays; recommends the use of real-time monitors to evaluate sensitivity, dynamic range and platform precision; and presents survey procedures to reveal the significance of biomarker findings.

  11. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Directory of Open Access Journals (Sweden)

    Erin M Siegel

    Full Text Available Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2. A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003. Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  12. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    Science.gov (United States)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  13. Quantitative Appearance Inspection for Film Coated Tablets.

    Science.gov (United States)

    Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru

    2016-01-01

    The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.

  14. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  15. The rise of quantitative methods in Psychology

    Directory of Open Access Journals (Sweden)

    Denis Cousineau

    2005-09-01

    Full Text Available Quantitative methods have a long history in some scientific fields. Indeed, no one today would consider a qualitative data set in physics or a qualitative theory in chemistry. Quantitative methods are so central in these fields that they are often labelled “hard sciences”. Here, we examine the question whether psychology is ready to enter the “hard science club” like biology did in the forties. The facts that a over half of the statistical techniques used in psychology are less than 40 years old and that b the number of simulations in empirical papers has followed an exponential growth since the eighties, both suggests that the answer is yes. The purpose of Tutorials in Quantitative Methods for Psychology is to provide a concise and easy access to the currents methods.

  16. Quantitative proteomics of the tobacco pollen tube secretome identifies novel pollen tube guidance proteins important for fertilization

    Czech Academy of Sciences Publication Activity Database

    Hafidh, Said; Potěšil, D.; Fíla, Jan; Čapková, Věra; Zdráhal, Z.; Honys, David

    2016-01-01

    Roč. 17, MAY 3 (2016), č. článku 81. ISSN 1465-6906 R&D Projects: GA ČR GA15-22720S; GA ČR(CZ) GA14-32292S; GA ČR(CZ) GA15-16050S; GA MŠk(CZ) LD14109; GA MŠk(CZ) LM2015043; GA MŠk(CZ) ED1.1.00/02.0068; GA MŠk(CZ) LQ1601 Institutional support: RVO:61389030 Keywords : Protein secretion * Pollen tube guidance * Cell-cell signaling Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 11.313, year: 2015

  17. Quantitative High-Throughput Screening and Orthogonal Assays to Identify Modulators of the Vitamin D Receptor (SETAC)

    Science.gov (United States)

    The Vitamin D nuclear receptor (VDR) is a selective, ligand-inducible transcription factor involved in numerous biological processes such as cell proliferation, differentiation, detoxification, calcium homeostasis, neurodevelopment, immune system regulation, cardiovascular functi...

  18. Identifying Environmental Chemicals as Agonists of the Androgen Receptor by Applying a Quantitative High-throughput Screening Platform

    Science.gov (United States)

    Background: The androgen receptor (AR, NR3C4) is a nuclear receptor whose main function is acting as a transcription factor regulating gene expression for male sexual development and maintaining accessory sexual organ function. It is also a necessary component of female fertility...

  19. Quantitative imaging of epithelial cell scattering identifies specific inhibitors of cell motility and cell-cell dissociation

    NARCIS (Netherlands)

    Loerke, D.; le Duc, Q.; Blonk, I.; Kerstens, A.; Spanjaard, E.; Machacek, M.; Danuser, G.; de Rooij, J.

    2012-01-01

    The scattering of cultured epithelial cells in response to hepatocyte growth factor (HGF) is a model system that recapitulates key features of metastatic cell behavior in vitro, including disruption of cell-cell adhesions and induction of cell migration. We have developed image analysis tools that

  20. Quantitative Real-Time PCR Assays To Identify and Quantify Fecal Bifidobacterium Species in Infants Receiving a Prebiotic Infant Formula

    OpenAIRE

    Haarman, Monique; Knol, Jan

    2005-01-01

    A healthy intestinal microbiota is considered to be important for priming of the infants' mucosal and systemic immunity. Breast-fed infants typically have an intestinal microbiota dominated by different Bifidobacterium species. It has been described that allergic infants have different levels of specific Bifidobacterium species than healthy infants. For the accurate quantification of Bifidobacterium adolescentis, Bifidobacterium angulatum, Bifidobacterium bifidum, Bifidobacterium breve, Bifid...

  1. sxtA-Based Quantitative Molecular Assay To Identify Saxitoxin-Producing Harmful Algal Blooms in Marine Waters ▿ †

    Science.gov (United States)

    Murray, Shauna A.; Wiese, Maria; Stüken, Anke; Brett, Steve; Kellmann, Ralf; Hallegraeff, Gustaaf; Neilan, Brett A.

    2011-01-01

    The recent identification of genes involved in the production of the potent neurotoxin and keystone metabolite saxitoxin (STX) in marine eukaryotic phytoplankton has allowed us for the first time to develop molecular genetic methods to investigate the chemical ecology of harmful algal blooms in situ. We present a novel method for detecting and quantifying the potential for STX production in marine environmental samples. Our assay detects a domain of the gene sxtA that encodes a unique enzyme putatively involved in the sxt pathway in marine dinoflagellates, sxtA4. A product of the correct size was recovered from nine strains of four species of STX-producing Alexandrium and Gymnodinium catenatum and was not detected in the non-STX-producing Alexandrium species, other dinoflagellate cultures, or an environmental sample that did not contain known STX-producing species. However, sxtA4 was also detected in the non-STX-producing strain of Alexandrium tamarense, Tasmanian ribotype. We investigated the copy number of sxtA4 in three strains of Alexandrium catenella and found it to be relatively constant among strains. Using our novel method, we detected and quantified sxtA4 in three environmental blooms of Alexandrium catenella that led to STX uptake in oysters. We conclude that this method shows promise as an accurate, fast, and cost-effective means of quantifying the potential for STX production in marine samples and will be useful for biological oceanographic research and harmful algal bloom monitoring. PMID:21841034

  2. sxtA-based quantitative molecular assay to identify saxitoxin-producing harmful algal blooms in marine waters.

    Science.gov (United States)

    Murray, Shauna A; Wiese, Maria; Stüken, Anke; Brett, Steve; Kellmann, Ralf; Hallegraeff, Gustaaf; Neilan, Brett A

    2011-10-01

    The recent identification of genes involved in the production of the potent neurotoxin and keystone metabolite saxitoxin (STX) in marine eukaryotic phytoplankton has allowed us for the first time to develop molecular genetic methods to investigate the chemical ecology of harmful algal blooms in situ. We present a novel method for detecting and quantifying the potential for STX production in marine environmental samples. Our assay detects a domain of the gene sxtA that encodes a unique enzyme putatively involved in the sxt pathway in marine dinoflagellates, sxtA4. A product of the correct size was recovered from nine strains of four species of STX-producing Alexandrium and Gymnodinium catenatum and was not detected in the non-STX-producing Alexandrium species, other dinoflagellate cultures, or an environmental sample that did not contain known STX-producing species. However, sxtA4 was also detected in the non-STX-producing strain of Alexandrium tamarense, Tasmanian ribotype. We investigated the copy number of sxtA4 in three strains of Alexandrium catenella and found it to be relatively constant among strains. Using our novel method, we detected and quantified sxtA4 in three environmental blooms of Alexandrium catenella that led to STX uptake in oysters. We conclude that this method shows promise as an accurate, fast, and cost-effective means of quantifying the potential for STX production in marine samples and will be useful for biological oceanographic research and harmful algal bloom monitoring.

  3. Towards Quantitative Optical Cross Sections in Entomological Laser Radar - Potential of Temporal and Spherical Parameterizations for Identifying Atmospheric Fauna.

    Directory of Open Access Journals (Sweden)

    Mikkel Brydegaard

    Full Text Available In recent years, the field of remote sensing of birds and insects in the atmosphere (the aerial fauna has advanced considerably, and modern electro-optic methods now allow the assessment of the abundance and fluxes of pests and beneficials on a landscape scale. These techniques have the potential to significantly increase our understanding of, and ability to quantify and manage, the ecological environment. This paper presents a concept whereby laser radar observations of atmospheric fauna can be parameterized and table values for absolute cross sections can be catalogued to allow for the study of focal species such as disease vectors and pests. Wing-beat oscillations are parameterized with a discrete set of harmonics and the spherical scatter function is parameterized by a reduced set of symmetrical spherical harmonics. A first order spherical model for insect scatter is presented and supported experimentally, showing angular dependence of wing beat harmonic content. The presented method promises to give insights into the flight heading directions of species in the atmosphere and has the potential to shed light onto the km-range spread of pests and disease vectors.

  4. Quantitative and qualitative differences in celiac disease epitopes among durum wheat varieties identified through deep RNA-amplican sequencing

    NARCIS (Netherlands)

    Salentijn, E.M.J.; Esselink, D.G.; Goryunova, S.V.; Meer, van der I.M.; Gilissen, L.J.W.J.; Smulders, M.J.M.

    2013-01-01

    Background - Wheat gluten is important for the industrial quality of bread wheat (Triticum aestivum L.) and durum wheat (T. turgidum L.). Gluten proteins are also the source of immunogenic peptides that can trigger a T cell reaction in celiac disease (CD) patients, leading to inflammatory responses

  5. Identifying policy target groups with qualitative and quantitative methods: the case of wildfire risk on nonindustrial private forest lands

    Science.gov (United States)

    A. Paige. Fischer

    2012-01-01

    Designing policies to harness the potential of heterogeneous target groups such as nonindustrial private forest owners to contribute to public policy goals can be challenging. The behaviors of such groups are shaped by their diverse motivations and circumstances. Segmenting heterogeneous target groups into more homogeneous subgroups may improve the chances of...

  6. Constructing high-density genetic maps for polyploid sugarcane (Saccharum spp.) and identifying quantitative trait loci controlling brown rust resistance

    Science.gov (United States)

    Sugarcane (Saccharum spp.) is an important economic crop for producing edible sugar and bioethanol. Brown rust had long been a major disease impacting sugarcane production world widely. Resistance resource and markers linked to the resistance are valuable tools for disease resistance improvement. An...

  7. Two statistics for evaluating parameter identifiability and error reduction

    Science.gov (United States)

    Doherty, John; Hunt, Randall J.

    2009-01-01

    Two statistics are presented that can be used to rank input parameters utilized by a model in terms of their relative identifiability based on a given or possible future calibration dataset. Identifiability is defined here as the capability of model calibration to constrain parameters used by a model. Both statistics require that the sensitivity of each model parameter be calculated for each model output for which there are actual or presumed field measurements. Singular value decomposition (SVD) of the weighted sensitivity matrix is then undertaken to quantify the relation between the parameters and observations that, in turn, allows selection of calibration solution and null spaces spanned by unit orthogonal vectors. The first statistic presented, "parameter identifiability", is quantitatively defined as the direction cosine between a parameter and its projection onto the calibration solution space. This varies between zero and one, with zero indicating complete non-identifiability and one indicating complete identifiability. The second statistic, "relative error reduction", indicates the extent to which the calibration process reduces error in estimation of a parameter from its pre-calibration level where its value must be assigned purely on the basis of prior expert knowledge. This is more sophisticated than identifiability, in that it takes greater account of the noise associated with the calibration dataset. Like identifiability, it has a maximum value of one (which can only be achieved if there is no measurement noise). Conceptually it can fall to zero; and even below zero if a calibration problem is poorly posed. An example, based on a coupled groundwater/surface-water model, is included that demonstrates the utility of the statistics. ?? 2009 Elsevier B.V.

  8. Infusion of Quantitative and Statistical Concepts into Biology Courses Does Not Improve Quantitative Literacy

    Science.gov (United States)

    Beck, Christopher W.

    2018-01-01

    Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…

  9. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    Science.gov (United States)

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  10. Quantitative whole body scintigraphy - a simplified approach

    International Nuclear Information System (INIS)

    Marienhagen, J.; Maenner, P.; Bock, E.; Schoenberger, J.; Eilles, C.

    1996-01-01

    In this paper we present investigations on a simplified method of quantitative whole body scintigraphy by using a dual head LFOV-gamma camera and a calibration algorithm without the need of additional attenuation or scatter correction. Validation of this approach to the anthropomorphic phantom as well as in patient studies showed a high accuracy concerning quantification of whole body activity (102.8% and 97.72%, resp.), by contrast organ activities were recovered with an error range up to 12%. The described method can be easily performed using commercially available software packages and is recommendable especially for quantitative whole body scintigraphy in a clinical setting. (orig.) [de

  11. Aspects of quantitative secondary ion mass spectrometry

    International Nuclear Information System (INIS)

    Grauer, R.

    1982-05-01

    Parameters which have an influence on the formation of secondary ions by ion bombardment of a solid matrix are discussed. Quantitative SIMS-analysis with the help of calibration standards necessitates a stringent control of these parameters. This is particularly valid for the oxygen partial pressure which for metal analysis has to be maintained constant also under ultra high vacuum. The performance of the theoretical LTE-model (Local Thermal Equilibrium) using internal standards will be compared with the analysis with the help of external standards. The LTE-model does not satisfy the requirements for quantitative analysis. (Auth.)

  12. Accuracy of quantitative visual soil assessment

    Science.gov (United States)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  13. Review of progress in quantitative nondestructive evaluation

    International Nuclear Information System (INIS)

    Thompson, D.O.; Chimenti, D.E.

    1983-01-01

    A comprehensive review of the current state of quantitative nondestructive evaluation (NDE), this volume brings together papers by researchers working in government, private industry, and university laboratories. Their papers cover a wide range of interests and concerns for researchers involved in theoretical and applied aspects of quantitative NDE. Specific topics examined include reliability probability of detection--ultrasonics and eddy currents weldments closure effects in fatigue cracks technology transfer ultrasonic scattering theory acoustic emission ultrasonic scattering, reliability and penetrating radiation metal matrix composites ultrasonic scattering from near-surface flaws ultrasonic multiple scattering

  14. Strategies for MCMC computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibanez, Noelia; Sorensen, Daniel

    2006-01-01

    Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional but with a sp......Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional...

  15. Electric Field Quantitative Measurement System and Method

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  16. Identifiability and Identification of Trace Continuous Pollutant Source

    Directory of Open Access Journals (Sweden)

    Hongquan Qu

    2014-01-01

    Full Text Available Accidental pollution events often threaten people’s health and lives, and a pollutant source is very necessary so that prompt remedial actions can be taken. In this paper, a trace continuous pollutant source identification method is developed to identify a sudden continuous emission pollutant source in an enclosed space. The location probability model is set up firstly, and then the identification method is realized by searching a global optimal objective value of the location probability. In order to discuss the identifiability performance of the presented method, a conception of a synergy degree of velocity fields is presented in order to quantitatively analyze the impact of velocity field on the identification performance. Based on this conception, some simulation cases were conducted. The application conditions of this method are obtained according to the simulation studies. In order to verify the presented method, we designed an experiment and identified an unknown source appearing in the experimental space. The result showed that the method can identify a sudden trace continuous source when the studied situation satisfies the application conditions.

  17. Protein Correlation Profiles Identify Lipid Droplet Proteins with High Confidence*

    Science.gov (United States)

    Krahmer, Natalie; Hilger, Maximiliane; Kory, Nora; Wilfling, Florian; Stoehr, Gabriele; Mann, Matthias; Farese, Robert V.; Walther, Tobias C.

    2013-01-01

    Lipid droplets (LDs) are important organelles in energy metabolism and lipid storage. Their cores are composed of neutral lipids that form a hydrophobic phase and are surrounded by a phospholipid monolayer that harbors specific proteins. Most well-established LD proteins perform important functions, particularly in cellular lipid metabolism. Morphological studies show LDs in close proximity to and interacting with membrane-bound cellular organelles, including the endoplasmic reticulum, mitochondria, peroxisomes, and endosomes. Because of these close associations, it is difficult to purify LDs to homogeneity. Consequently, the confident identification of bona fide LD proteins via proteomics has been challenging. Here, we report a methodology for LD protein identification based on mass spectrometry and protein correlation profiles. Using LD purification and quantitative, high-resolution mass spectrometry, we identified LD proteins by correlating their purification profiles to those of known LD proteins. Application of the protein correlation profile strategy to LDs isolated from Drosophila S2 cells led to the identification of 111 LD proteins in a cellular LD fraction in which 1481 proteins were detected. LD localization was confirmed in a subset of identified proteins via microscopy of the expressed proteins, thereby validating the approach. Among the identified LD proteins were both well-characterized LD proteins and proteins not previously known to be localized to LDs. Our method provides a high-confidence LD proteome of Drosophila cells and a novel approach that can be applied to identify LD proteins of other cell types and tissues. PMID:23319140

  18. Nuclear techniques to identify allergenic metals in orthodontic brackets

    International Nuclear Information System (INIS)

    Zenobio, E.G.; Zenobio, M.A.F.; Menezes, M.A.B.C.

    2009-01-01

    The present study determines the elementary alloy composition of ten commercial brands of brackets, especially related to Ni, Cr, and Co metals, confirmed allergenic elements. The nuclear techniques applied in the analyses were X-ray fluorescence (XRF) - Centre National de la Recherche Scientifique, France (National Center of Scientific Research), and X-ray energy spectrometry (XRES), and Instrumental Neutron Activation Analysis (INAA) - CDTN/CNEN, Brazil. The XRES and XRF techniques identified Cr in the 10 samples analyzed and Ni in eight samples. The INAA technique identified the presence of Cr (14% to 19%) and Co (42% to 2400 ppm) in all samples. The semi-quantitative analysis performed by XRF also identified Co in two samples. The techniques were effective in the identification of metals in orthodontic brackets. The elements identified in this study can be considered one of the main reason for the allergic processes among the patients studied. This finding suggests that the patients should be tested for allergy and allergenic sensibility to metals prior to the prescription of orthodontic device. (author)

  19. THE USEFULNESS OF USER TESTING METHODS IN IDENTIFYING PROBLEMS ON UNIVERSITY WEBSITES

    Directory of Open Access Journals (Sweden)

    Layla Hasan

    2014-10-01

    Full Text Available This paper aims to investigate the usefulness of three user testing methods (observation, and using both quantitative and qualitative data from a post-test questionnaire in terms of their ability or inability to find specific usability problems on university websites. The results showed that observation was the best method, compared to the other two, in identifying large numbers of major and minor usability problems on university websites. The results also showed that employing qualitative data from a post-test questionnaire was a useful complementary method since this identified additional usability problems that were not identified by the observation method. However, the results showed that the quantitative data from the post-test questionnaire were inaccurate and ineffective in terms of identifying usability problems on such websites.

  20. Establishment of Dimethyl Labeling-based Quantitative Acetylproteomics in Arabidopsis.

    Science.gov (United States)

    Liu, Shichang; Yu, Fengchao; Yang, Zhu; Wang, Tingliang; Xiong, Hairong; Chang, Caren; Yu, Weichuan; Li, Ning

    2018-05-01

    Protein acetylation, one of many types of post-translational modifications (PTMs), is involved in a variety of biological and cellular processes. In the present study, we applied both C sCl d ensity g radient (CDG) centrifugation-based protein fractionation and a dimethyl-labeling-based 4C quantitative PTM proteomics workflow in the study of dynamic acetylproteomic changes in Arabidopsis. This workflow integrates the dimethyl c hemical labeling with c hromatography-based acetylpeptide separation and enrichment followed by mass spectrometry (MS) analysis, the extracted ion chromatogram (XIC) quantitation-based c omputational analysis of mass spectrometry data to measure dynamic changes of acetylpeptide level using an in-house software program, named S table isotope-based Qua ntitation- D imethyl labeling (SQUA-D), and finally the c onfirmation of ethylene hormone-regulated acetylation using immunoblot analysis. Eventually, using this proteomic approach, 7456 unambiguous acetylation sites were found from 2638 different acetylproteins, and 5250 acetylation sites, including 5233 sites on lysine side chain and 17 sites on protein N termini, were identified repetitively. Out of these repetitively discovered acetylation sites, 4228 sites on lysine side chain ( i.e. 80.5%) are novel. These acetylproteins are exemplified by the histone superfamily, ribosomal and heat shock proteins, and proteins related to stress/stimulus responses and energy metabolism. The novel acetylproteins enriched by the CDG centrifugation fractionation contain many cellular trafficking proteins, membrane-bound receptors, and receptor-like kinases, which are mostly involved in brassinosteroid, light, gravity, and development signaling. In addition, we identified 12 highly conserved acetylation site motifs within histones, P-glycoproteins, actin depolymerizing factors, ATPases, transcription factors, and receptor-like kinases. Using SQUA-D software, we have quantified 33 ethylene hormone-enhanced and

  1. Using an Educational Electronic Documentation System to Help Nursing Students Accurately Identify Nursing Diagnoses

    Science.gov (United States)

    Pobocik, Tamara J.

    2013-01-01

    The use of technology and electronic medical records in healthcare has exponentially increased. This quantitative research project used a pretest/posttest design, and reviewed how an educational electronic documentation system helped nursing students to identify the accurate related to statement of the nursing diagnosis for the patient in the case…

  2. Bridging the Gap: Identifying Perceptions of Effective Teaching Methods for Age 50+ Baby Boomer Learners

    Science.gov (United States)

    Newberry, Sheila

    2013-01-01

    The purpose of this study was to identify effective teaching methods for age 50+ baby boomer learners. The study used a mixed methods research design. The qualitative paradigm used focus group sessions and the quantitative paradigm was completed through surveys. Fifteen age 50+ baby boomer learners and 11 faculty who teach them comprised the two…

  3. Identifying novel genes for atherosclerosis through mouse-human comparative genetics

    NARCIS (Netherlands)

    Wang, XS; Ishimori, N; Korstanje, R; Rollins, J; Paigen, B

    Susceptibility to atherosclerosis is determined by both environmental and genetic factors. Its genetic determinants have been studied by use of quantitative- trait - locus ( QTL) analysis. So far, 21 atherosclerosis QTLs have been identified in the mouse: 7 in a high- fat - diet model only, 9 in a

  4. The Sampling Issues in Quantitative Research

    Science.gov (United States)

    Delice, Ali

    2010-01-01

    A concern for generalization dominates quantitative research. For generalizability and repeatability, identification of sample size is essential. The present study investigates 90 qualitative master's theses submitted for the Primary and Secondary School Science and Mathematics Education Departments, Mathematic Education Discipline in 10…

  5. Critical Race Quantitative Intersections: A "testimonio" Analysis

    Science.gov (United States)

    Covarrubias, Alejandro; Nava, Pedro E.; Lara, Argelia; Burciaga, Rebeca; Vélez, Verónica N.; Solorzano, Daniel G.

    2018-01-01

    The educational pipeline has become a commonly referenced depiction of educational outcomes for racialized groups across the country. While visually impactful, an overreliance on decontextualized quantitative data often leads to majoritarian interpretations. Without sociohistorical contexts, these interpretations run the risk of perpetuating…

  6. SCRY: Enabling quantitative reasoning in SPARQL queries

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Stringer, Bas; Loizou, Antonis; Abeln, Sanne; Heringa, Jaap

    2015-01-01

    The inability to include quantitative reasoning in SPARQL queries slows down the application of Semantic Web technology in the life sciences. SCRY, our SPARQL compatible service layer, improves this by executing services at query time and making their outputs query-accessible, generating RDF data on

  7. Quantitative sample preparation of some heavy elements

    International Nuclear Information System (INIS)

    Jaffey, A.H.

    1977-01-01

    A discussion is given of some techniques that have been useful in quantitatively preparing and analyzing samples used in the half-life determinations of some plutonium and uranium isotopes. Application of these methods to the preparation of uranium and plutonium samples used in neutron experiments is discussed

  8. Strategies for MCMC computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibánez, N.; Sorensen, Daniel

    2006-01-01

    Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional but with a sp...

  9. Proteomic approaches for quantitative cancer cell signaling

    DEFF Research Database (Denmark)

    Voellmy, Franziska

    studies in an effort to contribute to the study of signaling dynamics in cancer systems. This thesis is divided into two parts. Part I begins with a brief introduction in the use of omics in systems cancer research with a focus on mass spectrometry as a means to quantitatively measure protein...

  10. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  11. Quantitative multiplex detection of pathogen biomarkers

    Energy Technology Data Exchange (ETDEWEB)

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I.; Martinez, Jennifer; Grace, Wynne K.

    2016-02-09

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  12. Quantitative angiography after directional coronary atherectomy

    NARCIS (Netherlands)

    P.W.J.C. Serruys (Patrick); V.A.W.M. Umans (Victor); B.H. Strauss (Bradley); R-J. van Suylen (Robert-Jan); M.J.B.M. van den Brand (Marcel); H. Suryapranata (Harry); P.J. de Feyter (Pim); J.R.T.C. Roelandt (Jos)

    1991-01-01

    textabstractOBJECTIVE: To assess by quantitative analysis the immediate angiographic results of directional coronary atherectomy. To compare the effects of successful atherectomy with those of successful balloon dilatation in a series of patients with matched lesions. DESIGN--Case series.

  13. Deforestation since independence: A quantitative assessment of ...

    African Journals Online (AJOL)

    Deforestation since independence: A quantitative assessment of four decades of land-cover change in Malawi. ... pressure and demographic factors are important predictors of deforestation rate within our study area. Keywords: afforestation, Africa, deforestation, drivers, land-use change, reforestation, rural, urban ...

  14. Quantitative SPECT reconstruction of iodine-123 data

    International Nuclear Information System (INIS)

    Gilland, D.R.; Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1991-01-01

    Many clinical and research studies in nuclear medicine require quantitation of iodine-123 ( 123 I) distribution for the determination of kinetics or localization. The objective of this study was to implement several reconstruction methods designed for single-photon emission computed tomography (SPECT) using 123 I and to evaluate their performance in terms of quantitative accuracy, image artifacts, and noise. The methods consisted of four attenuation and scatter compensation schemes incorporated into both the filtered backprojection/Chang (FBP) and maximum likelihood-expectation maximization (ML-EM) reconstruction algorithms. The methods were evaluated on data acquired of a phantom containing a hot sphere of 123 I activity in a lower level background 123 I distribution and nonuniform density media. For both reconstruction algorithms, nonuniform attenuation compensation combined with either scatter subtraction or Metz filtering produced images that were quantitatively accurate to within 15% of the true value. The ML-EM algorithm demonstrated quantitative accuracy comparable to FBP and smaller relative noise magnitude for all compensation schemes

  15. Quantitative grading of store separation trajectories

    CSIR Research Space (South Africa)

    Jamison, Kevin A

    2017-09-01

    Full Text Available . This paper describes the development of an automated analysis process and software that can run a multitude of separation scenarios. A key enabler for this software is the development of a quantitative grading algorithm that scores the outcome of each release...

  16. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  17. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    Araujo, M. Marrocos de

    1987-01-01

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author) [pt

  18. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...

  19. Quantitative multiplex detection of pathogen biomarkers

    Science.gov (United States)

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I; Martinez, Jennifer; Grace, Wynne K

    2014-10-14

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  20. A quantitative lubricant test for deep drawing

    DEFF Research Database (Denmark)

    Olsson, David Dam; Bay, Niels; Andreasen, Jan L.

    2010-01-01

    A tribological test for deep drawing has been developed by which the performance of lubricants may be evaluated quantitatively measuring the maximum backstroke force on the punch owing to friction between tool and workpiece surface. The forming force is found not to give useful information...

  1. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  2. Quantitative muscle ultrasonography in amyotrophic lateral sclerosis.

    NARCIS (Netherlands)

    Arts, I.M.P.; Rooij, F.G. van; Overeem, S.; Pillen, S.; Janssen, H.M.; Schelhaas, H.J.; Zwarts, M.J.

    2008-01-01

    In this study, we examined whether quantitative muscle ultrasonography can detect structural muscle changes in early-stage amyotrophic lateral sclerosis (ALS). Bilateral transverse scans were made of five muscles or muscle groups (sternocleidomastoid, biceps brachii/brachialis, forearm flexor group,

  3. Quantitative penetration testing with item response theory

    NARCIS (Netherlands)

    Pieters, W.; Arnold, F.; Stoelinga, M.I.A.

    2013-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Therefore, penetration testing has thus far been used as a qualitative research method. To enable quantitative approaches to security risk management,

  4. QUANTITATIVE EXTRACTION OF MEIOFAUNA: A COMPARISON ...

    African Journals Online (AJOL)

    and A G DE WET. Department of Mathematical Statistics, University of Port Elizabeth. Accepted: May 1978. ABSTRACT. Two methods for the quantitative extraction of meiofauna from natural sandy sediments were investigated and compared: Cobb's decanting and sieving technique and the Oostenbrink elutriator. Both.

  5. Development of Three Methods for Simultaneous Quantitative ...

    African Journals Online (AJOL)

    Development of Three Methods for Simultaneous Quantitative Determination of Chlorpheniramine Maleate and Dexamethasone in the Presence of Parabens in ... Tropical Journal of Pharmaceutical Research ... Results: All the proposed methods were successfully applied to the analysis of raw materials and dosage form.

  6. Automated approach to quantitative error analysis

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1977-04-01

    A method is described how a quantitative measure for the robustness of a given neutron transport theory code for coarse network calculations can be obtained. A code that performs this task automatically and at only nominal cost is described. This code also generates user-oriented benchmark problems which exhibit the analytic behavior at interfaces. 5 figures, 1 table

  7. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    Energy Technology Data Exchange (ETDEWEB)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  8. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  9. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  10. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  11. Quantitative and Qualitative Extensions of Event Structures

    NARCIS (Netherlands)

    Katoen, Joost P.

    1996-01-01

    An important application of formal methods is the specification, design, and analysis of functional aspects of (distributed) systems. Recently the study of quantitative aspects of such systems based on formal methods has come into focus. Several extensions of formal methods where the occurrence of

  12. Quantitative Evidence Synthesis with Power Priors

    NARCIS (Netherlands)

    Rietbergen, C.|info:eu-repo/dai/nl/322847796

    2016-01-01

    The aim of this thesis is to provide the applied researcher with a practical approach for quantitative evidence synthesis using the conditional power prior that allows for subjective input and thereby provides an alternative tgbgo deal with the difficulties as- sociated with the joint power prior

  13. Quantitative Penetration Testing with Item Response Theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle Ida Antoinette

    2014-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  14. Quantitative penetration testing with item response theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle

    2013-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  15. Engaging Business Students in Quantitative Skills Development

    Science.gov (United States)

    Cronin, Anthony; Carroll, Paula

    2015-01-01

    In this paper the complex problems of developing quantitative and analytical skills in undergraduate first year, first semester business students are addressed. An action research project, detailing how first year business students perceive the relevance of data analysis and inferential statistics in light of the economic downturn and the…

  16. Leaderless Covert Networks : A Quantitative Approach

    NARCIS (Netherlands)

    Husslage, B.G.M.; Lindelauf, R.; Hamers, H.J.M.

    2012-01-01

    Abstract: Lindelauf et al. (2009a) introduced a quantitative approach to investigate optimal structures of covert networks. This approach used an objective function which is based on the secrecy versus information trade-off these organizations face. Sageman (2008) hypothesized that covert networks

  17. Quantitative MRI of kidneys in renal disease.

    Science.gov (United States)

    Kline, Timothy L; Edwards, Marie E; Garg, Ishan; Irazabal, Maria V; Korfiatis, Panagiotis; Harris, Peter C; King, Bernard F; Torres, Vicente E; Venkatesh, Sudhakar K; Erickson, Bradley J

    2018-03-01

    To evaluate the reproducibility and utility of quantitative magnetic resonance imaging (MRI) sequences for the assessment of kidneys in young adults with normal renal function (eGFR ranged from 90 to 130 mL/min/1.73 m 2 ) and patients with early renal disease (autosomal dominant polycystic kidney disease). This prospective case-control study was performed on ten normal young adults (18-30 years old) and ten age- and sex-matched patients with early renal parenchymal disease (autosomal dominant polycystic kidney disease). All subjects underwent a comprehensive kidney MRI protocol, including qualitative imaging: T1w, T2w, FIESTA, and quantitative imaging: 2D cine phase contrast of the renal arteries, and parenchymal diffusion weighted imaging (DWI), magnetization transfer imaging (MTI), blood oxygen level dependent (BOLD) imaging, and magnetic resonance elastography (MRE). The normal controls were imaged on two separate occasions ≥24 h apart (range 24-210 h) to assess reproducibility of the measurements. Quantitative MR imaging sequences were found to be reproducible. The mean ± SD absolute percent difference between quantitative parameters measured ≥24 h apart were: MTI-derived ratio = 4.5 ± 3.6%, DWI-derived apparent diffusion coefficient (ADC) = 6.5 ± 3.4%, BOLD-derived R2* = 7.4 ± 5.9%, and MRE-derived tissue stiffness = 7.6 ± 3.3%. Compared with controls, the ADPKD patient's non-cystic renal parenchyma (NCRP) had statistically significant differences with regard to quantitative parenchymal measures: lower MTI percent ratios (16.3 ± 4.4 vs. 23.8 ± 1.2, p quantitative measurements was obtained in all cases. Significantly different quantitative MR parenchymal measurement parameters between ADPKD patients and normal controls were obtained by MT, DWI, BOLD, and MRE indicating the potential for detecting and following renal disease at an earlier stage than the conventional qualitative imaging techniques.

  18. Investigation of Shrinkage Defect in Castings by Quantitative Ishikawa Diagram

    Directory of Open Access Journals (Sweden)

    Chokkalingam B.

    2017-03-01

    Full Text Available Metal casting process involves processes such as pattern making, moulding and melting etc. Casting defects occur due to combination of various processes even though efforts are taken to control them. The first step in the defect analysis is to identify the major casting defect among the many casting defects. Then the analysis is to be made to find the root cause of the particular defect. Moreover, it is especially difficult to identify the root causes of the defect. Therefore, a systematic method is required to identify the root cause of the defect among possible causes, consequently specific remedial measures have to be implemented to control them. This paper presents a systematic procedure to identify the root cause of shrinkage defect in an automobile body casting (SG 500/7 and control it by the application of Pareto chart and Ishikawa diagram. with quantitative Weightage. It was found that the root causes were larger volume section in the cope, insufficient feeding of riser and insufficient poured metal in the riser. The necessary remedial measures were taken and castings were reproduced. The shrinkage defect in the castings was completely eliminated.

  19. Quantitative Reasoning in Environmental Science: A Learning Progression

    Science.gov (United States)

    Mayes, Robert Lee; Forrester, Jennifer Harris; Christus, Jennifer Schuttlefield; Peterson, Franziska Isabel; Bonilla, Rachel; Yestness, Nissa

    2014-01-01

    The ability of middle and high school students to reason quantitatively within the context of environmental science was investigated. A quantitative reasoning (QR) learning progression was created with three progress variables: quantification act, quantitative interpretation, and quantitative modeling. An iterative research design was used as it…

  20. NIH Researchers Identify OCD Risk Gene

    Science.gov (United States)

    ... News From NIH NIH Researchers Identify OCD Risk Gene Past Issues / Summer 2006 Table of Contents For ... and Alcoholism (NIAAA) have identified a previously unknown gene variant that doubles an individual's risk for obsessive- ...

  1. Ability of Slovakian Pupils to Identify Birds

    Science.gov (United States)

    Prokop, Pavol; Rodak, Rastislav

    2009-01-01

    A pupil's ability to identify common organisms is necessary for acquiring further knowledge of biology. We investigated how pupils were able to identify 25 bird species following their song, growth habits, or both features presented simultaneously. Just about 19% of birds were successfully identified by song, about 39% by growth habit, and 45% of…

  2. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  3. CBCL Pediatric Bipolar Disorder Profile and ADHD: Comorbidity and Quantitative Trait Loci Analysis

    Science.gov (United States)

    McGough, James J.; Loo, Sandra K.; McCracken, James T.; Dang, Jeffery; Clark, Shaunna; Nelson, Stanley F.; Smalley, Susan L.

    2008-01-01

    The pediatric bipolar disorder profile of the Child Behavior checklist is used to differentiate patterns of comorbidity and to search for quantitative trait loci in multiple affected ADHD sibling pairs. The CBCL-PBD profiling identified 8 percent of individuals with severe psychopathology and increased rates of oppositional defiant, conduct and…

  4. Workers and alate queens of Solenopsis geminata share qualitatively similar but quantitatively different venom alkaloid chemistry

    Science.gov (United States)

    The cis and trans alkaloids from body extracts of workers and alate queens of the tropical fire ant, Solenopsis geminata (Hymenoptera: Formicidae), were successfully separated by silica gel chromatography, identified, and quantitated by GC-MS analysis. Both workers and alate queens produce primarily...

  5. Complex and unstable simple elbow dislocations: a review and quantitative analysis of individual patient data

    NARCIS (Netherlands)

    de Haan, Jeroen; Schep, Niels; Tuinebreijer, Wim; den Hartog, Dennis

    2010-01-01

    The primary objective of this review of the literature with quantitative analysis of individual patient data was to identify the results of available treatments for complex elbow dislocations and unstable simple elbow dislocations. The secondary objective was to compare the results of patients with

  6. Verification of Scientific Simulations via Hypothesis-Driven Comparative and Quantitative Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [ORNL; Heitmann, Katrin [ORNL; Petersen, Mark R [ORNL; Woodring, Jonathan [Los Alamos National Laboratory (LANL); Williams, Sean [Los Alamos National Laboratory (LANL); Fasel, Patricia [Los Alamos National Laboratory (LANL); Ahrens, Christine [Los Alamos National Laboratory (LANL); Hsu, Chung-Hsing [ORNL; Geveci, Berk [ORNL

    2010-11-01

    This article presents a visualization-assisted process that verifies scientific-simulation codes. Code verification is necessary because scientists require accurate predictions to interpret data confidently. This verification process integrates iterative hypothesis verification with comparative, feature, and quantitative visualization. Following this process can help identify differences in cosmological and oceanographic simulations.

  7. A quantitative approach to diagnosis and correction of organizational and programmatic issues

    International Nuclear Information System (INIS)

    Chiu, C.; Johnson, K.

    1997-01-01

    A integrated approach to diagnosis and correction of critical Organizational and Programmatic (O and P) issues is summarized, and the quantitative special evaluations that ar used to confirm the O and P issues identified by the periodic common cause analysis and integrated safety assessments

  8. Quantitative Analysis of Qualitative Information from Interviews: A Systematic Literature Review

    Science.gov (United States)

    Fakis, Apostolos; Hilliam, Rachel; Stoneley, Helen; Townend, Michael

    2014-01-01

    Background: A systematic literature review was conducted on mixed methods area. Objectives: The overall aim was to explore how qualitative information from interviews has been analyzed using quantitative methods. Methods: A contemporary review was undertaken and based on a predefined protocol. The references were identified using inclusion and…

  9. Qualitative and Quantitative Management Tools Used by Financial Officers in Public Research Universities

    Science.gov (United States)

    Trexler, Grant Lewis

    2012-01-01

    This dissertation set out to identify effective qualitative and quantitative management tools used by financial officers (CFOs) in carrying out their management functions of planning, decision making, organizing, staffing, communicating, motivating, leading and controlling at a public research university. In addition, impediments to the use of…

  10. A Quantitative Correlational Study of Teacher Preparation Program on Student Achievement

    Science.gov (United States)

    Dingman, Jacob Blackstone

    2010-01-01

    The purpose of this quantitative correlational study was to identify the relationship between the type of teacher preparation program and student performance on the seventh and eighth grade mathematics state assessments in rural school settings. The study included a survey of a convenience sample of 36 teachers from Colorado and Washington school…

  11. Quantitative trait loci analysis of osteocondrosis traits in the elbow joint of pigs

    DEFF Research Database (Denmark)

    Christensen, O F; Busch, M E; Gregersen, V R

    2010-01-01

    Osteochondrosis is a growth disorder in the cartilage of young animals and is characterised by lesions found in the cartilage and bone. This study identified quantitative trait loci (QTLs) associated with six osteochondrosis lesion traits in the elbow joint of finishing pigs. The traits were: thi...

  12. Investigation of the genetic association between quantitative measures of psychosis and schizophrenia

    DEFF Research Database (Denmark)

    Derks, Eske M; Vorstman, Jacob A S; Ripke, Stephan

    2012-01-01

    The presence of subclinical levels of psychosis in the general population may imply that schizophrenia is the extreme expression of more or less continuously distributed traits in the population. In a previous study, we identified five quantitative measures of schizophrenia (positive, negative, d...

  13. Annotation of loci from genome-wide association studies using tissue-specific quantitative interaction proteomics

    NARCIS (Netherlands)

    Lundby, Alicia; Rossin, Elizabeth J.; Steffensen, Annette B.; Acha, Moshe Ray; Newton-Cheh, Christopher; Pfeufer, Arne; Lyneh, Stacey N.; Olesen, Soren-Peter; Brunak, Soren; Ellinor, Patrick T.; Jukema, J. Wouter; Trompet, Stella; Ford, Ian; Macfarlane, Peter W.; Krijthe, Bouwe P.; Hofman, Albert; Uitterlinden, Andre G.; Stricker, Bruno H.; Nathoe, Hendrik M.; Spiering, Wilko; Daly, Mark J.; Asselbergs, Ikea W.; van der Harst, Pim; Milan, David J.; de Bakker, Paul I. W.; Lage, Kasper; Olsen, Jesper V.

    Genome-wide association studies (GWAS) have identified thousands of loci associated with complex traits, but it is challenging to pinpoint causal genes in these loci and to exploit subtle association signals. We used tissue-specific quantitative interaction proteomics to map a network of five genes

  14. Regional and Historical Minerageny of the Gemstone Complexes in Russia (Quantitative Aspects

    Directory of Open Access Journals (Sweden)

    Polyanin V.S.

    2015-06-01

    Full Text Available The paper presents an approximate quantitative estimation of the mineragenic potentials of gems in the paleogeodynamic systems, complexes, and geological formations of Russia, which are characterized by different age and regional distribution. General trends of the changes in the scale and intensity of formation processes of the same gem deposits over geologic time were identified.

  15. Cracking anxiety in the mouse : a quantitative (epi)genetic approach

    NARCIS (Netherlands)

    Labots, M.

    2017-01-01

    The aim of this thesis was to improve existing methodologies and apply genetic strategies in order to identify (main-effect, epistatic, multiple and pleiotropic) quantitative trait loci and to decipher functional candidate genes for anxiety-related behavior and baseline blood plasma total

  16. Identification of quantitative trait loci for cadmium tolerance and accumulation in wheat

    DEFF Research Database (Denmark)

    Ci, Dunwei; Jiang, Dong; Li, Sishen

    2012-01-01

    Quantitative trait loci (QTL) for Cadmium (Cd) tolerance and accumulation in wheat (Triticum aestivum L.) were identified, using 103 recombinant inbred lines (RILs) derived from a cross of Ch×Sh at germination and seedling stages. The traits of germination, growth and physiology were measured. Cd...

  17. Buying Imported Products Online : A quantitative study about Chinese Online consumer behavior towards imported products

    OpenAIRE

    Chen, Qianqian; Wang, Yuren

    2015-01-01

    With the fast growing Chinese online marketplace and the increasing popularity of shopping imported products online in China, more and more practitioners and researchers are interested in understanding the cues that Chinese consumers use to evaluate imported products consumption online. Our quantitative study aims to identify what factors affect the behavior of Chinese online consumers towards imported products and the relationships between the identified factors and purchase intention, and t...

  18. Measurement parameter selection for quantitative isotope dilution gas chromatography/mass spectrometry

    International Nuclear Information System (INIS)

    Colby, B.N.; Rosecrance, A.E.; Colby, M.E.

    1981-01-01

    By use of the two-isotope model of isotope dilution, selection criteria were developed for identifying optimum m/z's for quantitation of compounds by gas chromatography/mass spectrometry. In addition, it was possible to predict the optimum ratio of naturally abundant to labeled compound and to identify appropriate data reduction methods. The validity of these predictions was confirmed by using experimental GC/MS data for several organic compounds

  19. Application of quantitative DTI metrics in sporadic CJD

    Directory of Open Access Journals (Sweden)

    E. Caverzasi

    2014-01-01

    Full Text Available Diffusion Weighted Imaging is extremely important for the diagnosis of probable sporadic Jakob–Creutzfeldt disease, the most common human prion disease. Although visual assessment of DWI MRI is critical diagnostically, a more objective, quantifiable approach might more precisely identify the precise pattern of brain involvement. Furthermore, a quantitative, systematic tracking of MRI changes occurring over time might provide insights regarding the underlying histopathological mechanisms of human prion disease and provide information useful for clinical trials. The purposes of this study were: 1 to describe quantitatively the average cross-sectional pattern of reduced mean diffusivity, fractional anisotropy, atrophy and T1 relaxation in the gray matter (GM in sporadic Jakob–Creutzfeldt disease, 2 to study changes in mean diffusivity and atrophy over time and 3 to explore their relationship with clinical scales. Twenty-six sporadic Jakob–Creutzfeldt disease and nine control subjects had MRIs on the same scanner; seven sCJD subjects had a second scan after approximately two months. Cortical and subcortical gray matter regions were parcellated with Freesurfer. Average cortical thickness (or subcortical volume, T1-relaxiation and mean diffusivity from co-registered diffusion maps were calculated in each region for each subject. Quantitatively on cross-sectional analysis, certain brain regions were preferentially affected by reduced mean diffusivity (parietal, temporal lobes, posterior cingulate, thalamus and deep nuclei, but with relative sparing of the frontal and occipital lobes. Serial imaging, surprisingly showed that mean diffusivity did not have a linear or unidirectional reduction over time, but tended to decrease initially and then reverse and increase towards normalization. Furthermore, there was a strong correlation between worsening of patient clinical function (based on modified Barthel score and increasing mean diffusivity.

  20. Functional mapping imprinted quantitative trait loci underlying developmental characteristics

    Directory of Open Access Journals (Sweden)

    Li Gengxin

    2008-03-01

    Full Text Available Abstract Background Genomic imprinting, a phenomenon referring to nonequivalent expression of alleles depending on their parental origins, has been widely observed in nature. It has been shown recently that the epigenetic modification of an imprinted gene can be detected through a genetic mapping approach. Such an approach is developed based on traditional quantitative trait loci (QTL mapping focusing on single trait analysis. Recent studies have shown that most imprinted genes in mammals play an important role in controlling embryonic growth and post-natal development. For a developmental character such as growth, current approach is less efficient in dissecting the dynamic genetic effect of imprinted genes during individual ontology. Results Functional mapping has been emerging as a powerful framework for mapping quantitative trait loci underlying complex traits showing developmental characteristics. To understand the genetic architecture of dynamic imprinted traits, we propose a mapping strategy by integrating the functional mapping approach with genomic imprinting. We demonstrate the approach through mapping imprinted QTL controlling growth trajectories in an inbred F2 population. The statistical behavior of the approach is shown through simulation studies, in which the parameters can be estimated with reasonable precision under different simulation scenarios. The utility of the approach is illustrated through real data analysis in an F2 family derived from LG/J and SM/J mouse stains. Three maternally imprinted QTLs are identified as regulating the growth trajectory of mouse body weight. Conclusion The functional iQTL mapping approach developed here provides a quantitative and testable framework for assessing the interplay between imprinted genes and a developmental process, and will have important implications for elucidating the genetic architecture of imprinted traits.

  1. Bringing quality and meaning to quantitative data - Bringing quantitative evidence to qualitative observation

    DEFF Research Database (Denmark)

    Karpatschof, Benny

    2007-01-01

    Based on the author's methodological theory defining the distinctive properties of quantitative and qualitative method the article demonstrates the possibilities and advantages of combining the two types of investigation in the same research project. The project being an effect study...

  2. Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?

    Science.gov (United States)

    Gizak, Agnieszka; Rakus, Dariusz

    2016-01-11

    Molecular and cellular biology methodology is traditionally based on the reasoning called "the mechanistic explanation". In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems' complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites), and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.

  3. Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?

    Directory of Open Access Journals (Sweden)

    Agnieszka Gizak

    2016-01-01

    Full Text Available Molecular and cellular biology methodology is traditionally based on the reasoning called “the mechanistic explanation”. In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems’ complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites, and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.

  4. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  5. Quantitative developmental transcriptomes of the Mediterranean sea urchin Paracentrotus lividus.

    Science.gov (United States)

    Gildor, Tsvia; Malik, Assaf; Sher, Noa; Avraham, Linor; Ben-Tabou de-Leon, Smadar

    2016-02-01

    Embryonic development progresses through the timely activation of thousands of differentially activated genes. Quantitative developmental transcriptomes provide the means to relate global patterns of differentially expressed genes to the emerging body plans they generate. The sea urchin is one of the classic model systems for embryogenesis and the models of its developmental gene regulatory networks are of the most comprehensive of their kind. Thus, the sea urchin embryo is an excellent system for studies of its global developmental transcriptional profiles. Here we produced quantitative developmental transcriptomes of the sea urchin Paracentrotus lividus (P. lividus) at seven developmental stages from the fertilized egg to prism stage. We generated de-novo reference transcriptome and identified 29,817 genes that are expressed at this time period. We annotated and quantified gene expression at the different developmental stages and confirmed the reliability of the expression profiles by QPCR measurement of a subset of genes. The progression of embryo development is reflected in the observed global expression patterns and in our principle component analysis. Our study illuminates the rich patterns of gene expression that participate in sea urchin embryogenesis and provide an essential resource for further studies of the dynamic expression of P. lividus genes. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Quantitative assessment of videolaryngostroboscopic images in patients with glottic pathologies.

    Science.gov (United States)

    Niebudek-Bogusz, Ewa; Kopczynski, Bartosz; Strumillo, Pawel; Morawska, Joanna; Wiktorowicz, Justyna; Sliwinska-Kowalska, Mariola

    2017-07-01

    Digital imaging techniques enable exploration of novel visualization modalities of the vocal folds during phonation and definition of parameters, facilitating more precise diagnosis of voice disorders. Application of computer vision algorithms for analysis of videolaryngostroboscopic (VLS) images aimed at qualitative and quantitative description of phonatory vibrations. VLS examinations were conducted for 45 females, including 15 subjects with vocal nodules, 15 subjects with glottal incompetence, and 15 normophonic females. The recorded VLS images were preprocessed, the glottis area was segmented out, and the glottal cycles were identified. The glottovibrograms were built, and then the glottal area waveforms (GAW) were quantitatively described by computing the following parameters: open quotient (OQ), closing quotient (CQ), speed quotient (SQ), minimal relative glottal area (MRGA), and a new parameter termed closure difference index (CDI). Profiles of the glottal widths assessed along the glottal length differentiated the study groups (P diagnostics. Results of the performed ROC curve analysis suggest that the evaluated parameters can distinguish patients with voice disorders from normophonic subjects.

  7. N reactor individual risk comparison to quantitative nuclear safety goals

    International Nuclear Information System (INIS)

    Wang, O.S.; Rainey, T.E.; Zentner, M.D.

    1990-01-01

    A full-scope level III probabilistic risk assessment (PRA) has been completed for N reactor, a US Department of Energy (DOE) production reactor located on the Hanford Reservation in the state of Washington. Sandia National Laboratories (SNL) provided the technical leadership for this work, using the state-of-the-art NUREG-1150 methodology developed for the US Nuclear Regulatory Commission (NRC). The main objectives of this effort were to assess the risks to the public and to the on-site workers posed by the operation of N reactor, to identify changes to the plant that could reduce the overall risk, and to compare those risks to the proposed NRC and DOE quantitative safety goals. This paper presents the methodology adopted by Westinghouse Hanford Company (WHC) and SNL for individual health risk evaluation, its results, and a comparison to the NRC safety objectives and the DOE nuclear safety guidelines. The N reactor results, are also compared with the five NUREG-1150 nuclear plants. Only internal events are compared here because external events are not yet reported in the current draft NUREG-1150. This is the first full-scope level III PRA study with a detailed quantitative safety goal comparison performed for DOE production reactors

  8. Quantitative Measures of Swallowing Deficits in Patients With Parkinson's Disease.

    Science.gov (United States)

    Ellerston, Julia K; Heller, Amanda C; Houtz, Daniel R; Kendall, Katherine A

    2016-05-01

    Dysphagia and associated aspiration pneumonia are commonly reported sequelae of Parkinson's disease (PD). Previous studies of swallowing in patients with PD have described prolonged pharyngeal transit time, delayed onset of pharyngeal transit, cricopharyngeal (CP) achalasia, reduced pharyngeal constriction, and slowed hyolaryngeal elevation. These studies were completed using inconsistent evaluation methodology, reliance on qualitative analysis, and a lack of a large control group, resulting in concerns regarding diagnostic precision. The purpose of this study was to investigate swallowing function in patients with PD using a norm-referenced, quantitative approach. This retrospective study includes 34 patients with a diagnosis of PD referred to a multidisciplinary voice and swallowing clinic. Modified barium swallow studies were performed using quantitative measures of pharyngeal transit time, hyoid displacement, CP sphincter opening, area of the pharynx at maximal constriction, and timing of laryngeal vestibule closure relative to bolus arrival at the CP sphincter. Reduced pharyngeal constriction was found in 30.4%, and a delay in airway closure relative to arrival of the bolus at the CP sphincter was the most common abnormality, present in 62% of patients. Previously reported findings of prolonged pharyngeal transit, poor hyoid elevation, and CP achalasia were not identified as prominent features. © The Author(s) 2015.

  9. "Why am i a volunteer?": building a quantitative scale

    Directory of Open Access Journals (Sweden)

    Carlos Eduardo Cavalcante

    Full Text Available This paper aims to analyze the validity of a quantitative instrument to identify what attracts someone to volunteer work, as well as what makes them stay and what makes them quit such an activity. The theoretical framework lists aspects related to volunteer work, which is followed by a discussion on models of analysis of volunteer motivation. As to the objectives, this research is descriptive, since it presents the analysis of the validity of a quantitative instrument that seeks to understand and describe the reasons for volunteering at the Pastoral da Criança, a Brazilian NGO. This instrument is based on theoretical ideas by Souza, Medeiros and Fernandes (2006. Reliability - Cronbach's Alpha - reached values between 0.7 and 0.8. Regarding Kaiser-Meyer-Olkin measure of sampling adequacy a good index was also obtained: 0.74. Despite the good results of reliability and sampling adequacy of factor analysis, none of the variables resulted in the expected combination, namely: indicators versus profile. It is necessary to improve the semantic meaning of certain factors, or even increase the number of indicators so as to generate additional correlations among them.

  10. Can quantitative sensory testing predict responses to analgesic treatment?

    Science.gov (United States)

    Grosen, K; Fischer, I W D; Olesen, A E; Drewes, A M

    2013-10-01

    The role of quantitative sensory testing (QST) in prediction of analgesic effect in humans is scarcely investigated. This updated review assesses the effectiveness in predicting analgesic effects in healthy volunteers, surgical patients and patients with chronic pain. A systematic review of English written, peer-reviewed articles was conducted using PubMed and Embase (1980-2013). Additional studies were identified by chain searching. Search terms included 'quantitative sensory testing', 'sensory testing' and 'analgesics'. Studies on the relationship between QST and response to analgesic treatment in human adults were included. Appraisal of the methodological quality of the included studies was based on evaluative criteria for prognostic studies. Fourteen studies (including 720 individuals) met the inclusion criteria. Significant correlations were observed between responses to analgesics and several QST parameters including (1) heat pain threshold in experimental human pain, (2) electrical and heat pain thresholds, pressure pain tolerance and suprathreshold heat pain in surgical patients, and (3) electrical and heat pain threshold and conditioned pain modulation in patients with chronic pain. Heterogeneity among studies was observed especially with regard to application of QST and type and use of analgesics. Although promising, the current evidence is not sufficiently robust to recommend the use of any specific QST parameter in predicting analgesic response. Future studies should focus on a range of different experimental pain modalities rather than a single static pain stimulation paradigm. © 2013 European Federation of International Association for the Study of Pain Chapters.

  11. Effects of normalization on quantitative traits in association test

    Science.gov (United States)

    2009-01-01

    Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest. PMID:20003414

  12. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    International Nuclear Information System (INIS)

    Ando, Katsutoshi; Tobino, Kazunori; Kurihara, Masatoshi; Kataoka, Hideyuki; Doi, Tokuhide; Hoshika, Yoshito; Takahashi, Kazuhisa; Seyama, Kuniaki

    2012-01-01

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm 2 and 5–10 mm 2 and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL CO /VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p CO /VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  13. Effects of normalization on quantitative traits in association test

    Directory of Open Access Journals (Sweden)

    Yap Von Bing

    2009-12-01

    Full Text Available Abstract Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest.

  14. Qualitative and quantitative ultrasound assessment of gastric content.

    Science.gov (United States)

    Bisinotto, Flora Margarida Barra; Pansani, Patrícia Luísa; Silveira, Luciano Alves Matias da; Naves, Aline de Araújo; Peixoto, Ana Cristina Abdu; Lima, Hellen Moreira de; Martins, Laura Bisinotto

    2017-02-01

    Pulmonary aspiration of the gastric contents is one of the most feared complications in anesthesia. Its prevention depends on preoperative fasting as well as identification of risky patients. A reliable diagnostic tool to assess gastric volume is currently lacking. The aim of this study performed on volunteers was to evaluate the feasibility of ultrasonography to identify qualitative and quantitative gastric content. A standardized gastric scanning protocol was applied on 67 healthy volunteers to assess the gastric antrum in four different situations: fasting, after ingesting clear fluid, milk and a solid meal. A qualitative and quantitative assessment of the gastric content in the antrum was performed by a blinded sonographer. The antrum was considered either as empty, or containing clear or thick fluid, or solids. Total gastric volume was predicted based on a cross-sectional area of the antrum. A p-value less than 0.05 was considered statistically significant. For each type of gastric content, the sonographic characteristics of the antrum and its content were described and illustrated. Sonographic qualitative assessment allowed to distinguish between an empty stomach and one with different kinds of meal. The predicted gastric volume was significantly larger after the consumption of any food source compared to fasting. Bedside sonography can determine the nature of gastric content. It is also possible to estimate the difference between an empty gastric antrum and one that has some food in it. Such information may be useful to estimate the risk of aspiration, particularly in situations when prandial status is unknown or uncertain.

  15. Quantitative imaging features: extension of the oncology medical image database

    Science.gov (United States)

    Patel, M. N.; Looney, P. T.; Young, K. C.; Halling-Brown, M. D.

    2015-03-01

    Radiological imaging is fundamental within the healthcare industry and has become routinely adopted for diagnosis, disease monitoring and treatment planning. With the advent of digital imaging modalities and the rapid growth in both diagnostic and therapeutic imaging, the ability to be able to harness this large influx of data is of paramount importance. The Oncology Medical Image Database (OMI-DB) was created to provide a centralized, fully annotated dataset for research. The database contains both processed and unprocessed images, associated data, and annotations and where applicable expert determined ground truths describing features of interest. Medical imaging provides the ability to detect and localize many changes that are important to determine whether a disease is present or a therapy is effective by depicting alterations in anatomic, physiologic, biochemical or molecular processes. Quantitative imaging features are sensitive, specific, accurate and reproducible imaging measures of these changes. Here, we describe an extension to the OMI-DB whereby a range of imaging features and descriptors are pre-calculated using a high throughput approach. The ability to calculate multiple imaging features and data from the acquired images would be valuable and facilitate further research applications investigating detection, prognosis, and classification. The resultant data store contains more than 10 million quantitative features as well as features derived from CAD predictions. Theses data can be used to build predictive models to aid image classification, treatment response assessment as well as to identify prognostic imaging biomarkers.

  16. Quantitative imaging analysis of posterior fossa ependymoma location in children.

    Science.gov (United States)

    Sabin, Noah D; Merchant, Thomas E; Li, Xingyu; Li, Yimei; Klimo, Paul; Boop, Frederick A; Ellison, David W; Ogg, Robert J

    2016-08-01

    Imaging descriptions of posterior fossa ependymoma in children have focused on magnetic resonance imaging (MRI) signal and local anatomic relationships with imaging location only recently used to classify these neoplasms. We developed a quantitative method for analyzing the location of ependymoma in the posterior fossa, tested its effectiveness in distinguishing groups of tumors, and examined potential associations of distinct tumor groups with treatment and prognostic factors. Pre-operative MRI examinations of the brain for 38 children with histopathologically proven posterior fossa ependymoma were analyzed. Tumor margin contours and anatomic landmarks were manually marked and used to calculate the centroid of each tumor. Landmarks were used to calculate a transformation to align, scale, and rotate each patient's image coordinates to a common coordinate space. Hierarchical cluster analysis of the location and morphological variables was performed to detect multivariate patterns in tumor characteristics. The ependymomas were also characterized as "central" or "lateral" based on published radiological criteria. Therapeutic details and demographic, recurrence, and survival information were obtained from medical records and analyzed with the tumor location and morphology to identify prognostic tumor characteristics. Cluster analysis yielded two distinct tumor groups based on centroid location The cluster groups were associated with differences in PFS (p = .044), "central" vs. "lateral" radiological designation (p = .035), and marginally associated with multiple operative interventions (p = .064). Posterior fossa ependymoma can be objectively classified based on quantitative analysis of tumor location, and these classifications are associated with prognostic and treatment factors.

  17. Portable smartphone based quantitative phase microscope

    Science.gov (United States)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  18. Using Local Data To Advance Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Stephen Sweet

    2008-07-01

    Full Text Available In this article we consider the application of local data as a means of advancing quantitative literacy. We illustrate the use of three different sources of local data: institutional data, Census data, and the National College Health Assessment survey. Our learning modules are applied in courses in sociology and communication, but the strategy of using local data can be integrated beyond these disciplinary boundaries. We demonstrate how these data can be used to stimulate student interests in class discussion, advance analytic skills, as well as develop capacities in written and verbal communication. We conclude by considering concerns that may influence the types of local data used and the challenges of integrating these data in a course in which quantitative analysis is not typically part of the curriculum.

  19. Balance between qualitative and quantitative verification methods

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2012-01-01

    The amount of inspection effort for verification of declared nuclear material needs to be optimized in the situation where qualitative and quantitative measures are applied. Game theory was referred to investigate the relation of detection probability and deterrence of diversion. Payoffs used in the theory were quantified for cases of conventional safeguards and integrated safeguards by using AHP, Analytical Hierarchy Process. Then, it became possible to estimate detection probability under integrated safeguards which had equivalent deterrence capability for detection probability under conventional safeguards. In addition the distribution of inspection effort for qualitative and quantitative measures was estimated. Although the AHP has some ambiguities in quantifying qualitative factors, its application to optimization in safeguards is useful to reconsider the detection probabilities under integrated safeguards. (author)

  20. Quality control in quantitative computed tomography

    International Nuclear Information System (INIS)

    Jessen, K.A.; Joergensen, J.

    1989-01-01

    Computed tomography (CT) has for several years been an indispensable tool in diagnostic radiology, but it is only recently that extraction of quantitative information from CT images has been of practical clinical value. Only careful control of the scan parameters, and especially the scan geometry, allows useful information to be obtained; and it can be demonstrated by simple phantom measurements how sensitive a CT system can be to variations in size, shape and position of the phantom in the gantry aperture. Significant differences exist between systems that are not manifested in normal control of image quality and general performance tests. Therefore an actual system has to be analysed for its suitability for quantitative use of the images before critical clinical applications are justified. (author)

  1. Credit Institutions Management Evaluation using Quantitative Methods

    Directory of Open Access Journals (Sweden)

    Nicolae Dardac

    2006-02-01

    Full Text Available Credit institutions supervising mission by state authorities is mostly assimilated with systemic risk prevention. In present, the mission is orientated on analyzing the risk profile of the credit institutions, the mechanism and existing systems as management tools providing to bank rules the proper instruments to avoid and control specific bank risks. Rating systems are sophisticated measurement instruments which are capable to assure the above objectives, such as success in banking risk management. The management quality is one of the most important elements from the set of variables used in the quoting process in credit operations. Evaluation of this quality is – generally speaking – fundamented on quantitative appreciations which can induce subjectivism and heterogeneity in quotation. The problem can be solved by using, complementary, quantitative technics such us DEA (Data Envelopment Analysis.

  2. Quantitative analysis of thallium-201 myocardial scintigraphy

    International Nuclear Information System (INIS)

    Kanemoto, Nariaki; Hoer, G.; Johost, S.; Maul, F.-D.; Standke, R.

    1981-01-01

    The method of quantitative analysis of thallium-201 myocardial scintigraphy using computer assisted technique was described. Calculated indices are washout factor, vitality index and redistribution factor. Washout factor is the ratio of counts at certain period of time after exercise and immediately after exercise. This value is neccessary for the evaluation of redistribution to the ischemic areas in serial imagings to correct the Tl-201 washout from the myocardium under the assumption that the washout is constant in the whole myocardium. Vitality index is the ratio between the Tl-201 uptake in the region of interest and that of the maximum. Redistribution factor is the ratio of the redistribution in the region of interest in serial imagings after exercise to that of immediately after exercise. Four examples of exercise Tl-201 myocardial scintigrams and the quantitative analyses before and after the percutaneous transluminal coronary angioplasty were presented. (author)

  3. Nanostructured surfaces investigated by quantitative morphological studies

    International Nuclear Information System (INIS)

    Perani, Martina; Carapezzi, Stefania; Mutta, Geeta Rani; Cavalcoli, Daniela

    2016-01-01

    The morphology of different surfaces has been investigated by atomic force microscopy and quantitatively analyzed in this paper. Two different tools have been employed to this scope: the analysis of the height–height correlation function and the determination of the mean grain size, which have been combined to obtain a complete characterization of the surfaces. Different materials have been analyzed: SiO_xN_y, InGaN/GaN quantum wells and Si nanowires, grown with different techniques. Notwithstanding the presence of grain-like structures on all the samples analyzed, they present very diverse surface design, underlying that this procedure can be of general use. Our results show that the quantitative analysis of nanostructured surfaces allows us to obtain interesting information, such as grain clustering, from the comparison of the lateral correlation length and the grain size. (paper)

  4. Quantitative risk in radiation protection standards

    International Nuclear Information System (INIS)

    Bond, V.P.

    1979-01-01

    Although the overall aim of radiobiology is to understand the biological effects of radiation, it also has the implied practical purpose of developing rational measures for the control of radiation exposure in man. The emphasis in this presentation is to show that the enormous effort expended over the years to develop quantitative dose-effect relationships in biochemical and cellular systems, animals, and human beings now seems to be paying off. The pieces appear to be falling into place, and a framework is evolving to utilize these data. Specifically, quantitative risk assessments will be discussed in terms of the cellular, animal, and human data on which they are based; their use in the development of radiation protection standards; and their present and potential impact and meaning in relation to the quantity dose equivalent and its special unit, the rem

  5. Quantitative sputter profiling at surfaces and interfaces

    International Nuclear Information System (INIS)

    Kirschner, J.; Etzkorn, H.W.

    1981-01-01

    The key problem in quantitative sputter profiling, that of a sliding depth scale has been solved by combined Auger/X-ray microanalysis. By means of this technique and for the model system Ge/Si (amorphous) the following questions are treated quantitatively: shape of the sputter profiles when sputtering through an interface and origin of their asymmetry; precise location of the interface plane on the depth profile; broadening effects due to limited depth of information and their correction; origin and amount of bombardment induced broadening for different primary ions and energies; depth dependence of the broadening, and basic limits to depth resolution. Comparisons are made to recent theoretical calculations based on recoil mixing in the collision cascade and very good agreement is found

  6. Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.

    Science.gov (United States)

    Kendall, Katherine A

    2017-10-01

    Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  7. Quantitative image processing in fluid mechanics

    Science.gov (United States)

    Hesselink, Lambertus; Helman, James; Ning, Paul

    1992-01-01

    The current status of digital image processing in fluid flow research is reviewed. In particular, attention is given to a comprehensive approach to the extraction of quantitative data from multivariate databases and examples of recent developments. The discussion covers numerical simulations and experiments, data processing, generation and dissemination of knowledge, traditional image processing, hybrid processing, fluid flow vector field topology, and isosurface analysis using Marching Cubes.

  8. On the quantitativeness of EDS STEM

    Energy Technology Data Exchange (ETDEWEB)

    Lugg, N.R. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan); Kothleitner, G. [Institute for Electron Microscopy and Nanoanalysis, Graz University of Technology, Steyrergasse 17, 8010 Graz (Austria); Centre for Electron Microscopy, Steyrergasse 17, 8010 Graz (Austria); Shibata, N.; Ikuhara, Y. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2015-04-15

    Chemical mapping using energy dispersive X-ray spectroscopy (EDS) in scanning transmission electron microscopy (STEM) has recently shown to be a powerful technique in analyzing the elemental identity and location of atomic columns in materials at atomic resolution. However, most applications of EDS STEM have been used only to qualitatively map whether elements are present at specific sites. Obtaining calibrated EDS STEM maps so that they are on an absolute scale is a difficult task and even if one achieves this, extracting quantitative information about the specimen – such as the number or density of atoms under the probe – adds yet another layer of complexity to the analysis due to the multiple elastic and inelastic scattering of the electron probe. Quantitative information may be obtained by comparing calibrated EDS STEM with theoretical simulations, but in this case a model of the structure must be assumed a priori. Here we first theoretically explore how exactly elastic and thermal scattering of the probe confounds the quantitative information one is able to extract about the specimen from an EDS STEM map. We then show using simulation how tilting the specimen (or incident probe) can reduce the effects of scattering and how it can provide quantitative information about the specimen. We then discuss drawbacks of this method – such as the loss of atomic resolution along the tilt direction – but follow this with a possible remedy: precession averaged EDS STEM mapping. - Highlights: • Signal obtained in EDS STEM maps (of STO) compared to non-channelling signal. • Deviation from non-channelling signal occurs in on-axis experiments. • Tilting specimen: signal close to non-channelling case but atomic resolution is lost. • Tilt-precession series: non-channelling signal and atomic-resolution features obtained. • Associated issues are discussed.

  9. Quantitative indicators of fruit and vegetable consumption

    OpenAIRE

    Dagmar Kozelová; Dana Országhová; Milan Fiľa; Zuzana Čmiková

    2015-01-01

    The quantitative research of the market is often based on surveys and questionnaires which are finding out the behavior of customers in observed areas. Before purchasing process consumers consider where they will buy fruit and vegetables, what kind to choose and in what quantity of goods. Consumers' behavior is affected by the factors as: regional gastronomic traditions, price, product appearance, aroma, place of buying, own experience and knowledge, taste preferences as well as specific heal...

  10. Development of a quantitative risk standard

    International Nuclear Information System (INIS)

    Temme, M.I.

    1982-01-01

    IEEE Working Group SC-5.4 is developing a quantitative risk standard for LWR plant design and operation. The paper describes the Working Group's conclusions on significant issues, including the scope of the standard, the need to define the process (i.e., PRA calculation) for meeting risk criteria, the need for PRA quality requirements and the importance of distinguishing standards from goals. The paper also describes the Working Group's approach to writing this standard

  11. Quantitative possibility analysis. Present status in ESCA

    International Nuclear Information System (INIS)

    Brion, D.

    1981-01-01

    A short review of the recent developments in quantification of X-ray photoelectron spectroscopy or ESCA is presented. The basic equations are reminded. Each involved parameter (photoionisation, inelastic mean free paths, 'response function' of the instruments, intensity measurement) is separately discussed in relation with the accuracy and the precision of the method. Other topics are considered such as roughness, surface contamination, matrix effect and inhomogeneous composition. Some aspects of the quantitative ESCA analysis and AES analysis are compared [fr

  12. Quantitative imaging of bilirubin by photoacoustic microscopy

    Science.gov (United States)

    Zhou, Yong; Zhang, Chi; Yao, Da-Kang; Wang, Lihong V.

    2013-03-01

    Noninvasive detection of both bilirubin concentration and its distribution is important for disease diagnosis. Here we implemented photoacoustic microscopy (PAM) to detect bilirubin distribution. We first demonstrate that our PAM system can measure the absorption spectra of bilirubin and blood. We also image bilirubin distributions in tissuemimicking samples, both without and with blood mixed. Our results show that PAM has the potential to quantitatively image bilirubin in vivo for clinical applications.

  13. Quantitative Risk Assessment of Contact Sensitization

    DEFF Research Database (Denmark)

    Api, Anne Marie; Belsito, Donald; Bickers, David

    2010-01-01

    Background: Contact hypersensitivity quantitative risk assessment (QRA) for fragrance ingredients is being used to establish new international standards for all fragrance ingredients that are potential skin sensitizers. Objective: The objective was to evaluate the retrospective clinical data...... as potential sensitizers. Methods: This article reviews clinical data for three fragrance ingredients cinnamic aldehyde, citral, and isoeugenol to assess the utility of the QRA approach for fragrance ingredients. Results: This assessment suggests that had the QRA approach been available at the time standards...

  14. Quantitative Method of Measuring Metastatic Activity

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  15. Quantitative maps of groundwater resources in Africa

    International Nuclear Information System (INIS)

    MacDonald, A M; Bonsor, H C; Dochartaigh, B É Ó; Taylor, R G

    2012-01-01

    In Africa, groundwater is the major source of drinking water and its use for irrigation is forecast to increase substantially to combat growing food insecurity. Despite this, there is little quantitative information on groundwater resources in Africa, and groundwater storage is consequently omitted from assessments of freshwater availability. Here we present the first quantitative continent-wide maps of aquifer storage and potential borehole yields in Africa based on an extensive review of available maps, publications and data. We estimate total groundwater storage in Africa to be 0.66 million km 3 (0.36–1.75 million km 3 ). Not all of this groundwater storage is available for abstraction, but the estimated volume is more than 100 times estimates of annual renewable freshwater resources on Africa. Groundwater resources are unevenly distributed: the largest groundwater volumes are found in the large sedimentary aquifers in the North African countries Libya, Algeria, Egypt and Sudan. Nevertheless, for many African countries appropriately sited and constructed boreholes can support handpump abstraction (yields of 0.1–0.3 l s −1 ), and contain sufficient storage to sustain abstraction through inter-annual variations in recharge. The maps show further that the potential for higher yielding boreholes ( > 5 l s −1 ) is much more limited. Therefore, strategies for increasing irrigation or supplying water to rapidly urbanizing cities that are predicated on the widespread drilling of high yielding boreholes are likely to be unsuccessful. As groundwater is the largest and most widely distributed store of freshwater in Africa, the quantitative maps are intended to lead to more realistic assessments of water security and water stress, and to promote a more quantitative approach to mapping of groundwater resources at national and regional level. (letter)

  16. Review of progress in quantitative NDE

    International Nuclear Information System (INIS)

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques

  17. Radioimmunoassay to quantitatively measure cell surface immunoglobulins

    International Nuclear Information System (INIS)

    Krishman, E.C.; Jewell, W.R.

    1975-01-01

    A radioimmunoassay techniques developed to quantitatively measure the presence of immunoglobulins on the surface of cells, is described. The amount of immunoglobulins found on different tumor cells varied from 200 to 1140 ng/10 6 cells. Determination of immunoglobulins on the peripheral lymphocytes obtained from different cancer patients varied between 340 to 1040 ng/10 6 cells. Cultured tumor cells, on the other hand, were found to contain negligible quantities of human IgG [pt

  18. Quantitative analysis of untreated bio-samples

    International Nuclear Information System (INIS)

    Sera, K.; Futatsugawa, S.; Matsuda, K.

    1999-01-01

    A standard-free method of quantitative analysis for untreated samples has been developed. For hair samples, measurements were performed by irradiating with a proton beam a few hairs as they are, and quantitative analysis was carried out by means of a standard-free method developed by ourselves. First, quantitative values of concentration of zinc were derived, then concentration of other elements was obtained by regarding zinc as an internal standard. As the result, values of concentration of sulphur for 40 samples agree well with the average value for a typical Japanese and also with each other within 20%, and validity of the present method could be confirmed. Accuracy was confirmed by comparing the results with those obtained by the usual internal standard method, too. For the purpose of a surface analysis of a bone sample, a very small incidence angle of the proton beam was used, so that both energy loss of the projectile and self-absorption of X-rays become negligible. As the result, consistent values of concentration for many elements were obtained by the standard-free method

  19. Quantitative evaluation of dysphagia using scintigraphy

    International Nuclear Information System (INIS)

    Park, Seok Gun; Hyun, Jung Keun; Lee, Seong Jae

    1998-01-01

    To evaluate dysphagia objectively and quantitatively, and to clarify the effect of neck position and viscosity changes in patients with aspiration and laryngeal penetration. We studied 35 patients with dysphagia and 21 normal controls using videofluoroscopy and scintigraphy. Videofluoroscopy was performed with barium with three different viscosity, and scintigraphy was done with water, yogurt, and steamed egg mixed with Tc-99m tin colloid. If aspiration was found during videofluoroscopic examination, patient's neck position was changed and study repeated. Videofluoroscopy was analyzed qualitatively. We calculated 7 quantitative parameters from scintigraphy. According to the videofluoroscopic findings, we divided patients into 3 subgroups; aspiration, laryngeal penetration, and no-aspiration group. The result of videofluoroscopy revealed that the most common finding was the delay in triggering pharyngeal swallow. Pharyngeal transit time (PTT) and pharyngeal swallowing efficiency (PSE) in patients with aspiration were significantly different from other groups. After neck position change, aspiration could be reduced in all of 7 patients, and laryngeal penetration reduced by about 82%. PTT and PSE were also improved after position change. Aspiration and laryngeal penetration occurred more frequently in thin liquid swallowing than in thin liquid and solid swallowing. PTT and PSE were useful for the evaluation of dysphagia. Aspiration and laryngeal penetration could by reduced when appropriate position assumed. We could decrease the chance of aspiration by changing the patient diet consistency. Scintigraphy might be useful tool to quantitate and follow up these changes

  20. Quantitative evaluation of dysphagia using scintigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Park, Seok Gun; Hyun, Jung Keun; Lee, Seong Jae [College of Medicine, Dankook Univ., Cheonnon (Korea, Republic of)

    1998-08-01

    To evaluate dysphagia objectively and quantitatively, and to clarify the effect of neck position and viscosity changes in patients with aspiration and laryngeal penetration. We studied 35 patients with dysphagia and 21 normal controls using videofluoroscopy and scintigraphy. Videofluoroscopy was performed with barium with three different viscosity, and scintigraphy was done with water, yogurt, and steamed egg mixed with Tc-99m tin colloid. If aspiration was found during videofluoroscopic examination, patient's neck position was changed and study repeated. Videofluoroscopy was analyzed qualitatively. We calculated 7 quantitative parameters from scintigraphy. According to the videofluoroscopic findings, we divided patients into 3 subgroups; aspiration, laryngeal penetration, and no-aspiration group. The result of videofluoroscopy revealed that the most common finding was the delay in triggering pharyngeal swallow. Pharyngeal transit time (PTT) and pharyngeal swallowing efficiency (PSE) in patients with aspiration were significantly different from other groups. After neck position change, aspiration could be reduced in all of 7 patients, and laryngeal penetration reduced by about 82%. PTT and PSE were also improved after position change. Aspiration and laryngeal penetration occurred more frequently in thin liquid swallowing than in thin liquid and solid swallowing. PTT and PSE were useful for the evaluation of dysphagia. Aspiration and laryngeal penetration could by reduced when appropriate position assumed. We could decrease the chance of aspiration by changing the patient diet consistency. Scintigraphy might be useful tool to quantitate and follow up these changes.

  1. Quantitative and qualitative coronary arteriography. 1

    International Nuclear Information System (INIS)

    Brown, B.G.; Simpson, Paul; Dodge, J.T. Jr; Bolson, E.L.; Dodge, H.T.

    1991-01-01

    The clinical objectives of arteriography are to obtain information that contributes to an understanding of the mechanisms of the clinical syndrome, provides prognostic information, facilitates therapeutic decisions, and guides invasive therapy. Quantitative and improved qualitative assessments of arterial disease provide us with a common descriptive language which has the potential to accomplish these objectives more effectively and thus to improve clinical outcome. In certain situations, this potential has been demonstrated. Clinical investigation using quantitative techniques has definitely contributed to our understanding of disease mechanisms and of atherosclerosis progression/regression. Routine quantitation of clinical images should permit more accurate and repeatable estimates of disease severity and promises to provide useful estimates of coronary flow reserve. But routine clinical QCA awaits more cost- and time-efficient methods and clear proof of a clinical advantage. Careful inspection of highly magnified, high-resolution arteriographic images reveals morphologic features related to the pathophysiology of the clinical syndrome and to the likelihood of future progression or regression of obstruction. Features that have been found useful include thrombus in its various forms, ulceration and irregularity, eccentricity, flexing and dissection. The description of such high-resolution features should be included among, rather than excluded from, the goals of image processing, since they contribute substantially to the understanding and treatment of the clinical syndrome. (author). 81 refs.; 8 figs.; 1 tab

  2. Rational quantitative safety goals: a summary

    International Nuclear Information System (INIS)

    Unwin, S.D.; Hayns, M.R.

    1984-08-01

    We introduce the notion of a Rational Quantitative Safety Goal. Such a goal reflects the imprecision and vagueness inherent in any reasonable notion of adequate safety and permits such vagueness to be incorporated into the formal regulatory decision-making process. A quantitative goal of the form, the parameter x, characterizing the safety level of the nuclear plant, shall not exceed the value x 0 , for example, is of a non-rational nature in that it invokes a strict binary logic in which the parameter space underlying x is cut sharply into two portions: that containing those values of x that comply with the goal and that containing those that do not. Here, we utilize an alternative form of logic which, in accordance with any intuitively reasonable notion of safety, permits a smooth transition of a safety determining parameter between the adequately safe and inadequately safe domains. Fuzzy set theory provides a suitable mathematical basis for the formulation of rational quantitative safety goals. The decision-making process proposed here is compatible with current risk assessment techniques and produces results in a transparent and useful format. Our methodology is illustrated with reference to the NUS Corporation risk assessment of the Limerick Generating Station

  3. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Quantitative trait loci and metabolic pathways

    Science.gov (United States)

    McMullen, M. D.; Byrne, P. F.; Snook, M. E.; Wiseman, B. R.; Lee, E. A.; Widstrom, N. W.; Coe, E. H.

    1998-01-01

    The interpretation of quantitative trait locus (QTL) studies is limited by the lack of information on metabolic pathways leading to most economic traits. Inferences about the roles of the underlying genes with a pathway or the nature of their interaction with other loci are generally not possible. An exception is resistance to the corn earworm Helicoverpa zea (Boddie) in maize (Zea mays L.) because of maysin, a C-glycosyl flavone synthesized in silks via a branch of the well characterized flavonoid pathway. Our results using flavone synthesis as a model QTL system indicate: (i) the importance of regulatory loci as QTLs, (ii) the importance of interconnecting biochemical pathways on product levels, (iii) evidence for “channeling” of intermediates, allowing independent synthesis of related compounds, (iv) the utility of QTL analysis in clarifying the role of specific genes in a biochemical pathway, and (v) identification of a previously unknown locus on chromosome 9S affecting flavone level. A greater understanding of the genetic basis of maysin synthesis and associated corn earworm resistance should lead to improved breeding strategies. More broadly, the insights gained in relating a defined genetic and biochemical pathway affecting a quantitative trait should enhance interpretation of the biological basis of variation for other quantitative traits. PMID:9482823

  5. Quantitative learning strategies based on word networks

    Science.gov (United States)

    Zhao, Yue-Tian-Yi; Jia, Zi-Yang; Tang, Yong; Xiong, Jason Jie; Zhang, Yi-Cheng

    2018-02-01

    Learning English requires a considerable effort, but the way that vocabulary is introduced in textbooks is not optimized for learning efficiency. With the increasing population of English learners, learning process optimization will have significant impact and improvement towards English learning and teaching. The recent developments of big data analysis and complex network science provide additional opportunities to design and further investigate the strategies in English learning. In this paper, quantitative English learning strategies based on word network and word usage information are proposed. The strategies integrate the words frequency with topological structural information. By analyzing the influence of connected learned words, the learning weights for the unlearned words and dynamically updating of the network are studied and analyzed. The results suggest that quantitative strategies significantly improve learning efficiency while maintaining effectiveness. Especially, the optimized-weight-first strategy and segmented strategies outperform other strategies. The results provide opportunities for researchers and practitioners to reconsider the way of English teaching and designing vocabularies quantitatively by balancing the efficiency and learning costs based on the word network.

  6. Quantitative tools for addressing hospital readmissions

    Directory of Open Access Journals (Sweden)

    Lagoe Ronald J

    2012-11-01

    Full Text Available Abstract Background Increased interest in health care cost containment is focusing attention on reduction of hospital readmissions. Major payors have already developed financial penalties for providers that generate excess readmissions. This subject has benefitted from the development of resources such as the Potentially Preventable Readmissions software. This process has encouraged hospitals to renew efforts to improve these outcomes. The aim of this study was to describe quantitative tools such as definitions, risk estimation, and tracking of patients for reducing hospital readmissions. Findings This study employed the Potentially Preventable Readmissions software to develop quantitative tools for addressing hospital readmissions. These tools included two definitions of readmissions that support identification and management of patients. They also included analytical approaches for estimation of the risk of readmission for individual patients by age, discharge status of the initial admission, and severity of illness. They also included patient specific spreadsheets for tracking of target populations and for evaluation of the impact of interventions. Conclusions The study demonstrated that quantitative tools including the development of definitions of readmissions, estimation of the risk of readmission, and patient specific spreadsheets could contribute to the improvement of patient outcomes in hospitals.

  7. Some exercises in quantitative NMR imaging

    International Nuclear Information System (INIS)

    Bakker, C.J.G.

    1985-01-01

    The articles represented in this thesis result from a series of investigations that evaluate the potential of NMR imaging as a quantitative research tool. In the first article the possible use of proton spin-lattice relaxation time T 1 in tissue characterization, tumor recognition and monitoring tissue response to radiotherapy is explored. The next article addresses the question whether water proton spin-lattice relaxation curves of biological tissues are adequately described by a single time constant T 1 , and analyzes the implications of multi-exponentiality for quantitative NMR imaging. In the third article the use of NMR imaging as a quantitative research tool is discussed on the basis of phantom experiments. The fourth article describes a method which enables unambiguous retrieval of sign information in a set of magnetic resonance images of the inversion recovery type. The next article shows how this method can be adapted to allow accurate calculation of T 1 pictures on a pixel-by-pixel basis. The sixth article, finally, describes a simulation procedure which enables a straightforward determination of NMR imaging pulse sequence parameters for optimal tissue contrast. (orig.)

  8. Quantitative assessment of breast density from mammograms

    International Nuclear Information System (INIS)

    Jamal, N.; Ng, K.H.

    2004-01-01

    Full text: It is known that breast density is increasingly used as a risk factor for breast cancer. This study was undertaken to develop and validate a semi-automated computer technique for the quantitative assessment of breast density from digitised mammograms. A computer technique had been developed using MATLAB (Version 6.1) based GUI applications. This semi-automated image analysis tool consists of gradient correction, segmentation of breast region from background, segmentation of fibroglandular and adipose region within the breast area and calculation of breast density. The density is defined as the percentage of fibroglandular tissue area divided by the total breast area in the mammogram. This technique was clinically validated with 122 normal mammograms; these were subjectively evaluated and classified according to the five parenchyma patterns of the Tabar's scheme (Class I- V) by a consultant radiologist. There was a statistical significant correlation between the computer technique and subjective classification (r 2 = 0.84, p<0.05). 71.3% of subjective classification was correctly classified using the computer technique. We had developed a computer technique for the quantitative assessment of breast density and validated its accuracy for computerized classification based on Tabar's scheme. This quantitative tool is useful for the evaluation of a large dataset of mammograms to predict breast cancer risk based on density. Furthermore it has the potential to provide an early marker for success or failure in chemoprevention studies such as hormonal replacement therapy. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  9. Network analysis of quantitative proteomics on asthmatic bronchi: effects of inhaled glucocorticoid treatment

    Directory of Open Access Journals (Sweden)

    Sihlbom Carina

    2011-09-01

    Full Text Available Abstract Background Proteomic studies of respiratory disorders have the potential to identify protein biomarkers for diagnosis and disease monitoring. Utilisation of sensitive quantitative proteomic methods creates opportunities to determine individual patient proteomes. The aim of the current study was to determine if quantitative proteomics of bronchial biopsies from asthmatics can distinguish relevant biological functions and whether inhaled glucocorticoid treatment affects these functions. Methods Endobronchial biopsies were taken from untreated asthmatic patients (n = 12 and healthy controls (n = 3. Asthmatic patients were randomised to double blind treatment with either placebo or budesonide (800 μg daily for 3 months and new biopsies were obtained. Proteins extracted from the biopsies were digested and analysed using isobaric tags for relative and absolute quantitation combined with a nanoLC-LTQ Orbitrap mass spectrometer. Spectra obtained were used to identify and quantify proteins. Pathways analysis was performed using Ingenuity Pathway Analysis to identify significant biological pathways in asthma and determine how the expression of these pathways was changed by treatment. Results More than 1800 proteins were identified and quantified in the bronchial biopsies of subjects. The pathway analysis revealed acute phase response signalling, cell-to-cell signalling and tissue development associations with proteins expressed in asthmatics compared to controls. The functions and pathways associated with placebo and budesonide treatment showed distinct differences, including the decreased association with acute phase proteins as a result of budesonide treatment compared to placebo. Conclusions Proteomic analysis of bronchial biopsy material can be used to identify and quantify proteins using highly sensitive technologies, without the need for pooling of samples from several patients. Distinct pathophysiological features of asthma can be

  10. De-identifying an EHR Database

    DEFF Research Database (Denmark)

    Lauesen, Søren; Pantazos, Kostas; Lippert, Søren

    2011-01-01

    -identified a Danish EHR database with 437,164 patients. The goal was to generate a version with real medical records, but related to artificial persons. We developed a de-identification algorithm that uses lists of named entities, simple language analysis, and special rules. Our algorithm consists of 3 steps: collect...... lists of identifiers from the database and external resources, define a replacement for each identifier, and replace identifiers in structured data and free text. Some patient records could not be safely de-identified, so the de-identified database has 323,122 patient records with an acceptable degree...... of anonymity, readability and correctness (F-measure of 95%). The algorithm has to be adjusted for each culture, language and database....

  11. A Resource of Quantitative Functional Annotation for Homo sapiens Genes.

    Science.gov (United States)

    Taşan, Murat; Drabkin, Harold J; Beaver, John E; Chua, Hon Nian; Dunham, Julie; Tian, Weidong; Blake, Judith A; Roth, Frederick P

    2012-02-01

    The body of human genomic and proteomic evidence continues to grow at ever-increasing rates, while annotation efforts struggle to keep pace. A surprisingly small fraction of human genes have clear, documented associations with specific functions, and new functions continue to be found for characterized genes. Here we assembled an integrated collection of diverse genomic and proteomic data for 21,341 human genes and make quantitative associations of each to 4333 Gene Ontology terms. We combined guilt-by-profiling and guilt-by-association approaches to exploit features unique to the data types. Performance was evaluated by cross-validation, prospective validation, and by manual evaluation with the biological literature. Functional-linkage networks were also constructed, and their utility was demonstrated by identifying candidate genes related to a glioma FLN using a seed network from genome-wide association studies. Our annotations are presented-alongside existing validated annotations-in a publicly accessible and searchable web interface.

  12. Quantitative phosphoproteomic analysis of porcine muscle within 24 h postmortem

    DEFF Research Database (Denmark)

    Huang, Honggang; Larsen, Martin Røssel; Palmisano, Giuseppe

    2014-01-01

    in meat quality development, a quantitative mass spectrometry-based phosphoproteomic study was performed to analyze the porcine muscle within 24h PM using dimethyl labeling combined with the TiSH phosphopeptide enrichment strategy. In total 305 unique proteins were identified, including 160...... phosphorylation levels in muscle within 24 h PM. The high phosphorylation level of heat shock proteins (HSPs) in early PM may be an adaptive response to slaughter stress and protect muscle cell from apoptosis, as observed in the serine 84 of HSP27. This work indicated that PM muscle proteins underwent significant...... and rigor mortis development in PM muscle. BIOLOGICAL SIGNIFICANCE: The manuscript describes the characterization of postmortem (PM) porcine muscle within 24 h postmortem from the perspective of protein phosphorylation using advanced phosphoproteomic techniques. In the study, the authors employed...

  13. Quantitative Trait Loci for Fertility Traits in Finnish Ayrshire Cattle

    DEFF Research Database (Denmark)

    Schulman, Nina F; Sahana, Goutam; Lund, Mogens S

    2008-01-01

    A whole genome scan was carried out to detect quantitative trait loci (QTL) for fertility traits in Finnish Ayrshire cattle. The mapping population consisted of 12 bulls and 493 sons. Estimated breeding values for days open, fertility treatments, maternal calf mortality and paternal non-return rate...... combinations, which were observed significant in the regression method. Twenty-two chromosome-wise significant QTL were detected. Several of the detected QTL areas were overlapping with milk production QTL previously identified in the same population. Multi-trait QTL analyses were carried out to test...... if these effects were due to a pleiotropic QTL affecting fertility and milk yield traits or to linked QTL causing the effects. This distinction could only be made with confidence on BTA1 where a QTL affecting milk yield is linked to a pleiotropic QTL affecting days open and fertility treatments...

  14. Quantitative Global Heat Transfer in a Mach-6 Quiet Tunnel

    Science.gov (United States)

    Sullivan, John P.; Schneider, Steven P.; Liu, Tianshu; Rubal, Justin; Ward, Chris; Dussling, Joseph; Rice, Cody; Foley, Ryan; Cai, Zeimin; Wang, Bo; hide

    2012-01-01

    This project developed quantitative methods for obtaining heat transfer from temperature sensitive paint (TSP) measurements in the Mach-6 quiet tunnel at Purdue, which is a Ludwieg tube with a downstream valve, moderately-short flow duration and low levels of heat transfer. Previous difficulties with inferring heat transfer from TSP in the Mach-6 quiet tunnel were traced to (1) the large transient heat transfer that occurs during the unusually long tunnel startup and shutdown, (2) the non-uniform thickness of the insulating coating, (3) inconsistencies and imperfections in the painting process and (4) the low levels of heat transfer observed on slender models at typical stagnation temperatures near 430K. Repeated measurements were conducted on 7 degree-half-angle sharp circular cones at zero angle of attack in order to evaluate the techniques, isolate the problems and identify solutions. An attempt at developing a two-color TSP method is also summarized.

  15. Dynamic Quantitative Trait Locus Analysis of Plant Phenomic Data.

    Science.gov (United States)

    Li, Zitong; Sillanpää, Mikko J

    2015-12-01

    Advanced platforms have recently become available for automatic and systematic quantification of plant growth and development. These new techniques can efficiently produce multiple measurements of phenotypes over time, and introduce time as an extra dimension to quantitative trait locus (QTL) studies. Functional mapping utilizes a class of statistical models for identifying QTLs associated with the growth characteristics of interest. A major benefit of functional mapping is that it integrates information over multiple timepoints, and therefore could increase the statistical power for QTL detection. We review the current development of computationally efficient functional mapping methods which provide invaluable tools for analyzing large-scale timecourse data that are readily available in our post-genome era. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Quantitative indicators of the impacts generated in lineal development projects

    International Nuclear Information System (INIS)

    Ospina N, Jesus Efren; Lema T, Alvaro de J.

    2002-01-01

    This work outlines a methodological proposal for the elaboration of quantitative indicators of the impact caused by electrical power transmission projects, using the perspective of the model of environmental administration by dimensions (physical, biotic, cultural, economic, and political). The model achieved an integral and interdisciplinary analysis, managing to determine what the degree of impact that a project generates on a dimension and its relationships to the others, moreover the indicators identified are useful tools that should help support planning, project formulation, decisions making, and environmental studies, such as: environmental management plans and greater efficiency in the estimation of administrative costs, as well as in the techniques of generating location alternatives, and also may lead to better administration of economic and human resources, among others

  17. Program for the quantitative and qualitative analysis of

    International Nuclear Information System (INIS)

    Tepelea, V.; Purice, E.; Dan, R.; Calcev, G.; Domnisan, M.; Galis, V.; Teodosiu, G.; Debert, C.; Mocanu, N.; Nastase, M.

    1985-01-01

    A computer code for processing of data from neutron activation analysis is described. The code is capable of qualitative and quantitative analysis of regular spectra from neutron irradiated samples, measured by a Ge(li) detector. Multichannel analysers with 1024 channels, such as TN 1705 or a Romanian made MCA 79, and an ITC interface can be used. The code is implemented on FELIX M118 and FELIX M216 microcomputers. Spectrum processing is performed off line, after storing the data on a floppy disk. The background is assumed to be a polynomial of first, second or third degree. Qualitative analysis is performed by recursive least square, Gaussian curve fitting. The elements are identified using a polynomial relation between energy and channel, obtained by calibration with a standard sample

  18. Quantitative sensory testing using DFNS protocol in Europe

    DEFF Research Database (Denmark)

    Vollert, Jan; Attal, Nadine; Baron, Ralf

    2016-01-01

    Quantitative sensory testing (QST) in accordance with the DFNS (German Research Network on Neuropathic Pain) protocol assesses the function of afferent nerve fibers on the basis of 13 parameters. Within the consortia IMI (Innovative Medicines Initiative) Europain and Neuropain, QST results from...... pain research units experienced in QST across Europe can be compared for the first time. Aim of this analysis was to identify possible biases in the QST assessment between 10 centers from 8 different European countries. In total, 188 healthy subjects, 217 patients with painful polyneuropathy, and 150...... patients with painful peripheral nerve injury were included in the analysis. Mixed effects models were constructed for each of the 11 normally distributed QST parameters with z-value as the dependent variable, and center as the random effect. The I statistic for heterogeneity was calculated, an index...

  19. Enhancing the Connection between Conceptual Reasoning and Quantitative Skills

    Science.gov (United States)

    Ash, Linda R.; Luettmer-Strathmann, Jutta

    2004-10-01

    Introductory physics students often solve physics problems by equation hunting or pattern matching. These techniques can leave students ill-prepared to work on problems that require a careful analysis of the situation and logical reasoning using fundamental physical concepts. We have designed worksheets for challenging problems that guide the students through the conceptual part of the problem using expert problem solving techniques, namely, draw sketches, collect the available information, determine the question, identify applicable physical principles, and find relationships that can lead to a solution. The worksheets only help the student plan how to reach a solution; the actual execution of the planned solution is left to the student. Student feedback on the worksheets has been positive and suggests that they are helpful in enhancing the connection between conceptual reasoning and quantitative skills in problem solving. We are currently evaluating quizzes, exams, and homework to further investigate the effectiveness of the approach.

  20. Parameter identifiability and redundancy: theoretical considerations.

    Directory of Open Access Journals (Sweden)

    Mark P Little

    Full Text Available BACKGROUND: Models for complex biological systems may involve a large number of parameters. It may well be that some of these parameters cannot be derived from observed data via regression techniques. Such parameters are said to be unidentifiable, the remaining parameters being identifiable. Closely related to this idea is that of redundancy, that a set of parameters can be expressed in terms of some smaller set. Before data is analysed it is critical to determine which model parameters are identifiable or redundant to avoid ill-defined and poorly convergent regression. METHODOLOGY/PRINCIPAL FINDINGS: In this paper we outline general considerations on parameter identifiability, and introduce the notion of weak local identifiability and gradient weak local identifiability. These are based on local properties of the likelihood, in particular the rank of the Hessian matrix. We relate these to the notions of parameter identifiability and redundancy previously introduced by Rothenberg (Econometrica 39 (1971 577-591 and Catchpole and Morgan (Biometrika 84 (1997 187-196. Within the widely used exponential family, parameter irredundancy, local identifiability, gradient weak local identifiability and weak local identifiability are shown to be largely equivalent. We consider applications to a recently developed class of cancer models of Little and Wright (Math Biosciences 183 (2003 111-134 and Little et al. (J Theoret Biol 254 (2008 229-238 that generalize a large number of other recently used quasi-biological cancer models. CONCLUSIONS/SIGNIFICANCE: We have shown that the previously developed concepts of parameter local identifiability and redundancy are closely related to the apparently weaker properties of weak local identifiability and gradient weak local identifiability--within the widely used exponential family these concepts largely coincide.