WorldWideScience

Sample records for amplicon quantification maq

  1. Integrated Microfluidic Nucleic Acid Isolation, Isothermal Amplification, and Amplicon Quantification

    Directory of Open Access Journals (Sweden)

    Michael G. Mauk

    2015-10-01

    Full Text Available Microfluidic components and systems for rapid (<60 min, low-cost, convenient, field-deployable sequence-specific nucleic acid-based amplification tests (NAATs are described. A microfluidic point-of-care (POC diagnostics test to quantify HIV viral load from blood samples serves as a representative and instructive example to discuss the technical issues and capabilities of “lab on a chip” NAAT devices. A portable, miniaturized POC NAAT with performance comparable to conventional PCR (polymerase-chain reaction-based tests in clinical laboratories can be realized with a disposable, palm-sized, plastic microfluidic chip in which: (1 nucleic acids (NAs are extracted from relatively large (~mL volume sample lysates using an embedded porous silica glass fiber or cellulose binding phase (“membrane” to capture sample NAs in a flow-through, filtration mode; (2 NAs captured on the membrane are isothermally (~65 °C amplified; (3 amplicon production is monitored by real-time fluorescence detection, such as with a smartphone CCD camera serving as a low-cost detector; and (4 paraffin-encapsulated, lyophilized reagents for temperature-activated release are pre-stored in the chip. Limits of Detection (LOD better than 103 virons/sample can be achieved. A modified chip with conduits hosting a diffusion-mode amplification process provides a simple visual indicator to readily quantify sample NA template. In addition, a companion microfluidic device for extracting plasma from whole blood without a centrifuge, generating cell-free plasma for chip-based molecular diagnostics, is described. Extensions to a myriad of related applications including, for example, food testing, cancer screening, and insect genotyping are briefly surveyed.

  2. The Quantification of Representative Sequences pipeline for amplicon sequencing: case study on within-population ITS1 sequence variation in a microparasite infecting Daphnia.

    Science.gov (United States)

    González-Tortuero, E; Rusek, J; Petrusek, A; Gießler, S; Lyras, D; Grath, S; Castro-Monzón, F; Wolinska, J

    2015-11-01

    Next generation sequencing (NGS) platforms are replacing traditional molecular biology protocols like cloning and Sanger sequencing. However, accuracy of NGS platforms has rarely been measured when quantifying relative frequencies of genotypes or taxa within populations. Here we developed a new bioinformatic pipeline (QRS) that pools similar sequence variants and estimates their frequencies in NGS data sets from populations or communities. We tested whether the estimated frequency of representative sequences, generated by 454 amplicon sequencing, differs significantly from that obtained by Sanger sequencing of cloned PCR products. This was performed by analysing sequence variation of the highly variable first internal transcribed spacer (ITS1) of the ichthyosporean Caullerya mesnili, a microparasite of cladocerans of the genus Daphnia. This analysis also serves as a case example of the usage of this pipeline to study within-population variation. Additionally, a public Illumina data set was used to validate the pipeline on community-level data. Overall, there was a good correspondence in absolute frequencies of C. mesnili ITS1 sequences obtained from Sanger and 454 platforms. Furthermore, analyses of molecular variance (amova) revealed that population structure of C. mesnili differs across lakes and years independently of the sequencing platform. Our results support not only the usefulness of amplicon sequencing data for studies of within-population structure but also the successful application of the QRS pipeline on Illumina-generated data. The QRS pipeline is freely available together with its documentation under GNU Public Licence version 3 at http://code.google.com/p/quantification-representative-sequences. PMID:25728529

  3. HPC-MAQ : A PARALLEL SHORT-READ REFERENCE ASSEMBLER

    Directory of Open Access Journals (Sweden)

    Veeram Venkata Siva Prasad

    2011-07-01

    Full Text Available Bioinformatics and computational biology are rooted in life sciences as well as computer and information sciences and technologies. Bioinformatics applies principles of information sciences and technologies to make the vast, diverse, and complex life sciences data more understandable and useful. Computational biology uses mathematical and computational approaches to address theoretical and experimental questions in biology. Short read sequence assembly is one of the most important steps in the analysis of biological data. There are many open source software’s available for short read sequence assembly where MAQ is one such popularly used software by the research community. In general, biological data sets generated by next generation sequencers are very huge and massive which requires tremendous amount of computational resources. The algorithm used for the short read sequence assembly is NP Hard which is computationally expensive and time consuming. Also MAQ is single threaded software which doesn't use the power of multi core and distributed computing and it doesn't scale. In this paper we report HPC-MAQ which addresses the NP-Hard related challenges of genome reference assembly and enables MAQ parallel and scalable through Hadoop which is a software framework for distributed computing.

  4. Mobile Air Quality Studies (MAQS-an international project

    Directory of Open Access Journals (Sweden)

    Sudik Claudia

    2010-04-01

    Full Text Available Abstract Due to an increasing awareness of the potential hazardousness of air pollutants, new laws, rules and guidelines have recently been implemented globally. In this respect, numerous studies have addressed traffic-related exposure to particulate matter using stationary technology so far. By contrast, only few studies used the advanced technology of mobile exposure analysis. The Mobile Air Quality Study (MAQS addresses the issue of air pollutant exposure by combining advanced high-granularity spatial-temporal analysis with vehicle-mounted, person-mounted and roadside sensors. The MAQS-platform will be used by international collaborators in order 1 to assess air pollutant exposure in relation to road structure, 2 to assess air pollutant exposure in relation to traffic density, 3 to assess air pollutant exposure in relation to weather conditions, 4 to compare exposure within vehicles between front and back seat (children positions, and 5 to evaluate "traffic zone"-exposure in relation to non-"traffic zone"-exposure. Primarily, the MAQS-platform will focus on particulate matter. With the establishment of advanced mobile analysis tools, it is planed to extend the analysis to other pollutants including NO2, SO2, nanoparticles and ozone.

  5. Application of Sharī‘ah contracts in contemporary Islamic finance: A maqāṣid perspective

    OpenAIRE

    Younes Soualhi

    2015-01-01

    This research exposes the underlying maqāṣid embedded in Sharī‘ah contracts as applied in Islamic banking and finance. It addresses the problem of not observing maqāṣid in nominated and combined Sharī‘ah contracts as well as the problem of not sufficiently imbuing maqāṣid in products developed by Islamic financial institutions. As a benchmark of the maqāṣid of wealth, the research adopts Ibn ‘Āshūr’s classification of maqāṣid to evaluate the conformity of Sharī‘ah contracts to Maqāṣid al-Shar...

  6. Maqāmāt Dalam Manthiq Al-Thayr Al-Attār

    Directory of Open Access Journals (Sweden)

    Syamsun Ni`am

    2015-12-01

    Full Text Available Abstract : Attār is one to be accounted amongst the greatest sufi poets of Persia sufis in the history of Sufism who has notable contibution to the esoteric Islam (Sufism. He packed his mystical teachings in the form of poetry and verse, -to be more interactive and expressive- though his teachings are not different with his predecessors, particularly of those who inspired by the principle of mystical union; and Attār is among the Sufis who have managed to reach the highest peak in the mystical journey. According to Attār , there are seven valleys that must be passed by spiritual traveler (sālik in his mystical journey, as mentioned in Manthiq al-Thayr. In this work he use the language of poetry to reveal the process of a servant to God through the stages (maqāmāt, and Attār  gives the parable maqāmāt is like a bird -burung flying looking for the king. Attār call maqāmāt is the term valleys, of which there are seven valleys.Keywords : Maqāmāt, seven valleys, fanā’, divine loveAbstrak : Attar adalah salah satu sufi besar Persia yang pernah dikenal dalam sejarah tasawuf. Attār telah banyak memberikan kontribusi pemikirannya dalam khasanah keilmuan Islam esoteris (tasawuf. Ia mengemas ajaran tasawufnya dalam bentuk puisi dan syair, —sehingga tampak lebih menarik— meskipun ajaran-ajaran yang disampaikan kurang lebih sama dengan para sufi pendahulunya, terutama yang menganut paham mystical union; dan Attār adalah di antara sufi yang telah berhasil mencapai puncak tertinggi dalam perjalanan mistisnya itu. Menurut Attār, ada tujuh lembah yang harus dilalui sālik dalam menempuh perjalanan mistisnya, sebagaimana disebutkan dalam Manthiq al-Thayr. Dalam karyanya ini, Attār menggunakan bahasa puisi dan syair untuk mengungkap proses seorang hamba menuju Tuhan melalui tahapan-tahapan (Maqāmāt, dan Attār memberikan perumpamaan maqāmāt tersebut seperti burung-burung yang terbang mencari rajanya. Attār  menyebut maq

  7. Measuring the performance of Islamic banks using maqāsid based model

    Directory of Open Access Journals (Sweden)

    Mustafa Omar Mohammed

    2015-12-01

    Full Text Available The vision and mission of Islamic banks were supposed to reflect the adherence of their activities and aspiration to Maqāṣid al-Sharī‘ah. However, there are contentions that Islamic banks have been converging towards conventional banking system. Efforts have been expended to reverse the tide and harmonise Islamic banking to its Sharī‘ah objectives. Hitherto, the existing conventional yardsticks have failed to measure the impact of the harmonisation exercise on Islamic banks’ performance. Therefore, using maqāṣid based yardstick to measure the performance of Islamic banks becomes imperative. This study has made use of al-Imām al-Ghazālī’s theory of Maqāṣid al-Sharī‘ah and Ibn ‘Āshūr’s reinterpretation, adopting content analysis and Sekaran (2000 behavioral science methods to develop a Maqāṣid Based Performance Evaluation Model (MPEM to measure the performance of Islamic banks. Experts’ opinions have validated the model and its acceptability. Suggestions are provided to policy makers and future research.

  8. Maqásid aš-šaría a ochrana rodiny

    Czech Academy of Sciences Publication Activity Database

    Bezoušková, Lenka

    Praha : Leges, 2013 - (Krošlák, D.; Moravčíková, M.), s. 160-174 ISBN 978-80-87576-98-4. - (Teoretik) Institutional support: RVO:68378122 Keywords : Islamic law * theory of maqásid * family law Subject RIV: AG - Legal Sciences

  9. Application of Sharī‘ah contracts in contemporary Islamic finance: A maqāṣid perspective

    Directory of Open Access Journals (Sweden)

    Younes Soualhi

    2015-12-01

    Full Text Available This research exposes the underlying maqāṣid embedded in Sharī‘ah contracts as applied in Islamic banking and finance. It addresses the problem of not observing maqāṣid in nominated and combined Sharī‘ah contracts as well as the problem of not sufficiently imbuing maqāṣid in products developed by Islamic financial institutions. As a benchmark of the maqāṣid of wealth, the research adopts Ibn ‘Āshūr’s classification of maqāṣid to evaluate the conformity of Sharī‘ah contracts to Maqāṣid al-Sharī‘ah namely, justice, circulation, transparency, and firmness. The study focuses on three markets related to the application of Sharī‘ah contracts, namely, banking, Islamic capital market, and takāful. The study concludes that, by and large, the application of Sharī‘ah contracts has observed Maqāṣid al-Sharī‘ah during its development and initial application stages of Islamic finance products; however, offering such products in the market has raised economic questions as to their viability and economic values. In addition, the malpractice of some Sharī‘ah contracts has long raised concerns as to the maqāṣid compliance of such products. The research recommends a de-sophistication of Islamic financial engineering to minimise the possibility of convergence with conventional finance. The research also emphasises product differentiation based on less complicated combined Sharī‘ah contracts.

  10. Maqâsid al-Qur’ân dan Deradikalisasi Penafsiran dalam Konteks Keindonesiaan

    Directory of Open Access Journals (Sweden)

    Ulya Fikriyati

    2015-09-01

    Full Text Available This article deals with the viewpoint that the reading of the Qur’ân by a certain generation is subject to criticism by the following generation. The article seeks to offer, as an example, the deradicalization of interpreting the so-called ‘radical’ verses of the Qur’ân in Indonesian context. Islam in Indonesia always interact with various races, ethnicities, religions and beliefs, and therefore requires a type of exegesis different from other regions such as the Middle East. For the radicalization of interpretation, this article offers what is called the maqâsid al-Qur’ân as its parameter. The maqâsid al-Qur’ân consists of seven points: 1 Hifz al-dîn wa tatwîr wasâilih, 2 Hifz al-nafs wa tatwîruhâ, 3 Hifz al-‘aql wa tatwîruh, 4 Hifz al-mâl wa tanmîyat wasâilih, 5 Hifz al-‘ird wa tatwîr al-wasâil li al-husûl ‘alayh, 6 Tahqîq al-huqûq al-insânîyah wa mâ yandarij tahtahâ, 7 Hifz al-‘âlam wa ‘imâratuhâ. As spirit and parameter, the maqâsid al-Qur’ân necessitates the dialectics of dynamic interpretation without any judgment of infidelity or heresy. If a certain reading of the Qur’anic verses deviates from these seven maqâsid al-Qur’ân above, it deserves to be examined further, but not to be immediately suppressed.

  11. Dispute management in Islamic financial services and products: A maqāṣid-based analysis

    OpenAIRE

    Umar A. Oseni

    2015-01-01

    The increasing expansion of the Islamic financial services industry beyond its original frontiers has not only come with success stories but has also been affected by the growing preference for litigation as the mode of dispute resolution. Exorbitant legal fees and cost of sustaining protracted litigation are two major challenges that require the attention of major stakeholders in the industry. This paper examines these challenges through a Maqāṣid al-Sharī‘ah focused prism considering the im...

  12. Removing Noise From Pyrosequenced Amplicons

    Directory of Open Access Journals (Sweden)

    Davenport Russell J

    2011-01-01

    Full Text Available Abstract Background In many environmental genomics applications a homologous region of DNA from a diverse sample is first amplified by PCR and then sequenced. The next generation sequencing technology, 454 pyrosequencing, has allowed much larger read numbers from PCR amplicons than ever before. This has revolutionised the study of microbial diversity as it is now possible to sequence a substantial fraction of the 16S rRNA genes in a community. However, there is a growing realisation that because of the large read numbers and the lack of consensus sequences it is vital to distinguish noise from true sequence diversity in this data. Otherwise this leads to inflated estimates of the number of types or operational taxonomic units (OTUs present. Three sources of error are important: sequencing error, PCR single base substitutions and PCR chimeras. We present AmpliconNoise, a development of the PyroNoise algorithm that is capable of separately removing 454 sequencing errors and PCR single base errors. We also introduce a novel chimera removal program, Perseus, that exploits the sequence abundances associated with pyrosequencing data. We use data sets where samples of known diversity have been amplified and sequenced to quantify the effect of each of the sources of error on OTU inflation and to validate these algorithms. Results AmpliconNoise outperforms alternative algorithms substantially reducing per base error rates for both the GS FLX and latest Titanium protocol. All three sources of error lead to inflation of diversity estimates. In particular, chimera formation has a hitherto unrealised importance which varies according to amplification protocol. We show that AmpliconNoise allows accurate estimates of OTU number. Just as importantly AmpliconNoise generates the right OTUs even at low sequence differences. We demonstrate that Perseus has very high sensitivity, able to find 99% of chimeras, which is critical when these are present at high

  13. Empirical Likelihood Inference for MA(q) Model%MA(q)模型的经验似然推断

    Institute of Scientific and Technical Information of China (English)

    陈燕红; 宋立新

    2009-01-01

    In this article we study the empirical likelihood inference for MA(q) model.We propose the moment restrictions,by which we get the empirical likelihood estimator of the model parameter,and we also propose an empirical log-likelihood ratio based on this estimator.Our result shows that the EL estimator is asymptotically normal,and the empirical log-likelihood ratio is proved to be asymptotical standard chi-square distribution.

  14. Integrated Microfluidic Nucleic Acid Isolation, Isothermal Amplification, and Amplicon Quantification

    OpenAIRE

    Mauk, Michael G.; Changchun Liu; Jinzhao Song; Bau, Haim H

    2015-01-01

    Microfluidic components and systems for rapid (<60 min), low-cost, convenient, field-deployable sequence-specific nucleic acid-based amplification tests (NAATs) are described. A microfluidic point-of-care (POC) diagnostics test to quantify HIV viral load from blood samples serves as a representative and instructive example to discuss the technical issues and capabilities of “lab on a chip” NAAT devices. A portable, miniaturized POC NAAT with performance comparable to conventional PCR (poly...

  15. The validation of the Indonesian version of psychotic symptoms ratings scale (PSYRATS, the Indonesian version of cognitive bias questionnaire for psychosis (CBQP and metacognitive ability questionnaire (MAQ

    Directory of Open Access Journals (Sweden)

    Erna Erawati

    2014-10-01

    Full Text Available Aim: The present study was to validate the Indonesian version of Psychotic Symptoms Ratings Scales (PSYRATS, the Cognitive Bias Questionnaire for Psychosis (CBQp and the Metacognitive Ability Questionnaire (MAQ as a new scale to measure the ability of metacognition of schizophrenia. Background: The PSYRATS, CBQp, and MAQ have demonstrated their usefulness for the assessment of hallucinations and delusions, cognitive biases and metacognitive ability in schizophrenia. So far no validation of the Indonesian version has been carried out. Methods: The PSYRATS, CBQp and MAQ were administered to 155 subjects with a diagnosis of schizophrenia. Factor structure, reliability, test-retest stability, and convergent validity were analyzed. Findings: We found that the all psychometric were reliable and valid. Indonesian version of the PSYRATS and CBQp Indonesian version have high reliability. The reliability of new psychometric of MAQ was Cronbachs alpha=.759 and was checked in a subsample (n=32; r=.668; p<.01. Conclusions: Similar to the original PSYRATS and CBQp, the Indonesian version of PSYRATS and CBQp have good psychometric properties. The new psychometric of MAQ is a valid instrument for assessing metacognition. Implications for future research are discussed. Keywords: Schizophrenia; Hallucination Severity; Delusion Severity; Bias Cognitive for Psychosis; Metacognition: Instrumental Study.

  16. Dispute management in Islamic financial services and products: A maqāṣid-based analysis

    Directory of Open Access Journals (Sweden)

    Umar A. Oseni

    2015-12-01

    Full Text Available The increasing expansion of the Islamic financial services industry beyond its original frontiers has not only come with success stories but has also been affected by the growing preference for litigation as the mode of dispute resolution. Exorbitant legal fees and cost of sustaining protracted litigation are two major challenges that require the attention of major stakeholders in the industry. This paper examines these challenges through a Maqāṣid al-Sharī‘ah focused prism considering the importance of the sustainable dispute management framework in Islamic financial services and products. While singling out the important higher objective (maqṣad of ḥifẓ al-māl, this study argues that preservation of wealth and financial resources requires effective means of resolving increasingly diverse disputes in the Islamic financial services industry. It is further argued that an effective dispute management framework will consider the original value proposition of Islamic financial intermediation which promotes maṣlaḥah (benefits and prevents mafsadah (hardship and ḍarar (financial harm. This makes a case for the affirmative relevance, potential adoption, and systemic modernisation of Islamic dispute management mechanisms such as ṣulḥ, taḥkīm, and muḥtasib in order to fulfil the overarching objective of protection and preservation of wealth and financial resources as one of the core objectives of Sharī‘ah.

  17. The Discourse of Medicine in the Čahār Maqāla (Four Discourses) of Nezami Aruzi of Samarghand.

    Science.gov (United States)

    Afshar, Ahmadreza

    2015-09-01

    Nezami Aruzi prepared Čahār Maqāla (Four Discourses) as a guide and admonishment for the rulers and kings. The fourth discourse of Čahār Maqāla with 12 anecdotes is devoted to the science of medicine and the characteristics of the physicians. The discourse presents the name of the eminent scientists, physicians, as well as Farsi and Arabic medical books that had professional acceptance in the medieval in Persia. The author has described how medicine was studied in the medieval in Persia and has presented notes on the physiology of the nervous system, pulse, uroscopy, fever, spiritual affairs and medical ethics. The current essay is a brief review of the medical subjects in Čahār Maqāla. PMID:26317607

  18. Freedom, Sufism, maqâm hurrîyah, hâl hurrîyah.

    OpenAIRE

    Ah. Haris Fakhrudi

    2015-01-01

    This paper will discuss the meaning of freedom in the discourse of Sufi thought, especially of Ibn ‘Arabî. This is based on the consideration that Sufism before Ibn ‘Arabî’s more focused on ritualistic orientation for students and only revealed variant of Sufi’s expressions, both on maqâmât and ahwâl. The presence of Ibn ‘Arabî, therefore, became the turning point in the discourse of Sufism by expressing his beliefs in the theoretical formulation. The doctrine of Sufism—which previously only ...

  19. Freedom, Sufism, maqâm hurrîyah, hâl hurrîyah.

    Directory of Open Access Journals (Sweden)

    Ah. Haris Fakhrudi

    2015-09-01

    Full Text Available This paper will discuss the meaning of freedom in the discourse of Sufi thought, especially of Ibn ‘Arabî. This is based on the consideration that Sufism before Ibn ‘Arabî’s more focused on ritualistic orientation for students and only revealed variant of Sufi’s expressions, both on maqâmât and ahwâl. The presence of Ibn ‘Arabî, therefore, became the turning point in the discourse of Sufism by expressing his beliefs in the theoretical formulation. The doctrine of Sufism—which previously only implicitly contained in the words of the Sufi shaykh—in the hands of Ibn ‘Arabî flashed into an open, theoretical, and obvios and thus opened the door for anyone who has a high intelligence in reflecting at once and realizing the metaphysical theories through operational forms. Therefore, this article will discuss some of the key concepts in the thought of Ibn ‘Arabî including the meaning of freedom (al-hurrîyah in Sufism, maqâm hurrîyah, and hâl hurrîyah obtained by the Sufis during their spiritual journey.

  20. Pendekatan Maqāṣid Syarī’ah: Konstruksi Terhadap Pengembangan Ilmu Ekonomi dan Keuangan Islam

    Directory of Open Access Journals (Sweden)

    Ayief Fathurrahman

    2014-12-01

    Full Text Available This article examines the problems that occur in the development process of Islamic economic and finance, both in theory and practice, especially in the Islamic banking practice in various parts of the world. To cope with the problems, the maqāṣid al-syarī’ah approach can be employed to understand the verses of the Qur’an and the sunnah, to reconsolidate the conflicting arguments, and to decide the legal standing to a case of which legal sanding is not mentioned in the Qur’an and the sunnah. The purpose is to make Islamic economic and finance relevant to the change of time and to reduce the turbulence taking place in the practice of Islamic economy and finance themselves.

  1. Sample Preparation for Fungal Community Analysis by High-Throughput Sequencing of Barcode Amplicons.

    Science.gov (United States)

    Clemmensen, Karina Engelbrecht; Ihrmark, Katarina; Durling, Mikael Brandström; Lindahl, Björn D

    2016-01-01

    Fungal species participate in vast numbers of processes in the landscape around us. However, their often cryptic growth, inside various substrates and in highly diverse species assemblages, has been a major obstacle to thorough analysis of fungal communities, hampering exhaustive description of the fungal kingdom. Recent technological developments allowing rapid, high-throughput sequencing of mixed communities from many samples at once are currently having a tremendous impact in fungal community ecology. Universal DNA extraction followed by amplification and sequencing of fungal species-level barcodes such as the nuclear internal transcribed spacer (ITS) region now enable identification and relative quantification of fungal community members across well-replicated experimental settings. Here, we present the sample preparation procedure presently used in our laboratory for fungal community analysis by high-throughput sequencing of amplified ITS2 markers. We focus on the procedure optimized for studies of total fungal communities in humus-rich soils, wood, and litter. However, this procedure can be applied to other sample types and markers. We focus on the laboratory-based part of sample preparation, that is, the procedure from the point where samples enter the laboratory until amplicons are submitted for sequencing. Our procedure comprises four main parts: (1) universal DNA extraction, (2) optimization of PCR conditions, (3) production of tagged ITS amplicons, and (4) preparation of the multiplexed amplicon mix to be sequenced. The presented procedure is independent of the specific high-throughput sequencing technology used, which makes it highly versatile. PMID:26791497

  2. Maqásid aš-šarí´a a interpretace islámského práva

    Czech Academy of Sciences Publication Activity Database

    Bezoušková, Lenka

    Vol. 1. Brno : Masarykova univerzita, 2014 - (Večeřa, M.; Machalová, T.; Valdhans, J.), s. 12-33 ISBN 978-80-210-6808-7. - (Acta Universitatis Brunensis Iuridica. řada teoretická. 469). [Dny práva 2013. Brno (CZ), 13.11.2013-14.11.2013] R&D Projects: GA ČR GA13-30299S Institutional support: RVO:68378122 Keywords : Islamic law * maqásid ash-shari´a Subject RIV: AG - Legal Sciences http://www.law.muni.cz/sborniky/dny_prava_2013/01_Aktualni_otazky.pdf

  3. Harmonising legality with morality in Islamic banking and finance: A quest for Maqāṣid al-Sharī‘ah paradigm

    Directory of Open Access Journals (Sweden)

    Luqman Zakariyah

    2015-12-01

    Full Text Available Scholars in Islamic Finance Industry (IFI have been calling for the integration of Islamic morality with legal theories in the industry. Among the reasons for this call is an unethical trend in product innovation. Implementing Islamic banking and financial practices would require adopting their undergirding Islamic legal and moral frameworks. Departing from these foundations of Islamic law could render the activities conducted under its name religiously unacceptable. Many approaches have been put forward to achieve this cause. One of the most complex yet subjective approaches is the quest for Maqāṣid al-Sharī‘ah. This paper critically examines the feasibility of harmonising morality with legality in Islamic finance. In doing so, it will reveal what constitutes morality and legality in Islamic legal theory, and critically examine the approaches of Muslim classical scholars in fusing the two elements together for the realisation and actualisation of the very objectives of Sharī‘ah. Questions of the relationship between morality and legality are raised, and samples of Islamic finance products are evaluated to expose their moral and legal dimensions. Lastly, the role of Maqāṣid al-Sharī‘ah in the process of harmonisation is discussed with some observations and reservations on the practicality of their implementation.

  4. Systems consequences of amplicon formation in human breast cancer.

    Science.gov (United States)

    Inaki, Koichiro; Menghi, Francesca; Woo, Xing Yi; Wagner, Joel P; Jacques, Pierre-Étienne; Lee, Yi Fang; Shreckengast, Phung Trang; Soon, Wendy WeiJia; Malhotra, Ankit; Teo, Audrey S M; Hillmer, Axel M; Khng, Alexis Jiaying; Ruan, Xiaoan; Ong, Swee Hoe; Bertrand, Denis; Nagarajan, Niranjan; Karuturi, R Krishna Murthy; Miranda, Alfredo Hidalgo; Liu, Edison T

    2014-10-01

    Chromosomal structural variations play an important role in determining the transcriptional landscape of human breast cancers. To assess the nature of these structural variations, we analyzed eight breast tumor samples with a focus on regions of gene amplification using mate-pair sequencing of long-insert genomic DNA with matched transcriptome profiling. We found that tandem duplications appear to be early events in tumor evolution, especially in the genesis of amplicons. In a detailed reconstruction of events on chromosome 17, we found large unpaired inversions and deletions connect a tandemly duplicated ERBB2 with neighboring 17q21.3 amplicons while simultaneously deleting the intervening BRCA1 tumor suppressor locus. This series of events appeared to be unusually common when examined in larger genomic data sets of breast cancers albeit using approaches with lesser resolution. Using siRNAs in breast cancer cell lines, we showed that the 17q21.3 amplicon harbored a significant number of weak oncogenes that appeared consistently coamplified in primary tumors. Down-regulation of BRCA1 expression augmented the cell proliferation in ERBB2-transfected human normal mammary epithelial cells. Coamplification of other functionally tested oncogenic elements in other breast tumors examined, such as RIPK2 and MYC on chromosome 8, also parallel these findings. Our analyses suggest that structural variations efficiently orchestrate the gain and loss of cancer gene cassettes that engage many oncogenic pathways simultaneously and that such oncogenic cassettes are favored during the evolution of a cancer. PMID:25186909

  5. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup;

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r...... belonging to the phylum Chloroflexi. Based on knowledge about their ecophysiology, other control measures were introduced and the bulking problem was reduced after 2 months. Besides changes in the filament abundance and composition also other changes in the microbial community were observed that likely...... correlated with the bacterial species composition in 25 Danish full-scale WWTPs with nutrient removal. Examples of properties were SVI, filament index, floc size, floc strength, content of cations and amount of extracellular polymeric substances. Multivariate statistics provided several important insights...

  6. Mobile air quality studies (MAQS in inner cities: particulate matter PM10 levels related to different vehicle driving modes and integration of data into a geographical information program

    Directory of Open Access Journals (Sweden)

    Uibel Stefanie

    2012-10-01

    Full Text Available Abstract Background Particulate matter (PM is assumed to exert a major burden on public health. Most studies that address levels of PM use stationary measure systems. By contrast, only few studies measure PM concentrations under mobile conditions to analyze individual exposure situations. Methods By combining spatial-temporal analysis with a novel vehicle-mounted sensor system, the present Mobile Air Quality Study (MAQS aimed to analyse effects of different driving conditions in a convertible vehicle. PM10 was continuously monitored in a convertible car, driven with roof open, roof closed, but windows open, or windows closed. Results PM10 values inside the car were nearly always higher with open roof than with roof and windows closed, whereas no difference was seen with open or closed windows. During the day PM10 values varied with high values before noon, and occasional high median values or standard deviation values due to individual factors. Vehicle speed in itself did not influence the mean value of PM10; however, at traffic speed (10 – 50 km/h the standard deviation was large. No systematic difference was seen between PM10 values in stationary and mobile cars, nor was any PM10 difference observed between driving within or outside an environmental (low emission zone. Conclusions The present study has shown the feasibility of mobile PM analysis in vehicles. Individual exposure of the occupants varies depending on factors like time of day as well as ventilation of the car; other specific factors are clearly identifiably and may relate to specific PM10 sources. This system may be used to monitor individual exposure ranges and provide recommendations for preventive measurements. Although differences in PM10 levels were found under certain ventilation conditions, these differences are likely not of concern for the safety and health of passengers.

  7. Use of AmpliWax to optimize amplicon sterilization by isopsoralen.

    OpenAIRE

    De la Viuda, M; Fille, M; Ruiz, J; Aslanzadeh, J

    1996-01-01

    The photochemical inactivation of amplicons by isopsoralen (IP-10) has been suggested as a possible means to prevent PCR carryover contamination. To evaluate the technique, serial dilutions of amplicons (10(11) to 10(3)) from the Borrelia burgdorferi OSP A gene were amplified in the presence of 0, 25, 50, and 100 micrograms of IP-10 per ml for 45 cycles. The PCR products were exposed to UV light for 15 min to activate IP-10 and sterilize the amplicons. One microliter of each sterilized sample...

  8. JRC GMO-Amplicons, a collection of nucleic acid sequences related to genetically modified organisms

    OpenAIRE

    PETRILLO MAURO; ANGERS ALEXANDRE; HENRIKSSON PETER; Bonfini, Laura; PATAK DENNSTEDT Alexandre; KREYSA JOACHIM

    2015-01-01

    The DNA target sequence is the key element in designing detection methods for genetically modified organisms (GMOs). Unfortunately this information is frequently lacking, especially for unauthorized GMOs. In addition, patent sequences are generally poorly annotated, buried in complex and extensive documentation and hard to link to the corresponding GM event. Here, we present the JRC GMO-Amplicons, a database of amplicons collected by screening public nucleotide sequence databanks by in silico...

  9. Multi-factor data normalization enables the detection of copy number aberrations in amplicon sequencing data

    OpenAIRE

    Boeva, Valentina; Popova, Tatiana; Lienard, Maxime; Toffoli, Sebastien; Kamal, Maud; Le Tourneau, Christophe; Gentien, David; Servant, Nicolas; Gestraud, Pierre; Rio Frio, Thomas; Hupé, Philippe; Barillot, Emmanuel; Laes, Jean-François

    2014-01-01

    Motivation: Because of its low cost, amplicon sequencing, also known as ultra-deep targeted sequencing, is now becoming widely used in oncology for detection of actionable mutations, i.e. mutations influencing cell sensitivity to targeted therapies. Amplicon sequencing is based on the polymerase chain reaction amplification of the regions of interest, a process that considerably distorts the information on copy numbers initially present in the tumor DNA. Therefore, additional experiments such...

  10. JRC GMO-Amplicons: a collection of nucleic acid sequences related to genetically modified organisms.

    Science.gov (United States)

    Petrillo, Mauro; Angers-Loustau, Alexandre; Henriksson, Peter; Bonfini, Laura; Patak, Alex; Kreysa, Joachim

    2015-01-01

    The DNA target sequence is the key element in designing detection methods for genetically modified organisms (GMOs). Unfortunately this information is frequently lacking, especially for unauthorized GMOs. In addition, patent sequences are generally poorly annotated, buried in complex and extensive documentation and hard to link to the corresponding GM event. Here, we present the JRC GMO-Amplicons, a database of amplicons collected by screening public nucleotide sequence databanks by in silico determination of PCR amplification with reference methods for GMO analysis. The European Union Reference Laboratory for Genetically Modified Food and Feed (EU-RL GMFF) provides these methods in the GMOMETHODS database to support enforcement of EU legislation and GM food/feed control. The JRC GMO-Amplicons database is composed of more than 240 000 amplicons, which can be easily accessed and screened through a web interface. To our knowledge, this is the first attempt at pooling and collecting publicly available sequences related to GMOs in food and feed. The JRC GMO-Amplicons supports control laboratories in the design and assessment of GMO methods, providing inter-alia in silico prediction of primers specificity and GM targets coverage. The new tool can assist the laboratories in the analysis of complex issues, such as the detection and identification of unauthorized GMOs. Notably, the JRC GMO-Amplicons database allows the retrieval and characterization of GMO-related sequences included in patents documentation. Finally, it can help annotating poorly described GM sequences and identifying new relevant GMO-related sequences in public databases. The JRC GMO-Amplicons is freely accessible through a web-based portal that is hosted on the EU-RL GMFF website. Database URL: http://gmo-crl.jrc.ec.europa.eu/jrcgmoamplicons/. PMID:26424080

  11. Efficient error correction for next-generation sequencing of viral amplicons

    Directory of Open Access Journals (Sweden)

    Skums Pavel

    2012-06-01

    Full Text Available Abstract Background Next-generation sequencing allows the analysis of an unprecedented number of viral sequence variants from infected patients, presenting a novel opportunity for understanding virus evolution, drug resistance and immune escape. However, sequencing in bulk is error prone. Thus, the generated data require error identification and correction. Most error-correction methods to date are not optimized for amplicon analysis and assume that the error rate is randomly distributed. Recent quality assessment of amplicon sequences obtained using 454-sequencing showed that the error rate is strongly linked to the presence and size of homopolymers, position in the sequence and length of the amplicon. All these parameters are strongly sequence specific and should be incorporated into the calibration of error-correction algorithms designed for amplicon sequencing. Results In this paper, we present two new efficient error correction algorithms optimized for viral amplicons: (i k-mer-based error correction (KEC and (ii empirical frequency threshold (ET. Both were compared to a previously published clustering algorithm (SHORAH, in order to evaluate their relative performance on 24 experimental datasets obtained by 454-sequencing of amplicons with known sequences. All three algorithms show similar accuracy in finding true haplotypes. However, KEC and ET were significantly more efficient than SHORAH in removing false haplotypes and estimating the frequency of true ones. Conclusions Both algorithms, KEC and ET, are highly suitable for rapid recovery of error-free haplotypes obtained by 454-sequencing of amplicons from heterogeneous viruses. The implementations of the algorithms and data sets used for their testing are available at: http://alan.cs.gsu.edu/NGS/?q=content/pyrosequencing-error-correction-algorithm

  12. A flexible and economical barcoding approach for highly multiplexed amplicon sequencing of diverse target genes

    Directory of Open Access Journals (Sweden)

    Craig W. Herbold

    2015-07-01

    Full Text Available High throughput sequencing of phylogenetic and functional gene amplicons provides tremendous insight into the structure and functional potential of complex microbial communities. Here, we introduce a highly adaptable and economical PCR approach to barcoding and pooling libraries of numerous target genes. In this approach, we replace gene- and sequencing platform-specific fusion primers with general, interchangeable barcoding primers, enabling nearly limitless customized barcode-primer combinations. Compared to barcoding with long fusion primers, our multiple-target gene approach is more economical because it overall requires lower number of primers and is based on short primers with generally lower synthesis and purification costs. To highlight our approach, we pooled over 900 different small-subunit rRNA and functional gene amplicon libraries obtained from various environmental or host-associated microbial community samples into a single, paired-end Illumina MiSeq run. Although the amplicon regions ranged in size from approximately 290 to 720 bp, we found no significant systematic sequencing bias related to amplicon length or gene target. Our results indicate that this flexible multiplexing approach produces large, diverse and high quality sets of amplicon sequence data for modern studies in microbial ecology.

  13. Integration of complete chloroplast genome sequences with small amplicon datasets improves phylogenetic resolution in Acacia.

    Science.gov (United States)

    Williams, Anna V; Miller, Joseph T; Small, Ian; Nevill, Paul G; Boykin, Laura M

    2016-03-01

    Combining whole genome data with previously obtained amplicon sequences has the potential to increase the resolution of phylogenetic analyses, particularly at low taxonomic levels or where recent divergence, rapid speciation or slow genome evolution has resulted in limited sequence variation. However, the integration of these types of data for large scale phylogenetic studies has rarely been investigated. Here we conduct a phylogenetic analysis of the whole chloroplast genome and two nuclear ribosomal loci for 65 Acacia species from across the most recent Acacia phylogeny. We then combine this data with previously generated amplicon sequences (four chloroplast loci and two nuclear ribosomal loci) for 508 Acacia species. We use several phylogenetic methods, including maximum likelihood bootstrapping (with and without constraint) and ExaBayes, in order to determine the success of combining a dataset of 4000bp with one of 189,000bp. The results of our study indicate that the inclusion of whole genome data gave a far better resolved and well supported representation of the phylogenetic relationships within Acacia than using only amplicon sequences, with the greatest support observed when using a whole genome phylogeny as a constraint on the amplicon sequences. Our study therefore provides methods for optimal integration of genomic and amplicon sequences. PMID:26702955

  14. Unlabeled Oligonucleotides as Internal Temperature Controls for Genotyping by Amplicon Melting

    OpenAIRE

    Seipp, Michael T.; Durtschi, Jacob D; Liew, Michael A.; Williams, Jamie; Damjanovich, Kristy; Pont-Kingdon, Genevieve; Lyon, Elaine; Voelkerding, Karl V.; Wittwer, Carl T.

    2007-01-01

    Amplicon melting is a closed-tube method for genotyping that does not require probes, real-time analysis, or allele-specific polymerase chain reaction. However, correct differentiation of homozygous mutant and wild-type samples by melting temperature (Tm) requires high-resolution melting and closely controlled reaction conditions. When three different DNA extraction methods were used to isolate DNA from whole blood, amplicon Tm differences of 0.03 to 0.39°C attributable to the extractions wer...

  15. Overexpressed Genes/ESTs and Characterization of Distinct Amplicons on 17823 in Breast Cancer Cells

    Directory of Open Access Journals (Sweden)

    Ayse E. Erson

    2001-01-01

    Full Text Available 17823 is a frequent site of gene amplification in breast cancer. Several lines of evidence suggest the presence of multiple amplicons on 17823. To characterize distinct amplicons on 17823 and localize putative oncogenes, we screened genes and expressed sequence tags (ESTs in existing physical and radiation hybrid maps for amplification and overexpression in breast cancer cell lines by semiquantitative duplex PCR, semiquantitative duplex RT-PCR, Southern blot, Northern blot analyses. We identified two distinct amplicons on 17823, one including TBX2 and another proximal region including RPS6KB1 (PS6K and MUL. In addition to these previously reported overexpressed genes, we also identified amplification and overexpression of additional uncharacterized genes and ESTs, some of which suggest potential oncogenic activity. In conclusion, we have further defined two distinct regions of gene amplification and overexpression on 17823 with identification of new potential oncogene candidates. Based on the amplification and overexpression patterns of known and as of yet unrecognized genes on 17823, it is likely that some of these genes mapping to the discrete amplicons function as oncogenes and contribute to tumor progression in breast cancer cells.

  16. Evaluation of Hybridization Capture Versus Amplicon-Based Methods for Whole-Exome Sequencing.

    Science.gov (United States)

    Samorodnitsky, Eric; Jewell, Benjamin M; Hagopian, Raffi; Miya, Jharna; Wing, Michele R; Lyon, Ezra; Damodaran, Senthilkumar; Bhatt, Darshna; Reeser, Julie W; Datta, Jharna; Roychowdhury, Sameek

    2015-09-01

    Next-generation sequencing has aided characterization of genomic variation. While whole-genome sequencing may capture all possible mutations, whole-exome sequencing remains cost-effective and captures most phenotype-altering mutations. Initial strategies for exome enrichment utilized a hybridization-based capture approach. Recently, amplicon-based methods were designed to simplify preparation and utilize smaller DNA inputs. We evaluated two hybridization capture-based and two amplicon-based whole-exome sequencing approaches, utilizing both Illumina and Ion Torrent sequencers, comparing on-target alignment, uniformity, and variant calling. While the amplicon methods had higher on-target rates, the hybridization capture-based approaches demonstrated better uniformity. All methods identified many of the same single-nucleotide variants, but each amplicon-based method missed variants detected by the other three methods and reported additional variants discordant with all three other technologies. Many of these potential false positives or negatives appear to result from limited coverage, low variant frequency, vicinity to read starts/ends, or the need for platform-specific variant calling algorithms. All methods demonstrated effective copy-number variant calling when evaluated against a single-nucleotide polymorphism array. This study illustrates some differences between whole-exome sequencing approaches, highlights the need for selecting appropriate variant calling based on capture method, and will aid laboratories in selecting their preferred approach. PMID:26110913

  17. Genomic and expression array profiling of chromosome 20q amplicon in human colon cancer cells

    Directory of Open Access Journals (Sweden)

    Carter Jennifer

    2005-01-01

    Full Text Available Background: Gain of the q arm of chromosome 20 in human colorectal cancer has been associated with poorer survival time and has been reported to increase in frequency from adenomas to metastasis. The increasing frequency of chromosome 20q amplification during colorectal cancer progression and the presence of this amplification in carcinomas of other tissue origin has lead us to hypothesize that 20q11-13 harbors one or more genes which, when over expressed promote tumor invasion and metastasis. Aims: Generate genomic and expression profiles of the 20q amplicon in human cancer cell lines in order to identify genes with increased copy number and expression. Materials and Methods: Utilizing genomic sequencing clones and amplification mapping data from our lab and other previous studies, BAC/ PAC tiling paths spanning the 20q amplicon and genomic microarrays were generated. Array-CGH on the custom array with human cancer cell line DNAs was performed to generate genomic profiles of the amplicon. Expression array analysis with RNA from these cell lines using commercial oligo microarrays generated expression profiles of the amplicon. The data were then combined in order to identify genes with increased copy number and expression. Results: Over expressed genes in regions of increased copy number were identified and a list of potential novel genetic tumor markers was assembled based on biological functions of these genes Conclusions: Performing high-resolution genomic microarray profiling in conjunction with expression analysis is an effective approach to identify potential tumor markers.

  18. Absolute estimation of initial concentrations of amplicon in a real-time RT-PCR process

    Directory of Open Access Journals (Sweden)

    Kohn Michael

    2007-10-01

    Full Text Available Abstract Background Since real time PCR was first developed, several approaches to estimating the initial quantity of template in an RT-PCR reaction have been tried. While initially only the early thermal cycles corresponding to exponential duplication were used, lately there has been an effort to use all of the cycles in a PCR. The efforts have included both fitting empirical sigmoid curves and more elaborate mechanistic models that explore the chemical reactions taking place during each cycle. The more elaborate mechanistic models require many more parameters than can be fit from a single amplification, while the empirical models provide little insight and are difficult to tailor to specific reactants. Results We directly estimate the initial amount of amplicon using a simplified mechanistic model based on chemical reactions in the annealing step of the PCR. The basic model includes the duplication of DNA with the digestion of Taqman probe and the re-annealing between previously synthesized DNA strands of opposite orientation. By modelling the amount of Taqman probe digested and matching that with the observed fluorescence, the conversion factor between the number of fluorescing dye molecules and observed fluorescent emission can be estimated, along with the absolute initial amount of amplicon and the rate parameter for re-annealing. The model is applied to several PCR reactions with known amounts of amplicon and is shown to work reasonably well. An expanded version of the model allows duplication of amplicon without release of fluorescent dye, by adding 1 more parameter to the model. The additional process is helpful in most cases where the initial primer concentration exceeds the initial probe concentration. Software for applying the algorithm to data may be downloaded at http://www.niehs.nih.gov/research/resources/software/pcranalyzer/ Conclusion We present proof of the principle that a mechanistically based model can be fit to observations

  19. High-Throughput Identification and Quantification of Candida Species Using High Resolution Derivative Melt Analysis of Panfungal Amplicons

    OpenAIRE

    Mandviwala, Tasneem; Shinde, Rupali; Kalra, Apoorv; Jack D. Sobel; Akins, Robert A.

    2010-01-01

    Fungal infections pose unique challenges to molecular diagnostics; fungal molecular diagnostics consequently lags behind bacterial and viral counterparts. Nevertheless, fungal infections are often life-threatening, and early detection and identification of species is crucial to successful intervention. A high throughput PCR-based method is needed that is independent of culture, is sensitive to the level of one fungal cell per milliliter of blood or other tissue types, and is capable of detect...

  20. Using expected sequence features to improve basecalling accuracy of amplicon pyrosequencing data

    DEFF Research Database (Denmark)

    Rask, Thomas Salhøj; Petersen, Bent; Chen, Donald S.;

    2016-01-01

    insertions and deletions, are on the other hand likely to disrupt open reading frames. Such an inverse relationship between errors and expectation based on prior knowledge can be used advantageously to guide the process known as basecalling, i.e. the inference of nucleotide sequence from raw sequencing data......Amplicon pyrosequencing targets a known genetic region and thus inherently produces reads highly anticipated to have certain features, such as conserved nucleotide sequence, and in the case of protein coding DNA, an open reading frame. Pyrosequencing errors, consisting mainly of nucleotide....... This probabilistic approach enables integration of basecalling into a larger model where other parameters can be incorporated, such as the likelihood for observing a full-length open reading frame at the targeted region. We apply the method to 454 amplicon pyrosequencing data obtained from a malaria...

  1. DADA2: High-resolution sample inference from Illumina amplicon data.

    Science.gov (United States)

    Callahan, Benjamin J; McMurdie, Paul J; Rosen, Michael J; Han, Andrew W; Johnson, Amy Jo A; Holmes, Susan P

    2016-07-01

    We present the open-source software package DADA2 for modeling and correcting Illumina-sequenced amplicon errors (https://github.com/benjjneb/dada2). DADA2 infers sample sequences exactly and resolves differences of as little as 1 nucleotide. In several mock communities, DADA2 identified more real variants and output fewer spurious sequences than other methods. We applied DADA2 to vaginal samples from a cohort of pregnant women, revealing a diversity of previously undetected Lactobacillus crispatus variants. PMID:27214047

  2. Massively parallel sequencing of 17 commonly used forensic autosomal STRs and amelogenin with small amplicons.

    Science.gov (United States)

    Kim, Eun Hye; Lee, Hwan Young; Yang, In Seok; Jung, Sang-Eun; Yang, Woo Ick; Shin, Kyoung-Jin

    2016-05-01

    The next-generation sequencing (NGS) method has been utilized to analyze short tandem repeat (STR) markers, which are routinely used for human identification purposes in the forensic field. Some researchers have demonstrated the successful application of the NGS system to STR typing, suggesting that NGS technology may be an alternative or additional method to overcome limitations of capillary electrophoresis (CE)-based STR profiling. However, there has been no available multiplex PCR system that is optimized for NGS analysis of forensic STR markers. Thus, we constructed a multiplex PCR system for the NGS analysis of 18 markers (13CODIS STRs, D2S1338, D19S433, Penta D, Penta E and amelogenin) by designing amplicons in the size range of 77-210 base pairs. Then, PCR products were generated from two single-sources, mixed samples and artificially degraded DNA samples using a multiplex PCR system, and were prepared for sequencing on the MiSeq system through construction of a subsequent barcoded library. By performing NGS and analyzing the data, we confirmed that the resultant STR genotypes were consistent with those of CE-based typing. Moreover, sequence variations were detected in targeted STR regions. Through the use of small-sized amplicons, the developed multiplex PCR system enables researchers to obtain successful STR profiles even from artificially degraded DNA as well as STR loci which are analyzed with large-sized amplicons in the CE-based commercial kits. In addition, successful profiles can be obtained from mixtures up to a 1:19 ratio. Consequently, the developed multiplex PCR system, which produces small size amplicons, can be successfully applied to STR NGS analysis of forensic casework samples such as mixtures and degraded DNA samples. PMID:26799314

  3. From Benchtop to Desktop: Important Considerations when Designing Amplicon Sequencing Workflows

    OpenAIRE

    Murray, Dáithí C; Megan L Coghlan; Michael Bunce

    2015-01-01

    Amplicon sequencing has been the method of choice in many high-throughput DNA sequencing (HTS) applications. To date there has been a heavy focus on the means by which to analyse the burgeoning amount of data afforded by HTS. In contrast, there has been a distinct lack of attention paid to considerations surrounding the importance of sample preparation and the fidelity of library generation. No amount of high-end bioinformatics can compensate for poorly prepared samples and it is therefore im...

  4. Potential of pmoA Amplicon Pyrosequencing for Methanotroph Diversity Studies ▿†

    OpenAIRE

    Lüke, Claudia; Frenzel, Peter

    2011-01-01

    We analyzed the potential of pmoA amplicon pyrosequencing compared to that of Sanger sequencing with paddy soils as a model environment. We defined operational taxonomic unit (OTU) cutoff values of 7% and 18%, reflecting methanotrophic species and major phylogenetic pmoA lineages, respectively. Major lineages were already well covered by clone libraries; nevertheless, pyrosequencing provided a higher level of diversity at the species level.

  5. Fast, accurate and easy-to-pipeline methods for amplicon sequence processing

    Science.gov (United States)

    Antonielli, Livio; Sessitsch, Angela

    2016-04-01

    Next generation sequencing (NGS) technologies established since years as an essential resource in microbiology. While on the one hand metagenomic studies can benefit from the continuously increasing throughput of the Illumina (Solexa) technology, on the other hand the spreading of third generation sequencing technologies (PacBio, Oxford Nanopore) are getting whole genome sequencing beyond the assembly of fragmented draft genomes, making it now possible to finish bacterial genomes even without short read correction. Besides (meta)genomic analysis next-gen amplicon sequencing is still fundamental for microbial studies. Amplicon sequencing of the 16S rRNA gene and ITS (Internal Transcribed Spacer) remains a well-established widespread method for a multitude of different purposes concerning the identification and comparison of archaeal/bacterial (16S rRNA gene) and fungal (ITS) communities occurring in diverse environments. Numerous different pipelines have been developed in order to process NGS-derived amplicon sequences, among which Mothur, QIIME and USEARCH are the most well-known and cited ones. The entire process from initial raw sequence data through read error correction, paired-end read assembly, primer stripping, quality filtering, clustering, OTU taxonomic classification and BIOM table rarefaction as well as alternative "normalization" methods will be addressed. An effective and accurate strategy will be presented using the state-of-the-art bioinformatic tools and the example of a straightforward one-script pipeline for 16S rRNA gene or ITS MiSeq amplicon sequencing will be provided. Finally, instructions on how to automatically retrieve nucleotide sequences from NCBI and therefore apply the pipeline to targets other than 16S rRNA gene (Greengenes, SILVA) and ITS (UNITE) will be discussed.

  6. Single genome amplification and direct amplicon sequencing of Plasmodium spp. DNA from ape fecal specimens

    OpenAIRE

    Liu, Weimin; Li, Yingying; Peeters, Martine; Rayner, Julian; Sharp, Paul; Shaw, George; Hahn, Beatrice

    2010-01-01

    Conventional PCR followed by molecular cloning and sequencing of amplified products is commonly used to test clinical specimens for target sequences of interest, such as viral, bacterial or parasite nucleic acids. However, this approach has serious limitations when used to analyze mixtures of genetically divergent templates1–9. This is because Taq polymerase is prone to switch templates during the amplification process, thereby generating recombinants that do not exist in vivo4. When amplicon...

  7. Single-stranded positive-sense RNA viruses generated in days using infectious subgenomic amplicons

    OpenAIRE

    Aubry, Fabien; Nougairède, Antoine; de Fabritus, Lauriane; Querat, Gilles; Gould, Ernest A.; de Lamballerie, Xavier

    2014-01-01

    Reverse genetics is a key methodology for producing genetically modified RNA viruses and deciphering cellular and viral biological properties, but methods based on the preparation of plasmid-based complete viral genomes are laborious and unpredictable. Here, both wild-type and genetically modified infectious RNA viruses were generated in days using the newly described ISA (infectious-subgenomic-amplicons) method. This new versatile and simple procedure may enhance our capacity to obtain infec...

  8. From benchtop to desktop: important considerations when designing amplicon sequencing workflows.

    Directory of Open Access Journals (Sweden)

    Dáithí C Murray

    Full Text Available Amplicon sequencing has been the method of choice in many high-throughput DNA sequencing (HTS applications. To date there has been a heavy focus on the means by which to analyse the burgeoning amount of data afforded by HTS. In contrast, there has been a distinct lack of attention paid to considerations surrounding the importance of sample preparation and the fidelity of library generation. No amount of high-end bioinformatics can compensate for poorly prepared samples and it is therefore imperative that careful attention is given to sample preparation and library generation within workflows, especially those involving multiple PCR steps. This paper redresses this imbalance by focusing on aspects pertaining to the benchtop within typical amplicon workflows: sample screening, the target region, and library generation. Empirical data is provided to illustrate the scope of the problem. Lastly, the impact of various data analysis parameters is also investigated in the context of how the data was initially generated. It is hoped this paper may serve to highlight the importance of pre-analysis workflows in achieving meaningful, future-proof data that can be analysed appropriately. As amplicon sequencing gains traction in a variety of diagnostic applications from forensics to environmental DNA (eDNA it is paramount workflows and analytics are both fit for purpose.

  9. Electrochemical DNA sensor for anthrax toxin activator gene atxA-detection of PCR amplicons.

    Science.gov (United States)

    Das, Ritu; Goel, Ajay K; Sharma, Mukesh K; Upadhyay, Sanjay

    2015-12-15

    We report the DNA probe functionalized electrochemical genosensor for the detection of Bacillus anthracis, specific towards the regulatory gene atxA. The DNA sensor is fabricated on electrochemically deposited gold nanoparticle on self assembled layer of (3-Mercaptopropyl) trimethoxysilane (MPTS) on GC electrode. DNA hybridization is monitored by differential pulse voltammogram (DPV). The modified GC electrode is characterized by atomic force microscopy (AFM), cyclic voltammetry (CV), and electrochemical impedance spectroscopy (EIS) method. We also quantified the DNA probe density on electrode surface by the chronocoulometric method. The detection is specific and selective for atxA gene by DNA probe on the electrode surface. No report is available for the detection of B. anthracis by using atxA an anthrax toxin activator gene. In the light of real and complex sample, we have studied the PCR amplicons of 303, 361 and 568 base pairs by using symmetric and asymmetric PCR approaches. The DNA probe of atxA gene efficiently hybridizes with different base pairs of PCR amplicons. The detection limit is found to be 1.0 pM (S/N ratio=3). The results indicate that the DNA sensor is able to detect synthetic target as well as PCR amplicons of different base pairs. PMID:26257186

  10. Impact of Mutation Type and Amplicon Characteristics on Genetic Diversity Measures Generated Using a High-Resolution Melting Diversity Assay

    OpenAIRE

    Cousins, Matthew M.; Donnell, Deborah; Eshleman, Susan H.

    2013-01-01

    We adapted high-resolution melting (HRM) technology to measure genetic diversity without sequencing. Diversity is measured as a single numeric HRM score. Herein, we determined the impact of mutation types and amplicon characteristics on HRM diversity scores. Plasmids were generated with single-base changes, insertions, and deletions. Different primer sets were used to vary the position of mutations within amplicons. Plasmids and plasmid mixtures were analyzed to determine the impact of mutati...

  11. Clinical impact of targeted amplicon sequencing for meningioma as a practical clinical-sequencing system.

    Science.gov (United States)

    Yuzawa, Sayaka; Nishihara, Hiroshi; Yamaguchi, Shigeru; Mohri, Hiromi; Wang, Lei; Kimura, Taichi; Tsuda, Masumi; Tanino, Mishie; Kobayashi, Hiroyuki; Terasaka, Shunsuke; Houkin, Kiyohiro; Sato, Norihiro; Tanaka, Shinya

    2016-07-01

    Recent genetic analyses using next-generation sequencers have revealed numerous genetic alterations in various tumors including meningioma, which is the most common primary brain tumor. However, their use as routine laboratory examinations in clinical applications for tumor genotyping is not cost effective. To establish a clinical sequencing system for meningioma and investigate the clinical significance of genotype, we retrospectively performed targeted amplicon sequencing on 103 meningiomas and evaluated the association with clinicopathological features. We designed amplicon-sequencing panels targeting eight genes including NF2 (neurofibromin 2), TRAF7, KLF4, AKT1, and SMO. Libraries prepared with genomic DNA extracted from PAXgene-fixed paraffin-embedded tissues of 103 meningioma specimens were sequenced using the Illumina MiSeq. NF2 loss in some cases was also confirmed by interphase-fluorescent in situ hybridization. We identified NF2 loss and/or at least one mutation in NF2, TRAF7, KLF4, AKT1, and SMO in 81 out of 103 cases (79%) by targeted amplicon sequencing. On the basis of genetic status, we categorized meningiomas into three genotype groups: NF2 type, TRAKLS type harboring mutation in TRAF7, AKT1, KLF4, and/or SMO, and 'not otherwise classified' type. Genotype significantly correlated with tumor volume, tumor location, and magnetic resonance imaging findings such as adjacent bone change and heterogeneous gadolinium enhancement, as well as histopathological subtypes. In addition, multivariate analysis revealed that genotype was independently associated with risk of recurrence. In conclusion, we established a rapid clinical sequencing system that enables final confirmation of meningioma genotype within 7 days turnaround time. Our method will bring multiple benefits to neuropathologists and neurosurgeons for accurate diagnosis and appropriate postoperative management. PMID:27102344

  12. Shedding Light on the Microbial Community of the Macropod Foregut Using 454-Amplicon Pyrosequencing

    OpenAIRE

    Gulino, Lisa-Maree; Ouwerkerk, Diane; Alicia Y H Kang; Maguire, Anita J.; Kienzle, Marco; Klieve, Athol V.

    2013-01-01

    Twenty macropods from five locations in Queensland, Australia, grazing on a variety of native pastures were surveyed and the bacterial community of the foregut was examined using 454-amplicon pyrosequencing. Specifically, the V3/V4 region of 16S rRNA gene was examined. A total of 5040 OTUs were identified in the data set (post filtering). Thirty-two OTUs were identified as ‘shared’ OTUS (i.e. present in all samples) belonging to either Firmicutes or Bacteroidetes (Clostridiales/Bacteroidales)...

  13. Analysis and Visualization Tool for Targeted Amplicon Bisulfite Sequencing on Ion Torrent Sequencers.

    Science.gov (United States)

    Pabinger, Stephan; Ernst, Karina; Pulverer, Walter; Kallmeyer, Rainer; Valdes, Ana M; Metrustry, Sarah; Katic, Denis; Nuzzo, Angelo; Kriegner, Albert; Vierlinger, Klemens; Weinhaeusel, Andreas

    2016-01-01

    Targeted sequencing of PCR amplicons generated from bisulfite deaminated DNA is a flexible, cost-effective way to study methylation of a sample at single CpG resolution and perform subsequent multi-target, multi-sample comparisons. Currently, no platform specific protocol, support, or analysis solution is provided to perform targeted bisulfite sequencing on a Personal Genome Machine (PGM). Here, we present a novel tool, called TABSAT, for analyzing targeted bisulfite sequencing data generated on Ion Torrent sequencers. The workflow starts with raw sequencing data, performs quality assessment, and uses a tailored version of Bismark to map the reads to a reference genome. The pipeline visualizes results as lollipop plots and is able to deduce specific methylation-patterns present in a sample. The obtained profiles are then summarized and compared between samples. In order to assess the performance of the targeted bisulfite sequencing workflow, 48 samples were used to generate 53 different Bisulfite-Sequencing PCR amplicons from each sample, resulting in 2,544 amplicon targets. We obtained a mean coverage of 282X using 1,196,822 aligned reads. Next, we compared the sequencing results of these targets to the methylation level of the corresponding sites on an Illumina 450k methylation chip. The calculated average Pearson correlation coefficient of 0.91 confirms the sequencing results with one of the industry-leading CpG methylation platforms and shows that targeted amplicon bisulfite sequencing provides an accurate and cost-efficient method for DNA methylation studies, e.g., to provide platform-independent confirmation of Illumina Infinium 450k methylation data. TABSAT offers a novel way to analyze data generated by Ion Torrent instruments and can also be used with data from the Illumina MiSeq platform. It can be easily accessed via the Platomics platform, which offers a web-based graphical user interface along with sample and parameter storage. TABSAT is freely

  14. Amplicon-based metagenomics identified candidate organisms in soils that caused yield decline in strawberry

    OpenAIRE

    Xiangming Xu; Thomas Passey; Feng Wei; Robert Saville; Harrison, Richard J.

    2015-01-01

    A phenomenon of yield decline due to weak plant growth in strawberry was recently observed in non-chemo-fumigated soils, which was not associated with the soil fungal pathogen Verticillium dahliae, the main target of fumigation. Amplicon-based metagenomics was used to profile soil microbiota in order to identify microbial organisms that may have caused the yield decline. A total of 36 soil samples were obtained in 2013 and 2014 from four sites for metagenomic studies; two of the four sites ha...

  15. Surface density dependence of PCR amplicon hybridization on PNA/DNA probe layers

    DEFF Research Database (Denmark)

    Yao, Danfeng; Kim, Junyoung; Yu, Fang;

    2005-01-01

    intermediate sodium concentration (approximately 100 mM). These effects were mainly ascribed to the electrostatic cross talk among the hybridized DNA molecules and the secondary structure of PCR amplicons. For the negatively charged DNA probes, the hybridization reaction was subjected additionally to the DNA....../DNA electrostatic barrier, particularly in lower ionic strength range (e.g., 10 approximately 150 mM Na(+)). The electrostatic cross talk was shown to be largely reduced if the PNA probe layer was sufficiently diluted by following a strategic templated immobilization method. As a consequence, a pseudo...

  16. Multiplex amplicon sequencing for microbe identification in community-based culture collections.

    Science.gov (United States)

    Armanhi, Jaderson Silveira Leite; de Souza, Rafael Soares Correa; de Araújo, Laura Migliorini; Okura, Vagner Katsumi; Mieczkowski, Piotr; Imperial, Juan; Arruda, Paulo

    2016-01-01

    Microbiome analysis using metagenomic sequencing has revealed a vast microbial diversity associated with plants. Identifying the molecular functions associated with microbiome-plant interaction is a significant challenge concerning the development of microbiome-derived technologies applied to agriculture. An alternative to accelerate the discovery of the microbiome benefits to plants is to construct microbial culture collections concomitant with accessing microbial community structure and abundance. However, traditional methods of isolation, cultivation, and identification of microbes are time-consuming and expensive. Here we describe a method for identification of microbes in culture collections constructed by picking colonies from primary platings that may contain single or multiple microorganisms, which we named community-based culture collections (CBC). A multiplexing 16S rRNA gene amplicon sequencing based on two-step PCR amplifications with tagged primers for plates, rows, and columns allowed the identification of the microbial composition regardless if the well contains single or multiple microorganisms. The multiplexing system enables pooling amplicons into a single tube. The sequencing performed on the PacBio platform led to recovery near-full-length 16S rRNA gene sequences allowing accurate identification of microorganism composition in each plate well. Cross-referencing with plant microbiome structure and abundance allowed the estimation of diversity and abundance representation of microorganism in the CBC. PMID:27404280

  17. Amplicon-based metagenomics identified candidate organisms in soils that caused yield decline in strawberry.

    Science.gov (United States)

    Xu, Xiangming; Passey, Thomas; Wei, Feng; Saville, Robert; Harrison, Richard J

    2015-01-01

    A phenomenon of yield decline due to weak plant growth in strawberry was recently observed in non-chemo-fumigated soils, which was not associated with the soil fungal pathogen Verticillium dahliae, the main target of fumigation. Amplicon-based metagenomics was used to profile soil microbiota in order to identify microbial organisms that may have caused the yield decline. A total of 36 soil samples were obtained in 2013 and 2014 from four sites for metagenomic studies; two of the four sites had a yield-decline problem, the other two did not. More than 2000 fungal or bacterial operational taxonomy units (OTUs) were found in these samples. Relative abundance of individual OTUs was statistically compared for differences between samples from sites with or without yield decline. A total of 721 individual comparisons were statistically significant - involving 366 unique bacterial and 44 unique fungal OTUs. Based on further selection criteria, we focused on 34 bacterial and 17 fungal OTUs and found that yield decline resulted probably from one or more of the following four factors: (1) low abundance of Bacillus and Pseudomonas populations, which are well known for their ability of supressing pathogen development and/or promoting plant growth; (2) lack of the nematophagous fungus (Paecilomyces species); (3) a high level of two non-specific fungal root rot pathogens; and (4) wet soil conditions. This study demonstrated the usefulness of an amplicon-based metagenomics approach to profile soil microbiota and to detect differential abundance in microbes. PMID:26504572

  18. AMPLISAS: a web server for multilocus genotyping using next-generation amplicon sequencing data.

    Science.gov (United States)

    Sebastian, Alvaro; Herdegen, Magdalena; Migalska, Magdalena; Radwan, Jacek

    2016-03-01

    Next-generation sequencing (NGS) technologies are revolutionizing the fields of biology and medicine as powerful tools for amplicon sequencing (AS). Using combinations of primers and barcodes, it is possible to sequence targeted genomic regions with deep coverage for hundreds, even thousands, of individuals in a single experiment. This is extremely valuable for the genotyping of gene families in which locus-specific primers are often difficult to design, such as the major histocompatibility complex (MHC). The utility of AS is, however, limited by the high intrinsic sequencing error rates of NGS technologies and other sources of error such as polymerase amplification or chimera formation. Correcting these errors requires extensive bioinformatic post-processing of NGS data. Amplicon Sequence Assignment (AMPLISAS) is a tool that performs analysis of AS results in a simple and efficient way, while offering customization options for advanced users. AMPLISAS is designed as a three-step pipeline consisting of (i) read demultiplexing, (ii) unique sequence clustering and (iii) erroneous sequence filtering. Allele sequences and frequencies are retrieved in excel spreadsheet format, making them easy to interpret. AMPLISAS performance has been successfully benchmarked against previously published genotyped MHC data sets obtained with various NGS technologies. PMID:26257385

  19. Analysis of the microbiome: Advantages of whole genome shotgun versus 16S amplicon sequencing.

    Science.gov (United States)

    Ranjan, Ravi; Rani, Asha; Metwally, Ahmed; McGee, Halvor S; Perkins, David L

    2016-01-22

    The human microbiome has emerged as a major player in regulating human health and disease. Translational studies of the microbiome have the potential to indicate clinical applications such as fecal transplants and probiotics. However, one major issue is accurate identification of microbes constituting the microbiota. Studies of the microbiome have frequently utilized sequencing of the conserved 16S ribosomal RNA (rRNA) gene. We present a comparative study of an alternative approach using whole genome shotgun sequencing (WGS). In the present study, we analyzed the human fecal microbiome compiling a total of 194.1 × 10(6) reads from a single sample using multiple sequencing methods and platforms. Specifically, after establishing the reproducibility of our methods with extensive multiplexing, we compared: 1) The 16S rRNA amplicon versus the WGS method, 2) the Illumina HiSeq versus MiSeq platforms, 3) the analysis of reads versus de novo assembled contigs, and 4) the effect of shorter versus longer reads. Our study demonstrates that whole genome shotgun sequencing has multiple advantages compared with the 16S amplicon method including enhanced detection of bacterial species, increased detection of diversity and increased prediction of genes. In addition, increased length, either due to longer reads or the assembly of contigs, improved the accuracy of species detection. PMID:26718401

  20. ICO amplicon NGS data analysis: a Web tool for variant detection in common high-risk hereditary cancer genes analyzed by amplicon GS Junior next-generation sequencing.

    Science.gov (United States)

    Lopez-Doriga, Adriana; Feliubadaló, Lídia; Menéndez, Mireia; Lopez-Doriga, Sergio; Morón-Duran, Francisco D; del Valle, Jesús; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Campos, Olga; Gómez, Carolina; Pineda, Marta; González, Sara; Moreno, Victor; Capellá, Gabriel; Lázaro, Conxi

    2014-03-01

    Next-generation sequencing (NGS) has revolutionized genomic research and is set to have a major impact on genetic diagnostics thanks to the advent of benchtop sequencers and flexible kits for targeted libraries. Among the main hurdles in NGS are the difficulty of performing bioinformatic analysis of the huge volume of data generated and the high number of false positive calls that could be obtained, depending on the NGS technology and the analysis pipeline. Here, we present the development of a free and user-friendly Web data analysis tool that detects and filters sequence variants, provides coverage information, and allows the user to customize some basic parameters. The tool has been developed to provide accurate genetic analysis of targeted sequencing of common high-risk hereditary cancer genes using amplicon libraries run in a GS Junior System. The Web resource is linked to our own mutation database, to assist in the clinical classification of identified variants. We believe that this tool will greatly facilitate the use of the NGS approach in routine laboratories. PMID:24227591

  1. Parallel tagged amplicon sequencing of relatively long PCR products using the Illumina HiSeq platform and transcriptome assembly.

    Science.gov (United States)

    Feng, Yan-Jie; Liu, Qing-Feng; Chen, Meng-Yun; Liang, Dan; Zhang, Peng

    2016-01-01

    In phylogenetics and population genetics, a large number of loci are often needed to accurately resolve species relationships. Normally, loci are enriched by PCR and sequenced by Sanger sequencing, which is expensive when the number of amplicons is large. Next-generation sequencing (NGS) techniques are increasingly used for parallel amplicon sequencing, which reduces sequencing costs tremendously, but has not reduced preparation costs very much. Moreover, for most current NGS methods, amplicons need to be purified and quantified before sequencing and their lengths are also restricted (normally HiSeq paired-end 90-bp data. Overall, we validate a rapid, cost-effective and scalable approach to sequence a large number of targeted loci from a large number of samples that is particularly suitable for both phylogenetics and population genetics studies that require a modest scale of data. PMID:25959587

  2. Development of microsatellite and amplicon length polymorphism markers for Camellia japonica L. from tea plant (Camellia sinensis) expressed sequence tags.

    Science.gov (United States)

    Ueno, Saneyoshi; Tsumura, Yoshihiko

    2009-05-01

    Simple sequence repeats and amplicon length polymorphism markers for Camellia japonica were developed, based on Camellia sinensis sequences in the National Center for Biotechnology Information database. In total, 2495 gene sequences were used to design 216 primer pairs. To identify amplicon length polymorphism markers, 61 gene loci in 16 Camellia individuals were re-sequenced. In total, 10 markers (three expressed sequence tags-simple sequence repeats and seven amplicon length polymorphisms) yielded polymerase chain reaction products with clear polymorphic patterns and were used for genotyping 22 C. japonica individuals from a population. Numbers of alleles and expected heterozygosity ranged from two to 13 and from 0.28 to 0.90, respectively. PMID:21564753

  3. Single-Step Conversion of Cells to Retrovirus Vector Producers with Herpes Simplex Virus–Epstein-Barr Virus Hybrid Amplicons

    OpenAIRE

    Sena-Esteves, Miguel; Saeki, Yoshinaga; Camp, Sara M.; Chiocca, E. Antonio; Breakefield, Xandra O.

    1999-01-01

    We report here on the development and characterization of a novel herpes simplex virus type 1 (HSV-1) amplicon-based vector system which takes advantage of the host range and retention properties of HSV–Epstein-Barr virus (EBV) hybrid amplicons to efficiently convert cells to retrovirus vector producer cells after single-step transduction. The retrovirus genes gag-pol and env (GPE) and retroviral vector sequences were modified to minimize sequence overlap and cloned into an HSV-EBV hybrid amp...

  4. Rare amplicons implicate frequent deregulation of cell fate specification pathways in oral squamous cell carcinoma.

    Science.gov (United States)

    Snijders, Antoine M; Schmidt, Brian L; Fridlyand, Jane; Dekker, Nusi; Pinkel, Daniel; Jordan, Richard C K; Albertson, Donna G

    2005-06-16

    Genomes of solid tumors are characterized by gains and losses of regions, which may contribute to tumorigenesis by altering gene expression. Often the aberrations are extensive, encompassing whole chromosome arms, which makes identification of candidate genes in these regions difficult. Here, we focused on narrow regions of gene amplification to facilitate identification of genetic pathways important in oral squamous cell carcinoma (SCC) development. We used array comparative genomic hybridization (array CGH) to define minimum common amplified regions and then used expression analysis to identify candidate driver genes in amplicons that spanned LAMA3, MMP7), as well as members of the hedgehog (GLI2) and notch (JAG1, RBPSUH, FJX1) pathways to be amplified and overexpressed. Deregulation of these and other members of the hedgehog and notch pathways (HHIP, SMO, DLL1, NOTCH4) implicates deregulation of developmental and differentiation pathways, cell fate misspecification, in oral SCC development. PMID:15824737

  5. Amplicon restriction patterns associated with nitrogenase activity of root nodules for selection of superior Myrica seedlings

    Indian Academy of Sciences (India)

    Mhathung Yanthan; Arvind K Misran

    2013-11-01

    Trees of Myrica sp. grow abundantly in the forests of Meghalaya, India. These trees are actinorhizal and harbour nitrogen-fixing Frankia in their root nodules and contribute positively towards the enhancement of nitrogen status of forest areas. They can be used in rejuvenation of mine spoils and nitrogen-depleted fallow lands generated due to slash and burn agriculture practiced in the area. We have studied the association of amplicon restriction patterns (ARPs) of Myrica ribosomal RNA gene and internal transcribed spacer (ITS) region and nitrogenase activity of its root nodules. We found that ARPs thus obtained could be used as markers for early screening of seedlings that could support strains of Frankia that fix atmospheric nitrogen more efficiently.

  6. Nitrogenase gene amplicons from global marine surface waters are dominated by genes of non-cyanobacteria

    DEFF Research Database (Denmark)

    Farnelid, Hanna; Andersson, Anders F.; Bertilsson, Stefan;

    2011-01-01

    Cyanobacteria are thought to be the main N2-fixing organisms (diazotrophs) in marine pelagic waters, but recent molecular analyses indicate that non-cyanobacterial diazotrophs are also present and active. Existing data are, however, restricted geographically and by limited sequencing depths. Our...... of the nifH gene pool in marine waters. Great divergence in nifH composition was observed between sites. Cyanobacteria-like genes were most frequent among amplicons from the warmest waters, but overall the data set was dominated by nifH sequences most closely related to non-cyanobacteria. Clusters related...... by unicellular cyanobacteria, 42% of the identified non-cyanobacterial nifH clusters from the corresponding DNA samples were also detected in cDNA. The study indicates that non-cyanobacteria account for a substantial part of the nifH gene pool in marine surface waters and that these genes are at least...

  7. 16S rRNA amplicon sequencing dataset for conventionalized and conventionally raised zebrafish larvae.

    Science.gov (United States)

    Davis, Daniel J; Bryda, Elizabeth C; Gillespie, Catherine H; Ericsson, Aaron C

    2016-09-01

    Data presented here contains metagenomic analysis regarding the sequential conventionalization of germ-free zebrafish embryos. Zebrafish embryos that underwent a germ-free sterilization process immediately after fertilization were promptly exposed to and raised to larval stage in conventional fish water. At 6 days postfertilization (dpf), these "conventionalized" larvae were compared to zebrafish larvae that were raised in conventional fish water never undergoing the initial sterilization process. Bacterial 16S rRNA amplicon sequencing was performed on DNA isolated from homogenates of the larvae revealing distinct microbiota variations between the two groups. The dataset described here is also related to the research article entitled "Microbial modulation of behavior and stress responses in zebrafish larvae" (Davis et al., 2016) [1]. PMID:27508247

  8. Strong selective sweeps associated with ampliconic regions in great ape X chromosomes

    DEFF Research Database (Denmark)

    Nam, Kiwoong; Munch, Kasper; Hobolth, Asger;

    2014-01-01

    The unique inheritance pattern of X chromosomes makes them preferential targets of adaptive evolution. We here investigate natural selection on the X chromosome in all species of great apes. We find that diversity is more strongly reduced around genes on the X compared with autosomes, and that a ...... ampliconic sequences we propose that intra-genomic conflict between the X and the Y chromosomes is a major driver of X chromosome evolution.......The unique inheritance pattern of X chromosomes makes them preferential targets of adaptive evolution. We here investigate natural selection on the X chromosome in all species of great apes. We find that diversity is more strongly reduced around genes on the X compared with autosomes, and that a...... higher proportion of substitutions results from positive selection. Strikingly, the X exhibits several megabase long regions where diversity is reduced more than five fold. These regions overlap significantly among species, and have a higher singleton proportion, population differentiation, and...

  9. Amplicon restriction patterns associated with nitrogenase activity of root nodules for selection of superior Myrica seedlings.

    Science.gov (United States)

    Yanthan, Mhathung; Misra, Arvind K

    2013-11-01

    Trees of Myrica sp. grow abundantly in the forests of Meghalaya, India. These trees are actinorhizal and harbour nitrogen-fixing Frankia in their root nodules and contribute positively towards the enhancement of nitrogen status of forest areas. They can be used in rejuvenation of mine spoils and nitrogen-depleted fallow lands generated due to slash and burn agriculture practiced in the area. We have studied the association of amplicon restriction patterns (ARPs) of Myrica ribosomal RNA gene and internal transcribed spacer (ITS) region and nitrogenase activity of its root nodules. We found that ARPs thus obtained could be used as markers for early screening of seedlings that could support strains of Frankia that fix atmospheric nitrogen more efficiently. PMID:24287658

  10. Genotyping common FSHR polymorphisms based on competitive amplification of differentially melting amplicons (CADMA)

    DEFF Research Database (Denmark)

    Borgbo, Tanni; Sommer Kristensen, Lasse; Lindgren, Ida;

    2014-01-01

    -based genotyping results. Comparing the CADMA-based assays to (AS-PCR) and Sanger sequencing, the CADMA based assays showed an improved analytical sensitivity and a wider applicability. CONCLUSIONS: The new assays provide a reliable, fast and user-friendly genotyping method facilitating a wider implication...... analytical sensitivity. Here, we present a novel version of "competitive amplification of differentially melting amplicons" (CADMA), providing an improved platform for simple, reliable, and cost-effective genotyping. METHODS: Two CADMA based assays were designed for the two common polymorphisms of the FSHR...... gene: rs6165 (c.919A > G, p. Thr307Ala, FSHR 307) and rs6166 (c.2039A > G, p. Asn680Ser, FSHR 680). To evaluate the reliability of the new CADMA-based assays, the genotyping results were compared with two conventional PCR based genotyping methods; allele-specific PCR (AS-PCR) and Sanger sequencing...

  11. Function of the EGR-1/TIS8 radiation inducible promoter in a minimal HSV-1 amplicon system

    International Nuclear Information System (INIS)

    Purpose: To evaluate function of the EGR-1/TIS8 promoter region in minimal HSV-1 amplicon system in order to determine the feasibility of using the system to regulate vector replication with radiation. Materials and Methods: A 600-base pair 5' upstream region of the EGR-1 promoter linked to chloramphenicol acetyltransferase (CAT) was recombined into a minimal HSV-1 amplicon vector system (pONEC). pONEC or a control plasmid was transfected into U87 glioma cells using the Lipofectamine method. Thirty-six hours later one aliquot of cells from each transfection was irradiated to a dose of 20 Gy and another identical aliquot served as a control. CAT activity was assayed 8 hours after irradiation. Results: pONEC transfected cells irradiated with 20 Gy demonstrated 2.0 fold increase in CAT activity compared to non-irradiated cells. Cells transfected with control plasmid showed no change in CAT activity. Unirradiated pONEC cells had CAT activity 1.3 times those transfected with control plasmid. Conclusion: We have previously created HSV-1 gene therapy amplicon vector systems which allow virus-amplicon interdependent replication, with the intent of regulating replication. The above data demonstrates that a minimal amplicon system will allow radiation dependent regulation by the EGR-1 promoter, thus indicating the possibility of using this system to regulate onsite, spatially and temporally regulated vector production. Baseline CAT activity was higher and relative induction lower than other reported expression constructs, which raises concern for the ability of the system to produce a differential in transcription levels sufficient for this purpose. This is possibly the result of residual promoter/enhancer elements remaining in the HSV-1 sequences. We are attempting to create constructs lacking these elements. Addition of secondary promoter sequences may also be of use. We are also currently evaluating the efficacy of the putative IEX-1 radiation inducible promoter region in

  12. Extracellular DNA amplicon sequencing reveals high levels of benthic eukaryotic diversity in the central Red Sea

    KAUST Repository

    Pearman, John K.

    2015-11-01

    The present study aims to characterize the benthic eukaryotic biodiversity patterns at a coarse taxonomic level in three areas of the central Red Sea (a lagoon, an offshore area in Thuwal and a shallow coastal area near Jeddah) based on extracellular DNA. High-throughput amplicon sequencing targeting the V9 region of the 18S rRNA gene was undertaken for 32 sediment samples. High levels of alpha-diversity were detected with 16,089 operational taxonomic units (OTUs) being identified. The majority of the OTUs were assigned to Metazoa (29.2%), Alveolata (22.4%) and Stramenopiles (17.8%). Stramenopiles (Diatomea) and Alveolata (Ciliophora) were frequent in a lagoon and in shallower coastal stations, whereas metazoans (Arthropoda: Maxillopoda) were dominant in deeper offshore stations. Only 24.6% of total OTUs were shared among all areas. Beta-diversity was generally lower between the lagoon and Jeddah (nearshore) than between either of those and the offshore area, suggesting a nearshore–offshore biodiversity gradient. The current approach allowed for a broad-range of benthic eukaryotic biodiversity to be analysed with significantly less labour than would be required by other traditional taxonomic approaches. Our findings suggest that next generation sequencing techniques have the potential to provide a fast and standardised screening of benthic biodiversity at large spatial and temporal scales.

  13. Flow cytometry community fingerprinting and amplicon sequencing for the assessment of landfill leachate cellulolytic bioaugmentation.

    Science.gov (United States)

    Kinet, R; Dzaomuho, P; Baert, J; Taminiau, B; Daube, G; Nezer, C; Brostaux, Y; Nguyen, F; Dumont, G; Thonart, P; Delvigne, F

    2016-08-01

    Flow cytometry (FCM) is a high throughput single cell technology that is actually becoming widely used for studying phenotypic and genotypic diversity among microbial communities. This technology is considered in this work for the assessment of a bioaugmentation treatment in order to enhance cellulolytic potential of landfill leachate. The experimental results reveal the relevant increase of leachate cellulolytic potential due to bioaugmentation. Cytometric monitoring of microbial dynamics along these assays is then realized. The flow FP package is used to establish microbial samples fingerprint from initial 2D cytometry histograms. This procedure allows highlighting microbial communities' variation along the assays. Cytometric and 16S rRNA gene sequencing fingerprinting methods are then compared. The two approaches give same evidence about microbial dynamics throughout digestion assay. There are however a lack of significant correlation between cytometric and amplicon sequencing fingerprint at genus or species level. Same phenotypical profiles of microbiota during assays matched to several 16S rRNA gene sequencing ones. Flow cytometry fingerprinting can thus be considered as a promising routine on-site method suitable for the detection of stability/variation/disturbance of complex microbial communities involved in bioprocesses. PMID:27160955

  14. Digital fragment analysis of short tandem repeats by high-throughput amplicon sequencing.

    Science.gov (United States)

    Darby, Brian J; Erickson, Shay F; Hervey, Samuel D; Ellis-Felege, Susan N

    2016-07-01

    High-throughput sequencing has been proposed as a method to genotype microsatellites and overcome the four main technical drawbacks of capillary electrophoresis: amplification artifacts, imprecise sizing, length homoplasy, and limited multiplex capability. The objective of this project was to test a high-throughput amplicon sequencing approach to fragment analysis of short tandem repeats and characterize its advantages and disadvantages against traditional capillary electrophoresis. We amplified and sequenced 12 muskrat microsatellite loci from 180 muskrat specimens and analyzed the sequencing data for precision of allele calling, propensity for amplification or sequencing artifacts, and for evidence of length homoplasy. Of the 294 total alleles, we detected by sequencing, only 164 alleles would have been detected by capillary electrophoresis as the remaining 130 alleles (44%) would have been hidden by length homoplasy. The ability to detect a greater number of unique alleles resulted in the ability to resolve greater population genetic structure. The primary advantages of fragment analysis by sequencing are the ability to precisely size fragments, resolve length homoplasy, multiplex many individuals and many loci into a single high-throughput run, and compare data across projects and across laboratories (present and future) with minimal technical calibration. A significant disadvantage of fragment analysis by sequencing is that the method is only practical and cost-effective when performed on batches of several hundred samples with multiple loci. Future work is needed to optimize throughput while minimizing costs and to update existing microsatellite allele calling and analysis programs to accommodate sequence-aware microsatellite data. PMID:27386092

  15. Shedding light on the microbial community of the macropod foregut using 454-amplicon pyrosequencing.

    Directory of Open Access Journals (Sweden)

    Lisa-Maree Gulino

    Full Text Available Twenty macropods from five locations in Queensland, Australia, grazing on a variety of native pastures were surveyed and the bacterial community of the foregut was examined using 454-amplicon pyrosequencing. Specifically, the V3/V4 region of 16S rRNA gene was examined. A total of 5040 OTUs were identified in the data set (post filtering. Thirty-two OTUs were identified as 'shared' OTUS (i.e. present in all samples belonging to either Firmicutes or Bacteroidetes (Clostridiales/Bacteroidales. These phyla predominated the general microbial community in all macropods. Genera represented within the shared OTUs included: unclassified Ruminococcaceae, unclassified Lachnospiraceae, unclassified Clostridiales, Peptococcus sp. Coprococcus spp., Streptococcus spp., Blautia sp., Ruminoccocus sp., Eubacterium sp., Dorea sp., Oscillospira sp. and Butyrivibrio sp. The composition of the bacterial community of the foregut samples of each the host species (Macropus rufus, Macropus giganteus and Macropus robustus was significantly different allowing differentiation between the host species based on alpha and beta diversity measures. Specifically, eleven dominant OTUs that separated the three host species were identified and classified as: unclassified Ruminococcaceae, unclassified Bacteroidales, Prevotella spp. and a Syntrophococcus sucromutans. Putative reductive acetogens and fibrolytic bacteria were also identified in samples. Future work will investigate the presence and role of fibrolytics and acetogens in these ecosystems. Ideally, the isolation and characterization of these organisms will be used for enhanced feed efficiency in cattle, methane mitigation and potentially for other industries such as the biofuel industry.

  16. Amplicon pyrosequencing reveals the soil microbial diversity associated with invasive Japanese barberry (Berberis thunbergii DC.).

    Science.gov (United States)

    Coats, V C; Pelletreau, K N; Rumpho, M E

    2014-03-01

    The soil microbial community acts as a reservoir of microbes that directly influences the structure and composition of the aboveground plant community, promotes plant growth, increases stress tolerance and mediates local patterns of nutrient cycling. Direct interactions between plants and rhizosphere-dwelling microorganisms occur at, or near, the surface of the root. Upon introduction and establishment, invasive plants modify the soil microbial communities and soil biochemistry affecting bioremediation efforts and future plant communities. Here, we used tag-encoded FLX amplicon 454 pyrosequencing (TEFAP) to characterize the bacterial and fungal community diversity in the rhizosphere of Berberis thunbergii DC. (Japanese barberry) from invasive stands in coastal Maine to investigate effects of soil type, soil chemistry and surrounding plant cover on the soil microbial community structure. Acidobacteria, Actinobacteria, Proteobacteria and Verrucomicrobia were the dominant bacterial phyla, whereas fungal communities were comprised mostly of Ascomycota and Basidiomycota phyla members, including Agaricomycetes and Sordariomycetes. Bulk soil chemistry had more effect on the bacterial community structure than the fungal community. An effect of geographic location was apparent in the rhizosphere microbial communities, yet it was less significant than the effect of surrounding plant cover. These data demonstrate a high degree of spatial variation in the rhizosphere microbial communities of Japanese barberry with apparent effects of soil chemistry, location and canopy cover on the microbial community structure. PMID:24118303

  17. Functional Overexpression of Vomeronasal Receptors Using a Herpes Simplex Virus Type 1 (HSV-1)-Derived Amplicon.

    Science.gov (United States)

    Stein, Benjamin; Alonso, María Teresa; Zufall, Frank; Leinders-Zufall, Trese; Chamero, Pablo

    2016-01-01

    In mice, social behaviors such as mating and aggression are mediated by pheromones and related chemosignals. The vomeronasal organ (VNO) detects olfactory information from other individuals by sensory neurons tuned to respond to specific chemical cues. Receptors expressed by vomeronasal neurons are implicated in selective detection of these cues. Nearly 400 receptor genes have been identified in the mouse VNO, but the tuning properties of individual receptors remain poorly understood, in part due to the lack of a robust heterologous expression system. Here we develop a herpes virus-based amplicon delivery system to overexpress three types of vomeronasal receptor genes and to characterize cell responses to their proposed ligands. Through Ca2+ imaging in native VNO cells we show that virus-induced overexpression of V1rj2, V2r1b or Fpr3 caused a pronounced increase of responsivity to sulfated steroids, MHC-binding peptide or the synthetic hexapeptide W-peptide, respectively. Other related ligands were not recognized by infected individual neurons, indicating a high degree of selectivity by the overexpressed receptor. Removal of G-protein signaling eliminates Ca2+ responses, indicating that the endogenous second messenger system is essential for observing receptor activation. Our results provide a novel expression system for vomeronasal receptors that should be useful for understanding the molecular logic of VNO ligand detection. Functional expression of vomeronasal receptors and their deorphanization provides an essential requirement for deciphering the neural mechanisms controlling behavior. PMID:27195771

  18. Functional Overexpression of Vomeronasal Receptors Using a Herpes Simplex Virus Type 1 (HSV-1-Derived Amplicon.

    Directory of Open Access Journals (Sweden)

    Benjamin Stein

    Full Text Available In mice, social behaviors such as mating and aggression are mediated by pheromones and related chemosignals. The vomeronasal organ (VNO detects olfactory information from other individuals by sensory neurons tuned to respond to specific chemical cues. Receptors expressed by vomeronasal neurons are implicated in selective detection of these cues. Nearly 400 receptor genes have been identified in the mouse VNO, but the tuning properties of individual receptors remain poorly understood, in part due to the lack of a robust heterologous expression system. Here we develop a herpes virus-based amplicon delivery system to overexpress three types of vomeronasal receptor genes and to characterize cell responses to their proposed ligands. Through Ca2+ imaging in native VNO cells we show that virus-induced overexpression of V1rj2, V2r1b or Fpr3 caused a pronounced increase of responsivity to sulfated steroids, MHC-binding peptide or the synthetic hexapeptide W-peptide, respectively. Other related ligands were not recognized by infected individual neurons, indicating a high degree of selectivity by the overexpressed receptor. Removal of G-protein signaling eliminates Ca2+ responses, indicating that the endogenous second messenger system is essential for observing receptor activation. Our results provide a novel expression system for vomeronasal receptors that should be useful for understanding the molecular logic of VNO ligand detection. Functional expression of vomeronasal receptors and their deorphanization provides an essential requirement for deciphering the neural mechanisms controlling behavior.

  19. Extracellular DNA amplicon sequencing reveals high levels of benthic eukaryotic diversity in the central Red Sea.

    Science.gov (United States)

    Pearman, John K; Irigoien, Xabier; Carvalho, Susana

    2016-04-01

    The present study aims to characterize the benthic eukaryotic biodiversity patterns at a coarse taxonomic level in three areas of the central Red Sea (a lagoon, an offshore area in Thuwal and a shallow coastal area near Jeddah) based on extracellular DNA. High-throughput amplicon sequencing targeting the V9 region of the 18S rRNA gene was undertaken for 32 sediment samples. High levels of alpha-diversity were detected with 16,089 operational taxonomic units (OTUs) being identified. The majority of the OTUs were assigned to Metazoa (29.2%), Alveolata (22.4%) and Stramenopiles (17.8%). Stramenopiles (Diatomea) and Alveolata (Ciliophora) were frequent in a lagoon and in shallower coastal stations, whereas metazoans (Arthropoda: Maxillopoda) were dominant in deeper offshore stations. Only 24.6% of total OTUs were shared among all areas. Beta-diversity was generally lower between the lagoon and Jeddah (nearshore) than between either of those and the offshore area, suggesting a nearshore-offshore biodiversity gradient. The current approach allowed for a broad-range of benthic eukaryotic biodiversity to be analysed with significantly less labour than would be required by other traditional taxonomic approaches. Our findings suggest that next generation sequencing techniques have the potential to provide a fast and standardised screening of benthic biodiversity at large spatial and temporal scales. PMID:26525270

  20. Comparative analyses of amplicon migration behavior in differing denaturing gradient gel electrophoresis (DGGE) systems

    Science.gov (United States)

    Thornhill, D. J.; Kemp, D. W.; Sampayo, E. M.; Schmidt, G. W.

    2010-03-01

    Denaturing gradient gel electrophoresis (DGGE) is commonly utilized to identify and quantify microbial diversity, but the conditions required for different electrophoretic systems to yield equivalent results and optimal resolution have not been assessed. Herein, the influence of different DGGE system configuration parameters on microbial diversity estimates was tested using Symbiodinium, a group of marine eukaryotic microbes that are important constituents of coral reef ecosystems. To accomplish this, bacterial clone libraries were constructed and sequenced from cultured isolates of Symbiodinium for the ribosomal DNA internal transcribed spacer 2 (ITS2) region. From these, 15 clones were subjected to PCR with a GC clamped primer set for DGGE analyses. Migration behaviors of the resulting amplicons were analyzed using a range of conditions, including variation in the composition of the denaturing gradient, electrophoresis time, and applied voltage. All tests were conducted in parallel on two commercial DGGE systems, a C.B.S. Scientific DGGE-2001, and the Bio-Rad DCode system. In this context, identical nucleotide fragments exhibited differing migration behaviors depending on the model of apparatus utilized, with fragments denaturing at a lower gradient concentration and applied voltage on the Bio-Rad DCode system than on the C.B.S. Scientific DGGE-2001 system. Although equivalent PCR-DGGE profiles could be achieved with both brands of DGGE system, the composition of the denaturing gradient and application of electrophoresis time × voltage must be appropriately optimized to achieve congruent results across platforms.

  1. Differential amplicons (ΔAmp—a new molecular method to assess RNA integrity

    Directory of Open Access Journals (Sweden)

    J. Björkman

    2016-01-01

    Full Text Available Integrity of the mRNA in clinical samples has major impact on the quality of measured expression levels. This is independent of the measurement technique being next generation sequencing (NGS, Quantitative real-time PCR (qPCR or microarray profiling. If mRNA is highly degraded or damaged, measured data will be very unreliable and the whole study is likely a waste of time and money. It is therefore common strategy to test the quality of RNA in samples before conducting large and costly studies. Most methods today to assess the quality of RNA are ignorant to the nature of the RNA and, therefore, reflect the integrity of ribosomal RNA, which is the dominant species, rather than of mRNAs, microRNAs and long non-coding RNAs, which usually are the species of interest. Here, we present a novel molecular approach to assess the quality of the targeted RNA species by measuring the differential amplification (ΔAmp of an Endogenous RNase Resistant (ERR marker relative to a reference gene, optionally combined with the measurement of two amplicons of different lengths. The combination reveals any mRNA degradation caused by ribonucleases as well as physical, chemical or UV damage. ΔAmp has superior sensitivity to common microfluidic electrophoretic methods, senses the integrity of the actual targeted RNA species, and allows for a smoother and more cost efficient workflow.

  2. Network analysis of the microorganism in 25 Danish wastewater treatment plants over 7 years using high-throughput amplicon sequencing

    DEFF Research Database (Denmark)

    Albertsen, Mads; Larsen, Poul; Saunders, Aaron Marc;

    the existing knowledge on core genera with high-throughput amplicon sequencing, plant design and process data in order to identify interactions and community shaping factors. We investigated 25 Danish full-scale wastewater treatment plants with nutrient removal over a period of 7 years with two to four samples...... database, which integrates the current knowledge of the core species in wastewater treatment plants (www.midasfieldguide.org). For each sample the wastewater treatment plants provided operational data, and selected samples were subjected to extensive physio-chemical analysis and shear tests in order...... to link sludge and floc properties to the microbial communities. All data was subjected to extensive network analysis and multivariate statistics through R. The 16S amplicon results confirmed the findings of relatively few core groups of organism shared by all the wastewater treatment plants...

  3. Analysis of the mouse gut microbiome using full-length 16S rRNA amplicon sequencing.

    Science.gov (United States)

    Shin, Jongoh; Lee, Sooin; Go, Min-Jeong; Lee, Sang Yup; Kim, Sun Chang; Lee, Chul-Ho; Cho, Byung-Kwan

    2016-01-01

    Demands for faster and more accurate methods to analyze microbial communities from natural and clinical samples have been increasing in the medical and healthcare industry. Recent advances in next-generation sequencing technologies have facilitated the elucidation of the microbial community composition with higher accuracy and greater throughput than was previously achievable; however, the short sequencing reads often limit the microbial composition analysis at the species level due to the high similarity of 16S rRNA amplicon sequences. To overcome this limitation, we used the nanopore sequencing platform to sequence full-length 16S rRNA amplicon libraries prepared from the mouse gut microbiota. A comparison of the nanopore and short-read sequencing data showed that there were no significant differences in major taxonomic units (89%) except one phylotype and three taxonomic units. Moreover, both sequencing data were highly similar at all taxonomic resolutions except the species level. At the species level, nanopore sequencing allowed identification of more species than short-read sequencing, facilitating the accurate classification of the bacterial community composition. Therefore, this method of full-length 16S rRNA amplicon sequencing will be useful for rapid, accurate and efficient detection of microbial diversity in various biological and clinical samples. PMID:27411898

  4. Analysis of a Drosophila amplicon in follicle cells highlights the diversity of metazoan replication origins.

    Science.gov (United States)

    Kim, Jane C; Orr-Weaver, Terry L

    2011-10-01

    To investigate the properties of metazoan replication origins, recent studies in cell culture have adopted the strategy of identifying origins using genome-wide approaches and assessing correlations with such features as transcription and histone modifications. Drosophila amplicon in follicle cells (DAFCs), genomic regions that undergo repeated rounds of DNA replication to increase DNA copy number, serve as powerful in vivo model replicons. Because there are six DAFCs, compared with thousands of origins activated in the typical S phase, close molecular characterization of all DAFCs is possible. To determine the extent to which the six DAFCs are different or similar, we investigated the developmental and replication properties of the newly identified DAFC-34B. DAFC-34B contains two genes expressed in follicle cells, although the timing and spatial patterns of expression suggest that amplification is not a strategy to promote high expression at this locus. Like the previously characterized DAFC-62D, DAFC-34B displays origin activation at two separate stages of development. However, unlike DAFC-62D, amplification at the later stage is not transcription-dependent. We mapped the DAFC-34B amplification origin to 1 kb by nascent strand analysis and delineated cis requirements for origin activation, finding that a 6-kb region, but not the 1-kb origin alone, is sufficient for amplification. We analyzed the developmental localization of the origin recognition complex (ORC) and the minichromosome maintenance (MCM)2-7 complex, the replicative helicase. Intriguingly, the final round of origin activation at DAFC-34B occurs in the absence of detectable ORC, although MCMs are present, suggesting a new amplification initiation mechanism. PMID:21933960

  5. Voltammetric detection of sequence-selective DNA hybridization related to Toxoplasma gondii in PCR amplicons.

    Science.gov (United States)

    Gokce, Gultekin; Erdem, Arzum; Ceylan, Cagdas; Akgöz, Muslum

    2016-03-01

    This work describes the single-use electrochemical DNA biosensor technology developed for voltammetric detection of sequence selective DNA hybridization related to important human and veterinary pathogen; Toxoplasma gondii. In the principle of electrochemical label-free detection assay, the duplex of DNA hybrid formation was detected by measuring guanine oxidation signal occured in the presence of DNA hybridization. The biosensor design consisted of the immobilization of an inosine-modified (guanine-free) probe onto the surface of pencil graphite electrode (PGE), and the detection of the duplex formation in connection with the differential pulse voltammetry(DPV) by measuring the guanine signal. Toxoplasma gondii capture probe was firstly immobilized onto the surface of the activated PGE by wet adsorption. The extent of hybridization at PGE surface between the probe and the target was then determined by measuring the guanine signal observed at +1.0V. The electrochemical monitoring of optimum DNA hybridization has been performed in the target concentration of 40µg/mL in 50min of hybridization time. The specificity of the electrochemical biosensor was then tested using non-complementary, or mismatch short DNA sequences. Under the optimum conditions, the guanine oxidation signal indicating full hybridization was measured in various target concentration from 0.5 to 25µg/mL and a detection limit was found to be 1.78µg/mL. This single-use biosensor platform was successfully applied for the voltammetric detection of DNA hybridization related to Toxoplasma gondii in PCR amplicons. PMID:26717837

  6. Amplicon-based semiconductor sequencing of human exomes: performance evaluation and optimization strategies.

    Science.gov (United States)

    Damiati, E; Borsani, G; Giacopuzzi, Edoardo

    2016-05-01

    The Ion Proton platform allows to perform whole exome sequencing (WES) at low cost, providing rapid turnaround time and great flexibility. Products for WES on Ion Proton system include the AmpliSeq Exome kit and the recently introduced HiQ sequencing chemistry. Here, we used gold standard variants from GIAB consortium to assess the performances in variants identification, characterize the erroneous calls and develop a filtering strategy to reduce false positives. The AmpliSeq Exome kit captures a large fraction of bases (>94 %) in human CDS, ClinVar genes and ACMG genes, but with 2,041 (7 %), 449 (13 %) and 11 (19 %) genes not fully represented, respectively. Overall, 515 protein coding genes contain hard-to-sequence regions, including 90 genes from ClinVar. Performance in variants detection was maximum at mean coverage >120×, while at 90× and 70× we measured a loss of variants of 3.2 and 4.5 %, respectively. WES using HiQ chemistry showed ~71/97.5 % sensitivity, ~37/2 % FDR and ~0.66/0.98 F1 score for indels and SNPs, respectively. The proposed low, medium or high-stringency filters reduced the amount of false positives by 10.2, 21.2 and 40.4 % for indels and 21.2, 41.9 and 68.2 % for SNP, respectively. Amplicon-based WES on Ion Proton platform using HiQ chemistry emerged as a competitive approach, with improved accuracy in variants identification. False-positive variants remain an issue for the Ion Torrent technology, but our filtering strategy can be applied to reduce erroneous variants. PMID:27003585

  7. Competitive amplification of differentially melting amplicons (CADMA improves KRAS hotspot mutation testing in colorectal cancer

    Directory of Open Access Journals (Sweden)

    Kristensen Lasse

    2012-11-01

    Full Text Available Abstract Background Cancer is an extremely heterogeneous group of diseases traditionally categorized according to tissue of origin. However, even among patients with the same cancer subtype the cellular alterations at the molecular level are often very different. Several new therapies targeting specific molecular changes found in individual patients have initiated the era of personalized therapy and significantly improved patient care. In metastatic colorectal cancer (mCRC a selected group of patients with wild-type KRAS respond to antibodies against the epidermal growth factor receptor (EGFR. Testing for KRAS mutations is now required prior to anti-EGFR treatment, however, less sensitive methods based on conventional PCR regularly fail to detect KRAS mutations in clinical samples. Methods We have developed sensitive and specific assays for detection of the seven most common KRAS mutations based on a novel methodology named Competitive Amplification of Differentially Melting Amplicons (CADMA. The clinical applicability of these assays was assessed by analyzing 100 colorectal cancer samples, for which KRAS mutation status has been evaluated by the commercially available TheraScreen® KRAS mutation kit. Results The CADMA assays were sensitive to at least 0.5% mutant alleles in a wild-type background when using 50 nanograms of DNA in the reactions. Consensus between CADMA and the TheraScreen kit was observed in 96% of the colorectal cancer samples. In cases where disagreement was observed the CADMA result could be confirmed by a previously published assay based on TaqMan probes and by fast COLD-PCR followed by Sanger sequencing. Conclusions The high analytical sensitivity and specificity of CADMA may increase diagnostic sensitivity and specificity of KRAS mutation testing in mCRC patients.

  8. Prerequisites for Amplicon Pyrosequencing of Microbial Methanol Utilizers in the Environment

    Directory of Open Access Journals (Sweden)

    SteffenKolb

    2013-09-01

    Full Text Available The commercial availability of next generation sequencing (NGS technologies facilitated the assessment of functional groups of microorganisms in the environment with high coverage, resolution, and reproducibility. Soil methylotrophs were among the first microorganisms in the environment that were assessed with molecular tools, and nowadays, as well with NGS technologies. Studies in the past years re-attracted notice to the pivotal role of methylotrophs in global conversions of methanol, which mainly originates from plants, and is involved in oxidative reactions and ozone formation in the atmosphere. Aerobic methanol utilizers belong to Bacteria, yeasts, Ascomycota, and molds. Numerous bacterial methylotrophs are facultatively aerobic, and also contribute to anaerobic methanol oxidation in the environment, whereas strict anaerobic methanol utilizers belong to methanogens and acetogens. The diversity of enzymes catalyzing the initial oxidation of methanol is considerable, and comprises at least five different enzyme types in aerobes, and one in strict anaerobes. Only the gene of the large subunit of PQQ-dependent methanol dehydrogenase (mxaF has been analyzed by environmental pyrosequencing. To enable a comprehensive assessment of methanol utilizers in the environment, new primers targeting genes of the PQQ MDH in Methylibium (mdh2, of the NAD-dependent MDH (mdh, of the methanol oxidoreductase of Actinobacteria (mdo, of the fungal FAD-dependent alcohol oxidase (mod1, mod2, and homologues, and of the gene of the large subunit of the methanol:corrinoid methyltransferases (mtaC in methanogens and acetogens need to be developed. Combined stable isotope probing of nucleic acids or proteins with amplicon-based NGS are straightforward approaches to reveal insights into functions of certain methylotrophic taxa in the global methanol cycle.

  9. Introduction to uncertainty quantification

    CERN Document Server

    Sullivan, T J

    2015-01-01

    Uncertainty quantification is a topic of increasing practical importance at the intersection of applied mathematics, statistics, computation, and numerous application areas in science and engineering. This text provides a framework in which the main objectives of the field of uncertainty quantification are defined, and an overview of the range of mathematical methods by which they can be achieved. Complete with exercises throughout, the book will equip readers with both theoretical understanding and practical experience of the key mathematical and algorithmic tools underlying the treatment of uncertainty in modern applied mathematics. Students and readers alike are encouraged to apply the mathematical methods discussed in this book to their own favourite problems to understand their strengths and weaknesses, also making the text suitable as a self-study. This text is designed as an introduction to uncertainty quantification for senior undergraduate and graduate students with a mathematical or statistical back...

  10. De novo origin of VCY2 from autosome to Y-transposed amplicon.

    Directory of Open Access Journals (Sweden)

    Peng-Rong Cao

    Full Text Available The formation of new genes is a primary driving force of evolution in all organisms. The de novo evolution of new genes from non-protein-coding genomic regions is emerging as an important additional mechanism for novel gene creation. Y chromosomes underlie sex determination in mammals and contain genes that are required for male-specific functions. In this study, a search was undertaken for Y chromosome de novo genes derived from non-protein-coding sequences. The Y chromosome orphan gene variable charge, Y-linked (VCY2, is an autosome-derived gene that has sequence similarity to large autosomal fragments but lacks an autosomal protein-coding homolog. VCY2 locates in the amplicon containing long DNA fragments that were transposed from autosomes to the Y chromosome before the ape-monkey split. We confirmed that VCY2 cannot be encoded by autosomes due to the presence of multiple disablers that disrupt the open reading frame, such as the absence of start or stop codons and the presence of premature stop codons. Similar observations have been made for homologs in the autosomes of the chimpanzee, gorilla, rhesus macaque, baboon and out-group marmoset, which suggests that there was a non-protein-coding ancestral VCY2 that was common to apes and monkeys that predated the transposition event. Furthermore, while protein-coding orthologs are absent, a putative non-protein-coding VCY2 with conserved disablers was identified in the rhesus macaque Y chromosome male-specific region. This finding implies that VCY2 might have not acquired its protein-coding ability before the ape-monkey split. VCY2 encodes a testis-specific expressed protein and is involved in the pathologic process of male infertility, and the acquisition of this gene might improve male fertility. This is the first evidence that de novo genes can be generated from transposed autosomal non-protein-coding segments, and this evidence provides novel insights into the evolutionary history of the Y

  11. De novo origin of VCY2 from autosome to Y-transposed amplicon.

    Science.gov (United States)

    Cao, Peng-Rong; Wang, Lei; Jiang, Yu-Chao; Yi, Yin-Sha; Qu, Fang; Liu, Tao-Cheng; Lv, Yuan

    2015-01-01

    The formation of new genes is a primary driving force of evolution in all organisms. The de novo evolution of new genes from non-protein-coding genomic regions is emerging as an important additional mechanism for novel gene creation. Y chromosomes underlie sex determination in mammals and contain genes that are required for male-specific functions. In this study, a search was undertaken for Y chromosome de novo genes derived from non-protein-coding sequences. The Y chromosome orphan gene variable charge, Y-linked (VCY)2, is an autosome-derived gene that has sequence similarity to large autosomal fragments but lacks an autosomal protein-coding homolog. VCY2 locates in the amplicon containing long DNA fragments that were transposed from autosomes to the Y chromosome before the ape-monkey split. We confirmed that VCY2 cannot be encoded by autosomes due to the presence of multiple disablers that disrupt the open reading frame, such as the absence of start or stop codons and the presence of premature stop codons. Similar observations have been made for homologs in the autosomes of the chimpanzee, gorilla, rhesus macaque, baboon and out-group marmoset, which suggests that there was a non-protein-coding ancestral VCY2 that was common to apes and monkeys that predated the transposition event. Furthermore, while protein-coding orthologs are absent, a putative non-protein-coding VCY2 with conserved disablers was identified in the rhesus macaque Y chromosome male-specific region. This finding implies that VCY2 might have not acquired its protein-coding ability before the ape-monkey split. VCY2 encodes a testis-specific expressed protein and is involved in the pathologic process of male infertility, and the acquisition of this gene might improve male fertility. This is the first evidence that de novo genes can be generated from transposed autosomal non-protein-coding segments, and this evidence provides novel insights into the evolutionary history of the Y chromosome. PMID

  12. Exploring the Gastrointestinal "Nemabiome": Deep Amplicon Sequencing to Quantify the Species Composition of Parasitic Nematode Communities.

    Science.gov (United States)

    Avramenko, Russell W; Redman, Elizabeth M; Lewis, Roy; Yazwinski, Thomas A; Wasmuth, James D; Gilleard, John S

    2015-01-01

    Parasitic helminth infections have a considerable impact on global human health as well as animal welfare and production. Although co-infection with multiple parasite species within a host is common, there is a dearth of tools with which to study the composition of these complex parasite communities. Helminth species vary in their pathogenicity, epidemiology and drug sensitivity and the interactions that occur between co-infecting species and their hosts are poorly understood. We describe the first application of deep amplicon sequencing to study parasitic nematode communities as well as introduce the concept of the gastro-intestinal "nemabiome". The approach is analogous to 16S rDNA deep sequencing used to explore microbial communities, but utilizes the nematode ITS-2 rDNA locus instead. Gastro-intestinal parasites of cattle were used to develop the concept, as this host has many well-defined gastro-intestinal nematode species that commonly occur as complex co-infections. Further, the availability of pure mono-parasite populations from experimentally infected cattle allowed us to prepare mock parasite communities to determine, and correct for, species representation biases in the sequence data. We demonstrate that, once these biases have been corrected, accurate relative quantitation of gastro-intestinal parasitic nematode communities in cattle fecal samples can be achieved. We have validated the accuracy of the method applied to field-samples by comparing the results of detailed morphological examination of L3 larvae populations with those of the sequencing assay. The results illustrate the insights that can be gained into the species composition of parasite communities, using grazing cattle in the mid-west USA as an example. However, both the technical approach and the concept of the 'nemabiome' have a wide range of potential applications in human and veterinary medicine. These include investigations of host-parasite and parasite-parasite interactions during co

  13. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very ...

  14. Next-generation sequencing of multiple individuals per barcoded library by deconvolution of sequenced amplicons using endonuclease fragment analysis

    DEFF Research Database (Denmark)

    Andersen, Jeppe D; Pereira, Vania; Pietroni, Carlotta;

    2014-01-01

    The simultaneous sequencing of samples from multiple individuals increases the efficiency of next-generation sequencing (NGS) while also reducing costs. Here we describe a novel and simple approach for sequencing DNA from multiple individuals per barcode. Our strategy relies on the endonuclease...... digestion of PCR amplicons prior to library preparation, creating a specific fragment pattern for each individual that can be resolved after sequencing. By using both barcodes and restriction fragment patterns, we demonstrate the ability to sequence the human melanocortin 1 receptor (MC1R) genes from 72...... individuals using only 24 barcoded libraries....

  15. The cytological manifestation of gene amplification in multidrug-resistant mouse leukemia P388 sublines is correlated with amplicon content

    Energy Technology Data Exchange (ETDEWEB)

    Il`inskaya, G.V.; Kopnin, B.P. [Institute of Carcinogenesis, Moscow (Russian Federation); Demidova, N.S. [Institute of Chemical Physics, Moscow (Russian Federation)

    1995-10-01

    Previously, we showed that development of multidrug resistance (MDR) in mouse P388 leukemia cells is often associated with the appearance of newly-formed chromosomelike structures that contain amplified copies of the mdr1 gene. In the present study, we compared amplicon content in P388 sublines showing different types of these structures. A strong correlation between the formation of specific acentric markers consisting of two identical arms and the absence of the sorcin gene coamplification was found. In all the sublines containing other types of chromosomelike structures, the sorcin gene is coamplified. 9 refs., 2 figs., 1 tab.

  16. A Method for Amplicon Deep Sequencing of Drug Resistance Genes in Plasmodium falciparum Clinical Isolates from India.

    Science.gov (United States)

    Rao, Pavitra N; Uplekar, Swapna; Kayal, Sriti; Mallick, Prashant K; Bandyopadhyay, Nabamita; Kale, Sonal; Singh, Om P; Mohanty, Akshaya; Mohanty, Sanjib; Wassmer, Samuel C; Carlton, Jane M

    2016-06-01

    A major challenge to global malaria control and elimination is early detection and containment of emerging drug resistance. Next-generation sequencing (NGS) methods provide the resolution, scalability, and sensitivity required for high-throughput surveillance of molecular markers of drug resistance. We have developed an amplicon sequencing method on the Ion Torrent PGM platform for targeted resequencing of a panel of six Plasmodium falciparum genes implicated in resistance to first-line antimalarial therapy, including artemisinin combination therapy, chloroquine, and sulfadoxine-pyrimethamine. The protocol was optimized using 12 geographically diverse P. falciparum reference strains and successfully applied to multiplexed sequencing of 16 clinical isolates from India. The sequencing results from the reference strains showed 100% concordance with previously reported drug resistance-associated mutations. Single-nucleotide polymorphisms (SNPs) in clinical isolates revealed a number of known resistance-associated mutations and other nonsynonymous mutations that have not been implicated in drug resistance. SNP positions containing multiple allelic variants were used to identify three clinical samples containing mixed genotypes indicative of multiclonal infections. The amplicon sequencing protocol has been designed for the benchtop Ion Torrent PGM platform and can be operated with minimal bioinformatics infrastructure, making it ideal for use in countries that are endemic for the disease to facilitate routine large-scale surveillance of the emergence of drug resistance and to ensure continued success of the malaria treatment policy. PMID:27008882

  17. A Method for Amplicon Deep Sequencing of Drug Resistance Genes in Plasmodium falciparum Clinical Isolates from India

    Science.gov (United States)

    Rao, Pavitra N.; Uplekar, Swapna; Kayal, Sriti; Mallick, Prashant K.; Bandyopadhyay, Nabamita; Kale, Sonal; Singh, Om P.; Mohanty, Akshaya; Mohanty, Sanjib; Wassmer, Samuel C.

    2016-01-01

    A major challenge to global malaria control and elimination is early detection and containment of emerging drug resistance. Next-generation sequencing (NGS) methods provide the resolution, scalability, and sensitivity required for high-throughput surveillance of molecular markers of drug resistance. We have developed an amplicon sequencing method on the Ion Torrent PGM platform for targeted resequencing of a panel of six Plasmodium falciparum genes implicated in resistance to first-line antimalarial therapy, including artemisinin combination therapy, chloroquine, and sulfadoxine-pyrimethamine. The protocol was optimized using 12 geographically diverse P. falciparum reference strains and successfully applied to multiplexed sequencing of 16 clinical isolates from India. The sequencing results from the reference strains showed 100% concordance with previously reported drug resistance-associated mutations. Single-nucleotide polymorphisms (SNPs) in clinical isolates revealed a number of known resistance-associated mutations and other nonsynonymous mutations that have not been implicated in drug resistance. SNP positions containing multiple allelic variants were used to identify three clinical samples containing mixed genotypes indicative of multiclonal infections. The amplicon sequencing protocol has been designed for the benchtop Ion Torrent PGM platform and can be operated with minimal bioinformatics infrastructure, making it ideal for use in countries that are endemic for the disease to facilitate routine large-scale surveillance of the emergence of drug resistance and to ensure continued success of the malaria treatment policy. PMID:27008882

  18. RiboFR-Seq: a novel approach to linking 16S rRNA amplicon profiles to metagenomes

    Science.gov (United States)

    Zhang, Yanming; Ji, Peifeng; Wang, Jinfeng; Zhao, Fangqing

    2016-01-01

    16S rRNA amplicon analysis and shotgun metagenome sequencing are two main culture-independent strategies to explore the genetic landscape of various microbial communities. Recently, numerous studies have employed these two approaches together, but downstream data analyses were performed separately, which always generated incongruent or conflict signals on both taxonomic and functional classifications. Here we propose a novel approach, RiboFR-Seq (Ribosomal RNA gene flanking region sequencing), for capturing both ribosomal RNA variable regions and their flanking protein-coding genes simultaneously. Through extensive testing on clonal bacterial strain, salivary microbiome and bacterial epibionts of marine kelp, we demonstrated that RiboFR-Seq could detect the vast majority of bacteria not only in well-studied microbiomes but also in novel communities with limited reference genomes. Combined with classical amplicon sequencing and shotgun metagenome sequencing, RiboFR-Seq can link the annotations of 16S rRNA and metagenomic contigs to make a consensus classification. By recognizing almost all 16S rRNA copies, the RiboFR-seq approach can effectively reduce the taxonomic abundance bias resulted from 16S rRNA copy number variation. We believe that RiboFR-Seq, which provides an integrated view of 16S rRNA profiles and metagenomes, will help us better understand diverse microbial communities. PMID:26984526

  19. CLOTU: An online pipeline for processing and clustering of 454 amplicon reads into OTUs followed by taxonomic annotation

    Directory of Open Access Journals (Sweden)

    Shalchian-Tabrizi Kamran

    2011-05-01

    Full Text Available Abstract Background The implementation of high throughput sequencing for exploring biodiversity poses high demands on bioinformatics applications for automated data processing. Here we introduce CLOTU, an online and open access pipeline for processing 454 amplicon reads. CLOTU has been constructed to be highly user-friendly and flexible, since different types of analyses are needed for different datasets. Results In CLOTU, the user can filter out low quality sequences, trim tags, primers, adaptors, perform clustering of sequence reads, and run BLAST against NCBInr or a customized database in a high performance computing environment. The resulting data may be browsed in a user-friendly manner and easily forwarded to downstream analyses. Although CLOTU is specifically designed for analyzing 454 amplicon reads, other types of DNA sequence data can also be processed. A fungal ITS sequence dataset generated by 454 sequencing of environmental samples is used to demonstrate the utility of CLOTU. Conclusions CLOTU is a flexible and easy to use bioinformatics pipeline that includes different options for filtering, trimming, clustering and taxonomic annotation of high throughput sequence reads. Some of these options are not included in comparable pipelines. CLOTU is implemented in a Linux computer cluster and is freely accessible to academic users through the Bioportal web-based bioinformatics service (http://www.bioportal.uio.no.

  20. RiboFR-Seq: a novel approach to linking 16S rRNA amplicon profiles to metagenomes.

    Science.gov (United States)

    Zhang, Yanming; Ji, Peifeng; Wang, Jinfeng; Zhao, Fangqing

    2016-06-01

    16S rRNA amplicon analysis and shotgun metagenome sequencing are two main culture-independent strategies to explore the genetic landscape of various microbial communities. Recently, numerous studies have employed these two approaches together, but downstream data analyses were performed separately, which always generated incongruent or conflict signals on both taxonomic and functional classifications. Here we propose a novel approach, RiboFR-Seq (Ribosomal RNA gene flanking region sequencing), for capturing both ribosomal RNA variable regions and their flanking protein-coding genes simultaneously. Through extensive testing on clonal bacterial strain, salivary microbiome and bacterial epibionts of marine kelp, we demonstrated that RiboFR-Seq could detect the vast majority of bacteria not only in well-studied microbiomes but also in novel communities with limited reference genomes. Combined with classical amplicon sequencing and shotgun metagenome sequencing, RiboFR-Seq can link the annotations of 16S rRNA and metagenomic contigs to make a consensus classification. By recognizing almost all 16S rRNA copies, the RiboFR-seq approach can effectively reduce the taxonomic abundance bias resulted from 16S rRNA copy number variation. We believe that RiboFR-Seq, which provides an integrated view of 16S rRNA profiles and metagenomes, will help us better understand diverse microbial communities. PMID:26984526

  1. A Portable Automatic Endpoint Detection System for Amplicons of Loop Mediated Isothermal Amplification on Microfluidic Compact Disk Platform

    Directory of Open Access Journals (Sweden)

    Shah Mukim Uddin

    2015-03-01

    Full Text Available In recent years, many improvements have been made in foodborne pathogen detection methods to reduce the impact of food contamination. Several rapid methods have been developed with biosensor devices to improve the way of performing pathogen detection. This paper presents an automated endpoint detection system for amplicons generated by loop mediated isothermal amplification (LAMP on a microfluidic compact disk platform. The developed detection system utilizes a monochromatic ultraviolet (UV emitter for excitation of fluorescent labeled LAMP amplicons and a color sensor to detect the emitted florescence from target. Then it processes the sensor output and displays the detection results on liquid crystal display (LCD. The sensitivity test has been performed with detection limit up to 2.5 × 10−3 ng/µL with different DNA concentrations of Salmonella bacteria. This system allows a rapid and automatic endpoint detection which could lead to the development of a point-of-care diagnosis device for foodborne pathogens detection in a resource-limited environment.

  2. The HER2 amplicon includes several genes required for the growth and survival of HER2 positive breast cancer cells — A data description

    Directory of Open Access Journals (Sweden)

    Vesa Hongisto

    2014-12-01

    Full Text Available A large number of breast cancers are characterized by amplification and overexpression of the chromosome segment surrounding the HER2 (ERBB2 oncogene. As the HER2 amplicon at 17q12 contains multiple genes, we have systematically explored the role of the HER2 co-amplified genes in breast cancer cell growth and their relation to trastuzumab resistance. We integrated array comparative genomic hybridization (aCGH data of the HER2 amplicon from 71 HER2 positive breast tumors and 10 cell lines with systematic functional RNA interference analysis of 23 core amplicon genes with several phenotypic endpoints in a panel of trastuzumab responding and non-responding HER2 positive breast cancer cells. In this Data in Brief we give a detailed description of the experimental procedures and the data analysis methods used in the study (1.

  3. Shotgun metagenomes and multiple primer pair-barcode combinations of amplicons reveal biases in metabarcoding analyses of fungi

    Directory of Open Access Journals (Sweden)

    Leho Tedersoo

    2015-05-01

    Full Text Available Rapid development of high-throughput (HTS molecular identification methods has revolutionized our knowledge about taxonomic diversity and ecology of fungi. However, PCR-based methods exhibit multiple technical shortcomings that may bias our understanding of the fungal kingdom. This study was initiated to quantify potential biases in fungal community ecology by comparing the relative performance of amplicon-free shotgun metagenomics and amplicons of nine primer pairs over seven nuclear ribosomal DNA (rDNA regions often used in metabarcoding analyses. The internal transcribed spacer (ITS barcodes ITS1 and ITS2 provided greater taxonomic and functional resolution and richness of operational taxonomic units (OTUs at the 97% similarity threshold compared to barcodes located within the ribosomal small subunit (SSU and large subunit (LSU genes. All barcode-primer pair combinations provided consistent results in ranking taxonomic richness and recovering the importance of floristic variables in driving fungal community composition in soils of Papua New Guinea. The choice of forward primer explained up to 2.0% of the variation in OTU-level analysis of the ITS1 and ITS2 barcode data sets. Across the whole data set, barcode-primer pair combination explained 37.6–38.1% of the variation, which surpassed any environmental signal. Overall, the metagenomics data set recovered a similar taxonomic overview, but resulted in much lower fungal rDNA sequencing depth, inability to infer OTUs, and high uncertainty in identification. We recommend the use of ITS2 or the whole ITS region for metabarcoding and we advocate careful choice of primer pairs in consideration of the relative proportion of fungal DNA and expected dominant groups.

  4. Analysis of 16S rRNA amplicon sequencing options on the Roche/454 next-generation titanium sequencing platform.

    Directory of Open Access Journals (Sweden)

    Hideyuki Tamaki

    Full Text Available BACKGROUND: 16S rRNA gene pyrosequencing approach has revolutionized studies in microbial ecology. While primer selection and short read length can affect the resulting microbial community profile, little is known about the influence of pyrosequencing methods on the sequencing throughput and the outcome of microbial community analyses. The aim of this study is to compare differences in output, ease, and cost among three different amplicon pyrosequencing methods for the Roche/454 Titanium platform METHODOLOGY/PRINCIPAL FINDINGS: The following three pyrosequencing methods for 16S rRNA genes were selected in this study: Method-1 (standard method is the recommended method for bi-directional sequencing using the LIB-A kit; Method-2 is a new option designed in this study for unidirectional sequencing with the LIB-A kit; and Method-3 uses the LIB-L kit for unidirectional sequencing. In our comparison among these three methods using 10 different environmental samples, Method-2 and Method-3 produced 1.5-1.6 times more useable reads than the standard method (Method-1, after quality-based trimming, and did not compromise the outcome of microbial community analyses. Specifically, Method-3 is the most cost-effective unidirectional amplicon sequencing method as it provided the most reads and required the least effort in consumables management. CONCLUSIONS: Our findings clearly demonstrated that alternative pyrosequencing methods for 16S rRNA genes could drastically affect sequencing output (e.g. number of reads before and after trimming but have little effect on the outcomes of microbial community analysis. This finding is important for both researchers and sequencing facilities utilizing 16S rRNA gene pyrosequencing for microbial ecological studies.

  5. Short-read assembly of full-length 16S amplicons reveals bacterial diversity in subsurface sediments.

    Directory of Open Access Journals (Sweden)

    Christopher S Miller

    Full Text Available In microbial ecology, a fundamental question relates to how community diversity and composition change in response to perturbation. Most studies have had limited ability to deeply sample community structure (e.g. Sanger-sequenced 16S rRNA libraries, or have had limited taxonomic resolution (e.g. studies based on 16S rRNA hypervariable region sequencing. Here, we combine the higher taxonomic resolution of near-full-length 16S rRNA gene amplicons with the economics and sensitivity of short-read sequencing to assay the abundance and identity of organisms that represent as little as 0.01% of sediment bacterial communities. We used a new version of EMIRGE optimized for large data size to reconstruct near-full-length 16S rRNA genes from amplicons sheared and sequenced with Illumina technology. The approach allowed us to differentiate the community composition among samples acquired before perturbation, after acetate amendment shifted the predominant metabolism to iron reduction, and once sulfate reduction began. Results were highly reproducible across technical replicates, and identified specific taxa that responded to the perturbation. All samples contain very high alpha diversity and abundant organisms from phyla without cultivated representatives. Surprisingly, at the time points measured, there was no strong loss of evenness, despite the selective pressure of acetate amendment and change in the terminal electron accepting process. However, community membership was altered significantly. The method allows for sensitive, accurate profiling of the "long tail" of low abundance organisms that exist in many microbial communities, and can resolve population dynamics in response to environmental change.

  6. Investigation of Microbial Diversity in Geothermal Hot Springs in Unkeshwar, India, Based on 16S rRNA Amplicon Metagenome Sequencing

    OpenAIRE

    Mehetre, Gajanan T.; Paranjpe, Aditi; Dastager, Syed G.; Dharne, Mahesh S.

    2016-01-01

    Microbial diversity in geothermal waters of the Unkeshwar hot springs in Maharashtra, India, was studied using 16S rRNA amplicon metagenomic sequencing. Taxonomic analysis revealed the presence of Bacteroidetes, Proteobacteria, Cyanobacteria, Actinobacteria, Archeae, and OD1 phyla. Metabolic function prediction analysis indicated a battery of biological information systems indicating rich and novel microbial diversity, with potential biotechnological applications in this niche.

  7. Optimisation of 16S rDNA amplicon sequencing protocols for microbial community profiling of anaerobic digesters

    DEFF Research Database (Denmark)

    Kirkegaard, Rasmus Hansen; McIlroy, Simon Jon; Larsen, Poul;

    A reliable and reproducible method for identification and quantification of the microorganisms involved in biogas production is important for the study and understanding of the microbial communities responsible for the function of anaerobic digester systems. DNA based identification using 16S rRN...

  8. Obtaining representative community profiles of anaerobic digesters through optimisation of 16S rRNA amplicon sequencing protocols

    DEFF Research Database (Denmark)

    Kirkegaard, Rasmus Hansen; McIlroy, Simon Jon; Karst, Søren Michael;

    A reliable and reproducible method for identification and quantification of the microorganisms involved in biogas production is important for the study and understanding of the microbial communities responsible for the function of anaerobic digester systems. DNA based identification using 16S rRN...

  9. Training load quantification in triathlon

    OpenAIRE

    2011-01-01

    There are different Indices of Training Stress of varying complexity, to quantification Training load. Examples include the training impulse (TRIMP), the session (RPE), Lucia’s TRIMP or Summated Zone Score. But the triathlon, a sport to be combined where there are interactions between different segments, is a complication when it comes to quantify the training. The aim of this paper is to review current methods of quantification, and to propose a scale to quantify the training load in triathl...

  10. Quantification of human responses

    Science.gov (United States)

    Steinlage, R. C.; Gantner, T. E.; Lim, P. Y. W.

    1992-01-01

    Human perception is a complex phenomenon which is difficult to quantify with instruments. For this reason, large panels of people are often used to elicit and aggregate subjective judgments. Print quality, taste, smell, sound quality of a stereo system, softness, and grading Olympic divers and skaters are some examples of situations where subjective measurements or judgments are paramount. We usually express what is in our mind through language as a medium but languages are limited in available choices of vocabularies, and as a result, our verbalizations are only approximate expressions of what we really have in mind. For lack of better methods to quantify subjective judgments, it is customary to set up a numerical scale such as 1, 2, 3, 4, 5 or 1, 2, 3, ..., 9, 10 for characterizing human responses and subjective judgments with no valid justification except that these scales are easy to understand and convenient to use. But these numerical scales are arbitrary simplifications of the complex human mind; the human mind is not restricted to such simple numerical variations. In fact, human responses and subjective judgments are psychophysical phenomena that are fuzzy entities and therefore difficult to handle by conventional mathematics and probability theory. The fuzzy mathematical approach provides a more realistic insight into understanding and quantifying human responses. This paper presents a method for quantifying human responses and subjective judgments without assuming a pattern of linear or numerical variation for human responses. In particular, quantification and evaluation of linguistic judgments was investigated.

  11. Nitrogen quantification with SNMS

    Science.gov (United States)

    Goschnick, J.; Natzeck, C.; Sommer, M.

    1999-04-01

    Plasma-based secondary neutral mass spectrometry (plasma SNMS) is a powerful analytical method for determining the elemental concentrations of almost any kind of material at low cost by using a cheap quadrupole mass filter. However, a quadrupole-based mass spectrometer is limited to nominal mass resolution. Atomic signals are sometimes superimposed by molecular signals (2 or 3 atomic clusters such as CH +, CH 2+ or metal oxide clusters) and/or intensities of double-charged species. Especially in the case of nitrogen several interferences can impede the quantification. This article reports on methods to recognize and deconvolute superpositions of N + with CH 2+, Li 2+, and Si 2+ at mass 14 D (Debye) occurring during analysis of organic and inorganic substances. The recognition is based on the signal pattern of N +, Li +, CH +, and Si +. The latter serve as indicators for a probable interference of molecular or double-charged species with N on mass 14 D. The subsequent deconvolution use different shapes of atomic and cluster kinetic energy distributions (kEDs) to determine the quantities of the intensity components by a linear fit of N + and non-atomic kEDs obtained from several organic and inorganic standards into the measured kED. The atomic intensity fraction yields a much better nitrogen concentration than the total intensity of mass 14 D after correction.

  12. Amplicon-Based Pyrosequencing Reveals High Diversity of Protistan Parasites in Ships' Ballast Water: Implications for Biogeography and Infectious Diseases.

    Science.gov (United States)

    Pagenkopp Lohan, K M; Fleischer, R C; Carney, K J; Holzer, K K; Ruiz, G M

    2016-04-01

    Ships' ballast water (BW) commonly moves macroorganisms and microorganisms across the world's oceans and along coasts; however, the majority of these microbial transfers have gone undetected. We applied high-throughput sequencing methods to identify microbial eukaryotes, specifically emphasizing the protistan parasites, in ships' BW collected from vessels calling to the Chesapeake Bay (Virginia and Maryland, USA) from European and Eastern Canadian ports. We utilized tagged-amplicon 454 pyrosequencing with two general primer sets, amplifying either the V4 or V9 domain of the small subunit (SSU) of the ribosomal RNA (rRNA) gene complex, from total DNA extracted from water samples collected from the ballast tanks of bulk cargo vessels. We detected a diverse group of protistan taxa, with some known to contain important parasites in marine systems, including Apicomplexa (unidentified apicomplexans, unidentified gregarines, Cryptosporidium spp.), Dinophyta (Blastodinium spp., Euduboscquella sp., unidentified syndinids, Karlodinium spp., Syndinium spp.), Perkinsea (Parvilucifera sp.), Opisthokonta (Ichthyosporea sp., Pseudoperkinsidae, unidentified ichthyosporeans), and Stramenopiles (Labyrinthulomycetes). Further characterization of groups with parasitic taxa, consisting of phylogenetic analyses for four taxa (Cryptosporidium spp., Parvilucifera spp., Labyrinthulomycetes, and Ichthyosporea), revealed that sequences were obtained from both known and novel lineages. This study demonstrates that high-throughput sequencing is a viable and sensitive method for detecting parasitic protists when present and transported in the ballast water of ships. These data also underscore the potential importance of human-aided dispersal in the biogeography of these microbes and emerging diseases in the world's oceans. PMID:26476551

  13. Metagenomic Analysis of Slovak Bryndza Cheese Using Next-Generation 16S rDNA Amplicon Sequencing

    Directory of Open Access Journals (Sweden)

    Planý Matej

    2016-06-01

    Full Text Available Knowledge about diversity and taxonomic structure of the microbial population present in traditional fermented foods plays a key role in starter culture selection, safety improvement and quality enhancement of the end product. Aim of this study was to investigate microbial consortia composition in Slovak bryndza cheese. For this purpose, we used culture-independent approach based on 16S rDNA amplicon sequencing using next generation sequencing platform. Results obtained by the analysis of three commercial (produced on industrial scale in winter season and one traditional (artisanal, most valued, produced in May Slovak bryndza cheese sample were compared. A diverse prokaryotic microflora composed mostly of the genera Lactococcus, Streptococcus, Lactobacillus, and Enterococcus was identified. Lactococcus lactis subsp. lactis and Lactococcus lactis subsp. cremoris were the dominant taxons in all tested samples. Second most abundant species, detected in all bryndza cheeses, were Lactococcus fujiensis and Lactococcus taiwanensis, independently by two different approaches, using different reference 16S rRNA genes databases (Greengenes and NCBI respectively. They have been detected in bryndza cheese samples in substantial amount for the first time. The narrowest microbial diversity was observed in a sample made with a starter culture from pasteurised milk. Metagenomic analysis by high-throughput sequencing using 16S rRNA genes seems to be a powerful tool for studying the structure of the microbial population in cheeses.

  14. Fungi Sailing the Arctic Ocean: Speciose Communities in North Atlantic Driftwood as Revealed by High-Throughput Amplicon Sequencing.

    Science.gov (United States)

    Rämä, Teppo; Davey, Marie L; Nordén, Jenni; Halvorsen, Rune; Blaalid, Rakel; Mathiassen, Geir H; Alsos, Inger G; Kauserud, Håvard

    2016-08-01

    High amounts of driftwood sail across the oceans and provide habitat for organisms tolerating the rough and saline environment. Fungi have adapted to the extremely cold and saline conditions which driftwood faces in the high north. For the first time, we applied high-throughput sequencing to fungi residing in driftwood to reveal their taxonomic richness, community composition, and ecology in the North Atlantic. Using pyrosequencing of ITS2 amplicons obtained from 49 marine logs, we found 807 fungal operational taxonomic units (OTUs) based on clustering at 97 % sequence similarity cut-off level. The phylum Ascomycota comprised 74 % of the OTUs and 20 % belonged to Basidiomycota. The richness of basidiomycetes decreased with prolonged submersion in the sea, supporting the general view of ascomycetes being more extremotolerant. However, more than one fourth of the fungal OTUs remained unassigned to any fungal class, emphasising the need for better DNA reference data from the marine habitat. Different fungal communities were detected in coniferous and deciduous logs. Our results highlight that driftwood hosts a considerably higher fungal diversity than currently known. The driftwood fungal community is not a terrestrial relic but a speciose assemblage of fungi adapted to the stressful marine environment and different kinds of wooden substrates found in it. PMID:27147245

  15. Simultaneous Whole Mitochondrial Genome Sequencing with Short Overlapping Amplicons Suitable for Degraded DNA Using the Ion Torrent Personal Genome Machine.

    Science.gov (United States)

    Chaitanya, Lakshmi; Ralf, Arwin; van Oven, Mannis; Kupiec, Tomasz; Chang, Joseph; Lagacé, Robert; Kayser, Manfred

    2015-12-01

    Whole mitochondrial (mt) genome analysis enables a considerable increase in analysis throughput, and improves the discriminatory power to the maximum possible phylogenetic resolution. Most established protocols on the different massively parallel sequencing (MPS) platforms, however, invariably involve the PCR amplification of large fragments, typically several kilobases in size, which may fail due to mtDNA fragmentation in the available degraded materials. We introduce a MPS tiling approach for simultaneous whole human mt genome sequencing using 161 short overlapping amplicons (average 200 bp) with the Ion Torrent Personal Genome Machine. We illustrate the performance of this new method by sequencing 20 DNA samples belonging to different worldwide mtDNA haplogroups. Additional quality control, particularly regarding the potential detection of nuclear insertions of mtDNA (NUMTs), was performed by comparative MPS analysis using the conventional long-range amplification method. Preliminary sensitivity testing revealed that detailed haplogroup inference was feasible with 100 pg genomic input DNA. Complete mt genome coverage was achieved from DNA samples experimentally degraded down to genomic fragment sizes of about 220 bp, and up to 90% coverage from naturally degraded samples. Overall, we introduce a new approach for whole mt genome MPS analysis from degraded and nondegraded materials relevant to resolve and infer maternal genetic ancestry at complete resolution in anthropological, evolutionary, medical, and forensic applications. PMID:26387877

  16. FasL and FADD delivery by a glioma-specific and cell cycle-dependent HSV-1 amplicon virus enhanced apoptosis in primary human brain tumors

    Directory of Open Access Journals (Sweden)

    Lam Paula Y

    2010-10-01

    Full Text Available Abstract Background Glioblastoma multiforme is the most malignant cancer of the brain and is notoriously difficult to treat due to the highly proliferative and infiltrative nature of the cells. Herein, we explored the combination treatment of pre-established human glioma xenograft using multiple therapeutic genes whereby the gene expression is regulated by both cell-type and cell cycle-dependent transcriptional regulatory mechanism conferred by recombinant HSV-1 amplicon vectors. Results We demonstrated for the first time that Ki67-positive proliferating primary human glioma cells cultured from biopsy samples were effectively induced into cell death by the dual-specific function of the pG8-FasL amplicon vectors. These vectors were relatively stable and exhibited minimal cytotoxicity in vivo. Intracranial implantation of pre-transduced glioma cells resulted in better survival outcome when compared with viral vectors inoculated one week post-implantation of tumor cells, indicating that therapeutic efficacy is dependent on the viral spread and mode of viral vectors administration. We further showed that pG8-FasL amplicon vectors are functional in the presence of commonly used treatment regimens for human brain cancer. In fact, the combined therapies of pG8-FasL and pG8-FADD in the presence of temozolomide significantly improved the survival of mice bearing intracranial high-grade gliomas. Conclusion Taken together, our results showed that the glioma-specific and cell cycle-dependent HSV-1 amplicon vector is potentially useful as an adjuvant therapy to complement the current gene therapy strategy for gliomas.

  17. Development of a control region-based mtDNA SNaPshot™ selection tool, integrated into a mini amplicon sequencing method.

    Science.gov (United States)

    Weiler, Natalie E C; de Vries, Gerda; Sijen, Titia

    2016-03-01

    Mitochondrial DNA (mtDNA) analysis is regularly applied to forensic DNA samples with limited amounts of nuclear DNA (nDNA), such as hair shafts and bones. Generally, this mtDNA analysis involves examination of the hypervariable control region by Sanger sequencing of amplified products. When samples are severely degraded, small-sized amplicons can be applied and an earlier described mini-mtDNA method by Eichmann et al. [1] that accommodates ten mini amplicons in two multiplexes is found to be a very robust approach. However, in cases with large numbers of samples, like when searching for hairs with an mtDNA profile deviant from that of the victim, the method is time (and cost) consuming. Previously, Chemale et al. [2] described a SNaPshot™-based screening tool for a Brazilian population that uses standard-size amplicons for HVS-I and HVS-II. Here, we describe a similar tool adapted to the full control region and compatible with mini-mtDNA amplicons. Eighteen single nucleotide polymorphisms (SNPs) were selected based on their relative frequencies in a European population. They showed a high discriminatory power in a Dutch population (97.2%). The 18 SNPs are assessed in two SNaPshot™ multiplexes that pair to the two mini-mtDNA amplification multiplexes. Degenerate bases are included to limit allele dropout due to SNPs at primer binding site positions. Three SNPs provide haplogroup information. Reliability testing showed no differences with Sanger sequencing results. Since mini-mtSNaPshot screening uses only a small portion of the same PCR products used for Sanger sequencing, no additional DNA extract is consumed, which is forensically advantageous. PMID:26976467

  18. Investigation of Microbial Diversity in Geothermal Hot Springs in Unkeshwar, India, Based on 16S rRNA Amplicon Metagenome Sequencing.

    Science.gov (United States)

    Mehetre, Gajanan T; Paranjpe, Aditi; Dastager, Syed G; Dharne, Mahesh S

    2016-01-01

    Microbial diversity in geothermal waters of the Unkeshwar hot springs in Maharashtra, India, was studied using 16S rRNA amplicon metagenomic sequencing. Taxonomic analysis revealed the presence of Bacteroidetes, Proteobacteria, Cyanobacteria, Actinobacteria, Archeae, and OD1 phyla. Metabolic function prediction analysis indicated a battery of biological information systems indicating rich and novel microbial diversity, with potential biotechnological applications in this niche. PMID:26950332

  19. Very high resolution single pass HLA genotyping using amplicon sequencing on the 454 next generation DNA sequencers: Comparison with Sanger sequencing.

    Science.gov (United States)

    Yamamoto, F; Höglund, B; Fernandez-Vina, M; Tyan, D; Rastrou, M; Williams, T; Moonsamy, P; Goodridge, D; Anderson, M; Erlich, H A; Holcomb, C L

    2015-12-01

    Compared to Sanger sequencing, next-generation sequencing offers advantages for high resolution HLA genotyping including increased throughput, lower cost, and reduced genotype ambiguity. Here we describe an enhancement of the Roche 454 GS GType HLA genotyping assay to provide very high resolution (VHR) typing, by the addition of 8 primer pairs to the original 14, to genotype 11 HLA loci. These additional amplicons help resolve common and well-documented alleles and exclude commonly found null alleles in genotype ambiguity strings. Simplification of workflow to reduce the initial preparation effort using early pooling of amplicons or the Fluidigm Access Array™ is also described. Performance of the VHR assay was evaluated on 28 well characterized cell lines using Conexio Assign MPS software which uses genomic, rather than cDNA, reference sequence. Concordance was 98.4%; 1.6% had no genotype assignment. Of concordant calls, 53% were unambiguous. To further assess the assay, 59 clinical samples were genotyped and results compared to unambiguous allele assignments obtained by prior sequence-based typing supplemented with SSO and/or SSP. Concordance was 98.7% with 58.2% as unambiguous calls; 1.3% could not be assigned. Our results show that the amplicon-based VHR assay is robust and can replace current Sanger methodology. Together with software enhancements, it has the potential to provide even higher resolution HLA typing. PMID:26037172

  20. Strategy for the maximization of clinically relevant information from hepatitis C virus, RT-PCR quantification.

    LENUS (Irish Health Repository)

    Levis, J

    2012-02-03

    BACKGROUND: The increasing clinical application of viral load assays for monitoring viral infections has been an incentive for the development of standardized tests for the hepatitis C virus. OBJECTIVE: To develop a simple model for the prediction of baseline viral load in individuals infected with the hepatitis C virus. METHODOLOGY: Viral load quantification of each patient\\'s first sample was assessed by RT-PCR-ELISA using the Roche MONITOR assay in triplicate. Genotype of the infecting virus was identified by reverse line probe hybridization, using amplicons resulting from the qualitative HCV Roche AMPLICOR assay. RESULTS: Retrospective evaluation of first quantitative values suggested that 82.4% (n=168\\/204) of individuals had a viral load between 4.3 and 6.7 log(10) viral copies per ml. A few patients (3.4%; n=7\\/204) have a serum viremia less than the lower limit of the linear range of the RT-PCR assay. Subsequent, prospective evaluation of hepatitis C viral load of all new patients using a model based on the dynamic range of viral load in the retrospective group correctly predicted the dynamic range in 75.9% (n=33\\/54). CONCLUSION: The dynamic range of hepatitis C viremia extends beyond the linear range of the Roche MONITOR assay. Accurate determination of serum viremia is substantially improved by dilution of specimens prior to quantification.

  1. Classification of pmoA amplicon pyrosequences using BLAST and the lowest common ancestor method in MEGAN

    Directory of Open Access Journals (Sweden)

    Marc Gregory Dumont

    2014-02-01

    Full Text Available The classification of high-throughput sequencing data of protein-encoding genes is not as well established as for 16S rRNA. The objective of this work was to develop a simple and accurate method of classifying large datasets of pmoA sequences, a common marker for methanotrophic bacteria. A taxonomic system for pmoA was developed based on a phylogenetic analysis of available sequences. The taxonomy incorporates the known diversity of pmoA present in public databases, including both sequences from cultivated and uncultivated organisms. Representative sequences from closely related genes, such as those encoding the bacterial ammonia monooxygenase, were also included in the pmoA taxonomy. In total, 53 low-level taxa (genus-level are included. Using previously published datasets of high-throughput pmoA amplicon sequence data, we tested two approaches for classifying pmoA: a naïve Bayesian classifier and BLAST. Classification of pmoA sequences based on BLAST analyses was performed using the lowest common ancestor (LCA algorithm in MEGAN, a software program commonly used for the analysis of metagenomic data. Both the naïve Bayesian and BLAST methods were able to classify pmoA sequences and provided similar classifications; however, the naïve Bayesian classifier was prone to misclassifying contaminant sequences present in the datasets. Another advantage of the BLAST/LCA method was that it provided a user-interpretable output and enabled novelty detection at various levels, from highly divergent pmoA sequences to genus-level novelty.  

  2. Nuclear Species-Diagnostic SNP Markers Mined from 454 Amplicon Sequencing Reveal Admixture Genomic Structure of Modern Citrus Varieties

    Science.gov (United States)

    Curk, Franck; Ancillo, Gema; Ollitrault, Frédérique; Perrier, Xavier; Jacquemoud-Collet, Jean-Pierre; Garcia-Lor, Andres; Navarro, Luis; Ollitrault, Patrick

    2015-01-01

    Most cultivated Citrus species originated from interspecific hybridisation between four ancestral taxa (C. reticulata, C. maxima, C. medica, and C. micrantha) with limited further interspecific recombination due to vegetative propagation. This evolution resulted in admixture genomes with frequent interspecific heterozygosity. Moreover, a major part of the phenotypic diversity of edible citrus results from the initial differentiation between these taxa. Deciphering the phylogenomic structure of citrus germplasm is therefore essential for an efficient utilization of citrus biodiversity in breeding schemes. The objective of this work was to develop a set of species-diagnostic single nucleotide polymorphism (SNP) markers for the four Citrus ancestral taxa covering the nine chromosomes, and to use these markers to infer the phylogenomic structure of secondary species and modern cultivars. Species-diagnostic SNPs were mined from 454 amplicon sequencing of 57 gene fragments from 26 genotypes of the four basic taxa. Of the 1,053 SNPs mined from 28,507 kb sequence, 273 were found to be highly diagnostic for a single basic taxon. Species-diagnostic SNP markers (105) were used to analyse the admixture structure of varieties and rootstocks. This revealed C. maxima introgressions in most of the old and in all recent selections of mandarins, and suggested that C. reticulata × C. maxima reticulation and introgression processes were important in edible mandarin domestication. The large range of phylogenomic constitutions between C. reticulata and C. maxima revealed in mandarins, tangelos, tangors, sweet oranges, sour oranges, grapefruits, and orangelos is favourable for genetic association studies based on phylogenomic structures of the germplasm. Inferred admixture structures were in agreement with previous hypotheses regarding the origin of several secondary species and also revealed the probable origin of several acid citrus varieties. The developed species-diagnostic SNP

  3. The Histone Variant H3.3 Is Enriched at Drosophila Amplicon Origins but Does Not Mark Them for Activation.

    Science.gov (United States)

    Paranjape, Neha P; Calvi, Brian R

    2016-01-01

    Eukaryotic DNA replication begins from multiple origins. The origin recognition complex (ORC) binds origin DNA and scaffolds assembly of a prereplicative complex (pre-RC), which is subsequently activated to initiate DNA replication. In multicellular eukaryotes, origins do not share a strict DNA consensus sequence, and their activity changes in concert with chromatin status during development, but mechanisms are ill-defined. Previous genome-wide analyses in Drosophila and other organisms have revealed a correlation between ORC binding sites and the histone variant H3.3. This correlation suggests that H3.3 may designate origin sites, but this idea has remained untested. To address this question, we examined the enrichment and function of H3.3 at the origins responsible for developmental gene amplification in the somatic follicle cells of the Drosophila ovary. We found that H3.3 is abundant at these amplicon origins. H3.3 levels remained high when replication initiation was blocked, indicating that H3.3 is abundant at the origins before activation of the pre-RC. H3.3 was also enriched at the origins during early oogenesis, raising the possibility that H3.3 bookmarks sites for later amplification. However, flies null mutant for both of the H3.3 genes in Drosophila did not have overt defects in developmental gene amplification or genomic replication, suggesting that H3.3 is not essential for the assembly or activation of the pre-RC at origins. Instead, our results imply that the correlation between H3.3 and ORC sites reflects other chromatin attributes that are important for origin function. PMID:27172191

  4. Uncultured bacterial diversity in a seawater recirculating aquaculture system revealed by 16S rRNA gene amplicon sequencing.

    Science.gov (United States)

    Lee, Da-Eun; Lee, Jinhwan; Kim, Young-Mog; Myeong, Jeong-In; Kim, Kyoung-Ho

    2016-04-01

    Bacterial diversity in a seawater recirculating aquaculture system (RAS) was investigated using 16S rRNA amplicon sequencing to understand the roles of bacterial communities in the system. The RAS was operated at nine different combinations of temperature (15°C, 20°C, and 25°C) and salinity (20‰, 25‰, and 32.5‰). Samples were collected from five or six RAS tanks (biofilters) for each condition. Fifty samples were analyzed. Proteobacteria and Bacteroidetes were most common (sum of both phyla: 67.2% to 99.4%) and were inversely proportional to each other. Bacteria that were present at an average of ≥ 1% included Actinobacteria (2.9%) Planctomycetes (2.0%), Nitrospirae (1.5%), and Acidobacteria (1.0%); they were preferentially present in packed bed biofilters, mesh biofilters, and maturation biofilters. The three biofilters showed higher diversity than other RAS tanks (aerated biofilters, floating bed biofilters, and fish tanks) from phylum to operational taxonomic unit (OTU) level. Samples were clustered into several groups based on the bacterial communities. Major taxonomic groups related to family Rhodobacteraceae and Flavobacteriaceae were distributed widely in the samples. Several taxonomic groups like [Saprospiraceae], Cytophagaceae, Octadecabacter, and Marivita showed a cluster-oriented distribution. Phaeobacter and Sediminicola-related reads were detected frequently and abundantly at low temperature. Nitrifying bacteria were detected frequently and abundantly in the three biofilters. Phylogenetic analysis of the nitrifying bacteria showed several similar OTUs were observed widely through the biofilters. The diverse bacterial communities and the minor taxonomic groups, except for Proteobacteria and Bacteroidetes, seemed to play important roles and seemed necessary for nitrifying activity in the RAS, especially in packed bed biofilters, mesh biofilters, and maturation biofilters. PMID:27033205

  5. Nuclear species-diagnostic SNP markers mined from 454 amplicon sequencing reveal admixture genomic structure of modern citrus varieties.

    Science.gov (United States)

    Curk, Franck; Ancillo, Gema; Ollitrault, Frédérique; Perrier, Xavier; Jacquemoud-Collet, Jean-Pierre; Garcia-Lor, Andres; Navarro, Luis; Ollitrault, Patrick

    2015-01-01

    Most cultivated Citrus species originated from interspecific hybridisation between four ancestral taxa (C. reticulata, C. maxima, C. medica, and C. micrantha) with limited further interspecific recombination due to vegetative propagation. This evolution resulted in admixture genomes with frequent interspecific heterozygosity. Moreover, a major part of the phenotypic diversity of edible citrus results from the initial differentiation between these taxa. Deciphering the phylogenomic structure of citrus germplasm is therefore essential for an efficient utilization of citrus biodiversity in breeding schemes. The objective of this work was to develop a set of species-diagnostic single nucleotide polymorphism (SNP) markers for the four Citrus ancestral taxa covering the nine chromosomes, and to use these markers to infer the phylogenomic structure of secondary species and modern cultivars. Species-diagnostic SNPs were mined from 454 amplicon sequencing of 57 gene fragments from 26 genotypes of the four basic taxa. Of the 1,053 SNPs mined from 28,507 kb sequence, 273 were found to be highly diagnostic for a single basic taxon. Species-diagnostic SNP markers (105) were used to analyse the admixture structure of varieties and rootstocks. This revealed C. maxima introgressions in most of the old and in all recent selections of mandarins, and suggested that C. reticulata × C. maxima reticulation and introgression processes were important in edible mandarin domestication. The large range of phylogenomic constitutions between C. reticulata and C. maxima revealed in mandarins, tangelos, tangors, sweet oranges, sour oranges, grapefruits, and orangelos is favourable for genetic association studies based on phylogenomic structures of the germplasm. Inferred admixture structures were in agreement with previous hypotheses regarding the origin of several secondary species and also revealed the probable origin of several acid citrus varieties. The developed species-diagnostic SNP

  6. Haplotyping and copy number estimation of the highly polymorphic human beta-defensin locus on 8p23 by 454 amplicon sequencing

    Directory of Open Access Journals (Sweden)

    Rosenstiel Philip

    2010-04-01

    Full Text Available Abstract Background The beta-defensin gene cluster (DEFB at chromosome 8p23.1 is one of the most copy number (CN variable regions of the human genome. Whereas individual DEFB CNs have been suggested as independent genetic risk factors for several diseases (e.g. psoriasis and Crohn's disease, the role of multisite sequence variations (MSV is less well understood and to date has only been reported for prostate cancer. Simultaneous assessment of MSVs and CNs can be achieved by PCR, cloning and Sanger sequencing, however, these methods are labour and cost intensive as well as prone to methodological bias introduced by bacterial cloning. Here, we demonstrate that amplicon sequencing of pooled individual PCR products by the 454 technology allows in-depth determination of MSV haplotypes and estimation of DEFB CNs in parallel. Results Six PCR products spread over ~87 kb of DEFB and harbouring 24 known MSVs were amplified from 11 DNA samples, pooled and sequenced on a Roche 454 GS FLX sequencer. From ~142,000 reads, ~120,000 haplotype calls (HC were inferred that identified 22 haplotypes ranging from 2 to 7 per amplicon. In addition to the 24 known MSVs, two additional sequence variations were detected. Minimal CNs were estimated from the ratio of HCs and compared to absolute CNs determined by alternative methods. Concordance in CNs was found for 7 samples, the CNs differed by one in 2 samples and the estimated minimal CN was half of the absolute in one sample. For 7 samples and 2 amplicons, the 454 haplotyping results were compared to those by cloning/Sanger sequencing. Intrinsic problems related to chimera formation during PCR and differences between haplotyping by 454 and cloning/Sanger sequencing are discussed. Conclusion Deep amplicon sequencing using the 454 technology yield thousands of HCs per amplicon for an affordable price and may represent an effective method for parallel haplotyping and CN estimation in small to medium-sized cohorts. The

  7. MAMA Software Features: Quantification Verification Documentation-1

    Energy Technology Data Exchange (ETDEWEB)

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  8. Polymicrobial nature of chronic diabetic foot ulcer biofilm infections determined using bacterial tag encoded FLX amplicon pyrosequencing (bTEFAP.

    Directory of Open Access Journals (Sweden)

    Scot E Dowd

    Full Text Available BACKGROUND: Diabetic extremity ulcers are associated with chronic infections. Such ulcer infections are too often followed by amputation because there is little or no understanding of the ecology of such infections or how to control or eliminate this type of chronic infection. A primary impediment to the healing of chronic wounds is biofilm phenotype infections. Diabetic foot ulcers are the most common, disabling, and costly complications of diabetes. Here we seek to derive a better understanding of the polymicrobial nature of chronic diabetic extremity ulcer infections. METHODS AND FINDINGS: Using a new bacterial tag encoded FLX amplicon pyrosequencing (bTEFAP approach we have evaluated the bacterial diversity of 40 chronic diabetic foot ulcers from different patients. The most prevalent bacterial genus associated with diabetic chronic wounds was Corynebacterium spp. Findings also show that obligate anaerobes including Bacteroides, Peptoniphilus, Fingoldia, Anaerococcus, and Peptostreptococcus spp. are ubiquitous in diabetic ulcers, comprising a significant portion of the wound biofilm communities. Other major components of the bacterial communities included commonly cultured genera such as Streptococcus, Serratia, Staphylococcus and Enterococcus spp. CONCLUSIONS: In this article, we highlight the patterns of population diversity observed in the samples and introduce preliminary evidence to support the concept of functional equivalent pathogroups (FEP. Here we introduce FEP as consortia of genotypically distinct bacteria that symbiotically produce a pathogenic community. According to this hypothesis, individual members of these communities when they occur alone may not cause disease but when they coaggregate or consort together into a FEP the synergistic effect provides the functional equivalence of well-known pathogens, such as Staphylococcus aureus, giving the biofilm community the factors necessary to maintain chronic biofilm infections

  9. Exploring the Gastrointestinal “Nemabiome”: Deep Amplicon Sequencing to Quantify the Species Composition of Parasitic Nematode Communities

    Science.gov (United States)

    Avramenko, Russell W.; Redman, Elizabeth M.; Lewis, Roy; Yazwinski, Thomas A.; Wasmuth, James D.; Gilleard, John S.

    2015-01-01

    Parasitic helminth infections have a considerable impact on global human health as well as animal welfare and production. Although co-infection with multiple parasite species within a host is common, there is a dearth of tools with which to study the composition of these complex parasite communities. Helminth species vary in their pathogenicity, epidemiology and drug sensitivity and the interactions that occur between co-infecting species and their hosts are poorly understood. We describe the first application of deep amplicon sequencing to study parasitic nematode communities as well as introduce the concept of the gastro-intestinal “nemabiome”. The approach is analogous to 16S rDNA deep sequencing used to explore microbial communities, but utilizes the nematode ITS-2 rDNA locus instead. Gastro-intestinal parasites of cattle were used to develop the concept, as this host has many well-defined gastro-intestinal nematode species that commonly occur as complex co-infections. Further, the availability of pure mono-parasite populations from experimentally infected cattle allowed us to prepare mock parasite communities to determine, and correct for, species representation biases in the sequence data. We demonstrate that, once these biases have been corrected, accurate relative quantitation of gastro-intestinal parasitic nematode communities in cattle fecal samples can be achieved. We have validated the accuracy of the method applied to field-samples by comparing the results of detailed morphological examination of L3 larvae populations with those of the sequencing assay. The results illustrate the insights that can be gained into the species composition of parasite communities, using grazing cattle in the mid-west USA as an example. However, both the technical approach and the concept of the ‘nemabiome’ have a wide range of potential applications in human and veterinary medicine. These include investigations of host-parasite and parasite-parasite interactions

  10. Gastrointestinal Bacterial and Methanogenic Archaea Diversity Dynamics Associated with Condensed Tannin-Containing Pine Bark Diet in Goats Using 16S rDNA Amplicon Pyrosequencing

    OpenAIRE

    Min, Byeng R.; Sandra Solaiman; Raymon Shange; Jong-Su Eun

    2014-01-01

    Eighteen Kiko-cross meat goats (n=6) were used to collect gastrointestinal (GI) bacteria and methanogenic archaea for diversity measures when fed condensed tannin-containing pine bark (PB). Three dietary treatments were tested: control diet (0% PB and 30% wheat straw (WS); 0.17% condensed tannins (CT) dry matter (DM)); 15% PB and 15% WS (1.6% CT DM), and 30% PB and 0% WS (3.2% CT DM). A 16S rDNA bacterial tag-encoded FLX amplicon pyrosequencing technique was used to characterize and elucida...

  11. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Gallati, Sabina, E-mail: sabina.gallati@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Schaller, Andre, E-mail: andre.schaller@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland)

    2012-07-06

    Highlights: Black-Right-Pointing-Pointer Serial qPCR accurately determines fragmentation state of any given DNA sample. Black-Right-Pointing-Pointer Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. Black-Right-Pointing-Pointer Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. Black-Right-Pointing-Pointer Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serial qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze-thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA ({lambda}{sub nDNA}) and mtDNA ({lambda}{sub mtDNA}) we present an approach to possibly correct measurements in

  12. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    International Nuclear Information System (INIS)

    Highlights: ► Serial qPCR accurately determines fragmentation state of any given DNA sample. ► Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. ► Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. ► Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serial qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze–thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA (λnDNA) and mtDNA (λmtDNA) we present an approach to possibly correct measurements in degraded samples in the future. To our knowledge this is the first time different degradation impact of the two genomes is

  13. Entanglement quantification by neutron scattering

    Energy Technology Data Exchange (ETDEWEB)

    Marty, Oliver; Plenio, Martin; Cramer, Marcus [Institut fuer Theoretische Physik, Universtitaet Ulm, Ulm (Germany); Epping, Michael; Kampermann, Hermann; Bruss, Dagmar [Institut fuer Theoretische Physik III, Heinrich-Heine Universitaet Duesseldorf, Duesseldorf (Germany)

    2013-07-01

    We present studies about the quantification of the entanglement contained in large samples of magnetic materials by structure factor measurements - a standard tool in analyzing condensed matter systems. We discuss experimentally relevant models (such as Heisenberg, Majumdar-Ghosh and XY models) in different geometries and with different spin numbers. For those, lower bounds to entanglement measures can be read off directly from the cross section obtained in neutron-scattering experiments.

  14. Risk Quantification and Evaluation Modelling

    OpenAIRE

    Manmohan Singh; M.D. Jaybhaye; S. K. Basu

    2014-01-01

    In this paper authors have discussed risk quantification methods and evaluation of risks and decision parameter to be used for deciding on ranking of the critical items, for prioritization of condition monitoring based risk and reliability centered maintenance (CBRRCM). As time passes any equipment or any product degrades into lower effectiveness and the rate of failure or malfunctioning increases, thereby lowering the reliability. Thus with the passage of time or a number of active tests or ...

  15. Conservative fragments in bacterial 16S rRNA genes and primer design for 16S ribosomal DNA amplicons in metagenomic studies

    KAUST Repository

    Wang, Yong

    2009-10-09

    Bacterial 16S ribosomal DNA (rDNA) amplicons have been widely used in the classification of uncultured bacteria inhabiting environmental niches. Primers targeting conservative regions of the rDNAs are used to generate amplicons of variant regions that are informative in taxonomic assignment. One problem is that the percentage coverage and application scope of the primers used in previous studies are largely unknown. In this study, conservative fragments of available rDNA sequences were first mined and then used to search for candidate primers within the fragments by measuring the coverage rate defined as the percentage of bacterial sequences containing the target. Thirty predicted primers with a high coverage rate (>90%) were identified, which were basically located in the same conservative regions as known primers in previous reports, whereas 30% of the known primers were associated with a coverage rate of <90%. The application scope of the primers was also examined by calculating the percentages of failed detections in bacterial phyla. Primers A519-539, E969- 983, E1063-1081, U515 and E517, are highly recommended because of their high coverage in almost all phyla. As expected, the three predominant phyla, Firmicutes, Gemmatimonadetes and Proteobacteria, are best covered by the predicted primers. The primers recommended in this report shall facilitate a comprehensive and reliable survey of bacterial diversity in metagenomic studies. © 2009 Wang, Qian.

  16. Application of PCR amplicon sequencing using a single primer pair in PCR amplification to assess variations in Helicobacter pylori CagA EPIYA tyrosine phosphorylation motifs

    Directory of Open Access Journals (Sweden)

    Karlsson Anneli

    2010-02-01

    Full Text Available Abstract Background The presence of various EPIYA tyrosine phosphorylation motifs in the CagA protein of Helicobacter pylori has been suggested to contribute to pathogenesis in adults. In this study, a unique PCR assay and sequencing strategy was developed to establish the number and variation of cagA EPIYA motifs. Findings MDA-DNA derived from gastric biopsy specimens from eleven subjects with gastritis was used with M13- and T7-sequence-tagged primers for amplification of the cagA EPIYA motif region. Automated capillary electrophoresis using a high resolution kit and amplicon sequencing confirmed variations in the cagA EPIYA motif region. In nine cases, sequencing revealed the presence of AB, ABC, or ABCC (Western type cagA EPIYA motif, respectively. In two cases, double cagA EPIYA motifs were detected (ABC/ABCC or ABC/AB, indicating the presence of two H. pylori strains in the same biopsy. Conclusion Automated capillary electrophoresis and Amplicon sequencing using a single, M13- and T7-sequence-tagged primer pair in PCR amplification enabled a rapid molecular typing of cagA EPIYA motifs. Moreover, the techniques described allowed for a rapid detection of mixed H. pylori strains present in the same biopsy specimen.

  17. Cowpea mosaic virus RNA-1 acts as an amplicon whose effects can be counteracted by a RNA-2-encoded suppressor of silencing

    International Nuclear Information System (INIS)

    Lines of Nicotiana benthamiana transgenic for full-length copies of both Cowpea mosaic virus (CPMV) genomic RNAs, either singly or together, have been produced. Plants transgenic for both RNAs developed symptoms characteristic of a CPMV infection. When plants transgenic for RNA-1 were agro-inoculated with RNA-2, no infection developed and the plants were also resistant to challenge with CPMV. By contrast, plants transgenic for RNA-2 became infected when agro-inoculated with RNA-1 and were fully susceptible to CPMV infection. The resistance of RNA-1 transgenic plants was shown to be related to the ability of RNA-1 to self-replicate and act as an amplicon. The ability of transgenically expressed RNA-2 to counteract the amplicon effect suggested that it encodes a suppressor of posttranscriptional gene silencing (PTGS). By examining the ability of portions of RNA-2 to reverse PTGS in N. benthamiana, we have identified the small (S) coat protein as the CPMV RNA-2-encoded suppressor of PTGS

  18. MODELS OF CAPITAL COSTS QUANTIFICATION

    OpenAIRE

    Tomáš KLIEŠTIK; Katarína VALÁŠKOVÁ

    2013-01-01

    The present contribution deals with the quantification of capital costs. The contribution is written on a theoretical basis. The costs will be particularly quantified in financing only by equity and only by debt capital and particularly in the so-called mixed financing in which weighted average costs of capital will be quantified. The cost of capital can be seen from three different perspectives: in the assets part of a company, in the liability part of a company and in the part of potential ...

  19. Quantification of wastewater sludge dewatering.

    Science.gov (United States)

    Skinner, Samuel J; Studer, Lindsay J; Dixon, David R; Hillis, Peter; Rees, Catherine A; Wall, Rachael C; Cavalida, Raul G; Usher, Shane P; Stickland, Anthony D; Scales, Peter J

    2015-10-01

    Quantification and comparison of the dewatering characteristics of fifteen sewage sludges from a range of digestion scenarios are described. The method proposed uses laboratory dewatering measurements and integrity analysis of the extracted material properties. These properties were used as inputs into a model of filtration, the output of which provides the dewatering comparison. This method is shown to be necessary for quantification and comparison of dewaterability as the permeability and compressibility of the sludges varies by up to ten orders of magnitude in the range of solids concentration of interest to industry. This causes a high sensitivity of the dewaterability comparison to the starting concentration of laboratory tests, thus simple dewaterability comparison based on parameters such as the specific resistance to filtration is difficult. The new approach is demonstrated to be robust relative to traditional methods such as specific resistance to filtration analysis and has an in-built integrity check. Comparison of the quantified dewaterability of the fifteen sludges to the relative volatile solids content showed a very strong correlation in the volatile solids range from 40 to 80%. The data indicate that the volatile solids parameter is a strong indicator of the dewatering behaviour of sewage sludges. PMID:26003332

  20. Detection and Quantification of Neurotransmitters in Dialysates

    OpenAIRE

    Zapata, Agustin; Chefer, Vladimir I.; Shippenberg, Toni S.; Denoroy, Luc

    2009-01-01

    Sensitive analytical methods are needed for the separation and quantification of neurotransmitters obtained in microdialysate studies. This unit describes methods that permit quantification of nanomolar concentrations of monoamines and their metabolites (high-pressure liquid chromatography electrochemical detection), acetylcholine (HPLC-coupled to an enzyme reactor), and amino acids (HPLC-fluorescence detection; capillary electrophoresis with laser-induced fluorescence detection).

  1. A new amplicon based approach of whole mitogenome sequencing for phylogenetic and phylogeographic analysis: An example of East African white-eyes (Aves, Zosteropidae).

    Science.gov (United States)

    Meimberg, Harald; Schachtler, Christina; Curto, Manuel; Husemann, Martin; Habel, Jan Christian

    2016-09-01

    Classical Sanger sequencing is still frequently used to generate sequence data for phylogenetic and phylogeographic inference. In this contribution we present a novel approach to genotype whole mitogenomic haplotypes using Illumina MiSeq reads from indexed amplicons. Our new approach reduces preparation time by multiplexing loci within a single or few PCR reactions and by plate format library construction. The use of paired-end reads allows covering amplicons of about 0.5kb and thus no nebulisation and assembly are necessary. We tested the power and effectiveness of this technique by analysing the mitogenomic diversity of East African white-eye bird species (Zosteropidae), a taxonomically highly diverse and complex species flock found in various ecosystems spread across major parts of Africa. We compare the newly generated mitogenomic data set with published data of three mitochondrial genes for a similar set of populations and taxa. The comparison demonstrates that our new procedure represents a cost effective use of NGS for medium throughput phylogenetic analyses. Using this method, we were able to increase the amount of phylogenetic information significantly, while reducing the costs and effort in the laboratory. The mitogenomic data show a higher resolution than previous studies providing higher support and new insights in the relationships of Zosterops species. Our data suggest to split Z. poliogaster into four distinct species, three of which had previously been proposed: Z. silvanus, Z. mbulensis, Z. kikyuensis and Z. kulalensis. Our approach allows the genotyping of whole mitogenomes for a large number of individuals and thus allows more reliable reconstruction of phylogenetic and phylogeographic relationships - also for non-model organisms. PMID:27233440

  2. Identification and Evaluation of Single-Nucleotide Polymorphisms in Allotetraploid Peanut (Arachis hypogaea L.) Based on Amplicon Sequencing Combined with High Resolution Melting (HRM) Analysis.

    Science.gov (United States)

    Hong, Yanbin; Pandey, Manish K; Liu, Ying; Chen, Xiaoping; Liu, Hong; Varshney, Rajeev K; Liang, Xuanqiang; Huang, Shangzhi

    2015-01-01

    The cultivated peanut (Arachis hypogaea L.) is an allotetraploid (AABB) species derived from the A-genome (Arachis duranensis) and B-genome (Arachis ipaensis) progenitors. Presence of two versions of a DNA sequence based on the two progenitor genomes poses a serious technical and analytical problem during single nucleotide polymorphism (SNP) marker identification and analysis. In this context, we have analyzed 200 amplicons derived from expressed sequence tags (ESTs) and genome survey sequences (GSS) to identify SNPs in a panel of genotypes consisting of 12 cultivated peanut varieties and two diploid progenitors representing the ancestral genomes. A total of 18 EST-SNPs and 44 genomic-SNPs were identified in 12 peanut varieties by aligning the sequence of A. hypogaea with diploid progenitors. The average frequency of sequence polymorphism was higher for genomic-SNPs than the EST-SNPs with one genomic-SNP every 1011 bp as compared to one EST-SNP every 2557 bp. In order to estimate the potential and further applicability of these identified SNPs, 96 peanut varieties were genotyped using high resolution melting (HRM) method. Polymorphism information content (PIC) values for EST-SNPs ranged between 0.021 and 0.413 with a mean of 0.172 in the set of peanut varieties, while genomic-SNPs ranged between 0.080 and 0.478 with a mean of 0.249. Total 33 SNPs were used for polymorphism detection among the parents and 10 selected lines from mapping population Y13Zh (Zhenzhuhei × Yueyou13). Of the total 33 SNPs, nine SNPs showed polymorphism in the mapping population Y13Zh, and seven SNPs were successfully mapped into five linkage groups. Our results showed that SNPs can be identified in allotetraploid peanut with high accuracy through amplicon sequencing and HRM assay. The identified SNPs were very informative and can be used for different genetic and breeding applications in peanut. PMID:26697032

  3. Visualization and quantification of dopamine

    International Nuclear Information System (INIS)

    Spiperone is a potent antagonist of dopamine D2 receptors in the brain. 3-N-[11C] methylspiperone (11C-NMSP), a spiperone derivative, was synthesized by N-alkylation of spiperone with [11C] methyliodide for the visualization and quantification of dopamine receptors in the brain using PET. Age related decrease of 11C-NMSP binding to the striatum was studied in healthy normal volunteers and the binding capacity was also examined in patients with Parkinson's disease and striatonigral degeneration (SND). After intravenous injection of 11C-NMSP, 5 sequential 2 min scan followed by 16 sequential 5 min scan were performed using a PET system. Region of interests to the striatum, cerebral cortex and cerebellum were outlined in the brain image and were plotted against time. The radioactivity in the striatum, in which much D2 binding site exists, was the highest among the brain and gradually increased with time. On the other hand, that in the cerebellar cortex, in which only non-specific binding site exists, peaked within 10 min, followed by rapid decrease. The ratios between striatum and cerebellum, which correlate with specific binding of 11C-NMSP, increased with time and had a linear relationship against time. Quantification of specific binding was evaluated by taking receptor-ligand association rate constant 'K3' in 3 conpartment model. The constant K3 was evaluated from the slope of the striatum to cerebellum ratio vesus an equivalent time which was calculated by the radioactivity in the cerebellulm. Decrease of K3 value with increase of age was observed. The values in Parkinson's disease were almost the same or slightly higher than those in age-matched control, however, a significant decrease compated to the normal value was observed in SND. The possible explanations for unchange or slight increase in Parkinson's disease and the decrease in SND were discussed. (author)

  4. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  5. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to

  6. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  7. Uncertainty Quantification in Aerodynamics Simulations Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  8. MAMA Software Features: Visual Examples of Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-20

    This document shows examples of the results from quantifying objects of certain sizes and types in the software. It is intended to give users a better feel for some of the quantification calculations, and, more importantly, to help users understand the challenges with using a small set of ‘shape’ quantification calculations for objects that can vary widely in shapes and features. We will add more examples to this in the coming year.

  9. Oncogenic Role of miR-15a-3p in 13q Amplicon-Driven Colorectal Adenoma-to-Carcinoma Progression.

    Science.gov (United States)

    de Groen, Florence L M; Timmer, Lisette M; Menezes, Renee X; Diosdado, Begona; Hooijberg, Erik; Meijer, Gerrit A; Steenbergen, Renske D M; Carvalho, Beatriz

    2015-01-01

    Progression from colorectal adenoma to carcinoma is strongly associated with an accumulation of genomic alterations, including gain of chromosome 13. This gain affects the whole q arm and is present in 40%-60% of all colorectal cancers (CRCs). Several genes located at this amplicon are known to be overexpressed in carcinomas due to copy number dosage. A subset of these genes, including the mir-17~92 cluster, are functionally involved in CRC development. The present study set out to explore whether apart from mir-17~92, other miRNAs located at the 13q amplicon show a copy number dependent dosage effect that may contribute to 13q-driven colorectal adenoma-to-carcinoma progression. Integration of publically available miRNA expression, target mRNA expression and DNA copy number data from 125 CRCs yielded three miRNAs, miR-15a, -17, and -20a, of which high expression levels were significantly correlated with a 13q gain and which influenced target mRNA expression. These results could be confirmed by qRT-PCR in a series of 100 colon adenomas and carcinomas.Functional analysis of both mature miRNAs encoded by mir-15a, i.e. miR-15a-5p and miR-15a-3p, showed that silencing of miR-15a-3p significantly inhibited viability of CRC cells. Integration of miR-15a expression levels with mRNA expression data of predicted target genes identified mitochondrial uncoupling protein 2 (UCP2) and COP9 signalosome subunit 2 (COPS2) as candidates with significantly decreased expression in CRCs with 13q gain. Upon silencing of miR-15a-3p, mRNA expression of both genes increased in CRC cells, supporting miR-15a-3p mediated regulation of UPC2 and COPS2 expression. In conclusion, significant overexpression of miR-15a-3p due to gain of 13q is functionally relevant in CRC, with UCP2 and COPS2 as candidate target genes. Taken together our findings suggest that miR-15a-3p may contribute to adenoma-to-carcinoma progression. PMID:26148070

  10. Risk Quantification and Evaluation Modelling

    Directory of Open Access Journals (Sweden)

    Manmohan Singh

    2014-07-01

    Full Text Available In this paper authors have discussed risk quantification methods and evaluation of risks and decision parameter to be used for deciding on ranking of the critical items, for prioritization of condition monitoring based risk and reliability centered maintenance (CBRRCM. As time passes any equipment or any product degrades into lower effectiveness and the rate of failure or malfunctioning increases, thereby lowering the reliability. Thus with the passage of time or a number of active tests or periods of work, the reliability of the product or the system, may fall down to a low value known as a threshold value, below which the reliability should not be allowed to dip. Hence, it is necessary to fix up the normal basis for determining the appropriate points in the product life cycle where predictive preventive maintenance may be applied in the programme so that the reliability (the probability of successful functioning can be enhanced, preferably to its original value, by reducing the failure rate and increasing the mean time between failure. It is very important for defence application where reliability is a prime work. An attempt is made to develop mathematical model for risk assessment and ranking them. Based on likeliness coefficient β1 and risk coefficient β2 ranking of the sub-systems can be modelled and used for CBRRCM.Defence Science Journal, Vol. 64, No. 4, July 2014, pp. 378-384, DOI:http://dx.doi.org/10.14429/dsj.64.6366 

  11. Lung involvement quantification in chest radiographs

    International Nuclear Information System (INIS)

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  12. Allele discovery of ten candidate drought-response genes in Austrian oak using a systematically informatics approach based on 454 amplicon sequencing

    Directory of Open Access Journals (Sweden)

    Homolka Andreas

    2012-04-01

    Full Text Available Abstract Background Rise of temperatures and shortening of available water as result of predicted climate change will impose significant pressure on long-lived forest tree species. Discovering allelic variation present in drought related genes of two Austrian oak species can be the key to understand mechanisms of natural selection and provide forestry with key tools to cope with future challenges. Results In the present study we have used Roche 454 sequencing and developed a bioinformatic pipeline to process multiplexed tagged amplicons in order to identify single nucleotide polymorphisms and allelic sequences of ten candidate genes related to drought/osmotic stress from sessile oak (Quercus robur and sessile oak (Q. petraea individuals. Out of these, eight genes of 336 oak individuals growing in Austria have been detected with a total number of 158 polymorphic sites. Allele numbers ranged from ten to 52 with observed heterozygosity ranging from 0.115 to 0.640. All loci deviated from Hardy-Weinberg equilibrium and linkage disequilibrium was found among six combinations of loci. Conclusions We have characterized 183 alleles of drought related genes from oak species and detected first evidences of natural selection. Beside the potential for marker development, we have created an expandable bioinformatic pipeline for the analysis of next generation sequencing data.

  13. Cross-comparison of exome analysis, next-generation sequencing of amplicons, and the iPLEX® ADME PGx Panel for pharmacogenomic profiling

    Directory of Open Access Journals (Sweden)

    Eng Wee eChua

    2016-01-01

    Full Text Available Whole-exome sequencing (WES has been widely used for analysis of human genetic diseases, but its value for the pharmacogenomic profiling of individuals is not well studied. Initially, we performed an in-depth evaluation of the accuracy of WES variant calling in the pharmacogenes CYP2D6 and CYP2C19 by comparison with MiSeq® amplicon sequencing data (n = 36. This analysis revealed that the concordance rate between WES and MiSeq® was high, achieving 99.60% for variants that were called without exceeding the truth-sensitivity threshold (99%, defined during variant quality score recalibration. Beyond this threshold, the proportion of discordant calls increased markedly. Subsequently, we expanded our findings beyond CYP2D6 and CYP2C19 to include more genes genotyped by the iPLEX® ADME PGx Panel in the subset of twelve samples. WES performed well, agreeing with the genotyping panel in approximately 99% of the selected pass-filter variant calls. Overall, our results have demonstrated WES to be a promising approach for pharmacogenomic profiling, with an estimated error rate of lower than 1%. Quality filters, particularly variant quality score recalibration, are important for reducing the number of false variants. Future studies may benefit from examining the role of WES in the clinical setting for guiding drug therapy.

  14. Cross-Comparison of Exome Analysis, Next-Generation Sequencing of Amplicons, and the iPLEX® ADME PGx Panel for Pharmacogenomic Profiling

    Science.gov (United States)

    Chua, Eng Wee; Cree, Simone L.; Ton, Kim N. T.; Lehnert, Klaus; Shepherd, Phillip; Helsby, Nuala; Kennedy, Martin A.

    2016-01-01

    Whole-exome sequencing (WES) has been widely used for analysis of human genetic diseases, but its value for the pharmacogenomic profiling of individuals is not well studied. Initially, we performed an in-depth evaluation of the accuracy of WES variant calling in the pharmacogenes CYP2D6 and CYP2C19 by comparison with MiSeq® amplicon sequencing data (n = 36). This analysis revealed that the concordance rate between WES and MiSeq® was high, achieving 99.60% for variants that were called without exceeding the truth-sensitivity threshold (99%), defined during variant quality score recalibration (VQSR). Beyond this threshold, the proportion of discordant calls increased markedly. Subsequently, we expanded our findings beyond CYP2D6 and CYP2C19 to include more genes genotyped by the iPLEX® ADME PGx Panel in the subset of twelve samples. WES performed well, agreeing with the genotyping panel in approximately 99% of the selected pass-filter variant calls. Overall, our results have demonstrated WES to be a promising approach for pharmacogenomic profiling, with an estimated error rate of lower than 1%. Quality filters, particularly VQSR, are important for reducing the number of false variants. Future studies may benefit from examining the role of WES in the clinical setting for guiding drug therapy. PMID:26858644

  15. Cross-Comparison of Exome Analysis, Next-Generation Sequencing of Amplicons, and the iPLEX(®) ADME PGx Panel for Pharmacogenomic Profiling.

    Science.gov (United States)

    Chua, Eng Wee; Cree, Simone L; Ton, Kim N T; Lehnert, Klaus; Shepherd, Phillip; Helsby, Nuala; Kennedy, Martin A

    2016-01-01

    Whole-exome sequencing (WES) has been widely used for analysis of human genetic diseases, but its value for the pharmacogenomic profiling of individuals is not well studied. Initially, we performed an in-depth evaluation of the accuracy of WES variant calling in the pharmacogenes CYP2D6 and CYP2C19 by comparison with MiSeq(®) amplicon sequencing data (n = 36). This analysis revealed that the concordance rate between WES and MiSeq(®) was high, achieving 99.60% for variants that were called without exceeding the truth-sensitivity threshold (99%), defined during variant quality score recalibration (VQSR). Beyond this threshold, the proportion of discordant calls increased markedly. Subsequently, we expanded our findings beyond CYP2D6 and CYP2C19 to include more genes genotyped by the iPLEX(®) ADME PGx Panel in the subset of twelve samples. WES performed well, agreeing with the genotyping panel in approximately 99% of the selected pass-filter variant calls. Overall, our results have demonstrated WES to be a promising approach for pharmacogenomic profiling, with an estimated error rate of lower than 1%. Quality filters, particularly VQSR, are important for reducing the number of false variants. Future studies may benefit from examining the role of WES in the clinical setting for guiding drug therapy. PMID:26858644

  16. Gene-specific amplicons from metagenomes as an alternative to directed evolution for enzyme screening: a case study using phenylacetaldehyde reductases.

    Science.gov (United States)

    Itoh, Nobuya; Kazama, Miki; Takeuchi, Nami; Isotani, Kentaro; Kurokawa, Junji

    2016-06-01

    Screening gene-specific amplicons from metagenomes (S-GAM) is a highly promising technique for the isolation of genes encoding enzymes for biochemical and industrial applications. From metagenomes, we isolated phenylacetaldehyde reductase (par) genes, which code for an enzyme that catalyzes the production of various Prelog's chiral alcohols. Nearly full-length par genes were amplified by PCR from metagenomic DNA, the products of which were fused with engineered par sequences at both terminal regions of the expression vector to ensure proper expression and then used to construct Escherichia coli plasmid libraries. Sequence- and activity-based screening of these libraries identified different homologous par genes, Hpar-001 to -036, which shared more than 97% amino acid sequence identity with PAR. Comparative characterization of these active homologs revealed a wide variety of enzymatic properties including activity, substrate specificity, and thermal stability. Moreover, amino acid substitutions in these genes coincided with those of Sar268 and Har1 genes, which were independently engineered by error-prone PCR to exhibit increased activity in the presence of concentrated 2-propanol. The comparative data from both approaches suggest that sequence information from homologs isolated from metagenomes is quite useful for enzyme engineering. Furthermore, by examining the GAM-based sequence dataset derived from soil metagenomes, we easily found amino acid substitutions that increase the thermal stability of PAR/PAR homologs. Thus, GAM-based approaches can provide not only useful homologous enzymes but also an alternative to directed evolution methodologies. PMID:27419059

  17. The utility of diversity profiling using Illumina 18S rRNA gene amplicon deep sequencing to detect and discriminate Toxoplasma gondii among the cyst-forming coccidia.

    Science.gov (United States)

    Cooper, Madalyn K; Phalen, David N; Donahoe, Shannon L; Rose, Karrie; Šlapeta, Jan

    2016-01-30

    Next-generation sequencing (NGS) has the capacity to screen a single DNA sample and detect pathogen DNA from thousands of host DNA sequence reads, making it a versatile and informative tool for investigation of pathogens in diseased animals. The technique is effective and labor saving in the initial identification of pathogens, and will complement conventional diagnostic tests to associate the candidate pathogen with a disease process. In this report, we investigated the utility of the diversity profiling NGS approach using Illumina small subunit ribosomal RNA (18S rRNA) gene amplicon deep sequencing to detect Toxoplasma gondii in previously confirmed cases of toxoplasmosis. We then tested the diagnostic approach with species-specific PCR genotyping, histopathology and immunohistochemistry of toxoplasmosis in a Risso's dolphin (Grampus griseus) to systematically characterise the disease and associate causality. We show that the Euk7A/Euk570R primer set targeting the V1-V3 hypervariable region of the 18S rRNA gene can be used as a species-specific assay for cyst-forming coccidia and discriminate T. gondii. Overall, the approach is cost-effective and improves diagnostic decision support by narrowing the differential diagnosis list with more certainty than was previously possible. Furthermore, it supplements the limitations of cryptic protozoan morphology and surpasses the need for species-specific PCR primer combinations. PMID:26801593

  18. Uncertainty Quantification in Climate Modeling

    Science.gov (United States)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  19. Uncertainty Quantification for Safeguards Measurements

    International Nuclear Information System (INIS)

    Part of the scientific method requires all calculated and measured results to be accompanied by a description that meets user needs and provides an adequate statement of the confidence one can have in the results. The scientific art of generating quantitative uncertainty statements is closely related to the mathematical disciplines of applied statistics, sensitivity analysis, optimization, and inversion, but in the field of non-destructive assay, also often draws heavily on expert judgment based on experience. We call this process uncertainty quantification, (UQ). Philosophical approaches to UQ along with the formal tools available for UQ have advanced considerably over recent years and these advances, we feel, may be useful to include in the analysis of data gathered from safeguards instruments. This paper sets out what we hope to achieve during a three year US DOE NNSA research project recently launched to address the potential of advanced UQ to improve safeguards conclusions. By way of illustration we discuss measurement of uranium enrichment by the enrichment meter principle (also known as the infinite thickness technique), that relies on gamma counts near the 186 keV peak directly from 235U. This method has strong foundations in fundamental physics and so we have a basis for the choice of response model — although in some implementations, peak area extraction may result in a bias when applied over a wide dynamic range. It also allows us to describe a common but usually neglected aspect of applying a calibration curve, namely the error structure in the predictors. We illustrate this using a combination of measured data and simulation. (author)

  20. Separation and quantification of microalgal carbohydrates.

    Science.gov (United States)

    Templeton, David W; Quinn, Matthew; Van Wychen, Stefanie; Hyman, Deborah; Laurens, Lieve M L

    2012-12-28

    Structural carbohydrates can constitute a large fraction of the dry weight of algal biomass and thus accurate identification and quantification is important for summative mass closure. Two limitations to the accurate characterization of microalgal carbohydrates are the lack of a robust analytical procedure to hydrolyze polymeric carbohydrates to their respective monomers and the subsequent identification and quantification of those monosaccharides. We address the second limitation, chromatographic separation of monosaccharides, here by identifying optimum conditions for the resolution of a synthetic mixture of 13 microalgae-specific monosaccharides, comprised of 8 neutral, 2 amino sugars, 2 uronic acids and 1 alditol (myo-inositol as an internal standard). The synthetic 13-carbohydrate mix showed incomplete resolution across 11 traditional high performance liquid chromatography (HPLC) methods, but showed improved resolution and accurate quantification using anion exchange chromatography (HPAEC) as well as alditol acetate derivatization followed by gas chromatography (for the neutral- and amino-sugars only). We demonstrate the application of monosaccharide quantification using optimized chromatography conditions after sulfuric acid analytical hydrolysis for three model algae strains and compare the quantification and complexity of monosaccharides in analytical hydrolysates relative to a typical terrestrial feedstock, sugarcane bagasse. PMID:23177152

  1. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes;

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than...... expected based on the information by the manufacturers. UV spectrometry, SYBR-Green dye staining, slot blot and RB1 rt-PCR gave 39, 27, 11 and 12%, respectively, higher concentrations than expected based on the manufacturers' information. The DNA preparations were quantified using the Quantifiler Human DNA...... Quantification kit in two experiments. The measured DNA concentrations with Quantifiler were 125 and 160% higher than expected based on the manufacturers' information. When the Quantifiler human DNA standard (Raji cell line) was replaced by the commercial human DNA preparation G147A (Promega) to generate the DNA...

  2. Rapid Drug Susceptibility Testing of Drug-Resistant Mycobacterium tuberculosis Isolates Directly from Clinical Samples by Use of Amplicon Sequencing: a Proof-of-Concept Study.

    Science.gov (United States)

    Colman, Rebecca E; Anderson, Julia; Lemmer, Darrin; Lehmkuhl, Erik; Georghiou, Sophia B; Heaton, Hannah; Wiggins, Kristin; Gillece, John D; Schupp, James M; Catanzaro, Donald G; Crudu, Valeriu; Cohen, Ted; Rodwell, Timothy C; Engelthaler, David M

    2016-08-01

    Increasingly complex drug-resistant tuberculosis (DR-TB) is a major global health concern and one of the primary reasons why TB is now the leading infectious cause of death worldwide. Rapid characterization of a DR-TB patient's complete drug resistance profile would facilitate individualized treatment in place of empirical treatment, improve treatment outcomes, prevent amplification of resistance, and reduce the transmission of DR-TB. The use of targeted next-generation sequencing (NGS) to obtain drug resistance profiles directly from patient sputum samples has the potential to enable comprehensive evidence-based treatment plans to be implemented quickly, rather than in weeks to months, which is currently needed for phenotypic drug susceptibility testing (DST) results. In this pilot study, we evaluated the performance of amplicon sequencing of Mycobacterium tuberculosis DNA from patient sputum samples using a tabletop NGS technology and automated data analysis to provide a rapid DST solution (the Next Gen-RDST assay). One hundred sixty-six out of 176 (94.3%) sputum samples from the Republic of Moldova yielded complete Next Gen-RDST assay profiles for 7 drugs of interest. We found a high level of concordance of our Next Gen-RDST assay results with phenotypic DST (97.0%) and pyrosequencing (97.8%) results from the same clinical samples. Our Next Gen-RDST assay was also able to estimate the proportion of resistant-to-wild-type alleles down to mixtures of ≤1%, which demonstrates the ability to detect very low levels of resistant variants not detected by pyrosequencing and possibly below the threshold for phenotypic growth methods. The assay as described here could be used as a clinical or surveillance tool. PMID:27225403

  3. Rapid Drug Susceptibility Testing of Drug-Resistant Mycobacterium tuberculosis Isolates Directly from Clinical Samples by Use of Amplicon Sequencing: a Proof-of-Concept Study

    Science.gov (United States)

    Anderson, Julia; Lemmer, Darrin; Lehmkuhl, Erik; Georghiou, Sophia B.; Heaton, Hannah; Wiggins, Kristin; Gillece, John D.; Schupp, James M.; Catanzaro, Donald G.; Crudu, Valeriu; Cohen, Ted; Rodwell, Timothy C.; Engelthaler, David M.

    2016-01-01

    Increasingly complex drug-resistant tuberculosis (DR-TB) is a major global health concern and one of the primary reasons why TB is now the leading infectious cause of death worldwide. Rapid characterization of a DR-TB patient's complete drug resistance profile would facilitate individualized treatment in place of empirical treatment, improve treatment outcomes, prevent amplification of resistance, and reduce the transmission of DR-TB. The use of targeted next-generation sequencing (NGS) to obtain drug resistance profiles directly from patient sputum samples has the potential to enable comprehensive evidence-based treatment plans to be implemented quickly, rather than in weeks to months, which is currently needed for phenotypic drug susceptibility testing (DST) results. In this pilot study, we evaluated the performance of amplicon sequencing of Mycobacterium tuberculosis DNA from patient sputum samples using a tabletop NGS technology and automated data analysis to provide a rapid DST solution (the Next Gen-RDST assay). One hundred sixty-six out of 176 (94.3%) sputum samples from the Republic of Moldova yielded complete Next Gen-RDST assay profiles for 7 drugs of interest. We found a high level of concordance of our Next Gen-RDST assay results with phenotypic DST (97.0%) and pyrosequencing (97.8%) results from the same clinical samples. Our Next Gen-RDST assay was also able to estimate the proportion of resistant-to-wild-type alleles down to mixtures of ≤1%, which demonstrates the ability to detect very low levels of resistant variants not detected by pyrosequencing and possibly below the threshold for phenotypic growth methods. The assay as described here could be used as a clinical or surveillance tool. PMID:27225403

  4. Development of an assay for rapid detection and quantification of Verticillium dahliae in soil.

    Science.gov (United States)

    Bilodeau, Guillaume J; Koike, Steven T; Uribe, Pedro; Martin, Frank N

    2012-03-01

    ABSTRACT Verticillium dahliae is responsible for Verticillium wilt on a wide range of hosts, including strawberry, on which low soil inoculum densities can cause significant crop loss. Determination of inoculum density is currently done by soil plating but this can take 6 to 8 weeks to complete and delay the grower's ability to make planting decisions. To provide a faster means for estimating pathogen populations in the soil, a multiplexed TaqMan real-time polymerase chain reaction (PCR) assay based on the ribosomal DNA (rDNA) intergenic spacer (IGS) was developed for V. dahliae. The assay was specific for V. dahliae and included an internal control for evaluation of inhibition due to the presence of PCR inhibitors in DNA extracted from soil samples. An excellent correlation was observed in regression analysis (R(2) = 0.96) between real-time PCR results and inoculum densities determined by soil plating in a range of field soils with pathogen densities as low as 1 to 2 microsclerotia/g of soil. Variation in copy number of the rDNA was also evaluated among isolates by SYBR Green real-time PCR amplification of the V. dahliae-specific amplicon compared with amplification of several single-copy genes and was estimated to range from ≈24 to 73 copies per haploid genome, which translated into possible differences in results among isolates of ≈1.8 cycle thresholds. Analysis of the variation in results of V. dahliae quantification among extractions of the same soil sample indicated that assaying four replicate DNA extractions for each field sample would provide accurate results. A TaqMan assay also was developed to help identify colonies of V. tricorpus on soil plates. PMID:22066673

  5. Comparison of different real-time PCR chemistries and their suitability for detection and quantification of genetically modified organisms

    Directory of Open Access Journals (Sweden)

    Žel Jana

    2008-03-01

    Full Text Available Abstract Background The real-time polymerase chain reaction is currently the method of choice for quantifying nucleic acids in different DNA based quantification applications. It is widely used also for detecting and quantifying genetically modified components in food and feed, predominantly employing TaqMan® and SYBR® Green real-time PCR chemistries. In our study four alternative chemistries: Lux™, Plexor™, Cycling Probe Technology and LNA® were extensively evaluated and compared using TaqMan® chemistry as a reference system. Results Amplicons were designed on the maize invertase gene and the 5'-junction of inserted transgene and plant genomic DNA in MON 810 event. Real-time assays were subsequently compared for their efficiency in PCR amplification, limits of detection and quantification, repeatability and accuracy to test the performance of the assays. Additionally, the specificity of established assays was checked on various transgenic and non-transgenic plant species. The overall applicability of the designed assays was evaluated, adding practicability and costs issues to the performance characteristics. Conclusion Although none of the chemistries significantly outperformed the others, there are certain characteristics that suggest that LNA® technology is an alternative to TaqMan® when designing assays for quantitative analysis. Because LNA® probes are much shorter they might be especially appropriate when high specificity is required and where the design of a common TaqMan® probe is difficult or even impossible due to sequence characteristics. Plexor™ on the other hand might be a method of choice for qualitative analysis when sensitivity, low cost and simplicity of use prevail.

  6. TaqMan probe real-time polymerase chain reaction assay for the quantification of canine DNA in chicken nugget.

    Science.gov (United States)

    Rahman, Md Mahfujur; Hamid, Sharifah Bee Abd; Basirun, Wan Jefrey; Bhassu, Subha; Rashid, Nur Raifana Abdul; Mustafa, Shuhaimi; Mohd Desa, Mohd Nasir; Ali, Md Eaqub

    2016-01-01

    This paper describes a short-amplicon-based TaqMan probe quantitative real-time PCR (qPCR) assay for the quantitative detection of canine meat in chicken nuggets, which are very popular across the world, including Malaysia. The assay targeted a 100-bp fragment of canine cytb gene using a canine-specific primer and TaqMan probe. Specificity against 10 different animals and plants species demonstrated threshold cycles (Ct) of 16.13 ± 0.12 to 16.25 ± 0.23 for canine DNA and negative results for the others in a 40-cycle reaction. The assay was tested for the quantification of up to 0.01% canine meat in deliberately spiked chicken nuggets with 99.7% PCR efficiency and 0.995 correlation coefficient. The analysis of the actual and qPCR predicted values showed a high recovery rate (from 87% ± 28% to 112% ± 19%) with a linear regression close to unity (R(2) = 0.999). Finally, samples of three halal-branded commercial chicken nuggets collected from different Malaysian outlets were screened for canine meat, but no contamination was demonstrated. PMID:26458055

  7. Multiplex PCR for the detection and quantification of zoonotic taxa of Giardia, Cryptosporidium and Toxoplasma in wastewater and mussels.

    Science.gov (United States)

    Marangi, Marianna; Giangaspero, Annunziata; Lacasella, Vita; Lonigro, Antonio; Gasser, Robin B

    2015-04-01

    Giardia duodenalis, Cryptosporidium parvum and Toxoplasma gondii are important parasitic protists linked to water- and food-borne diseases. The accurate detection of these pathogens is central to the diagnosis, tracking, monitoring and surveillance of these protists in humans, animals and the environment. In this study, we established a multiplex real-time PCR (qPCR), coupled to high resolution melting (HRM) analysis, for the specific detection and quantification of each G. duodenalis (assemblage A), C. parvum and T. gondii (Type I). Once optimised, this assay was applied to the testing of samples (n = 232) of treated wastewater and mussels (Mytilus galloprovincialis). Of 119 water samples, 28.6% were test-positive for G. duodenalis, C. parvum and/or both pathogens; of 113 mussel samples, 66.6% were test-positive for G. duodenalis, C. parvum and/or both pathogens, and 13.2% were test-positive for only T. gondii. The specificity of all amplicons produced was verified by direct sequencing. The oo/cysts numbers (per 5 μl of DNA sample) ranged from 10 to 64. The present multiplex assay achieved an efficiency of 100% and a R(2) value of >0.99. Current evidence indicates that this assay provides a promising tool for the simultaneous detection and quantitation of three key protist taxa. PMID:25591902

  8. Molecular quantification of genes encoding for green-fluorescent proteins

    DEFF Research Database (Denmark)

    Felske, A; Vandieken, V; Pauling, B V;

    2003-01-01

    A quantitative PCR approach is presented to analyze the amount of recombinant green fluorescent protein (gfp) genes in environmental DNA samples. The quantification assay is a combination of specific PCR amplification and temperature gradient gel electrophoresis (TGGE). Gene quantification...

  9. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to

  10. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    Paulson, Patrick R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Purohit, Sumit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rodriguez, Luke R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  11. Automated quantification and analysis of mandibular asymmetry

    DEFF Research Database (Denmark)

    Darvann, T. A.; Hermann, N. V.; Larsen, P.; Ólafsdóttir, Hildur; Hansen, I. V.; Hove, H. D.; Christensen, L.; Rueckert, D.; Kreiborg, S.

    We present an automated method of spatially detailed 3D asymmetry quantification in mandibles extracted from CT and apply it to a population of infants with unilateral coronal synostosis (UCS). An atlas-based method employing non-rigid registration of surfaces is used for determining deformation...

  12. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  13. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward; Hansson, Lars; Marstrand, J. R.; Larsson, Henrik B.W.; Hansen, Lars Kai

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...

  14. Anwendung der "Uncertainty Quantification" bei eisenbahndynamischen problemen

    DEFF Research Database (Denmark)

    Bigoni, Daniele; Engsig-Karup, Allan Peter; True, Hans

    The paper describes the results of the application of "Uncertainty Quantification" methods in railway vehicle dynamics. The system parameters are given by probability distributions. The results of the application of the Monte-Carlo and generalized Polynomial Chaos methods to a simple bogie model...

  15. Recurrence quantification analysis in Liu's attractor

    International Nuclear Information System (INIS)

    Recurrence Quantification Analysis is used to detect transitions chaos to periodical states or chaos to chaos in a new dynamical system proposed by Liu et al. This system contains a control parameter in the second equation and was originally introduced to investigate the forming mechanism of the compound structure of the chaotic attractor which exists when the control parameter is zero

  16. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension. PMID:25987192

  17. Efficient uncertainty quantification in unsteady aeroelastic simulations

    NARCIS (Netherlands)

    Witteveen, J.A.S.; Bijl, H.

    2009-01-01

    An efficient uncertainty quantification method for unsteady problems is presented in order to achieve a constant accuracy in time for a constant number of samples. The approach is applied to the aeroelastic problems of a transonic airfoil flutter system and the AGARD 445.6 wing benchmark with uncert

  18. SPECT quantification of regional radionuclide distributions

    International Nuclear Information System (INIS)

    SPECT quantification of regional radionuclide activities within the human body is affected by several physical and instrumental factors including attenuation of photons within the patient, Compton scattered events, the system's finite spatial resolution and object size, finite number of detected events, partial volume effects, the radiopharmaceutical biokinetics, and patient and/or organ motion. Furthermore, other instrumentation factors such as calibration of the center-of-rotation, sampling, and detector nonuniformities will affect the SPECT measurement process. These factors are described, together with examples of compensation methods that are currently available for improving SPECT quantification. SPECT offers the potential to improve in vivo estimates of absorbed dose, provided the acquisition, reconstruction, and compensation procedures are adequately implemented and utilized. 53 references, 2 figures

  19. A Tableaux Calculus for Ambiguous Quantification

    OpenAIRE

    Monz, Christof; de Rijke, Maarten

    2000-01-01

    Coping with ambiguity has recently received a lot of attention in natural language processing. Most work focuses on the semantic representation of ambiguous expressions. In this paper we complement this work in two ways. First, we provide an entailment relation for a language with ambiguous expressions. Second, we give a sound and complete tableaux calculus for reasoning with statements involving ambiguous quantification. The calculus interleaves partial disambiguation steps with steps in a t...

  20. Uncertainty quantification of effective nuclear interactions

    CERN Document Server

    Perez, R Navarro; Arriola, E Ruiz

    2016-01-01

    We give a brief review on the development of phenomenological NN interactions and the corresponding quantification of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean field calculations through the Skyrme parameters and effective field theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different fitting strategies on the light of recent developments.

  1. Automated Quantification of Pneumothorax in CT

    OpenAIRE

    Synho Do; Kristen Salvaggio; Supriya Gupta; Mannudeep Kalra; Ali, Nabeel U.; Homer Pien

    2012-01-01

    An automated, computer-aided diagnosis (CAD) algorithm for the quantification of pneumothoraces from Multidetector Computed Tomography (MDCT) images has been developed. Algorithm performance was evaluated through comparison to manual segmentation by expert radiologists. A combination of two-dimensional and three-dimensional processing techniques was incorporated to reduce required processing time by two-thirds (as compared to similar techniques). Volumetric measurements on relative pneumothor...

  2. 大豆小片段法HRM基因分型体系优化%Optimization of small amplicon of HRM genotyping system in soybean

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    SNP genotyping by High Resolution Melting Curve( HRM)is a simple and effective,higher sensi-tivity and stronger specificity method. Optimization and establishment of soybean HRM-based SNP genotyping sys-tem is the precondition and basis for soybean SNP research in future. In this study,small amplicon after adding the internal calibration was used to optimize and establish SNP genotyping system. The results showed that it was better to make PCR product length 50 -80bp,the melting temperature at 72 -82℃ and Tm value between 55 -62℃when designing a SNP primer. A total of 10μL PCR reaction system should contain 25ng DNA template,0. 6pmol SNP primers and 1μL LC Green dye. After PCR reaction,3 pmol high and low temperature internal calibration should be added to each wells,and then denatured for HRM analysis. Moreover,the optimized SNP genotyping system was used to genotyping a recombinant inbred lines( RILs)population,and the results demonstrated that the RILs could be completely divided into 2 genotypes of their parents by the optimized system,with improved accuracy and efficiency. The optimized SNP genotyping system established in this study will provide a useful foundation for developing SNP markers,constructing high density genetic maps,QTL identification in the future.%应用高分辨率熔解曲线( high resolution melting curve,HRM)技术进行基因分型简单有效,灵敏度高、特异性强。优化并建立基于HRM的大豆SNP基因分型体系,是准确进行SNP研究的前提和基础。本研究利用添加内标后的小片段法进行SNP基因分型体系优化研究,结果表明在设计SNP引物时,最好使其PCR产物长度为50~80bp,熔解温度在72~82℃,Tm值介于55~62℃之间;10μL PCR反应体系中应包含25ng DNA模板,0.6pmol SNP引物,1μL LC Green染料;PCR反应后,应在各点样孔分别加入3pmol的高、低温内标,然后进行变性处理和HRM分析。同时利用建立的基因分型体

  3. Fluorescence-linked Antigen Quantification (FLAQ) Assay for Fast Quantification of HIV-1 p24Gag

    Science.gov (United States)

    Gesner, Marianne; Maiti, Mekhala; Grant, Robert; Cavrois, Marielle

    2016-01-01

    The fluorescence-linked antigen quantification (FLAQ) assay allows a fast quantification of HIV-1 p24Gag antigen. Viral supernatant are lysed and incubated with polystyrene microspheres coated with polyclonal antibodies against HIV-1 p24Gag and detector antibodies conjugated to fluorochromes (Figure 1). After washes, the fluorescence of microspheres is measured by flow cytometry and reflects the abundance of the antigen in the lysate. The speed, simplicity, and wide dynamic range of the FLAQ assay are optimum for many applications performed in HIV-1 research laboratories.

  4. Influence of amplicon size on the polymerase chain reaction of Parvovirus B19 genome in formalin-fixed specimens Influência do tamanho do amplicon na reação em cadeia da polimerase na detecção do genoma do PB19 em amostras fixadas em formalina

    Directory of Open Access Journals (Sweden)

    Paulo Roberto Veiga Quemelo

    2009-04-01

    Full Text Available The polymerase chain reaction (PCR has provided diagnosis of archival material, but some fixation methods such as formalin damage DNA and, subsequently, affect PCR analysis, particularly paraffin-embedded tissues. PCR is known due to its high specificity and sensitivity, although some difficulties arise when formalinfixed and paraffin-embedded tissue is used. Not only does this occur due to protein cross-linking, which increases with longer fixation time, but it also happens due to the direct damage that formalin causes in the DNA itself. PCR was used to analyze placenta and fetal organs from 34 samples with suspected Parvovirus B19 infection. It was not possible to amplify Parvovirus B19 DNA using nested-PCR, probably due to the size of the amplicon generated with the first set of primers. We approached this problem by using only the second set of primers. Two out of 34 tissue samples (5,9% were positive by PCR. However, PCR performed on corresponding fetal organs was negative in one of the two. We also observed a negative relation between the thickness of the tissue fragment and the positivity of the samples. In conclusion, although PCR is highly specific and sensitive in fresh or ideally fixed material, a careful standardization of PCR assays is necessary when using formalin fixed paraffin-embedded tissues by applying primers that require smaller DNA fragments for amplification.A reação em cadeia da polimerase (PCR tem fornecido diagnóstico de material de arquivo, mas alguns métodos de fixação, tais como formalina, provocam danos ao DNA e subsequentemente afetam sua análise, particularmente tecidos embebidos em parafina. A PCR é conhecida pela sua alta especificidade e sensibilidade, embora algumas dificuldades ocorram quando o material utilizado foi fixado em formalina e embebido em parafina. Isso não se deve somente pela formação de cross-linkings com proteínas, a qual aumenta com o maior tempo de fixação, mas também pelo

  5. Aerodynamic Modeling with Heterogeneous Data Assimilation and Uncertainty Quantification Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. proposes to develop an aerodynamic modeling tool that assimilates data from different sources and facilitates uncertainty quantification. The...

  6. QconCAT: Internal Standard for Protein Quantification.

    Science.gov (United States)

    Scott, Kerry Bauer; Turko, Illarion V; Phinney, Karen W

    2016-01-01

    Protein quantification based on stable isotope labeling-mass spectrometry involves adding known quantities of stable isotope-labeled internal standards into biological samples. The internal standards are analogous to analyte molecules and quantification is achieved by comparing signals from isotope-labeled and analyte molecules. This methodology is broadly applicable to proteomics research, biomarker discovery and validation, and clinical studies, which require accurate and precise protein abundance measurements. One such internal standard platform for protein quantification is concatenated peptides (QconCAT). This chapter describes a protocol for the design, expression, characterization, and application of the QconCAT strategy for protein quantification. PMID:26791984

  7. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon;

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against...... human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification of...

  8. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Propagation of parameter uncertainties through large computer models can be very resource intensive. Frameworks and tools for uncertainty quantification are...

  9. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses methods for efficient quantification of margins and uncertainties (QMU) for models that couple multiple, large-scale commercial or...

  10. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  11. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  12. Tutorial examples for uncertainty quantification methods.

    Energy Technology Data Exchange (ETDEWEB)

    De Bord, Sarah [Univ. of California, Davis, CA (United States)

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  13. Simple quantification of in planta fungal biomass.

    Science.gov (United States)

    Ayliffe, Michael; Periyannan, Sambasivam K; Feechan, Angela; Dry, Ian; Schumann, Ulrike; Lagudah, Evans; Pryor, Anthony

    2014-01-01

    An accurate assessment of the disease resistance status of plants to fungal pathogens is an essential requirement for the development of resistant crop plants. Many disease resistance phenotypes are partial rather than obvious immunity and are frequently scored using subjective qualitative estimates of pathogen development or plant disease symptoms. Here we report a method for the accurate comparison of total fungal biomass in plant tissues. This method, called the WAC assay, is based upon the specific binding of the plant lectin wheat germ agglutinin to fungal chitin. The assay is simple, high-throughput, and sensitive enough to discriminate between single Puccinia graminis f.sp tritici infection sites on a wheat leaf segment. It greatly lends itself to replication as large volumes of tissue can be pooled from independent experiments and assayed to provide truly representative quantification, or, alternatively, fungal growth on a single, small leaf segment can be quantified. In addition, as the assay is based upon a microscopic technique, pathogen infection sites can also be examined at high magnification prior to quantification if desired and average infection site areas are determined. Previously, we have demonstrated the application of the WAC assay for quantifying the growth of several different pathogen species in both glasshouse grown material and large-scale field plots. Details of this method are provided within. PMID:24643560

  14. Cadmium purification and quantification using immunochromatography.

    Science.gov (United States)

    Sasaki, Kazuhiro; Yongvongsoontorn, Nunnarpas; Tawarada, Kei; Ohnishi, Yoshikazu; Arakane, Tamami; Kayama, Fujio; Abe, Kaoru; Oguma, Shinichi; Ohmura, Naoya

    2009-06-10

    One of the pathways by which cadmium enters human beings is through the consumption of agricultural products. The monitoring of cadmium has a significant role in the management of cadmium intake. Cadmium purification and quantification using immunochromatography were conducted in this study as an alternative means of cadmium analysis. The samples used in this study were rice, tomato, lettuce, garden pea, Arabidopsis thaliana (a widely used model organism for studying plants), soil, and fertilizer. The cadmium immunochromatography has been produced from the monoclonal antibody Nx2C3, which recognize the chelate form of cadmium, Cd.EDTA. The immunochromatography can be used for quantification of cadmium in a range from 0.01 to 0.1 mg/L at 20% mean coefficient of variance. A chelate column employing quaternary ammonium salts was used for the purification of cadmium from HCl extracts of samples. Recoveries of cadmium were near 100%, and the lowest recovery was 76.6% from rice leaves. The estimated cadmium concentrations from the immunochromatography procedure were evaluated by comparison with the results of instrumental analysis (ICP-AES or ICP-MS). By comparison of HCl extracts analyzed by ICP-MS and column eluates analyzed by immunochromatography of the samples, the estimated cadmium concentrations were closely similar, and their recoveries were from 98 to 116%. PMID:19489614

  15. Quantification of prebiotics in commercial infant formulas.

    Science.gov (United States)

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. PMID:26471520

  16. Comparative studies of modern methods for caries detection and quantification

    OpenAIRE

    Shi, Xie-Qi

    2001-01-01

    In clinical dentistry proper treatment of caries lesions is highly dependent on diagnostic accuracy. Aim The present dissertation aimed at the evaluation and comparisons between several modem methods for caries detection and quantification. Methods The employed methods for detection and quantification of caries lesions may be divided into two categories, namely laser fluorescence based methods and radiographic methods. Laser fluorescence The performance of the...

  17. Toolbox Approaches Using Molecular Markers and 16S rRNA Gene Amplicon Data Sets for Identification of Fecal Pollution in Surface Water.

    Science.gov (United States)

    Ahmed, W; Staley, C; Sadowsky, M J; Gyawali, P; Sidhu, J P S; Palmer, A; Beale, D J; Toze, S

    2015-10-01

    In this study, host-associated molecular markers and bacterial 16S rRNA gene community analysis using high-throughput sequencing were used to identify the sources of fecal pollution in environmental waters in Brisbane, Australia. A total of 92 fecal and composite wastewater samples were collected from different host groups (cat, cattle, dog, horse, human, and kangaroo), and 18 water samples were collected from six sites (BR1 to BR6) along the Brisbane River in Queensland, Australia. Bacterial communities in the fecal, wastewater, and river water samples were sequenced. Water samples were also tested for the presence of bird-associated (GFD), cattle-associated (CowM3), horse-associated, and human-associated (HF183) molecular markers, to provide multiple lines of evidence regarding the possible presence of fecal pollution associated with specific hosts. Among the 18 water samples tested, 83%, 33%, 17%, and 17% were real-time PCR positive for the GFD, HF183, CowM3, and horse markers, respectively. Among the potential sources of fecal pollution in water samples from the river, DNA sequencing tended to show relatively small contributions from wastewater treatment plants (up to 13% of sequence reads). Contributions from other animal sources were rarely detected and were very small (pollution in an urban river. This study is a proof of concept, and based on the results, we recommend using bacterial community analysis (where possible) along with PCR detection or quantification of host-associated molecular markers to provide information on the sources of fecal pollution in waterways. PMID:26231650

  18. In vivo MRS metabolite quantification using genetic optimization

    International Nuclear Information System (INIS)

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure

  19. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  20. Molecular identification of CTX-M and blaOXY/K1 β-lactamase genes in Enterobacteriaceae by sequencing of universal M13-sequence tagged PCR-amplicons

    Directory of Open Access Journals (Sweden)

    Tärnberg Maria

    2009-01-01

    Full Text Available Abstract Background Plasmid encoded blaCTX-M enzymes represent an important sub-group of class A β-lactamases causing the ESBL phenotype which is increasingly found in Enterobacteriaceae including Klebsiella spp. Molecular typing of clinical ESBL-isolates has become more and more important for prevention of the dissemination of ESBL-producers among nosocomial environment. Methods Multiple displacement amplified DNA derived from 20 K. pneumoniae and 34 K. oxytoca clinical isolates with an ESBL-phenotype was used in a universal CTX-M PCR amplification assay. Identification and differentiation of blaCTX-M and blaOXY/K1 sequences was obtained by DNA sequencing of M13-sequence-tagged CTX-M PCR-amplicons using a M13-specific sequencing primer. Results Nine out of 20 K. pneumoniae clinical isolates had a blaCTX-M genotype. Interestingly, we found that the universal degenerated primers also amplified the chromosomally located K1-gene in all 34 K. oxytoca clinical isolates. Molecular identification and differentiation between blaCTX-M and blaOXY/K1-genes could only been achieved by sequencing of the PCR-amplicons. In silico analysis revealed that the universal degenerated CTX-M primer-pair used here might also amplify the chromosomally located blaOXY and K1-genes in Klebsiella spp. and K1-like genes in other Enterobacteriaceae. Conclusion The PCR-based molecular typing method described here enables a rapid and reliable molecular identification of blaCTX-M, and blaOXY/K1-genes. The principles used in this study could also be applied to any situation in which antimicrobial resistance genes would need to be sequenced.

  1. Recurrence quantification analysis of global stock markets

    Science.gov (United States)

    Bastos, João A.; Caiado, Jorge

    2011-04-01

    This study investigates the presence of deterministic dependencies in international stock markets using recurrence plots and recurrence quantification analysis (RQA). The results are based on a large set of free float-adjusted market capitalization stock indices, covering a period of 15 years. The statistical tests suggest that the dynamics of stock prices in emerging markets is characterized by higher values of RQA measures when compared to their developed counterparts. The behavior of stock markets during critical financial events, such as the burst of the technology bubble, the Asian currency crisis, and the recent subprime mortgage crisis, is analyzed by performing RQA in sliding windows. It is shown that during these events stock markets exhibit a distinctive behavior that is characterized by temporary decreases in the fraction of recurrence points contained in diagonal and vertical structures.

  2. Theoretical model and quantification of reflectance photometer

    Institute of Scientific and Technical Information of China (English)

    Lihua Huang; Youbao Zhang; Chengke Xie; Jianfeng Qu; Huijie Huang; Xiangzhao Wang

    2009-01-01

    @@ The surface morphology of lateral flow (LF) strip is examined by scanning electron microscope (SEM) and the diffuse reflection of porous strip with or without nanogold particles is investigated.Based on the scattering and absorption of nanogold particles, a reflectance photometer is developed for quantification of LF strip with nanogold particles as reporter.The integration of reflection optical density is to indicate the signals of test line and control line.As an example, serial dilutions of microalbunminuria (MAU) solution are used to calibrate the performance of the reflectance photometer.The dose response curve is fitted with a four-parameter logistic mathematical model for the determination of an unknown MAU concentration.The response curve spans a dynamic range of 5 to 200 μg/ml.The developed reflectance photometer can realize simple and quantitative detection of analyte on nanogold-labeled LF strip.

  3. On Uncertainty Quantification in Particle Accelerators Modelling

    CERN Document Server

    Adelmann, Andreas

    2015-01-01

    Using a cyclotron based model problem, we demonstrate for the first time the applicability and usefulness of a uncertainty quantification (UQ) approach in order to construct surrogate models for quantities such as emittance, energy spread but also the halo parameter, and construct a global sensitivity analysis together with error propagation and $L_{2}$ error analysis. The model problem is selected in a way that it represents a template for general high intensity particle accelerator modelling tasks. The presented physics problem has to be seen as hypothetical, with the aim to demonstrate the usefulness and applicability of the presented UQ approach and not solving a particulate problem. The proposed UQ approach is based on sparse polynomial chaos expansions and relies on a small number of high fidelity particle accelerator simulations. Within this UQ framework, the identification of most important uncertainty sources is achieved by performing a global sensitivity analysis via computing the so-called Sobols' ...

  4. Quantification Methods of Management Skills in Shipping

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2012-04-01

    Full Text Available Romania can not overcome the financial crisis without business growth, without finding opportunities for economic development and without attracting investment into the country. Successful managers find ways to overcome situations of uncertainty. The purpose of this paper is to determine the managerial skills developed by the Romanian fluvial shipping company NAVROM (hereinafter CNFR NAVROM SA, compared with ten other major competitors in the same domain, using financial information of these companies during the years 2005-2010. For carrying out the work it will be used quantification methods of managerial skills to CNFR NAVROM SA Galati, Romania, as example mentioning the analysis of financial performance management based on profitability ratios, net profit margin, suppliers management, turnover.

  5. Quantification of adipose tissue insulin sensitivity

    DEFF Research Database (Denmark)

    Søndergaard, Esben; Jensen, Michael D

    2016-01-01

    In metabolically healthy humans, adipose tissue is exquisitely sensitive to insulin. Similar to muscle and liver, adipose tissue lipolysis is insulin resistant in adults with central obesity and type 2 diabetes. Perhaps uniquely, however, insulin resistance in adipose tissue may directly contribute...... to development of insulin resistance in muscle and liver because of the increased delivery of free fatty acids to those tissues. It has been hypothesized that insulin adipose tissue resistance may precede other metabolic defects in obesity and type 2 diabetes. Therefore, precise and reproducible...... quantification of adipose tissue insulin sensitivity, in vivo, in humans, is an important measure. Unfortunately, no consensus exists on how to determine adipose tissue insulin sensitivity. We review the methods available to quantitate adipose tissue insulin sensitivity and will discuss their strengths and...

  6. Quantification of adipose tissue insulin sensitivity.

    Science.gov (United States)

    Søndergaard, Esben; Jensen, Michael D

    2016-06-01

    In metabolically healthy humans, adipose tissue is exquisitely sensitive to insulin. Similar to muscle and liver, adipose tissue lipolysis is insulin resistant in adults with central obesity and type 2 diabetes. Perhaps uniquely, however, insulin resistance in adipose tissue may directly contribute to development of insulin resistance in muscle and liver because of the increased delivery of free fatty acids to those tissues. It has been hypothesized that insulin adipose tissue resistance may precede other metabolic defects in obesity and type 2 diabetes. Therefore, precise and reproducible quantification of adipose tissue insulin sensitivity, in vivo, in humans, is an important measure. Unfortunately, no consensus exists on how to determine adipose tissue insulin sensitivity. We review the methods available to quantitate adipose tissue insulin sensitivity and will discuss their strengths and weaknesses. PMID:27073214

  7. Recurrence quantification analysis theory and best practices

    CERN Document Server

    Jr, Jr; Marwan, Norbert

    2015-01-01

    The analysis of recurrences in dynamical systems by using recurrence plots and their quantification is still an emerging field.  Over the past decades recurrence plots have proven to be valuable data visualization and analysis tools in the theoretical study of complex, time-varying dynamical systems as well as in various applications in biology, neuroscience, kinesiology, psychology, physiology, engineering, physics, geosciences, linguistics, finance, economics, and other disciplines.   This multi-authored book intends to comprehensively introduce and showcase recent advances as well as established best practices concerning both theoretical and practical aspects of recurrence plot based analysis.  Edited and authored by leading researcher in the field, the various chapters address an interdisciplinary readership, ranging from theoretical physicists to application-oriented scientists in all data-providing disciplines.

  8. Feature isolation and quantification of evolving datasets

    Science.gov (United States)

    1994-01-01

    Identifying and isolating features is an important part of visualization and a crucial step for the analysis and understanding of large time-dependent data sets (either from observation or simulation). In this proposal, we address these concerns, namely the investigation and implementation of basic 2D and 3D feature based methods to enhance current visualization techniques and provide the building blocks for automatic feature recognition, tracking, and correlation. These methods incorporate ideas from scientific visualization, computer vision, image processing, and mathematical morphology. Our focus is in the area of fluid dynamics, and we show the applicability of these methods to the quantification and tracking of three-dimensional vortex and turbulence bursts.

  9. Uncertainty quantification in DIC with Kriging regression

    Science.gov (United States)

    Wang, Dezhi; DiazDelaO, F. A.; Wang, Weizhuo; Lin, Xiaoshan; Patterson, Eann A.; Mottershead, John E.

    2016-03-01

    A Kriging regression model is developed as a post-processing technique for the treatment of measurement uncertainty in classical subset-based Digital Image Correlation (DIC). Regression is achieved by regularising the sample-point correlation matrix using a local, subset-based, assessment of the measurement error with assumed statistical normality and based on the Sum of Squared Differences (SSD) criterion. This leads to a Kriging-regression model in the form of a Gaussian process representing uncertainty on the Kriging estimate of the measured displacement field. The method is demonstrated using numerical and experimental examples. Kriging estimates of displacement fields are shown to be in excellent agreement with 'true' values for the numerical cases and in the experimental example uncertainty quantification is carried out using the Gaussian random process that forms part of the Kriging model. The root mean square error (RMSE) on the estimated displacements is produced and standard deviations on local strain estimates are determined.

  10. Quantification of Osteon Morphology Using Geometric Histomorphometrics.

    Science.gov (United States)

    Dillon, Scott; Cunningham, Craig; Felts, Paul

    2016-03-01

    Many histological methods in forensic anthropology utilize combinations of traditional histomorphometric parameters which may not accurately describe the morphology of microstructural features. Here, we report the novel application of a geometric morphometric method suitable when considering structures without anatomically homologous landmarks for the quantification of complete secondary osteon size and morphology. The method is tested for its suitability in the measurement of intact secondary osteons using osteons digitized from transverse femoral diaphyseal sections prepared from two human individuals. The results of methodological testing demonstrate the efficacy of the technique when applied to intact secondary osteons. In providing accurate characterization of micromorphology within the robust mathematical framework of geometric morphometrics, this method may surpass traditional histomorphometric variables currently employed in forensic research and practice. A preliminary study of the intersectional histomorphometric variation within the femoral diaphysis is made using this geometric histomorphometric method to demonstrate its potential. PMID:26478136

  11. Quantification of Structure from Medical Images

    DEFF Research Database (Denmark)

    Qazi, Arish Asif

    In this thesis, we present automated methods that quantify information from medical images; information that is intended to assist and enable clinicians gain a better understanding of the underlying pathology. The first part of the thesis presents methods that analyse the articular cartilage...... information beyond that of traditional morphometric measures. The thesis also proposes a fully automatic and generic statistical framework for identifying biologically interpretable regions of difference (ROD) between two groups of biological objects, attributed by anatomical differences or changes relating...... to pathology, without a priori knowledge about the location, extent, or topology of the ROD. Based on quantifications from both morphometric and textural based imaging markers, our method has identified the most pathological regions in the articular cartilage. The remaining part of the thesis...

  12. Quantification practices in the nuclear industry

    International Nuclear Information System (INIS)

    In this chapter the quantification of risk practices adopted by the nuclear industries in Germany, Britain and France are examined as representative of the practices adopted throughout Europe. From this examination a number of conclusions are drawn about the common features of the practices adopted. In making this survey, the views expressed in the report of the Task Force on Safety Goals/Objectives appointed by the Commission of the European Communities, are taken into account. For each country considered, the legal requirements for presentation of quantified risk assessment as part of the licensing procedure are examined, and the way in which the requirements have been developed for practical application are then examined. (author)

  13. Quantification of uncertainties for application in detonation simulation

    Science.gov (United States)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  14. Quantification of low levels of fluorine content in thin films

    International Nuclear Information System (INIS)

    Fluorine quantification in thin film samples containing different amounts of fluorine atoms was accomplished by combining proton-Rutherford Backscattering Spectrometry (p-RBS) and proton induced gamma-ray emission (PIGE) using proton beams of 1550 and 2330 keV for p-RBS and PIGE measurements, respectively. The capabilities of the proposed quantification method are illustrated with examples of the analysis of a series of samples of fluorine-doped tin oxides, fluorinated silica, and fluorinated diamond-like carbon films. It is shown that this procedure allows the quantification of F contents as low as 1 at.% in thin films with thicknesses in the 100–400 nm range.

  15. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Vincent A. Mousseau

    2008-09-01

    This report presents the forward sensitivity analysis method as a means for quantification of uncertainty in system analysis. The traditional approach to uncertainty quantification is based on a “black box” approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. This approach requires large number of simulation runs and therefore has high computational cost. Contrary to the “black box” method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In this approach equations for the propagation of uncertainty are constructed and the sensitivity is solved for as variables in the same simulation. This “glass box” method can generate similar sensitivity information as the above “black box” approach with couples of runs to cover a large uncertainty region. Because only small numbers of runs are required, those runs can be done with a high accuracy in space and time ensuring that the uncertainty of the physical model is being measured and not simply the numerical error caused by the coarse discretization. In the forward sensitivity method, the model is differentiated with respect to each parameter to yield an additional system of the same size as the original one, the result of which is the solution sensitivity. The sensitivity of any output variable can then be directly obtained from these sensitivities by applying the chain rule of differentiation. We extend the forward sensitivity method to include time and spatial steps as special parameters so that the numerical errors can be quantified against other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty analysis. By knowing the relative sensitivity of time and space steps with other

  16. Synthesis and Review: Advancing agricultural greenhouse gas quantification

    International Nuclear Information System (INIS)

    Reducing emissions of agricultural greenhouse gases (GHGs), such as methane and nitrous oxide, and sequestering carbon in the soil or in living biomass can help reduce the impact of agriculture on climate change while improving productivity and reducing resource use. There is an increasing demand for improved, low cost quantification of GHGs in agriculture, whether for national reporting to the United Nations Framework Convention on Climate Change (UNFCCC), underpinning and stimulating improved practices, establishing crediting mechanisms, or supporting green products. This ERL focus issue highlights GHG quantification to call attention to our existing knowledge and opportunities for further progress. In this article we synthesize the findings of 21 papers on the current state of global capability for agricultural GHG quantification and visions for its improvement. We conclude that strategic investment in quantification can lead to significant global improvement in agricultural GHG estimation in the near term. (paper)

  17. Quantification of Uncertainties in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...

  18. Uncertainty Quantification for Production Navier-Stokes Solvers Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The uncertainty quantification methods developed under this program are designed for use with current state-of-the-art flow solvers developed by and in use at NASA....

  19. Quantification of Uncertainties in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective for the Phase II effort will be to develop a comprehensive, efficient, and flexible uncertainty quantification (UQ) framework implemented within a...

  20. Droplet digital PCR, the new tool in HIV reservoir quantification?

    OpenAIRE

    De Spiegelaere, Ward; Kiselinova, Maja; Malatinkova, Eva; Pasternak, Alexander; Berkhout, Ben; Vandekerckhove, Linos

    2013-01-01

    Background: Digital PCR is a relatively old concept for absolute quantification of DNA using PCR, but recent technological developments allowed its wide use. The current state of art technique for performing digital PCR is based on microdroplet technology. Direct absolute quantification relieves the necessity of standard curves and increases assay accuracy. In addition, the end point PCR set-up allows higher assay flexibility and decreases quantitative bias due to variations in PCR efficiency...

  1. Volumetric motion quantification by 3D tissue phase mapped CMR

    OpenAIRE

    Lutz Anja; Paul Jan; Bornstedt Axel; Nienhaus G; Etyngier Patrick; Bernhardt Peter; Rottbauer Wolfgang; Rasche Volker

    2012-01-01

    Abstract Background The objective of this study was the quantification of myocardial motion from 3D tissue phase mapped (TPM) CMR. Recent work on myocardial motion quantification by TPM has been focussed on multi-slice 2D acquisitions thus excluding motion information from large regions of the left ventricle. Volumetric motion assessment appears an important next step towards the understanding of the volumetric myocardial motion and hence may further improve diagnosis and treatments in patien...

  2. Neutron-encoded mass signatures for multiplexed proteome quantification.

    Science.gov (United States)

    Hebert, Alexander S; Merrill, Anna E; Bailey, Derek J; Still, Amelia J; Westphall, Michael S; Strieter, Eric R; Pagliarini, David J; Coon, Joshua J

    2013-04-01

    We describe a protein quantification method called neutron encoding that exploits the subtle mass differences caused by nuclear binding energy variation in stable isotopes. These mass differences are synthetically encoded into amino acids and incorporated into yeast and mouse proteins via metabolic labeling. Mass spectrometry analysis with high mass resolution (>200,000) reveals the isotopologue-embedded peptide signals, permitting quantification. Neutron encoding will enable highly multiplexed proteome analysis with excellent dynamic range and accuracy. PMID:23435260

  3. Quantification of global myocardial oxygenation in humans: initial experience

    OpenAIRE

    Gropler Robert J; Woodard Pamela K; Lyons Matt; Lesniak Donna; O'Connor Robert; McCommis Kyle S; Zheng Jie

    2010-01-01

    Abstract Purpose To assess the feasibility of our newly developed cardiovascular magnetic resonance (CMR) methods to quantify global and/or regional myocardial oxygen consumption rate (MVO2) at rest and during pharmacologically-induced vasodilation in normal volunteers. Methods A breath-hold T2 quantification method is developed to calculate oxygen extraction fraction (OEF) and MVO2 rate at rest and/or during hyperemia, using a two-compartment model. A previously reported T2 quantification me...

  4. Quantification Model for Estimating Temperature Field Distributions of Apple Fruit

    OpenAIRE

    Min ZHANG; Yang, Le; Zhao, Huizhong; Zhang, Leijie; Zhong, Zhiyou; Liu, Yanling; Chen, Jianhua

    2010-01-01

    A quantification model of transient heat conduction was provided to simulate apple fruit temperature distribution in the cooling process. The model was based on the energy variation of apple fruit of different points. It took into account, heat exchange of representative elemental volume, metabolism heat and external heat. The following conclusions could be obtained: first, the quantification model can satisfactorily describe the tendency of apple fruit temperature distribution in the cooling...

  5. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost which quickly becomes intractable with the current explosion of data sizes. In this work we reduce this complexity to quadratic with the synergy of two algorithms that gracefully complement each other and lead to a radically different approach. First, we turned to stochastic estimation of the diagonal. This allowed us to cast the problem as a linear system with a relatively small number of multiple right hand sides. Second, for this linear system we developed a novel, mixed precision, iterative refinement scheme, which uses iterative solvers instead of matrix factorizations. We demonstrate that the new framework not only achieves the much needed quadratic cost but in addition offers excellent opportunities for scaling at massively parallel environments. We based our implementation on BLAS 3 kernels that ensure very high processor performance. We achieved a peak performance of 730 TFlops on 72 BG/P racks, with a sustained performance 73% of theoretical peak. We stress that the techniques presented in this work are quite general and applicable to several other important applications. Copyright © 2009 ACM.

  6. Uncertainty Quantification for Cargo Hold Fires

    CERN Document Server

    DeGennaro, Anthony M; Martinelli, Luigi; Rowley, Clarence W

    2015-01-01

    The purpose of this study is twofold -- first, to introduce the application of high-order discontinuous Galerkin methods to buoyancy-driven cargo hold fire simulations, second, to explore statistical variation in the fluid dynamics of a cargo hold fire given parameterized uncertainty in the fire source location and temperature. Cargo hold fires represent a class of problems that require highly-accurate computational methods to simulate faithfully. Hence, we use an in-house discontinuous Galerkin code to treat these flows. Cargo hold fires also exhibit a large amount of uncertainty with respect to the boundary conditions. Thus, the second aim of this paper is to quantify the resulting uncertainty in the flow, using tools from the uncertainty quantification community to ensure that our efforts require a minimal number of simulations. We expect that the results of this study will provide statistical insight into the effects of fire location and temperature on cargo fires, and also assist in the optimization of f...

  7. Quantification of food intake in Drosophila.

    Directory of Open Access Journals (Sweden)

    Richard Wong

    Full Text Available Measurement of food intake in the fruit fly Drosophila melanogaster is often necessary for studies of behaviour, nutrition and drug administration. There is no reliable and agreed method for measuring food intake of flies in undisturbed, steady state, and normal culture conditions. We report such a method, based on measurement of feeding frequency by proboscis-extension, validated by short-term measurements of food dye intake. We used the method to demonstrate that (a female flies feed more frequently than males, (b flies feed more often when housed in larger groups and (c fly feeding varies at different times of the day. We also show that alterations in food intake are not induced by dietary restriction or by a null mutation of the fly insulin receptor substrate chico. In contrast, mutation of takeout increases food intake by increasing feeding frequency while mutation of ovo(D increases food intake by increasing the volume of food consumed per proboscis-extension. This approach provides a practical and reliable method for quantification of food intake in Drosophila under normal, undisturbed culture conditions.

  8. Legionella spp. isolation and quantification from greywater.

    Science.gov (United States)

    Rodríguez-Martínez, Sara; Blanky, Marina; Friedler, Eran; Halpern, Malka

    2015-01-01

    Legionella, an opportunistic human pathogen whose natural environment is water, is transmitted to humans through inhalation of contaminated aerosols. Legionella has been isolated from a high diversity of water types. Due its importance as a pathogen, two ISO protocols have been developed for its monitoring. However, these two protocols are not suitable for analyzing Legionella in greywater (GW). GW is domestic wastewater excluding the inputs from toilets and kitchen. It can serve as an alternative water source, mainly for toilet flushing and garden irrigation; both producing aerosols that can cause a risk for Legionella infection. Hence, before reuse, GW has to be treated and its quality needs to be monitored. The difficulty of Legionella isolation from GW strives in the very high load of contaminant bacteria. Here we describe a modification of the ISO protocol 11731:1998 that enables the isolation and quantification of Legionella from GW samples. The following modifications were made:•To enable isolation of Legionella from greywater, a pre-filtration step that removes coarse matter is recommended.•Legionella can be isolated after a combined acid-thermic treatment that eliminates the high load of contaminant bacteria in the sample. PMID:26740925

  9. Standardless quantification methods in electron probe microanalysis

    International Nuclear Information System (INIS)

    The elemental composition of a solid sample can be determined by electron probe microanalysis with or without the use of standards. The standardless algorithms are quite faster than the methods that require standards; they are useful when a suitable set of standards is not available or for rough samples, and also they help to solve the problem of current variation, for example, in equipments with cold field emission gun. Due to significant advances in the accuracy achieved during the last years, product of the successive efforts made to improve the description of generation, absorption and detection of X-rays, the standardless methods have increasingly become an interesting option for the user. Nevertheless, up to now, algorithms that use standards are still more precise than standardless methods. It is important to remark, that care must be taken with results provided by standardless methods that normalize the calculated concentration values to 100%, unless an estimate of the errors is reported. In this work, a comprehensive discussion of the key features of the main standardless quantification methods, as well as the level of accuracy achieved by them is presented. - Highlights: • Standardless methods are a good alternative when no suitable standards are available. • Their accuracy reaches 10% for 95% of the analyses when traces are excluded. • Some of them are suitable for the analysis of rough samples

  10. Standardless quantification methods in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Trincavelli, Jorge, E-mail: trincavelli@famaf.unc.edu.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Limandri, Silvina, E-mail: s.limandri@conicet.gov.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Bonetto, Rita, E-mail: bonetto@quimica.unlp.edu.ar [Centro de Investigación y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Facultad de Ciencias Exactas, de la Universidad Nacional de La Plata, Calle 47 N° 257, 1900 La Plata (Argentina)

    2014-11-01

    The elemental composition of a solid sample can be determined by electron probe microanalysis with or without the use of standards. The standardless algorithms are quite faster than the methods that require standards; they are useful when a suitable set of standards is not available or for rough samples, and also they help to solve the problem of current variation, for example, in equipments with cold field emission gun. Due to significant advances in the accuracy achieved during the last years, product of the successive efforts made to improve the description of generation, absorption and detection of X-rays, the standardless methods have increasingly become an interesting option for the user. Nevertheless, up to now, algorithms that use standards are still more precise than standardless methods. It is important to remark, that care must be taken with results provided by standardless methods that normalize the calculated concentration values to 100%, unless an estimate of the errors is reported. In this work, a comprehensive discussion of the key features of the main standardless quantification methods, as well as the level of accuracy achieved by them is presented. - Highlights: • Standardless methods are a good alternative when no suitable standards are available. • Their accuracy reaches 10% for 95% of the analyses when traces are excluded. • Some of them are suitable for the analysis of rough samples.

  11. Quantification of moving target cyber defenses

    Science.gov (United States)

    Farris, Katheryn A.; Cybenko, George

    2015-05-01

    Current network and information systems are static, making it simple for attackers to maintain an advantage. Adaptive defenses, such as Moving Target Defenses (MTD) have been developed as potential "game-changers" in an effort to increase the attacker's workload. With many new methods being developed, it is difficult to accurately quantify and compare their overall costs and effectiveness. This paper compares the tradeoffs between current approaches to the quantification of MTDs. We present results from an expert opinion survey on quantifying the overall effectiveness, upfront and operating costs of a select set of MTD techniques. We find that gathering informed scientific opinions can be advantageous for evaluating such new technologies as it offers a more comprehensive assessment. We end by presenting a coarse ordering of a set of MTD techniques from most to least dominant. We found that seven out of 23 methods rank as the more dominant techniques. Five of which are techniques of either address space layout randomization or instruction set randomization. The remaining two techniques are applicable to software and computer platforms. Among the techniques that performed the worst are those primarily aimed at network randomization.

  12. Uncertainty quantification in reacting flow modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Le MaÒitre, Olivier P. (UniversitÔe d' Evry Val d' Essonne, Evry, France); Reagan, Matthew T.; Knio, Omar M. (Johns Hopkins University, Baltimore, MD); Ghanem, Roger Georges (Johns Hopkins University, Baltimore, MD); Najm, Habib N.

    2003-10-01

    Uncertainty quantification (UQ) in the computational modeling of physical systems is important for scientific investigation, engineering design, and model validation. In this work we develop techniques for UQ based on spectral and pseudo-spectral polynomial chaos (PC) expansions, and we apply these constructions in computations of reacting flow. We develop and compare both intrusive and non-intrusive spectral PC techniques. In the intrusive construction, the deterministic model equations are reformulated using Galerkin projection into a set of equations for the time evolution of the field variable PC expansion mode strengths. The mode strengths relate specific parametric uncertainties to their effects on model outputs. The non-intrusive construction uses sampling of many realizations of the original deterministic model, and projects the resulting statistics onto the PC modes, arriving at the PC expansions of the model outputs. We investigate and discuss the strengths and weaknesses of each approach, and identify their utility under different conditions. We also outline areas where ongoing and future research are needed to address challenges with both approaches.

  13. Cross recurrence quantification for cover song identification

    International Nuclear Information System (INIS)

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Roessler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  14. Quantification of the vocal folds’ dynamic displacements

    Science.gov (United States)

    del Socorro Hernández-Montes, María; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-05-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ~100-1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues.

  15. Tissue quantification for development of pediatric phantom

    International Nuclear Information System (INIS)

    The optimization of the risk- benefit ratio is a major concern in the pediatric radiology, due to the greater vulnerability of children to the late somatic effects and genetic effects of exposure to radiation compared to adults. In Brazil, it is estimated that the causes of death from head trauma are 18 % for the age group between 1-5 years and the radiograph is the primary diagnostic test for the detection of skull fracture . Knowing that the image quality is essential to ensure the identification of structures anatomical and minimizing errors diagnostic interpretation, this paper proposed the development and construction of homogeneous phantoms skull, for the age group 1-5 years. The construction of the phantoms homogeneous was performed using the classification and quantification of tissue present in the skull of pediatric patients. In this procedure computational algorithms were used, using Matlab, to quantify distinct biological tissues present in the anatomical regions studied , using pictures retrospective CT scans. Preliminary data obtained from measurements show that between the ages of 1-5 years, assuming an average anteroposterior diameter of the pediatric skull region of the 145.73 ± 2.97 mm, can be represented by 92.34 mm ± 5.22 of lucite and 1.75 ± 0:21 mm of aluminum plates of a provision of PEP (Pacient equivalent phantom). After its construction, the phantoms will be used for image and dose optimization in pediatric protocols process to examinations of computerized radiography

  16. Legionella spp. isolation and quantification from greywater

    Science.gov (United States)

    Rodríguez-Martínez, Sara; Blanky, Marina; Friedler, Eran; Halpern, Malka

    2015-01-01

    Legionella, an opportunistic human pathogen whose natural environment is water, is transmitted to humans through inhalation of contaminated aerosols. Legionella has been isolated from a high diversity of water types. Due its importance as a pathogen, two ISO protocols have been developed for its monitoring. However, these two protocols are not suitable for analyzing Legionella in greywater (GW). GW is domestic wastewater excluding the inputs from toilets and kitchen. It can serve as an alternative water source, mainly for toilet flushing and garden irrigation; both producing aerosols that can cause a risk for Legionella infection. Hence, before reuse, GW has to be treated and its quality needs to be monitored. The difficulty of Legionella isolation from GW strives in the very high load of contaminant bacteria. Here we describe a modification of the ISO protocol 11731:1998 that enables the isolation and quantification of Legionella from GW samples. The following modifications were made:•To enable isolation of Legionella from greywater, a pre-filtration step that removes coarse matter is recommended.•Legionella can be isolated after a combined acid-thermic treatment that eliminates the high load of contaminant bacteria in the sample. PMID:26740925

  17. Volumetric motion quantification by 3D tissue phase mapped CMR

    Directory of Open Access Journals (Sweden)

    Lutz Anja

    2012-10-01

    Full Text Available Abstract Background The objective of this study was the quantification of myocardial motion from 3D tissue phase mapped (TPM CMR. Recent work on myocardial motion quantification by TPM has been focussed on multi-slice 2D acquisitions thus excluding motion information from large regions of the left ventricle. Volumetric motion assessment appears an important next step towards the understanding of the volumetric myocardial motion and hence may further improve diagnosis and treatments in patients with myocardial motion abnormalities. Methods Volumetric motion quantification of the complete left ventricle was performed in 12 healthy volunteers and two patients applying a black-blood 3D TPM sequence. The resulting motion field was analysed regarding motion pattern differences between apical and basal locations as well as for asynchronous motion pattern between different myocardial segments in one or more slices. Motion quantification included velocity, torsion, rotation angle and strain derived parameters. Results All investigated motion quantification parameters could be calculated from the 3D-TPM data. Parameters quantifying hypokinetic or asynchronous motion demonstrated differences between motion impaired and healthy myocardium. Conclusions 3D-TPM enables the gapless volumetric quantification of motion abnormalities of the left ventricle, which can be applied in future application as additional information to provide a more detailed analysis of the left ventricular function.

  18. Quantification of isotopic turnover in agricultural systems

    Science.gov (United States)

    Braun, A.; Auerswald, K.; Schnyder, H.

    2012-04-01

    The isotopic turnover, which is a proxy for the metabolic rate, is gaining scientific importance. It is quantified for an increasing range of organisms, from microorganisms over plants to animals including agricultural livestock. Additionally, the isotopic turnover is analyzed on different scales, from organs to organisms to ecosystems and even to the biosphere. In particular, the quantification of the isotopic turnover of specific tissues within the same organism, e.g. organs like liver and muscle and products like milk and faeces, has brought new insights to improve understanding of nutrient cycles and fluxes, respectively. Thus, the knowledge of isotopic turnover is important in many areas, including physiology, e.g. milk synthesis, ecology, e.g. soil retention time of water, and medical science, e.g. cancer diagnosis. So far, the isotopic turnover is quantified by applying time, cost and expertise intensive tracer experiments. Usually, this comprises two isotopic equilibration periods. A first equilibration period with a constant isotopic input signal is followed by a second equilibration period with a distinct constant isotopic input signal. This yields a smooth signal change from the first to the second signal in the object under consideration. This approach reveals at least three major problems. (i) The input signals must be controlled isotopically, which is almost impossible in many realistic cases like free ranging animals. (ii) Both equilibration periods may be very long, especially when the turnover rate of the object under consideration is very slow, which aggravates the first problem. (iii) The detection of small or slow pools is improved by large isotopic signal changes, but large isotopic changes also involve a considerable change in the input material; e.g. animal studies are usually carried out as diet-switch experiments, where the diet is switched between C3 and C4 plants, since C3 and C4 plants differ strongly in their isotopic signal. The

  19. Quantification of nanowire uptake by live cells

    KAUST Repository

    Margineanu, Michael B.

    2015-05-01

    Nanostructures fabricated by different methods have become increasingly important for various applications at the cellular level. In order to understand how these nanostructures “behave” and for studying their internalization kinetics, several attempts have been made at tagging and investigating their interaction with living cells. In this study, magnetic iron nanowires with an iron oxide layer are coated with (3-Aminopropyl)triethoxysilane (APTES), and subsequently labeled with a fluorogenic pH-dependent dye pHrodo™ Red, covalently bound to the aminosilane surface. Time-lapse live imaging of human colon carcinoma HCT 116 cells interacting with the labeled iron nanowires is performed for 24 hours. As the pHrodo™ Red conjugated nanowires are non-fluorescent outside the cells but fluoresce brightly inside, internalized nanowires are distinguished from non-internalized ones and their behavior inside the cells can be tracked for the respective time length. A machine learning-based computational framework dedicated to automatic analysis of live cell imaging data, Cell Cognition, is adapted and used to classify cells with internalized and non-internalized nanowires and subsequently determine the uptake percentage by cells at different time points. An uptake of 85 % by HCT 116 cells is observed after 24 hours incubation at NW-to-cell ratios of 200. While the approach of using pHrodo™ Red for internalization studies is not novel in the literature, this study reports for the first time the utilization of a machine-learning based time-resolved automatic analysis pipeline for quantification of nanowire uptake by cells. This pipeline has also been used for comparison studies with nickel nanowires coated with APTES and labeled with pHrodo™ Red, and another cell line derived from the cervix carcinoma, HeLa. It has thus the potential to be used for studying the interaction of different types of nanostructures with potentially any live cell types.

  20. GPU-accelerated voxelwise hepatic perfusion quantification.

    Science.gov (United States)

    Wang, H; Cao, Y

    2012-09-01

    Voxelwise quantification of hepatic perfusion parameters from dynamic contrast enhanced (DCE) imaging greatly contributes to assessment of liver function in response to radiation therapy. However, the efficiency of the estimation of hepatic perfusion parameters voxel-by-voxel in the whole liver using a dual-input single-compartment model requires substantial improvement for routine clinical applications. In this paper, we utilize the parallel computation power of a graphics processing unit (GPU) to accelerate the computation, while maintaining the same accuracy as the conventional method. Using compute unified device architecture-GPU, the hepatic perfusion computations over multiple voxels are run across the GPU blocks concurrently but independently. At each voxel, nonlinear least-squares fitting the time series of the liver DCE data to the compartmental model is distributed to multiple threads in a block, and the computations of different time points are performed simultaneously and synchronically. An efficient fast Fourier transform in a block is also developed for the convolution computation in the model. The GPU computations of the voxel-by-voxel hepatic perfusion images are compared with ones by the CPU using the simulated DCE data and the experimental DCE MR images from patients. The computation speed is improved by 30 times using a NVIDIA Tesla C2050 GPU compared to a 2.67 GHz Intel Xeon CPU processor. To obtain liver perfusion maps with 626 400 voxels in a patient's liver, it takes 0.9 min with the GPU-accelerated voxelwise computation, compared to 110 min with the CPU, while both methods result in perfusion parameters differences less than 10(-6). The method will be useful for generating liver perfusion images in clinical settings. PMID:22892645

  1. GPU-accelerated voxelwise hepatic perfusion quantification

    International Nuclear Information System (INIS)

    Voxelwise quantification of hepatic perfusion parameters from dynamic contrast enhanced (DCE) imaging greatly contributes to assessment of liver function in response to radiation therapy. However, the efficiency of the estimation of hepatic perfusion parameters voxel-by-voxel in the whole liver using a dual-input single-compartment model requires substantial improvement for routine clinical applications. In this paper, we utilize the parallel computation power of a graphics processing unit (GPU) to accelerate the computation, while maintaining the same accuracy as the conventional method. Using compute unified device architecture-GPU, the hepatic perfusion computations over multiple voxels are run across the GPU blocks concurrently but independently. At each voxel, nonlinear least-squares fitting the time series of the liver DCE data to the compartmental model is distributed to multiple threads in a block, and the computations of different time points are performed simultaneously and synchronically. An efficient fast Fourier transform in a block is also developed for the convolution computation in the model. The GPU computations of the voxel-by-voxel hepatic perfusion images are compared with ones by the CPU using the simulated DCE data and the experimental DCE MR images from patients. The computation speed is improved by 30 times using a NVIDIA Tesla C2050 GPU compared to a 2.67 GHz Intel Xeon CPU processor. To obtain liver perfusion maps with 626 400 voxels in a patient's liver, it takes 0.9 min with the GPU-accelerated voxelwise computation, compared to 110 min with the CPU, while both methods result in perfusion parameters differences less than 10−6. The method will be useful for generating liver perfusion images in clinical settings. (paper)

  2. Extended quantification of the generalized recurrence plot

    Science.gov (United States)

    Riedl, Maik; Marwan, Norbert; Kurths, Jürgen

    2016-04-01

    The generalized recurrence plot is a modern tool for quantification of complex spatial patterns. Its application spans the analysis of trabecular bone structures, Turing structures, turbulent spatial plankton patterns, and fractals. But, it is also successfully applied to the description of spatio-temporal dynamics and the detection of regime shifts, such as in the complex Ginzburg-Landau- equation. The recurrence plot based determinism is a central measure in this framework quantifying the level of regularities in temporal and spatial structures. We extend this measure for the generalized recurrence plot considering additional operations of symmetry than the simple translation. It is tested not only on two-dimensional regular patterns and noise but also on complex spatial patterns reconstructing the parameter space of the complex Ginzburg-Landau-equation. The extended version of the determinism resulted in values which are consistent to the original recurrence plot approach. Furthermore, the proposed method allows a split of the determinism into parts which based on laminar and non-laminar regions of the two-dimensional pattern of the complex Ginzburg-Landau-equation. A comparison of these parts with a standard method of image classification, the co-occurrence matrix approach, shows differences especially in the description of patterns associated with turbulence. In that case, it seems that the extended version of the determinism allows a distinction of phase turbulence and defect turbulence by means of their spatial patterns. This ability of the proposed method promise new insights in other systems with turbulent dynamics coming from climatology, biology, ecology, and social sciences, for example.

  3. Quantification of asphaltene precipitation by scaling equation

    Science.gov (United States)

    Janier, Josefina Barnachea; Jalil, Mohamad Afzal B. Abd.; Samin, Mohamad Izhar B. Mohd; Karim, Samsul Ariffin B. A.

    2015-02-01

    Asphaltene precipitation from crude oil is one of the issues for the oil industry. The deposition of asphaltene occurs during production, transportation and separating process. The injection of carbon dioxide (CO2) during enhance oil recovery (EOR) is believed to contribute much to the precipitation of asphaltene. Precipitation can be affected by the changes in temperature and pressure on the crude oil however, reduction in pressure contribute much to the instability of asphaltene as compared to temperature. This paper discussed the quantification of precipitated asphaltene in crude oil at different high pressures and at constant temperature. The derived scaling equation was based on the reservoir condition with variation in the amount of carbon dioxide (CO2) mixed with Dulang a light crude oil sample used in the experiment towards the stability of asphaltene. A FluidEval PVT cell with Solid Detection System (SDS) was the instrument used to gain experimental knowledge on the behavior of fluid at reservoir conditions. Two conditions were followed in the conduct of the experiment. Firstly, a 45cc light crude oil was mixed with 18cc (40%) of CO2 and secondly, the same amount of crude oil sample was mixed with 27cc (60%) of CO2. Results showed that for a 45cc crude oil sample combined with 18cc (40%) of CO2 gas indicated a saturation pressure of 1498.37psi and asphaltene onset point was 1620psi. Then for the same amount of crude oil combined with 27cc (60%) of CO2, the saturation pressure was 2046.502psi and asphaltene onset point was 2230psi. The derivation of the scaling equation considered reservoir temperature, pressure, bubble point pressure, mole percent of the precipitant the injected gas CO2, and the gas molecular weight. The scaled equation resulted to a third order polynomial that can be used to quantify the amount of asphaltene in crude oil.

  4. Rapid digital quantification of microfracture populations

    Science.gov (United States)

    Gomez, Leonel A.; Laubach, Stephen E.

    2006-03-01

    Populations of microfractures are a structural fabric in many rocks deformed at upper crustal conditions. In some cases these fractures are visible in transmitted-light microscopy as fluid-inclusion planes or cement filled microfractures, but because SEM-based cathodoluminescence (CL) reveals more fractures and delineates their shapes, sizes, and crosscutting relations, it is a more effective structural tool. Yet at magnifications of 150-300×, at which many microfractures are visible, SEM-CL detectors image only small sample areas (0.5-0.1 mm 2) relative to fracture population patterns. The substantial effort required to image and measure centimeter-size areas at high-magnification has impeded quantitative study of microfractures. We present a method for efficient collection of mosaics of high-resolution CL imagery, a preparation method that allows samples to be any size while retaining continuous imagery of rock (no gaps), and software that facilitates fracture mapping and data reduction. Although the method introduced here was developed for CL imagery, it can be used with any other kind of images, including mosaics from petrographic microscopes. Compared with manual measurements, the new method increases several fold the number of microfractures imaged without a proportional increase in level of effort, increases the accuracy and repeatability of fracture measurements, and speeds quantification and display of fracture population attributes. We illustrate the method on microfracture arrays in dolostone from NE Mexico and sandstone from NW Scotland. We show that key aspects of microfracture population attributes are only fully manifest at scales larger than a single thin section.

  5. Uncertainty quantification of neutronics characteristics using Latin hypercube sampling method

    International Nuclear Information System (INIS)

    The Latin Hypercube Sampling (LHS) method is applied to the uncertainty quantification of neutronics characteristics based on the Monte-Carlo based sampling method. The Monte-Carlo based sampling method is one of the uncertainty quantification methods of neutronics characteristics (e.g. neutron multiplication factor) due to uncertainty cross section. The sampling method has the advantage that burnup and thermal-hydraulics feedback effects are easily considered in complicated light water reactor core analysis. Contrary, the sampling method has the disadvantage that statistical errors are involved in estimated uncertainties of neutronics characteristics since the sampling method is a probabilistic approach. Although the statistical errors can be reduced by increasing the number of samples (e.g. number of perturbed cross section libraries), computational cost also increases. Therefore development of an efficient uncertainty quantification method, which provides smaller statistical error of estimated uncertainty while reducing number of samples, is highly desirable. In the present study, adequacy and performance of the LHS method is investigated as an efficient Monte-Carlo based sampling method. Uncertainty quantification of multiplication factor for a BWR fuel assembly is carried out with the LHS method. The results indicate that uncertainty quantification using the LHS method is more efficient than that using the conventional sampling method, which utilizes simple random sampling of cross sections. (author)

  6. Automated lobar quantification of emphysema in patients with severe COPD

    International Nuclear Information System (INIS)

    Automated lobar quantification of emphysema has not yet been evaluated. Unenhanced 64-slice MDCT was performed in 47 patients evaluated before bronchoscopic lung-volume reduction. CT images reconstructed with a standard (B20) and high-frequency (B50) kernel were analyzed using a dedicated prototype software (MevisPULMO) allowing lobar quantification of emphysema extent. Lobar quantification was obtained following (a) a fully automatic delineation of the lobar limits by the software and (b) a semiautomatic delineation with manual correction of the lobar limits when necessary and was compared with the visual scoring of emphysema severity per lobe. No statistically significant difference existed between automated and semiautomated lobar quantification (p>0.05 in the five lobes), with differences ranging from 0.4 to 3.9%. The agreement between the two methods (intraclass correlation coefficient, ICC) was excellent for left upper lobe (ICC=0.94), left lower lobe (ICC=0.98), and right lower lobe (ICC=0.80). The agreement was good for right upper lobe (ICC=0.68) and moderate for middle lobe (IC=0.53). The Bland and Altman plots confirmed these results. A good agreement was observed between the software and visually assessed lobar predominance of emphysema (kappa 0.78; 95% CI 0.64-0.92). Automated and semiautomated lobar quantifications of emphysema are concordant and show good agreement with visual scoring. (orig.)

  7. GMO quantification: valuable experience and insights for the future.

    Science.gov (United States)

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques. PMID:25182968

  8. Methane bubbling: from speculation to quantification

    Science.gov (United States)

    Grinham, A. R.; Dunbabin, M.; Yuan, Z.

    2013-12-01

    magnitude from 500 to 100 000 mg m-2 d-1 depending on time of day and water depth. Average storage bubble flux rates between reservoirs varied by two orders of magnitude from 1 200 to 15 000 mg m-2 d-1, with the primary driver likely to be catchment forest cover. The relative contribution of bubbling to total fluxes varied from 10% to more than 90% depending on the reservoir and time of sampling. This method was consistently shown to greatly improve the spatial mapping and quantification of methane bubbling rates from reservoir surfaces and reduces the uncertainty associated with the determining the relative contribution of bubbling to total flux.

  9. Superlattice band structure: New and simple energy quantification condition

    International Nuclear Information System (INIS)

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga0.5Al0.5As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results

  10. Superlattice band structure: New and simple energy quantification condition

    Science.gov (United States)

    Maiz, F.

    2014-10-01

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga0.5Al0.5As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results.

  11. Superlattice band structure: New and simple energy quantification condition

    Energy Technology Data Exchange (ETDEWEB)

    Maiz, F., E-mail: fethimaiz@gmail.com [University of Cartage, Nabeul Engineering Preparatory Institute, Merazka, 8000 Nabeul (Tunisia); King Khalid University, Faculty of Science, Physics Department, P.O. Box 9004, Abha 61413 (Saudi Arabia)

    2014-10-01

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga{sub 0.5}Al{sub 0.5}As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results.

  12. Stereological quantification of mast cells in human synovium

    DEFF Research Database (Denmark)

    Damsgaard, T E; Sørensen, Flemming Brandt; Herlin, T;

    1999-01-01

    synovium, using stereological techniques. Different methods of staining and quantification have previously been used for mast cell quantification in human synovium. Stereological techniques provide precise and unbiased information on the number of cell profiles in two-dimensional tissue sections of, in......Mast cells participate in both the acute allergic reaction as well as in chronic inflammatory diseases. Earlier studies have revealed divergent results regarding the quantification of mast cells in the human synovium. The aim of the present study was therefore to quantify these cells in the human...... this case, human synovium. In 10 patients suffering from osteoarthritis a median of 3.6 mast cells/mm2 synovial membrane was found. The total number of cells (synoviocytes, fibroblasts, lymphocytes, leukocytes) present was 395.9 cells/mm2 (median). The mast cells constituted 0.8% of all the cell...

  13. Large-Scale Inverse Problems and Quantification of Uncertainty

    CERN Document Server

    Biegler, Lorenz; Ghattas, Omar

    2010-01-01

    Large-scale inverse problems and associated uncertainty quantification has become an important area of research, central to a wide range of science and engineering applications. Written by leading experts in the field, Large-scale Inverse Problems and Quantification of Uncertainty focuses on the computational methods used to analyze and simulate inverse problems. The text provides PhD students, researchers, advanced undergraduate students, and engineering practitioners with the perspectives of researchers in areas of inverse problems and data assimilation, ranging from statistics and large-sca

  14. Quantification is Neither Necessary Nor Sufficient for Measurement

    International Nuclear Information System (INIS)

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement

  15. A quick colorimetric method for total lipid quantification in microalgae.

    Science.gov (United States)

    Byreddy, Avinesh R; Gupta, Adarsha; Barrow, Colin J; Puri, Munish

    2016-06-01

    Discovering microalgae with high lipid productivity are among the key milestones for achieving sustainable biodiesel production. Current methods of lipid quantification are time intensive and costly. A rapid colorimetric method based on sulfo-phospho-vanillin (SPV) reaction was developed for the quantification of microbial lipids to facilitate screening for lipid producing microalgae. This method was successfully tested on marine thraustochytrid strains and vegetable oils. The colorimetric method results correlated well with gravimetric method estimates. The new method was less time consuming than gravimetric analysis and is quantitative for lipid determination, even in the presence of carbohydrates, proteins and glycerol. PMID:27050419

  16. Comparison of Quantifiler(®) Trio and InnoQuant™ human DNA quantification kits for detection of DNA degradation in developed and aged fingerprints.

    Science.gov (United States)

    Goecker, Zachary C; Swiontek, Stephen E; Lakhtakia, Akhlesh; Roy, Reena

    2016-06-01

    The development techniques employed to visualize fingerprints collected from crime scenes as well as post-development ageing may result in the degradation of the DNA present in low quantities in such evidence samples. Amplification of the DNA samples with short tandem repeat (STR) amplification kits may result in partial DNA profiles. A comparative study of two commercially available quantification kits, Quantifiler(®) Trio and InnoQuant™, was performed on latent fingerprint samples that were either (i) developed using one of three different techniques and then aged in ambient conditions or (ii) undeveloped and then aged in ambient conditions. The three fingerprint development techniques used were: cyanoacrylate fuming, dusting with black powder, and the columnar-thin-film (CTF) technique. In order to determine the differences between the expected quantities and actual quantities of DNA, manually degraded samples generated by controlled exposure of DNA standards to ultraviolet radiation were also analyzed. A total of 144 fingerprint and 42 manually degraded DNA samples were processed in this study. The results indicate that the InnoQuant™ kit is capable of producing higher degradation ratios compared to the Quantifiler(®) Trio kit. This was an expected result since the degradation ratio is a relative value specific for a kit based on the length and extent of amplification of the two amplicons that vary from one kit to the other. Additionally, samples with lower concentrations of DNA yielded non-linear relationships of degradation ratio with the duration of aging, whereas samples with higher concentrations of DNA yielded quasi-linear relationships. None of the three development techniques produced a noticeably different degradation pattern when compared to undeveloped fingerprints, and therefore do not impede downstream DNA analysis. PMID:27107968

  17. MaqFACS: A Muscle-Based Facial Movement Coding System for the Rhesus Macaque

    OpenAIRE

    Parr, L. A.; Waller, B.M.; Burrows, A.M.; Gothard, K.M.; Vick, S.J.

    2010-01-01

    Over 125 years ago, Charles Darwin suggested that the only way to fully understand the form and function of human facial expression was to make comparisons to other species. Nevertheless, it has been only recently that facial expressions in humans and related primate species have been compared using systematic, anatomically-based techniques. Through this approach, large scale evolutionary and phylogenetic analyses of facial expressions, including their homology, can now be addressed. Here, th...

  18. The Maqāṣid approach and rethinking political rights in modern society

    Directory of Open Access Journals (Sweden)

    Louay Safi

    2010-12-01

    Full Text Available This paper examines political rights in Islam by focusing on freedom of religion and the extent to which the state is empowered to enforce faith and religious law on society. It starts by comparing the notion of law in both Western and Islamic traditions, and then analyzes the difference between the ethical and legal within Sharī‘ah. The paper illustrates how Islamic law grew historically by working to limit the power of the state, and points out the need to maintain the distinction between the state and civil society for the proper implementation of Sharī‘ah. The paper also contends that those who call on the state to enforce all rules of Sharī‘ah on society rely on a faulty theory of right and concludes that Islamic law fully recognizes the right of individuals to adopt and practice their faith freely. Freedom of religion, it stresses, is an intrinsic aspect of Islamic law and all efforts to limit this freedom is bound to violate its purpose and dictates.

  19. Rapid Quantification of Myocardial Fibrosis: A New Macro-Based Automated Analysis

    Directory of Open Access Journals (Sweden)

    Awal M. Hadi

    2010-01-01

    Full Text Available Background: Fibrosis is associated with various cardiac pathologies and dysfunction. Current quantification methods are time-consuming and laborious. We describe a semi-automated quantification technique for myocardial fibrosis and validated this using traditional methods.

  20. Quantification and Negation in Event Semantics

    Directory of Open Access Journals (Sweden)

    Lucas Champollion

    2010-12-01

    Full Text Available Recently, it has been claimed that event semantics does not go well together with quantification, especially if one rejects syntactic, LF-based approaches to quantifier scope. This paper shows that such fears are unfounded, by presenting a simple, variable-free framework which combines a Neo-Davidsonian event semantics with a type-shifting based account of quantifier scope. The main innovation is that the event variable is bound inside the verbal denotation, rather than at sentence level by existential closure. Quantifiers can then be interpreted in situ. The resulting framework combines the strengths of event semantics and type-shifting accounts of quantifiers and thus does not force the semanticist to posit either a default underlying word order or a syntactic LF-style level. It is therefore well suited for applications to languages where word order is free and quantifier scope is determined by surface order. As an additional benefit, the system leads to a straightforward account of negation, which has also been claimed to be problematic for event-based frameworks.ReferencesBarker, Chris. 2002. ‘Continuations and the nature of quantification’. Natural Language Semantics 10: 211–242.http://dx.doi.org/10.1023/A:1022183511876Barker, Chris & Shan, Chung-chieh. 2008. ‘Donkey anaphora is in-scope binding’. Semantics and Pragmatics 1: 1–46.Beaver, David & Condoravdi, Cleo. 2007. ‘On the logic of verbal modification’. In Maria Aloni, Paul Dekker & Floris Roelofsen (eds. ‘Proceedings of the Sixteenth Amsterdam Colloquium’, 3–9. Amsterdam, Netherlands: University of Amsterdam.Beghelli, Filippo & Stowell, Tim. 1997. ‘Distributivity and negation: The syntax of each and every’. In Anna Szabolcsi (ed. ‘Ways of scope taking’, 71–107. Dordrecht, Netherlands: Kluwer.Brasoveanu, Adrian. 2010. ‘Modified Numerals as Post-Suppositions’. In Maria Aloni, Harald Bastiaanse, Tikitu de Jager & Katrin Schulz (eds. ‘Logic, Language

  1. The Dark Side of the Mushroom Spring Microbial Mat: Life in the Shadow of Chlorophototrophs. I. Microbial Diversity Based on 16S rRNA Gene Amplicons and Metagenomic Sequencing.

    Science.gov (United States)

    Thiel, Vera; Wood, Jason M; Olsen, Millie T; Tank, Marcus; Klatt, Christian G; Ward, David M; Bryant, Donald A

    2016-01-01

    Microbial-mat communities in the effluent channels of Octopus and Mushroom Springs within the Lower Geyser Basin at Yellowstone National Park have been studied for nearly 50 years. The emphasis has mostly focused on the chlorophototrophic bacterial organisms of the phyla Cyanobacteria and Chloroflexi. In contrast, the diversity and metabolic functions of the heterotrophic community in the microoxic/anoxic region of the mat are not well understood. In this study we analyzed the orange-colored undermat of the microbial community of Mushroom Spring using metagenomic and rRNA-amplicon (iTag) analyses. Our analyses disclosed a highly diverse community exhibiting a high degree of unevenness, strongly dominated by a single taxon, the filamentous anoxygenic phototroph, Roseiflexus spp. The second most abundant organisms belonged to the Thermotogae, which have been hypothesized to be a major source of H2 from fermentation that could enable photomixotrophic metabolism by Chloroflexus and Roseiflexus spp. Other abundant organisms include two members of the Armatimonadetes (OP10); Thermocrinis sp.; and phototrophic and heterotrophic members of the Chloroflexi. Further, an Atribacteria (OP9/JS1) member; a sulfate-reducing Thermodesulfovibrio sp.; a Planctomycetes member; a member of the EM3 group tentatively affiliated with the Thermotogae, as well as a putative member of the Arminicenantes (OP8) represented ≥1% of the reads. Archaea were not abundant in the iTag analysis, and no metagenomic bin representing an archaeon was identified. A high microdiversity of 16S rRNA gene sequences was identified for the dominant taxon, Roseiflexus spp. Previous studies demonstrated that highly similar Synechococcus variants in the upper layer of the mats represent ecological species populations with specific ecological adaptations. This study suggests that similar putative ecotypes specifically adapted to different niches occur within the undermat community, particularly for Roseiflexus

  2. The Dark Side of the Mushroom Spring Microbial Mat: Life in the Shadow of Chlorophototrophs. I. Microbial Diversity Based on 16S rRNA Gene Amplicons and Metagenomic Sequencing

    Science.gov (United States)

    Thiel, Vera; Wood, Jason M.; Olsen, Millie T.; Tank, Marcus; Klatt, Christian G.; Ward, David M.; Bryant, Donald A.

    2016-01-01

    Microbial-mat communities in the effluent channels of Octopus and Mushroom Springs within the Lower Geyser Basin at Yellowstone National Park have been studied for nearly 50 years. The emphasis has mostly focused on the chlorophototrophic bacterial organisms of the phyla Cyanobacteria and Chloroflexi. In contrast, the diversity and metabolic functions of the heterotrophic community in the microoxic/anoxic region of the mat are not well understood. In this study we analyzed the orange-colored undermat of the microbial community of Mushroom Spring using metagenomic and rRNA-amplicon (iTag) analyses. Our analyses disclosed a highly diverse community exhibiting a high degree of unevenness, strongly dominated by a single taxon, the filamentous anoxygenic phototroph, Roseiflexus spp. The second most abundant organisms belonged to the Thermotogae, which have been hypothesized to be a major source of H2 from fermentation that could enable photomixotrophic metabolism by Chloroflexus and Roseiflexus spp. Other abundant organisms include two members of the Armatimonadetes (OP10); Thermocrinis sp.; and phototrophic and heterotrophic members of the Chloroflexi. Further, an Atribacteria (OP9/JS1) member; a sulfate-reducing Thermodesulfovibrio sp.; a Planctomycetes member; a member of the EM3 group tentatively affiliated with the Thermotogae, as well as a putative member of the Arminicenantes (OP8) represented ≥1% of the reads. Archaea were not abundant in the iTag analysis, and no metagenomic bin representing an archaeon was identified. A high microdiversity of 16S rRNA gene sequences was identified for the dominant taxon, Roseiflexus spp. Previous studies demonstrated that highly similar Synechococcus variants in the upper layer of the mats represent ecological species populations with specific ecological adaptations. This study suggests that similar putative ecotypes specifically adapted to different niches occur within the undermat community, particularly for Roseiflexus

  3. DESCRIPTION AND QUANTIFICATION OF MULTIPLE FAMILY GROUP INTERACTION1

    OpenAIRE

    Bhatti, Ranbir S.; Janakiramaiah, N.; Channabasavanna, S.M.; Devi, Shoba

    1980-01-01

    SUMMARY Multiple Family Group Interaction as a method of Family Therapy is reported with reference to its development, technique and procedure. A rating system for the categorization and quantification of the therapeutic processes is discussed. The main findings of analysis of 85 sessions are presented.

  4. Striga hermonthica seed bank dynamics: process quantification and modelling

    NARCIS (Netherlands)

    Mourik, van T.A.

    2007-01-01

    Key words:    Weed control, integrated management, parasitic weed, population, sorghum, millet.   This thesis presents a study on the quantification of seed bank dynamics of the parasitic weed Striga hermonthica. The main objectives were to quantify transition rates between different stages of the l

  5. Methane emission quantification from landfills using a double tracer approach

    DEFF Research Database (Denmark)

    Scheutz, Charlotte; Samuelsson, J.; Fredenslund, Anders Michael;

    2007-01-01

    A tracer method was successfully used for quantification of the whole methane (CH4) emission from Fakse landfill. By using two different tracers the emission from different sections of the landfill could be quantified. Furthermore, is was possible to determine the emissions from local on site...

  6. 15 CFR 990.52 - Injury assessment-quantification.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Injury assessment-quantification. 990... REGULATIONS NATURAL RESOURCE DAMAGE ASSESSMENTS Restoration Planning Phase § 990.52 Injury assessment... extent of injury to a natural resource, with subsequent translation of that adverse change to a...

  7. Identification and Quantification Soil Redoximorphic Features by Digital Image Processing

    Science.gov (United States)

    Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...

  8. Optimal purification and sensitive quantification of DNA from fecal samples

    DEFF Research Database (Denmark)

    Jensen, Annette Nygaard; Hoorfar, Jeffrey

    2002-01-01

    addition, the detection range of X-DNA of a spectrophotometric and a fluorometric (PicoGreen) method was compared. The PicoGreen showed a quantification limit of 1 ng/mL, consistent triplicate measurements, and finally a linear relationship between the concentrations of DNA standards and the fluorescence...

  9. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    International Nuclear Information System (INIS)

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  10. Damage Localization and Quantification of Earthquake Excited RC-Frames

    DEFF Research Database (Denmark)

    Skjærbæk, P.S.; Nielsen, Søren R.K.; Kirkegaard, Poul Henning;

    In the paper a recently proposed method for damage localization and quantification of RC-structures from response measurements is tested on experimental data. The method investigated requires at least one response measurement along the structure and the ground surface acceleration. Further, the t...

  11. Improved Method for PD-Quantification in Power Cables

    DEFF Research Database (Denmark)

    Holbøll, Joachim T.; Villefrance, Rasmus; Henriksen, Mogens

    n this paper, a method is described for improved quantification of partial discharges(PD) in power cables. The method is suitable for PD-detection and location systems in the MHz-range, where pulse attenuation and distortion along the cable cannot be neglected. The system transfer function was...

  12. StudyonMathematicalModeofQuantification Performance-to-Price Ratio

    Institute of Scientific and Technical Information of China (English)

    裴兰华; 史德战; 王冠

    2007-01-01

    Nowadays, consumers often compare the same kind of commodities and decide what to pick out when they purchase merchandise including the service. The paper discusses the mathematical mode of quantification performance-to-price ratio according to which product can be made in order to increase the competitiveness in the market.

  13. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    Science.gov (United States)

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-01-01

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality. PMID:27048884

  14. Quantification of Human Cytomegalovirus DNA by Real-Time PCR

    OpenAIRE

    Gault, Elyanne; Michel, Yanne; Dehée, Axelle; Belabani, Chahrazed; Nicolas, Jean-Claude; Garbarg-Chenon, Antoine

    2001-01-01

    A quantitative real-time PCR assay was developed to measure human cytomegalovirus (HCMV) DNA load in peripheral blood leukocytes (PBLs). The HCMV DNA load in PBLs was normalized by means of the quantification of a cellular gene (albumin). The results of the real-time PCR assay correlated with those of the HCMV pp65-antigenemia assay (P < 0.0001).

  15. Real-Time PCR for Universal Phytoplasma Detection and Quantification

    DEFF Research Database (Denmark)

    Christensen, Nynne Meyn; Nyskjold, Henriette; Nicolaisen, Mogens

    2013-01-01

    Currently, the most efficient detection and precise quantification of phytoplasmas is by real-time PCR. Compared to nested PCR, this method is less sensitive to contamination and is less work intensive. Therefore, a universal real-time PCR method will be valuable in screening programs and in other...

  16. Improved perfusion quantification in FAIR imaging by offset correction

    DEFF Research Database (Denmark)

    Sidaros, Karam; Andersen, Irene K.; Gesmar, Henrik;

    2001-01-01

    Perfusion quantification using pulsed arterial spin labeling has been shown to be sensitive to the RF pulse slice profiles. Therefore, in Flow-sensitive Alternating-Inversion Recovery (FAIR) imaging the slice selective (ss) inversion slab is usually three to four times thicker than the imaging...

  17. Mass spectrometry quantification of clusterin in the human brain

    Directory of Open Access Journals (Sweden)

    Chen Junjun

    2012-08-01

    Full Text Available Abstract Background The multifunctional glycoprotein clusterin has been associated with late-onset Alzheimer’s disease (AD. Further investigation to define the role of clusterin in AD phenotypes would be aided by the development of techniques to quantify level, potential post-translational modifications, and isoforms of clusterin. We have developed a quantitative technique based on multiple reaction monitoring (MRM mass spectrometry to measure clusterin in human postmortem brain tissues. Results A stable isotope-labeled concatenated peptide (QconCAT bearing selected peptides from clusterin was expressed with an in vitro translation system and purified. This clusterin QconCAT was validated for use as an internal standard for clusterin quantification using MRM mass spectrometry. Measurements were performed on the human postmortem frontal and temporal cortex from control and severe AD cases. During brain tissues processing, 1% SDS was used in the homogenization buffer to preserve potential post-translational modifications of clusterin. However, MRM quantifications in the brain did not suggest phosphorylation of Thr393, Ser394, and Ser396 residues reported for clusterin in serum. MRM quantifications in the frontal cortex demonstrated significantly higher (P  Conclusions The proposed protocol is a universal quantitative technique to assess expression level of clusterin. It is expected that application of this protocol to quantification of various clusterin isoforms and potential post-translational modifications will be helpful in addressing the role of clusterin in AD.

  18. Hepatocellular carcinoma: perfusion quantification with dynamic contrast-enhanced MRI

    NARCIS (Netherlands)

    Taouli, B.; Johnson, R.S.; Hajdu, C.H.; Oei, M.T.H.; Merad, M.; Yee, H.; Rusinek, H.

    2013-01-01

    The objective of our study was to report our initial experience with dynamic contrast-enhanced MRI (DCE-MRI) for perfusion quantification of hepatocellular carcinoma (HCC) and surrounding liver.DCE-MRI of the liver was prospectively performed on 31 patients with HCC (male-female ratio, 26:5; mean ag

  19. Quantification of Photoperiodic Effects on Growth of Phleum pratense

    OpenAIRE

    WU, ZUOLI; SKJELVÅG A.o.; BAADSHAUG, O. H.

    2004-01-01

    • Background and Aims Accurate quantifications of plant responses to photoperiod are useful for physiological studies, in growth modelling and in other studies of environmental effects. The objective of the current work was a mathematical description of photoperiodic influence on plant morphological traits, using functions with few and common parameters related to key plant characteristics and typical response patterns.

  20. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian [University of Bern, From the Institute of Forensic Medicine, Bern (Switzerland); Persson, Anders; Warntjes, Marcel J. [University of Linkoeping, The Center for Medical Image Science and Visualization (CMIV), Linkoeping (Sweden)

    2015-08-15

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  1. Current Issues in the Quantification of Federal Reserved Water Rights

    Science.gov (United States)

    Brookshire, David S.; Watts, Gary L.; Merrill, James L.

    1985-11-01

    This paper examines the quantification of federal reserved water rights from legal, institutional, and economic perspectives. Special attention is directed toward Indian reserved water rights and the concept of practicably irrigable acreage. We conclude by examining current trends and exploring alternative approaches to the dilemma of quantifying Indian reserved water rights.

  2. Quantification in MALDI-TOF mass spectrometry of modified polymers

    Czech Academy of Sciences Publication Activity Database

    Walterová, Zuzana; Horský, Jiří

    2011-01-01

    Roč. 693, 1/2 (2011), s. 82-88. ISSN 0003-2670 R&D Projects: GA ČR GA203/08/0543 Institutional research plan: CEZ:AV0Z40500505 Keywords : MALDI-TOF mass spectrometry * modified polymers * quantification Subject RIV: CC - Organic Chemistry Impact factor: 4.555, year: 2011

  3. Automatic quantification of subarachnoid hemorrhage on noncontrast CT

    NARCIS (Netherlands)

    Boers, A.M.; Zijlstra, I.A.; Gathier, C.S.; Berg, van den R.; Slump, C.H.; Marquering, H.A.; Majoie, C.B.

    2014-01-01

    Quantification of blood after SAH on initial NCCT is an important radiologic measure to predict patient outcome and guide treatment decisions. In current scales, hemorrhage volume and density are not accounted for. The purpose of this study was to develop and validate a fully automatic method for SA

  4. Literacy and Language Education: The Quantification of Learning

    Science.gov (United States)

    Gibb, Tara

    2015-01-01

    This chapter describes international policy contexts of adult literacy and language assessment and the shift toward standardization through measurement tools. It considers the implications the quantification of learning outcomes has for pedagogy and practice and for the social inclusion of transnational migrants.

  5. Macroscopic inspection of ape feces: what's in a quantification method?

    Science.gov (United States)

    Phillips, Caroline A; McGrew, William C

    2014-06-01

    Macroscopic inspection of feces has been used to investigate primate diet. The limitations of this method to identify food-items to species level have long been recognized, but ascertaining aspects of diet (e.g., folivory) are achievable by quantifying food-items in feces. Quantification methods applied include rating food-items using a scale of abundance, estimating their percentage volume, and weighing food-items. However, verification as to whether or not composition data differ, depending on which quantification method is used during macroscopic inspection, has not been done. We analyzed feces collected from ten adult chimpanzees (Pan troglodytes schweinfurthii) of the Kanyawara community in Kibale National Park, Uganda. We compare dietary composition totals obtained from using different quantification methods and ascertain if sieve mesh size influences totals calculated. Finally, this study validates findings from direct observation of feeding by the same individuals from whom the fecal samples had been collected. Contrasting diet composition totals obtained by using different quantification methods and sieve mesh sizes can influence folivory and frugivory estimates. However, our findings were based on the assumption that fibrous matter contained pith and leaf fragments only, which remains to be verified. We advocate macroscopic inspection of feces can be a valuable tool to provide a generalized overview of dietary composition for primate populations. As most populations remain unhabituated, scrutinizing and validating indirect measures are important if they are to be applied to further understand inter- and intra-species dietary variation. PMID:24482001

  6. CT quantification of pleuropulmonary lesions in severe thoracic trauma

    International Nuclear Information System (INIS)

    Purpose: Computed quantification of the extent of pleuropulmonary trauma by CT and comparison with conventional chest X-ray - Impact on therapy and correlation with mechanical ventilation support and clinical outcome. Method: In a prospective trial, 50 patients with clinically suspicious blunt chest trauma were evaluated using CT and conventional chest X-ray. The computed quantification of ventilated lung provided by CT volumetry was correlated with the consecutive artificial respiration parameters and the clinical outcome. Results: We found a high correlation between CT volumetry and artificial ventilation concerning maximal pressures and inspiratory oxygen concentration (FiO2, Goris-Score) (r=0.89, Pearson). The graduation of thoracic trauma correlated highly with the duration of mechanical ventilation (r=0.98, Pearson). Especially with regard to atelectases and lung contusions CT is superior compared to conventional chest X-ray; only 32% and 43%, respectively, were identified by conventional chest X-ray. (orig./AJ)

  7. Quantification of uranyl in presence of citric acid

    International Nuclear Information System (INIS)

    To determine the influence that has the organic matter of the soil on the uranyl sorption on some solids is necessary to have a detection technique and quantification of uranyl that it is reliable and sufficiently quick in the obtaining of results. For that in this work, it intends to carry out the uranyl quantification in presence of citric acid modifying the Fluorescence induced by UV-Vis radiation technique. Since the uranyl ion is very sensitive to the medium that contains it, (speciation, pH, ionic forces, etc.) it was necessary to develop an analysis technique that stands out the fluorescence of uranyl ion avoiding the out one that produce the organic acids. (Author)

  8. Modeling Heterogeneity in Networks using Uncertainty Quantification Tools

    CERN Document Server

    Rajendran, Karthikeyan; Siettos, Constantinos I; Laing, Carlo R; Kevrekidis, Ioannis G

    2015-01-01

    Using the dynamics of information propagation on a network as our illustrative example, we present and discuss a systematic approach to quantifying heterogeneity and its propagation that borrows established tools from Uncertainty Quantification. The crucial assumption underlying this mathematical and computational "technology transfer" is that the evolving states of the nodes in a network quickly become correlated with the corresponding node "identities": features of the nodes imparted by the network structure (e.g. the node degree, the node clustering coefficient). The node dynamics thus depend on heterogeneous (rather than uncertain) parameters, whose distribution over the network results from the network structure. Knowing these distributions allows us to obtain an efficient coarse-grained representation of the network state in terms of the expansion coefficients in suitable orthogonal polynomials. This representation is closely related to mathematical/computational tools for uncertainty quantification (th...

  9. Quantification of Si in silicone oils by ICP-OES

    DEFF Research Database (Denmark)

    Wang, Qian; Yang, Zhenyu

    2014-01-01

    (LOD) in the emulsion as 0.5 ppm Si. Compared to the Si determination by the direct organic solvent ICP-OES, this method is much more convenient, where a regular ICP-OES instrument can be directly used for the quantification of Si in the silicone oils obtained via extraction by organic solvents from......A new simple and low cost method of quantification of trace amounts of silicon (Si) in silicone oils has been developed by combining the silicone emulsion and inductively coupled plasma optical emission spectroscopy (ICP-OES) analysis. Silicone oils that contained phenyl groups in the viscosity...... range from 20 to 1000 mPa.s formed stable oil in water emulsions in the presence of Tween 80 surfactant and methylisobutylketone (MIBK). The Si in the emulsions was further quantified by ICP-OES. The calibration was performed using spiked inorganic silicon standard in the emulsions and the method was...

  10. Quantification of the effects of dependence on human error probabilities

    International Nuclear Information System (INIS)

    In estimating the probabilities of human error in the performance of a series of tasks in a nuclear power plant, the situation-specific characteristics of the series must be considered. A critical factor not to be overlooked in this estimation is the dependence or independence that pertains to any of the several pairs of task performances. In discussing the quantification of the effects of dependence, the event tree symbology described will be used. In any series of tasks, the only dependence considered for quantification in this document will be that existing between the task of interest and the immediately preceeding task. Tasks performed earlier in the series may have some effect on the end task, but this effect is considered negligible

  11. Reliability of resistivity quantification for shallow subsurface water processes

    CERN Document Server

    Rings, Joerg; 10.1016/j.jappgeo.2009.03.008

    2009-01-01

    The reliability of surface-based electrical resistivity tomography (ERT) for quantifying resistivities for shallow subsurface water processes is analysed. A method comprising numerical simulations of water movement in soil and forward-inverse modeling of ERT surveys for two synthetic data sets is presented. Resistivity contrast, e.g. by changing water content, is shown to have large influence on the resistivity quantification. An ensemble and clustering approach is introduced in which ensembles of 50 different inversion models for one data set are created by randomly varying the parameters for a regularisation based inversion routine. The ensemble members are sorted into five clusters of similar models and the mean model for each cluster is computed. Distinguishing persisting features in the mean models from singular artifacts in individual tomograms can improve the interpretation of inversion results. Especially in the presence of large resistivity contrasts in high sensitivity areas, the quantification of r...

  12. Collaborative framework for PIV uncertainty quantification: comparative assessment of methods

    International Nuclear Information System (INIS)

    A posteriori uncertainty quantification of particle image velocimetry (PIV) data is essential to obtain accurate estimates of the uncertainty associated with a given experiment. This is particularly relevant when measurements are used to validate computational models or in design and decision processes. In spite of the importance of the subject, the first PIV uncertainty quantification (PIV-UQ) methods have been developed only in the last three years. The present work is a comparative assessment of four approaches recently proposed in the literature: the uncertainty surface method (Timmins et al 2012), the particle disparity approach (Sciacchitano et al 2013), the peak ratio criterion (Charonko and Vlachos 2013) and the correlation statistics method (Wieneke 2015). The analysis is based upon experiments conducted for this specific purpose, where several measurement techniques are employed simultaneously. The performances of the above approaches are surveyed across different measurement conditions and flow regimes. (paper)

  13. Uncertainty Quantification for Large-Scale Ice Sheet Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [Univ. of Texas, Austin, TX (United States)

    2016-02-05

    This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.

  14. Clinical applications of MS-based protein quantification.

    Science.gov (United States)

    Sabbagh, Bassel; Mindt, Sonani; Neumaier, Michael; Findeisen, Peter

    2016-04-01

    Mass spectrometry-based assays are increasingly important in clinical laboratory medicine and nowadays are already commonly used in several areas of routine diagnostics. These include therapeutic drug monitoring, toxicology, endocrinology, pediatrics, and microbiology. Accordingly, some of the most common analyses are therapeutic drug monitoring of immunosuppressants, vitamin D, steroids, newborn screening, and bacterial identification. However, MS-based quantification of peptides and proteins for routine diagnostic use is rather rare up to now despite excellent analytical specificity and good sensitivity. Here, we want to give an overview over current fit-for-purpose assays for MS-based protein quantification. Advantages as well as challenges of this approach will be discussed with focus on feasibility for routine diagnostic use. PMID:27061318

  15. The Challenges of Credible Thermal Protection System Reliability Quantification

    Science.gov (United States)

    Green, Lawrence L.

    2013-01-01

    The paper discusses several of the challenges associated with developing a credible reliability estimate for a human-rated crew capsule thermal protection system. The process of developing such a credible estimate is subject to the quantification, modeling and propagation of numerous uncertainties within a probabilistic analysis. The development of specific investment recommendations, to improve the reliability prediction, among various potential testing and programmatic options is then accomplished through Bayesian analysis.

  16. Visualization and Quantification of Rotor Tip Vortices in Helicopter Flows

    Science.gov (United States)

    Kao, David L.; Ahmad, Jasim U.; Holst, Terry L.

    2015-01-01

    This paper presents an automated approach for effective extraction, visualization, and quantification of vortex core radii from the Navier-Stokes simulations of a UH-60A rotor in forward flight. We adopt a scaled Q-criterion to determine vortex regions and then perform vortex core profiling in these regions to calculate vortex core radii. This method provides an efficient way of visualizing and quantifying the blade tip vortices. Moreover, the vortices radii are displayed graphically in a plane.

  17. Incorporating Functional Gene Quantification into Traditional Decomposition Models

    Science.gov (United States)

    Todd-Brown, K. E.; Zhou, J.; Yin, H.; Wu, L.; Tiedje, J. M.; Schuur, E. A. G.; Konstantinidis, K.; Luo, Y.

    2014-12-01

    Incorporating new genetic quantification measurements into traditional substrate pool models represents a substantial challenge. These decomposition models are built around the idea that substrate availablity, with environmental drivers, limit carbon dioxide respiration rates. In this paradigm, microbial communities optimally adapt to a given substrate and environment on much shorter time scales then the carbon flux of interest. By characterizing the relative shift in biomass of these microbial communities, we informed previously poorly constrained parameters in traditional decomposition models. In this study we coupled a 9 month laboratory incubation study with quantitative gene measurements with traditional CO2 flux measurements plus initial soil organic carbon quantification. GeoChip 5.0 was used to quantify the functional genes associated with carbon cycling at 2 weeks, 3 months and 9 months. We then combined the genes which 'collapsed' over the experiment and assumed that this tracked the relative change in the biomass associated with the 'fast' pool. We further assumed that this biomass was proportional to the 'fast' SOC pool and thus were able to constrain the relative change in the fast SOC pool in our 3-pool decomposition model. We found that biomass quantification described above, combined with traditional CO2 flux and SOC measurements, improve the transfer coefficient estimation in traditional decomposition models. Transfer coefficients are very difficult to characterized using traditional CO2 flux measurements, thus DNA quantification provides new and significant information about the system. Over a 100 year simulation, these new biologically informed parameters resulted in an additional 10% of SOC loss over the traditionally informed parameters.

  18. QUANTIFICATIONS OF THE DETRIMENTAL HEALTHEFFECTS OF IONISING RADIATION

    OpenAIRE

    Walsh, Linda

    2013-01-01

    This thesis, presented to The University of Manchester in 2012 by Dr. Linda Walsh, is in fulfilment of the requirements for the degree of Doctor of Science (DSc) and is entitled “Quantifications of the detrimental health effects of ionising radiation.” A body of work and ensuing publications covering 2000–2012 are presented, predominantly concerning studies of various cohorts of people exposed to ionising radiation. The major areas cover epidemiological and statistical studies on the Life spa...

  19. Color correction for automatic fibrosis quantification in liver biopsy specimens

    Directory of Open Access Journals (Sweden)

    Yuri Murakami

    2013-01-01

    Full Text Available Context: For a precise and objective quantification of liver fibrosis, quantitative evaluations through image analysis have been utilized. However, manual operations are required in most cases for extracting fiber areas because of color variation included in digital pathology images. Aims: The purpose of this research is to propose a color correction method for whole slide images (WSIs of Elastica van Gieson (EVG stained liver biopsy tissue specimens and to realize automated operation of image analysis for fibrosis quantification. Materials and Methods: Our experimental dataset consisted of 38 WSIs of liver biopsy specimens collected from 38 chronic viral hepatitis patients from multiple medical facilities, stained with EVG and scanned at ×20 using a Nano Zoomer 2.0 HT (Hamamatsu Photonics K.K., Hamamatsu, Japan. Color correction was performed by modifying the color distribution of a target WSI so as to fit to the reference, where the color distribution was modeled by a set of two triangle pyramids. Using color corrected WSIs; fibrosis quantification was performed based on tissue classification analysis. Statistical Analysis Used: Spearman′s rank correlation coefficients were calculated between liver stiffness measured by transient elastography and median area ratio of collagen fibers calculated based on tissue classification results. Results: Statistical analysis results showed a significant correlation r = 0.61-0.68 even when tissue classifiers were trained by using a subset of WSIs, while the correlation coefficients were reduced to r = 0.40-0.50 without color correction. Conclusions: Fibrosis quantification accompanied with the proposed color correction method could provide an objective evaluation tool for liver fibrosis, which complements semi-quantitative histologic evaluation systems.

  20. Quantification of clay minerals by combined EWA/XRD method

    Institute of Scientific and Technical Information of China (English)

    XU; Jianhong; (徐建红); XU; Jianhong; (徐建红); T.; R.; Astin; PAN; Mao; (潘懋)

    2001-01-01

    Illite has been considered the main constraint on permeability in the Morecambe Gas Field, East Irish Sea, UK. Previous research has emphasized the morphology rather than the amount of clay minerals. By applying a new method of clay mineral quantification, EWA/XRD, and applying statistical analysis methods, we are able to establish a quantitative model of illite distribution in the field. The result also leads to a better understanding of permeability distribution in reservoir sandstones.

  1. Color correction for automatic fibrosis quantification in liver biopsy specimens

    OpenAIRE

    Yuri Murakami; Tokiya Abe; Akinori Hashiguchi; Masahiro Yamaguchi; Akira Saito; Michiie Sakamoto

    2013-01-01

    Context: For a precise and objective quantification of liver fibrosis, quantitative evaluations through image analysis have been utilized. However, manual operations are required in most cases for extracting fiber areas because of color variation included in digital pathology images. Aims: The purpose of this research is to propose a color correction method for whole slide images (WSIs) of Elastica van Gieson (EVG) stained liver biopsy tissue specimens and to realize automated operation of im...

  2. Methodological Issues in the Quantification of Respiratory Sinus Arrhythmia

    OpenAIRE

    Denver, John W.; Reed, Shawn F.; Porges, Stephen W.

    2006-01-01

    Although respiratory sinus arrhythmia (RSA) is a commonly quantified physiological variable, the methods for quantification are not consistent. This manuscript questions the assumption that respiration frequency needs to be manipulated or monitored to generate an accurate measure of RSA amplitude. A review of recent papers is presented that contrast RSA amplitude with measures that use respiratory parameters to adjust RSA amplitude. In addition, data from two studies are presented to evaluate...

  3. Quantification of bacterial invasion into adherent cells by flow cytometry

    OpenAIRE

    Pils, Stefan; Schmitter, Tim; Neske, Florian; Hauck, Christof R.

    2006-01-01

    Quantification of invasive, intracellular bacteria is critical in many areas of cellular microbiology and immunology. We describe a novel and fast approach to determine invasion of bacterial pathogens in adherent cell types such as epithelial cells or fibroblasts based on flow cytometry. Using the CEACAM-mediated uptake of Opa-expressing Neisseria gonorrhoeae as a well-characterized model of bacterial invasion, we demonstrate that the flow cytometry-based method yields results comparable to a...

  4. A Novel Immunological Assay for Hepcidin Quantification in Human Serum

    OpenAIRE

    Koliaraki, V.; Marinou, M.; Vassilakopoulos, T P; Vavourakis, E.; Tsochatzis, E.; Pangalis, G. A.; Papatheodoridis, G; Stamoulakatou, A.; Swinkels, D. W.; Papanikolaou, G; A. Mamalaki

    2009-01-01

    BACKGROUND: Hepcidin is a 25-aminoacid cysteine-rich iron regulating peptide. Increased hepcidin concentrations lead to iron sequestration in macrophages, contributing to the pathogenesis of anaemia of chronic disease whereas decreased hepcidin is observed in iron deficiency and primary iron overload diseases such as hereditary hemochromatosis. Hepcidin quantification in human blood or urine may provide further insights for the pathogenesis of disorders of iron homeostasis and might prove a v...

  5. Accurate Peptide Fragment Mass Analysis: Multiplexed Peptide Identification and Quantification

    OpenAIRE

    Weisbrod, Chad R.; Eng, Jimmy K.; Hoopmann, Michael R.; Baker, Tahmina; Bruce, James E.

    2012-01-01

    FT All Reaction Monitoring (FT-ARM) is a novel approach for the identification and quantification of peptides that relies upon the selectivity of high mass accuracy data and the specificity of peptide fragmentation patterns. An FT-ARM experiment involves continuous, data-independent, high mass accuracy MS/MS acquisition spanning a defined m/z range. Custom software was developed to search peptides against the multiplexed fragmentation spectra by comparing theoretical or empirical fragment ion...

  6. Quantification of myocardial perfusion by cardiovascular magnetic resonance

    OpenAIRE

    Jerosch-Herold Michael

    2010-01-01

    Abstract The potential of contrast-enhanced cardiovascular magnetic resonance (CMR) for a quantitative assessment of myocardial perfusion has been explored for more than a decade now, with encouraging results from comparisons with accepted "gold standards", such as microspheres used in the physiology laboratory. This has generated an increasing interest in the requirements and methodological approaches for the non-invasive quantification of myocardial blood flow by CMR. This review provides a...

  7. The necessity of operational risk management and quantification

    OpenAIRE

    Barbu Teodora Cristina; Olteanu (Puiu) Ana Cornelia; Radu Alina Nicoleta

    2008-01-01

    Beginning with the fact that performant strategies of the financial institutions have programmes and management procedures for the banking risks, which have as main objective to minimize the probability of risk generation and the bank’s potential exposure, this paper wants to present the operational risk management and quantification methods. Also it presents the modality of minimum capital requirement for the operational risk. Therefore, the first part presents the conceptual approach of the...

  8. A "Toy" Model for Operational Risk Quantification using Credibility Theory

    OpenAIRE

    Hans B\\"uhlmann; Shevchenko, Pavel V.; Mario V. W\\"uthrich

    2009-01-01

    To meet the Basel II regulatory requirements for the Advanced Measurement Approaches in operational risk, the bank's internal model should make use of the internal data, relevant external data, scenario analysis and factors reflecting the business environment and internal control systems. One of the unresolved challenges in operational risk is combining of these data sources appropriately. In this paper we focus on quantification of the low frequency high impact losses exceeding some high thr...

  9. Universal quantification for deterministic chaos in dynamical systems

    OpenAIRE

    Selvam, A. Mary

    2000-01-01

    A cell dynamical system model for deterministic chaos enables precise quantification of the round-off error growth,i.e., deterministic chaos in digital computer realizations of mathematical models of continuum dynamical systems. The model predicts the following: (a) The phase space trajectory (strange attractor) when resolved as a function of the computer accuracy has intrinsic logarithmic spiral curvature with the quasiperiodic Penrose tiling pattern for the internal structure. (b) The unive...

  10. Neutron-encoded protein quantification by peptide carbamylation.

    Science.gov (United States)

    Ulbrich, Arne; Merrill, Anna E; Hebert, Alexander S; Westphall, Michael S; Keller, Mark P; Attie, Alan D; Coon, Joshua J

    2014-01-01

    We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet. PMID:24178922

  11. Imaging and Quantification of Brain Serotonergic Activity using PET

    OpenAIRE

    Lundquist, Pinelopi

    2006-01-01

    This thesis investigates the potential of using positron emission tomography (PET) to study the biosynthesis and release of serotonin (5HT) at the brain serotonergic neuron. As PET requires probe compounds with specific attributes to enable imaging and quantification of biological processes, emphasis was placed on the evaluation of these attributes. The experiments established that the 5HT transporter radioligand [11C]-3-amino-4-(2-dimethylaminomethyl-phenylsulfanyl)-benzonitrile, [11C]DASB, ...

  12. Standardless quantification by parameter optimization in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Limandri, Silvina P. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Bonetto, Rita D. [Centro de Investigacion y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco (CINDECA), CONICET, 47 Street 257, (1900) La Plata (Argentina); Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1 and 47 Streets (1900) La Plata (Argentina); Josa, Victor Galvan; Carreras, Alejo C. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Trincavelli, Jorge C., E-mail: trincavelli@famaf.unc.edu.ar [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina)

    2012-11-15

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum Registered-Sign for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: Black-Right-Pointing-Pointer A method for standardless quantification in EPMA is presented. Black-Right-Pointing-Pointer It gives better results than the commercial software GENESIS Spectrum. Black-Right-Pointing-Pointer It gives better results than the software DTSA. Black-Right-Pointing-Pointer It allows the determination of the conductive coating thickness. Black-Right-Pointing-Pointer It gives an estimation for the concentration uncertainties.

  13. Quantification of global myocardial oxygenation in humans: initial experience

    Directory of Open Access Journals (Sweden)

    Gropler Robert J

    2010-06-01

    Full Text Available Abstract Purpose To assess the feasibility of our newly developed cardiovascular magnetic resonance (CMR methods to quantify global and/or regional myocardial oxygen consumption rate (MVO2 at rest and during pharmacologically-induced vasodilation in normal volunteers. Methods A breath-hold T2 quantification method is developed to calculate oxygen extraction fraction (OEF and MVO2 rate at rest and/or during hyperemia, using a two-compartment model. A previously reported T2 quantification method using turbo-spin-echo sequence was also applied for comparison. CMR scans were performed in 6 normal volunteers. Each imaging session consisted of imaging at rest and during adenosine-induced vasodilation. The new T2 quantification method was applied to calculate T2 in the coronary sinus (CS, as well as in myocardial tissue. Resting CS OEF, representing resting global myocardial OEF, and myocardial OEF during adenosine vasodilation were then calculated by the model. Myocardial blood flow (MBF was also obtained to calculate MVO2, by using a first-pass perfusion imaging approach. Results The T2 quantification method yielded a hyperemic OEF of 0.37 ± 0.05 and a hyperemic MVO2 of 9.2 ± 2.4 μmol/g/min. The corresponding resting values were 0.73 ± 0.05 and 5.2 ± 1.7 μmol/g/min respectively, which agreed well with published literature values. The MVO2 rose proportionally with rate-pressure product from the rest condition. The T2 sensitivity is approximately 95% higher with the new T2 method than turbo-spin-echo method. Conclusion The CMR oxygenation method demonstrates the potential for non-invasive estimation of myocardial oxygenation, and should be explored in patients with altered myocardial oxygenation.

  14. Reliability and discriminatory power of methods for dental plaque quantification

    OpenAIRE

    Daniela Prócida Raggio; Mariana Minatel Braga; Jonas Almeida Rodrigues; Patrícia Moreira de Freitas; José Carlos Pettorossi Imparato; Fausto Medeiros Mendes

    2010-01-01

    OBJECTIVE: This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI) and fluorescence camera (FC) to detect plaque. MATERIAL AND METHODS: Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both cond...

  15. Mass spectrometry based targeted protein quantification: methods and applications

    OpenAIRE

    Pan, Sheng; Aebersold, Ruedi; Chen, Ru; Rush, John; Goodlett, David R.; McIntosh, Martin W.; Zhang, Jing; Brentnall, Teresa A.

    2009-01-01

    The recent advance in technology for mass spectrometry-based targeted protein quantification has opened new avenues for a broad range of proteomic applications in clinical research. The major breakthroughs are highlighted by the capability of using a “universal” approach to perform quantitative assays for a wide spectrum of proteins with minimum restrictions, and the ease of assembling multiplex detections in a single measurement. The quantitative approach relies on the use of synthetic stabl...

  16. Single-Molecule Fluorescence Quantification with a Photobleached Internal Standard

    OpenAIRE

    Gadd, Jennifer C.; Fujimoto, Bryant S.; Sandra M Bajjalieh; Chiu, Daniel T.

    2012-01-01

    In cellular and molecular biology, fluorophores are employed to aid in tracking and quantifying molecules involved in cellular function. We previously developed a sensitive single-molecule quantification technique to count the number of proteins and the variation of the protein number over the population of individual sub-cellular organelles. However, environmental effects on the fluorescent intensity of fluorophores can make it difficult to accurately quantify proteins using these sensitive ...

  17. Lessons in uncertainty quantification for turbulent dynamical systems

    OpenAIRE

    Majda, A. J.; M. Branicki

    2012-01-01

    The modus operandi of modern applied mathematics in developing very recent mathematical strategies for uncertainty quantification in partially observed high-dimensional turbulent dynamical systems is emphasized here. The approach involves the synergy of rigorous mathematical guidelines with a suite of physically relevant and progressively more complex test models which are mathematically tractable while possessing such important features as the two-way coupling between the resolved dynamics a...

  18. Leishmania parasite detection and quantification using PCR-ELISA

    Czech Academy of Sciences Publication Activity Database

    Kobets, Tetyana; Badalová, Jana; Grekov, Igor; Havelková, Helena; Lipoldová, Marie

    2010-01-01

    Roč. 5, č. 6 (2010), s. 1074-1080. ISSN 1754-2189 R&D Projects: GA ČR GA310/08/1697; GA MŠk(CZ) LC06009 Institutional research plan: CEZ:AV0Z50520514 Keywords : polymerase chain reaction * Leishmania major infection * parasite quantification Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 8.362, year: 2010

  19. Quantification of rice bran oil in oil blends

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, R.; Sharma, H. K.; Sengar, G.

    2012-11-01

    Blends consisting of physically refined rice bran oil (PRBO): sunflower oil (SnF) and PRBO: safflower oil (SAF) in different proportions were analyzed for various physicochemical parameters. The quantification of pure rice bran oil in the blended oils was carried out using different methods including gas chromatographic, HPLC, ultrasonic velocity and methods based on physico-chemical parameters. The physicochemical parameters such as ultrasonic velocity, relative association and acoustic impedance at 2 MHz, iodine value, palmitic acid content and oryzanol content reflected significant changes with increased proportions of PRBO in the blended oils. These parameters were selected as dependent parameters and % PRBO proportion was selected as independent parameters. The study revealed that regression equations based on the oryzanol content, palmitic acid composition, ultrasonic velocity, relative association, acoustic impedance, and iodine value can be used for the quantification of rice bran oil in blended oils. The rice bran oil can easily be quantified in the blended oils based on the oryzanol content by HPLC even at a 1% level. The palmitic acid content in blended oils can also be used as an indicator to quantify rice bran oil at or above the 20% level in blended oils whereas the method based on ultrasonic velocity, acoustic impedance and relative association showed initial promise in the quantification of rice bran oil. (Author) 23 refs.

  20. Assessment methods for angiogenesis and current approaches for its quantification.

    Science.gov (United States)

    AlMalki, Waleed Hassan; Shahid, Imran; Mehdi, Abeer Yousaf; Hafeez, Muhammad Hassan

    2014-01-01

    Angiogenesis is a physiological process which describes the development of new blood vessels from the existing vessels. It is a common and the most important process in the formation and development of blood vessels, so it is supportive in the healing of wounds and granulation of tissues. The different assays for the evaluation of angiogenesis have been described with distinct advantages and some limitations. In order to develop angiogenic and antiangiogenic techniques, continuous efforts have been resulted to give animal models for more quantitative analysis of angiogenesis. Most of the studies on angiogenic inducers and inhibitors rely on various models, both in vitro, in vivo and in ova, as indicators of efficacy. The angiogenesis assays are very much helpful to test efficacy of both pro- and anti- angiogenic agents. The development of non-invasive procedures for quantification of angiogenesis will facilitate this process significantly. The main objective of this review article is to focus on the novel and existing methods of angiogenesis and their quantification techniques. These findings will be helpful to establish the most convenient methods for the detection, quantification of angiogenesis and to develop a novel, well tolerated and cost effective anti-angiogenic treatment in the near future. PMID:24987169

  1. Gas chromatographic validated method for quantification of ayurvedic polyherbal formulation

    Directory of Open Access Journals (Sweden)

    Navdeep Saini

    2015-01-01

    Full Text Available A new gas chromatographic-flame ionization detector (GC-FID method was developed for quantification of ayurvedic polyherbal formulation. The GC-FID method was found highly accurate, sensitive, simple and precise. This method was validated as per international conference on harmonization (ICH guidelines. Experimental work was performed by nonpolar capillary column (Zb-5, 5%-Phenyl-95%-dimethylpolysiloxane. Film thickness of capillary column (Zb-5 was (0.25 μm and length 30 m × 0.25 mm i.d. The temperature of the oven, injector and detector were 200, 210 and 280°C respectively. Data processing system was applied to obtain data. The standards and test samples were prepared in absolute ethanol. The principle constituents t-Anethol, d-Limonene, cuminaldehyde and thymol were found in ayurvedic polyherbal formulation. The ICH validation parameters for the proposed procedure, recovery (limit 98.85-100.76%, precision (<1.00%, limits of detection, limits of quantification and linearity (r2 = 0.995 ± 0.002 were observed under acceptance limit. Validation results were statistically calculated. The result shows that method is selective and reproducible for quantification of ayurvedic polyherbal formulation. The presented GC method can be applied for the routine analysis of principle constituents as well as ayurvedic polyherbal formulation.

  2. Automated Erythema Quantification in Radiation Therapy - a Java Based Tool

    Directory of Open Access Journals (Sweden)

    Paul Martin PUTORA

    2010-03-01

    Full Text Available Introduction: In radiotherapy, erythema is a common side-effect, especially during radiotherapy treatment regimes that last several weeks. The measurement of erythema might be of clinical relevance, especially when standardized interpretation is possible. Aim: The aim of this article is to present a tool that can be implemented for automatized and time efficient quantification of erythema from digital images taken during radiotherapy treatment. Method: Instead of relying on commercially available graphic editors and performing manual operations on the images within these programs we developed a java based tool that can automatically evaluate the “redness” of images. These erythema values receive a score number, are connected with the date and time the pictures were taken and are exported into a comma separated values (CSV file. Results: The Erythema values of images could be quickly evaluated with the developed tool. With spreadsheet software the exported file could be easily manipulated to produce graphical representations of erythema rise. Conclusion: Erythema quantification from digital images can be easily performed by custom developed java tools. An automated quantification provides a method of detecting an increase in erythema that may not be visible to the naked eye.

  3. Gas plume quantification in downlooking hyperspectral longwave infrared images

    Science.gov (United States)

    Turcotte, Caroline S.; Davenport, Michael R.

    2010-10-01

    Algorithms have been developed to support quantitative analysis of a gas plume using down-looking airborne hyperspectral long-wave infrared (LWIR) imagery. The resulting gas quantification "GQ" tool estimates the quantity of one or more gases at each pixel, and estimates uncertainty based on factors such as atmospheric transmittance, background clutter, and plume temperature contrast. GQ uses gas-insensitive segmentation algorithms to classify the background very precisely so that it can infer gas quantities from the differences between plume-bearing pixels and similar non-plume pixels. It also includes MODTRAN-based algorithms to iteratively assess various profiles of air temperature, water vapour, and ozone, and select the one that implies smooth emissivity curves for the (unknown) materials on the ground. GQ then uses a generalized least-squares (GLS) algorithm to simultaneously estimate the most likely mixture of background (terrain) material and foreground plume gases. Cross-linking of plume temperature to the estimated gas quantity is very non-linear, so the GLS solution was iteratively assessed over a range of plume temperatures to find the best fit to the observed spectrum. Quantification errors due to local variations in the camera-topixel distance were suppressed using a subspace projection operator. Lacking detailed depth-maps for real plumes, the GQ algorithm was tested on synthetic scenes generated by the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software. Initial results showed pixel-by-pixel gas quantification errors of less than 15% for a Freon 134a plume.

  4. Preliminary study on computer automatic quantification of brain atrophy

    International Nuclear Information System (INIS)

    Objective: To study the variability of normal brain volume with the sex and age, and put forward an objective standard for computer automatic quantification of brain atrophy. Methods: The cranial volume, brain volume and brain parenchymal fraction (BPF) of 487 cases of brain atrophy (310 males, 177 females) and 1901 cases of normal subjects (993 males, 908 females) were calculated with the newly developed algorithm of automatic quantification for brain atrophy. With the technique of polynomial curve fitting, the mathematical relationship of BPF with age in normal subjects was analyzed. Results: The cranial volume, brain volume and BPF of normal subjects were (1 271 322 ± 128 699) mm3, (1 211 725 ± 122 077) mm3 and (95.3471 ± 2.3453)%, respectively, and those of atrophy subjects were (1 276 900 ± 125 180) mm3, (1 203 400 ± 117 760) mm3 and BPF(91.8115 ± 2.3035)% respectively. The difference of BPF between the two groups was extremely significant (P0.05). The expression P(x)=-0.0008x2 + 0.0193x + 96.9999 could accurately describe the mathematical relationship between BPF and age in normal subject (lower limit of 95% CI y=-0.0008x2+0.0184x+95.1090). Conclusion: The lower limit of 95% confidence interval mathematical relationship between BPF and age could be used as an objective criteria for automatic quantification of brain atrophy with computer. (authors)

  5. Leveraging transcript quantification for fast computation of alternative splicing profiles.

    Science.gov (United States)

    Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo

    2015-09-01

    Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. PMID:26179515

  6. In vivo behavior of NTBI revealed by automated quantification system.

    Science.gov (United States)

    Ito, Satoshi; Ikuta, Katsuya; Kato, Daisuke; Lynda, Addo; Shibusa, Kotoe; Niizeki, Noriyasu; Toki, Yasumichi; Hatayama, Mayumi; Yamamoto, Masayo; Shindo, Motohiro; Iizuka, Naomi; Kohgo, Yutaka; Fujiya, Mikihiro

    2016-08-01

    Non-Tf-bound iron (NTBI), which appears in serum in iron overload, is thought to contribute to organ damage; the monitoring of serum NTBI levels may therefore be clinically useful in iron-overloaded patients. However, NTBI quantification methods remain complex, limiting their use in clinical practice. To overcome the technical difficulties often encountered, we recently developed a novel automated NTBI quantification system capable of measuring large numbers of samples. In the present study, we investigated the in vivo behavior of NTBI in human and animal serum using this newly established automated system. Average NTBI in healthy volunteers was 0.44 ± 0.076 μM (median 0.45 μM, range 0.28-0.66 μM), with no significant difference between sexes. Additionally, serum NTBI rapidly increased after iron loading, followed by a sudden disappearance. NTBI levels also decreased in inflammation. The results indicate that NTBI is a unique marker of iron metabolism, unlike other markers of iron metabolism, such as serum ferritin. Our new automated NTBI quantification method may help to reveal the clinical significance of NTBI and contribute to our understanding of iron overload. PMID:27086349

  7. Initial water quantification results using neutron computed tomography

    International Nuclear Information System (INIS)

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  8. Automatic quantification in lung scintigraphy: functional atlas; Quantification automatique en scintigraphie pulmonaire: elaboration d'atlas fonctionnels

    Energy Technology Data Exchange (ETDEWEB)

    Ledee, R.; Therain, F. [Orleans Univ., Lab. d' Electronique - Signaux - Images (LESI), 45 (France); Debrun, D. [Centre Hospitalier Regional de la Source (CHRO), Medecine Nucleaire, 45 - Orleans (France)

    2003-06-01

    In spite of the development of new techniques, the ventilation perfusion lung scintigraphy keeps a good place for the diagnosis of pulmonary embolism (PE). In the context of an improvement of reliability and reproducibility of the diagnosis, this study proposes to realize an automatic quantification of the distribution of the radioactive tracers by pulmonary segments. Measurements are made following a procedure of non-rigid matching of morphological 2-D charts of the lungs on the scintigraphic images. The adaptation of these charts to the patients' morphology is carried out by exploiting iso-contour information of the images and using Fourier descriptors to determine the parameters of the transformation. The study was performed on a population of 30 patients with a probability of nil of the pulmonary embolism. After a study of the robustness of the quantification, 2-D segmental functional reference charts (according to the conditions of acquisition) were proposed. In the perfusion case and four views, the following lobar distribution, in relative value, is measured: Right Inferior Lobar = 23,39%, Medial Lobar = 10,41 %, Right Superior Lobar = 20,37%, Left Inferior Lobar 20,6% and Left Superior Lobar = 25.6% with culmen = 18,8% and lingula = 6,8%; values comparable with those of the publication. The process of quantification is adaptable to the ventilation lung scans. The segmental quantifications of a patient carried out under the same conditions of acquisition as the functional reference charts, could be compared with the reference data and provide indicators for the diagnosis but also for patient follow-up and preoperative evaluation of lung cancers. (authors)

  9. Systematic development of a group quantification method using evaporative light scattering detector for relative quantification of ginsenosides in ginseng products.

    Science.gov (United States)

    Lee, Gwang Jin; Shin, Byong-Kyu; Yu, Yun-Hyun; Ahn, Jongsung; Kwon, Sung Won; Park, Jeong Hill

    2016-09-01

    The determination for the contents of multi-components in ginseng products has come to the fore by demands of in-depth information, but the associated industries confront the high cost of securing pure standards for the continuous quality evaluation of the products. This study aimed to develop a prospective high-performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD) method for relative quantification of ginsenosides in ginseng products without a considerable change from the conventional gradient analysis. We investigated the effects of mobile phase composition and elution bandwidth, which are potential variables affecting the ELSD response in the gradient analysis. Similar ELSD response curves of nine major ginsenosides were obtained under the identical flow injection conditions, and the response increased as the percentage of organic solvent increased. The nine ginsenosides were divided into three groups to confirm the effect of elution bandwidth. The ELSD response significantly decreased in case of the late eluted ginsenoside in the individual groups under the isocratic conditions. With the consideration of the two important effects, stepwise changes of the gradient condition were carried out to reach a group quantification method. The inconsistent responses of the nine ginsenosides were reconstituted to three normalized responses by the stepwise changes of the gradient condition, and this result actualized relative quantification in the individual groups. The availability was confirmed by comparing the ginsenoside contents in a base material of ginseng products determined by the direct and group quantification method. The largest difference in the determination results from the two methods was 8.26%, and the difference of total contents was only 0.91%. PMID:27262109

  10. Evaluation of a new method for stenosis quantification from 3D x-ray angiography images

    Science.gov (United States)

    Betting, Fabienne; Moris, Gilles; Knoplioch, Jerome; Trousset, Yves L.; Sureda, Francisco; Launay, Laurent

    2001-05-01

    A new method for stenosis quantification from 3D X-ray angiography images has been evaluated on both phantom and clinical data. On phantoms, for the parts larger or equal to 3 mm, the standard deviation of the measurement error has always found to be less or equal to 0.4 mm, and the maximum measurement error less than 0.17 mm. No clear relationship has been observed between the performances of the quantification method and the acquisition FoV. On clinical data, the 3D quantification method proved to be more robust to vessel bifurcations than its 3D equivalent. On a total of 15 clinical cases, the differences between 2D and 3D quantification were always less than 0.7 mm. The conclusion is that stenosis quantification from 3D X-4ay angiography images is an attractive alternative to quantification from 2D X-ray images.

  11. The sociology of quantification - perspectives on an emerging field in the social sciences

    OpenAIRE

    Diaz-Bone, Rainer; Didier, Emmanuel

    2016-01-01

    The introductory article to this HSR Special Issue presents the emerging field of sociology of quantification, which can be regarded as a transdisciplinary approach to the analysis of processes of quantification. Processes of categorization and classification are included because they can result in processes of generating figures and numbers also. The contribution sketches the science-historical development of this field. It is argued that processes of quantification are related in many ways ...

  12. Learning-guided automatic three dimensional synapse quantification for drosophila neurons

    OpenAIRE

    Sanders, Jonathan; Singh, Anil; Sterne, Gabriella; Ye, Bing; Zhou, Jie

    2015-01-01

    Background The subcellular distribution of synapses is fundamentally important for the assembly, function, and plasticity of the nervous system. Automated and effective quantification tools are a prerequisite to large-scale studies of the molecular mechanisms of subcellular synapse distribution. Common practices for synapse quantification in neuroscience labs remain largely manual or semi-manual. This is mainly due to computational challenges in automatic quantification of synapses, including...

  13. Quantification Bias Caused by Plasmid DNA Conformation in Quantitative Real-Time PCR Assay

    OpenAIRE

    Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming

    2011-01-01

    Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute qu...

  14. Quantification of Hepatic Steatosis With Dual-Energy Computed Tomography

    Science.gov (United States)

    Artz, Nathan S.; Hines, Catherine D.G.; Brunner, Stephen T.; Agni, Rashmi M.; Kühn, Jens-Peter; Roldan-Alzate, Alejandro; Chen, Guang-Hong; Reeder, Scott B.

    2012-01-01

    Objective The aim of this study was to compare dual-energy computed tomography (DECT) and magnetic resonance imaging (MRI) for fat quantification using tissue triglyceride concentration and histology as references in an animal model of hepatic steatosis. Materials and Methods This animal study was approved by our institution's Research Animal Resource Center. After validation of DECT and MRI using a phantom consisting of different triglyceride concentrations, a leptin-deficient obese mouse model (ob/ob) was used for this study. Twenty mice were divided into 3 groups based on expected levels of hepatic steatosis: low (n = 6), medium (n = 7), and high (n = 7) fat. After MRI at 3 T, a DECT scan was immediately performed. The caudate lobe of the liver was harvested and analyzed for triglyceride concentration using a colorimetric assay. The left lateral lobe was also extracted for histology. Magnetic resonance imaging fat-fraction (FF) and DECT measurements (attenuation, fat density, and effective atomic number) were compared with triglycerides and histology. Results Phantom results demonstrated excellent correlation between triglyceride content and each of the MRI and DECT measurements (r2 ≥ 0.96, P ≤ 0.003). In vivo, however, excellent triglyceride correlation was observed only with attenuation (r2 = 0.89, P Gemstone Spectral Imaging analysis tool do not improve the accuracy of fat quantification in the liver beyond what CT attenuation can already provide. Furthermore, MRI may provide an excellent reference standard for liver fat quantification when validating new CT or DECT methods in human subjects. PMID:22836309

  15. QUANTIFICATION OF GENETICALLY MODIFIED MAIZE MON 810 IN PROCESSED FOODS

    Directory of Open Access Journals (Sweden)

    Peter Siekel

    2012-12-01

    Full Text Available 800x600 Normal 0 21 false false false SK X-NONE X-NONE MicrosoftInternetExplorer4 Maize MON 810 (Zea mays L. represents the majority of genetically modified food crops. It is the only transgenic cultivar grown in the EU (European Union countries and food products with its content higher than 0.9 % must be labelled. This study was aimed at impact of food processing (temperature, pH and pressure on DNA degradation and quantification of the genetically modified maize MON 810. The transgenic DNA was quantified by the real-time polymerase chain reaction method. Processing as is high temperature (121 °C, elevated pressure (0.1 MPa and low pH 2.25 fragmented DNA. A consequence of two order difference in the species specific gene content compared to the transgenic DNA content in plant materials used has led to false negative results in the quantification of transgenic DNA. The maize containing 4.2 % of the transgene after processing appeared to be as low as 3.0 % (100 °C and 1.9 % (121 °C, 0.1 MPa. The 2.1 % amount of transgene dropped at 100 °C to 1.0 % and at 121 °C, 0.1 MPa to 0.6 %. Under such make up the DNA degradation of transgenic content showed up 2 or 3 time higher decrease a consequence of unequal gene presence. Such genes disparity is expressed as considerable decrease of transgenic content while the decrease of species specific gene content remains unnoticed. Based on our findings we conclude that high degree of processing might have led to false negative results of the transgenic constituent quantification. Determination of GMO content in processed foods may leads to incorrect statement and labelling in these cases could misleads consumers.doi:10.5219/212

  16. Quantification of Carnosine-Aldehyde Adducts in Human Urine.

    Science.gov (United States)

    da Silva Bispo, Vanderson; Di Mascio, Paolo; Medeiros, Marisa

    2014-10-01

    Lipid peroxidation generates several reactive carbonyl species, including 4-hydroxy-2-nonenal (HNE), acrolein (ACR), 4-hydroxy-2-hexenal (HHE) and malondialdehyde. One major pathwayof aldehydes detoxification is through conjugation with glutathione catalyzed by glutathione-S-transferases or, alternatively, by conjugation with endogenous histidine containing dipeptides, such as carnosine (CAR). In this study, on-line reverse-phase high-performance liquid chromatography (HPLC) separation with tandem mass spectrometry detection was utilized for the accurate quantification of CAR- ACR, CAR-HHE and CAR-HNE adducts in human urinary samples from non-smokers young adults. Standard adducts were prepared and isolated by HPLC. The results showed the presence of a new product from the reaction of CAR with ACR. This new adduct was completely characterized by HPLC/MS-MSn, 1H RMN, COSY and HSQC. The new HPLC/MS/MS methodology employing stable isotope-labeled internal standards (CAR-HHEd5 and CAR-HNEd11) was developed for adducts quantification. This methodology permits quantification of 10pmol CAR-HHE and 1pmol of CAR-ACR and CAR-HNE. Accurate determinations in human urine sample were performed and showed 4.65±1.71 to CAR-ACR, 5.13±1.76 to CAR-HHE and 5.99±3.19nmol/mg creatinine to CAR-HNE. Our results indicate that carnosine pathways can be an important detoxification route of a, ß -unsaturated aldehydes. Moreover, carnosine adducts may be useful as redox stress indicator. PMID:26461323

  17. DNA imaging and quantification using chemi-luminescent probes

    International Nuclear Information System (INIS)

    During this interdisciplinary study we have developed an ultra sensitive and reliable imaging system of DNA labelled by chemiluminescence. Based on a liquid nitrogen cooled CCD, the system achieves sensitivities down to 10 fg/mm2 labelled DNA over a surface area of 25 x 25 cm2 with a sub-millimeter resolution. Commercially available chemi-luminescent - and enhancer molecules are compared and their reaction conditions optimized for best signal-to-noise ratios. Double labelling was performed to verify quantification with radioactive probes. (authors)

  18. HPTLC densitometric quantification of stigmasterol and lupeol from Ficus religiosa

    OpenAIRE

    Deepti Rathee; Sushila Rathee; Permender Rathee; Aakash Deep; Sheetal Anandjiwala; Dharmender Rathee

    2015-01-01

    This study presents the first report of TLC densitometric method, which has been developed and validated for simultaneous quantification of the two marker compounds (stigmasterol and lupeol) from methanolic extract using the solvent system of toluene:methanol (9:1, v/v). The method employed TLC aluminum plates precoated with silica gel 60 F254 as the stationary phase. Densitometric analysis of stigmasterol and lupeol was carried out in the reflectance mode at 525 nm. The system was found to g...

  19. Quantification of aortic regurgitation by magnetic resonance velocity mapping

    DEFF Research Database (Denmark)

    Søndergaard, Lise; Lindvig, K; Hildebrandt, P;

    1993-01-01

    The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients, and the...... calculated from MR imaging of the left ventricular end-diastolic and end-systolic volumes in eight patients (Y = 0.89 x X + 11, r = 0.97, p < 0.001). This finding was confirmed by a good agreement between the net cardiac output (L/min) quantified with MR velocity mapping and simultaneous 125I...

  20. Review of some aspects of human reliability quantification

    International Nuclear Information System (INIS)

    An area in systems reliability considered to be weak, is the characterization and quantification of the role of the operations and maintenance staff in combatting accidents. Several R and D programs are underway to improve the modeling of human interactions and some progress has been made. This paper describes a specific aspect of human reliability analysis which is referred to as modeling of cognitive processes. In particular, the basis for the so- called Human Cognitive Reliability (HCR) model is described and the focus is on its validation and on its benefits and limitations

  1. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Stracuzzi, David John [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Brost, Randolph [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Chen, Maximillian Gene [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Malinas, Rebecca [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Peterson, Matthew Gregor [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Phillips, Cynthia A. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Robinson, David G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Woodbridge, Diane [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search, and discuss a number of possible improvements for each.

  2. Method for quantification of aerobic anoxygenic phototrophic bacteria

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yao; JIAO Nianzhi

    2004-01-01

    Accurate quantification of aerobic anoxygenic phototrophic bacteria (AAPB) is of crucial importance for estimation of the role of AAPB in the carbon cycling in marine ecosystems. The normally used method "epifiuorescence microscope-infrared photography (EFM-IRP)"is, however, subject to positive errors introduced by mistaking cyanobacteria as AAPB due to the visibility of cyanobacteria under infrared photographic conditions for AAPB. This error could be up to 30% in the coast of the East China Sea. Such bias should be avoided by either subtracting cyanobacteira from the total infrared counts or using a fiowcytometer equipped with specific detectors for discrimination between cyanobacteria and AAPB.

  3. MM98.57 Quantification of Combined Strain Paths

    DEFF Research Database (Denmark)

    Nielsen, Morten Sturgård; Wanheim, Tarras

    1998-01-01

    Hitherto the quantification of material properties of a material submitted to a deformation has been taken as a function of one or more internal variables. When making measurements of yield properties as a function of a deformation a convenient way of describing the deformation in an experiment is...... surface at a given point in a new strain history. A simple example of this concept is to take the length of the strain curve as describing scalar relation: E.g. to use the equivalent strain as parameter for describing the yield stress. This paper focuses on the strain curve concept and the possibilities...

  4. Nuclear Data Uncertainty Quantification: Past, Present and Future

    International Nuclear Information System (INIS)

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested

  5. A recipe for EFT uncertainty quantification in nuclear physics

    International Nuclear Information System (INIS)

    The application of effective field theory (EFT) methods to nuclear systems provides the opportunity to rigorously estimate the uncertainties originating in the nuclear Hamiltonian. Yet this is just one source of uncertainty in the observables predicted by calculations based on nuclear EFTs. We discuss the goals of uncertainty quantification in such calculations and outline a recipe to obtain statistically meaningful error bars for their predictions. We argue that the different sources of theory error can be accounted for within a Bayesian framework, as we illustrate using a toy model. (paper)

  6. Predicting human age with bloodstains by sjTREC quantification.

    Directory of Open Access Journals (Sweden)

    Xue-ling Ou

    Full Text Available The age-related decline of signal joint T-cell receptor rearrangement excision circles (sjTRECs in human peripheral blood has been demonstrated in our previous study and other reports. Until now, only a few studies on sjTREC detection in bloodstain samples were reported, which were based on a small sample of subjects of a limited age range, although bloodstains are much more frequently encountered in forensic practice. In this present study, we adopted the sensitive Taqman real-time quantitative polymerase chain reaction (qPCR method to perform sjTREC quantification in bloodstains from individuals ranging from 0-86 years old (n = 264. The results revealed that sjTREC contents in human bloodstains were declined in an age-dependent manner (r = -0.8712. The formula of age estimation was Age = -7.1815Y-42.458 ± 9.42 (Y dCt(TBP-sjTREC; 9.42 standard error. Furthermore, we tested for the influence of short- or long- storage time by analyzing fresh and stored bloodstains from the same individuals. Remarkably, no statistically significant difference in sjTREC contents was found between the fresh and old DNA samples over a 4-week of storage time. However, significant loss (0.16-1.93 dCt in sjTREC contents was detected after 1.5 years of storage in 31 samples. Moreover, preliminary sjTREC quantification from up to 20-year-old bloodstains showed that though the sjTREC contents were detectable in all samples and highly correlated with donor age, a time-dependent decrease in the correlation coefficient r was found, suggesting the predicting accuracy of this described assay would be deteriorated in aged samples. Our findings show that sjTREC quantification might be also suitable for age prediction in bloodstains, and future researches into the time-dependent or other potential impacts on sjTREC quantification might allow further improvement of the predicting accuracy.

  7. Molecular nonlinear dynamics and protein thermal uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Kelin [Department of Mathematics, Michigan State University, Michigan 48824 (United States); Wei, Guo-Wei, E-mail: wei@math.msu.edu [Department of Mathematics, Michigan State University, Michigan 48824 (United States); Department of Electrical and Computer Engineering, Michigan State University, Michigan 48824 (United States); Department of Biochemistry and Molecular Biology, Michigan State University, Michigan 48824 (United States)

    2014-03-15

    This work introduces molecular nonlinear dynamics (MND) as a new approach for describing protein folding and aggregation. By using a mode system, we show that the MND of disordered proteins is chaotic while that of folded proteins exhibits intrinsically low dimensional manifolds (ILDMs). The stability of ILDMs is found to strongly correlate with protein energies. We propose a novel method for protein thermal uncertainty quantification based on persistently invariant ILDMs. Extensive comparison with experimental data and the state-of-the-art methods in the field validate the proposed new method for protein B-factor prediction.

  8. Recognition and quantification of pain in horses: A tutorial review

    DEFF Research Database (Denmark)

    Gleerup, Karina Charlotte Bech; Lindegaard, Casper

    2015-01-01

    Pain management is dependent on the quality of the pain evaluation. Ideally, pain evaluation is objective, pain-specific and easily incorporated into a busy equine clinic. This paper reviews the existing knowledge base regarding the identification and quantification of pain in horses. Behavioural...... indicators of pain in horses in the context of normal equine behaviour, as well as various physiological parameters potentially useful for pain evaluation, are discussed. Areas where knowledge is sparse are identified and a new equine pain scale based on results from all reviewed papers is proposed. Finally...

  9. Recognition and quantification of pain in horses: A tutorial review

    DEFF Research Database (Denmark)

    Gleerup, Karina Charlotte Bech; Lindegaard, Casper

    2016-01-01

    Pain management is dependent on the quality of the pain evaluation. Ideally, pain evaluation is objective, pain-specific and easily incorporated into a busy equine clinic. This paper reviews the existing knowledge base regarding the identification and quantification of pain in horses. Behavioural...... indicators of pain in horses in the context of normal equine behaviour, as well as various physiological parameters potentially useful for pain evaluation, are discussed. Areas where knowledge is sparse are identified and a new equine pain scale based on results from all reviewed papers is proposed. Finally...

  10. Quantification in histopathology-Can magnetic particles help?

    International Nuclear Information System (INIS)

    Every year, more than 270,000 people are diagnosed with cancer in the UK alone; this means that one in three people worldwide contract cancer within their lifetime. Histopathology is the principle method for confirming cancer and directing treatment. In this paper, a novel application of magnetic particles is proposed to help address the problem of subjectivity in histopathology. Preliminary results indicate that magnetic nanoparticles cannot only be used to assist diagnosis through improving quantification but also potentially increase throughput, hence offering a way of dramatically reducing costs within the routine histopathology laboratory

  11. Progressive damage state evolution and quantification in composites

    Science.gov (United States)

    Patra, Subir; Banerjee, Sourav

    2016-04-01

    Precursor damage state quantification can be helpful for safety and operation of aircraft and defense equipment's. Damage develops in the composite material in the form of matrix cracking, fiber breakages and deboning, etc. However, detection and quantification of the damage modes at their very early stage is not possible unless modifications of the existing indispensable techniques are conceived, particularly for the quantification of multiscale damages at their early stage. Here, we present a novel nonlocal mechanics based damage detection technique for precursor damage state quantification. Micro-continuum physics is used by modifying the Christoffel equation. American society of testing and materials (ASTM) standard woven carbon fiber (CFRP) specimens were tested under Tension-Tension fatigue loading at the interval of 25,000 cycles until 500,000 cycles. Scanning Acoustic Microcopy (SAM) and Optical Microscopy (OM) were used to examine the damage development at the same interval. Surface Acoustic Wave (SAW) velocity profile on a representative volume element (RVE) of the specimen were calculated at the regular interval of 50,000 cycles. Nonlocal parameters were calculated form the micromorphic wave dispersion curve at a particular frequency of 50 MHz. We used a previously formulated parameter called "Damage entropy" which is a measure of the damage growth in the material calculated with the loading cycle. Damage entropy (DE) was calculated at every pixel on the RVE and the mean of DE was plotted at the loading interval of 25,000 cycle. Growth of DE with fatigue loading cycles was observed. Optical Imaging also performed at the interval of 25,000 cycles to investigate the development of damage inside the materials. We also calculated the mean value of the Surface Acoustic Wave (SAW) velocity and plotted with fatigue cycle which is correlated further with Damage Entropy (DE). Statistical analysis of the Surface Acoustic Wave profile (SAW) obtained at different

  12. Uncertainty Quantification and Statistical Engineering for Hypersonic Entry Applications

    Science.gov (United States)

    Cozmuta, Ioana

    2011-01-01

    NASA has invested significant resources in developing and validating a mathematical construct for TPS margin management: a) Tailorable for low/high reliability missions; b) Tailorable for ablative/reusable TPS; c) Uncertainty Quantification and Statistical Engineering are valuable tools not exploited enough; and d) Need to define strategies combining both Theoretical Tools and Experimental Methods. The main reason for this lecture is to give a flavor of where UQ and SE could contribute and hope that the broader community will work with us to improve in these areas.

  13. Multi data reservior history matching and uncertainty quantification framework

    KAUST Repository

    Katterbauer, Klemens

    2015-11-26

    A multi-data reservoir history matching and uncertainty quantification framework is provided. The framework can utilize multiple data sets such as production, seismic, electromagnetic, gravimetric and surface deformation data for improving the history matching process. The framework can consist of a geological model that is interfaced with a reservoir simulator. The reservoir simulator can interface with seismic, electromagnetic, gravimetric and surface deformation modules to predict the corresponding observations. The observations can then be incorporated into a recursive filter that subsequently updates the model state and parameters distributions, providing a general framework to quantify and eventually reduce with the data, uncertainty in the estimated reservoir state and parameters.

  14. A critical view on microplastic quantification in aquatic organisms

    DEFF Research Database (Denmark)

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R.;

    2015-01-01

    to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature...... review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different "hotspot" locations in Europe (Po estuary, Italy; Tagus...

  15. Development of magnetic resonance technology for noninvasive boron quantification

    International Nuclear Information System (INIS)

    Boron magnetic resonance imaging (MRI) and spectroscopy (MRS) were developed in support of the noninvasive boron quantification task of the Idaho National Engineering Laboratory (INEL) Power Burst Facility/Boron Neutron Capture Therapy (PBF/BNCT) program. The hardware and software described in this report are modifications specific to a GE Signa trademark MRI system, release 3.X and are necessary for boron magnetic resonance operation. The technology developed in this task has been applied to obtaining animal pharmacokinetic data of boron compounds (drug time response) and the in-vivo localization of boron in animal tissue noninvasively. 9 refs., 21 figs

  16. Accident sequence quantification of phased mission systems using Markov approach

    International Nuclear Information System (INIS)

    A Markov approach to incorporate a phased mission analysis into an accident sequence quantification is developed in this study. The Markov approach describes more accurately the dynamic characteristics of phased mission systems, dependency among various phases, and repair of failed components than a cut set approach. For comparison, an accurate cut set approach to quantify the accident sequences is also presented in this study. The results show that it is desirable to use the Markov approach in phased missions systems with large failure rates of components or long mission time intervals. Furthermore, the Markov approach is not constrained by the magnitudes of failure rates or mission time intervals. (author)

  17. Recurrence plots and recurrence quantification analysis of human motion data

    Science.gov (United States)

    Josiński, Henryk; Michalczuk, Agnieszka; Świtoński, Adam; Szczesna, Agnieszka; Wojciechowski, Konrad

    2016-06-01

    The authors present exemplary application of recurrence plots, cross recurrence plots and recurrence quantification analysis for the purpose of exploration of experimental time series describing selected aspects of human motion. Time series were extracted from treadmill gait sequences which were recorded in the Human Motion Laboratory (HML) of the Polish-Japanese Academy of Information Technology in Bytom, Poland by means of the Vicon system. Analysis was focused on the time series representing movements of hip, knee, ankle and wrist joints in the sagittal plane.

  18. HPTLC densitometric quantification of stigmasterol and lupeol from Ficus religiosa

    Directory of Open Access Journals (Sweden)

    Deepti Rathee

    2015-05-01

    Full Text Available This study presents the first report of TLC densitometric method, which has been developed and validated for simultaneous quantification of the two marker compounds (stigmasterol and lupeol from methanolic extract using the solvent system of toluene:methanol (9:1, v/v. The method employed TLC aluminum plates precoated with silica gel 60 F254 as the stationary phase. Densitometric analysis of stigmasterol and lupeol was carried out in the reflectance mode at 525 nm. The system was found to give compact spots for stigmasterol and lupeol (Rf value of 0.37 and 0.60, respectively. The method was validated using ICH guidelines in terms of precision, repeatability and accuracy. Linearity range for stigmasterol and lupeol was 80–480 ng/spot and 150–900 ng/spot and the contents were found to be 0.06 ± 0.005% w/w and 0.12 ± 0.02% w/w, respectively. The limit of detection (LOD value for stigmasterol and lupeol were found to be 20 and 50 ng, and limit of quantification (LOQ value were 60 and 100 ng, respectively. This simple, precise and accurate method gave good resolution from other constituents present in the extract. The method has been successfully applied in the analysis and routine quality control of herbal material and formulations containing Ficus religiosa.

  19. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    Science.gov (United States)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  20. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  1. Mixture quantification using PLS in plastic scintillation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Bagan, H.; Tarancon, A.; Rauret, G. [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain); Garcia, J.F., E-mail: jfgarcia@ub.ed [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain)

    2011-06-15

    This article reports the capability of plastic scintillation (PS) combined with multivariate calibration (Partial least squares; PLS) to detect and quantify alpha and beta emitters in mixtures. While several attempts have been made with this purpose in mind using liquid scintillation (LS), no attempt was done using PS that has the great advantage of not producing mixed waste after the measurements are performed. Following this objective, ternary mixtures of alpha and beta emitters ({sup 241}Am, {sup 137}Cs and {sup 90}Sr/{sup 90}Y) have been quantified. Procedure optimisation has evaluated the use of the net spectra or the sample spectra, the inclusion of different spectra obtained at different values of the Pulse Shape Analysis parameter and the application of the PLS1 or PLS2 algorithms. The conclusions show that the use of PS+PLS2 applied to the sample spectra, without the use of any pulse shape discrimination, allows quantification of the activities with relative errors less than 10% in most of the cases. This procedure not only allows quantification of mixtures but also reduces measurement time (no blanks are required) and the application of this procedure does not require detectors that include the pulse shape analysis parameter.

  2. Evaluation of semi-automatic arterial stenosis quantification

    International Nuclear Information System (INIS)

    Object: To assess the accuracy and reproducibility of semi-automatic vessel axis extraction and stenosis quantification in 3D contrast-enhanced Magnetic Resonance Angiography (CE-MRA) of the carotid arteries (CA). Materials and methods: A total of 25 MRA datasets was used: 5 phantoms with known stenoses, and 20 patients (40 CAs) drawn from a multicenter trial database. Maracas software extracted vessel centerlines and quantified the stenoses, based on boundary detection in planes perpendicular to the centerline. Centerline accuracy was visually scored. Semi-automatic measurements were compared with: (1) theoretical phantom morphometric values, and (2) stenosis degrees evaluated by two independent radiologists. Results: Exploitable centerlines were obtained in 97% of CA and in all phantoms. In phantoms, the software achieved a better agreement with theoretic stenosis degrees (weighted kappa ΚW = 0.91) than the radiologists (ΚW = 0.69). In patients, agreement between software and radiologists varied from ΚW =0.67 to 0.90. In both, Maracas was substantially more reproducible than the readers. Mean operating time was within 1 min/ CA. Conclusion: Maracas software generates accurate 3D centerlines of vascular segments with minimum user intervention. Semi-automatic quantification of CA stenosis is also accurate, except in very severe stenoses that cannot be segmented. It substantially reduces the inter-observer variability. (orig.)

  3. Evaluation of semi-automatic arterial stenosis quantification

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez Hoyos, M. [CREATIS Research Unit, CNRS, INSERM, INSA, Lyon (France); Universite Claude Bernard Lyon 1, 69 - Villeurbanne (France). INSA; Univ. de los Andes, Bogota (Colombia). Grupo de Ingenieria Biomedica; Serfaty, J.M.; Douek, P.C. [CREATIS Research Unit, CNRS, INSERM, INSA, Lyon (France); Universite Claude Bernard Lyon 1, 69 - Villeurbanne (France). INSA; Hopital Cardiovasculaire et Pneumologique L. Pradel, Bron (France). Dept. de Radiologie; Maghiar, A. [Hopital Cardiovasculaire et Pneumologique L. Pradel, Bron (France). Dept. de Radiologie; Mansard, C.; Orkisz, M.; Magnin, I. [CREATIS Research Unit, CNRS, INSERM, INSA, Lyon (France); Universite Claude Bernard Lyon 1, 69 - Villeurbanne (France). INSA

    2006-11-15

    Object: To assess the accuracy and reproducibility of semi-automatic vessel axis extraction and stenosis quantification in 3D contrast-enhanced Magnetic Resonance Angiography (CE-MRA) of the carotid arteries (CA). Materials and methods: A total of 25 MRA datasets was used: 5 phantoms with known stenoses, and 20 patients (40 CAs) drawn from a multicenter trial database. Maracas software extracted vessel centerlines and quantified the stenoses, based on boundary detection in planes perpendicular to the centerline. Centerline accuracy was visually scored. Semi-automatic measurements were compared with: (1) theoretical phantom morphometric values, and (2) stenosis degrees evaluated by two independent radiologists. Results: Exploitable centerlines were obtained in 97% of CA and in all phantoms. In phantoms, the software achieved a better agreement with theoretic stenosis degrees (weighted kappa {kappa}{sub W} = 0.91) than the radiologists ({kappa}{sub W} = 0.69). In patients, agreement between software and radiologists varied from {kappa}{sub W} =0.67 to 0.90. In both, Maracas was substantially more reproducible than the readers. Mean operating time was within 1 min/ CA. Conclusion: Maracas software generates accurate 3D centerlines of vascular segments with minimum user intervention. Semi-automatic quantification of CA stenosis is also accurate, except in very severe stenoses that cannot be segmented. It substantially reduces the inter-observer variability. (orig.)

  4. A Spanish model for quantification and management of construction waste

    International Nuclear Information System (INIS)

    Currently, construction and demolition waste (C and D waste) is a worldwide issue that concerns not only governments but also the building actors involved in construction activity. In Spain, a new national decree has been regulating the production and management of C and D waste since February 2008. The present work describes the waste management model that has inspired this decree: the Alcores model implemented with good results in Los Alcores Community (Seville, Spain). A detailed model is also provided to estimate the volume of waste that is expected to be generated on the building site. The quantification of C and D waste volume, from the project stage, is essential for the building actors to properly plan and control its disposal. This quantification model has been developed by studying 100 dwelling projects, especially their bill of quantities, and defining three coefficients to estimate the demolished volume (CT), the wreckage volume (CR) and the packaging volume (CE). Finally, two case studies are included to illustrate the usefulness of the model to estimate C and D waste volume in both new construction and demolition projects.

  5. Biomass to energy : GHG reduction quantification protocols and case study

    International Nuclear Information System (INIS)

    With the growing concerns over greenhouses gases and their contribution to climate change, it is necessary to find ways of reducing environmental impacts by diversifying energy sources to include non-fossil fuel energy sources. Among the fastest growing green energy sources is energy from waste facilities that use biomass that would otherwise be landfilled or stockpiled. The quantification of greenhouse gas reductions through the use of biomass to energy systems can be calculated using various protocols and methodologies. This paper described each of these methodologies and presented a case study comparing some of these quantification methodologies. A summary and comparison of biomass to energy greenhouse gas reduction protocols in use or under development by the United Nations, the European Union, the Province of Alberta and Environment Canada was presented. It was concluded that regulatory, environmental pressures, and public policy will continue to impact the practices associated with biomass processing or landfill operations, such as composting, or in the case of landfills, gas collection systems, thus reducing the amount of potential credit available for biomass to energy facility offset projects. 10 refs., 2 tabs., 6 figs

  6. Damage Detection and Quantification Using Transmissibility Coherence Analysis

    Directory of Open Access Journals (Sweden)

    Yun-Lai Zhou

    2015-01-01

    Full Text Available A new transmissibility-based damage detection and quantification approach is proposed. Based on the operational modal analysis, the transmissibility is extracted from system responses and transmissibility coherence is defined and analyzed. Afterwards, a sensitive-damage indicator is defined in order to detect and identify the severity of damage and compared with an indicator developed by other authors. The proposed approach is validated on data from a physics-based numerical model as well as experimental data from a three-story aluminum frame structure. For both numerical simulation and experiment the results of the new indicator reveal a better performance than coherence measure proposed in Rizos et al., 2008, Rizos et al., 2002, Fassois and Sakellariou, 2007, especially when nonlinearity occurs, which might be further used in real engineering. The main contribution of this study is the construction of the relation between transmissibility coherence and frequency response function coherence and the construction of an effective indicator based on the transmissibility modal assurance criteria for damage (especially for minor nonlinearity detection as well as quantification.

  7. An EPGPT-based approach for uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C.; Abdel-Khalik, H. S. [Dept. of Nuclear Engineering, North Caroline State Univ., Raleigh, NC 27695 (United States)

    2012-07-01

    Generalized Perturbation Theory (GPT) has been widely used by many scientific disciplines to perform sensitivity analysis and uncertainty quantification. This manuscript employs recent developments in GPT theory, collectively referred to as Exact-to-Precision Generalized Perturbation Theory (EPGPT), to enable uncertainty quantification for computationally challenging models, e.g. nonlinear models associated with many input parameters and many output responses and with general non-Gaussian parameters distributions. The core difference between EPGPT and existing GPT is in the way the problem is formulated. GPT formulates an adjoint problem that is dependent on the response of interest. It tries to capture via the adjoint solution the relationship between the response of interest and the constraints on the state variations. EPGPT recasts the problem in terms of a smaller set of what is referred to as the 'active' responses which are solely dependent on the physics model and the boundary and initial conditions rather than on the responses of interest. The objective of this work is to apply an EPGPT methodology to propagate cross-sections variations in typical reactor design calculations. The goal is to illustrate its use and the associated impact for situations where the typical Gaussian assumption for parameters uncertainties is not valid and when nonlinear behavior must be considered. To allow this demonstration, exaggerated variations will be employed to stimulate nonlinear behavior in simple prototypical neutronics models. (authors)

  8. Volumetric loss quantification using ultrasonic inductively coupled transducers

    Science.gov (United States)

    Gong, Peng; Hay, Thomas R.; Greve, David W.; Oppenheim, Irving J.

    2015-03-01

    The pulse-echo method is widely used for plate and pipe thickness measurement. However, the pulse echo method does not work well for detecting localized volumetric loss in thick-wall tubes, as created by erosion damage, when the morphology of volumetric loss is irregular and can reflect ultrasonic pulses away from the transducer, making it difficult to detect an echo. In this paper, we propose a novel method using an inductively coupled transducer to generate longitudinal waves propagating in a thick-wall aluminum tube for the volumetric loss quantification. In the experiment, longitudinal waves exhibit diffraction effects during the propagation which can be explained by the Huygens-Fresnel principle. The diffractive waves are also shown to be significantly delayed by the machined volumetric loss on the inside surface of the thick-wall aluminum tube. It is also shown that the inductively coupled transducers can generate and receive similar ultrasonic waves to those from wired transducers, and the inductively coupled transducers perform as well as the wired transducers in the volumetric loss quantification when other conditions are the same.

  9. A highly sensitive method for quantification of iohexol

    DEFF Research Database (Denmark)

    Schulz, A.; Boeringer, F.; Swifka, J.; Kretschmer, A.; Schaefer, M.; Jankowski, V.; van der Giet, M.; Schuchardt, M.; Toelle, M.; Tepel, Martin; Schlieper, G.; Zidek, W.; Jankowski, J.

    2014-01-01

    lohexol (1-N,3-N-bis(2,3-dihydroxypropyl)-5-IN-(2,3-dihydroxypropyl) acetamide-2,4,6-triiodobenzene1,3-dicarboxamide) is used for accurate determination of the glomerular filtration rate (GFR) in chronic kidney disease (CKD) patients. However, high iohexol amounts might lead to adverse effects in...... organisms. In order to minimize the iohexol dosage required for the GFR determination in humans, the development of a sensitive quantification method is essential. Therefore, the objective of our preclinical study was to establish and validate a simple and robust liquid...... of different amounts of iohexol (15 mg to 150 1.tg per rat). Blood sampling was conducted at four time points, at 15, 30, 60, and 90 min, after iohexol injection. The analyte (iohexol) and the internal standard (iotha(amic acid) were separated from serum proteins using a centrifugal filtration device...... with a cut-off of 3 kDa. The chromatographic separation was achieved on an analytical Zorbax SB C18 column. The detection and quantification were performed on a high capacity trap mass spectrometer using positive ion ESI in the multiple reaction monitoring (MRM) mode. Furthermore, using real...

  10. A critical assessment of indentation-based ductile damage quantification

    International Nuclear Information System (INIS)

    This paper scrutinizes the reliability of indentation-based damage quantification, frequently used by many industrial and academic researchers. In this methodology, damage evolution parameters for continuum damage models are experimentally measured by probing the deformation-induced degradation of either hardness or indentation modulus. In this critical assessment the damage evolution in different sheet metals was investigated using this indentation approach, whereby the obtained results were verified by other experimental techniques (scanning electron microscopy, X-ray microtomography and highly sensitive density measurements), and by finite element simulations. This extensive experimental-numerical assessment reveals that the damage-induced degradation of both hardness and modulus is at least partially, but most likely completely, masked by other deformation-induced microstructural mechanisms (e.g. grain shape change, strain hardening, texture development, residual stresses and indentation pile-up). It is therefore concluded that hardness-based or modulus-based damage quantification methods are intrinsically flawed and should not be used for the determination of a damage parameter.

  11. Simple and inexpensive quantification of ammonia in whole blood.

    Science.gov (United States)

    Ayyub, Omar B; Behrens, Adam M; Heligman, Brian T; Natoli, Mary E; Ayoub, Joseph J; Cunningham, Gary; Summar, Marshall; Kofinas, Peter

    2015-01-01

    Quantification of ammonia in whole blood has applications in the diagnosis and management of many hepatic diseases, including cirrhosis and rare urea cycle disorders, amounting to more than 5 million patients in the United States. Current techniques for ammonia measurement suffer from limited range, poor resolution, false positives or large, complex sensor set-ups. Here we demonstrate a technique utilizing inexpensive reagents and simple methods for quantifying ammonia in 100 μL of whole blood. The sensor comprises a modified form of the indophenol reaction, which resists sources of destructive interference in blood, in conjunction with a cation-exchange membrane. The presented sensing scheme is selective against other amine containing molecules such as amino acids and has a shelf life of at least 50 days. Additionally, the resulting system has high sensitivity and allows for the accurate reliable quantification of ammonia in whole human blood samples at a minimum range of 25 to 500 μM, which is clinically for rare hyperammonemic disorders and liver disease. Furthermore, concentrations of 50 and 100 μM ammonia could be reliably discerned with p = 0.0001. PMID:25936660

  12. Unilateral condylar hyperplasia: a 3-dimensional quantification of asymmetry.

    Directory of Open Access Journals (Sweden)

    Tim J Verhoeven

    Full Text Available PURPOSE: Objective quantifications of facial asymmetry in patients with Unilateral Condylar Hyperplasia (UCH have not yet been described in literature. The aim of this study was to objectively quantify soft-tissue asymmetry in patients with UCH and to compare the findings with a control group using a new method. MATERIAL AND METHODS: Thirty 3D photographs of patients diagnosed with UCH were compared with 30 3D photographs of healthy controls. As UCH presents particularly in the mandible, a new method was used to isolate the lower part of the face to evaluate asymmetry of this part separately. The new method was validated by two observers using 3D photographs of five patients and five controls. RESULTS: A significant difference (0.79 mm between patients and controls whole face asymmetry was found. Intra- and inter-observer differences of 0.011 mm (-0.034-0.011 and 0.017 mm (-0.007-0.042 respectively were found. These differences are irrelevant in clinical practice. CONCLUSION: After objective quantification, a significant difference was identified in soft-tissue asymmetry between patients with UCH and controls. The method used to isolate mandibular asymmetry was found to be valid and a suitable tool to evaluate facial asymmetry.

  13. Pancreas++ : Automated Quantification of Pancreatic Islet Cells in Microscopy Images

    Directory of Open Access Journals (Sweden)

    StuartMaudsley

    2013-01-01

    Full Text Available The microscopic image analysis of pancreatic Islet of Langerhans morphology is crucial for the investigation of diabetes and metabolic diseases. Besides the general size of the islet, the percentage and relative position of glucagon-containing alpha-, and insulin-containing beta-cells is also important for pathophysiological analyses, especially in rodents. Hence, the ability to identify, quantify and spatially locate peripheral and ‘involuted’ alpha-cells in the islet core is an important analytical goal. There is a dearth of software available for the automated and sophisticated positional-quantification of multiple cell types in the islet core. Manual analytical methods for these analyses, while relatively accurate, can suffer from a slow throughput rate as well as user-based biases. Here we describe a newly developed pancreatic islet analytical software program, Pancreas++, which facilitates the fully-automated, non-biased, and highly reproducible investigation of islet area and alpha- and beta-cell quantity as well as position within the islet for either single or large batches of fluorescent images. We demonstrate the utility and accuracy of Pancreas++ by comparing its performance to other pancreatic islet size and cell type (alpha, beta quantification methods. Our Pancreas++ analysis was significantly faster than other methods, while still retaining low error rates and a high degree of result correlation with the manually generated reference standard.

  14. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    Science.gov (United States)

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets. PMID:27031878

  15. Amperometric quantification based on serial dilution microfluidic systems.

    Science.gov (United States)

    Stephan, Khaled; Pittet, Patrick; Sigaud, Monique; Renaud, Louis; Vittori, Olivier; Morin, Pierre; Ouaini, Naim; Ferrigno, Rosaria

    2009-03-01

    This paper describes a microfluidic device fabricated in poly(dimethylsiloxane) that was employed to perform amperometric quantifications using on-chip calibration curves and on-chip standard addition methods. This device integrated a network of Au electrodes within a microfluidic structure designed for automatic preparation of a series of solutions containing an electroactive molecule at a concentration linearly decreasing. This device was first characterized by fluorescence microscopy and then evaluated with a model electroactive molecule such as Fe(CN(6))(4-). Operating a quantification in this microfluidic parallel approach rather than in batch mode allows a reduced analysis time to be achieved. Moreover, the microfluidic approach is compatible with the on-chip calibration of sensors simultaneously to the analysis, therefore preventing problems due to sensor response deviation with time. When using the on-chip calibration and on-chip standard addition method, we reached concentration estimation better than 5%. We also demonstrated that compared to the calibration curve approach, the standard addition mode is less complex to operate. Indeed, in this case, it is not necessary to take into account flow rate discrepancies as in the calibration approach. PMID:19238282

  16. A critical view on microplastic quantification in aquatic organisms.

    Science.gov (United States)

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R; Marques, Antonio; Granby, Kit; Fait, Gabriella; Kotterman, Michiel J J; Diogène, Jorge; Bekaert, Karen; Robbens, Johan; Devriese, Lisa

    2015-11-01

    Microplastics, plastic particles and fragments smaller than 5mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different "hotspot" locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g(-1) w.w. for the Acid mix Method and 0.12±0.04 total microplastics g(-1) w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g(-1) w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring. PMID:26249746

  17. First approach to radionuclide mixtures quantification by using plastic scintillators

    International Nuclear Information System (INIS)

    Recent studies have evaluated the capability of plastic scintillation (PS) as an alternative to liquid scintillation (LS) in radionuclide activity determination without mixed waste production. In order to complete the comparison, we now assess the extent to which PS can be used to quantify mixtures of radionuclides and the influence of the diameter of the plastic scintillation beads in detection efficiency. The results show that the detection efficiency decreases and the spectrum shrink to lower energies when the size of the plastic scintillation beads increases, and that the lower the energy of the beta particle, the greater the variation takes place. Similar behaviour has been observed for beta-gamma and alpha emitters. Two scenarios for the quantification of mixtures are considered, one including two radionuclides (14C and 60Co) whose spectra do not overlap significantly, and the other including two radionuclides (137Cs and 90Sr/90Y), where the spectra of one the isotopes is totally overlapped by the other The calculation has been performed by using the conventional window selection procedure and a new approach in which the selected windows correspond to those with lower quantification errors. Relative errors obtained using the proposed approach (less than 10%) are lower than those of the conventional procedure, even when a radionuclide is completely overlapped, except for those samples with extreme activity ratios that were not included in the window optimization process

  18. First approach to radionuclide mixtures quantification by using plastic scintillators

    Energy Technology Data Exchange (ETDEWEB)

    Tarancon, A. [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain); Garcia, J.F. [Departament de Pintura, Universitat de Barcelona, Pau Gargallo 4, E-08028 Barcelona (Spain)]. E-mail: jfgarcia@ub.edu; Rauret, G. [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain)

    2007-05-08

    Recent studies have evaluated the capability of plastic scintillation (PS) as an alternative to liquid scintillation (LS) in radionuclide activity determination without mixed waste production. In order to complete the comparison, we now assess the extent to which PS can be used to quantify mixtures of radionuclides and the influence of the diameter of the plastic scintillation beads in detection efficiency. The results show that the detection efficiency decreases and the spectrum shrink to lower energies when the size of the plastic scintillation beads increases, and that the lower the energy of the beta particle, the greater the variation takes place. Similar behaviour has been observed for beta-gamma and alpha emitters. Two scenarios for the quantification of mixtures are considered, one including two radionuclides ({sup 14}C and {sup 60}Co) whose spectra do not overlap significantly, and the other including two radionuclides ({sup 137}Cs and {sup 90}Sr/{sup 90}Y), where the spectra of one the isotopes is totally overlapped by the other The calculation has been performed by using the conventional window selection procedure and a new approach in which the selected windows correspond to those with lower quantification errors. Relative errors obtained using the proposed approach (less than 10%) are lower than those of the conventional procedure, even when a radionuclide is completely overlapped, except for those samples with extreme activity ratios that were not included in the window optimization process.

  19. An EPGPT-based approach for uncertainty quantification

    International Nuclear Information System (INIS)

    Generalized Perturbation Theory (GPT) has been widely used by many scientific disciplines to perform sensitivity analysis and uncertainty quantification. This manuscript employs recent developments in GPT theory, collectively referred to as Exact-to-Precision Generalized Perturbation Theory (EPGPT), to enable uncertainty quantification for computationally challenging models, e.g. nonlinear models associated with many input parameters and many output responses and with general non-Gaussian parameters distributions. The core difference between EPGPT and existing GPT is in the way the problem is formulated. GPT formulates an adjoint problem that is dependent on the response of interest. It tries to capture via the adjoint solution the relationship between the response of interest and the constraints on the state variations. EPGPT recasts the problem in terms of a smaller set of what is referred to as the 'active' responses which are solely dependent on the physics model and the boundary and initial conditions rather than on the responses of interest. The objective of this work is to apply an EPGPT methodology to propagate cross-sections variations in typical reactor design calculations. The goal is to illustrate its use and the associated impact for situations where the typical Gaussian assumption for parameters uncertainties is not valid and when nonlinear behavior must be considered. To allow this demonstration, exaggerated variations will be employed to stimulate nonlinear behavior in simple prototypical neutronics models. (authors)

  20. Comparison of techniques for quantification of next-generation sequencing libraries

    DEFF Research Database (Denmark)

    Hussing, Christian; Kampmann, Marie-Louise; Mogensen, Helle Smidt;

    2015-01-01

    To ensure efficient sequencing, the DNA of next-generation sequencing (NGS) libraries must be quantified correctly. Therefore, an accurate, sensitive and stable method for DNA quantification is crucial. In this study, seven different methods for DNA quantification were compared to each other by q...

  1. Proposal for a method of identification, quantification and characterization of VLLW

    International Nuclear Information System (INIS)

    The following topics are treated:(i) Identification and quantification of very low level radioactive waste in the Czech Republic (proposal for limiting levels for VLLW in the Czech Republic; quantification of VLLW in the Czech Republic); and Proposal for characterization of VLLW (difficult nature of radionuclide detection; description of suitable measuring methods; available measuring techniques). (P.A.)

  2. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Directory of Open Access Journals (Sweden)

    Žel Jana

    2006-08-01

    Full Text Available Abstract Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was

  3. Dielectrophoretic immobilization of proteins: Quantification by atomic force microscopy.

    Science.gov (United States)

    Laux, Eva-Maria; Knigge, Xenia; Bier, Frank F; Wenger, Christian; Hölzel, Ralph

    2015-09-01

    The combination of alternating electric fields with nanometer-sized electrodes allows the permanent immobilization of proteins by dielectrophoretic force. Here, atomic force microscopy is introduced as a quantification method, and results are compared with fluorescence microscopy. Experimental parameters, for example the applied voltage and duration of field application, are varied systematically, and the influence on the amount of immobilized proteins is investigated. A linear correlation to the duration of field application was found by atomic force microscopy, and both microscopical methods yield a square dependence of the amount of immobilized proteins on the applied voltage. While fluorescence microscopy allows real-time imaging, atomic force microscopy reveals immobilized proteins obscured in fluorescence images due to low S/N. Furthermore, the higher spatial resolution of the atomic force microscope enables the visualization of the protein distribution on single nanoelectrodes. The electric field distribution is calculated and compared to experimental results with very good agreement to atomic force microscopy measurements. PMID:26010162

  4. Iterative Methods for Scalable Uncertainty Quantification in Complex Networks

    CERN Document Server

    Surana, Amit; Banaszuk, Andrzej

    2011-01-01

    In this paper we address the problem of uncertainty management for robust design, and verification of large dynamic networks whose performance is affected by an equally large number of uncertain parameters. Many such networks (e.g. power, thermal and communication networks) are often composed of weakly interacting subnetworks. We propose intrusive and non-intrusive iterative schemes that exploit such weak interconnections to overcome dimensionality curse associated with traditional uncertainty quantification methods (e.g. generalized Polynomial Chaos, Probabilistic Collocation) and accelerate uncertainty propagation in systems with large number of uncertain parameters. This approach relies on integrating graph theoretic methods and waveform relaxation with generalized Polynomial Chaos, and Probabilistic Collocation, rendering these techniques scalable. We analyze convergence properties of this scheme and illustrate it on several examples.

  5. Quantification of Human Movement for Assessment in Automated Exercise Coaching

    CERN Document Server

    Hagler, Stuart; Bajczy, Ruzena; Pavel, Misha

    2016-01-01

    Quantification of human movement is a challenge in many areas, ranging from physical therapy to robotics. We quantify of human movement for the purpose of providing automated exercise coaching in the home. We developed a model-based assessment and inference process that combines biomechanical constraints with movement assessment based on the Microsoft Kinect camera. To illustrate the approach, we quantify the performance of a simple squatting exercise using two model-based metrics that are related to strength and endurance, and provide an estimate of the strength and energy-expenditure of each exercise session. We look at data for 5 subjects, and show that for some subjects the metrics indicate a trend consistent with improved exercise performance.

  6. Ideas underlying the Quantification of Margins and Uncertainties

    International Nuclear Information System (INIS)

    Key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions are described. While QMU is a broad process and methodology for generating critical technical information to be used in U.S. nuclear weapon stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, the following topics are discussed: (i) the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, (ii) the need to separate aleatory and epistemic uncertainty in QMU, and (iii) the properties of risk-informed decision making (RIDM) that are best suited for effective application of QMU. The paper is written at a high level, but provides an extensive bibliography of useful papers for interested readers to deepen their understanding of the presented ideas.

  7. Tannins quantification in barks of Mimosa tenuiflora and Acacia mearnsii

    Directory of Open Access Journals (Sweden)

    Leandro Calegari

    2016-03-01

    Full Text Available Due to its chemical complexity, there are several methodologies for vegetable tannins quantification. Thus, this work aims at quantifying both tannin and non-tannin substances present in the barks of Mimosa tenuiflora and Acacia mearnsii by two different methods. From bark particles of both species, analytical solutions were produced by using a steam-jacketed extractor. The solution was analyzed by Stiasny and hide-powder (no chromed methods. For both species, tannin levels were superior when analyzed by hide-powder method, reaching 47.8% and 24.1% for A. mearnsii and M. tenuiflora, respectively. By Stiasny method, the tannins levels considered were 39.0% for A. mearnsii, and 15.5% for M. tenuiflora. Despite the best results presented by A. mearnsii, the bark of M. tenuiflora also showed great potential due to its considerable amount of tannin and the availability of the species at Caatinga biome.

  8. A surrogate accelerated multicanonical Monte Carlo method for uncertainty quantification

    Science.gov (United States)

    Wu, Keyi; Li, Jinglai

    2016-09-01

    In this work we consider a class of uncertainty quantification problems where the system performance or reliability is characterized by a scalar parameter y. The performance parameter y is random due to the presence of various sources of uncertainty in the system, and our goal is to estimate the probability density function (PDF) of y. We propose to use the multicanonical Monte Carlo (MMC) method, a special type of adaptive importance sampling algorithms, to compute the PDF of interest. Moreover, we develop an adaptive algorithm to construct local Gaussian process surrogates to further accelerate the MMC iterations. With numerical examples we demonstrate that the proposed method can achieve several orders of magnitudes of speedup over the standard Monte Carlo methods.

  9. A Point-Wise Quantification of Asymmetry Using Deformation Fields

    DEFF Research Database (Denmark)

    Ólafsdóttir, Hildur; Lanche, Stephanie; Darvann, Tron Andre;

    2007-01-01

    resulting displacement vectors on the left and right side of the symmetry plane, gives a point-wise measure of asymmetry. The asymmetry measure was applied to the study of Crouzon syndrome using Micro CT scans of genetically modified mice. Crouzon syndrome is characterised by the premature fusion of cranial......This paper introduces a novel approach to quantify asymmetry in each point of a surface. The measure is based on analysing displacement vectors resulting from nonrigid image registration. A symmetric atlas, generated from control subjects is registered to a given subject image. A comparison of the...... sutures, which gives rise to a highly asymmetric growth. Quantification and localisation of this asymmetry is of high value with respect to surgery planning and treatment evaluation. Using the proposed method, asymmetry was calculated in each point of the surface of Crouzon mice and wild-type mice...

  10. An approximation approach for uncertainty quantification using evidence theory

    International Nuclear Information System (INIS)

    Over the last two decades, uncertainty quantification (UQ) in engineering systems has been performed by the popular framework of probability theory. However, many scientific and engineering communities realize that there are limitations in using only one framework for quantifying the uncertainty experienced in engineering applications. Recently evidence theory, also called Dempster-Shafer theory, was proposed to handle limited and imprecise data situations as an alternative to the classical probability theory. Adaptation of this theory for large-scale engineering structures is a challenge due to implicit nature of simulations and excessive computational costs. In this work, an approximation approach is developed to improve the practical utility of evidence theory in UQ analysis. The techniques are demonstrated on composite material structures and airframe wing aeroelastic design problem

  11. Uncertainty quantification in computational fluid dynamics and aircraft engines

    CERN Document Server

    Montomoli, Francesco; D'Ammaro, Antonio; Massini, Michela; Salvadori, Simone

    2015-01-01

    This book introduces novel design techniques developed to increase the safety of aircraft engines. The authors demonstrate how the application of uncertainty methods can overcome problems in the accurate prediction of engine lift, caused by manufacturing error. This in turn ameliorates the difficulty of achieving required safety margins imposed by limits in current design and manufacturing methods. This text shows that even state-of-the-art computational fluid dynamics (CFD) are not able to predict the same performance measured in experiments; CFD methods assume idealised geometries but ideal geometries do not exist, cannot be manufactured and their performance differs from real-world ones. By applying geometrical variations of a few microns, the agreement with experiments improves dramatically, but unfortunately the manufacturing errors in engines or in experiments are unknown. In order to overcome this limitation, uncertainty quantification considers the probability density functions of manufacturing errors...

  12. Quantification of osteolytic bone lesions in a preclinical rat trial

    Science.gov (United States)

    Fränzle, Andrea; Bretschi, Maren; Bäuerle, Tobias; Giske, Kristina; Hillengass, Jens; Bendl, Rolf

    2013-10-01

    In breast cancer, most of the patients who died, have developed bone metastasis as disease progression. Bone metastases in case of breast cancer are mainly bone destructive (osteolytic). To understand pathogenesis and to analyse response to different treatments, animal models, in our case rats, are examined. For assessment of treatment response to bone remodelling therapies exact segmentations of osteolytic lesions are needed. Manual segmentations are not only time-consuming but lack in reproducibility. Computerized segmentation tools are essential. In this paper we present an approach for the computerized quantification of osteolytic lesion volumes using a comparison to a healthy reference model. The presented qualitative and quantitative evaluation of the reconstructed bone volumes show, that the automatically segmented lesion volumes complete missing bone in a reasonable way.

  13. Sensitive ligand-based protein quantification using immuno-PCR

    DEFF Research Database (Denmark)

    Hansen, Marcus Celik; Nederby, Line; Henriksen, Mads Okkels-Birk; Hansen, Maria; Nyvold, Charlotte Guldborg

    2014-01-01

    Quantitative PCR (qPCR) of reverse-transcribed mRNA has revolutionized gene expression analyses. qPCR analysis is based on the prevalent assumption that mRNA transcript numbers provide an adequate measure of specific biomarker expression. However, taking the complexity of protein turnover into...... account, there is a need to correlate qPCR-derived transcriptional patterns with protein translational patterns so as to not leave behind important pathobiological details. One emerging approach in protein analysis is PCR-coupled protein quantification, often denoted as immuno-PCR (iPCR), which targets...... soluble proteins. Here we review recent trends and applications in iPCR assays that may bridge the gap between classical enzyme-linked immunosorbent assays and mass spectrometry methodologies in terms of sensitivity and multiplexing....

  14. Quantification of asymmetric microtubule nucleation at sub-cellular structures

    Science.gov (United States)

    Zhu, Xiaodong; Kaverina, Irina

    2012-01-01

    Cell polarization is important for multiple physiological processes. In polarized cells, microtubules (MTs) are organized into a spatially polarized array. Generally, in non-differentiated cells, it is assumed that MTs are symmetrically nucleated exclusively from centrosome (microtubule organizing center, MTOC) and then reorganized into the asymmetric array. We have recently identified the Golgi complex as an additional MTOC that asymmetrically nucleates MTs toward one side of the cell. Methods used for alternative MTOC identification include microtubule re-growth after complete drug-induced depolymerization and tracking of growing microtubules using fluorescence labeled MT +TIP binding proteins in living cells. These approaches can be used for quantification of MT nucleation sites at diverse sub-cellular structures. PMID:21773933

  15. Uncertainty quantification for CO2 sequestration and enhanced oil recovery

    CERN Document Server

    Dai, Zhenxue; Fessenden-Rahn, Julianna; Middleton, Richard; Pan, Feng; Jia, Wei; Lee, Si-Yong; McPherson, Brian; Ampomah, William; Grigg, Reid

    2014-01-01

    This study develops a statistical method to perform uncertainty quantification for understanding CO2 storage potential within an enhanced oil recovery (EOR) environment at the Farnsworth Unit of the Anadarko Basin in northern Texas. A set of geostatistical-based Monte Carlo simulations of CO2-oil-water flow and reactive transport in the Morrow formation are conducted for global sensitivity and statistical analysis of the major uncertainty metrics: net CO2 injection, cumulative oil production, cumulative gas (CH4) production, and net water injection. A global sensitivity and response surface analysis indicates that reservoir permeability, porosity, and thickness are the major intrinsic reservoir parameters that control net CO2 injection/storage and oil/gas recovery rates. The well spacing and the initial water saturation also have large impact on the oil/gas recovery rates. Further, this study has revealed key insights into the potential behavior and the operational parameters of CO2 sequestration at CO2-EOR s...

  16. Quantification of hydroxyacetone and glycolaldehyde using chemical ionization mass spectrometry

    Directory of Open Access Journals (Sweden)

    K. M. Spencer

    2011-08-01

    Full Text Available Chemical ionization mass spectrometry (CIMS enables online, fast, in situ detection and quantification of hydroxyacetone and glycolaldehyde. Two different CIMS approaches are demonstrated employing the strengths of single quadrupole mass spectrometry and triple quadrupole (tandem mass spectrometry. Both methods are capable of the measurement of hydroxyacetone, an analyte with minimal isobaric interferences. Tandem mass spectrometry provides direct separation of the isobaric compounds glycolaldehyde and acetic acid using distinct, collision-induced dissociation daughter ions. Measurement of hydroxyacetone and glycolaldehyde by these methods was demonstrated during the ARCTAS-CARB 2008 campaign and the BEARPEX 2009 campaign. Enhancement ratios of these compounds in ambient biomass burning plumes are reported for the ARCTAS-CARB campaign. BEARPEX observations are compared to simple photochemical box model predictions of biogenic volatile organic compound oxidation at the site.

  17. Quantification of interfacial segregation by analytical electron microscopy

    CERN Document Server

    Muellejans, H

    2003-01-01

    The quantification of interfacial segregation by spatial difference and one-dimensional profiling is presented in general where special attention is given to the random and systematic uncertainties. The method is demonstrated for an example of Al-Al sub 2 O sub 3 interfaces in a metal-ceramic composite material investigated by energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy in a dedicated scanning transmission electron microscope. The variation of segregation measured at different interfaces by both methods is within the uncertainties, indicating a constant segregation level and interfacial phase formation. The most important random uncertainty is the counting statistics of the impurity signal whereas the specimen thickness introduces systematic uncertainties (via k factor and effective scan width). The latter could be significantly reduced when the specimen thickness is determined explicitly. (orig.)

  18. Uncertainty quantification of an inflatable/rigidizable torus

    Science.gov (United States)

    Lew, Jiann-Shiun; Horta, Lucas G.; Reaves, Mercedes C.

    2006-06-01

    There is an increasing interest in lightweight inflatable structures for space missions. The dynamic testing and model updating of these types of structures present many challenges in terms of model uncertainty and structural nonlinearity. This paper presents an experimental study of uncertainty quantification of a 3m-diameter inflatable torus. Model uncertainty can be thought of as coming from two different sources, uncertainty due to changes in controlled conditions, such as temperature and input force level, and uncertainty associated with others random factors, such as measurement noise, etc. To precisely investigate and quantify model uncertainty from different sources, experiments, using sine-sweep excitation in the specified narrow frequency bands, are conducted to collect frequency response function (FRF) under various test conditions. To model the variation of the identified parameters, a singular value decomposition technique is applied to extract the principal components of the parameter change.

  19. Automated quantification of one-dimensional nanostructure alignment on surfaces

    CERN Document Server

    Dong, Jianjin; Abukhdeir, Nasser Mohieddin

    2016-01-01

    A method for automated quantification of the alignment of one-dimensional nanostructures from microscopy imaging is presented. Nanostructure alignment metrics are formulated and shown to able to rigorously quantify the orientational order of nanostructures within a two-dimensional domain (surface). A complementary image processing method is also presented which enables robust processing of microscopy images where overlapping nanostructures might be present. Scanning electron microscopy (SEM) images of nanowire-covered surfaces are analyzed using the presented methods and it is shown that past single parameter alignment metrics are insufficient for highly aligned domains. Through the use of multiple parameter alignment metrics, automated quantitative analysis of SEM images is shown to be possible and the alignment characteristics of different samples are able to be rigorously compared using a similarity metric. The results of this work provide researchers in nanoscience and nanotechnology with a rigorous metho...

  20. Capacitive immunosensor for C-reactive protein quantification

    KAUST Repository

    Sapsanis, Christos

    2015-08-02

    We report an agglutination-based immunosensor for the quantification of C-reactive protein (CRP). The developed immunoassay sensor requires approximately 15 minutes of assay time per sample and provides a sensitivity of 0.5 mg/L. We have measured the capacitance of interdigitated electrodes (IDEs) and quantified the concentration of added analyte. The proposed method is a label free detection method and hence provides rapid measurement preferable in diagnostics. We have so far been able to quantify the concentration to as low as 0.5 mg/L and as high as 10 mg/L. By quantifying CRP in serum, we can assess whether patients are prone to cardiac diseases and monitor the risk associated with such diseases. The sensor is a simple low cost structure and it can be a promising device for rapid and sensitive detection of disease markers at the point-of-care stage.

  1. Reliability and discriminatory power of methods for dental plaque quantification

    Directory of Open Access Journals (Sweden)

    Daniela Prócida Raggio

    2010-04-01

    Full Text Available OBJECTIVE: This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI and fluorescence camera (FC to detect plaque. MATERIAL AND METHODS: Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. RESULTS: Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. CONCLUSION: The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque.

  2. Quantification of tidal parameters from Solar System data

    Science.gov (United States)

    Lainey, Valéry

    2016-05-01

    Tidal dissipation is the main driver of orbital evolution of natural satellites and a key point to understand the exoplanetary system configurations. Despite its importance, its quantification from observations still remains difficult for most objects of our own Solar System. In this work, we overview the method that has been used to determine, directly from observations, the tidal parameters, with emphasis on the Love number k_2 and the tidal quality factor Q. Up-to-date values of these tidal parameters are summarized. Last, an assessment on the possible determination of the tidal ratio k_2/Q of Uranus and Neptune is done. This may be particularly relevant for coming astrometric campaigns and future space missions focused on these systems.

  3. NeuCode Labels for Relative Protein Quantification *

    Science.gov (United States)

    Merrill, Anna E.; Hebert, Alexander S.; MacGilvray, Matthew E.; Rose, Christopher M.; Bailey, Derek J.; Bradley, Joel C.; Wood, William W.; El Masri, Marwan; Westphall, Michael S.; Gasch, Audrey P.; Coon, Joshua J.

    2014-01-01

    We describe a synthesis strategy for the preparation of lysine isotopologues that differ in mass by as little as 6 mDa. We demonstrate that incorporation of these molecules into the proteomes of actively growing cells does not affect cellular proliferation, and we discuss how to use the embedded mass signatures (neutron encoding (NeuCode)) for multiplexed proteome quantification by means of high-resolution mass spectrometry. NeuCode SILAC amalgamates the quantitative accuracy of SILAC with the multiplexing of isobaric tags and, in doing so, offers up new opportunities for biological investigation. We applied NeuCode SILAC to examine the relationship between transcript and protein levels in yeast cells responding to environmental stress. Finally, we monitored the time-resolved responses of five signaling mutants in a single 18-plex experiment. PMID:24938287

  4. Enhanced techniques for asymmetry quantification in brain imagery

    Science.gov (United States)

    Liu, Xin; Imielinska, Celina; Rosiene, Joel; Connolly, E. S.; D'Ambrosio, Anthony L.

    2006-03-01

    We present an automated generic methodology for symmetry identification and asymmetry quantification, novel method of identifying and delineation of brain pathology by analyzing the opposing sides of the brain utilizing of inherent left-right symmetry in the brain. After symmetry axis has been detected, we apply non-parametric statistical tests operating on the pairs of samples to identify initial seeds points which is defined defined as the pixels where the most statistically significant difference appears. Local region growing is performed on the difference map, from where the seeds are aggregating until it captures all 8-way connected high signals from the difference map. We illustrate the capability of our method with examples ranging from tumors in patient MR data to animal stroke data. The validation results on Rat stroke data have shown that this approach has promise to achieve high precision and full automation in segmenting lesions in reflectional symmetrical objects.

  5. Polynomial regression with derivative information in nuclear reactor uncertainty quantification

    International Nuclear Information System (INIS)

    We introduce a novel technique of uncertainty quantification using polynomial regression with derivative information and apply it to analyze the performance of a model of a sodium-cooled fast reactor. We construct a surrogate model as a goal-oriented projection onto an incomplete space of polynomials, find coordinates of projection by collocation, and use derivative information to reduce the number of sample points required by the collocation procedure. This surrogate model can be used to estimate range, sensitivities and the statistical distribution of the output. Numerical experiments show that the suggested approach is significantly more computationally efficient than random sampling, or approaches that do not use derivative information, and that it has greater precision than linear models. (author)

  6. Expert judgement and uncertainty quantification for climate change

    Science.gov (United States)

    Oppenheimer, Michael; Little, Christopher M.; Cooke, Roger M.

    2016-05-01

    Expert judgement is an unavoidable element of the process-based numerical models used for climate change projections, and the statistical approaches used to characterize uncertainty across model ensembles. Here, we highlight the need for formalized approaches to unifying numerical modelling with expert judgement in order to facilitate characterization of uncertainty in a reproducible, consistent and transparent fashion. As an example, we use probabilistic inversion, a well-established technique used in many other applications outside of climate change, to fuse two recent analyses of twenty-first century Antarctic ice loss. Probabilistic inversion is but one of many possible approaches to formalizing the role of expert judgement, and the Antarctic ice sheet is only one possible climate-related application. We recommend indicators or signposts that characterize successful science-based uncertainty quantification.

  7. Experimental investigations for uncertainty quantification in brake squeal analysis

    Science.gov (United States)

    Renault, A.; Massa, F.; Lallemand, B.; Tison, T.

    2016-04-01

    The aim of this paper is to improve the correlation between the experimental and the numerical prediction of unstable frequencies for automotive brake systems considering uncertainty. First, an experimental quantification of uncertainty and a discussion analysing the contributions of uncertainty to a numerical squeal simulation are proposed. Frequency and transient simulations are performed considering nominal values of model parameters, determined experimentally. The obtained results are compared with those derived from experimental tests to highlight the limitation of deterministic simulations. The effects of the different kinds of uncertainty detected in working conditions of brake system, the pad boundary condition, the brake system material properties and the pad surface topography are discussed by defining different unstable mode classes. Finally, a correlation between experimental and numerical results considering uncertainty is successfully proposed for an industrial brake system. Results from the different comparisons reveal also a major influence of the pad topography and consequently the contact distribution.

  8. Quantification of Diffuse Hydrothermal Flows Using Multibeam Sonar

    Science.gov (United States)

    Ivakin, A. N.; Jackson, D. R.; Bemis, K. G.; Xu, G.

    2014-12-01

    The Cabled Observatory Vent Imaging Sonar (COVIS) deployed at the Main Endeavour node of the NEPTUNE Canada observatory has provided acoustic time series extending over 2 years. This includes 3D images of plume scattering strength and Doppler velocity measurements as well as 2D images showing regions of diffuse flow. The diffuse-flow images display the level of decorrelation between sonar echos with transmissions separated by 0.2 s. The present work aims to provide further information on the strength of diffuse flows. Two approaches are used: Measurement of the dependence of decorrelation on lag and measurement of phase shift of sonar echos, with lags in 3-hour increments up to several days. The phase shifts and decorrelation are linked to variations of temperature above the seabed, which allows quantification of those variations, their magnitudes, spatial and temporal scales, and energy spectra. These techniques are illustrated using COVIS data obtained near the Grotto vent complex.

  9. Quantification of precipitate fraction in Al-Si-Cu alloys

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Z. [Sente Software Ltd., Surrey Technology Centre, Guildford GU2 7YG (United Kingdom); Sha, W. [Metals Research Group, School of Civil Engineering, Queen' s University of Belfast (United Kingdom)]. E-mail: w.sha@qub.ac.uk

    2005-02-15

    Quantification of precipitate fraction is difficult when the precipitates formed are of low volume fraction. A simple method is proposed in the present work to estimate the precipitate fraction of Al{sub 2}Cu phase in Al-Si-Cu alloys based on X-ray diffraction analysis. The change in the lattice parameter of the matrix due to ageing, measured from X-ray diffraction profiles, is correlated to the fraction of Al{sub 2}Cu phase formed during ageing. JMatPro, a software package for calculating the properties of metallic systems, is used to calculate the phase constitution and composition in the Al-Si-Cu alloys studied after different heat treatments. Factors that affect the lattice parameter of the matrix have been discussed and considered in the calculations.

  10. Quantification of tidal parameters from Solar system data

    CERN Document Server

    Lainey, Valéry

    2016-01-01

    Tidal dissipation is the main driver of orbital evolution of natural satellites and a key point to understand the exoplanetary system configurations. Despite its importance, its quantification from observations still remains difficult for most objects of our own Solar system. In this work, we overview the method that has been used to determine, directly from observations, the tidal parameters, with emphasis on the Love number k2 and the tidal quality factor Q. Up-to-date values of these tidal parameters are summarized. Last, an assessment on the possible determination of the tidal ratio k2/Q of Uranus and Neptune is done. This may be particularly relevant for coming astrometric campaigns and future space missions focused on these systems.

  11. Quantification of projection angle in fragment generator warhead

    Directory of Open Access Journals (Sweden)

    K.D. Dhote

    2014-06-01

    Full Text Available Tactical Ballistic Missile (TBM class target neutralization by the fragment spray of a Fragment Generator Warhead (FGW calls for quantification of fragment projection angle scatter to finalize the end game engagement logic. For conventional axi-symmetric warhead, dispersion is assumed to be normal with a standard deviation of 30. However, such information is not available in case of FGW. Hence, a set of experiments are conducted to determine the dispersion of fragments. The experiments are conducted with a specific configuration of FGW in an identical arena to quantify the scatter and then verified its applicability to other configurations having a range of L/D and C/M ratios, and contoured fragmenting discs. From the experimental study, it is concluded that the scatter in projection angle follows normal distribution with a standard deviation of 0.75° at Chi-square significance level of 0.01(χ20.99.

  12. Extracellular polymeric substances: quantification and use in erosion experiments

    Science.gov (United States)

    Perkins, R. G.; Paterson, D. M.; Sun, H.; Watson, J.; Player, M. A.

    2004-10-01

    Extracellular polymeric substances (EPS) is a generic term often applied to high molecular weight polymers implicated in the biostabilisation of natural sediments. Quantitative analysis of in situ EPS production rates and sediment contents has usually involved extraction of EPS in saline media prior to precipitation in alcohol and quantification against a glucose standard (phenol-sulphuric acid assay). Extracted and synthetic EPS has also been used to create engineered sediments for erosion experiments. This study investigated two steps in the EPS extraction procedure, saline extraction and alcohol precipitation. Comparisons of the effects of different extracted polymers were made in sediment erosion experiments using engineered sediments. Sediment EPS content decreased as the salinity of the extractant increased, with highest values obtained for extraction in fresh water. Potential errors were observed in the quantification of the soluble colloidal polymer fraction when divided into EPS and lower molecular weight polymers (LMW) as used in many studies. In erosion studies, 15 mg kg-1 of alcohol (IMS) extracted EPS polymer (in 5 g kg-1 IMS precipitate, equivalent to approximately 5 g salt kg-1 sediment dry weight) decreased the erosion threshold of cohesive sediments whereas 30 mg kg-1 (in 10 g kg-1 IMS precipitate, approximately 10 g salt kg-1 sediment dry weight) had no effect compared to controls. This could be due to the influence of EPS on water content: low levels of EPS did not bind but prevented desiccation, lowering sediment stability against controls. At higher EPS content, binding effects balanced water content effects. Salt alone (at 10 g kg-1) slightly increased the erosion threshold after a 6-h desiccation period. In comparison, carbohydrates produced without alcohol precipitation (rotary evaporation) increased the erosion threshold at both 0.5 and 1.0 g EPS kg-1 dry weight of sediment. It was concluded that the role of microphytobenthic polymers in

  13. Hyphenated techniques for the characterization and quantification of metallothionein isoforms

    Energy Technology Data Exchange (ETDEWEB)

    Prange, Andreas; Schaumloeffel, Dirk [GKSS Research Center, Institute for Coastal Research/Physical and Chemical Analysis, Max-Planck-Strasse, 21502 Geesthacht (Germany)

    2002-07-01

    Recent developments in the coupling of highly selective separation techniques such as capillary electrophoresis (CE) and high-performance liquid chromatography (HPLC) to element-specific and molecule-specific detectors, such as inductively-coupled plasma mass spectrometry (ICP-MS) and electrospray ionization-tandem mass spectrometry (ESI-MS/MS) for the characterization and quantification of metallothioneins (MTs) are critically reviewed and discussed. This review gives an update based on the literature over the last five years. The coupling of CE to ICP-MS is especially highlighted. As a result of progress in new interface technologies for CE-ICP-MS, research topics presented in the literature are changing from ''the characterization of interfaces by metallothioneins'' to the ''characterization of metallothioneins by CE-ICP-MS''. New applications of CE-ICP-MS to the analysis of MTs in real samples are summarized. The potential of the on-line isotope dilution technique for the quantification of MTs and for the determination of the stoichiometric composition of metalloprotein complexes is discussed. Furthermore, a selection of relevant papers dealing with HPLC-ICP-MS for MT analysis are summarized and compared to those dealing with CE-ICP-MS. In particular, the use of size-exclusion (SE)-HPLC as a preliminary separation step for metallothioneins in real samples prior to further chromatographic or electrophoretic separations is considered. Additionally, the application of electrospray ionisation-tandem mass spectrometry (ESI-MS/MS) for the identification of metallothionein isoforms following electrophoretic or chromatographic separation is discussed. (orig.)

  14. A novel immunological assay for hepcidin quantification in human serum.

    Directory of Open Access Journals (Sweden)

    Vasiliki Koliaraki

    Full Text Available BACKGROUND: Hepcidin is a 25-aminoacid cysteine-rich iron regulating peptide. Increased hepcidin concentrations lead to iron sequestration in macrophages, contributing to the pathogenesis of anaemia of chronic disease whereas decreased hepcidin is observed in iron deficiency and primary iron overload diseases such as hereditary hemochromatosis. Hepcidin quantification in human blood or urine may provide further insights for the pathogenesis of disorders of iron homeostasis and might prove a valuable tool for clinicians for the differential diagnosis of anaemia. This study describes a specific and non-operator demanding immunoassay for hepcidin quantification in human sera. METHODS AND FINDINGS: An ELISA assay was developed for measuring hepcidin serum concentration using a recombinant hepcidin25-His peptide and a polyclonal antibody against this peptide, which was able to identify native hepcidin. The ELISA assay had a detection range of 10-1500 microg/L and a detection limit of 5.4 microg/L. The intra- and interassay coefficients of variance ranged from 8-15% and 5-16%, respectively. Mean linearity and recovery were 101% and 107%, respectively. Mean hepcidin levels were significantly lower in 7 patients with juvenile hemochromatosis (12.8 microg/L and 10 patients with iron deficiency anemia (15.7 microg/L and higher in 7 patients with Hodgkin lymphoma (116.7 microg/L compared to 32 age-matched healthy controls (42.7 microg/L. CONCLUSIONS: We describe a new simple ELISA assay for measuring hepcidin in human serum with sufficient accuracy and reproducibility.

  15. Quantification of rutile in anatase by X-ray diffraction

    International Nuclear Information System (INIS)

    Nowadays the discovering of new and better materials required in all areas of the industry has been lead to the human being to introduce him to this small and great world. The crystalline materials, have properties markedly directional. When it is necessary to realize a quantitative analysis to these materials the task is not easy. The main objective of this work is the research of a real problem, its solution and perfecting of a technique involving the theoretical and experimental principles which allow the quantification of crystalline phases. The chapter 1 treats about the study of crystalline state during the last century, by means of the X-ray diffraction technique. The chapter 2 studies the nature and production of X-rays, the chapter 3 expounds the principles of the diffraction technique which to carry out when it is satisfied the Bragg law studying the powder diffraction method and its applications. In the chapter 4 it is explained how the intensities of the beams diffracted are determined by the atoms positions inside of the elemental cell of the crystal. The properties of the crystalline samples of anatase and rutile are described in the chapter 5. The results of this last analysis are the information which will be processed by means of the auxiliary software: Diffrac AT, Axum and Peakfit as well as the TAFOR and CUANTI software describing this part with more detail in the chapters 6 and 7 where it is mentioned step by step the function of each software until to reach the quantification of crystalline phases, objective of this work. Finally, in the chapter 8 there are a results analysis and conclusions. The contribution of this work is for those learned institutions of limited resources which can tackle in this way the characterization of materials. (Author)

  16. Bayesian Proteoform Modeling Improves Protein Quantification of Global Proteomic Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Datta, Susmita; Payne, Samuel H.; Kang, Jiyun; Bramer, Lisa M.; Nicora, Carrie D.; Shukla, Anil K.; Metz, Thomas O.; Rodland, Karin D.; Smith, Richard D.; Tardiff, Mark F.; McDermott, Jason E.; Pounds, Joel G.; Waters, Katrina M.

    2014-12-01

    As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally-driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian model (BP-Quant) that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern, or the existence of multiple over-expressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab ® and R packages at https://github.com/PNNL-Comp-Mass-Spec/BP-Quant.

  17. Refined Stratified Sampling for efficient Monte Carlo based uncertainty quantification

    International Nuclear Information System (INIS)

    A general adaptive approach rooted in stratified sampling (SS) is proposed for sample-based uncertainty quantification (UQ). To motivate its use in this context the space-filling, orthogonality, and projective properties of SS are compared with simple random sampling and Latin hypercube sampling (LHS). SS is demonstrated to provide attractive properties for certain classes of problems. The proposed approach, Refined Stratified Sampling (RSS), capitalizes on these properties through an adaptive process that adds samples sequentially by dividing the existing subspaces of a stratified design. RSS is proven to reduce variance compared to traditional stratified sample extension methods while providing comparable or enhanced variance reduction when compared to sample size extension methods for LHS – which do not afford the same degree of flexibility to facilitate a truly adaptive UQ process. An initial investigation of optimal stratification is presented and motivates the potential for major advances in variance reduction through optimally designed RSS. Potential paths for extension of the method to high dimension are discussed. Two examples are provided. The first involves UQ for a low dimensional function where convergence is evaluated analytically. The second presents a study to asses the response variability of a floating structure to an underwater shock. - Highlights: • An adaptive process, rooted in stratified sampling, is proposed for Monte Carlo-based uncertainty quantification. • Space-filling, orthogonality, and projective properties of stratified sampling are investigated • Stratified sampling is shown to possess attractive properties for certain classes of problems. • Refined Stratified Sampling, a new sampling method is proposed that enables the adaptive UQ process. • Optimality of RSS stratum division is explored

  18. Stochastic methods for uncertainty quantification in radiation transport

    Energy Technology Data Exchange (ETDEWEB)

    Fichtl, Erin D [Los Alamos National Laboratory; Prinja, Anil K [Los Alamos National Laboratory; Warsa, James S [Los Alamos National Laboratory

    2009-01-01

    The use of generalized polynomial chaos (gPC) expansions is investigated for uncertainty quantification in radiation transport. The gPC represents second-order random processes in terms of an expansion of orthogonal polynomials of random variables and is used to represent the uncertain input(s) and unknown(s). We assume a single uncertain input-the total macroscopic cross section-although this does not represent a limitation of the approaches considered here. Two solution methods are examined: The Stochastic Finite Element Method (SFEM) and the Stochastic Collocation Method (SCM). The SFEM entails taking Galerkin projections onto the orthogonal basis, which, for fixed source problems, yields a linear system of fully -coupled equations for the PC coefficients of the unknown. For k-eigenvalue calculations, the SFEM system is non-linear and a Newton-Krylov method is employed to solve it. The SCM utilizes a suitable quadrature rule to compute the moments or PC coefficients of the unknown(s), thus the SCM solution involves a series of independent deterministic transport solutions. The accuracy and efficiency of the two methods are compared and contrasted. The PC coefficients are used to compute the moments and probability density functions of the unknown(s), which are shown to be accurate by comparing with Monte Carlo results. Our work demonstrates that stochastic spectral expansions are a viable alternative to sampling-based uncertainty quantification techniques since both provide a complete characterization of the distribution of the flux and the k-eigenvalue. Furthermore, it is demonstrated that, unlike perturbation methods, SFEM and SCM can handle large parameter uncertainty.

  19. Respiratory Mucosal Proteome Quantification in Human Influenza Infections

    Science.gov (United States)

    Marion, Tony; Elbahesh, Husni; Thomas, Paul G.; DeVincenzo, John P.; Webby, Richard; Schughart, Klaus

    2016-01-01

    Respiratory influenza virus infections represent a serious threat to human health. Underlying medical conditions and genetic make-up predispose some influenza patients to more severe forms of disease. To date, only a few studies have been performed in patients to correlate a selected group of cytokines and chemokines with influenza infection. Therefore, we evaluated the potential of a novel multiplex micro-proteomics technology, SOMAscan, to quantify proteins in the respiratory mucosa of influenza A and B infected individuals. The analysis included but was not limited to quantification of cytokines and chemokines detected in previous studies. SOMAscan quantified more than 1,000 secreted proteins in small nasal wash volumes from infected and healthy individuals. Our results illustrate the utility of micro-proteomic technology for analysis of proteins in small volumes of respiratory mucosal samples. Furthermore, when we compared nasal wash samples from influenza-infected patients with viral load ≥ 28 and increased IL-6 and CXCL10 to healthy controls, we identified 162 differentially-expressed proteins between the two groups. This number greatly exceeds the number of DEPs identified in previous studies in human influenza patients. Most of the identified proteins were associated with the host immune response to infection, and changes in protein levels of 151 of the DEPs were significantly correlated with viral load. Most important, SOMAscan identified differentially expressed proteins heretofore not associated with respiratory influenza infection in humans. Our study is the first report for the use of SOMAscan to screen nasal secretions. It establishes a precedent for micro-proteomic quantification of proteins that reflect ongoing response to respiratory infection. PMID:27088501

  20. Uncertainty Quantification of Physical Model for Best Estimate Safety Analysis

    International Nuclear Information System (INIS)

    In this work, the model calibration was done to reduce the simulation code's input parameters' uncertainties, and subsequently simulation code's prediction uncertainties of design constraining responses. Each parameter/physical Model's fidelity was identified as well to determine major sources of the modeling uncertainty. This analysis is important in deciding where additional efforts should be given to improve our simulation model. The goal of this work is to develop higher fidelity model by completing experiments and doing uncertainty quantification. Thermal hydraulic parameters were adjusted for both mildly nonlinear and highly nonlinear systems, and their a posterior parameter uncertainties were propagated through the simulation model to predict a posterior uncertainties of the key system attributes. To solve both highly nonlinear as well as mildly nonlinear problem, both deterministic and probabilistic methods were used to complete uncertainty quantification. To accomplish this, the Bayesian approach modified by regularization is used for the mildly nonlinear problem to incorporate available information in quantifying uncertainties. The a priori information considered are the parameters and the experimental data together with their uncertainties. The results indicate that substantial reductions in uncertainties on the system responses can be achieved using experimental data to obtain a posterior input parameters' uncertainty distributions. The MCMC method was used for the highly nonlinear transient. Due to the computational burden, this method would not be applicable if there are many parameters, but it can provide the best solution since the algorithm does not approximate the responses while the deterministic approach assumes linearity of the responses with regard to dependencies on the parameters. Using MCMC non-Gaussian a posterior distributions of the parameters with reduced uncertainties were obtained due to the nonlinearity of the system sensitivity

  1. Quantification of Carbohydrates in Grape Tissues Using Capillary Zone Electrophoresis

    Science.gov (United States)

    Zhao, Lu; Chanon, Ann M.; Chattopadhyay, Nabanita; Dami, Imed E.; Blakeslee, Joshua J.

    2016-01-01

    Soluble sugars play an important role in freezing tolerance in both herbaceous and woody plants, functioning in both the reduction of freezing-induced dehydration and the cryoprotection of cellular constituents. The quantification of soluble sugars in plant tissues is, therefore, essential in understanding freezing tolerance. While a number of analytical techniques and methods have been used to quantify sugars, most of these are expensive and time-consuming due to complex sample preparation procedures which require the derivatization of the carbohydrates being analyzed. Analysis of soluble sugars using capillary zone electrophoresis (CZE) under alkaline conditions with direct UV detection has previously been used to quantify simple sugars in fruit juices. However, it was unclear whether CZE-based methods could be successfully used to quantify the broader range of sugars present in complex plant extracts. Here, we present the development of an optimized CZE method capable of separating and quantifying mono-, di-, and tri-saccharides isolated from plant tissues. This optimized CZE method employs a column electrolyte buffer containing 130 mM NaOH, pH 13.0, creating a current of 185 μA when a separation voltage of 10 kV is employed. The optimized CZE method provides limits-of-detection (an average of 1.5 ng/μL) for individual carbohydrates comparable or superior to those obtained using gas chromatography–mass spectrometry, and allows resolution of non-structural sugars and cell wall components (structural sugars). The optimized CZE method was successfully used to quantify sugars from grape leaves and buds, and is a robust tool for the quantification of plant sugars found in vegetative and woody tissues. The increased analytical efficiency of this CZE method makes it ideal for use in high-throughput metabolomics studies designed to quantify plant sugars. PMID:27379118

  2. Rapid quantification method for Legionella pneumophila in surface water.

    Science.gov (United States)

    Wunderlich, Anika; Torggler, Carmen; Elsässer, Dennis; Lück, Christian; Niessner, Reinhard; Seidel, Michael

    2016-03-01

    World-wide legionellosis outbreaks caused by evaporative cooling systems have shown that there is a need for rapid screening methods for Legionella pneumophila in water. Antibody-based methods for the quantification of L. pneumophila are rapid, non-laborious, and relatively cheap but not sensitive enough for establishment as a screening method for surface and drinking water. Therefore, preconcentration methods have to be applied in advance to reach the needed sensitivity. In a basic test, monolithic adsorption filtration (MAF) was used as primary preconcentration method that adsorbs L. pneumophila with high efficiency. Ten-liter water samples were concentrated in 10 min and further reduced to 1 mL by centrifugal ultrafiltration (CeUF). The quantification of L. pneumophila strains belonging to the monoclonal subtype Bellingham was performed via flow-based chemiluminescence sandwich microarray immunoassays (CL-SMIA) in 36 min. The whole analysis process takes 90 min. A polyclonal antibody (pAb) against L. pneumophila serogroup 1-12 and a monoclonal antibody (mAb) against L. pneumophila SG 1 strain Bellingham were immobilized on a microarray chip. Without preconcentration, the detection limit was 4.0 × 10(3) and 2.8 × 10(3) CFU/mL determined by pAb and mAb 10/6, respectively. For samples processed by MAF-CeUF prior to SMIA detection, the limit of detection (LOD) could be decreased to 8.7 CFU/mL and 0.39 CFU/mL, respectively. A recovery of 99.8 ± 15.9% was achieved for concentrations between 1-1000 CFU/mL. The established combined analytical method is sensitive for rapid screening of surface and drinking water to allow fast hygiene control of L. pneumophila. PMID:26873217

  3. Comparison of colorimetric methods for the quantification of model proteins in aqueous two-phase systems.

    Science.gov (United States)

    Glyk, Anna; Heinisch, Sandra L; Scheper, Thomas; Beutel, Sascha

    2015-05-15

    In the current study, the quantification of different model proteins in the presence of typical aqueous two-phase system components was investigated by using the Bradford and bicinchoninic acid (BCA) assays. Each phase-forming component above 1 and 5 wt% had considerable effects on the protein quantification in both assays, respectively, resulting in diminished protein recoveries/absorption values by increasing poly(ethylene glycol) (PEG)/salt concentration and PEG molecular weight. Therefore, a convenient dilution of both components (up to 1 and 5 wt%) before protein quantification is recommended in both assays, respectively, where the BCA assay is favored in comparison with the Bradford assay. PMID:25684109

  4. Quantification of rice bran oil in oil blends

    Directory of Open Access Journals (Sweden)

    Mishra, R.

    2012-03-01

    Full Text Available Blends consisting of physically refined rice bran oil (PRBO: sunflower oil (SnF and PRBO: safflower oil (SAF in different proportions were analyzed for various physicochemical parameters. The quantification of pure rice bran oil in the blended oils was carried out using different methods including gas chromatographic, HPLC, ultrasonic velocity and methods based on physico-chemical parameters. The physicochemical parameters such as ultrasonic velocity, relative association and acoustic impedance at 2 MHz, iodine value, palmitic acid content and oryzanol content reflected significant changes with increased proportions of PRBO in the blended oils. These parameters were selected as dependent parameters and % PRBO proportion was selected as independent parameters. The study revealed that regression equations based on the oryzanol content, palmitic acid composition, ultrasonic velocity, relative association, acoustic impedance, and iodine value can be used for the quantification of rice bran oil in blended oils. The rice bran oil can easily be quantified in the blended oils based on the oryzanol content by HPLC even at a 1% level. The palmitic acid content in blended oils can also be used as an indicator to quantify rice bran oil at or above the 20% level in blended oils whereas the method based on ultrasonic velocity, acoustic impedance and relative association showed initial promise in the quantification of rice bran oil.

    Se analizaron diversos parámetros físico-químicos para la evaluación de mezclas de aceites en diferentes proporciones que incluyen: aceite de salvado de arroz físícamente refinado (PRBO: aceite de girasol (SNF y las mezclas PRBO: aceite de cártamo (SAF en diferentes proporciones. La cuantificación de la presencia del aceite de salvado de arroz en las mezclas se llevó a cabo por diferentes métodos, como cromatografía de gases (GC, cromatografía líquida (HPLC, ultrasonidos y métodos basados en otros parámetros f

  5. Computer-assisted radiological quantification of rheumatoid arthritis

    International Nuclear Information System (INIS)

    Specific objective was to develop the layout and structure of a platform for effective quantification of rheumatoid arthritis (RA). A fully operative Java stand-alone application software (RheumaCoach) was developed to support the efficacy of the scoring process in RA (Web address: http://www.univie.ac.at/radio/radio.htm). Addressed as potential users of such a program are physicians enrolled in clinical trials to evaluate the course of RA and its modulation with drug therapies and scientists developing new scoring modalities. The software 'RheumaCoach' consists of three major modules: The Tutorial starts with 'Rheumatoid Arthritis', to teach the basic pathology of the disease. Afterwards the section 'Imaging Standards' explains how to produce proper radiographs. 'Principles - How to use the 'Larsen Score', 'Radiographic Findings' and 'Quantification by Scoring' explain the requirements for unbiased scoring of RA. At the Data Input Sheet care was taken to follow the radiologist's approach in analysing films as published previously. At the compute sheet the calculated Larsen-Score may be compared with former scores and the further possibilities (calculate, export, print, send) are easily accessible. In a first pre-clinical study the system was tested in an unstructured. Two structured evaluations (30 fully documented and blinded cases of RA, four radiologists scored hands and feet with or without the RheumaCoach) followed. Between the evaluations we permanently improved the software. For all readers the usage of the RheumaCoach fastened the procedure, all together the scoring without computer-assistance needed about 20 % percent more time. Availability of the programme via the internet provides common access for potential quality control in multi-center studies. Documentation of results in a specifically designed printout improves communication between radiologists and rheumatologists. The possibilities of direct export to other programmes and electronic

  6. Quantification of clarithromycin polymorphs in presence of tablet excipients.

    Directory of Open Access Journals (Sweden)

    Swathi Kuncham

    2014-03-01

    Full Text Available Excipients can cause a considerable challenge when developing a solid form of an active pharmaceutical ingredient (API. The aim of this present study was to analyze the polymorphs of clarithromycin (CAM mixed with excipients using powder X-ray diffraction (PXRD. Polymorphic Form I (CAM-1, Form II (CAM-2 and an amorphous phase of CAM were characterized using thermal and crystallographic methods. CAM-1 and CAM-2 were monotropically related, with CAM-2 being the stable form. PXRD instrument related parameters were optimized for the characterization of CAM polymorphic forms using a variety of excipients. Calibration curves for CAM-1 and CAM-2 mixed with excipients were also prepared. Analytical methods based on the differences in the diffraction patterns of CAM-1, CAM-2 and the excipients were developed. Sodium methyl paraben, sodium propyl paraben, microcrystalline cellulose and magnesium stearate were crystalline showing characteristic diffraction patterns. Starch, croscarmellose sodium, talc and sodium starch glycolate were semicrystalline in nature, while colloidal silicon dioxide was amorphous. A diffraction peak at 8.7° 2θ provided a quantification of CAM-2 when mixed with excipients. The analytical method was evaluated and validated for accuracy, precision, inter- and intra-day variation, variability due to sample repacking and instrument reproducibility. The method for quantification of CAM-2 in the range of 80 to 100% w/w was linear with R2 = 0.998. Relative standard deviation (RSD, due to sample repacking, was 2.77% indicating good homogeneity of mixing of the samples. RSD due to assay errors was 1.66%. PXRD analysis of the commercial tablet showed the CAM-2 as a major polymorph being 98% of the overall content of the API. CAM-1 was found to be present as an impurity at trace levels shown by peaks at 2θ values of 5.2° and 6.7°. This method provides a method for characterization of the polymorphic forms of CAM in the presence of

  7. Computer Model Inversion and Uncertainty Quantification in the Geosciences

    Science.gov (United States)

    White, Jeremy T.

    The subject of this dissertation is use of computer models as data analysis tools in several different geoscience settings, including integrated surface water/groundwater modeling, tephra fallout modeling, geophysical inversion, and hydrothermal groundwater modeling. The dissertation is organized into three chapters, which correspond to three individual publication manuscripts. In the first chapter, a linear framework is developed to identify and estimate the potential predictive consequences of using a simple computer model as a data analysis tool. The framework is applied to a complex integrated surface-water/groundwater numerical model with thousands of parameters. Several types of predictions are evaluated, including particle travel time and surface-water/groundwater exchange volume. The analysis suggests that model simplifications have the potential to corrupt many types of predictions. The implementation of the inversion, including how the objective function is formulated, what minimum of the objective function value is acceptable, and how expert knowledge is enforced on parameters, can greatly influence the manifestation of model simplification. Depending on the prediction, failure to specifically address each of these important issues during inversion is shown to degrade the reliability of some predictions. In some instances, inversion is shown to increase, rather than decrease, the uncertainty of a prediction, which defeats the purpose of using a model as a data analysis tool. In the second chapter, an efficient inversion and uncertainty quantification approach is applied to a computer model of volcanic tephra transport and deposition. The computer model simulates many physical processes related to tephra transport and fallout. The utility of the approach is demonstrated for two eruption events. In both cases, the importance of uncertainty quantification is highlighted by exposing the variability in the conditioning provided by the observations used for

  8. Development of hydrate risk quantification in oil and gas production

    Science.gov (United States)

    Chaudhari, Piyush N.

    order to reduce the parametric study that may require a long duration of time using The Colorado School of Mines Hydrate Kinetic Model (CSMHyK). The evolution of the hydrate plugging risk along flowline-riser systems is modeled for steady state and transient operations considering the effect of several critical parameters such as oil-hydrate slip, duration of shut-in, and water droplet size on a subsea tieback system. This research presents a novel platform for quantification of the hydrate plugging risk, which in-turn will play an important role in improving and optimizing current hydrate management strategies. The predictive strength of the hydrate risk quantification and hydrate prediction models will have a significant impact on flow assurance engineering and design with respect to building safe and efficient hydrate management techniques for future deep-water developments.

  9. Coral Pigments: Quantification Using HPLC and Detection by Remote Sensing

    Science.gov (United States)

    Cottone, Mary C.

    1995-01-01

    Widespread coral bleaching (loss of pigments of symbiotic dinoflagellates), and the corresponding decline in coral reef health worldwide, mandates the monitoring of coral pigmentation. Samples of the corals Porites compressa and P. lobata were collected from a healthy reef at Puako, Hawaii, and chlorophyll (chl) a, peridinin, and Beta-carotene (Beta-car) were quantified using reverse-phase high performance liquid chromatography (HPLC). Detailed procedures are presented for the extraction of the coral pigments in 90% acetone, and the separation, identification, and quantification of the major zooxanthellar pigments using spectrophotometry and a modification of the HPLC system described by Mantoura and Llewellyn (1983). Beta-apo-8-carotenal was found to be inadequate as in internal standard, due to coelution with chl b and/or chl a allomer in the sample extracts. Improvements are suggested, which may result in better resolution of the major pigments and greater accuracy in quantification. Average concentrations of peridinin, chl a, and Beta-car in corals on the reef were 5.01, 8.59, and 0.29, micro-grams/cm(exp 2), respectively. Average concentrations of peridinin and Beta-car did not differ significantly between the two coral species sampled; however, the mean chl a concentration in P. compressa specimens (7.81 ,micro-grams/cm(exp 2) was significantly lower than that in P. lobata specimens (9.96 11g/cm2). Chl a concentrations determined spectrophotometrically were significantly higher than those generated through HPLC, suggesting that spectrophotometry overestimates chl a concentrations. The average ratio of chl a-to-peridinin concentrations was 1.90, with a large (53%) coefficient of variation and a significant difference between the two species sampled. Additional data are needed before conclusions can be drawn regarding average pigment concentrations in healthy corals and the consistency of the chl a/peridinin ratio. The HPLC pigment concentration values

  10. Collaborative framework for PIV uncertainty quantification: the experimental database

    International Nuclear Information System (INIS)

    The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for

  11. Best practices for metabolite quantification in drug development: updated recommendation from the European Bioanalysis Forum.

    Science.gov (United States)

    Timmerman, Philip; Blech, Stefan; White, Stephen; Green, Martha; Delatour, Claude; McDougall, Stuart; Mannens, Geert; Smeraglia, John; Williams, Stephen; Young, Graeme

    2016-06-01

    Metabolite quantification and profiling continues to grow in importance in today's drug development. The guidance provided by the 2008 FDA Metabolites in Safety Testing Guidance and the subsequent ICH M3(R2) Guidance (2009) has led to a more streamlined process to assess metabolite exposures in preclinical and clinical studies in industry. In addition, the European Bioanalysis Forum (EBF) identified an opportunity to refine the strategies on metabolite quantification considering the experience to date with their recommendation paper on the subject dating from 2010 and integrating the recent discussions on the tiered approach to bioanalytical method validation with focus on metabolite quantification. The current manuscript summarizes the discussion and recommendations from a recent EBF Focus Workshop into an updated recommendation for metabolite quantification in drug development. PMID:27217058

  12. The Effect of Human Genome Annotation Complexity on RNA-Seq Gene Expression Quantification

    Science.gov (United States)

    Wu, Po-Yen; Phan, John H.; Wang, May D.

    2016-01-01

    Next-generation sequencing (NGS) has brought human genomic research to an unprecedented era. RNA-Seq is a branch of NGS that can be used to quantify gene expression and depends on accurate annotation of the human genome (i.e., the definition of genes and all of their variants or isoforms). Multiple annotations of the human genome exist with varying complexity. However, it is not clear how the choice of genome annotation influences RNA-Seq gene expression quantification. We assess the effect of different genome annotations in terms of (1) mapping quality, (2) quantification variation, (3) quantification accuracy (i.e., by comparing to qRT-PCR data), and (4) the concordance of detecting differentially expressed genes. External validation with qRT-PCR suggests that more complex genome annotations result in higher quantification variation.

  13. Dispersoid quantification and size distribution in hot and cold processed AA3103

    International Nuclear Information System (INIS)

    A quantification procedure coupling Field Emission Gun Scanning Electron Microscope (FEGSEM) and image analysis software has been developed to obtain size distributions of secondary phase dispersoids. The procedure is described in detail for aluminium alloys, and each involved parameter is evaluated separately. The first part of the procedure is the acquisition of the micrograph; both microscope and software parameters are important in this stage. The second part is the most critical step, where the micrograph is transformed to a binary image. Here, the choice of the threshold value is of great importance because it corresponds to the maximum loss of information from the original image. The quantification procedure is then presented with considerations and examples of statistical meaning. The quantification method is applied to a series of AA3103 samples taken from different industrial processing steps. The results obtained are consistent with the expected trends and prove the quantification procedure to be valid

  14. The Effect of AOP on Software Engineering, with Particular Attention to OIF and Event Quantification

    Science.gov (United States)

    Havelund, Klaus; Filman, Robert; Korsmeyer, David (Technical Monitor)

    2003-01-01

    We consider the impact of Aspect-Oriented Programming on Software Engineering, and, in particular, analyze two AOP systems, one of which does component wrapping and the other, quantification over events, for their software engineering effects.

  15. Quantification of synovial and erosive changes in rheumatoid arthritis with ultrasound-Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Schueller-Weidekamm, Claudia [Department of Radiology, Medical University of Vienna, Vienna General Hospital, Waehringer Guertel 18-20, A-1090 Vienna (Austria)], E-mail: claudia.schueller-weidekamm@meduniwien.ac.at

    2009-08-15

    Synovitis is a predictive factor of irreversible changes in the joints, tendons, and ligaments in patients with rheumatoid arthritis (RA). Therefore, the early demonstration of reversible, pre-erosive inflammatory features to diagnose RA, the monitoring of disease activity, and the response to therapy are of great importance. Technical developments in ultrasound now allow the quantification of synovitis and erosions, and enable the assessment and follow-up of disease activity. However, both the subjective and objective quantification techniques are associated with different problems. This review article highlights the advantages and disadvantages of sonographic quantification, and revisits the somewhat controversial positions apparent in the current literature. Familiarity with the imaging findings and the scoring systems used to characterize erosive changes are prerequisites for considerably improving the detection and monitoring of synovitis and erosions. The role of ultrasound in the diagnostic approach to RA, particularly in the quantification of synovial and erosive changes, will be explored and the current literature will be reviewed.

  16. Quantification of Abscisic Acid, Cytokinin, and Auxin Content in Salt-Stressed Plant Tissues

    OpenAIRE

    Dobrev, P.; Vaňková, R. (Radomíra)

    2012-01-01

    Plant hormones cytokinins, auxin (indole-3-acetic acid), and abscisic acid are central to regulation of plant growth and defence to abiotic stresses such as salinity. Quantification of the hormone levels and determination of their ratios can reveal different plant strategies to cope with the stress, e.g., suppression of growth or mobilization of plant metabolism. This chapter describes a procedure enabling such quantification. Due to the high variability of these hormones in plant tissues, it...

  17. Mathematical model for biomolecular quantification using surface-enhanced Raman spectroscopy based signal intensity distributions

    DEFF Research Database (Denmark)

    Palla, Mirko; Bosco, Filippo Giacomo; Yang, Jaeyoung; Rindzevicius, Tomas; Alstrøm, Tommy Sonne; Schmidt, Michael Stenbak; Lin, Qiao; Ju, Jingyue; Boisen, Anja

    This paper presents the development of a novel statistical method for quantifying trace amounts of biomolecules by surface-enhanced Raman spectroscopy (SERS) using a rigorous, single molecule (SM) theory based mathematical derivation. Our quantification framework could be generalized for planar...... SERS substrates, in which the nanostructured features can be approximated as a closely spaced electromagnetic dimer problem. The potential for SM detection was also shown, which opens up an exciting opportunity in the field of SERS quantification....

  18. Enhanced Information Output From Shotgun Proteomics Data by Protein Quantification and Peptide Quality Control (PQPQ)

    OpenAIRE

    Forshed, J.; Johansson, H. J.; Pernemalm, M; Branca, R. M. M.; Sandberg, A.; Lehtio, J.

    2011-01-01

    We present a tool to improve quantitative accuracy and precision in mass spectrometry based on shotgun proteomics: protein quantification by peptide quality control, PQPQ. The method is based on the assumption that the quantitative pattern of peptides derived from one protein will correlate over several samples. Dissonant patterns arise either from outlier peptides or because of the presence of different protein species. By correlation analysis, protein quantification by peptide quality contr...

  19. Evaluation of chromium concentration in cattle feces using different acid digestion and spectrophotometric quantification techniques

    OpenAIRE

    N.K.P. Souza; E. Detmann; D. S. Pina; Valadares Filho, S. C.; C.B. Sampaio; A.C. Queiroz; C.M. Veloso

    2013-01-01

    The objective of this work was to evaluate combinations between acid digestion techniques and spectrophotometric quantification to measure chromium concentration in cattle feces. Digestion techniques were evaluated based on the use of nitric and perchloric acids, sulfuric and perchloric acids, and phosphoric acid. The chromium quantification in the solutions was performed by colorimetry and by atomic absorption spectrophotometry (AAS). When AAS was used, the addition of calcium chloride to th...

  20. On the quantification of joint and muscle efforts in the human body during motion

    OpenAIRE

    Raison, Maxime

    2009-01-01

    The accurate quantification of internal forces in the human body in motion is still a challenge and a necessity to better understand motion-linked pathologies. In this context, the objective of this research is to develop a method to quantify the joint efforts and the corresponding muscle forces in the human body in motion. The main originality of the proposed method for the muscle force quantification is the cautious use of electromyographic (EMG) data information, known to...

  1. MR Spectroscopy: Real-Time Quantification of in-vivo MR Spectroscopic data

    OpenAIRE

    Massé, Kunal

    2009-01-01

    In the last two decades, magnetic resonance spectroscopy (MRS) has had an increasing success in biomedical research. This technique has the faculty of discerning several metabolites in human tissue non-invasively and thus offers a multitude of medical applications. In clinical routine, quantification plays a key role in the evaluation of the different chemical elements. The quantification of metabolites characterizing specific pathologies helps physicians establish the patient's diagnosis. E...

  2. Absolute quantification in multi-pinhole micro-SPECT for different isotopes

    OpenAIRE

    Vandeghinste, Bert; Vanhove, Christian; De Beenhouwer, Jan; Van Holen, Roel; Vandenberghe, Stefaan; Staelens, Steven

    2011-01-01

    In preclinical Single Photon Emission Tomography (SPECT), absolute quantification is interesting, expressed in percentage of injected radioactive dose per gram of tissue. This allows for accurate evaluation of disease progression and precise follow-up studies without the need for sacrificing animals. Accurate modeling of image degrading effects is currently under development for isotopes different from 99mTc. The aim of this work is to develop absolute micro-SPECT quantification for three dif...

  3. Nanoparticle-based detection and quantification of DNA with single nucleotide polymorphism (SNP) discrimination selectivity

    OpenAIRE

    Qin, Wei Jie; Yung, Lin Yue Lanry

    2007-01-01

    Sequence-specific DNA detection is important in various biomedical applications such as gene expression profiling, disease diagnosis and treatment, drug discovery and forensic analysis. Here we report a gold nanoparticle-based method that allows DNA detection and quantification and is capable of single nucleotide polymorphism (SNP) discrimination. The precise quantification of single-stranded DNA is due to the formation of defined nanoparticle-DNA conjugate groupings in the presence of target...

  4. Quantification of Acidic Compounds in Complex Biomass-Derived Streams

    Energy Technology Data Exchange (ETDEWEB)

    Karp, Eric M.; Nimlos, Claire T.; Deutch, Steve; Salvachua, Davinia; Cywar, Robin M.; Beckham, Gregg T.

    2016-09-07

    Biomass-derived streams that contain acidic compounds from the degradation of lignin and polysaccharides (e.g. black liquor, pyrolysis oil, pyrolytic lignin, etc.) are chemically complex solutions prone to instability and degradation during analysis, making quantification of compounds within them challenging. Here we present a robust analytical method to quantify acidic compounds in complex biomass-derived mixtures using ion exchange, sample reconstitution in pyridine and derivatization with BSTFA. The procedure is based on an earlier method originally reported for kraft black liquors and, in this work, is applied to identify and quantify a large slate of acidic compounds in corn stover derived alkaline pretreatment liquor (APL) as a function of pretreatment severity. Analysis of the samples is conducted with GCxGC-TOFMS to achieve good resolution of the components within the complex mixture. The results reveal the dominant low molecular weight components and their concentrations as a function of pretreatment severity. Application of this method is also demonstrated in the context of lignin conversion technologies by applying it to track the microbial conversion of an APL substrate. Here too excellent results are achieved, and the appearance and disappearance of compounds is observed in agreement with the known metabolic pathways of two bacteria, indicating the sample integrity was maintained throughout analysis. Finally, it is shown that this method applies more generally to lignin-rich materials by demonstrating its usefulness in analysis of pyrolysis oil and pyrolytic lignin.

  5. XPS quantification of the hetero-junction interface energy

    Science.gov (United States)

    Ma, Z. S.; Wang, Yan; Huang, Y. L.; Zhou, Z. F.; Zhou, Y. C.; Zheng, Weitao; Sun, Chang Q.

    2013-01-01

    We present an approach for quantifying the heterogeneous interface bond energy using X-ray photoelectron spectroscopy (XPS). Firstly, from analyzing the XPS core-level shift of the elemental surfaces we obtained the energy levels of an isolated atom and their bulk shifts of the constituent elements for reference; then we measured the energy shifts of the specific energy levels upon interface alloy formation. Subtracting the referential spectrum from that collected from the alloy, we can distil the interface effect on the binding energy. Calibrated based on the energy levels and their bulk shifts derived from elemental surfaces, we can derive the bond energy, energy density, atomic cohesive energy, and free energy at the interface region. This approach has enabled us to clarify the dominance of quantum entrapment at CuPd interface and the dominance of polarization at AgPd and BeW interfaces, as the origin of interface energy change. Developed approach not only enhances the power of XPS but also enables the quantification of the interface energy at the atomic scale that has been an issue of long challenge.

  6. Accurate quantification of cells recovered by bronchoalveolar lavage.

    Science.gov (United States)

    Saltini, C; Hance, A J; Ferrans, V J; Basset, F; Bitterman, P B; Crystal, R G

    1984-10-01

    Quantification of the differential cell count and total number of cells recovered from the lower respiratory tract by bronchoalveolar lavage is a valuable technique for evaluating the alveolitis of patients with inflammatory disorders of the lower respiratory tract. The most commonly used technique for the evaluation of cells recovered by lavage has been to concentrate cells by centrifugation and then to determine total cell number using a hemocytometer and differential cell count from a Wright-Glemsa-stained cytocentrifuge preparation. However, we have noted that the percentage of small cells present in the original cell suspension recovered by lavage is greater than the percentage of lymphocytes identified on cytocentrifuge preparations. Therefore, we developed procedures for determining differential cell counts on lavage cells collected on Millipore filters and stained with hematoxylin-eosin (filter preparations) and compared the results of differential cell counts performed on filter preparations with those obtained using cytocentrifuge preparations. When cells recovered by lavage were collected on filter preparations, accurate differential cell counts were obtained, as confirmed by performing differential cell counts on cell mixtures of known composition, and by comparing differential cell counts obtained using filter preparations stained with hematoxylin-eosin with those obtained using filter preparations stained with a peroxidase cytochemical stain. The morphology of cells displayed on filter preparations was excellent, and interobserver variability in quantitating cell types recovered by lavage was less than 3%.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:6385789

  7. Significance of DNA quantification in testicular germ cell tumors.

    Science.gov (United States)

    Codesal, J; Paniagua, R; Regadera, J; Fachal, C; Nistal, M

    1991-01-01

    A cytophotometric quantification of DNA in tumor cells was performed in histological sections of orchidectomy specimens from 36 men with testicular germ cell tumors (TGCT), 7 of them showing more than one tumor type. Among the variants of seminoma (classic and spermatocytic) the lowest DNA content were in spermatocytic seminoma. With respect to non-seminomatous tumors (yolk sac tumor, embryonal carcinoma, teratoma, and choriocarcinoma), choriocarcinomas showed the highest DNA content, and the lowest value was found in teratomas. No significant differences were found between the average DNA content of seminomas (all types) and non-seminomatous tumors (all types). Both embryonal carcinoma and yolk sac tumor showed similar DNA content when they were the sole tumor and when they were found associated with other tumors. In this study, except for the 4 cases of teratoma and the case of spermatocytic seminoma, all TGCT examined did not show modal values of DNA content in the diploid range. Such an elevated frequency of aneuploidism in these tumors may be helpful for their diagnosis. PMID:1666273

  8. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.

    2011-09-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which experimental data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport universe, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new experiments within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies. © 2011 Elsevier Ltd. All rights reserved.

  9. Pinpower uncertainty quantification of LWR-PROTEUS Phase III experiments

    International Nuclear Information System (INIS)

    CASMO-5MX is a set of Perl-based tools built around the lattice code CASMO-5 and developed at the Paul Scherrer Institut, used to perform sensitivity analysis (SA) and uncertainty quantification (UQ). The paper describes the recent developments to the CASMO-5MX methodology to perform pinpower UQ with the Stochastic Sampling methodology while perturbing nuclear data and technological parameters. The CASMO-5MX SS pinpower UQ is applied to the re-analysis of the LWR-PROTEUS Phase III campaign which featured SVEA-96 Optima2 fresh BWR fuel assemblies in an attempt to explain some discrepancy between the fission rate measurements (E) and their associated predictions with CASMO-5M (C). The pinpower UQ results have shown that the pinpower uncertainty due to nuclear data and technological parameters is in the range of 0.4% to 0.8%. This additional source of uncertainty, added to the existing experimental uncertainty, do not fully explain the remaining few C/E discrepancies in the pinpower map. The magnitude of nuclear data and technological parameters uncertainty is either underestimated or an additional source of uncertainty has not been taken into account. (author)

  10. In vivo cell tracking and quantification method in adult zebrafish

    Science.gov (United States)

    Zhang, Li; Alt, Clemens; Li, Pulin; White, Richard M.; Zon, Leonard I.; Wei, Xunbin; Lin, Charles P.

    2012-03-01

    Zebrafish have become a powerful vertebrate model organism for drug discovery, cancer and stem cell research. A recently developed transparent adult zebrafish using double pigmentation mutant, called casper, provide unparalleled imaging power in in vivo longitudinal analysis of biological processes at an anatomic resolution not readily achievable in murine or other systems. In this paper we introduce an optical method for simultaneous visualization and cell quantification, which combines the laser scanning confocal microscopy (LSCM) and the in vivo flow cytometry (IVFC). The system is designed specifically for non-invasive tracking of both stationary and circulating cells in adult zebrafish casper, under physiological conditions in the same fish over time. The confocal imaging part in this system serves the dual purposes of imaging fish tissue microstructure and a 3D navigation tool to locate a suitable vessel for circulating cell counting. The multi-color, multi-channel instrument allows the detection of multiple cell populations or different tissues or organs simultaneously. We demonstrate initial testing of this novel instrument by imaging vasculature and tracking circulating cells in CD41: GFP/Gata1: DsRed transgenic casper fish whose thrombocytes/erythrocytes express the green and red fluorescent proteins. Circulating fluorescent cell incidents were recorded and counted repeatedly over time and in different types of vessels. Great application opportunities in cancer and stem cell researches are discussed.

  11. Crystal Violet and XTT Assays on Staphylococcus aureus Biofilm Quantification.

    Science.gov (United States)

    Xu, Zhenbo; Liang, Yanrui; Lin, Shiqi; Chen, Dingqiang; Li, Bing; Li, Lin; Deng, Yang

    2016-10-01

    Staphylococcus aureus (S. Aureus) is a common food-borne pathogenic microorganism. Biofilm formation remains the major obstruction for bacterial elimination. The study aims at providing a basis for determining S. aureus biofilm formation. 257 clinical samples of S. aureus isolates were identified by routine analysis and multiplex PCR detection and found to contain 227 MRSA, 16 MSSA, 11 MRCNS, and 3 MSCNS strains. Two assays for quantification of S. aureus biofilm formation, the crystal violet (CV) assay and the XTT (tetrazolium salt reduction) assay, were optimized, evaluated, and further compared. In CV assay, most isolates formed weak biofilm 74.3 %), while the rest formed moderate biofilm (23.3 %) or strong biofilm (2.3 %). However, most isolates in XTT assay showed weak metabolic activity (77.0 %), while the rest showed moderate metabolic activity (17.9 %) or high metabolic activity (5.1 %). In this study, we found a distinct strain-to-strain dissimilarity in terms of both biomass formation and metabolic activity, and it was concluded from this study that two assays were mutual complementation rather than being comparison. PMID:27324342

  12. Automated quantification of one-dimensional nanostructure alignment on surfaces

    Science.gov (United States)

    Dong, Jianjin; Goldthorpe, Irene A.; Mohieddin Abukhdeir, Nasser

    2016-06-01

    A method for automated quantification of the alignment of one-dimensional (1D) nanostructures from microscopy imaging is presented. Nanostructure alignment metrics are formulated and shown to be able to rigorously quantify the orientational order of nanostructures within a two-dimensional domain (surface). A complementary image processing method is also presented which enables robust processing of microscopy images where overlapping nanostructures might be present. Scanning electron microscopy (SEM) images of nanowire-covered surfaces are analyzed using the presented methods and it is shown that past single parameter alignment metrics are insufficient for highly aligned domains. Through the use of multiple parameter alignment metrics, automated quantitative analysis of SEM images is shown to be possible and the alignment characteristics of different samples are able to be quantitatively compared using a similarity metric. The results of this work provide researchers in nanoscience and nanotechnology with a rigorous method for the determination of structure/property relationships, where alignment of 1D nanostructures is significant.

  13. Accurate 3D quantification of the bronchial parameters in MDCT

    Science.gov (United States)

    Saragaglia, A.; Fetita, C.; Preteux, F.; Brillet, P. Y.; Grenier, P. A.

    2005-08-01

    The assessment of bronchial reactivity and wall remodeling in asthma plays a crucial role in better understanding such a disease and evaluating therapeutic responses. Today, multi-detector computed tomography (MDCT) makes it possible to perform an accurate estimation of bronchial parameters (lumen and wall areas) by allowing a quantitative analysis in a cross-section plane orthogonal to the bronchus axis. This paper provides the tools for such an analysis by developing a 3D investigation method which relies on 3D reconstruction of bronchial lumen and central axis computation. Cross-section images at bronchial locations interactively selected along the central axis are generated at appropriate spatial resolution. An automated approach is then developed for accurately segmenting the inner and outer bronchi contours on the cross-section images. It combines mathematical morphology operators, such as "connection cost", and energy-controlled propagation in order to overcome the difficulties raised by vessel adjacencies and wall irregularities. The segmentation accuracy was validated with respect to a 3D mathematically-modeled phantom of a pair bronchus-vessel which mimics the characteristics of real data in terms of gray-level distribution, caliber and orientation. When applying the developed quantification approach to such a model with calibers ranging from 3 to 10 mm diameter, the lumen area relative errors varied from 3.7% to 0.15%, while the bronchus area was estimated with a relative error less than 5.1%.

  14. Uncertainty Quantification for Airfoil Icing using Polynomial Chaos Expansions

    CERN Document Server

    DeGennaro, Anthony M; Martinelli, Luigi

    2014-01-01

    The formation and accretion of ice on the leading edge of a wing can be detrimental to airplane performance. Complicating this reality is the fact that even a small amount of uncertainty in the shape of the accreted ice may result in a large amount of uncertainty in aerodynamic performance metrics (e.g., stall angle of attack). The main focus of this work concerns using the techniques of Polynomial Chaos Expansions (PCE) to quantify icing uncertainty much more quickly than traditional methods (e.g., Monte Carlo). First, we present a brief survey of the literature concerning the physics of wing icing, with the intention of giving a certain amount of intuition for the physical process. Next, we give a brief overview of the background theory of PCE. Finally, we compare the results of Monte Carlo simulations to PCE-based uncertainty quantification for several different airfoil icing scenarios. The results are in good agreement and confirm that PCE methods are much more efficient for the canonical airfoil icing un...

  15. Automated angiogenesis quantification through advanced image processing techniques.

    Science.gov (United States)

    Doukas, Charlampos N; Maglogiannis, Ilias; Chatziioannou, Aristotle; Papapetropoulos, Andreas

    2006-01-01

    Angiogenesis, the formation of blood vessels in tumors, is an interactive process between tumor, endothelial and stromal cells in order to create a network for oxygen and nutrients supply, necessary for tumor growth. According to this, angiogenic activity is considered a suitable method for both tumor growth or inhibition detection. The angiogenic potential is usually estimated by counting the number of blood vessels in particular sections. One of the most popular assay tissues to study the angiogenesis phenomenon is the developing chick embryo and its chorioallantoic membrane (CAM), which is a highly vascular structure lining the inner surface of the egg shell. The aim of this study was to develop and validate an automated image analysis method that would give an unbiased quantification of the micro-vessel density and growth in angiogenic CAM images. The presented method has been validated by comparing automated results to manual counts over a series of digital chick embryo photos. The results indicate the high accuracy of the tool, which has been thus extensively used for tumor growth detection at different stages of embryonic development. PMID:17946107

  16. Comprehensive quantification of ceramide species in human stratum corneum.

    Science.gov (United States)

    Masukawa, Yoshinori; Narita, Hirofumi; Sato, Hirayuki; Naoe, Ayano; Kondo, Naoki; Sugai, Yoshiya; Oba, Tsuyoshi; Homma, Rika; Ishikawa, Junko; Takagi, Yutaka; Kitahara, Takashi

    2009-08-01

    One of the key challenges in lipidomics is to quantify lipidomes of interest, as it is practically impossible to collect all authentic materials covering the targeted lipidomes. For diverse ceramides (CER) in human stratum corneum (SC) that play important physicochemical roles in the skin, we developed a novel method for quantification of the overall CER species by improving our previously reported profiling technique using normal-phase liquid chromatography-electrospray ionization-mass spectrometry (NPLC-ESI-MS). The use of simultaneous selected ion monitoring measurement of as many as 182 kinds of molecular-related ions enables the highly sensitive detection of the overall CER species, as they can be analyzed in only one SC-stripped tape as small as 5 mm x 10 mm. To comprehensively quantify CERs, including those not available as authentic species, we designed a procedure to estimate their levels using relative responses of representative authentic species covering the species targeted, considering the systematic error based on intra-/inter-day analyses. The CER levels obtained by this method were comparable to those determined by conventional thin-layer chromatography (TLC), which guarantees the validity of this method. This method opens lipidomics approaches for CERs in the SC. PMID:19349641

  17. A surrogate-based uncertainty quantification with quantifiable errors

    Energy Technology Data Exchange (ETDEWEB)

    Bang, Y.; Abdel-Khalik, H. S. [North Carolina State Univ., Raleigh, NC 27695 (United States)

    2012-07-01

    Surrogate models are often employed to reduce the computational cost required to complete uncertainty quantification, where one is interested in propagating input parameters uncertainties throughout a complex engineering model to estimate responses uncertainties. An improved surrogate construction approach is introduced here which places a premium on reducing the associated computational cost. Unlike existing methods where the surrogate is constructed first, then employed to propagate uncertainties, the new approach combines both sensitivity and uncertainty information to render further reduction in the computational cost. Mathematically, the reduction is described by a range finding algorithm that identifies a subspace in the parameters space, whereby parameters uncertainties orthogonal to the subspace contribute negligible amount to the propagated uncertainties. Moreover, the error resulting from the reduction can be upper-bounded. The new approach is demonstrated using a realistic nuclear assembly model and compared to existing methods in terms of computational cost and accuracy of uncertainties. Although we believe the algorithm is general, it will be applied here for linear-based surrogates and Gaussian parameters uncertainties. The generalization to nonlinear models will be detailed in a separate article. (authors)

  18. Quantification of the degree of reaction of fly ash

    International Nuclear Information System (INIS)

    The quantification of the fly ash (FA) in FA blended cements is an important parameter to understand the effect of the fly ash on the hydration of OPC and on the microstructural development. The FA reaction in two different blended OPC-FA systems was studied using a selective dissolution technique based on EDTA/NaOH, diluted NaOH solution, the portlandite content and by backscattered electron image analysis. The amount of FA determined by selective dissolution using EDTA/NaOH is found to be associated with a significant possible error as different assumptions lead to large differences in the estimate of FA reacted. In addition, at longer hydration times, the reaction of the FA is underestimated by this method due to the presence of non-dissolved hydrates and MgO rich particles. The dissolution of FA in diluted NaOH solution agreed during the first days well with the dissolution as observed by image analysis. At 28 days and longer, the formation of hydrates in the diluted solutions leads to an underestimation. Image analysis appears to give consistent results and to be most reliable technique studied.

  19. Detection and quantification of MS lesions using fuzzy topological principles

    Science.gov (United States)

    Udupa, Jayaram K.; Wei, Luogang; Samarasekera, Supun; Miki, Yukio; van Buchem, M. A.; Grossman, Robert I.

    1996-04-01

    Quantification of the severity of the multiple sclerosis (MS) disease through estimation of lesion volume via MR imaging is vital for understanding and monitoring the disease and its treatment. This paper presents a novel methodology and a system that can be routinely used for segmenting and estimating the volume of MS lesions via dual-echo spin-echo MR imagery. An operator indicates a few points in the images by pointing to the white matter, the gray matter, and the CSF. Each of these objects is then detected as a fuzzy connected set. The holes in the union of these objects correspond to potential lesion sites which are utilized to detect each potential lesion as a fuzzy connected object. These 3D objects are presented to the operator who indicates acceptance/rejection through the click of a mouse button. The volume of accepted lesions is then computed and output. Based on several evaluation studies and over 300 3D data sets that were processed, we conclude that the methodology is highly reliable and consistent, with a coefficient of variation (due to subjective operator actions) of less than 1.0% for volume.

  20. Accurate quantification of astaxanthin from Haematococcus crude extract spectrophotometrically

    Science.gov (United States)

    Li, Yeguang; Miao, Fengping; Geng, Yahong; Lu, Dayan; Zhang, Chengwu; Zeng, Mingtao

    2012-07-01

    The influence of alkali on astaxanthin and the optimal working wave length for measurement of astaxanthin from Haematococcus crude extract were investigated, and a spectrophotometric method for precise quantification of the astaxanthin based on the method of Boussiba et al. was established. According to Boussiba's method, alkali treatment destroys chlorophyll. However, we found that: 1) carotenoid content declined for about 25% in Haematococcus fresh cysts and up to 30% in dry powder of Haematococcus broken cysts after alkali treatment; and 2) dimethyl sulfoxide (DMSO)-extracted chlorophyll of green Haematococcus bares little absorption at 520-550 nm. Interestingly, a good linear relationship existed between absorbance at 530 nm and astaxanthin content, while an unknown interference at 540-550 nm was detected in our study. Therefore, with 530 nm as working wavelength, the alkali treatment to destroy chlorophyll was not necessary and the influence of chlorophyll, other carotenoids, and the unknown interference could be avoided. The astaxanthin contents of two samples were measured at 492 nm and 530 nm; the measured values at 530 nm were 2.617 g/100 g and 1.811 g/100 g. When compared with the measured values at 492 nm, the measured values at 530 nm decreased by 6.93% and 11.96%, respectively. The measured values at 530 nm are closer to the true astaxanthin contents in the samples. The data show that 530 nm is the most suitable wave length for spectrophotometric determination to the astaxanthin in Haematococcus crude extract.

  1. Evaluation of SPECT quantification of radiopharmaceutical distribution in canine myocardium

    International Nuclear Information System (INIS)

    This study evaluates the quantitative accuracy of SPECT for in vivo distributions of 99mTc radiopharmaceuticals using fanbeam (FB) and parallel-beam (PB) collimators and compares uniform and nouniform attenuation correction methods in terms of quantitative accuracy. SPECT quantification of canine myocardial radioactivity was performed followed by well counter measurements of extracted myocardial tissue samples. Transmission scans using a line source and an FB collimator were performed to generate nonuniform attenuation maps of the canine thorax. Emission scans with two energy windows were acquired. Images were reconstructed using a filtered backprojection algorithm, with a dual-window scatter subtraction combined with either no attenuation compensation or single iteration Chang attenuation compensation based on a uniform attenuation map μ=0.152 cm-1 or the nonuniform transmission map. The measured mean counts from the SPECT images were converted using the well counter. The experimental results demonstrate that, compared with well counter values, the in vivo distributions of 99mTc were most accurately determined in FB and PB SPECT reconstructions with nonuniform attenuation compensation, under-estimated without attenuation compensation and overestimated with uniform attenuation compensation. 37 refs., 9 figs., 10 tabs

  2. VESGEN Software for Mapping and Quantification of Vascular Regulators

    Science.gov (United States)

    Parsons-Wingerter, Patricia A.; Vickerman, Mary B.; Keith, Patricia A.

    2012-01-01

    VESsel GENeration (VESGEN) Analysis is an automated software that maps and quantifies effects of vascular regulators on vascular morphology by analyzing important vessel parameters. Quantification parameters include vessel diameter, length, branch points, density, and fractal dimension. For vascular trees, measurements are reported as dependent functions of vessel branching generation. VESGEN maps and quantifies vascular morphological events according to fractal-based vascular branching generation. It also relies on careful imaging of branching and networked vascular form. It was developed as a plug-in for ImageJ (National Institutes of Health, USA). VESGEN uses image-processing concepts of 8-neighbor pixel connectivity, skeleton, and distance map to analyze 2D, black-and-white (binary) images of vascular trees, networks, and tree-network composites. VESGEN maps typically 5 to 12 (or more) generations of vascular branching, starting from a single parent vessel. These generations are tracked and measured for critical vascular parameters that include vessel diameter, length, density and number, and tortuosity per branching generation. The effects of vascular therapeutics and regulators on vascular morphology and branching tested in human clinical or laboratory animal experimental studies are quantified by comparing vascular parameters with control groups. VESGEN provides a user interface to both guide and allow control over the users vascular analysis process. An option is provided to select a morphological tissue type of vascular trees, network or tree-network composites, which determines the general collections of algorithms, intermediate images, and output images and measurements that will be produced.

  3. A Project-based Quantification of BIM Benefits

    Directory of Open Access Journals (Sweden)

    Jian Li

    2014-08-01

    Full Text Available In the construction industry, research is being carried out to look for feasible methods and technologies to cut down project costs and waste. Building Information Modelling (BIM is certainly currently a promising technology/method that can achieve this. The output of the construction industry has a considerable scale; however, the concentration of the industry and the level of informatization are still not high. There is still a large gap in terms of productivity between the construction industry and other industries. Due to the lack of first-hand data regarding how much of an effect can be genuinely had by BIM in real cases, it is unrealistic for construction stakeholders to take the risk of widely adopting BIM. This paper focuses on the methodological quantification (through a case study approach of BIM’s benefits in building construction resource management and real-time costs control, in contrast to traditional non-BIM technologies. Through the use of BIM technology for the dynamic querying and statistical analysis of construction schedules, engineering, resources and costs, the three implementations considered demonstrate how BIM can facilitate the comprehensive grasp of a project’s implementation and progress, identify and solve the contradictions and conflicts between construction resources and costs controls, reduce project over-spends and protect the supply of resources.

  4. A surrogate-based uncertainty quantification with quantifiable errors

    International Nuclear Information System (INIS)

    Surrogate models are often employed to reduce the computational cost required to complete uncertainty quantification, where one is interested in propagating input parameters uncertainties throughout a complex engineering model to estimate responses uncertainties. An improved surrogate construction approach is introduced here which places a premium on reducing the associated computational cost. Unlike existing methods where the surrogate is constructed first, then employed to propagate uncertainties, the new approach combines both sensitivity and uncertainty information to render further reduction in the computational cost. Mathematically, the reduction is described by a range finding algorithm that identifies a subspace in the parameters space, whereby parameters uncertainties orthogonal to the subspace contribute negligible amount to the propagated uncertainties. Moreover, the error resulting from the reduction can be upper-bounded. The new approach is demonstrated using a realistic nuclear assembly model and compared to existing methods in terms of computational cost and accuracy of uncertainties. Although we believe the algorithm is general, it will be applied here for linear-based surrogates and Gaussian parameters uncertainties. The generalization to nonlinear models will be detailed in a separate article. (authors)

  5. Quantification of motility of carabid beetles in farmland.

    Science.gov (United States)

    Allema, A B; van der Werf, W; Groot, J C J; Hemerik, L; Gort, G; Rossing, W A H; van Lenteren, J C

    2015-04-01

    Quantification of the movement of insects at field and landscape levels helps us to understand their ecology and ecological functions. We conducted a meta-analysis on movement of carabid beetles (Coleoptera: Carabidae), to identify key factors affecting movement and population redistribution. We characterize the rate of redistribution using motility μ (L2 T-1), which is a measure for diffusion of a population in space and time that is consistent with ecological diffusion theory and which can be used for upscaling short-term data to longer time frames. Formulas are provided to calculate motility from literature data on movement distances. A field experiment was conducted to measure the redistribution of mass-released carabid, Pterostichus melanarius in a crop field, and derive motility by fitting a Fokker-Planck diffusion model using inverse modelling. Bias in estimates of motility from literature data is elucidated using the data from the field experiment as a case study. The meta-analysis showed that motility is 5.6 times as high in farmland as in woody habitat. Species associated with forested habitats had greater motility than species associated with open field habitats, both in arable land and woody habitat. The meta-analysis did not identify consistent differences in motility at the species level, or between clusters of larger and smaller beetles. The results presented here provide a basis for calculating time-varying distribution patterns of carabids in farmland and woody habitat. The formulas for calculating motility can be used for other taxa. PMID:25673121

  6. Uncertainty quantification for large-scale ocean circulation predictions.

    Energy Technology Data Exchange (ETDEWEB)

    Safta, Cosmin; Debusschere, Bert J.; Najm, Habib N.; Sargsyan, Khachik

    2010-09-01

    Uncertainty quantificatio in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO{sub 2} forcing. We develop a methodology that performs uncertainty quantificatio in the presence of limited data that have discontinuous character. Our approach is two-fold. First we detect the discontinuity location with a Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve location in presence of arbitrarily distributed input parameter values. Furthermore, we developed a spectral approach that relies on Polynomial Chaos (PC) expansions on each sides of the discontinuity curve leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification and propagation. The methodology is tested on synthetic examples of discontinuous data with adjustable sharpness and structure.

  7. Quantification of Covariance in Tropical Cyclone Activity across Teleconnected Basins

    Science.gov (United States)

    Tolwinski-Ward, S. E.; Wang, D.

    2015-12-01

    Rigorous statistical quantification of natural hazard covariance across regions has important implications for risk management, and is also of fundamental scientific interest. We present a multivariate Bayesian Poisson regression model for inferring the covariance in tropical cyclone (TC) counts across multiple ocean basins and across Saffir-Simpson intensity categories. Such covariability results from the influence of large-scale modes of climate variability on local environments that can alternately suppress or enhance TC genesis and intensification, and our model also simultaneously quantifies the covariance of TC counts with various climatic modes in order to deduce the source of inter-basin TC covariability. The model explicitly treats the time-dependent uncertainty in observed maximum sustained wind data, and hence the nominal intensity category of each TC. Differences in annual TC counts as measured by different agencies are also formally addressed. The probabilistic output of the model can be probed for probabilistic answers to such questions as: - Does the relationship between different categories of TCs differ statistically by basin? - Which climatic predictors have significant relationships with TC activity in each basin? - Are the relationships between counts in different basins conditionally independent given the climatic predictors, or are there other factors at play affecting inter-basin covariability? - How can a portfolio of insured property be optimized across space to minimize risk? Although we present results of our model applied to TCs, the framework is generalizable to covariance estimation between multivariate counts of natural hazards across regions and/or across peril types.

  8. On Uncertainty Quantification of Lithium-ion Batteries

    CERN Document Server

    Hadigol, Mohammad; Doostan, Alireza

    2015-01-01

    In this work, a stochastic, physics-based model for Lithium-ion batteries (LIBs) is presented in order to study the effects of model uncertainties on the cell capacity, voltage, and concentrations. To this end, the proposed uncertainty quantification (UQ) approach, based on sparse polynomial chaos expansions, relies on a small number of battery simulations. Within this UQ framework, the identification of most important uncertainty sources is achieved by performing a global sensitivity analysis via computing the so-called Sobol' indices. Such information aids in designing more efficient and targeted quality control procedures, which consequently may result in reducing the LIB production cost. An LiC$_6$/LiCoO$_2$ cell with 19 uncertain parameters discharged at 0.25C, 1C and 4C rates is considered to study the performance and accuracy of the proposed UQ approach. The results suggest that, for the considered cell, the battery discharge rate is a key factor affecting not only the performance variability of the ce...

  9. Quantification of brodifacoum in plasma and liver tissue by HPLC.

    Science.gov (United States)

    O'Bryan, S M; Constable, D J

    1991-01-01

    A simple high-performance liquid chromatographic method has been developed for detection and quantification of brodifacoum in plasma and liver tissue. After adding difenacoum as the internal standard, brodifacoum and difenacoum are extracted from 2 mL of plasma with two sequential 10-mL volumes of acetonitrile-ethyl ether (9:1) and from 2 g of liver tissue by grinding the tissue with 10 mL acetonitrile. The extracts are evaporated to dryness under nitrogen, 2 mL of acetonitrile is added to reconstitute the residues, and the resulting solution is analyzed using reversed-phase chromatography and fluorescence detection. The limits of detection for plasma and tissue are 2 micrograms/L and 5 ng/g, respectively. Using internal standardization, the mean intra-assay recovery from plasma is 92% and the mean inter-assay recoveries is 109%. The mean intra-assay and inter-assay recoveries from tissue are 96%. No interferences were observed with any of the following related compounds: brodifacoum, bromadiolone, coumarin, difenacoum, diphacinone, warfarin, and vitamin K1. PMID:1943058

  10. Quantification of the chemical exergy of the acai pit

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, Manoel do Espirito Santo dos Santos [Universidade Federal do Para (UFPA), Belem, PA (Brazil). Lab. de Engenharia Mecanica. LABGAS]. E-mail: manoelssmendes@hotmail.com; Nogueira, Manoel Fernandes Martins [Universidade Federal do Para (UFPA), Belem, PA (Brazil). Lab. de Engenharia Mecanica]. E-mail: mfmn@ufpa.br

    2008-07-01

    The exergy became a tool more and more important for project and analysis of thermal systems because it supplies base for term-economy discussion. Systems that use combustion of biomass have difficulties to calculate the chemical exergy of the process of combustion due to the uncertainties in the quantification of the chemical exergy of the biomass, falling back upon expressions of approximate values. This work presents a methodology to quantify the chemical exergy of biomass and it exemplifies the calculation at chemical exergy for acai pit. Such calculates starts with the results from ultimate and proximate analysis of acai pit. Following, the acai pit chemical exergy is quantified and compared with values from another method. The proposed methodology can be applied for any biomass and determines the chemical exergy of the dry biomass free from ashes, only the ash chemical exergy and finally contribution of the water contained in the biomass, thus assisting the exergetic analysis of thermal systems using biomass of the Amazon region as combustible for energy generation. (author)

  11. Quantification of mouse pulmonary cancer models by microcomputed tomography imaging

    International Nuclear Information System (INIS)

    The advances in preclinical cancer models, including orthotopic implantation models or genetically engineered mouse models of cancer, enable pursuing the molecular mechanism of cancer disease that might mimic genetic and biological processes in humans. Lung cancer is the major cause of cancer deaths; therefore, the treatment and prevention of lung cancer are expected to be improved by a better understanding of the complex mechanism of disease. In this study, we have examined the quantification of two distinct mouse lung cancer models by utilizing imaging modalities for monitoring tumor progression and drug efficacy evaluation. The utility of microcomputed tomography (micro-CT) for real-time/non-invasive monitoring of lung cancer progression has been confirmed by combining bioluminescent imaging and histopathological analyses. Further, we have developed a more clinically relevant lung cancer model by utilizing K-rasLSL-G12D/p53LSL-R270H mutant mice. Using micro-CT imaging, we monitored the development and progression of solitary lung tumor in K-rasLSL-G12D/p53LSL-R270H mutant mouse, and further demonstrated tumor growth inhibition by anticancer drug treatment. These results clearly indicate that imaging-guided evaluation of more clinically relevant tumor models would improve the process of new drug discovery and increase the probability of success in subsequent clinical studies. (author)

  12. Relative Quantification of Several Plasma Proteins during Liver Transplantation Surgery

    Science.gov (United States)

    Parviainen, Ville; Joenväärä, Sakari; Tukiainen, Eija; Ilmakunnas, Minna; Isoniemi, Helena; Renkonen, Risto

    2011-01-01

    Plasma proteome is widely used in studying changes occurring in human body during disease or other disturbances. Immunological methods are commonly used in such studies. In recent years, mass spectrometry has gained popularity in high-throughput analysis of plasma proteins. In this study, we tested whether mass spectrometry and iTRAQ-based protein quantification might be used in proteomic analysis of human plasma during liver transplantation surgery to characterize changes in protein abundances occurring during early graft reperfusion. We sampled blood from systemic circulation as well as blood entering and exiting the liver. After immunodepletion of six high-abundant plasma proteins, trypsin digestion, iTRAQ labeling, and cation-exchange fractionation, the peptides were analyzed by reverse phase nano-LC-MS/MS. In total, 72 proteins were identified of which 31 could be quantified in all patient specimens collected. Of these 31 proteins, ten, mostly medium-to-high abundance plasma proteins with a concentration range of 50–2000 mg/L, displayed relative abundance change of more than 10%. The changes in protein abundance observed in this study allow further research on the role of several proteins in ischemia-reperfusion injury during liver transplantation and possibly in other surgery. PMID:22187521

  13. Atomic Resolution Imaging and Quantification of Chemical Functionality of Surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Schwarz, Udo [Yale University

    2014-12-10

    The work carried out from 2006-2014 under DoE support was targeted at developing new approaches to the atomic-scale characterization of surfaces that include species-selective imaging and an ability to quantify chemical surface interactions with site-specific accuracy. The newly established methods were subsequently applied to gain insight into the local chemical interactions that govern the catalytic properties of model catalysts of interest to DoE. The foundation of our work was the development of three-dimensional atomic force microscopy (3D-AFM), a new measurement mode that allows the mapping of the complete surface force and energy fields with picometer resolution in space (x, y, and z) and piconewton/millielectron volts in force/energy. From this experimental platform, we further expanded by adding the simultaneous recording of tunneling current (3D-AFM/STM) using chemically well-defined tips. Through comparison with simulations, we were able to achieve precise quantification and assignment of local chemical interactions to exact positions within the lattice. During the course of the project, the novel techniques were applied to surface-oxidized copper, titanium dioxide, and silicon oxide. On these materials, defect-induced changes to the chemical surface reactivity and electronic charge density were characterized with site-specific accuracy.

  14. Quantification of Safety-Critical Software Test Uncertainty

    International Nuclear Information System (INIS)

    The method, conservatively assumes that the failure probability of a software for the untested inputs is 1, and the failure probability turns in 0 for successful testing of all test cases. However, in reality the chance of failure exists due to the test uncertainty. Some studies have been carried out to identify the test attributes that affect the test quality. Cao discussed the testing effort, testing coverage, and testing environment. Management of the test uncertainties was discussed in. In this study, the test uncertainty has been considered to estimate the software failure probability because the software testing process is considered to be inherently uncertain. A reliability estimation of software is very important for a probabilistic safety analysis of a digital safety critical system of NPPs. This study focused on the estimation of the probability of a software failure that considers the uncertainty in software testing. In our study, BBN has been employed as an example model for software test uncertainty quantification. Although it can be argued that the direct expert elicitation of test uncertainty is much simpler than BBN estimation, however the BBN approach provides more insights and a basis for uncertainty estimation

  15. Quantification of Safety-Critical Software Test Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Khalaquzzaman, M.; Cho, Jaehyun; Lee, Seung Jun; Jung, Wondea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    The method, conservatively assumes that the failure probability of a software for the untested inputs is 1, and the failure probability turns in 0 for successful testing of all test cases. However, in reality the chance of failure exists due to the test uncertainty. Some studies have been carried out to identify the test attributes that affect the test quality. Cao discussed the testing effort, testing coverage, and testing environment. Management of the test uncertainties was discussed in. In this study, the test uncertainty has been considered to estimate the software failure probability because the software testing process is considered to be inherently uncertain. A reliability estimation of software is very important for a probabilistic safety analysis of a digital safety critical system of NPPs. This study focused on the estimation of the probability of a software failure that considers the uncertainty in software testing. In our study, BBN has been employed as an example model for software test uncertainty quantification. Although it can be argued that the direct expert elicitation of test uncertainty is much simpler than BBN estimation, however the BBN approach provides more insights and a basis for uncertainty estimation.

  16. Absolute Quantification of Selected Proteins in the Human Osteoarthritic Secretome

    Directory of Open Access Journals (Sweden)

    Mandy J. Peffers

    2013-10-01

    Full Text Available Osteoarthritis (OA is characterized by a loss of extracellular matrix which is driven by catabolic cytokines. Proteomic analysis of the OA cartilage secretome enables the global study of secreted proteins. These are an important class of molecules with roles in numerous pathological mechanisms. Although cartilage studies have identified profiles of secreted proteins, quantitative proteomics techniques have been implemented that would enable further biological questions to be addressed. To overcome this limitation, we used the secretome from human OA cartilage explants stimulated with IL-1β and compared proteins released into the media using a label-free LC-MS/MS-based strategy. We employed QconCAT technology to quantify specific proteins using selected reaction monitoring. A total of 252 proteins were identified, nine were differentially expressed by IL-1 β stimulation. Selected protein candidates were quantified in absolute amounts using QconCAT. These findings confirmed a significant reduction in TIMP-1 in the secretome following IL-1β stimulation. Label-free and QconCAT analysis produced equivocal results indicating no effect of cytokine stimulation on aggrecan, cartilage oligomeric matrix protein, fibromodulin, matrix metalloproteinases 1 and 3 or plasminogen release. This study enabled comparative protein profiling and absolute quantification of proteins involved in molecular pathways pertinent to understanding the pathogenesis of OA.

  17. Interactive image quantification tools in nuclear material forensics

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Reid B [Los Alamos National Laboratory; Ruggiero, Christy [Los Alamos National Laboratory; Hush, Don [Los Alamos National Laboratory; Harvey, Neal [Los Alamos National Laboratory; Kelly, Pat [Los Alamos National Laboratory; Scoggins, Wayne [Los Alamos National Laboratory; Tandon, Lav [Los Alamos National Laboratory

    2011-01-03

    Morphological and microstructural features visible in microscopy images of nuclear materials can give information about the processing history of a nuclear material. Extraction of these attributes currently requires a subject matter expert in both microscopy and nuclear material production processes, and is a time consuming, and at least partially manual task, often involving multiple software applications. One of the primary goals of computer vision is to find ways to extract and encode domain knowledge associated with imagery so that parts of this process can be automated. In this paper we describe a user-in-the-loop approach to the problem which attempts to both improve the efficiency of domain experts during image quantification as well as capture their domain knowledge over time. This is accomplished through a sophisticated user-monitoring system that accumulates user-computer interactions as users exploit their imagery. We provide a detailed discussion of the interactive feature extraction and segmentation tools we have developed and describe our initial results in exploiting the recorded user-computer interactions to improve user productivity over time.

  18. Lung involvement quantification in chest radiographs; Quantificacao de comprometimento pulmonar em radiografias de torax

    Energy Technology Data Exchange (ETDEWEB)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A. [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Instituto de Biociencias. Departamento de Fisica e Biofisica; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M., E-mail: giacomini@ibb.unesp.br [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Faculdade de Medicina. Departamento de Doencas Tropicais e Diagnostico por Imagem

    2014-12-15

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  19. Hepatic Iron Quantification on 3 Tesla (3 T) Magnetic Resonance (MR): Technical Challenges and Solutions

    International Nuclear Information System (INIS)

    MR has become a reliable and noninvasive method of hepatic iron quantification. Currently, most of the hepatic iron quantification is performed on 1.5 T MR, and the biopsy measurements have been paired with R2 and R2* values for 1.5 T MR. As the use of 3 T MR scanners is steadily increasing in clinical practice, it has become important to evaluate the practicality of calculating iron burden at 3 T MR. Hepatic iron quantification on 3 T MR requires a better understanding of the process and more stringent technical considerations. The purpose of this work is to focus on the technical challenges in establishing a relationship between T2* values at 1.5 T MR and 3 T MR for hepatic iron concentration (HIC) and to develop an appropriately optimized MR protocol for the evaluation of T2* values in the liver at 3 T magnetic field strength. We studied 22 sickle cell patients using multiecho fast gradient-echo sequence (MFGRE) 3 T MR and compared the results with serum ferritin and liver biopsy results. Our study showed that the quantification of hepatic iron on 3 T MRI in sickle cell disease patients correlates well with clinical blood test results and biopsy results. 3 T MR liver iron quantification based on MFGRE can be used for hepatic iron quantification in transfused patients

  20. Hepatic Iron Quantification on 3 Tesla (3 T) Magnetic Resonance (MR): Technical Challenges and Solutions

    International Nuclear Information System (INIS)

    MR has become a reliable and noninvasive method of hepatic iron quantification. Currently, most of the hepatic iron quantification is performed on 1.5 T MR, and the biopsy measurements have been paired withR2 and R2∗ values for 1.5 T MR. As the use of 3 T MR scanners is steadily increasing in clinical practice, it has become important to evaluate the practicality of calculating iron burden at 3 T MR. Hepatic iron quantification on 3 T MR requires a better understanding of the process and more stringent technical considerations. The purpose of this work is to focus on the technical challenges in establishing a relationship between T2 ∗ values at 1.5 T MR and 3 T MR for hepatic iron concentration (HIC) and to develop an appropriately optimized MR protocol for the evaluation of T2 ∗ values in the liver at 3 T magnetic field strength. We studied 22 sickle cell patients using multiecho fast gradient-echo sequence (MFGRE) 3 T MR and compared the results with serum ferritin and liver biopsy results. Our study showed that the quantification of hepatic iron on 3 T MRI in sickle cell disease patients correlates well with clinical blood test results and biopsy results. 3 T MR liver iron quantification based on MFGRE can be used for hepatic iron quantification in transfused patients.

  1. Method for Indirect Quantification of CH4 Production via H2O Production Using Hydrogenotrophic Methanogens

    Science.gov (United States)

    Taubner, Ruth-Sophie; Rittmann, Simon K.-M. R.

    2016-01-01

    Hydrogenotrophic methanogens are an intriguing group of microorganisms from the domain Archaea. Methanogens exhibit extraordinary ecological, biochemical, and physiological characteristics and possess a huge biotechnological potential. Yet, the only possibility to assess the methane (CH4) production potential of hydrogenotrophic methanogens is to apply gas chromatographic quantification of CH4. In order to be able to effectively screen pure cultures of hydrogenotrophic methanogens regarding their CH4 production potential we developed a novel method for indirect quantification of the volumetric CH4 production rate by measuring the volumetric water production rate. This method was established in serum bottles for cultivation of methanogens in closed batch cultivation mode. Water production was estimated by determining the difference in mass increase in a quasi-isobaric setting. This novel CH4 quantification method is an accurate and precise analytical technique, which can be used to rapidly screen pure cultures of methanogens regarding their volumetric CH4 evolution rate. It is a cost effective alternative determining CH4 production of methanogens over CH4 quantification by using gas chromatography, especially if applied as a high throughput quantification method. Eventually, the method can be universally applied for quantification of CH4 production from psychrophilic, thermophilic and hyperthermophilic hydrogenotrophic methanogens. PMID:27199898

  2. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis

    Directory of Open Access Journals (Sweden)

    Ning Deng

    2015-01-01

    Full Text Available Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  3. MaqFACS (Macaque Facial Action Coding System) can be used to document facial movements in Barbary macaques (Macaca sylvanus)

    OpenAIRE

    Julle-Danière, Églantine; Micheletta, Jérôme; Whitehouse, Jamie; Joly, Marine; Gass, Carolin; Burrows, Anne M.; Waller, Bridget M.

    2015-01-01

    Human and non-human primates exhibit facial movements or displays to communicate with one another. The evolution of form and function of those displays could be better understood through multispecies comparisons. Anatomically based coding systems (Facial Action Coding Systems: FACS) are developed to enable such comparisons because they are standardized and systematic and aid identification of homologous expressions underpinned by similar muscle contractions. To date, FACS has been developed f...

  4. Bacterial adhesion force quantification by fluidic force microscopy

    Science.gov (United States)

    Potthoff, Eva; Ossola, Dario; Zambelli, Tomaso; Vorholt, Julia A.

    2015-02-01

    Quantification of detachment forces between bacteria and substrates facilitates the understanding of the bacterial adhesion process that affects cell physiology and survival. Here, we present a method that allows for serial, single bacterial cell force spectroscopy by combining the force control of atomic force microscopy with microfluidics. Reversible bacterial cell immobilization under physiological conditions on the pyramidal tip of a microchanneled cantilever is achieved by underpressure. Using the fluidic force microscopy technology (FluidFM), we achieve immobilization forces greater than those of state-of-the-art cell-cantilever binding as demonstrated by the detachment of Escherichia coli from polydopamine with recorded forces between 4 and 8 nN for many cells. The contact time and setpoint dependence of the adhesion forces of E. coli and Streptococcus pyogenes, as well as the sequential detachment of bacteria out of a chain, are shown, revealing distinct force patterns in the detachment curves. This study demonstrates the potential of the FluidFM technology for quantitative bacterial adhesion measurements of cell-substrate and cell-cell interactions that are relevant in biofilms and infection biology.Quantification of detachment forces between bacteria and substrates facilitates the understanding of the bacterial adhesion process that affects cell physiology and survival. Here, we present a method that allows for serial, single bacterial cell force spectroscopy by combining the force control of atomic force microscopy with microfluidics. Reversible bacterial cell immobilization under physiological conditions on the pyramidal tip of a microchanneled cantilever is achieved by underpressure. Using the fluidic force microscopy technology (FluidFM), we achieve immobilization forces greater than those of state-of-the-art cell-cantilever binding as demonstrated by the detachment of Escherichia coli from polydopamine with recorded forces between 4 and 8 nN for many

  5. Christiansen Revisited: Rethinking Quantification of Uniformity in Rainfall Simulator Studies

    Science.gov (United States)

    Green, Daniel; Pattison, Ian

    2016-04-01

    Rainfall simulators, whether based within a laboratory or field setting are used extensively within a number of fields of research, including plot-scale runoff, infiltration and erosion studies, irrigation and crop management and scaled investigations into urban flooding. Rainfall simulators offer a number of benefits, including the ability to create regulated and repeatable rainfall characteristics (e.g. intensity, duration, drop size distribution and kinetic energy) without relying on unpredictable natural precipitation regimes. Ensuring and quantifying spatially uniform simulated rainfall across the entirety of the plot area is of particular importance to researchers undertaking rainfall simulation. As a result, numerous studies have focused on the quantification and improvement of uniformity values. Several statistical methods for the assessment of rainfall simulator uniformity have been developed. However, the Christiansen Uniformity Coefficient (CUC) suggested by Christiansen (1942) is most frequently used. Despite this, there is no set methodology and researchers can adapt or alter factors such as the quantity, as well as the spacing, distance and location of the measuring beakers used to derive CUC values. Because CUC values are highly sensitive to the resolution of the data, i.e. the number of observations taken, many densely distributed measuring containers subjected to the same experimental conditions may generate a significantly lower CUC value than fewer, more sparsely distributed measuring containers. Thus, the simulated rainfall under a higher resolution sampling method could appear less uniform than when using a coarser resolution sampling method, despite being derived from the same initial rainfall conditions. Expressing entire plot uniformity as a single, simplified percentage value disregards valuable qualitative information about plot uniformity, such as the small-scale spatial distribution of rainfall over the plot surface and whether these

  6. Streptavidin conjugation and quantification-a method evaluation for nanoparticles.

    Science.gov (United States)

    Quevedo, Pablo Darío; Behnke, Thomas; Resch-Genger, Ute

    2016-06-01

    Aiming at the development of validated protocols for protein conjugation of nanomaterials and the determination of protein labeling densities, we systematically assessed the conjugation of the model protein streptavidin (SAv) to 100-, 500-, and 1000-nm-sized polystyrene and silica nanoparticles and dye-encoded polymer particles with two established conjugation chemistries, based upon achievable coupling efficiencies and labeling densities. Bioconjugation reactions compared included EDC/sulfo NHS ester chemistry for direct binding of the SAv to carboxyl groups at the particle surface and maleimide-thiol chemistry in conjunction with heterobifunctional PEG linkers and aminated nanoparticles (NPs). Quantification of the total and functional amounts of SAv on these nanomaterials and unreacted SAv in solution was performed with the BCA assay and the biotin-FITC (BF) titration, relying on different signal generation principles, which are thus prone to different interferences. Our results revealed a clear influence of the conjugation chemistry on the amount of NP crosslinking, yet under optimized reaction conditions, EDC/sulfo NHS ester chemistry and the attachment via heterobifunctional PEG linkers led to comparably efficient SAv coupling and good labeling densities. Particle size can obviously affect protein labeling densities and particularly protein functionality, especially for larger particles. For unstained nanoparticles, direct bioconjugation seems to be the most efficient strategy, whereas for dye-encoded nanoparticles, PEG linkers are to be favored for the prevention of dye-protein interactions which can affect protein functionality specifically in the case of direct SAv binding. Moreover, an influence of particle size on achievable protein labeling densities and protein functionality could be demonstrated. PMID:27038055

  7. Quantification of ocular inflammation with technetium-99m glucoheptonate

    International Nuclear Information System (INIS)

    Histological and morphometric evaluation of ocular inflammation is difficult, particularly when there is extensive ocular involvement with abscess formation and necrosis. A quantitative imaging procedure applicable to humans would be important clinically. To establish such a procedure, turpentine-induced ocular inflammation was obtained by subconjunctival injection in the right eye of 55 rabbits. The left eye was used as control and injected with a volume of saline equal to the volume of turpentine in the right eye. Volumes of turpentine or saline were 0.02, 0.04, 0.06, 0.2 and 0.6 ml, and the rabbits were divided into groups 1-5, according to these volumes. Imaging was performed 48 h after turpentine injection and 6 h after intravenous injection of 10 mCi of technetium 99m glucoheptonate (99mTc-GH). An inflammatory reaction index (IRI), defined as the ratio of counts of the right eye divided by counts of the left eye, was used. IRIs were proportional to the degree of inflammation and allowed the distinction of 3 subgroups: One represented by group 4, one by group 5 and one by groups 1, 2 and 3. This method of quantification of ocular inflammatory processes using 99mTc-GH is original, rapid, non-invasive, reproducible and safe, although unable to differentiate inflammatory processes caused by doses of turpentine which are very small and close to each other. It is conceivable that its application to humans will bring new insight into the ocular inflammatory process and response to therapy. (orig.)

  8. Quantification of the adrenal cortex hormones with radioimmunoassay

    Energy Technology Data Exchange (ETDEWEB)

    Badillo A, V.; Carrera D, A. A.; Ibarra M, C. M., E-mail: vbadillocren@hotmail.co [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Calle Cipres No. 10, Fracc. La Penuela, 98068 Zacatecas (Mexico)

    2010-10-15

    The pathologies of the adrenal cortex -adrenal insufficiency and Cushing syndrome- have their origin on the deficit or hypersecretion of some of the hormones that are secreted by the adrenal cortex, which is divided in three zones anatomically defined: the external zone, also called the zona glomerulosa, which is the main production site of aldosterone and mineralocorticoids; the internal zone, or zona reticularis, that produces androgens; and the external zone, or zone 1 orticotrop, which is responsible for producing glucocorticoids. In this work, a quantitative analysis of those hormones and their pathologic trigger was made; the quantification was made in the laboratory by means of highly sensitive and specific techniques, in this case, the radioimmunoassay, in which a radioisotope I-125 is used. This technique is based on the biochemical bond-type reaction, because it requires of a substance called the linker, which bonds to another called ligand. This reaction is also known as antigen-antibody (Ag-Ab), where the results of the reaction will depend on the quantity of antigen in the sample and on its affinity for the antibody. In this work, a 56 patients (of which 13 were men and 43 women) study was made. The cortisol, the ACTH, the androsterone and the DHEA values were very elevated in the majority of the cases corresponding to women, predominating cortisol; while in men, a notorious elevation of the 17 {alpha}-OH-PRG and of the DHEA-SO{sub 4} was observed. Based on that, we can conclude that 51 of them did not have mayor complications, because they just went to the laboratory once, while the remaining 5 had a medical monitoring, and they visited the laboratory more than one occasion, tell about a difficulty on their improvement. According to the results, an approximate relation of 8:2 women:men, respectively, becomes clear to the hormonal pathologies of the adrenal cortex. (Author)

  9. Direct quantification of negatively charged functional groups on membrane surfaces

    KAUST Repository

    Tiraferri, Alberto

    2012-02-01

    Surface charge plays an important role in membrane-based separations of particulates, macromolecules, and dissolved ionic species. In this study, we present two experimental methods to determine the concentration of negatively charged functional groups at the surface of dense polymeric membranes. Both techniques consist of associating the membrane surface moieties with chemical probes, followed by quantification of the bound probes. Uranyl acetate and toluidine blue O dye, which interact with the membrane functional groups via complexation and electrostatic interaction, respectively, were used as probes. The amount of associated probes was quantified using liquid scintillation counting for uranium atoms and visible light spectroscopy for the toluidine blue dye. The techniques were validated using self-assembled monolayers of alkanethiols with known amounts of charged moieties. The surface density of negatively charged functional groups of hand-cast thin-film composite polyamide membranes, as well as commercial cellulose triacetate and polyamide membranes, was quantified under various conditions. Using both techniques, we measured a negatively charged functional group density of 20-30nm -2 for the hand-cast thin-film composite membranes. The ionization behavior of the membrane functional groups, determined from measurements with toluidine blue at varying pH, was consistent with published data for thin-film composite polyamide membranes. Similarly, the measured charge densities on commercial membranes were in general agreement with previous investigations. The relative simplicity of the two methods makes them a useful tool for quantifying the surface charge concentration of a variety of surfaces, including separation membranes. © 2011 Elsevier B.V.

  10. Quantification of glacial till chemical composition by reflectance spectroscopy

    International Nuclear Information System (INIS)

    Chemometric modelling of soil element concentrations from diffuse visible and near-infrared (VSWIR, 350–2500 nm) reflectance spectroscopic measurements holds potential for soil element analyses. Research has demonstrated it particularly for organic agricultural soils, yet little is known about the VSWIR response of glacial tills. Soils with low organic matter content developed on unstratified glacial materials were studied at two geologically similar sites on the mafic metavolcanic rocks of the Lapland Greenstone belt in northern Finland. The till samples (n = 217) were composed primarily of quartz, plagioclase and amphibole having less than 3% of clinochlore, talc and illite. VSWIR spectra of mineral powder (2 = 0.80–0.89) of several soil chemical elements such as Al (validation RMSE 1802 mg kg−1), Ba (5.85 mg kg−1), Co (0.86 mg kg−1), Cr (6.94 mg kg−1), Cu (2.54 mg kg−1), Fe (2088 mg kg−1), Mg (449.6 mg kg−1), Mn (0.82 mg kg−1), Ni (3.24 mg kg−1), V (4.88 mg kg−1), and Zn (0.80 mg kg−1). The electronic and vibrational molecular processes causing absorption might be responsible for accurate predictions of major elements such as Al, Fe and Mg. However, the concentrations of other major and trace elements could be predicted by the PLSR because they were cross-correlated to spectrally active soil elements or extraneous soil properties. Therefore, the applicability of the results is highly sample set specific. Further, the results show that in local scale studies at geologically fairly homogenous areas the limited spread of the data may restrict the use of the spectroscopic–chemometric approach. This paper demonstrates the capability of laboratory VSWIR spectroscopy for determining element concentrations of glacial tills. Further work should focus on overcoming the issues of sampling scale and understanding the causality for cross-correlation in quantification of the elements.

  11. Automated quantification reveals hyperglycemia inhibits endothelial angiogenic function.

    Directory of Open Access Journals (Sweden)

    Anthony R Prisco

    Full Text Available Diabetes Mellitus (DM has reached epidemic levels globally. A contributing factor to the development of DM is high blood glucose (hyperglycemia. One complication associated with DM is a decreased angiogenesis. The Matrigel tube formation assay (TFA is the most widely utilized in vitro assay designed to assess angiogenic factors and conditions. In spite of the widespread use of Matrigel TFAs, quantification is labor-intensive and subjective, often limiting experiential design and interpretation of results. This study describes the development and validation of an open source software tool for high throughput, morphometric analysis of TFA images and the validation of an in vitro hyperglycemic model of DM.Endothelial cells mimic angiogenesis when placed onto a Matrigel coated surface by forming tube-like structures. The goal of this study was to develop an open-source software algorithm requiring minimal user input (Pipeline v1.3 to automatically quantify tubular metrics from TFA images. Using Pipeline, the ability of endothelial cells to form tubes was assessed after culture in normal or high glucose for 1 or 2 weeks. A significant decrease in the total tube length and number of branch points was found when comparing groups treated with high glucose for 2 weeks versus normal glucose or 1 week of high glucose.Using Pipeline, it was determined that hyperglycemia inhibits formation of endothelial tubes in vitro. Analysis using Pipeline was more accurate and significantly faster than manual analysis. The Pipeline algorithm was shown to have additional applications, such as detection of retinal vasculature.

  12. Robust quantification of polymerase chain reactions using global fitting.

    Directory of Open Access Journals (Sweden)

    Ana C Carr

    Full Text Available BACKGROUND: Quantitative polymerase chain reactions (qPCR are used to monitor relative changes in very small amounts of DNA. One drawback to qPCR is reproducibility: measuring the same sample multiple times can yield data that is so noisy that important differences can be dismissed. Numerous analytical methods have been employed that can extract the relative template abundance between samples. However, each method is sensitive to baseline assignment and to the unique shape profiles of individual reactions, which gives rise to increased variance stemming from the analytical procedure itself. PRINCIPAL FINDINGS: We developed a simple mathematical model that accurately describes the entire PCR reaction profile using only two reaction variables that depict the maximum capacity of the reaction and feedback inhibition. This model allows quantification that is more accurate than existing methods and takes advantage of the brighter fluorescence signals from later cycles. Because the model describes the entire reaction, the influences of baseline adjustment errors, reaction efficiencies, template abundance, and signal loss per cycle could be formalized. We determined that the common cycle-threshold method of data analysis introduces unnecessary variance because of inappropriate baseline adjustments, a dynamic reaction efficiency, and also a reliance on data with a low signal-to-noise ratio. SIGNIFICANCE: Using our model, fits to raw data can be used to determine template abundance with high precision, even when the data contains baseline and signal loss defects. This improvement reduces the time and cost associated with qPCR and should be applicable in a variety of academic, clinical, and biotechnological settings.

  13. Uncertainty Quantification for CO2-Enhanced Oil Recovery

    Science.gov (United States)

    Dai, Z.; Middleton, R.; Bauman, J.; Viswanathan, H.; Fessenden-Rahn, J.; Pawar, R.; Lee, S.

    2013-12-01

    CO2-Enhanced Oil Recovery (EOR) is currently an option for permanently sequestering CO2 in oil reservoirs while increasing oil/gas productions economically. In this study we have developed a framework for understanding CO2 storage potential within an EOR-sequestration environment at the Farnsworth Unit of the Anadarko Basin in northern Texas. By coupling a EOR tool--SENSOR (CEI, 2011) with a uncertainty quantification tool PSUADE (Tong, 2011), we conduct an integrated Monte Carlo simulation of water, oil/gas components and CO2 flow and reactive transport in the heterogeneous Morrow formation to identify the key controlling processes and optimal parameters for CO2 sequestration and EOR. A global sensitivity and response surface analysis are conducted with PSUADE to build numerically the relationship among CO2 injectivity, oil/gas production, reservoir parameters and distance between injection and production wells. The results indicate that the reservoir permeability and porosity are the key parameters to control the CO2 injection, oil and gas (CH4) recovery rates. The distance between the injection and production wells has large impact on oil and gas recovery and net CO2 injection rates. The CO2 injectivity increases with the increasing reservoir permeability and porosity. The distance between injection and production wells is the key parameter for designing an EOR pattern (such as a five (or nine)-spot pattern). The optimal distance for a five-spot-pattern EOR in this site is estimated from the response surface analysis to be around 400 meters. Next, we are building the machinery into our risk assessment framework CO2-PENS to utilize these response surfaces and evaluate the operation risk for CO2 sequestration and EOR at this site.

  14. Quantification analysis in Tc-99m MIBI myocardial perfusion scintigraphy

    International Nuclear Information System (INIS)

    Aims: Technetium-99-MIBI myocardial perfusion scintigraphy is a routinely employed nuclear medicine procedure. This study was carried out to get additional information in terms of Lung-heart ratio (LHR), Right ventricular index (RVI) by computer assisted quantification analysis of this procedure. Material and Methods: Fifty diagnosed cases of coronary artery disease (CAD) underwent 99mTc-MIBI planar studies at stress and rest. A group of 15 subjects with low pre-test likelihood of CAD and normal exercise and rest 99mTc-MIBI images was used as control. LHR was calculated from the static images in the anterior view. A circular region of interest (ROI) of about 8 pixel in diameter was selected in left lung area at maximal count density as assessed visually. Similar ROI was drawn on left ventricular wall at maximal count density area, as assessed visually. Ratio of the counts in the lung ROI to the counts in the myocardial ROI was expressed as Lung Heart ratio or 'lung index. LHR = Average counts in Lung ROI/Average counts in Left Myocardial ROI. Right ventricular index (RVI) was determined from the static images in LAO 450 views. ROIs were drawn on the right ventricle (RV) with maximal counts and on the left ventricle (LV) with maximal counts as assessed visually. The ratio of the counts in the two ROIs gave the right ventricular index. RVI = Average counts in RV ROI/ Average counts in LV ROI. Results: A close correlation was noted in the findings of three independent observers. In patients or coronary artery disease group (CAD Group), stress was induced by treadmill exercise or dipyridamole infusion. The CAD Group showed higher LHR at stress and at rest than controls. Student's t-test comparison of patients versus controls. p99mTc-MIBI myocardial perfusion images provides reproducible and clinically useful information regarding left ventricular function in CAD patients

  15. Gas Dynamics during Thermal Remediation: Visualization, Quantification and Enhancement

    Science.gov (United States)

    Mumford, K. G.; Hegele, P. R.

    2014-12-01

    In situ thermal treatment (ISTT) technologies, such as electrical resistance heating (ERH) and thermal conductive heating (TCH), rely on the in situ production of a gas phase composed of steam and vaporized volatile organic compounds (VOCs). This gas phase must be captured, extracted, and processed in an aboveground treatment system to meet remediation objectives. When used to treat volatile non-aqueous phase liquids (NAPLs), gases can be created at temperatures below the boiling points of both the groundwater and the NAPL, in a process commonly referred to as co-boiling, and vaporized VOCs can condense if gases are transported to colder regions or are not captured before thermal treatment has stopped. As such, an understanding of gas formation, connection, and flow is important for the design and operation of ISTT technologies. A recent series of laboratory experiments focused on the visualization and quantification of gas dynamics during water boiling and NAPL-water co-boiling, and the investigation of potential NAPL redistribution. Experiments were conducted in a sand-packed glass-walled chamber (40 cm tall × 20 cm wide × 1 cm thick) heated by electrical resistance. Temperatures and electric currents were measured, and digital images were captured throughout the experiments to quantify gas saturations using light transmission techniques. Additional experiments also investigated the exsolution of dissolved gas as a technique to enhance gas production at lower temperatures. Results showed the development of disconnected and connected gas flow regimes, with disconnected flow occurring at early times and during co-boiling. Results also showed the potential for NAPL redistribution due to displacement by gas formed within pools, and due to condensation in colder regions. These results highlight the need to carefully consider gases in the design of ISTT heating and gas extraction systems to ensure remediation performance.

  16. Dating and quantification of erosion processes based on exposed roots

    Science.gov (United States)

    Stoffel, Markus; Corona, Christophe; Ballesteros-Cánovas, Juan Antonio; Bodoque, José Maria

    2013-08-01

    Soil erosion is a key driver of land degradation and heavily affects sustainable land management in various environments worldwide. An appropriate quantification of rates of soil erosion and a localization of hotspots are therefore critical, as sediment loss has been demonstrated to have drastic consequences on soil productivity and fertility. A consistent body of evidence also exists for a causal linkage between global changes and the temporal frequency and magnitude of erosion, and thus calls for an improved understanding of dynamics and rates of soil erosion for an appropriate management of landscapes and for the planning of preventive or countermeasures. Conventional measurement techniques to infer erosion rates are limited in their temporal resolution or extent. Long-term erosion rates in larger basins have been analyzed with cosmogenic nuclides, but with lower spatial and limited temporal resolutions, thus limiting the possibility to infer micro-geomorphic and climatic controls on the timing, amount and localization of erosion. If based on exposed tree roots, rates of erosion can be inferred with up to seasonal resolution, over decades to centuries of the past and for larger surfaces with homogenous hydrological response units. Root-based erosion rates, thus, constitute a valuable alternative to empirical or physically-based approaches, especially in ungauged basins, but will be controlled by individual or a few extreme events, so that average annual rates of erosion might be highly skewed. In this contribution, we review the contribution made by this biomarker to the understanding of erosion processes and related landform evolution. We report on recent progress in root-based erosion research, illustrate possibilities, caveats and limitations of reconstructed rates, and conclude with a call for further research on various aspects of root-erosion research and for work in new geographic regions.

  17. Simultaneous quantification and qualification of synacthen in plasma.

    Science.gov (United States)

    Chaabo, Ayman; de Ceaurriz, Jacques; Buisson, Corinne; Tabet, Jean-Claude; Lasne, Françoise

    2011-02-01

    Tetracosactide (Synacthen), a synthetic analogue of adrenocorticotropic hormone (ACTH), can be used as a doping agent to increase the secretion of glucocorticoids by adrenal glands. The only published method for anti-doping control of this drug in plasma relies on purification by immunoaffinity chromatography and LC/MS/MS analysis. Its limit of detection is 300 pg/mL, which corresponds to the peak value observed 12 h after 1 mg Synacthen IM administration. We report here a more sensitive method based on preparation of plasma by cation exchange chromatography and solid-phase extraction and analysis by LC/MS/MS with positive-mode electrospray ionization using 7-38 ACTH as internal standard. Identification of Synacthen was performed using two product ions, m/z 671.5 and m/z 223.0, from the parent [M + 5H](5+) ion, m/z 587.4. The recovery was estimated at 70%. A linear calibration curve was obtained from 25 to 600 pg/mL (R² > 0.99). The lower limit of detection was 8 pg/mL (S/N > 3). The lower limit of quantification was 15 pg/mL (S/N > 10; CV% < 20%). The performance of the method was illustrated by an 8-h kinetic analysis of plasma samples from nine subjects submitted to IM injections of either Synacthen® (five subjects) or Synacthen® Depot, the slow-release form of the drug (four subjects). Concentrations of Synacthen between 16 and 310 pg/mL were observed. A sensitive method for quantitation of Synacthen in plasma is proposed for anti-doping control analyses. PMID:21170520

  18. Quantification of plant chlorophyll content using Google Glass.

    Science.gov (United States)

    Cortazar, Bingen; Koydemir, Hatice Ceylan; Tseng, Derek; Feng, Steve; Ozcan, Aydogan

    2015-04-01

    Measuring plant chlorophyll concentration is a well-known and commonly used method in agriculture and environmental applications for monitoring plant health, which also correlates with many other plant parameters including, e.g., carotenoids, nitrogen, maximum green fluorescence, etc. Direct chlorophyll measurement using chemical extraction is destructive, complex and time-consuming, which has led to the development of mobile optical readers, providing non-destructive but at the same time relatively expensive tools for evaluation of plant chlorophyll levels. Here we demonstrate accurate measurement of chlorophyll concentration in plant leaves using Google Glass and a custom-developed software application together with a cost-effective leaf holder and multi-spectral illuminator device. Two images, taken using Google Glass, of a leaf placed in our portable illuminator device under red and white (i.e., broadband) light-emitting-diode (LED) illumination are uploaded to our servers for remote digital processing and chlorophyll quantification, with results returned to the user in less than 10 seconds. Intensity measurements extracted from the uploaded images are mapped against gold-standard colorimetric measurements made through a commercially available reader to generate calibration curves for plant leaf chlorophyll concentration. Using five plant species to calibrate our system, we demonstrate that our approach can accurately and rapidly estimate chlorophyll concentration of fifteen different plant species under both indoor and outdoor lighting conditions. This Google Glass based chlorophyll measurement platform can display the results in spatiotemporal and tabular forms and would be highly useful for monitoring of plant health in environmental and agriculture related applications, including e.g., urban plant monitoring, indirect measurements of the effects of climate change, and as an early indicator for water, soil, and air quality degradation. PMID:25669673

  19. An architectural model for software reliability quantification: sources of data

    International Nuclear Information System (INIS)

    Software reliability assessment models in use today treat software as a monolithic block. An aversion towards 'atomic' models seems to exist. These models appear to add complexity to the modeling, to the data collection and seem intrinsically difficult to generalize. In 1997, we introduced an architecturally based software reliability model called FASRE. The model is based on an architecture derived from the requirements which captures both functional and nonfunctional requirements and on a generic classification of functions, attributes and failure modes. The model focuses on evaluation of failure mode probabilities and uses a Bayesian quantification framework. Failure mode probabilities of functions and attributes are propagated to the system level using fault trees. It can incorporate any type of prior information such as results of developers' testing, historical information on a specific functionality and its attributes, and, is ideally suited for reusable software. By building an architecture and deriving its potential failure modes, the model forces early appraisal and understanding of the weaknesses of the software, allows reliability analysis of the structure of the system, provides assessments at a functional level as well as at a systems' level. In order to quantify the probability of failure (or the probability of success) of a specific element of our architecture, data are needed. The term element of the architecture is used here in its broadest sense to mean a single failure mode or a higher level of abstraction such as a function. The paper surveys the potential sources of software reliability data available during software development. Next the mechanisms for incorporating these sources of relevant data to the FASRE model are identified

  20. An anatomically realistic brain phantom for quantification with positron tomography

    International Nuclear Information System (INIS)

    Phantom studies are useful in assessing and maximizing the accuracy and precision of quantification of absolute activity, assessing errors associated with patient positioning, and dosimetry. Most phantoms are limited by the use of simple shapes, which do not adequately reflect real anatomy. The authors have constructed an anatomically realistic life-size brain phantom for positron tomography studies. The phantom consists of separately fillable R + L caudates, R + L putamens, R + L globus passidus and cerebellum. These structures are contained in proper anatomic orientation within a fillable cerebrum. Solid ventricles are also present. The entire clear vinyl cerebrum is placed in a human skull. The internal brain structures were fabricated from polyester resin, with dimensions, shapes and sizes of the structures obtained from digitized contours of brain slices in the U.C.S.D. computerized brain atlas. The structures were filled with known concentrations of Ga-68 in water and scanned with our NeuroECAT. The phantom was aligned in the scanner for each structure, such that the tomographic slice passed through that structure's center. After calibration of the scanner with a standard phantom for counts/pixel uCi/cc conversion, the measured activity concentrations were compared with the actual concentrations. The ratio of measured to actual activity concentration (''recovery coefficient'') for the caudate was 0.33; for the putamen 0.42. For comparison, the ratio for spheres of diameters 9.5, 16,19 and 25.4 mm was 0.23, 0.54, 0.81, and 0.93. This phantom provides more realistic assessment of performance and allows calculation of correction factors