WorldWideScience

Sample records for amplicon quantification maq

  1. Integrated Microfluidic Nucleic Acid Isolation, Isothermal Amplification, and Amplicon Quantification

    Directory of Open Access Journals (Sweden)

    Michael G. Mauk

    2015-10-01

    Full Text Available Microfluidic components and systems for rapid (<60 min, low-cost, convenient, field-deployable sequence-specific nucleic acid-based amplification tests (NAATs are described. A microfluidic point-of-care (POC diagnostics test to quantify HIV viral load from blood samples serves as a representative and instructive example to discuss the technical issues and capabilities of “lab on a chip” NAAT devices. A portable, miniaturized POC NAAT with performance comparable to conventional PCR (polymerase-chain reaction-based tests in clinical laboratories can be realized with a disposable, palm-sized, plastic microfluidic chip in which: (1 nucleic acids (NAs are extracted from relatively large (~mL volume sample lysates using an embedded porous silica glass fiber or cellulose binding phase (“membrane” to capture sample NAs in a flow-through, filtration mode; (2 NAs captured on the membrane are isothermally (~65 °C amplified; (3 amplicon production is monitored by real-time fluorescence detection, such as with a smartphone CCD camera serving as a low-cost detector; and (4 paraffin-encapsulated, lyophilized reagents for temperature-activated release are pre-stored in the chip. Limits of Detection (LOD better than 103 virons/sample can be achieved. A modified chip with conduits hosting a diffusion-mode amplification process provides a simple visual indicator to readily quantify sample NA template. In addition, a companion microfluidic device for extracting plasma from whole blood without a centrifuge, generating cell-free plasma for chip-based molecular diagnostics, is described. Extensions to a myriad of related applications including, for example, food testing, cancer screening, and insect genotyping are briefly surveyed.

  2. Amplicon sequencing for the quantification of spoilage microbiota in complex foods including bacterial spores

    NARCIS (Netherlands)

    Boer, de P.; Caspers, M.; Sanders, J.W.; Kemperman, R.; Wijman, J.; Lommerse, G.; Roeselers, G.; Montijn, R.; Abee, T.; Kort, R.

    2015-01-01

    Background
    Spoilage of food products is frequently caused by bacterial spores and lactic acid bacteria. Identification of these organisms by classic cultivation methods is limited by their ability to form colonies on nutrient agar plates. In this study, we adapted and optimized 16S rRNA amplicon

  3. Mobile Air Quality Studies (MAQS-an international project

    Directory of Open Access Journals (Sweden)

    Sudik Claudia

    2010-04-01

    Full Text Available Abstract Due to an increasing awareness of the potential hazardousness of air pollutants, new laws, rules and guidelines have recently been implemented globally. In this respect, numerous studies have addressed traffic-related exposure to particulate matter using stationary technology so far. By contrast, only few studies used the advanced technology of mobile exposure analysis. The Mobile Air Quality Study (MAQS addresses the issue of air pollutant exposure by combining advanced high-granularity spatial-temporal analysis with vehicle-mounted, person-mounted and roadside sensors. The MAQS-platform will be used by international collaborators in order 1 to assess air pollutant exposure in relation to road structure, 2 to assess air pollutant exposure in relation to traffic density, 3 to assess air pollutant exposure in relation to weather conditions, 4 to compare exposure within vehicles between front and back seat (children positions, and 5 to evaluate "traffic zone"-exposure in relation to non-"traffic zone"-exposure. Primarily, the MAQS-platform will focus on particulate matter. With the establishment of advanced mobile analysis tools, it is planed to extend the analysis to other pollutants including NO2, SO2, nanoparticles and ozone.

  4. Maqȃmȃt Tasawuf dan Terapi Anti Korupsi

    Directory of Open Access Journals (Sweden)

    Supian Ramli

    2017-07-01

    Full Text Available The goal of this paper is to find a comprehensive solution about corrupt behavior with Sufism teachings approach, especially about maqȃmȃt. In this context, the author uses approaching method of thinking, understanding, and practice of Sufism values ​​as an alternative to eradicating corruption, through critical analysis in looking at all the problems that occur in growing corruption behavior, by revitalizing the role and function of Sufism in this era. The conclusion of this paper shows that maqȃmȃt tasawuf in modern life can be an alternative in eradicating corruption. Therefore, every element of society and government needs to socialize the importance of the spirit of Sufism, uphold the high moral and legal awareness in managing various efforts, the government and all aspects of this life and can override the materialistic and hedonistic life which is a major obstacle factor for the absorption of values sufistik in our life.  Keywords: Sufism, Eradicating Corruption, Maqȃmȃt Abstrak Tulisan ini bertujuan untuk mencari solusi yang komprehensif terkait prilaku korupsi melalui ajaran tasawuf  khususnya terkait maqȃmȃt. Dalam konteks ini penulis menggunakan pendekatan pemikiran, pemahaman dan pengamalan nilai-nilai tasawuf sebagai alternatif pemberantasan korupsi, melalui telaah kritis dalam melihat semua persoalan yang terjadi dalam menumbuh suburkan prilaku korupsi, dengan melakukan revitalisasi peran dan fungsi pemikiran tasawuf dalam kehidupan sekarang. Kesimpulan dari  tulisan ini menunjukan bahwa maqȃmȃt tasawuf dalam kehidupan modern dapat menjadi alternatif dalam pemberantasan korupsi. Oleh karenanya, setiap elemen masyarakat dan pemerintah perlu untuk mensosialisasikan pentingnya spirit tasawuf , menjunjung moral yang tinggi dan kesadaran hukum dalam mengelola berbagai usaha, pemerintahanan dan semua aspek kehidupan ini serta dapat mengesampingkan kehidupan materialistis dan hedonistis  yang menjadi satu faktor

  5. Measuring the performance of Islamic banks using maqāsid based model

    Directory of Open Access Journals (Sweden)

    Mustafa Omar Mohammed

    2015-12-01

    Full Text Available The vision and mission of Islamic banks were supposed to reflect the adherence of their activities and aspiration to Maqāṣid al-Sharī‘ah. However, there are contentions that Islamic banks have been converging towards conventional banking system. Efforts have been expended to reverse the tide and harmonise Islamic banking to its Sharī‘ah objectives. Hitherto, the existing conventional yardsticks have failed to measure the impact of the harmonisation exercise on Islamic banks’ performance. Therefore, using maqāṣid based yardstick to measure the performance of Islamic banks becomes imperative. This study has made use of al-Imām al-Ghazālī’s theory of Maqāṣid al-Sharī‘ah and Ibn ‘Āshūr’s reinterpretation, adopting content analysis and Sekaran (2000 behavioral science methods to develop a Maqāṣid Based Performance Evaluation Model (MPEM to measure the performance of Islamic banks. Experts’ opinions have validated the model and its acceptability. Suggestions are provided to policy makers and future research.

  6. Application of Sharī‘ah contracts in contemporary Islamic finance: A maqāṣid perspective

    Directory of Open Access Journals (Sweden)

    Younes Soualhi

    2015-12-01

    Full Text Available This research exposes the underlying maqāṣid embedded in Sharī‘ah contracts as applied in Islamic banking and finance. It addresses the problem of not observing maqāṣid in nominated and combined Sharī‘ah contracts as well as the problem of not sufficiently imbuing maqāṣid in products developed by Islamic financial institutions. As a benchmark of the maqāṣid of wealth, the research adopts Ibn ‘Āshūr’s classification of maqāṣid to evaluate the conformity of Sharī‘ah contracts to Maqāṣid al-Sharī‘ah namely, justice, circulation, transparency, and firmness. The study focuses on three markets related to the application of Sharī‘ah contracts, namely, banking, Islamic capital market, and takāful. The study concludes that, by and large, the application of Sharī‘ah contracts has observed Maqāṣid al-Sharī‘ah during its development and initial application stages of Islamic finance products; however, offering such products in the market has raised economic questions as to their viability and economic values. In addition, the malpractice of some Sharī‘ah contracts has long raised concerns as to the maqāṣid compliance of such products. The research recommends a de-sophistication of Islamic financial engineering to minimise the possibility of convergence with conventional finance. The research also emphasises product differentiation based on less complicated combined Sharī‘ah contracts.

  7. Maqâsid al-Qur’ân dan Deradikalisasi Penafsiran dalam Konteks Keindonesiaan

    Directory of Open Access Journals (Sweden)

    Ulya Fikriyati

    2015-09-01

    Full Text Available This article deals with the viewpoint that the reading of the Qur’ân by a certain generation is subject to criticism by the following generation. The article seeks to offer, as an example, the deradicalization of interpreting the so-called ‘radical’ verses of the Qur’ân in Indonesian context. Islam in Indonesia always interact with various races, ethnicities, religions and beliefs, and therefore requires a type of exegesis different from other regions such as the Middle East. For the radicalization of interpretation, this article offers what is called the maqâsid al-Qur’ân as its parameter. The maqâsid al-Qur’ân consists of seven points: 1 Hifz al-dîn wa tatwîr wasâilih, 2 Hifz al-nafs wa tatwîruhâ, 3 Hifz al-‘aql wa tatwîruh, 4 Hifz al-mâl wa tanmîyat wasâilih, 5 Hifz al-‘ird wa tatwîr al-wasâil li al-husûl ‘alayh, 6 Tahqîq al-huqûq al-insânîyah wa mâ yandarij tahtahâ, 7 Hifz al-‘âlam wa ‘imâratuhâ. As spirit and parameter, the maqâsid al-Qur’ân necessitates the dialectics of dynamic interpretation without any judgment of infidelity or heresy. If a certain reading of the Qur’anic verses deviates from these seven maqâsid al-Qur’ân above, it deserves to be examined further, but not to be immediately suppressed.

  8. Removing Noise From Pyrosequenced Amplicons

    Directory of Open Access Journals (Sweden)

    Davenport Russell J

    2011-01-01

    Full Text Available Abstract Background In many environmental genomics applications a homologous region of DNA from a diverse sample is first amplified by PCR and then sequenced. The next generation sequencing technology, 454 pyrosequencing, has allowed much larger read numbers from PCR amplicons than ever before. This has revolutionised the study of microbial diversity as it is now possible to sequence a substantial fraction of the 16S rRNA genes in a community. However, there is a growing realisation that because of the large read numbers and the lack of consensus sequences it is vital to distinguish noise from true sequence diversity in this data. Otherwise this leads to inflated estimates of the number of types or operational taxonomic units (OTUs present. Three sources of error are important: sequencing error, PCR single base substitutions and PCR chimeras. We present AmpliconNoise, a development of the PyroNoise algorithm that is capable of separately removing 454 sequencing errors and PCR single base errors. We also introduce a novel chimera removal program, Perseus, that exploits the sequence abundances associated with pyrosequencing data. We use data sets where samples of known diversity have been amplified and sequenced to quantify the effect of each of the sources of error on OTU inflation and to validate these algorithms. Results AmpliconNoise outperforms alternative algorithms substantially reducing per base error rates for both the GS FLX and latest Titanium protocol. All three sources of error lead to inflation of diversity estimates. In particular, chimera formation has a hitherto unrealised importance which varies according to amplification protocol. We show that AmpliconNoise allows accurate estimates of OTU number. Just as importantly AmpliconNoise generates the right OTUs even at low sequence differences. We demonstrate that Perseus has very high sensitivity, able to find 99% of chimeras, which is critical when these are present at high

  9. Investigating Meat Milling Business in Yogyakarta: a Maqâshid al-Syarî‘ah Perspectives

    Directory of Open Access Journals (Sweden)

    Zein Muttaqin

    2017-12-01

    Full Text Available Investigating Meat Milling Business in Yogyakarta: A Maqâshid al-Syarî‘ah Perspectives. This article aims to investigate the meat milling business process in the light of maqâshid al-syarî‘ah perspectives. The ttopic arises due to some past cases related to the mixed process of meat milling of lawful and unlawful meat in the industry, while the demand on the meat milling business is pretty high according to the BAPPENAS Yogyakarta data. Lacking on supervision has threatened consumer protection and deviates from the sharia injunctions. Maqâshid al-syarî‘ah being consider as the basis of sharia compliances that serves to understand the purposes of sharia laws in the daily life in the form of prohibitions and injunctions. Qualitative method and content analysis is conducted to interpret 7 meats milling local shop owner responds regards the manifestation of maqâshid al-syarî‘ah in their business. It can be concluded that the implementation of maqâshid al-syarî‘ah of meat milling business in Yogyakarta on the level of input, process and output has been implemented accordingly. Although there are several cases of respondent that did not abide by the Islamic laws, but most of the businessman are. Halal logistic and supply chain is needed to become the insurance party of the legal aspect of business and also as the milestone of expanding the local meat milling to compete with the bigger competitor such as supermarket.

  10. Problematika Pendidikan Islam Perspektif Maqâṣid Sharîʻah

    Directory of Open Access Journals (Sweden)

    Rosidin Rosidin

    2016-11-01

    Full Text Available The article seeks to reveal problems faced by Islamic education from the perspective of Maqâṣid al-Sharîʻah, namely ideological (h}ifẓ al-dîn, practical (h}ifẓ al-nafs, academic (h}ifẓ al-‘aql, relationships or networks (h}ifẓ al-nasl, vocational (h}ifẓ al-mâl and quality (h}ifẓ al-‘irḍ aspects. In order to know the details of crucial problems of the Islamic education along with their alternative solutions, this paper suggests that the Islamic education should consider the insider’s and outsider’s perspective which has four types of roles as follow: a complete observer; b observer as participant; c participant as observer; d complete participant. This article proposes three important implications, are: first, the problems of the Islamic education can be categorized and mapped based on Maqâṣid al-Sharîʻah, in order to ease the analysis of such problems. Second, the analysis of the problems should be based on the insiders’ and outsiders’ point of views, so that it would be able to comprehensively detect the problems of Islamic education and show the inclusiveness of the Islamic education. Third, problem solving alternatives should be oriented at the level of theological (faith and religious [îmân], theoretical (philosophical and empirical [‘ilm], practical (learning and teaching [‘amal] and moral (ethics and aesthetics [akhlâq] realms.

  11. Al-‘Alāqah baina Ushūl al-Fiqh wa Maqāshidi al-Sharīah wa al-Da’wah ilā Ta’sīsi ‘Ilmi al-Maqāshid

    Directory of Open Access Journals (Sweden)

    Anggraini Binti Ramli

    2016-12-01

    Full Text Available The study of Maqāshid sharīa is an important point in the discussion of Islamic legal theory (ushūl al-fiqh. Serious debates began to emerge in the 19th century among Islamic jurists concerning the position of maqāshid sharīa. At least, there are three important debates in the history; first, whether maqāshid is part of the discussion ushūl al-fiqh; second, is maqāshid sharīa built upon a foundation of classical Islamic jurisprudence (fiqh; and third, whether the maqāshid sharīa study is able to become an independent science that is separated from the study of classical Islamic jurisprudence. This article tries to present a discussion of the three paradigms by employing a descriptive-analytic method. The results of this study uncover that the study of maqāshid sharīa is like two sides of one coin; theoretically it is a distinctive study from ushūl al-fiqh, but it cannot be separated from one another. Ushūl al-fiqh has become the foundation to find out more details about the study of maqashid sharia. The separation between classical Islamic jurisprudence (fiqh and maqāshid sharīa study conducted by Islamic jurists is a relative separation.

  12. Freedom, Sufism, maqâm hurrîyah, hâl hurrîyah.

    Directory of Open Access Journals (Sweden)

    Ah. Haris Fakhrudi

    2015-09-01

    Full Text Available This paper will discuss the meaning of freedom in the discourse of Sufi thought, especially of Ibn ‘Arabî. This is based on the consideration that Sufism before Ibn ‘Arabî’s more focused on ritualistic orientation for students and only revealed variant of Sufi’s expressions, both on maqâmât and ahwâl. The presence of Ibn ‘Arabî, therefore, became the turning point in the discourse of Sufism by expressing his beliefs in the theoretical formulation. The doctrine of Sufism—which previously only implicitly contained in the words of the Sufi shaykh—in the hands of Ibn ‘Arabî flashed into an open, theoretical, and obvios and thus opened the door for anyone who has a high intelligence in reflecting at once and realizing the metaphysical theories through operational forms. Therefore, this article will discuss some of the key concepts in the thought of Ibn ‘Arabî including the meaning of freedom (al-hurrîyah in Sufism, maqâm hurrîyah, and hâl hurrîyah obtained by the Sufis during their spiritual journey.

  13. EKSEKUSI HUKUMAN MATI Tinjauan Maqāṣid al-Sharī’ah dan Keadilan

    Directory of Open Access Journals (Sweden)

    Imam Yahya

    2013-04-01

    Full Text Available The debate about death penalty, is still attracted attention of people. At least, there are, two mainstream firstly those who agrees and secondly who refuses the death penalty being imposed. For those who agrees reasoned that severe violations of the right to life, should be punished by death so that could provide a deterrent effect, while those who refuses argued that the death penalty is a denial of human rights, especially right to life. The essence of the death penalty is not a violation of the law, because the implementation the death penalty actually enforced in order to protect human rights itself. In the view of Islamic law, death penalty, can be done on four cases, namely that of adultery, killing intentionally, Hirabah and apostasy. Furthermore, the death penalty should be carried out in accordance with maqāṣid al-sharī'ah and justice. In maqāṣid al-sharī'ah perspective, the purpose of death penalty should refer to maintain religion (ḥifẓ al-dīn, maintain body or maintain the survival (ḥifẓ al-nafs, mind (ḥifẓ al-'aql, descent (ḥifẓ alnasl, and maintaining property (ḥifẓ al-māl. While in the perspective of justice, State, on behalf of the law must protect its citizens from legal events that harm society.

  14. Proposed Methodology for Application of Human-like gradual Multi-Agent Q-Learning (HuMAQ) for Multi-robot Exploration

    International Nuclear Information System (INIS)

    Ray, Dip Narayan; Majumder, Somajyoti

    2014-01-01

    Several attempts have been made by the researchers around the world to develop a number of autonomous exploration techniques for robots. But it has been always an important issue for developing the algorithm for unstructured and unknown environments. Human-like gradual Multi-agent Q-leaming (HuMAQ) is a technique developed for autonomous robotic exploration in unknown (and even unimaginable) environments. It has been successfully implemented in multi-agent single robotic system. HuMAQ uses the concept of Subsumption architecture, a well-known Behaviour-based architecture for prioritizing the agents of the multi-agent system and executes only the most common action out of all the different actions recommended by different agents. Instead of using new state-action table (Q-table) each time, HuMAQ uses the immediate past table for efficient and faster exploration. The proof of learning has also been established both theoretically and practically. HuMAQ has the potential to be used in different and difficult situations as well as applications. The same architecture has been modified to use for multi-robot exploration in an environment. Apart from all other existing agents used in the single robotic system, agents for inter-robot communication and coordination/ co-operation with the other similar robots have been introduced in the present research. Current work uses a series of indigenously developed identical autonomous robotic systems, communicating with each other through ZigBee protocol

  15. Maqâshid al-Qur’ân dalam Ayat Kebebasan Beragama Menurut Thahâ Jâbir al-‘Alwânî

    Directory of Open Access Journals (Sweden)

    Ah. Fawaid

    2017-12-01

    Full Text Available Maqâshid al-Qur’ân in the Verses of Religious Freedom in Interpretation of Thahâ Jâbir al-’Alwânî. This article is aimed at describing Thahâ Jâbir al-‘Alwânî’s interpretation on various verses dealing with the issues of religious freedom. Adapting an approach of maqâshid al-Qur’ân, the present paper comes to answer two main issues on the true of Alwânî’s maqâshid al-Qur’ân perception and the theoretical application of Alwânî’s maqâshid al-Qur’ân in interpreting verses of religious freedom. According to ‘Alwânî, it was concluded that there are three main segments that he called al-maqâshid al-Qur’âniyyah al-hâkimah: (1 al-tawhîd, (2 al-tazkiyah, and (3 al-‘umrân. ‘Alwânî stated that freedom of interfaith religion is important goal of sharia meaning. Freedom of interfaith religion, on the other hand, is one of the important embodiments in believing the God and tauhid. Seeing this pattern, the later purpose of the Quran is tazkiyah. This term is a value that enables people apply the message, fulfill the promise, and can perform the tasks of the caliphate. When such principles are implemented, something appears that Alwânî called ‘umrân as the next purpose of the Quran can be manifested well. ‘Umrân or ‘prosperity’ in which human being performs as a khalifah actually can create baldatun thayyibatun wa rabbun ghafûr as the real welfare.

  16. Viability-qPCR for detecting Legionella: Comparison of two assays based on different amplicon lengths.

    Science.gov (United States)

    Ditommaso, Savina; Giacomuzzi, Monica; Ricciardi, Elisa; Zotti, Carla M

    2015-08-01

    Two different real-time quantitative PCR (PMA-qPCR) assays were applied for quantification of Legionella spp. by targeting a long amplicon (approx 400 bp) of 16S rRNA gene and a short amplicon (approx. 100 bp) of 5S rRNA gene. Purified DNA extracts from pure cultures of Legionella spp. and from environmental water samples were quantified. Application of the two assays to quantify Legionella in artificially contaminated water achieved that both assays were able to detect Legionella over a linear range of 10 to 10(5) cells ml(-1). A statistical analysis of the standard curves showed that both assays were linear with a good correlation coefficient (R(2) = 0.99) between the Ct and the copy number. Amplification with the reference assay was the most effective for detecting low copy numbers (1 bacterium per PCR mixture). Using selective quantification of viable Legionella by the PMA-qPCR method we obtained a greater inhibition of the amplification of the 400-bp 16S gene fragment (Δlog(10) = 3.74 ± 0.39 log(10) GU ml(-1)). A complete inhibition of the PCR signal was obtained when heat-killed cells in a concentration below 1 × 10(5) cells ml(-1) were pretreated with PMA. Analysing short amplicon sizes led to only 2.08 log reductions in the Legionella dead-cell signal. When we tested environmental water samples, the two qPCR assays were in good agreement according to the kappa index (0.741). Applying qPCR combined with PMA treatment, we also obtained a good agreement (kappa index 0.615). The comparison of quantitative results shows that both assays yielded the same quantification sensitivity (mean log = 4.59 vs mean log = 4.31). Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Optimal Activation of Isopsoralen To Prevent Amplicon Carryover

    OpenAIRE

    Fahle, Gary A.; Gill, Vee J.; Fischer, Steven H.

    1999-01-01

    We compared the efficiencies of activation of the photochemical isopsoralen compound 10 and its resulting amplicon neutralizations under conditions with a UV transilluminator box at room temperature (RT) and a HRI-300 UV photothermal reaction chamber at RT and at 5°C. Our data suggest that use of the HRI-300 reaction chamber at 5°C results in a statistically significantly higher degree of amplicon neutralization.

  18. Hak Kebebasan Berpendapat dalam Hubungannya dengan Pencemaran Nama Baik Menurut KUHP; Perspektif Teori Maqâṣid Sharî’ah

    Directory of Open Access Journals (Sweden)

    Moh. Faizur Rohman

    2017-11-01

    Full Text Available Human rights are often echoed by various parties are sometimes opposed to each other. Many journalists who exercised the right to freedom of expression expressed the opinion that the right has been protected by the 1945 Constitution article 28 E (3, precisely resulting in a criminal offense of defamation for those who do not like it. Defamation articles in the Criminal Code are often regarded as a powerful weapon against a criticism as a form of antipathy against criticism. Therefore, there should be efforts to classify the forms of the use of rights and forms of criminal acts. and will be seen from the standpoint of maqâṣid sharî’ah. Through approach statue approach, and case approach, obtained that the law has protected the rights of everyone in expressing opinions. Every person shall have the right and freedom to express opinions but such rights shall not infringe the rights of others. In maqâṣid is guaranteed and protected self-esteem, the soul for every human being is contained in the principle of ḥifḍu al-nafs wa al-‘ird. Therefore, in every opinion, communicate and socialize should pay attention to the rights of others is the dignity of one's dignity to create a balanced and harmonious life.

  19. Harmonising legality with morality in Islamic banking and finance: A quest for Maqāṣid al-Sharī‘ah paradigm

    Directory of Open Access Journals (Sweden)

    Luqman Zakariyah

    2015-12-01

    Full Text Available Scholars in Islamic Finance Industry (IFI have been calling for the integration of Islamic morality with legal theories in the industry. Among the reasons for this call is an unethical trend in product innovation. Implementing Islamic banking and financial practices would require adopting their undergirding Islamic legal and moral frameworks. Departing from these foundations of Islamic law could render the activities conducted under its name religiously unacceptable. Many approaches have been put forward to achieve this cause. One of the most complex yet subjective approaches is the quest for Maqāṣid al-Sharī‘ah. This paper critically examines the feasibility of harmonising morality with legality in Islamic finance. In doing so, it will reveal what constitutes morality and legality in Islamic legal theory, and critically examine the approaches of Muslim classical scholars in fusing the two elements together for the realisation and actualisation of the very objectives of Sharī‘ah. Questions of the relationship between morality and legality are raised, and samples of Islamic finance products are evaluated to expose their moral and legal dimensions. Lastly, the role of Maqāṣid al-Sharī‘ah in the process of harmonisation is discussed with some observations and reservations on the practicality of their implementation.

  20. BioMaS: a modular pipeline for Bioinformatic analysis of Metagenomic AmpliconS.

    Science.gov (United States)

    Fosso, Bruno; Santamaria, Monica; Marzano, Marinella; Alonso-Alemany, Daniel; Valiente, Gabriel; Donvito, Giacinto; Monaco, Alfonso; Notarangelo, Pasquale; Pesole, Graziano

    2015-07-01

    Substantial advances in microbiology, molecular evolution and biodiversity have been carried out in recent years thanks to Metagenomics, which allows to unveil the composition and functions of mixed microbial communities in any environmental niche. If the investigation is aimed only at the microbiome taxonomic structure, a target-based metagenomic approach, here also referred as Meta-barcoding, is generally applied. This approach commonly involves the selective amplification of a species-specific genetic marker (DNA meta-barcode) in the whole taxonomic range of interest and the exploration of its taxon-related variants through High-Throughput Sequencing (HTS) technologies. The accessibility to proper computational systems for the large-scale bioinformatic analysis of HTS data represents, currently, one of the major challenges in advanced Meta-barcoding projects. BioMaS (Bioinformatic analysis of Metagenomic AmpliconS) is a new bioinformatic pipeline designed to support biomolecular researchers involved in taxonomic studies of environmental microbial communities by a completely automated workflow, comprehensive of all the fundamental steps, from raw sequence data upload and cleaning to final taxonomic identification, that are absolutely required in an appropriately designed Meta-barcoding HTS-based experiment. In its current version, BioMaS allows the analysis of both bacterial and fungal environments starting directly from the raw sequencing data from either Roche 454 or Illumina HTS platforms, following two alternative paths, respectively. BioMaS is implemented into a public web service available at https://recasgateway.ba.infn.it/ and is also available in Galaxy at http://galaxy.cloud.ba.infn.it:8080 (only for Illumina data). BioMaS is a friendly pipeline for Meta-barcoding HTS data analysis specifically designed for users without particular computing skills. A comparative benchmark, carried out by using a simulated dataset suitably designed to broadly represent

  1. High throughput 16S rRNA gene amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Larsen, Poul; Jørgensen, Mads Koustrup

    S rRNA gene amplicon sequencing has been developed over the past few years and is now ready to use for more comprehensive studies related to plant operation and optimization thanks to short analysis time, low cost, high throughput, and high taxonomic resolution. In this study we show how 16S r......RNA gene amplicon sequencing can be used to reveal factors of importance for the operation of full-scale nutrient removal plants related to settling problems and floc properties. Using optimized DNA extraction protocols, indexed primers and our in-house Illumina platform, we prepared multiple samples...... be correlated to the presence of the species that are regarded as “strong” and “weak” floc formers. In conclusion, 16S rRNA gene amplicon sequencing provides a high throughput approach for a rapid and cheap community profiling of activated sludge that in combination with multivariate statistics can be used...

  2. Ultra-deep pyrosequencing (UDPS data treatment to study amplicon HCV minor variants.

    Directory of Open Access Journals (Sweden)

    Josep Gregori

    Full Text Available We have investigated the reliability and reproducibility of HCV viral quasispecies quantification by ultra-deep pyrosequencing (UDPS methods. Our study has been divided in two parts. First of all, by UDPS sequencing of clone mixes samples we have established the global noise level of UDPS and fine tuned a data treatment workflow previously optimized for HBV sequence analysis. Secondly, we have studied the reproducibility of the methodology by comparing 5 amplicons from two patient samples on three massive sequencing platforms (FLX+, FLX and Junior after applying the error filters developed from the clonal/control study. After noise filtering the UDPS results, the three replicates showed the same 12 polymorphic sites above 0.7%, with a mean CV of 4.86%. Two polymorphic sites below 0.6% were identified by two replicates and one replicate respectively. A total of 25, 23 and 26 haplotypes were detected by GS-Junior, GS-FLX and GS-FLX+. The observed CVs for the normalized Shannon entropy (Sn, the mutation frequency (Mf, and the nucleotidic diversity (Pi were 1.46%, 3.96% and 3.78%. The mean absolute difference in the two patients (5 amplicons each, in the GS-FLX and GS-FLX+, were 1.46%, 3.96% and 3.78% for Sn, Mf and Pi. No false polymorphic site was observed above 0.5%. Our results indicate that UDPS is an optimal alternative to molecular cloning for quantitative study of HCV viral quasispecies populations, both in complexity and composition. We propose an UDPS data treatment workflow for amplicons from the RNA viral quasispecies which, at a sequencing depth of at least 10,000 reads per strand, enables to obtain sequences and frequencies of consensus haplotypes above 0.5% abundance with no erroneous mutations, with high confidence, resistant mutants as minor variants at the level of 1%, with high confidence that variants are not missed, and highly confident measures of quasispecies complexity.

  3. Innovación y ruptura en la métrica de la poesía en al-Maqāmāt al-Luzūmiyya de al-Saraqusṭī (s. VI/XII

    Directory of Open Access Journals (Sweden)

    Ferrando, Ignacio

    2016-06-01

    Full Text Available Al-Maqāmāt al-Luzūmiyya, a collection of 50 picaresque-like stories couched in rhymed prose by Abū l-Ṭāhir al-Saraqusṭī in al-Andalus in the VI/XIIth century, were considered by Western scholars, up until the eighties of XXth century, as highly rhetorical pieces with a negligible and irrelevant content. However, some contemporary scholars argued that they contain some degree of social and political criticism, as they constitute a fictional domain in which subversion and irony play a fundamental role. In this paper, we cast some light on the particular use of metrical patterns in Al-Maqāmāt al-Luzūmiyya, which should be considered another subversion field: a poetical rhythm that avoids the most usual Arabic metrical patterns, contributing to provide the maqāma a flavor of the countergenre.Al-Maqāmāt al-Luzūmiyya, la colección de 50 relatos de corte picaresco escritos en una exigente prosa rimada por Abū l-Ṭāhir al-Saraqusṭī en al-Andalus en el siglo VI/XII, han sido consideradas por los críticos occidentales como piezas de alta retórica y de virtuosismo lingüístico en las que el contenido es prácticamente irrelevante. Sin embargo, a partir de los años 80 del siglo pasado, algunos investigadores han tratado de incidir en el hecho de que esta obra contiene cierto grado de crítica social y política, puesto que constituye un espacio ficticio en el que la subversión y la ironía desempeñan un papel fundamental. En este trabajo vamos a tratar de arrojar luz sobre el uso particular de los esquemas métricos por parte de al-Saraqusṭī, tratando de mostrar que los metros constituyen otro más de los ámbitos de subversión tan caros al autor. De hecho, el uso de ritmos “invertidos” o contrarios a los más habituales de la poesía clásica contribuye a darle a la maqāma ese aire de anti género que la puebla.

  4. Herpes simplex virus type 1-derived recombinant and amplicon vectors.

    Science.gov (United States)

    Fraefel, Cornel; Marconi, Peggy; Epstein, Alberto L

    2011-01-01

    Herpes simplex virus type 1 (HSV-1) is a human pathogen whose lifestyle is based on a long-term dual interaction with the infected host, being able to establish both lytic and latent infections. The virus genome is a 153 kbp double-stranded DNA molecule encoding more than 80 genes. The interest of HSV-1 as gene transfer vector stems from its ability to infect many different cell types, both quiescent and proliferating cells, the very high packaging capacity of the virus capsid, the outstanding neurotropic adaptations that this virus has evolved, and the fact that it never integrates into the cellular chromosomes, thus avoiding the risk of insertional mutagenesis. Two types of vectors can be derived from HSV-1, recombinant vectors and amplicon vectors, and different methodologies have been developed to prepare large stocks of each type of vector. This chapter summarizes (1) the two approaches most commonly used to prepare recombinant vectors through homologous recombination, either in eukaryotic cells or in bacteria, and (2) the two methodologies currently used to generate helper-free amplicon vectors, either using a bacterial artificial chromosome (BAC)-based approach or a Cre/loxP site-specific recombination strategy.

  5. Cytomegalovirus sequence variability, amplicon length, and DNase-sensitive non-encapsidated genomes are obstacles to standardization and commutability of plasma viral load results.

    Science.gov (United States)

    Naegele, Klaudia; Lautenschlager, Irmeli; Gosert, Rainer; Loginov, Raisa; Bir, Katia; Helanterä, Ilkka; Schaub, Stefan; Khanna, Nina; Hirsch, Hans H

    2018-04-22

    Cytomegalovirus (CMV) management post-transplantation relies on quantification in blood, but inter-laboratory and inter-assay variability impairs commutability. An international multicenter study demonstrated that variability is mitigated by standardizing plasma volumes, automating DNA extraction and amplification, and calibration to the 1st-CMV-WHO-International-Standard as in the FDA-approved Roche-CAP/CTM-CMV. However, Roche-CAP/CTM-CMV showed under-quantification and false-negative results in a quality assurance program (UK-NEQAS-2014). To evaluate factors contributing to quantification variability of CMV viral load and to develop optimized CMV-UL54-QNAT. The UL54 target of the UK-NEQAS-2014 variant was sequenced and compared to 329 available CMV GenBank sequences. Four Basel-CMV-UL54-QNAT assays of 361 bp, 254 bp, 151 bp, and 95 bp amplicons were developed that only differed in reverse primer positions. The assays were validated using plasmid dilutions, UK-NEQAS-2014 sample, as well as 107 frozen and 69 prospectively collected plasma samples from transplant patients submitted for CMV QNAT, with and without DNase-digestion prior to nucleic acid extraction. Eight of 43 mutations were identified as relevant in the UK-NEQAS-2014 target. All Basel-CMV-UL54 QNATs quantified the UK-NEQAS-2014 but revealed 10-fold increasing CMV loads as amplicon size decreased. The inverse correlation of amplicon size and viral loads was confirmed using 1st-WHO-International-Standard and patient samples. DNase pre-treatment reduced plasma CMV loads by >90% indicating the presence of unprotected CMV genomic DNA. Sequence variability, amplicon length, and non-encapsidated genomes obstruct standardization and commutability of CMV loads needed to develop thresholds for clinical research and management. Besides regular sequence surveys, matrix and extraction standardization, we propose developing reference calibrators using 100 bp amplicons. Copyright © 2018 Elsevier B.V. All

  6. Swarm: robust and fast clustering method for amplicon-based studies

    Science.gov (United States)

    Rognes, Torbjørn; Quince, Christopher; de Vargas, Colomban; Dunthorn, Micah

    2014-01-01

    Popular de novo amplicon clustering methods suffer from two fundamental flaws: arbitrary global clustering thresholds, and input-order dependency induced by centroid selection. Swarm was developed to address these issues by first clustering nearly identical amplicons iteratively using a local threshold, and then by using clusters’ internal structure and amplicon abundances to refine its results. This fast, scalable, and input-order independent approach reduces the influence of clustering parameters and produces robust operational taxonomic units. PMID:25276506

  7. Swarm: robust and fast clustering method for amplicon-based studies

    Directory of Open Access Journals (Sweden)

    Frédéric Mahé

    2014-09-01

    Full Text Available Popular de novo amplicon clustering methods suffer from two fundamental flaws: arbitrary global clustering thresholds, and input-order dependency induced by centroid selection. Swarm was developed to address these issues by first clustering nearly identical amplicons iteratively using a local threshold, and then by using clusters’ internal structure and amplicon abundances to refine its results. This fast, scalable, and input-order independent approach reduces the influence of clustering parameters and produces robust operational taxonomic units.

  8. Mobile air quality studies (MAQS in inner cities: particulate matter PM10 levels related to different vehicle driving modes and integration of data into a geographical information program

    Directory of Open Access Journals (Sweden)

    Uibel Stefanie

    2012-10-01

    Full Text Available Abstract Background Particulate matter (PM is assumed to exert a major burden on public health. Most studies that address levels of PM use stationary measure systems. By contrast, only few studies measure PM concentrations under mobile conditions to analyze individual exposure situations. Methods By combining spatial-temporal analysis with a novel vehicle-mounted sensor system, the present Mobile Air Quality Study (MAQS aimed to analyse effects of different driving conditions in a convertible vehicle. PM10 was continuously monitored in a convertible car, driven with roof open, roof closed, but windows open, or windows closed. Results PM10 values inside the car were nearly always higher with open roof than with roof and windows closed, whereas no difference was seen with open or closed windows. During the day PM10 values varied with high values before noon, and occasional high median values or standard deviation values due to individual factors. Vehicle speed in itself did not influence the mean value of PM10; however, at traffic speed (10 – 50 km/h the standard deviation was large. No systematic difference was seen between PM10 values in stationary and mobile cars, nor was any PM10 difference observed between driving within or outside an environmental (low emission zone. Conclusions The present study has shown the feasibility of mobile PM analysis in vehicles. Individual exposure of the occupants varies depending on factors like time of day as well as ventilation of the car; other specific factors are clearly identifiably and may relate to specific PM10 sources. This system may be used to monitor individual exposure ranges and provide recommendations for preventive measurements. Although differences in PM10 levels were found under certain ventilation conditions, these differences are likely not of concern for the safety and health of passengers.

  9. Deep amplicon sequencing reveals mixed phytoplasma infection within single grapevine plants

    DEFF Research Database (Denmark)

    Nicolaisen, Mogens; Contaldo, Nicoletta; Makarova, Olga

    2011-01-01

    The diversity of phytoplasmas within single plants has not yet been fully investigated. In this project, deep amplicon sequencing was used to generate 50,926 phytoplasma sequences from 11 phytoplasma-infected grapevine samples from a PCR amplicon in the 5' end of the 16S region. After clustering ...

  10. Amplicon sequencing of bacterial microbiota in abortion material from cattle.

    Science.gov (United States)

    Vidal, Sara; Kegler, Kristel; Posthaus, Horst; Perreten, Vincent; Rodriguez-Campos, Sabrina

    2017-10-10

    Abortions in cattle have a significant economic impact on animal husbandry and require prompt diagnosis for surveillance of epizootic infectious agents. Since most abortions are not epizootic but sporadic with often undetected etiologies, this study examined the bacterial community present in the placenta (PL, n = 32) and fetal abomasal content (AC, n = 49) in 64 cases of bovine abortion by next generation sequencing (NGS) of the 16S rRNA gene. The PL and AC from three fetuses of dams that died from non-infectious reasons were included as controls. All samples were analyzed by bacterial culture, and 17 were examined by histopathology. We observed 922 OTUs overall and 267 taxa at the genus level. No detectable bacterial DNA was present in the control samples. The microbial profiles of the PL and AC differed significantly, both in their composition (PERMANOVA), species richness and Chao-1 (Mann-Whitney test). In both organs, Pseudomonas was the most abundant genus. The combination of NGS and culture identified opportunistic pathogens of interest in placentas with lesions, such as Vibrio metschnikovii, Streptococcus uberis, Lactococcus lactis and Escherichia coli. In placentas with lesions where culturing was unsuccessful, Pseudomonas and unidentified Aeromonadaceae were identified by NGS displaying high number of reads. Three cases with multiple possible etiologies and placentas presenting lesions were detected by NGS. Amplicon sequencing has the potential to uncover unknown etiological agents. These new insights on cattle abortion extend our focus to previously understudied opportunistic abortive bacteria.

  11. Dynamic Copy Number Evolution of X- and Y-Linked Ampliconic Genes in Human Populations

    DEFF Research Database (Denmark)

    Lucotte, Elise A; Skov, Laurits; Jensen, Jacob Malte

    2018-01-01

    we explore the evolution of human X- and Y-linked ampliconic genes by investigating copy number variation (CNV) and coding variation between populations using the Simons Genome Diversity Project. We develop a method to assess CNVs using the read-depth on modified X and Y chromosome targets containing...... related Y haplogroups, that diversified less than 50,000 years ago. Moreover, X and Y-linked ampliconic genes seem to have a faster amplification dynamic than autosomal multicopy genes. Looking at expression data from another study, we also find that XY-linked ampliconic genes with extensive copy number...

  12. Canary: an atomic pipeline for clinical amplicon assays.

    Science.gov (United States)

    Doig, Kenneth D; Ellul, Jason; Fellowes, Andrew; Thompson, Ella R; Ryland, Georgina; Blombery, Piers; Papenfuss, Anthony T; Fox, Stephen B

    2017-12-15

    High throughput sequencing requires bioinformatics pipelines to process large volumes of data into meaningful variants that can be translated into a clinical report. These pipelines often suffer from a number of shortcomings: they lack robustness and have many components written in multiple languages, each with a variety of resource requirements. Pipeline components must be linked together with a workflow system to achieve the processing of FASTQ files through to a VCF file of variants. Crafting these pipelines requires considerable bioinformatics and IT skills beyond the reach of many clinical laboratories. Here we present Canary, a single program that can be run on a laptop, which takes FASTQ files from amplicon assays through to an annotated VCF file ready for clinical analysis. Canary can be installed and run with a single command using Docker containerization or run as a single JAR file on a wide range of platforms. Although it is a single utility, Canary performs all the functions present in more complex and unwieldy pipelines. All variants identified by Canary are 3' shifted and represented in their most parsimonious form to provide a consistent nomenclature, irrespective of sequencing variation. Further, proximate in-phase variants are represented as a single HGVS 'delins' variant. This allows for correct nomenclature and consequences to be ascribed to complex multi-nucleotide polymorphisms (MNPs), which are otherwise difficult to represent and interpret. Variants can also be annotated with hundreds of attributes sourced from MyVariant.info to give up to date details on pathogenicity, population statistics and in-silico predictors. Canary has been used at the Peter MacCallum Cancer Centre in Melbourne for the last 2 years for the processing of clinical sequencing data. By encapsulating clinical features in a single, easily installed executable, Canary makes sequencing more accessible to all pathology laboratories. Canary is available for download as source

  13. Efficient error correction for next-generation sequencing of viral amplicons.

    Science.gov (United States)

    Skums, Pavel; Dimitrova, Zoya; Campo, David S; Vaughan, Gilberto; Rossi, Livia; Forbi, Joseph C; Yokosawa, Jonny; Zelikovsky, Alex; Khudyakov, Yury

    2012-06-25

    Next-generation sequencing allows the analysis of an unprecedented number of viral sequence variants from infected patients, presenting a novel opportunity for understanding virus evolution, drug resistance and immune escape. However, sequencing in bulk is error prone. Thus, the generated data require error identification and correction. Most error-correction methods to date are not optimized for amplicon analysis and assume that the error rate is randomly distributed. Recent quality assessment of amplicon sequences obtained using 454-sequencing showed that the error rate is strongly linked to the presence and size of homopolymers, position in the sequence and length of the amplicon. All these parameters are strongly sequence specific and should be incorporated into the calibration of error-correction algorithms designed for amplicon sequencing. In this paper, we present two new efficient error correction algorithms optimized for viral amplicons: (i) k-mer-based error correction (KEC) and (ii) empirical frequency threshold (ET). Both were compared to a previously published clustering algorithm (SHORAH), in order to evaluate their relative performance on 24 experimental datasets obtained by 454-sequencing of amplicons with known sequences. All three algorithms show similar accuracy in finding true haplotypes. However, KEC and ET were significantly more efficient than SHORAH in removing false haplotypes and estimating the frequency of true ones. Both algorithms, KEC and ET, are highly suitable for rapid recovery of error-free haplotypes obtained by 454-sequencing of amplicons from heterogeneous viruses.The implementations of the algorithms and data sets used for their testing are available at: http://alan.cs.gsu.edu/NGS/?q=content/pyrosequencing-error-correction-algorithm.

  14. JRC GMO-Amplicons: a collection of nucleic acid sequences related to genetically modified organisms.

    Science.gov (United States)

    Petrillo, Mauro; Angers-Loustau, Alexandre; Henriksson, Peter; Bonfini, Laura; Patak, Alex; Kreysa, Joachim

    2015-01-01

    The DNA target sequence is the key element in designing detection methods for genetically modified organisms (GMOs). Unfortunately this information is frequently lacking, especially for unauthorized GMOs. In addition, patent sequences are generally poorly annotated, buried in complex and extensive documentation and hard to link to the corresponding GM event. Here, we present the JRC GMO-Amplicons, a database of amplicons collected by screening public nucleotide sequence databanks by in silico determination of PCR amplification with reference methods for GMO analysis. The European Union Reference Laboratory for Genetically Modified Food and Feed (EU-RL GMFF) provides these methods in the GMOMETHODS database to support enforcement of EU legislation and GM food/feed control. The JRC GMO-Amplicons database is composed of more than 240 000 amplicons, which can be easily accessed and screened through a web interface. To our knowledge, this is the first attempt at pooling and collecting publicly available sequences related to GMOs in food and feed. The JRC GMO-Amplicons supports control laboratories in the design and assessment of GMO methods, providing inter-alia in silico prediction of primers specificity and GM targets coverage. The new tool can assist the laboratories in the analysis of complex issues, such as the detection and identification of unauthorized GMOs. Notably, the JRC GMO-Amplicons database allows the retrieval and characterization of GMO-related sequences included in patents documentation. Finally, it can help annotating poorly described GM sequences and identifying new relevant GMO-related sequences in public databases. The JRC GMO-Amplicons is freely accessible through a web-based portal that is hosted on the EU-RL GMFF website. Database URL: http://gmo-crl.jrc.ec.europa.eu/jrcgmoamplicons/. © The Author(s) 2015. Published by Oxford University Press.

  15. Generic Amplicon Deep Sequencing to Determine Ilarvirus Species Diversity in Australian Prunus.

    Science.gov (United States)

    Kinoti, Wycliff M; Constable, Fiona E; Nancarrow, Narelle; Plummer, Kim M; Rodoni, Brendan

    2017-01-01

    The distribution of Ilarvirus species populations amongst 61 Australian Prunus trees was determined by next generation sequencing (NGS) of amplicons generated using a genus-based generic RT-PCR targeting a conserved region of the Ilarvirus RNA2 component that encodes the RNA dependent RNA polymerase (RdRp) gene. Presence of Ilarvirus sequences in each positive sample was further validated by Sanger sequencing of cloned amplicons of regions of each of RNA1, RNA2 and/or RNA3 that were generated by species specific PCRs and by metagenomic NGS. Prunus necrotic ringspot virus (PNRSV) was the most frequently detected Ilarvirus , occurring in 48 of the 61 Ilarvirus -positive trees and Prune dwarf virus (PDV) and Apple mosaic virus (ApMV) were detected in three trees and one tree, respectively. American plum line pattern virus (APLPV) was detected in three trees and represents the first report of APLPV detection in Australia. Two novel and distinct groups of Ilarvirus -like RNA2 amplicon sequences were also identified in several trees by the generic amplicon NGS approach. The high read depth from the amplicon NGS of the generic PCR products allowed the detection of distinct RNA2 RdRp sequence variant populations of PNRSV, PDV, ApMV, APLPV and the two novel Ilarvirus -like sequences. Mixed infections of ilarviruses were also detected in seven Prunus trees. Sanger sequencing of specific RNA1, RNA2, and/or RNA3 genome segments of each virus and total nucleic acid metagenomics NGS confirmed the presence of PNRSV, PDV, ApMV and APLPV detected by RNA2 generic amplicon NGS. However, the two novel groups of Ilarvirus -like RNA2 amplicon sequences detected by the generic amplicon NGS could not be associated to the presence of sequence from RNA1 or RNA3 genome segments or full Ilarvirus genomes, and their origin is unclear. This work highlights the sensitivity of genus-specific amplicon NGS in detection of virus sequences and their distinct populations in multiple samples, and the

  16. Generic Amplicon Deep Sequencing to Determine Ilarvirus Species Diversity in Australian Prunus

    Directory of Open Access Journals (Sweden)

    Wycliff M. Kinoti

    2017-06-01

    Full Text Available The distribution of Ilarvirus species populations amongst 61 Australian Prunus trees was determined by next generation sequencing (NGS of amplicons generated using a genus-based generic RT-PCR targeting a conserved region of the Ilarvirus RNA2 component that encodes the RNA dependent RNA polymerase (RdRp gene. Presence of Ilarvirus sequences in each positive sample was further validated by Sanger sequencing of cloned amplicons of regions of each of RNA1, RNA2 and/or RNA3 that were generated by species specific PCRs and by metagenomic NGS. Prunus necrotic ringspot virus (PNRSV was the most frequently detected Ilarvirus, occurring in 48 of the 61 Ilarvirus-positive trees and Prune dwarf virus (PDV and Apple mosaic virus (ApMV were detected in three trees and one tree, respectively. American plum line pattern virus (APLPV was detected in three trees and represents the first report of APLPV detection in Australia. Two novel and distinct groups of Ilarvirus-like RNA2 amplicon sequences were also identified in several trees by the generic amplicon NGS approach. The high read depth from the amplicon NGS of the generic PCR products allowed the detection of distinct RNA2 RdRp sequence variant populations of PNRSV, PDV, ApMV, APLPV and the two novel Ilarvirus-like sequences. Mixed infections of ilarviruses were also detected in seven Prunus trees. Sanger sequencing of specific RNA1, RNA2, and/or RNA3 genome segments of each virus and total nucleic acid metagenomics NGS confirmed the presence of PNRSV, PDV, ApMV and APLPV detected by RNA2 generic amplicon NGS. However, the two novel groups of Ilarvirus-like RNA2 amplicon sequences detected by the generic amplicon NGS could not be associated to the presence of sequence from RNA1 or RNA3 genome segments or full Ilarvirus genomes, and their origin is unclear. This work highlights the sensitivity of genus-specific amplicon NGS in detection of virus sequences and their distinct populations in multiple samples

  17. Overexpressed Genes/ESTs and Characterization of Distinct Amplicons on 17823 in Breast Cancer Cells

    Directory of Open Access Journals (Sweden)

    Ayse E. Erson

    2001-01-01

    Full Text Available 17823 is a frequent site of gene amplification in breast cancer. Several lines of evidence suggest the presence of multiple amplicons on 17823. To characterize distinct amplicons on 17823 and localize putative oncogenes, we screened genes and expressed sequence tags (ESTs in existing physical and radiation hybrid maps for amplification and overexpression in breast cancer cell lines by semiquantitative duplex PCR, semiquantitative duplex RT-PCR, Southern blot, Northern blot analyses. We identified two distinct amplicons on 17823, one including TBX2 and another proximal region including RPS6KB1 (PS6K and MUL. In addition to these previously reported overexpressed genes, we also identified amplification and overexpression of additional uncharacterized genes and ESTs, some of which suggest potential oncogenic activity. In conclusion, we have further defined two distinct regions of gene amplification and overexpression on 17823 with identification of new potential oncogene candidates. Based on the amplification and overexpression patterns of known and as of yet unrecognized genes on 17823, it is likely that some of these genes mapping to the discrete amplicons function as oncogenes and contribute to tumor progression in breast cancer cells.

  18. Improved sensitivity of circulating tumor DNA measurement using short PCR amplicons

    DEFF Research Database (Denmark)

    Andersen, Rikke Fredslund; Spindler, Karen-Lise Garm; Brandslund, Ivan

    2015-01-01

    , however, presents a number of challenges that require attention. The amount of DNA is low and highly fragmented and analyses need to be optimized accordingly. KRAS ARMS-qPCR assays with amplicon lengths of 120 and 85 base pairs, respectively, were compared using positive control material (PCR fragments...

  19. Polyadenylated Sequencing Primers Enable Complete Readability of PCR Amplicons Analyzed by Dideoxynucleotide Sequencing

    Directory of Open Access Journals (Sweden)

    Martin Beránek

    2012-01-01

    Full Text Available Dideoxynucleotide DNA sequencing is one of the principal procedures in molecular biology. Loss of an initial part of nucleotides behind the 3' end of the sequencing primer limits the readability of sequenced amplicons. We present a method which extends the readability by using sequencing primers modified by polyadenylated tails attached to their 5' ends. Performing a polymerase chain reaction, we amplified eight amplicons of six human genes (AMELX, APOE, HFE, MBL2, SERPINA1 and TGFB1 ranging from 106 bp to 680 bp. Polyadenylation of the sequencing primers minimized the loss of bases in all amplicons. Complete sequences of shorter products (AMELX 106 bp, SERPINA1 121 bp, HFE 208 bp, APOE 244 bp, MBL2 317 bp were obtained. In addition, in the case of TGFB1 products (366 bp, 432 bp, and 680 bp, respectively, the lengths of sequencing readings were significantly longer if adenylated primers were used. Thus, single strand dideoxynucleotide sequencing with adenylated primers enables complete or near complete readability of short PCR amplicons.

  20. Multiplex PCR, amplicon size and hybridization efficiency on the NanoChip electronic microarray

    DEFF Research Database (Denmark)

    Børsting, Claus; Sanchez, Juan J; Morling, Niels

    2004-01-01

    We tested the SNP typing protocol developed for the NanoChip electronic microarray by analyzing the four Y chromosome loci SRY1532, SRY8299, TAT, and 92R7. Amplicons of different lengths containing the same locus were purified and addressed to the NanoChip array and fluorescently labelled reporte...

  1. Absolute estimation of initial concentrations of amplicon in a real-time RT-PCR process

    Directory of Open Access Journals (Sweden)

    Kohn Michael

    2007-10-01

    Full Text Available Abstract Background Since real time PCR was first developed, several approaches to estimating the initial quantity of template in an RT-PCR reaction have been tried. While initially only the early thermal cycles corresponding to exponential duplication were used, lately there has been an effort to use all of the cycles in a PCR. The efforts have included both fitting empirical sigmoid curves and more elaborate mechanistic models that explore the chemical reactions taking place during each cycle. The more elaborate mechanistic models require many more parameters than can be fit from a single amplification, while the empirical models provide little insight and are difficult to tailor to specific reactants. Results We directly estimate the initial amount of amplicon using a simplified mechanistic model based on chemical reactions in the annealing step of the PCR. The basic model includes the duplication of DNA with the digestion of Taqman probe and the re-annealing between previously synthesized DNA strands of opposite orientation. By modelling the amount of Taqman probe digested and matching that with the observed fluorescence, the conversion factor between the number of fluorescing dye molecules and observed fluorescent emission can be estimated, along with the absolute initial amount of amplicon and the rate parameter for re-annealing. The model is applied to several PCR reactions with known amounts of amplicon and is shown to work reasonably well. An expanded version of the model allows duplication of amplicon without release of fluorescent dye, by adding 1 more parameter to the model. The additional process is helpful in most cases where the initial primer concentration exceeds the initial probe concentration. Software for applying the algorithm to data may be downloaded at http://www.niehs.nih.gov/research/resources/software/pcranalyzer/ Conclusion We present proof of the principle that a mechanistically based model can be fit to observations

  2. Does universal 16S rRNA gene amplicon sequencing of environmental communities provide an accurate description of nitrifying guilds?

    DEFF Research Database (Denmark)

    Diwan, Vaibhav; Albrechtsen, Hans-Jørgen; Smets, Barth F.

    2018-01-01

    amplicon sequencing and from guild targeted approaches. The universal amplicon sequencing provided 1) accurate estimates of nitrifier composition, 2) clustering of the samples based on these compositions consistent with sample origin, 3) estimates of the relative abundance of the guilds correlated...

  3. Conservative fragments in bacterial 16S rRNA genes and primer design for 16S ribosomal DNA amplicons in metagenomic studies

    KAUST Repository

    Wang, Yong; Qian, Pei-Yuan

    2009-01-01

    Bacterial 16S ribosomal DNA (rDNA) amplicons have been widely used in the classification of uncultured bacteria inhabiting environmental niches. Primers targeting conservative regions of the rDNAs are used to generate amplicons of variant regions

  4. pyAmpli: an amplicon-based variant filter pipeline for targeted resequencing data.

    Science.gov (United States)

    Beyens, Matthias; Boeckx, Nele; Van Camp, Guy; Op de Beeck, Ken; Vandeweyer, Geert

    2017-12-14

    Haloplex targeted resequencing is a popular method to analyze both germline and somatic variants in gene panels. However, involved wet-lab procedures may introduce false positives that need to be considered in subsequent data-analysis. No variant filtering rationale addressing amplicon enrichment related systematic errors, in the form of an all-in-one package, exists to our knowledge. We present pyAmpli, a platform independent parallelized Python package that implements an amplicon-based germline and somatic variant filtering strategy for Haloplex data. pyAmpli can filter variants for systematic errors by user pre-defined criteria. We show that pyAmpli significantly increases specificity, without reducing sensitivity, essential for reporting true positive clinical relevant mutations in gene panel data. pyAmpli is an easy-to-use software tool which increases the true positive variant call rate in targeted resequencing data. It specifically reduces errors related to PCR-based enrichment of targeted regions.

  5. Using expected sequence features to improve basecalling accuracy of amplicon pyrosequencing data

    DEFF Research Database (Denmark)

    Rask, Thomas Salhøj; Petersen, Bent; Chen, Donald S.

    2016-01-01

    . The new basecalling method described here, named Multipass, implements a probabilistic framework for working with the raw flowgrams obtained by pyrosequencing. For each sequence variant Multipass calculates the likelihood and nucleotide sequence of several most likely sequences given the flowgram data....... This probabilistic approach enables integration of basecalling into a larger model where other parameters can be incorporated, such as the likelihood for observing a full-length open reading frame at the targeted region. We apply the method to 454 amplicon pyrosequencing data obtained from a malaria virulence gene...... family, where Multipass generates 20 % more error-free sequences than current state of the art methods, and provides sequence characteristics that allow generation of a set of high confidence error-free sequences. This novel method can be used to increase accuracy of existing and future amplicon...

  6. Strong selective sweeps associated with ampliconic regions in great ape X chromosomes

    DEFF Research Database (Denmark)

    Nam, Kiwoong; Munch, Kasper; Hobolth, Asger

    2014-01-01

    The unique inheritance pattern of X chromosomes makes them preferential targets of adaptive evolution. We here investigate natural selection on the X chromosome in all species of great apes. We find that diversity is more strongly reduced around genes on the X compared with autosomes...... with ampliconic sequences we propose that intra-genomic conflict between the X and the Y chromosomes is a major driver of X chromosome evolution....

  7. Imaging Herpes Simplex Virus Type 1 Amplicon Vector–Mediated Gene Expression in Human Glioma Spheroids

    OpenAIRE

    Christine Kaestle; Alexandra Winkeler; Raphaela Richter; Heinrich Sauer; Jürgen Hescheler; Cornel Fraefel; Maria Wartenberg; Andreas H. Jacobs

    2011-01-01

    Vectors derived from herpes simplex virus type 1 (HSV-1) have great potential for transducing therapeutic genes into the central nervous system; however, inefficient distribution of vector particles in vivo may limit their therapeutic potential in patients with gliomas. This study was performed to investigate the extent of HSV-1 amplicon vector–mediated gene expression in a three-dimensional glioma model of multicellular spheroids by imaging highly infectious HSV-1 virions expressing green fl...

  8. Superposition Quantification

    Science.gov (United States)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  9. Ultra-high resolution HLA genotyping and allele discovery by highly multiplexed cDNA amplicon pyrosequencing

    Directory of Open Access Journals (Sweden)

    Lank Simon M

    2012-08-01

    Full Text Available Abstract Background High-resolution HLA genotyping is a critical diagnostic and research assay. Current methods rarely achieve unambiguous high-resolution typing without making population-specific frequency inferences due to a lack of locus coverage and difficulty in exon-phase matching. Achieving high-resolution typing is also becoming more challenging with traditional methods as the database of known HLA alleles increases. Results We designed a cDNA amplicon-based pyrosequencing method to capture 94% of the HLA class I open-reading-frame with only two amplicons per sample, and an analogous method for class II HLA genes, with a primary focus on sequencing the DRB loci. We present a novel Galaxy server-based analysis workflow for determining genotype. During assay validation, we performed two GS Junior sequencing runs to determine the accuracy of the HLA class I amplicons and DRB amplicon at different levels of multiplexing. When 116 amplicons were multiplexed, we unambiguously resolved 99%of class I alleles to four- or six-digit resolution, as well as 100% unambiguous DRB calls. The second experiment, with 271 multiplexed amplicons, missed some alleles, but generated high-resolution, concordant typing for 93% of class I alleles, and 96% for DRB1 alleles. In a third, preliminary experiment we attempted to sequence novel amplicons for other class II loci with mixed success. Conclusions The presented assay is higher-throughput and higher-resolution than existing HLA genotyping methods, and suitable for allele discovery or large cohort sampling. The validated class I and DRB primers successfully generated unambiguously high-resolution genotypes, while further work is needed to validate additional class II genotyping amplicons.

  10. Mining environmental high-throughput sequence data sets to identify divergent amplicon clusters for phylogenetic reconstruction and morphotype visualization.

    Science.gov (United States)

    Gimmler, Anna; Stoeck, Thorsten

    2015-08-01

    Environmental high-throughput sequencing (envHTS) is a very powerful tool, which in protistan ecology is predominantly used for the exploration of diversity and its geographic and local patterns. We here used a pyrosequenced V4-SSU rDNA data set from a solar saltern pond as test case to exploit such massive protistan amplicon data sets beyond this descriptive purpose. Therefore, we combined a Swarm-based blastn network including 11 579 ciliate V4 amplicons to identify divergent amplicon clusters with targeted polymerase chain reaction (PCR) primer design for full-length small subunit of the ribosomal DNA retrieval and probe design for fluorescence in situ hybridization (FISH). This powerful strategy allows to benefit from envHTS data sets to (i) reveal the phylogenetic position of the taxon behind divergent amplicons; (ii) improve phylogenetic resolution and evolutionary history of specific taxon groups; (iii) solidly assess an amplicons (species') degree of similarity to its closest described relative; (iv) visualize the morphotype behind a divergent amplicons cluster; (v) rapidly FISH screen many environmental samples for geographic/habitat distribution and abundances of the respective organism and (vi) to monitor the success of enrichment strategies in live samples for cultivation and isolation of the respective organisms. © 2015 Society for Applied Microbiology and John Wiley & Sons Ltd.

  11. The bias associated with amplicon sequencing does not affect the quantitative assessment of bacterial community dynamics.

    Directory of Open Access Journals (Sweden)

    Federico M Ibarbalz

    Full Text Available The performance of two sets of primers targeting variable regions of the 16S rRNA gene V1-V3 and V4 was compared in their ability to describe changes of bacterial diversity and temporal turnover in full-scale activated sludge. Duplicate sets of high-throughput amplicon sequencing data of the two 16S rRNA regions shared a collection of core taxa that were observed across a series of twelve monthly samples, although the relative abundance of each taxon was substantially different between regions. A case in point was the changes in the relative abundance of filamentous bacteria Thiothrix, which caused a large effect on diversity indices, but only in the V1-V3 data set. Yet the relative abundance of Thiothrix in the amplicon sequencing data from both regions correlated with the estimation of its abundance determined using fluorescence in situ hybridization. In nonmetric multidimensional analysis samples were distributed along the first ordination axis according to the sequenced region rather than according to sample identities. The dynamics of microbial communities indicated that V1-V3 and the V4 regions of the 16S rRNA gene yielded comparable patterns of: 1 the changes occurring within the communities along fixed time intervals, 2 the slow turnover of activated sludge communities and 3 the rate of species replacement calculated from the taxa-time relationships. The temperature was the only operational variable that showed significant correlation with the composition of bacterial communities over time for the sets of data obtained with both pairs of primers. In conclusion, we show that despite the bias introduced by amplicon sequencing, the variable regions V1-V3 and V4 can be confidently used for the quantitative assessment of bacterial community dynamics, and provide a proper qualitative account of general taxa in the community, especially when the data are obtained over a convenient time window rather than at a single time point.

  12. Massively parallel amplicon sequencing reveals isotype-specific variability of antimicrobial peptide transcripts in Mytilus galloprovincialis.

    Directory of Open Access Journals (Sweden)

    Umberto Rosani

    Full Text Available BACKGROUND: Effective innate responses against potential pathogens are essential in the living world and possibly contributed to the evolutionary success of invertebrates. Taken together, antimicrobial peptide (AMP precursors of defensin, mytilin, myticin and mytimycin can represent about 40% of the hemocyte transcriptome in mussels injected with viral-like and bacterial preparations, and unique profiles of myticin C variants are expressed in single mussels. Based on amplicon pyrosequencing, we have ascertained and compared the natural and Vibrio-induced diversity of AMP transcripts in mussel hemocytes from three European regions. METHODOLOGY/PRINCIPAL FINDINGS: Hemolymph was collected from mussels farmed in the coastal regions of Palavas (France, Vigo (Spain and Venice (Italy. To represent the AMP families known in M. galloprovincialis, nine transcript sequences have been selected, amplified from hemocyte RNA and subjected to pyrosequencing. Hemolymph from farmed (offshore and wild (lagoon Venice mussels, both injected with 10(7 Vibrio cells, were similarly processed. Amplicon pyrosequencing emphasized the AMP transcript diversity, with Single Nucleotide Changes (SNC minimal for mytilin B/C and maximal for arthropod-like defensin and myticin C. Ratio of non-synonymous vs. synonymous changes also greatly differed between AMP isotypes. Overall, each amplicon revealed similar levels of nucleotidic variation across geographical regions, with two main sequence patterns confirmed for mytimycin and no substantial changes after immunostimulation. CONCLUSIONS/SIGNIFICANCE: Barcoding and bidirectional pyrosequencing allowed us to map and compare the transcript diversity of known mussel AMPs. Though most of the genuine cds variation was common to the analyzed samples we could estimate from 9 to 106 peptide variants in hemolymph pools representing 100 mussels, depending on the AMP isoform and sampling site. In this study, no prevailing SNC patterns related

  13. Gene Fusions Associated with Recurrent Amplicons Represent a Class of Passenger Aberrations in Breast Cancer

    Directory of Open Access Journals (Sweden)

    Shanker Kalyana-Sundaram

    2012-08-01

    Full Text Available Application of high-throughput transcriptome sequencing has spurred highly sensitive detection and discovery of gene fusions in cancer, but distinguishing potentially oncogenic fusions from random, “passenger” aberrations has proven challenging. Here we examine a distinctive group of gene fusions that involve genes present in the loci of chromosomal amplifications—a class of oncogenic aberrations that are widely prevalent in breast cancers. Integrative analysis of a panel of 14 breast cancer cell lines comparing gene fusions discovered by high-throughput transcriptome sequencing and genome-wide copy number aberrations assessed by array comparative genomic hybridization, led to the identification of 77 gene fusions, of which more than 60% were localized to amplicons including 17q12, 17q23, 20q13, chr8q, and others. Many of these fusions appeared to be recurrent or involved highly expressed oncogenic drivers, frequently fused with multiple different partners, but sometimes displaying loss of functional domains. As illustrative examples of the “amplicon-associated” gene fusions, we examined here a recurrent gene fusion involving the mediator of mammalian target of rapamycin signaling, RPS6KB1 kinase in BT-474, and the therapeutically important receptor tyrosine kinase EGFR in MDA-MB-468 breast cancer cell line. These gene fusions comprise a minor allelic fraction relative to the highly expressed full-length transcripts and encode chimera lacking the kinase domains, which do not impart dependence on the respective cells. Our study suggests that amplicon-associated gene fusions in breast cancer primarily represent a by-product of chromosomal amplifications, which constitutes a subset of passenger aberrations and should be factored accordingly during prioritization of gene fusion candidates.

  14. SEED 2: a user-friendly platform for amplicon high-throughput sequencing data analyses.

    Science.gov (United States)

    Vetrovský, Tomáš; Baldrian, Petr; Morais, Daniel; Berger, Bonnie

    2018-02-14

    Modern molecular methods have increased our ability to describe microbial communities. Along with the advances brought by new sequencing technologies, we now require intensive computational resources to make sense of the large numbers of sequences continuously produced. The software developed by the scientific community to address this demand, although very useful, require experience of the command-line environment, extensive training and have steep learning curves, limiting their use. We created SEED 2, a graphical user interface for handling high-throughput amplicon-sequencing data under Windows operating systems. SEED 2 is the only sequence visualizer that empowers users with tools to handle amplicon-sequencing data of microbial community markers. It is suitable for any marker genes sequences obtained through Illumina, IonTorrent or Sanger sequencing. SEED 2 allows the user to process raw sequencing data, identify specific taxa, produce of OTU-tables, create sequence alignments and construct phylogenetic trees. Standard dual core laptops with 8 GB of RAM can handle ca. 8 million of Illumina PE 300 bp sequences, ca. 4GB of data. SEED 2 was implemented in Object Pascal and uses internal functions and external software for amplicon data processing. SEED 2 is a freeware software, available at http://www.biomed.cas.cz/mbu/lbwrf/seed/ as a self-contained file, including all the dependencies, and does not require installation. Supplementary data contain a comprehensive list of supported functions. daniel.morais@biomed.cas.cz. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press.

  15. Surface density dependence of PCR amplicon hybridization on PNA/DNA probe layers

    DEFF Research Database (Denmark)

    Yao, Danfeng; Kim, Junyoung; Yu, Fang

    2005-01-01

    at an intermediate sodium concentration (approximately 100 mM). These effects were mainly ascribed to the electrostatic cross talk among the hybridized DNA molecules and the secondary structure of PCR amplicons. For the negatively charged DNA probes, the hybridization reaction was subjected additionally to the DNA....../DNA electrostatic barrier, particularly in lower ionic strength range (e.g., 10 approximately 150 mM Na(+)). The electrostatic cross talk was shown to be largely reduced if the PNA probe layer was sufficiently diluted by following a strategic templated immobilization method. As a consequence, a pseudo...

  16. Nitrogenase gene amplicons from global marine surface waters are dominated by genes of non-cyanobacteria

    DEFF Research Database (Denmark)

    Farnelid, Hanna; Andersson, Anders F.; Bertilsson, Stefan

    2011-01-01

    analysis of 79,090 nitrogenase (nifH) PCR amplicons encoding 7,468 unique proteins from surface samples (ten DNA samples and two RNA samples) collected at ten marine locations world-wide provides the first in-depth survey of a functional bacterial gene and yield insights into the composition and diversity...... by unicellular cyanobacteria, 42% of the identified non-cyanobacterial nifH clusters from the corresponding DNA samples were also detected in cDNA. The study indicates that non-cyanobacteria account for a substantial part of the nifH gene pool in marine surface waters and that these genes are at least...

  17. Global Perspectives on Activated Sludge Community Composition analyzed using 16S rRNA amplicon sequencing

    DEFF Research Database (Denmark)

    Nierychlo, Marta; Saunders, Aaron Marc; Albertsen, Mads

    communities, and in this study activated sludge sampled from 32 Wastewater Treatment Plants (WWTPs) around the world was described and compared. The top abundant bacteria in the global activated sludge ecosystem were found and the core population shared by multiple samples was investigated. The results......Activated sludge is the most commonly applied bioprocess throughout the world for wastewater treatment. Microorganisms are key to the process, yet our knowledge of their identity and function is still limited. High-througput16S rRNA amplicon sequencing can reliably characterize microbial...

  18. Analysis and Visualization Tool for Targeted Amplicon Bisulfite Sequencing on Ion Torrent Sequencers.

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    Full Text Available Targeted sequencing of PCR amplicons generated from bisulfite deaminated DNA is a flexible, cost-effective way to study methylation of a sample at single CpG resolution and perform subsequent multi-target, multi-sample comparisons. Currently, no platform specific protocol, support, or analysis solution is provided to perform targeted bisulfite sequencing on a Personal Genome Machine (PGM. Here, we present a novel tool, called TABSAT, for analyzing targeted bisulfite sequencing data generated on Ion Torrent sequencers. The workflow starts with raw sequencing data, performs quality assessment, and uses a tailored version of Bismark to map the reads to a reference genome. The pipeline visualizes results as lollipop plots and is able to deduce specific methylation-patterns present in a sample. The obtained profiles are then summarized and compared between samples. In order to assess the performance of the targeted bisulfite sequencing workflow, 48 samples were used to generate 53 different Bisulfite-Sequencing PCR amplicons from each sample, resulting in 2,544 amplicon targets. We obtained a mean coverage of 282X using 1,196,822 aligned reads. Next, we compared the sequencing results of these targets to the methylation level of the corresponding sites on an Illumina 450k methylation chip. The calculated average Pearson correlation coefficient of 0.91 confirms the sequencing results with one of the industry-leading CpG methylation platforms and shows that targeted amplicon bisulfite sequencing provides an accurate and cost-efficient method for DNA methylation studies, e.g., to provide platform-independent confirmation of Illumina Infinium 450k methylation data. TABSAT offers a novel way to analyze data generated by Ion Torrent instruments and can also be used with data from the Illumina MiSeq platform. It can be easily accessed via the Platomics platform, which offers a web-based graphical user interface along with sample and parameter storage

  19. A priori Considerations When Conducting High-Throughput Amplicon-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Aditi Sengupta

    2016-03-01

    Full Text Available Amplicon-based sequencing strategies that include 16S rRNA and functional genes, alongside “meta-omics” analyses of communities of microorganisms, have allowed researchers to pose questions and find answers to “who” is present in the environment and “what” they are doing. Next-generation sequencing approaches that aid microbial ecology studies of agricultural systems are fast gaining popularity among agronomy, crop, soil, and environmental science researchers. Given the rapid development of these high-throughput sequencing techniques, researchers with no prior experience will desire information about the best practices that can be used before actually starting high-throughput amplicon-based sequence analyses. We have outlined items that need to be carefully considered in experimental design, sampling, basic bioinformatics, sequencing of mock communities and negative controls, acquisition of metadata, and in standardization of reaction conditions as per experimental requirements. Not all considerations mentioned here may pertain to a particular study. The overall goal is to inform researchers about considerations that must be taken into account when conducting high-throughput microbial DNA sequencing and sequences analysis.

  20. Insight into biases and sequencing errors for amplicon sequencing with the Illumina MiSeq platform.

    Science.gov (United States)

    Schirmer, Melanie; Ijaz, Umer Z; D'Amore, Rosalinda; Hall, Neil; Sloan, William T; Quince, Christopher

    2015-03-31

    With read lengths of currently up to 2 × 300 bp, high throughput and low sequencing costs Illumina's MiSeq is becoming one of the most utilized sequencing platforms worldwide. The platform is manageable and affordable even for smaller labs. This enables quick turnaround on a broad range of applications such as targeted gene sequencing, metagenomics, small genome sequencing and clinical molecular diagnostics. However, Illumina error profiles are still poorly understood and programs are therefore not designed for the idiosyncrasies of Illumina data. A better knowledge of the error patterns is essential for sequence analysis and vital if we are to draw valid conclusions. Studying true genetic variation in a population sample is fundamental for understanding diseases, evolution and origin. We conducted a large study on the error patterns for the MiSeq based on 16S rRNA amplicon sequencing data. We tested state-of-the-art library preparation methods for amplicon sequencing and showed that the library preparation method and the choice of primers are the most significant sources of bias and cause distinct error patterns. Furthermore we tested the efficiency of various error correction strategies and identified quality trimming (Sickle) combined with error correction (BayesHammer) followed by read overlapping (PANDAseq) as the most successful approach, reducing substitution error rates on average by 93%. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Direct recovery of infectious Pestivirus from a full-length RT-PCR amplicon

    DEFF Research Database (Denmark)

    Rasmussen, Thomas Bruun; Reimann, Ilona; Hoffmann, Bernd

    2008-01-01

    This study describes the use of a novel and rapid long reverse transcription (RT)-PCR for the generation of infectious full-length cDNA of pestiviruses. To produce rescued viruses, full-length RT-PCR amplicons of 12.3 kb, including a T7-promotor, were transcribed directly in vitro, and the result......This study describes the use of a novel and rapid long reverse transcription (RT)-PCR for the generation of infectious full-length cDNA of pestiviruses. To produce rescued viruses, full-length RT-PCR amplicons of 12.3 kb, including a T7-promotor, were transcribed directly in vitro......, and the resulting RNA transcripts were electroporated into ovine cells. Infectious virus was obtained after one cell culture passage. The rescued viruses had a phenotype similar to the parental Border Disease virus strain. Therefore, direct generation of infectious pestiviruses from full-length RT-PCR cDNA products...

  2. Evaluation of the reproducibility of amplicon sequencing with Illumina MiSeq platform.

    Science.gov (United States)

    Wen, Chongqing; Wu, Liyou; Qin, Yujia; Van Nostrand, Joy D; Ning, Daliang; Sun, Bo; Xue, Kai; Liu, Feifei; Deng, Ye; Liang, Yuting; Zhou, Jizhong

    2017-01-01

    Illumina's MiSeq has become the dominant platform for gene amplicon sequencing in microbial ecology studies; however, various technical concerns, such as reproducibility, still exist. To assess reproducibility, 16S rRNA gene amplicons from 18 soil samples of a reciprocal transplantation experiment were sequenced on an Illumina MiSeq. The V4 region of 16S rRNA gene from each sample was sequenced in triplicate with each replicate having a unique barcode. The average OTU overlap, without considering sequence abundance, at a rarefaction level of 10,323 sequences was 33.4±2.1% and 20.2±1.7% between two and among three technical replicates, respectively. When OTU sequence abundance was considered, the average sequence abundance weighted OTU overlap was 85.6±1.6% and 81.2±2.1% for two and three replicates, respectively. Removing singletons significantly increased the overlap for both (~1-3%, pdeep sequencing increased OTU overlap both when sequence abundance was considered (95%) and when not (44%). However, if singletons were not removed the overlap between two technical replicates (not considering sequence abundance) plateaus at 39% with 30,000 sequences. Diversity measures were not affected by the low overlap as α-diversities were similar among technical replicates while β-diversities (Bray-Curtis) were much smaller among technical replicates than among treatment replicates (e.g., 0.269 vs. 0.374). Higher diversity coverage, but lower OTU overlap, was observed when replicates were sequenced in separate runs. Detrended correspondence analysis indicated that while there was considerable variation among technical replicates, the reproducibility was sufficient for detecting treatment effects for the samples examined. These results suggest that although there is variation among technical replicates, amplicon sequencing on MiSeq is useful for analyzing microbial community structure if used appropriately and with caution. For example, including technical replicates

  3. Swarm v2: highly-scalable and high-resolution amplicon clustering.

    Science.gov (United States)

    Mahé, Frédéric; Rognes, Torbjørn; Quince, Christopher; de Vargas, Colomban; Dunthorn, Micah

    2015-01-01

    Previously we presented Swarm v1, a novel and open source amplicon clustering program that produced fine-scale molecular operational taxonomic units (OTUs), free of arbitrary global clustering thresholds and input-order dependency. Swarm v1 worked with an initial phase that used iterative single-linkage with a local clustering threshold (d), followed by a phase that used the internal abundance structures of clusters to break chained OTUs. Here we present Swarm v2, which has two important novel features: (1) a new algorithm for d = 1 that allows the computation time of the program to scale linearly with increasing amounts of data; and (2) the new fastidious option that reduces under-grouping by grafting low abundant OTUs (e.g., singletons and doubletons) onto larger ones. Swarm v2 also directly integrates the clustering and breaking phases, dereplicates sequencing reads with d = 0, outputs OTU representatives in fasta format, and plots individual OTUs as two-dimensional networks.

  4. Photochemical immobilization of anthraquinone conjugated oligonucleotides and PCR amplicons on solid surfaces

    DEFF Research Database (Denmark)

    Koch, T.; Jacobsen, N.; Fensholdt, J.

    2000-01-01

    Ligand immobilization on solid surfaces is an essential step in fields such as diagnostics, bio sensor manufacturing, and new material sciences in general. In this paper a photochemical approach based on anthraquinone as the chromophore is presented. Photochemical procedures offer special...... advantages as they are able to generate highly reactive species in an orientation specific manner. As presented here, anthraquinone (AQ) mediated covalent DNA immobilization appears to be superior to currently known procedures. A synthetic procedure providing AQ-phosphoramidites is presented. These reagents...... facilitate AQ conjugation during routine DNA synthesis, thus enabling the AQ-oligonucleotides to be immobilized in a very convenient and efficient manner. AQ-conjugated PCR primers can be used directly in PCR. When the PCR is performed in solution, the amplicons can be immobilized after the PCR. Moreover...

  5. Developing de novo human artificial chromosomes in embryonic stem cells using HSV-1 amplicon technology.

    Science.gov (United States)

    Moralli, Daniela; Monaco, Zoia L

    2015-02-01

    De novo artificial chromosomes expressing genes have been generated in human embryonic stem cells (hESc) and are maintained following differentiation into other cell types. Human artificial chromosomes (HAC) are small, functional, extrachromosomal elements, which behave as normal chromosomes in human cells. De novo HAC are generated following delivery of alpha satellite DNA into target cells. HAC are characterized by high levels of mitotic stability and are used as models to study centromere formation and chromosome organisation. They are successful and effective as gene expression vectors since they remain autonomous and can accommodate larger genes and regulatory regions for long-term expression studies in cells unlike other viral gene delivery vectors currently used. Transferring the essential DNA sequences for HAC formation intact across the cell membrane has been challenging for a number of years. A highly efficient delivery system based on HSV-1 amplicons has been used to target DNA directly to the ES cell nucleus and HAC stably generated in human embryonic stem cells (hESc) at high frequency. HAC were detected using an improved protocol for hESc chromosome harvesting, which consistently produced high-quality metaphase spreads that could routinely detect HAC in hESc. In tumour cells, the input DNA often integrated in the host chromosomes, but in the host ES genome, it remained intact. The hESc containing the HAC formed embryoid bodies, generated teratoma in mice, and differentiated into neuronal cells where the HAC were maintained. The HAC structure and chromatin composition was similar to the endogenous hESc chromosomes. This review will discuss the technological advances in HAC vector delivery using HSV-1 amplicons and the improvements in the identification of de novo HAC in hESc.

  6. Implementation of Amplicon Parallel Sequencing Leads to Improvement of Diagnosis and Therapy of Lung Cancer Patients.

    Science.gov (United States)

    König, Katharina; Peifer, Martin; Fassunke, Jana; Ihle, Michaela A; Künstlinger, Helen; Heydt, Carina; Stamm, Katrin; Ueckeroth, Frank; Vollbrecht, Claudia; Bos, Marc; Gardizi, Masyar; Scheffler, Matthias; Nogova, Lucia; Leenders, Frauke; Albus, Kerstin; Meder, Lydia; Becker, Kerstin; Florin, Alexandra; Rommerscheidt-Fuss, Ursula; Altmüller, Janine; Kloth, Michael; Nürnberg, Peter; Henkel, Thomas; Bikár, Sven-Ernö; Sos, Martin L; Geese, William J; Strauss, Lewis; Ko, Yon-Dschun; Gerigk, Ulrich; Odenthal, Margarete; Zander, Thomas; Wolf, Jürgen; Merkelbach-Bruse, Sabine; Buettner, Reinhard; Heukamp, Lukas C

    2015-07-01

    The Network Genomic Medicine Lung Cancer was set up to rapidly translate scientific advances into early clinical trials of targeted therapies in lung cancer performing molecular analyses of more than 3500 patients annually. Because sequential analysis of the relevant driver mutations on fixated samples is challenging in terms of workload, tissue availability, and cost, we established multiplex parallel sequencing in routine diagnostics. The aim was to analyze all therapeutically relevant mutations in lung cancer samples in a high-throughput fashion while significantly reducing turnaround time and amount of input DNA compared with conventional dideoxy sequencing of single polymerase chain reaction amplicons. In this study, we demonstrate the feasibility of a 102 amplicon multiplex polymerase chain reaction followed by sequencing on an Illumina sequencer on formalin-fixed paraffin-embedded tissue in routine diagnostics. Analysis of a validation cohort of 180 samples showed this approach to require significantly less input material and to be more reliable, robust, and cost-effective than conventional dideoxy sequencing. Subsequently, 2657 lung cancer patients were analyzed. We observed that comprehensive biomarker testing provided novel information in addition to histological diagnosis and clinical staging. In 2657 consecutively analyzed lung cancer samples, we identified driver mutations at the expected prevalence. Furthermore we found potentially targetable DDR2 mutations at a frequency of 3% in both adenocarcinomas and squamous cell carcinomas. Overall, our data demonstrate the utility of systematic sequencing analysis in a clinical routine setting and highlight the dramatic impact of such an approach on the availability of therapeutic strategies for the targeted treatment of individual cancer patients.

  7. Nitrogenase gene amplicons from global marine surface waters are dominated by genes of non-cyanobacteria.

    Directory of Open Access Journals (Sweden)

    Hanna Farnelid

    Full Text Available Cyanobacteria are thought to be the main N(2-fixing organisms (diazotrophs in marine pelagic waters, but recent molecular analyses indicate that non-cyanobacterial diazotrophs are also present and active. Existing data are, however, restricted geographically and by limited sequencing depths. Our analysis of 79,090 nitrogenase (nifH PCR amplicons encoding 7,468 unique proteins from surface samples (ten DNA samples and two RNA samples collected at ten marine locations world-wide provides the first in-depth survey of a functional bacterial gene and yield insights into the composition and diversity of the nifH gene pool in marine waters. Great divergence in nifH composition was observed between sites. Cyanobacteria-like genes were most frequent among amplicons from the warmest waters, but overall the data set was dominated by nifH sequences most closely related to non-cyanobacteria. Clusters related to Alpha-, Beta-, Gamma-, and Delta-Proteobacteria were most common and showed distinct geographic distributions. Sequences related to anaerobic bacteria (nifH Cluster III were generally rare, but preponderant in cold waters, especially in the Arctic. Although the two transcript samples were dominated by unicellular cyanobacteria, 42% of the identified non-cyanobacterial nifH clusters from the corresponding DNA samples were also detected in cDNA. The study indicates that non-cyanobacteria account for a substantial part of the nifH gene pool in marine surface waters and that these genes are at least occasionally expressed. The contribution of non-cyanobacterial diazotrophs to the global N(2 fixation budget cannot be inferred from sequence data alone, but the prevalence of non-cyanobacterial nifH genes and transcripts suggest that these bacteria are ecologically significant.

  8. Similar diversity of Alphaproteobacteria and nitrogenase gene amplicons on two related Sphagnum mosses

    Directory of Open Access Journals (Sweden)

    Anastasia eBragina

    2012-01-01

    Full Text Available Sphagnum mosses represent a main component in ombrotrophic wetlands. They harbor a specific and diverse microbial community with essential functions for the host. To understand extend and degree of host specificity, Sphagnum fallax and S. angustifolium, two phylogenetically closely related species, which show distinct habitat preference with respect to the nutrient level, were analyzed by a multifaceted approach. Microbial fingerprints obtained by PCR-SSCP (single-strand conformation polymorphism using universal, group-specific and functional primers were highly similar. Similarity was confirmed for colonization patterns obtained by fluorescence in situ hybridization (FISH coupled with confocal laser scanning microscopy (CLSM: Alphaproteobacteria were the main colonizers inside the hyaline cells of Sphagnum leaves. A deeper survey of Alphaproteobacteria by 16S rRNA gene amplicon sequencing reveals a high diversity with Acidocella, Acidisphaera, Rhodopila and Phenylobacterium as major genera for both mosses. Pathogen defense and nitrogen fixation are important functions of Sphagnum-associated bacteria, which are fulfilled by microbial communities of both Sphagna in a similar way. NifH libraries of Sphagnum-associated microbial communities were characterized by high diversity and abundance of Alphaproteobacteria but contained also diverse amplicons of other taxa, e.g. Cyanobacteria, Geobacter and Spirochaeta. Statistically significant differences between the microbial communities of both Sphagnum species could not be discovered in any of the experimental approach. Our results show that the same close relationship, which exists between the physical, morphological and chemical characteristics of Sphagnum mosses and the ecology and function of bog ecosystems, also connects moss plantlets with their associated bacterial communities.

  9. Simultaneous digital quantification and fluorescence-based size characterization of massively parallel sequencing libraries.

    Science.gov (United States)

    Laurie, Matthew T; Bertout, Jessica A; Taylor, Sean D; Burton, Joshua N; Shendure, Jay A; Bielas, Jason H

    2013-08-01

    Due to the high cost of failed runs and suboptimal data yields, quantification and determination of fragment size range are crucial steps in the library preparation process for massively parallel sequencing (or next-generation sequencing). Current library quality control methods commonly involve quantification using real-time quantitative PCR and size determination using gel or capillary electrophoresis. These methods are laborious and subject to a number of significant limitations that can make library calibration unreliable. Herein, we propose and test an alternative method for quality control of sequencing libraries using droplet digital PCR (ddPCR). By exploiting a correlation we have discovered between droplet fluorescence and amplicon size, we achieve the joint quantification and size determination of target DNA with a single ddPCR assay. We demonstrate the accuracy and precision of applying this method to the preparation of sequencing libraries.

  10. Function of the EGR-1/TIS8 radiation inducible promoter in a minimal HSV-1 amplicon system

    International Nuclear Information System (INIS)

    Spear, M.A.; Sakamoto, K.M.; Herrlinger, U.; Pechan, P.; Breakefield, X.O.

    1997-01-01

    Purpose: To evaluate function of the EGR-1/TIS8 promoter region in minimal HSV-1 amplicon system in order to determine the feasibility of using the system to regulate vector replication with radiation. Materials and Methods: A 600-base pair 5' upstream region of the EGR-1 promoter linked to chloramphenicol acetyltransferase (CAT) was recombined into a minimal HSV-1 amplicon vector system (pONEC). pONEC or a control plasmid was transfected into U87 glioma cells using the Lipofectamine method. Thirty-six hours later one aliquot of cells from each transfection was irradiated to a dose of 20 Gy and another identical aliquot served as a control. CAT activity was assayed 8 hours after irradiation. Results: pONEC transfected cells irradiated with 20 Gy demonstrated 2.0 fold increase in CAT activity compared to non-irradiated cells. Cells transfected with control plasmid showed no change in CAT activity. Unirradiated pONEC cells had CAT activity 1.3 times those transfected with control plasmid. Conclusion: We have previously created HSV-1 gene therapy amplicon vector systems which allow virus-amplicon interdependent replication, with the intent of regulating replication. The above data demonstrates that a minimal amplicon system will allow radiation dependent regulation by the EGR-1 promoter, thus indicating the possibility of using this system to regulate onsite, spatially and temporally regulated vector production. Baseline CAT activity was higher and relative induction lower than other reported expression constructs, which raises concern for the ability of the system to produce a differential in transcription levels sufficient for this purpose. This is possibly the result of residual promoter/enhancer elements remaining in the HSV-1 sequences. We are attempting to create constructs lacking these elements. Addition of secondary promoter sequences may also be of use. We are also currently evaluating the efficacy of the putative IEX-1 radiation inducible promoter region in

  11. Hi-Plex for Simple, Accurate, and Cost-Effective Amplicon-based Targeted DNA Sequencing.

    Science.gov (United States)

    Pope, Bernard J; Hammet, Fleur; Nguyen-Dumont, Tu; Park, Daniel J

    2018-01-01

    Hi-Plex is a suite of methods to enable simple, accurate, and cost-effective highly multiplex PCR-based targeted sequencing (Nguyen-Dumont et al., Biotechniques 58:33-36, 2015). At its core is the principle of using gene-specific primers (GSPs) to "seed" (or target) the reaction and universal primers to "drive" the majority of the reaction. In this manner, effects on amplification efficiencies across the target amplicons can, to a large extent, be restricted to early seeding cycles. Product sizes are defined within a relatively narrow range to enable high-specificity size selection, replication uniformity across target sites (including in the context of fragmented input DNA such as that derived from fixed tumor specimens (Nguyen-Dumont et al., Biotechniques 55:69-74, 2013; Nguyen-Dumont et al., Anal Biochem 470:48-51, 2015), and application of high-specificity genetic variant calling algorithms (Pope et al., Source Code Biol Med 9:3, 2014; Park et al., BMC Bioinformatics 17:165, 2016). Hi-Plex offers a streamlined workflow that is suitable for testing large numbers of specimens without the need for automation.

  12. Differential amplicons (ΔAmp)-a new molecular method to assess RNA integrity.

    Science.gov (United States)

    Björkman, J; Švec, D; Lott, E; Kubista, M; Sjöback, R

    2016-01-01

    Integrity of the mRNA in clinical samples has major impact on the quality of measured expression levels. This is independent of the measurement technique being next generation sequencing (NGS), Quantitative real-time PCR (qPCR) or microarray profiling. If mRNA is highly degraded or damaged, measured data will be very unreliable and the whole study is likely a waste of time and money. It is therefore common strategy to test the quality of RNA in samples before conducting large and costly studies. Most methods today to assess the quality of RNA are ignorant to the nature of the RNA and, therefore, reflect the integrity of ribosomal RNA, which is the dominant species, rather than of mRNAs, microRNAs and long non-coding RNAs, which usually are the species of interest. Here, we present a novel molecular approach to assess the quality of the targeted RNA species by measuring the differential amplification (ΔAmp) of an Endogenous RNase Resistant (ERR) marker relative to a reference gene, optionally combined with the measurement of two amplicons of different lengths. The combination reveals any mRNA degradation caused by ribonucleases as well as physical, chemical or UV damage. ΔAmp has superior sensitivity to common microfluidic electrophoretic methods, senses the integrity of the actual targeted RNA species, and allows for a smoother and more cost efficient workflow.

  13. Differential amplicons (ΔAmp—a new molecular method to assess RNA integrity

    Directory of Open Access Journals (Sweden)

    J. Björkman

    2016-01-01

    Full Text Available Integrity of the mRNA in clinical samples has major impact on the quality of measured expression levels. This is independent of the measurement technique being next generation sequencing (NGS, Quantitative real-time PCR (qPCR or microarray profiling. If mRNA is highly degraded or damaged, measured data will be very unreliable and the whole study is likely a waste of time and money. It is therefore common strategy to test the quality of RNA in samples before conducting large and costly studies. Most methods today to assess the quality of RNA are ignorant to the nature of the RNA and, therefore, reflect the integrity of ribosomal RNA, which is the dominant species, rather than of mRNAs, microRNAs and long non-coding RNAs, which usually are the species of interest. Here, we present a novel molecular approach to assess the quality of the targeted RNA species by measuring the differential amplification (ΔAmp of an Endogenous RNase Resistant (ERR marker relative to a reference gene, optionally combined with the measurement of two amplicons of different lengths. The combination reveals any mRNA degradation caused by ribonucleases as well as physical, chemical or UV damage. ΔAmp has superior sensitivity to common microfluidic electrophoretic methods, senses the integrity of the actual targeted RNA species, and allows for a smoother and more cost efficient workflow.

  14. Extracellular DNA amplicon sequencing reveals high levels of benthic eukaryotic diversity in the central Red Sea

    KAUST Repository

    Pearman, John K.

    2015-11-01

    The present study aims to characterize the benthic eukaryotic biodiversity patterns at a coarse taxonomic level in three areas of the central Red Sea (a lagoon, an offshore area in Thuwal and a shallow coastal area near Jeddah) based on extracellular DNA. High-throughput amplicon sequencing targeting the V9 region of the 18S rRNA gene was undertaken for 32 sediment samples. High levels of alpha-diversity were detected with 16,089 operational taxonomic units (OTUs) being identified. The majority of the OTUs were assigned to Metazoa (29.2%), Alveolata (22.4%) and Stramenopiles (17.8%). Stramenopiles (Diatomea) and Alveolata (Ciliophora) were frequent in a lagoon and in shallower coastal stations, whereas metazoans (Arthropoda: Maxillopoda) were dominant in deeper offshore stations. Only 24.6% of total OTUs were shared among all areas. Beta-diversity was generally lower between the lagoon and Jeddah (nearshore) than between either of those and the offshore area, suggesting a nearshore–offshore biodiversity gradient. The current approach allowed for a broad-range of benthic eukaryotic biodiversity to be analysed with significantly less labour than would be required by other traditional taxonomic approaches. Our findings suggest that next generation sequencing techniques have the potential to provide a fast and standardised screening of benthic biodiversity at large spatial and temporal scales.

  15. Shedding light on the microbial community of the macropod foregut using 454-amplicon pyrosequencing.

    Directory of Open Access Journals (Sweden)

    Lisa-Maree Gulino

    Full Text Available Twenty macropods from five locations in Queensland, Australia, grazing on a variety of native pastures were surveyed and the bacterial community of the foregut was examined using 454-amplicon pyrosequencing. Specifically, the V3/V4 region of 16S rRNA gene was examined. A total of 5040 OTUs were identified in the data set (post filtering. Thirty-two OTUs were identified as 'shared' OTUS (i.e. present in all samples belonging to either Firmicutes or Bacteroidetes (Clostridiales/Bacteroidales. These phyla predominated the general microbial community in all macropods. Genera represented within the shared OTUs included: unclassified Ruminococcaceae, unclassified Lachnospiraceae, unclassified Clostridiales, Peptococcus sp. Coprococcus spp., Streptococcus spp., Blautia sp., Ruminoccocus sp., Eubacterium sp., Dorea sp., Oscillospira sp. and Butyrivibrio sp. The composition of the bacterial community of the foregut samples of each the host species (Macropus rufus, Macropus giganteus and Macropus robustus was significantly different allowing differentiation between the host species based on alpha and beta diversity measures. Specifically, eleven dominant OTUs that separated the three host species were identified and classified as: unclassified Ruminococcaceae, unclassified Bacteroidales, Prevotella spp. and a Syntrophococcus sucromutans. Putative reductive acetogens and fibrolytic bacteria were also identified in samples. Future work will investigate the presence and role of fibrolytics and acetogens in these ecosystems. Ideally, the isolation and characterization of these organisms will be used for enhanced feed efficiency in cattle, methane mitigation and potentially for other industries such as the biofuel industry.

  16. Comparative analyses of amplicon migration behavior in differing denaturing gradient gel electrophoresis (DGGE) systems

    Science.gov (United States)

    Thornhill, D. J.; Kemp, D. W.; Sampayo, E. M.; Schmidt, G. W.

    2010-03-01

    Denaturing gradient gel electrophoresis (DGGE) is commonly utilized to identify and quantify microbial diversity, but the conditions required for different electrophoretic systems to yield equivalent results and optimal resolution have not been assessed. Herein, the influence of different DGGE system configuration parameters on microbial diversity estimates was tested using Symbiodinium, a group of marine eukaryotic microbes that are important constituents of coral reef ecosystems. To accomplish this, bacterial clone libraries were constructed and sequenced from cultured isolates of Symbiodinium for the ribosomal DNA internal transcribed spacer 2 (ITS2) region. From these, 15 clones were subjected to PCR with a GC clamped primer set for DGGE analyses. Migration behaviors of the resulting amplicons were analyzed using a range of conditions, including variation in the composition of the denaturing gradient, electrophoresis time, and applied voltage. All tests were conducted in parallel on two commercial DGGE systems, a C.B.S. Scientific DGGE-2001, and the Bio-Rad DCode system. In this context, identical nucleotide fragments exhibited differing migration behaviors depending on the model of apparatus utilized, with fragments denaturing at a lower gradient concentration and applied voltage on the Bio-Rad DCode system than on the C.B.S. Scientific DGGE-2001 system. Although equivalent PCR-DGGE profiles could be achieved with both brands of DGGE system, the composition of the denaturing gradient and application of electrophoresis time × voltage must be appropriately optimized to achieve congruent results across platforms.

  17. Swarm v2: highly-scalable and high-resolution amplicon clustering

    Directory of Open Access Journals (Sweden)

    Frédéric Mahé

    2015-12-01

    Full Text Available Previously we presented Swarm v1, a novel and open source amplicon clustering program that produced fine-scale molecular operational taxonomic units (OTUs, free of arbitrary global clustering thresholds and input-order dependency. Swarm v1 worked with an initial phase that used iterative single-linkage with a local clustering threshold (d, followed by a phase that used the internal abundance structures of clusters to break chained OTUs. Here we present Swarm v2, which has two important novel features: (1 a new algorithm for d = 1 that allows the computation time of the program to scale linearly with increasing amounts of data; and (2 the new fastidious option that reduces under-grouping by grafting low abundant OTUs (e.g., singletons and doubletons onto larger ones. Swarm v2 also directly integrates the clustering and breaking phases, dereplicates sequencing reads with d = 0, outputs OTU representatives in fasta format, and plots individual OTUs as two-dimensional networks.

  18. Imaging Herpes Simplex Virus Type 1 Amplicon Vector–Mediated Gene Expression in Human Glioma Spheroids

    Directory of Open Access Journals (Sweden)

    Christine Kaestle

    2011-05-01

    Full Text Available Vectors derived from herpes simplex virus type 1 (HSV-1 have great potential for transducing therapeutic genes into the central nervous system; however, inefficient distribution of vector particles in vivo may limit their therapeutic potential in patients with gliomas. This study was performed to investigate the extent of HSV-1 amplicon vector–mediated gene expression in a three-dimensional glioma model of multicellular spheroids by imaging highly infectious HSV-1 virions expressing green fluorescent protein (HSV-GFP. After infection or microscopy-guided vector injection of glioma spheroids at various spheroid sizes, injection pressures and injection times, the extent of HSV-1 vector–mediated gene expression was investigated via laser scanning microscopy. Infection of spheroids with HSV-GFP demonstrated a maximal depth of vector-mediated GFP expression at 70 to 80 μm. A > 80% transduction efficiency was reached only in small spheroids with a diameter of 90%. The results demonstrated that vector-mediated gene expression in glioma spheroids was strongly dependent on the mode of vector application—injection pressure and injection time being the most important parameters. The assessment of these vector application parameters in tissue models will contribute to the development of safe and efficient gene therapy protocols for clinical application.

  19. Targeted amplicon sequencing (TAS): a scalable next-gen approach to multilocus, multitaxa phylogenetics.

    Science.gov (United States)

    Bybee, Seth M; Bracken-Grissom, Heather; Haynes, Benjamin D; Hermansen, Russell A; Byers, Robert L; Clement, Mark J; Udall, Joshua A; Wilcox, Edward R; Crandall, Keith A

    2011-01-01

    Next-gen sequencing technologies have revolutionized data collection in genetic studies and advanced genome biology to novel frontiers. However, to date, next-gen technologies have been used principally for whole genome sequencing and transcriptome sequencing. Yet many questions in population genetics and systematics rely on sequencing specific genes of known function or diversity levels. Here, we describe a targeted amplicon sequencing (TAS) approach capitalizing on next-gen capacity to sequence large numbers of targeted gene regions from a large number of samples. Our TAS approach is easily scalable, simple in execution, neither time-nor labor-intensive, relatively inexpensive, and can be applied to a broad diversity of organisms and/or genes. Our TAS approach includes a bioinformatic application, BarcodeCrucher, to take raw next-gen sequence reads and perform quality control checks and convert the data into FASTA format organized by gene and sample, ready for phylogenetic analyses. We demonstrate our approach by sequencing targeted genes of known phylogenetic utility to estimate a phylogeny for the Pancrustacea. We generated data from 44 taxa using 68 different 10-bp multiplexing identifiers. The overall quality of data produced was robust and was informative for phylogeny estimation. The potential for this method to produce copious amounts of data from a single 454 plate (e.g., 325 taxa for 24 loci) significantly reduces sequencing expenses incurred from traditional Sanger sequencing. We further discuss the advantages and disadvantages of this method, while offering suggestions to enhance the approach.

  20. Shedding light on the microbial community of the macropod foregut using 454-amplicon pyrosequencing.

    Science.gov (United States)

    Gulino, Lisa-Maree; Ouwerkerk, Diane; Kang, Alicia Y H; Maguire, Anita J; Kienzle, Marco; Klieve, Athol V

    2013-01-01

    Twenty macropods from five locations in Queensland, Australia, grazing on a variety of native pastures were surveyed and the bacterial community of the foregut was examined using 454-amplicon pyrosequencing. Specifically, the V3/V4 region of 16S rRNA gene was examined. A total of 5040 OTUs were identified in the data set (post filtering). Thirty-two OTUs were identified as 'shared' OTUS (i.e. present in all samples) belonging to either Firmicutes or Bacteroidetes (Clostridiales/Bacteroidales). These phyla predominated the general microbial community in all macropods. Genera represented within the shared OTUs included: unclassified Ruminococcaceae, unclassified Lachnospiraceae, unclassified Clostridiales, Peptococcus sp. Coprococcus spp., Streptococcus spp., Blautia sp., Ruminoccocus sp., Eubacterium sp., Dorea sp., Oscillospira sp. and Butyrivibrio sp. The composition of the bacterial community of the foregut samples of each the host species (Macropus rufus, Macropus giganteus and Macropus robustus) was significantly different allowing differentiation between the host species based on alpha and beta diversity measures. Specifically, eleven dominant OTUs that separated the three host species were identified and classified as: unclassified Ruminococcaceae, unclassified Bacteroidales, Prevotella spp. and a Syntrophococcus sucromutans. Putative reductive acetogens and fibrolytic bacteria were also identified in samples. Future work will investigate the presence and role of fibrolytics and acetogens in these ecosystems. Ideally, the isolation and characterization of these organisms will be used for enhanced feed efficiency in cattle, methane mitigation and potentially for other industries such as the biofuel industry.

  1. Network analysis of the microorganism in 25 Danish wastewater treatment plants over 7 years using high-throughput amplicon sequencing

    DEFF Research Database (Denmark)

    Albertsen, Mads; Larsen, Poul; Saunders, Aaron Marc

    to link sludge and floc properties to the microbial communities. All data was subjected to extensive network analysis and multivariate statistics through R. The 16S amplicon results confirmed the findings of relatively few core groups of organism shared by all the wastewater treatment plants......Wastewater treatment is the world’s largest biotechnological processes and a perfect model system for microbial ecology as the habitat is well defined and replicated all over the world. Extensive investigations on Danish wastewater treatment plants using fluorescent in situ hybridization have...... a year, totaling over 400 samples. All samples were subjected to 16S rDNA amplicon sequencing using V13 primers on the Illumina MiSeq platform (2x300bp) to a depth of at least 20.000 quality filtered reads per sample. The OTUs were assigned taxonomy based on a manually curated version of the greengenes...

  2. Quantification in emission tomography

    International Nuclear Information System (INIS)

    Buvat, Irene

    2011-11-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena 2 - Main problems impacting quantification in PET and SPECT: problems, consequences, correction methods, results (Attenuation, scattering, partial volume effect, movement, un-stationary spatial resolution in SPECT, fortuitous coincidences in PET, standardisation in PET); 3 - Synthesis: accessible efficiency, know-how, Precautions, beyond the activity measurement

  3. Competitive amplification of differentially melting amplicons (CADMA improves KRAS hotspot mutation testing in colorectal cancer

    Directory of Open Access Journals (Sweden)

    Kristensen Lasse

    2012-11-01

    Full Text Available Abstract Background Cancer is an extremely heterogeneous group of diseases traditionally categorized according to tissue of origin. However, even among patients with the same cancer subtype the cellular alterations at the molecular level are often very different. Several new therapies targeting specific molecular changes found in individual patients have initiated the era of personalized therapy and significantly improved patient care. In metastatic colorectal cancer (mCRC a selected group of patients with wild-type KRAS respond to antibodies against the epidermal growth factor receptor (EGFR. Testing for KRAS mutations is now required prior to anti-EGFR treatment, however, less sensitive methods based on conventional PCR regularly fail to detect KRAS mutations in clinical samples. Methods We have developed sensitive and specific assays for detection of the seven most common KRAS mutations based on a novel methodology named Competitive Amplification of Differentially Melting Amplicons (CADMA. The clinical applicability of these assays was assessed by analyzing 100 colorectal cancer samples, for which KRAS mutation status has been evaluated by the commercially available TheraScreen® KRAS mutation kit. Results The CADMA assays were sensitive to at least 0.5% mutant alleles in a wild-type background when using 50 nanograms of DNA in the reactions. Consensus between CADMA and the TheraScreen kit was observed in 96% of the colorectal cancer samples. In cases where disagreement was observed the CADMA result could be confirmed by a previously published assay based on TaqMan probes and by fast COLD-PCR followed by Sanger sequencing. Conclusions The high analytical sensitivity and specificity of CADMA may increase diagnostic sensitivity and specificity of KRAS mutation testing in mCRC patients.

  4. Amplicon-based semiconductor sequencing of human exomes: performance evaluation and optimization strategies.

    Science.gov (United States)

    Damiati, E; Borsani, G; Giacopuzzi, Edoardo

    2016-05-01

    The Ion Proton platform allows to perform whole exome sequencing (WES) at low cost, providing rapid turnaround time and great flexibility. Products for WES on Ion Proton system include the AmpliSeq Exome kit and the recently introduced HiQ sequencing chemistry. Here, we used gold standard variants from GIAB consortium to assess the performances in variants identification, characterize the erroneous calls and develop a filtering strategy to reduce false positives. The AmpliSeq Exome kit captures a large fraction of bases (>94 %) in human CDS, ClinVar genes and ACMG genes, but with 2,041 (7 %), 449 (13 %) and 11 (19 %) genes not fully represented, respectively. Overall, 515 protein coding genes contain hard-to-sequence regions, including 90 genes from ClinVar. Performance in variants detection was maximum at mean coverage >120×, while at 90× and 70× we measured a loss of variants of 3.2 and 4.5 %, respectively. WES using HiQ chemistry showed ~71/97.5 % sensitivity, ~37/2 % FDR and ~0.66/0.98 F1 score for indels and SNPs, respectively. The proposed low, medium or high-stringency filters reduced the amount of false positives by 10.2, 21.2 and 40.4 % for indels and 21.2, 41.9 and 68.2 % for SNP, respectively. Amplicon-based WES on Ion Proton platform using HiQ chemistry emerged as a competitive approach, with improved accuracy in variants identification. False-positive variants remain an issue for the Ion Torrent technology, but our filtering strategy can be applied to reduce erroneous variants.

  5. Prerequisites for Amplicon Pyrosequencing of Microbial Methanol Utilizers in the Environment

    Directory of Open Access Journals (Sweden)

    Steffen eKolb

    2013-09-01

    Full Text Available The commercial availability of next generation sequencing (NGS technologies facilitated the assessment of functional groups of microorganisms in the environment with high coverage, resolution, and reproducibility. Soil methylotrophs were among the first microorganisms in the environment that were assessed with molecular tools, and nowadays, as well with NGS technologies. Studies in the past years re-attracted notice to the pivotal role of methylotrophs in global conversions of methanol, which mainly originates from plants, and is involved in oxidative reactions and ozone formation in the atmosphere. Aerobic methanol utilizers belong to Bacteria, yeasts, Ascomycota, and molds. Numerous bacterial methylotrophs are facultatively aerobic, and also contribute to anaerobic methanol oxidation in the environment, whereas strict anaerobic methanol utilizers belong to methanogens and acetogens. The diversity of enzymes catalyzing the initial oxidation of methanol is considerable, and comprises at least five different enzyme types in aerobes, and one in strict anaerobes. Only the gene of the large subunit of PQQ-dependent methanol dehydrogenase (mxaF has been analyzed by environmental pyrosequencing. To enable a comprehensive assessment of methanol utilizers in the environment, new primers targeting genes of the PQQ MDH in Methylibium (mdh2, of the NAD-dependent MDH (mdh, of the methanol oxidoreductase of Actinobacteria (mdo, of the fungal FAD-dependent alcohol oxidase (mod1, mod2, and homologues, and of the gene of the large subunit of the methanol:corrinoid methyltransferases (mtaC in methanogens and acetogens need to be developed. Combined stable isotope probing of nucleic acids or proteins with amplicon-based NGS are straightforward approaches to reveal insights into functions of certain methylotrophic taxa in the global methanol cycle.

  6. Quantification of local mobilities

    DEFF Research Database (Denmark)

    Zhang, Y. B.

    2018-01-01

    A new method for quantification of mobilities of local recrystallization boundary segments is presented. The quantification is based on microstructures characterized using electron microscopy and on determination of migration velocities and driving forces for local boundary segments. Pure aluminium...... is investigated and the results show that even for a single recrystallization boundary, different boundary segments migrate differently, and the differences can be understood based on variations in mobilities and local deformed microstructures. The present work has important implications for understanding...

  7. De novo origin of VCY2 from autosome to Y-transposed amplicon.

    Directory of Open Access Journals (Sweden)

    Peng-Rong Cao

    Full Text Available The formation of new genes is a primary driving force of evolution in all organisms. The de novo evolution of new genes from non-protein-coding genomic regions is emerging as an important additional mechanism for novel gene creation. Y chromosomes underlie sex determination in mammals and contain genes that are required for male-specific functions. In this study, a search was undertaken for Y chromosome de novo genes derived from non-protein-coding sequences. The Y chromosome orphan gene variable charge, Y-linked (VCY2, is an autosome-derived gene that has sequence similarity to large autosomal fragments but lacks an autosomal protein-coding homolog. VCY2 locates in the amplicon containing long DNA fragments that were transposed from autosomes to the Y chromosome before the ape-monkey split. We confirmed that VCY2 cannot be encoded by autosomes due to the presence of multiple disablers that disrupt the open reading frame, such as the absence of start or stop codons and the presence of premature stop codons. Similar observations have been made for homologs in the autosomes of the chimpanzee, gorilla, rhesus macaque, baboon and out-group marmoset, which suggests that there was a non-protein-coding ancestral VCY2 that was common to apes and monkeys that predated the transposition event. Furthermore, while protein-coding orthologs are absent, a putative non-protein-coding VCY2 with conserved disablers was identified in the rhesus macaque Y chromosome male-specific region. This finding implies that VCY2 might have not acquired its protein-coding ability before the ape-monkey split. VCY2 encodes a testis-specific expressed protein and is involved in the pathologic process of male infertility, and the acquisition of this gene might improve male fertility. This is the first evidence that de novo genes can be generated from transposed autosomal non-protein-coding segments, and this evidence provides novel insights into the evolutionary history of the Y

  8. Real-Time PCR Quantification of Chloroplast DNA Supports DNA Barcoding of Plant Species.

    Science.gov (United States)

    Kikkawa, Hitomi S; Tsuge, Kouichiro; Sugita, Ritsuko

    2016-03-01

    Species identification from extracted DNA is sometimes needed for botanical samples. DNA quantification is required for an accurate and effective examination. If a quantitative assay provides unreliable estimates, a higher quantity of DNA than the estimated amount may be used in additional analyses to avoid failure to analyze samples from which extracting DNA is difficult. Compared with conventional methods, real-time quantitative PCR (qPCR) requires a low amount of DNA and enables quantification of dilute DNA solutions accurately. The aim of this study was to develop a qPCR assay for quantification of chloroplast DNA from taxonomically diverse plant species. An absolute quantification method was developed using primers targeting the ribulose-1,5-bisphosphate carboxylase/oxygenase large subunit (rbcL) gene using SYBR Green I-based qPCR. The calibration curve was generated using the PCR amplicon as the template. DNA extracts from representatives of 13 plant families common in Japan. This demonstrates that qPCR analysis is an effective method for quantification of DNA from plant samples. The results of qPCR assist in the decision-making will determine the success or failure of DNA analysis, indicating the possibility of optimization of the procedure for downstream reactions.

  9. Unraveling Core Functional Microbiota in Traditional Solid-State Fermentation by High-Throughput Amplicons and Metatranscriptomics Sequencing.

    Science.gov (United States)

    Song, Zhewei; Du, Hai; Zhang, Yan; Xu, Yan

    2017-01-01

    Fermentation microbiota is specific microorganisms that generate different types of metabolites in many productions. In traditional solid-state fermentation, the structural composition and functional capacity of the core microbiota determine the quality and quantity of products. As a typical example of food fermentation, Chinese Maotai-flavor liquor production involves a complex of various microorganisms and a wide variety of metabolites. However, the microbial succession and functional shift of the core microbiota in this traditional food fermentation remain unclear. Here, high-throughput amplicons (16S rRNA gene amplicon sequencing and internal transcribed space amplicon sequencing) and metatranscriptomics sequencing technologies were combined to reveal the structure and function of the core microbiota in Chinese soy sauce aroma type liquor production. In addition, ultra-performance liquid chromatography and headspace-solid phase microextraction-gas chromatography-mass spectrometry were employed to provide qualitative and quantitative analysis of the major flavor metabolites. A total of 10 fungal and 11 bacterial genera were identified as the core microbiota. In addition, metatranscriptomic analysis revealed pyruvate metabolism in yeasts (genera Pichia, Schizosaccharomyces, Saccharomyces , and Zygosaccharomyces ) and lactic acid bacteria (genus Lactobacillus ) classified into two stages in the production of flavor components. Stage I involved high-level alcohol (ethanol) production, with the genus Schizosaccharomyces serving as the core functional microorganism. Stage II involved high-level acid (lactic acid and acetic acid) production, with the genus Lactobacillus serving as the core functional microorganism. The functional shift from the genus Schizosaccharomyces to the genus Lactobacillus drives flavor component conversion from alcohol (ethanol) to acid (lactic acid and acetic acid) in Chinese Maotai-flavor liquor production. Our findings provide insight into

  10. Unraveling Core Functional Microbiota in Traditional Solid-State Fermentation by High-Throughput Amplicons and Metatranscriptomics Sequencing

    Directory of Open Access Journals (Sweden)

    Zhewei Song

    2017-07-01

    Full Text Available Fermentation microbiota is specific microorganisms that generate different types of metabolites in many productions. In traditional solid-state fermentation, the structural composition and functional capacity of the core microbiota determine the quality and quantity of products. As a typical example of food fermentation, Chinese Maotai-flavor liquor production involves a complex of various microorganisms and a wide variety of metabolites. However, the microbial succession and functional shift of the core microbiota in this traditional food fermentation remain unclear. Here, high-throughput amplicons (16S rRNA gene amplicon sequencing and internal transcribed space amplicon sequencing and metatranscriptomics sequencing technologies were combined to reveal the structure and function of the core microbiota in Chinese soy sauce aroma type liquor production. In addition, ultra-performance liquid chromatography and headspace-solid phase microextraction-gas chromatography-mass spectrometry were employed to provide qualitative and quantitative analysis of the major flavor metabolites. A total of 10 fungal and 11 bacterial genera were identified as the core microbiota. In addition, metatranscriptomic analysis revealed pyruvate metabolism in yeasts (genera Pichia, Schizosaccharomyces, Saccharomyces, and Zygosaccharomyces and lactic acid bacteria (genus Lactobacillus classified into two stages in the production of flavor components. Stage I involved high-level alcohol (ethanol production, with the genus Schizosaccharomyces serving as the core functional microorganism. Stage II involved high-level acid (lactic acid and acetic acid production, with the genus Lactobacillus serving as the core functional microorganism. The functional shift from the genus Schizosaccharomyces to the genus Lactobacillus drives flavor component conversion from alcohol (ethanol to acid (lactic acid and acetic acid in Chinese Maotai-flavor liquor production. Our findings provide

  11. Fluorescent quantification of melanin.

    Science.gov (United States)

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. Next-generation sequencing of multiple individuals per barcoded library by deconvolution of sequenced amplicons using endonuclease fragment analysis

    DEFF Research Database (Denmark)

    Andersen, Jeppe D; Pereira, Vania; Pietroni, Carlotta

    2014-01-01

    The simultaneous sequencing of samples from multiple individuals increases the efficiency of next-generation sequencing (NGS) while also reducing costs. Here we describe a novel and simple approach for sequencing DNA from multiple individuals per barcode. Our strategy relies on the endonuclease...... digestion of PCR amplicons prior to library preparation, creating a specific fragment pattern for each individual that can be resolved after sequencing. By using both barcodes and restriction fragment patterns, we demonstrate the ability to sequence the human melanocortin 1 receptor (MC1R) genes from 72...... individuals using only 24 barcoded libraries....

  13. Analysis of intra-host genetic diversity of Prunus necrotic ringspot virus (PNRSV) using amplicon next generation sequencing.

    Science.gov (United States)

    Kinoti, Wycliff M; Constable, Fiona E; Nancarrow, Narelle; Plummer, Kim M; Rodoni, Brendan

    2017-01-01

    PCR amplicon next generation sequencing (NGS) analysis offers a broadly applicable and targeted approach to detect populations of both high- or low-frequency virus variants in one or more plant samples. In this study, amplicon NGS was used to explore the diversity of the tripartite genome virus, Prunus necrotic ringspot virus (PNRSV) from 53 PNRSV-infected trees using amplicons from conserved gene regions of each of PNRSV RNA1, RNA2 and RNA3. Sequencing of the amplicons from 53 PNRSV-infected trees revealed differing levels of polymorphism across the three different components of the PNRSV genome with a total number of 5040, 2083 and 5486 sequence variants observed for RNA1, RNA2 and RNA3 respectively. The RNA2 had the lowest diversity of sequences compared to RNA1 and RNA3, reflecting the lack of flexibility tolerated by the replicase gene that is encoded by this RNA component. Distinct PNRSV phylo-groups, consisting of closely related clusters of sequence variants, were observed in each of PNRSV RNA1, RNA2 and RNA3. Most plant samples had a single phylo-group for each RNA component. Haplotype network analysis showed that smaller clusters of PNRSV sequence variants were genetically connected to the largest sequence variant cluster within a phylo-group of each RNA component. Some plant samples had sequence variants occurring in multiple PNRSV phylo-groups in at least one of each RNA and these phylo-groups formed distinct clades that represent PNRSV genetic strains. Variants within the same phylo-group of each Prunus plant sample had ≥97% similarity and phylo-groups within a Prunus plant sample and between samples had less ≤97% similarity. Based on the analysis of diversity, a definition of a PNRSV genetic strain was proposed. The proposed definition was applied to determine the number of PNRSV genetic strains in each of the plant samples and the complexity in defining genetic strains in multipartite genome viruses was explored.

  14. Analysis of intra-host genetic diversity of Prunus necrotic ringspot virus (PNRSV using amplicon next generation sequencing.

    Directory of Open Access Journals (Sweden)

    Wycliff M Kinoti

    Full Text Available PCR amplicon next generation sequencing (NGS analysis offers a broadly applicable and targeted approach to detect populations of both high- or low-frequency virus variants in one or more plant samples. In this study, amplicon NGS was used to explore the diversity of the tripartite genome virus, Prunus necrotic ringspot virus (PNRSV from 53 PNRSV-infected trees using amplicons from conserved gene regions of each of PNRSV RNA1, RNA2 and RNA3. Sequencing of the amplicons from 53 PNRSV-infected trees revealed differing levels of polymorphism across the three different components of the PNRSV genome with a total number of 5040, 2083 and 5486 sequence variants observed for RNA1, RNA2 and RNA3 respectively. The RNA2 had the lowest diversity of sequences compared to RNA1 and RNA3, reflecting the lack of flexibility tolerated by the replicase gene that is encoded by this RNA component. Distinct PNRSV phylo-groups, consisting of closely related clusters of sequence variants, were observed in each of PNRSV RNA1, RNA2 and RNA3. Most plant samples had a single phylo-group for each RNA component. Haplotype network analysis showed that smaller clusters of PNRSV sequence variants were genetically connected to the largest sequence variant cluster within a phylo-group of each RNA component. Some plant samples had sequence variants occurring in multiple PNRSV phylo-groups in at least one of each RNA and these phylo-groups formed distinct clades that represent PNRSV genetic strains. Variants within the same phylo-group of each Prunus plant sample had ≥97% similarity and phylo-groups within a Prunus plant sample and between samples had less ≤97% similarity. Based on the analysis of diversity, a definition of a PNRSV genetic strain was proposed. The proposed definition was applied to determine the number of PNRSV genetic strains in each of the plant samples and the complexity in defining genetic strains in multipartite genome viruses was explored.

  15. Sequence-specific validation of LAMP amplicons in real-time optomagnetic detection of Dengue serotype 2 synthetic DNA

    DEFF Research Database (Denmark)

    Minero, Gabriel Khose Antonio; Nogueira, Catarina; Rizzi, Giovanni

    2017-01-01

    We report on an optomagnetic technique optimised for real-time molecular detection of Dengue fever virus under ideal as well as non-ideal laboratory conditions using two different detection approaches. The first approach is based on the detection of the hydrodynamic volume of streptavidin coated...... magnetic nanoparticles attached to biotinylated LAMP amplicons. We demonstrate detection of sub-femtomolar Dengue DNA target concentrations in the ideal contamination-free lab environment within 20 min. The second detection approach is based on sequence-specific binding of functionalised magnetic...... claim detection of down to 100 fM of Dengue target after 20 min of LAMP with a contamination background....

  16. Assessment of DNA degradation induced by thermal and UV radiation processing: implications for quantification of genetically modified organisms.

    Science.gov (United States)

    Ballari, Rajashekhar V; Martin, Asha

    2013-12-01

    DNA quality is an important parameter for the detection and quantification of genetically modified organisms (GMO's) using the polymerase chain reaction (PCR). Food processing leads to degradation of DNA, which may impair GMO detection and quantification. This study evaluated the effect of various processing treatments such as heating, baking, microwaving, autoclaving and ultraviolet (UV) irradiation on the relative transgenic content of MON 810 maize using pRSETMON-02, a dual target plasmid as a model system. Amongst all the processing treatments examined, autoclaving and UV irradiation resulted in the least recovery of the transgenic (CaMV 35S promoter) and taxon-specific (zein) target DNA sequences. Although a profound impact on DNA degradation was seen during the processing, DNA could still be reliably quantified by Real-time PCR. The measured mean DNA copy number ratios of the processed samples were in agreement with the expected values. Our study confirms the premise that the final analytical value assigned to a particular sample is independent of the degree of DNA degradation since the transgenic and the taxon-specific target sequences possessing approximately similar lengths degrade in parallel. The results of our study demonstrate that food processing does not alter the relative quantification of the transgenic content provided the quantitative assays target shorter amplicons and the difference in the amplicon size between the transgenic and taxon-specific genes is minimal. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Improved Efficiency and Reliability of NGS Amplicon Sequencing Data Analysis for Genetic Diagnostic Procedures Using AGSA Software

    Directory of Open Access Journals (Sweden)

    Axel Poulet

    2016-01-01

    Full Text Available Screening for BRCA mutations in women with familial risk of breast or ovarian cancer is an ideal situation for high-throughput sequencing, providing large amounts of low cost data. However, 454, Roche, and Ion Torrent, Thermo Fisher, technologies produce homopolymer-associated indel errors, complicating their use in routine diagnostics. We developed software, named AGSA, which helps to detect false positive mutations in homopolymeric sequences. Seventy-two familial breast cancer cases were analysed in parallel by amplicon 454 pyrosequencing and Sanger dideoxy sequencing for genetic variations of the BRCA genes. All 565 variants detected by dideoxy sequencing were also detected by pyrosequencing. Furthermore, pyrosequencing detected 42 variants that were missed with Sanger technique. Six amplicons contained homopolymer tracts in the coding sequence that were systematically misread by the software supplied by Roche. Read data plotted as histograms by AGSA software aided the analysis considerably and allowed validation of the majority of homopolymers. As an optimisation, additional 250 patients were analysed using microfluidic amplification of regions of interest (Access Array Fluidigm of the BRCA genes, followed by 454 sequencing and AGSA analysis. AGSA complements a complete line of high-throughput diagnostic sequence analysis, reducing time and costs while increasing reliability, notably for homopolymer tracts.

  18. A Portable Automatic Endpoint Detection System for Amplicons of Loop Mediated Isothermal Amplification on Microfluidic Compact Disk Platform

    Directory of Open Access Journals (Sweden)

    Shah Mukim Uddin

    2015-03-01

    Full Text Available In recent years, many improvements have been made in foodborne pathogen detection methods to reduce the impact of food contamination. Several rapid methods have been developed with biosensor devices to improve the way of performing pathogen detection. This paper presents an automated endpoint detection system for amplicons generated by loop mediated isothermal amplification (LAMP on a microfluidic compact disk platform. The developed detection system utilizes a monochromatic ultraviolet (UV emitter for excitation of fluorescent labeled LAMP amplicons and a color sensor to detect the emitted florescence from target. Then it processes the sensor output and displays the detection results on liquid crystal display (LCD. The sensitivity test has been performed with detection limit up to 2.5 × 10−3 ng/µL with different DNA concentrations of Salmonella bacteria. This system allows a rapid and automatic endpoint detection which could lead to the development of a point-of-care diagnosis device for foodborne pathogens detection in a resource-limited environment.

  19. CLOTU: An online pipeline for processing and clustering of 454 amplicon reads into OTUs followed by taxonomic annotation

    Directory of Open Access Journals (Sweden)

    Shalchian-Tabrizi Kamran

    2011-05-01

    Full Text Available Abstract Background The implementation of high throughput sequencing for exploring biodiversity poses high demands on bioinformatics applications for automated data processing. Here we introduce CLOTU, an online and open access pipeline for processing 454 amplicon reads. CLOTU has been constructed to be highly user-friendly and flexible, since different types of analyses are needed for different datasets. Results In CLOTU, the user can filter out low quality sequences, trim tags, primers, adaptors, perform clustering of sequence reads, and run BLAST against NCBInr or a customized database in a high performance computing environment. The resulting data may be browsed in a user-friendly manner and easily forwarded to downstream analyses. Although CLOTU is specifically designed for analyzing 454 amplicon reads, other types of DNA sequence data can also be processed. A fungal ITS sequence dataset generated by 454 sequencing of environmental samples is used to demonstrate the utility of CLOTU. Conclusions CLOTU is a flexible and easy to use bioinformatics pipeline that includes different options for filtering, trimming, clustering and taxonomic annotation of high throughput sequence reads. Some of these options are not included in comparable pipelines. CLOTU is implemented in a Linux computer cluster and is freely accessible to academic users through the Bioportal web-based bioinformatics service (http://www.bioportal.uio.no.

  20. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very ...

  1. Identification of novel candidate target genes in amplicons of Glioblastoma multiforme tumors detected by expression and CGH microarray profiling

    Directory of Open Access Journals (Sweden)

    Hernández-Moneo Jose-Luis

    2006-09-01

    Full Text Available Abstract Background Conventional cytogenetic and comparative genomic hybridization (CGH studies in brain malignancies have shown that glioblastoma multiforme (GBM is characterized by complex structural and numerical alterations. However, the limited resolution of these techniques has precluded the precise identification of detailed specific gene copy number alterations. Results We performed a genome-wide survey of gene copy number changes in 20 primary GBMs by CGH on cDNA microarrays. A novel amplicon at 4p15, and previously uncharacterized amplicons at 13q32-34 and 1q32 were detected and are analyzed here. These amplicons contained amplified genes not previously reported. Other amplified regions containg well-known oncogenes in GBMs were also detected at 7p12 (EGFR, 7q21 (CDK6, 4q12 (PDGFRA, and 12q13-15 (MDM2 and CDK4. In order to identify the putative target genes of the amplifications, and to determine the changes in gene expression levels associated with copy number change events, we carried out parallel gene expression profiling analyses using the same cDNA microarrays. We detected overexpression of the novel amplified genes SLA/LP and STIM2 (4p15, and TNFSF13B and COL4A2 (13q32-34. Some of the candidate target genes of amplification (EGFR, CDK6, MDM2, CDK4, and TNFSF13B were tested in an independent set of 111 primary GBMs by using FISH and immunohistological assays. The novel candidate 13q-amplification target TNFSF13B was amplified in 8% of the tumors, and showed protein expression in 20% of the GBMs. Conclusion This high-resolution analysis allowed us to propose novel candidate target genes such as STIM2 at 4p15, and TNFSF13B or COL4A2 at 13q32-34 that could potentially contribute to the pathogenesis of these tumors and which would require futher investigations. We showed that overexpression of the amplified genes could be attributable to gene dosage and speculate that deregulation of those genes could be important in the development

  2. Analysis of 16S rRNA amplicon sequencing options on the Roche/454 next-generation titanium sequencing platform.

    Directory of Open Access Journals (Sweden)

    Hideyuki Tamaki

    Full Text Available BACKGROUND: 16S rRNA gene pyrosequencing approach has revolutionized studies in microbial ecology. While primer selection and short read length can affect the resulting microbial community profile, little is known about the influence of pyrosequencing methods on the sequencing throughput and the outcome of microbial community analyses. The aim of this study is to compare differences in output, ease, and cost among three different amplicon pyrosequencing methods for the Roche/454 Titanium platform METHODOLOGY/PRINCIPAL FINDINGS: The following three pyrosequencing methods for 16S rRNA genes were selected in this study: Method-1 (standard method is the recommended method for bi-directional sequencing using the LIB-A kit; Method-2 is a new option designed in this study for unidirectional sequencing with the LIB-A kit; and Method-3 uses the LIB-L kit for unidirectional sequencing. In our comparison among these three methods using 10 different environmental samples, Method-2 and Method-3 produced 1.5-1.6 times more useable reads than the standard method (Method-1, after quality-based trimming, and did not compromise the outcome of microbial community analyses. Specifically, Method-3 is the most cost-effective unidirectional amplicon sequencing method as it provided the most reads and required the least effort in consumables management. CONCLUSIONS: Our findings clearly demonstrated that alternative pyrosequencing methods for 16S rRNA genes could drastically affect sequencing output (e.g. number of reads before and after trimming but have little effect on the outcomes of microbial community analysis. This finding is important for both researchers and sequencing facilities utilizing 16S rRNA gene pyrosequencing for microbial ecological studies.

  3. Large-scale benchmarking reveals false discoveries and count transformation sensitivity in 16S rRNA gene amplicon data analysis methods used in microbiome studies

    DEFF Research Database (Denmark)

    Thorsen, Jonathan; Brejnrod, Asker Daniel; Mortensen, Martin Steen

    2016-01-01

    BACKGROUND: There is an immense scientific interest in the human microbiome and its effects on human physiology, health, and disease. A common approach for examining bacterial communities is high-throughput sequencing of 16S rRNA gene hypervariable regions, aggregating sequence-similar amplicons...

  4. Prostate-Specific and Tumor-Specific Targeting of an Oncolytic HSV-1 Amplicon/Helper Virus for Prostate Cancer Treatment

    Science.gov (United States)

    2009-11-01

    regulation HSV-1 amplicon and recombinant viruses Molecular cloning Cell culture/gene transfection Xenograft mouse model Histology 20...no herpetic lesions were seen in CMV-ICP4-143T–treated and CMV-ICP4-145T– treated animals, although some gastritis developed 28 days after the viral

  5. A need for standardization in drinking water analysis – an investigation of DNA extraction procedure, primer choice and detection limit of 16S rRNA amplicon sequencing

    DEFF Research Database (Denmark)

    Brandt, Jakob; Nielsen, Per Halkjær; Albertsen, Mads

    have been made to illuminate the effects specifically related to bacterial communities in drinking water. In this study, we investigated the impact of the DNA extraction and primer choice on the observed community structure, and we also estimated the detection limit of the 16S rRNA amplicon sequencing...

  6. Optimisation of 16S rDNA amplicon sequencing protocols for microbial community profiling of anaerobic digesters

    DEFF Research Database (Denmark)

    Kirkegaard, Rasmus Hansen; McIlroy, Simon Jon; Larsen, Poul

    A reliable and reproducible method for identification and quantification of the microorganisms involved in biogas production is important for the study and understanding of the microbial communities responsible for the function of anaerobic digester systems. DNA based identification using 16S rRN...

  7. Obtaining representative community profiles of anaerobic digesters through optimisation of 16S rRNA amplicon sequencing protocols

    DEFF Research Database (Denmark)

    Kirkegaard, Rasmus Hansen; McIlroy, Simon Jon; Karst, Søren Michael

    A reliable and reproducible method for identification and quantification of the microorganisms involved in biogas production is important for the study and understanding of the microbial communities responsible for the function of anaerobic digester systems. DNA based identification using 16S r...

  8. Accident sequence quantification with KIRAP

    International Nuclear Information System (INIS)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong.

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP's cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs

  9. Accident sequence quantification with KIRAP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP`s cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs.

  10. Direct quantification of fungal DNA from soil substrate using real-time PCR.

    Science.gov (United States)

    Filion, Martin; St-Arnaud, Marc; Jabaji-Hare, Suha H

    2003-04-01

    Detection and quantification of genomic DNA from two ecologically different fungi, the plant pathogen Fusarium solani f. sp. phaseoli and the arbuscular mycorrhizal fungus Glomus intraradices, was achieved from soil substrate. Specific primers targeting a 362-bp fragment from the SSU rRNA gene region of G. intraradices and a 562-bp fragment from the F. solani f. sp. phaseoli translation elongation factor 1 alpha gene were used in real-time polymerase chain reaction (PCR) assays conjugated with the fluorescent SYBR(R) Green I dye. Standard curves showed a linear relation (r(2)=0.999) between log values of fungal genomic DNA of each species and real-time PCR threshold cycles and were quantitative over 4-5 orders of magnitude. Real-time PCR assays were applied to in vitro-produced fungal structures and sterile and non-sterile soil substrate seeded with known propagule numbers of either fungi. Detection and genomic DNA quantification was obtained from the different treatments, while no amplicon was detected from non-seeded non-sterile soil samples, confirming the absence of cross-reactivity with the soil microflora DNA. A significant correlation (Pgenomic DNA of F. solani f. sp. phaseoli or G. intraradices detected and the number of fungal propagules present in seeded soil substrate. The DNA extraction protocol and real-time PCR quantification assay can be performed in less than 2 h and is adaptable to detect and quantify genomic DNA from other soilborne fungi.

  11. The practical analysis of food: the development of Sakalar quantification table of DNA (SQT-DNA).

    Science.gov (United States)

    Sakalar, Ergün

    2013-11-15

    Practical and highly sensitive Sakalar quantification table of DNA (SQT-DNA) has been developed for the detection% of species-specific DNA amount in food products. Cycle threshold (Ct) data were obtained from multiple curves of real-time qPCR. The statistical analysis was done to estimate the concentration of standard dilutions. Amplicon concentrations versus each Ct value were assessed by the predictions of targets at known concentrations. SQT-DNA was prepared by using the percentage versus each Ct values. The applicability of SQT-DNA to commercial foods was proved by using sausages containing varying ratios of beef, chicken, and soybean. The results showed that SQT-DNA can be used to directly quantify food DNA by a single PCR without the need to construct a standart curve in parallel with the samples every time the experiment is performed, and also quantification by SQT-DNA is as reliable as standard curve quantification for a wide range of DNA concentrations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Metagenomic Analysis of Slovak Bryndza Cheese Using Next-Generation 16S rDNA Amplicon Sequencing

    Directory of Open Access Journals (Sweden)

    Planý Matej

    2016-06-01

    Full Text Available Knowledge about diversity and taxonomic structure of the microbial population present in traditional fermented foods plays a key role in starter culture selection, safety improvement and quality enhancement of the end product. Aim of this study was to investigate microbial consortia composition in Slovak bryndza cheese. For this purpose, we used culture-independent approach based on 16S rDNA amplicon sequencing using next generation sequencing platform. Results obtained by the analysis of three commercial (produced on industrial scale in winter season and one traditional (artisanal, most valued, produced in May Slovak bryndza cheese sample were compared. A diverse prokaryotic microflora composed mostly of the genera Lactococcus, Streptococcus, Lactobacillus, and Enterococcus was identified. Lactococcus lactis subsp. lactis and Lactococcus lactis subsp. cremoris were the dominant taxons in all tested samples. Second most abundant species, detected in all bryndza cheeses, were Lactococcus fujiensis and Lactococcus taiwanensis, independently by two different approaches, using different reference 16S rRNA genes databases (Greengenes and NCBI respectively. They have been detected in bryndza cheese samples in substantial amount for the first time. The narrowest microbial diversity was observed in a sample made with a starter culture from pasteurised milk. Metagenomic analysis by high-throughput sequencing using 16S rRNA genes seems to be a powerful tool for studying the structure of the microbial population in cheeses.

  13. Fungi Sailing the Arctic Ocean: Speciose Communities in North Atlantic Driftwood as Revealed by High-Throughput Amplicon Sequencing.

    Science.gov (United States)

    Rämä, Teppo; Davey, Marie L; Nordén, Jenni; Halvorsen, Rune; Blaalid, Rakel; Mathiassen, Geir H; Alsos, Inger G; Kauserud, Håvard

    2016-08-01

    High amounts of driftwood sail across the oceans and provide habitat for organisms tolerating the rough and saline environment. Fungi have adapted to the extremely cold and saline conditions which driftwood faces in the high north. For the first time, we applied high-throughput sequencing to fungi residing in driftwood to reveal their taxonomic richness, community composition, and ecology in the North Atlantic. Using pyrosequencing of ITS2 amplicons obtained from 49 marine logs, we found 807 fungal operational taxonomic units (OTUs) based on clustering at 97 % sequence similarity cut-off level. The phylum Ascomycota comprised 74 % of the OTUs and 20 % belonged to Basidiomycota. The richness of basidiomycetes decreased with prolonged submersion in the sea, supporting the general view of ascomycetes being more extremotolerant. However, more than one fourth of the fungal OTUs remained unassigned to any fungal class, emphasising the need for better DNA reference data from the marine habitat. Different fungal communities were detected in coniferous and deciduous logs. Our results highlight that driftwood hosts a considerably higher fungal diversity than currently known. The driftwood fungal community is not a terrestrial relic but a speciose assemblage of fungi adapted to the stressful marine environment and different kinds of wooden substrates found in it.

  14. A molecular-beacon-based asymmetric PCR assay for easy visualization of amplicons in the diagnosis of trichomoniasis.

    Science.gov (United States)

    Sonkar, Subash C; Sachdev, Divya; Mishra, Prashant K; Kumar, Anita; Mittal, Pratima; Saluja, Daman

    2016-12-15

    The currently available nucleic acid amplification tests (NAATs) for trichomoniasis are accurate, quick and confirmative with superior sensitivity than traditional culture-based microbiology assays. However, these assays are associated with problems of carry over contamination, false positive results, requirement of technical expertise for performance and detection of end product. Hence, a diagnostic assay with easy visualization of the amplified product will be profitable. An in-house, rapid, sensitive, specific molecular-beacon-based PCR assay, using primers against pfoB gene of Trichomonas vaginalis, was developed and evaluated using dry ectocervical swabs (n=392) from symptomatic females with vaginal discharge. Total DNA was isolated and used as template for the PCR assays. The performance and reproducibility of PCR assay was evaluated by composite reference standard (CRS). For easy visualization of the amplified product, molecular-beacon was designed and amplicons were visualized directly using fluorescent handheld dark reader or by Micro-Plate Reader. Molecular-beacons are single-stranded hairpin shaped nucleic acid probes composed of a stem, with fluorophore/quencher pair and a loop region complementary to the desired DNA. The beacon-based PCR assay designed in the present study is highly specific as confirmed by competition experiments and extremely sensitive with detection limit of 20fg of genomic DNA (3-4 pathogens). The minimum infrastructure requirement and ease to perform the assay makes this method highly useful for resource poor countries for better disease management. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Amplicon-Based Pyrosequencing Reveals High Diversity of Protistan Parasites in Ships' Ballast Water: Implications for Biogeography and Infectious Diseases.

    Science.gov (United States)

    Pagenkopp Lohan, K M; Fleischer, R C; Carney, K J; Holzer, K K; Ruiz, G M

    2016-04-01

    Ships' ballast water (BW) commonly moves macroorganisms and microorganisms across the world's oceans and along coasts; however, the majority of these microbial transfers have gone undetected. We applied high-throughput sequencing methods to identify microbial eukaryotes, specifically emphasizing the protistan parasites, in ships' BW collected from vessels calling to the Chesapeake Bay (Virginia and Maryland, USA) from European and Eastern Canadian ports. We utilized tagged-amplicon 454 pyrosequencing with two general primer sets, amplifying either the V4 or V9 domain of the small subunit (SSU) of the ribosomal RNA (rRNA) gene complex, from total DNA extracted from water samples collected from the ballast tanks of bulk cargo vessels. We detected a diverse group of protistan taxa, with some known to contain important parasites in marine systems, including Apicomplexa (unidentified apicomplexans, unidentified gregarines, Cryptosporidium spp.), Dinophyta (Blastodinium spp., Euduboscquella sp., unidentified syndinids, Karlodinium spp., Syndinium spp.), Perkinsea (Parvilucifera sp.), Opisthokonta (Ichthyosporea sp., Pseudoperkinsidae, unidentified ichthyosporeans), and Stramenopiles (Labyrinthulomycetes). Further characterization of groups with parasitic taxa, consisting of phylogenetic analyses for four taxa (Cryptosporidium spp., Parvilucifera spp., Labyrinthulomycetes, and Ichthyosporea), revealed that sequences were obtained from both known and novel lineages. This study demonstrates that high-throughput sequencing is a viable and sensitive method for detecting parasitic protists when present and transported in the ballast water of ships. These data also underscore the potential importance of human-aided dispersal in the biogeography of these microbes and emerging diseases in the world's oceans.

  16. Evolution of MHC class I genes in the endangered loggerhead sea turtle (Caretta caretta) revealed by 454 amplicon sequencing.

    Science.gov (United States)

    Stiebens, Victor A; Merino, Sonia E; Chain, Frédéric J J; Eizaguirre, Christophe

    2013-04-30

    In evolutionary and conservation biology, parasitism is often highlighted as a major selective pressure. To fight against parasites and pathogens, genetic diversity of the immune genes of the major histocompatibility complex (MHC) are particularly important. However, the extensive degree of polymorphism observed in these genes makes it difficult to conduct thorough population screenings. We utilized a genotyping protocol that uses 454 amplicon sequencing to characterize the MHC class I in the endangered loggerhead sea turtle (Caretta caretta) and to investigate their evolution at multiple relevant levels of organization. MHC class I genes revealed signatures of trans-species polymorphism across several reptile species. In the studied loggerhead turtle individuals, it results in the maintenance of two ancient allelic lineages. We also found that individuals carrying an intermediate number of MHC class I alleles are larger than those with either a low or high number of alleles. Multiple modes of evolution seem to maintain MHC diversity in the loggerhead turtles, with relatively high polymorphism for an endangered species.

  17. PipeCraft: Flexible open-source toolkit for bioinformatics analysis of custom high-throughput amplicon sequencing data.

    Science.gov (United States)

    Anslan, Sten; Bahram, Mohammad; Hiiesalu, Indrek; Tedersoo, Leho

    2017-11-01

    High-throughput sequencing methods have become a routine analysis tool in environmental sciences as well as in public and private sector. These methods provide vast amount of data, which need to be analysed in several steps. Although the bioinformatics may be applied using several public tools, many analytical pipelines allow too few options for the optimal analysis for more complicated or customized designs. Here, we introduce PipeCraft, a flexible and handy bioinformatics pipeline with a user-friendly graphical interface that links several public tools for analysing amplicon sequencing data. Users are able to customize the pipeline by selecting the most suitable tools and options to process raw sequences from Illumina, Pacific Biosciences, Ion Torrent and Roche 454 sequencing platforms. We described the design and options of PipeCraft and evaluated its performance by analysing the data sets from three different sequencing platforms. We demonstrated that PipeCraft is able to process large data sets within 24 hr. The graphical user interface and the automated links between various bioinformatics tools enable easy customization of the workflow. All analytical steps and options are recorded in log files and are easily traceable. © 2017 John Wiley & Sons Ltd.

  18. Rhea: a transparent and modular R pipeline for microbial profiling based on 16S rRNA gene amplicons.

    Science.gov (United States)

    Lagkouvardos, Ilias; Fischer, Sandra; Kumar, Neeraj; Clavel, Thomas

    2017-01-01

    The importance of 16S rRNA gene amplicon profiles for understanding the influence of microbes in a variety of environments coupled with the steep reduction in sequencing costs led to a surge of microbial sequencing projects. The expanding crowd of scientists and clinicians wanting to make use of sequencing datasets can choose among a range of multipurpose software platforms, the use of which can be intimidating for non-expert users. Among available pipeline options for high-throughput 16S rRNA gene analysis, the R programming language and software environment for statistical computing stands out for its power and increased flexibility, and the possibility to adhere to most recent best practices and to adjust to individual project needs. Here we present the Rhea pipeline, a set of R scripts that encode a series of well-documented choices for the downstream analysis of Operational Taxonomic Units (OTUs) tables, including normalization steps, alpha - and beta -diversity analysis, taxonomic composition, statistical comparisons, and calculation of correlations. Rhea is primarily a straightforward starting point for beginners, but can also be a framework for advanced users who can modify and expand the tool. As the community standards evolve, Rhea will adapt to always represent the current state-of-the-art in microbial profiles analysis in the clear and comprehensive way allowed by the R language. Rhea scripts and documentation are freely available at https://lagkouvardos.github.io/Rhea.

  19. Rhea: a transparent and modular R pipeline for microbial profiling based on 16S rRNA gene amplicons

    Directory of Open Access Journals (Sweden)

    Ilias Lagkouvardos

    2017-01-01

    Full Text Available The importance of 16S rRNA gene amplicon profiles for understanding the influence of microbes in a variety of environments coupled with the steep reduction in sequencing costs led to a surge of microbial sequencing projects. The expanding crowd of scientists and clinicians wanting to make use of sequencing datasets can choose among a range of multipurpose software platforms, the use of which can be intimidating for non-expert users. Among available pipeline options for high-throughput 16S rRNA gene analysis, the R programming language and software environment for statistical computing stands out for its power and increased flexibility, and the possibility to adhere to most recent best practices and to adjust to individual project needs. Here we present the Rhea pipeline, a set of R scripts that encode a series of well-documented choices for the downstream analysis of Operational Taxonomic Units (OTUs tables, including normalization steps, alpha- and beta-diversity analysis, taxonomic composition, statistical comparisons, and calculation of correlations. Rhea is primarily a straightforward starting point for beginners, but can also be a framework for advanced users who can modify and expand the tool. As the community standards evolve, Rhea will adapt to always represent the current state-of-the-art in microbial profiles analysis in the clear and comprehensive way allowed by the R language. Rhea scripts and documentation are freely available at https://lagkouvardos.github.io/Rhea.

  20. A quantitative and qualitative comparison of illumina MiSeq and 454 amplicon sequencing for genotyping the highly polymorphic major histocompatibility complex (MHC) in a non-model species.

    Science.gov (United States)

    Razali, Haslina; O'Connor, Emily; Drews, Anna; Burke, Terry; Westerdahl, Helena

    2017-07-28

    High-throughput sequencing enables high-resolution genotyping of extremely duplicated genes. 454 amplicon sequencing (454) has become the standard technique for genotyping the major histocompatibility complex (MHC) genes in non-model organisms. However, illumina MiSeq amplicon sequencing (MiSeq), which offers a much higher read depth, is now superseding 454. The aim of this study was to quantitatively and qualitatively evaluate the performance of MiSeq in relation to 454 for genotyping MHC class I alleles using a house sparrow (Passer domesticus) dataset with pedigree information. House sparrows provide a good study system for this comparison as their MHC class I genes have been studied previously and, consequently, we had prior expectations concerning the number of alleles per individual. We found that 454 and MiSeq performed equally well in genotyping amplicons with low diversity, i.e. amplicons from individuals that had fewer than 6 alleles. Although there was a higher rate of failure in the 454 dataset in resolving amplicons with higher diversity (6-9 alleles), the same genotypes were identified by both 454 and MiSeq in 98% of cases. We conclude that low diversity amplicons are equally well genotyped using either 454 or MiSeq, but the higher coverage afforded by MiSeq can lead to this approach outperforming 454 in amplicons with higher diversity.

  1. Genes encoded within 8q24 on the amplicon of a large extrachromosomal element are selectively repressed during the terminal differentiation of HL-60 cells.

    Science.gov (United States)

    Hirano, Tetsuo; Ike, Fumio; Murata, Takehide; Obata, Yuichi; Utiyama, Hiroyasu; Yokoyama, Kazunari K

    2008-04-02

    Human acute myeloblastic leukemia HL-60 cells become resistant to differentiation during long-term cultivation. After 150 passages, double minute chromosomes (dmins) found in early-passaged cells are replaced by large extrachromosomal elements (LEEs). In a DNA library derived from a purified fraction of LEEs, 12.6% (23/183) of clones were assigned to 8q24 and 9.2% (17/183) were assigned to 14q11 in the human genome. Fluorescence in situ hybridization (FISH) revealed a small aberrant chromosome, which had not been found in early-passaged cells, in addition to the purified LEEs. We determined that each LEE consisted of six discontinuous segments in a region that extended for 4.4Mb over the 8q24 locus. Five genes, namely, Myc (a proto-oncogene), NSMCE2 (for a SUMO ligase), CCDC26 (for a retinoic acid-dependent modulator of myeloid differentiation), TRIB1 (for a regulator of MAPK kinase) and LOC389637 (for a protein of unknown function), were encoded by the amplicon. Breaks in the chromosomal DNA within the amplicon were found in the NSMCE2 and CCDC26 genes. The discontinuous structure of the amplicon unit of the LEEs was identical with that of dmins in HL-60 early-passaged cells. The difference between them seemed, predominantly, to be the number (10-15 copies per LEE versus 2 or 3 copies per dmin) of constituent units. Expression of the Myc, NSMCE2, CCDC26 and LOC389637 and TRIB1 genes was constitutive in all lines of HL-60 cells and that of the first four genes was repressed during the terminal differentiation of early-passaged HL-60 cells. We also detected abnormal transcripts of CCDC26. Our results suggest that these genes were selected during the development of amplicons. They might be amplified and, sometimes, truncated to contribute to the maintenance of HL-60 cells in an undifferentiated state.

  2. FasL and FADD delivery by a glioma-specific and cell cycle-dependent HSV-1 amplicon virus enhanced apoptosis in primary human brain tumors

    Directory of Open Access Journals (Sweden)

    Lam Paula Y

    2010-10-01

    Full Text Available Abstract Background Glioblastoma multiforme is the most malignant cancer of the brain and is notoriously difficult to treat due to the highly proliferative and infiltrative nature of the cells. Herein, we explored the combination treatment of pre-established human glioma xenograft using multiple therapeutic genes whereby the gene expression is regulated by both cell-type and cell cycle-dependent transcriptional regulatory mechanism conferred by recombinant HSV-1 amplicon vectors. Results We demonstrated for the first time that Ki67-positive proliferating primary human glioma cells cultured from biopsy samples were effectively induced into cell death by the dual-specific function of the pG8-FasL amplicon vectors. These vectors were relatively stable and exhibited minimal cytotoxicity in vivo. Intracranial implantation of pre-transduced glioma cells resulted in better survival outcome when compared with viral vectors inoculated one week post-implantation of tumor cells, indicating that therapeutic efficacy is dependent on the viral spread and mode of viral vectors administration. We further showed that pG8-FasL amplicon vectors are functional in the presence of commonly used treatment regimens for human brain cancer. In fact, the combined therapies of pG8-FasL and pG8-FADD in the presence of temozolomide significantly improved the survival of mice bearing intracranial high-grade gliomas. Conclusion Taken together, our results showed that the glioma-specific and cell cycle-dependent HSV-1 amplicon vector is potentially useful as an adjuvant therapy to complement the current gene therapy strategy for gliomas.

  3. Using surface-enhanced Raman spectroscopy and electrochemically driven melting to discriminate Yersinia pestis from Y. pseudotuberculosis based on single nucleotide polymorphisms within unpurified polymerase chain reaction amplicons.

    Science.gov (United States)

    Papadopoulou, Evanthia; Goodchild, Sarah A; Cleary, David W; Weller, Simon A; Gale, Nittaya; Stubberfield, Michael R; Brown, Tom; Bartlett, Philip N

    2015-02-03

    The development of sensors for the detection of pathogen-specific DNA, including relevant species/strain level discrimination, is critical in molecular diagnostics with major impacts in areas such as bioterrorism and food safety. Herein, we use electrochemically driven denaturation assays monitored by surface-enhanced Raman spectroscopy (SERS) to target single nucleotide polymorphisms (SNPs) that distinguish DNA amplicons generated from Yersinia pestis, the causative agent of plague, from the closely related species Y. pseudotuberculosis. Two assays targeting SNPs within the groEL and metH genes of these two species have been successfully designed. Polymerase chain reaction (PCR) was used to produce Texas Red labeled single-stranded DNA (ssDNA) amplicons of 262 and 251 bases for the groEL and metH targets, respectively. These amplicons were used in an unpurified form to hybridize to immobilized probes then subjected to electrochemically driven melting. In all cases electrochemically driven melting was able to discriminate between fully homologous DNA and that containing SNPs. The metH assay was particularly challenging due to the presence of only a single base mismatch in the middle of the 251 base long PCR amplicon. However, manipulation of assay conditions (conducting the electrochemical experiments at 10 °C) resulted in greater discrimination between the complementary and mismatched DNA. Replicate data were collected and analyzed for each duplex on different days, using different batches of PCR product and different sphere segment void (SSV) substrates. Despite the variability introduced by these differences, the assays are shown to be reliable and robust providing a new platform for strain discrimination using unpurified PCR samples.

  4. Ct shift: A novel and accurate real-time PCR quantification model for direct comparison of different nucleic acid sequences and its application for transposon quantifications.

    Science.gov (United States)

    Kolacsek, Orsolya; Pergel, Enikő; Varga, Nóra; Apáti, Ágota; Orbán, Tamás I

    2017-01-20

    There are numerous applications of quantitative PCR for both diagnostic and basic research. As in many other techniques the basis of quantification is that comparisons are made between different (unknown and known or reference) specimens of the same entity. When the aim is to compare real quantities of different species in samples, one cannot escape their separate precise absolute quantification. We have established a simple and reliable method for this purpose (Ct shift method) which combines the absolute and the relative approach. It requires a plasmid standard containing both sequences of amplicons to be compared (e.g. the target of interest and the endogenous control). It can serve as a reference sample with equal copies of templates for both targets. Using the ΔΔCt formula we can quantify the exact ratio of the two templates in each unknown sample. The Ct shift method has been successfully applied for transposon gene copy measurements, as well as for comparison of different mRNAs in cDNA samples. This study provides the proof of concept and introduces some potential applications of the method; the absolute nature of results even without the need for real reference samples can contribute to the universality of the method and comparability of different studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Verb aspect, alternations and quantification

    Directory of Open Access Journals (Sweden)

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  6. Very high resolution single pass HLA genotyping using amplicon sequencing on the 454 next generation DNA sequencers: Comparison with Sanger sequencing.

    Science.gov (United States)

    Yamamoto, F; Höglund, B; Fernandez-Vina, M; Tyan, D; Rastrou, M; Williams, T; Moonsamy, P; Goodridge, D; Anderson, M; Erlich, H A; Holcomb, C L

    2015-12-01

    Compared to Sanger sequencing, next-generation sequencing offers advantages for high resolution HLA genotyping including increased throughput, lower cost, and reduced genotype ambiguity. Here we describe an enhancement of the Roche 454 GS GType HLA genotyping assay to provide very high resolution (VHR) typing, by the addition of 8 primer pairs to the original 14, to genotype 11 HLA loci. These additional amplicons help resolve common and well-documented alleles and exclude commonly found null alleles in genotype ambiguity strings. Simplification of workflow to reduce the initial preparation effort using early pooling of amplicons or the Fluidigm Access Array™ is also described. Performance of the VHR assay was evaluated on 28 well characterized cell lines using Conexio Assign MPS software which uses genomic, rather than cDNA, reference sequence. Concordance was 98.4%; 1.6% had no genotype assignment. Of concordant calls, 53% were unambiguous. To further assess the assay, 59 clinical samples were genotyped and results compared to unambiguous allele assignments obtained by prior sequence-based typing supplemented with SSO and/or SSP. Concordance was 98.7% with 58.2% as unambiguous calls; 1.3% could not be assigned. Our results show that the amplicon-based VHR assay is robust and can replace current Sanger methodology. Together with software enhancements, it has the potential to provide even higher resolution HLA typing. Copyright © 2015. Published by Elsevier Inc.

  7. HeurAA: accurate and fast detection of genetic variations with a novel heuristic amplicon aligner program for next generation sequencing.

    Directory of Open Access Journals (Sweden)

    Lőrinc S Pongor

    Full Text Available Next generation sequencing (NGS of PCR amplicons is a standard approach to detect genetic variations in personalized medicine such as cancer diagnostics. Computer programs used in the NGS community often miss insertions and deletions (indels that constitute a large part of known human mutations. We have developed HeurAA, an open source, heuristic amplicon aligner program. We tested the program on simulated datasets as well as experimental data from multiplex sequencing of 40 amplicons in 12 oncogenes collected on a 454 Genome Sequencer from lung cancer cell lines. We found that HeurAA can accurately detect all indels, and is more than an order of magnitude faster than previous programs. HeurAA can compare reads and reference sequences up to several thousand base pairs in length, and it can evaluate data from complex mixtures containing reads of different gene-segments from different samples. HeurAA is written in C and Perl for Linux operating systems, the code and the documentation are available for research applications at http://sourceforge.net/projects/heuraa/

  8. Nuclear Species-Diagnostic SNP Markers Mined from 454 Amplicon Sequencing Reveal Admixture Genomic Structure of Modern Citrus Varieties

    Science.gov (United States)

    Curk, Franck; Ancillo, Gema; Ollitrault, Frédérique; Perrier, Xavier; Jacquemoud-Collet, Jean-Pierre; Garcia-Lor, Andres; Navarro, Luis; Ollitrault, Patrick

    2015-01-01

    Most cultivated Citrus species originated from interspecific hybridisation between four ancestral taxa (C. reticulata, C. maxima, C. medica, and C. micrantha) with limited further interspecific recombination due to vegetative propagation. This evolution resulted in admixture genomes with frequent interspecific heterozygosity. Moreover, a major part of the phenotypic diversity of edible citrus results from the initial differentiation between these taxa. Deciphering the phylogenomic structure of citrus germplasm is therefore essential for an efficient utilization of citrus biodiversity in breeding schemes. The objective of this work was to develop a set of species-diagnostic single nucleotide polymorphism (SNP) markers for the four Citrus ancestral taxa covering the nine chromosomes, and to use these markers to infer the phylogenomic structure of secondary species and modern cultivars. Species-diagnostic SNPs were mined from 454 amplicon sequencing of 57 gene fragments from 26 genotypes of the four basic taxa. Of the 1,053 SNPs mined from 28,507 kb sequence, 273 were found to be highly diagnostic for a single basic taxon. Species-diagnostic SNP markers (105) were used to analyse the admixture structure of varieties and rootstocks. This revealed C. maxima introgressions in most of the old and in all recent selections of mandarins, and suggested that C. reticulata × C. maxima reticulation and introgression processes were important in edible mandarin domestication. The large range of phylogenomic constitutions between C. reticulata and C. maxima revealed in mandarins, tangelos, tangors, sweet oranges, sour oranges, grapefruits, and orangelos is favourable for genetic association studies based on phylogenomic structures of the germplasm. Inferred admixture structures were in agreement with previous hypotheses regarding the origin of several secondary species and also revealed the probable origin of several acid citrus varieties. The developed species-diagnostic SNP

  9. Quantification of informed opinion

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1985-01-01

    The objective of this session, Quantification of Informed Opinion, is to provide the statistician with a better understanding of this important area. The NRC uses informed opinion, sometimes called engineering judgment or subjective judgment, in many areas. Sometimes informed opinion is the only source of information that exists, especially in phenomenological areas, such as steam explosions, where experiments are costly and phenomena are very difficult to measure. There are many degrees of informed opinion. These vary from the weatherman who makes predictions concerning relatively high probability events with a large data base to the phenomenological expert who must use his intuition tempered with basic knowledge and little or no measured data to predict the behavior of events with a low probability of occurrence. The first paper in this session provides the reader with an overview of the subject area. The second paper provides some aspects that must be considered in the collection of informed opinion to improve the quality of the information. The final paper contains an example of the use of informed opinion in the area of seismic hazard characterization. These papers should be useful to researchers and statisticians who need to collect and use informed opinion in their work

  10. Quantification In Neurology

    Directory of Open Access Journals (Sweden)

    Netravati M

    2005-01-01

    Full Text Available There is a distinct shift of emphasis in clinical neurology in the last few decades. A few years ago, it was just sufficient for a clinician to precisely record history, document signs, establish diagnosis and write prescription. In the present context, there has been a significant intrusion of scientific culture in clinical practice. Several criteria have been proposed, refined and redefined to ascertain accurate diagnosis for many neurological disorders. Introduction of the concept of impairment, disability, handicap and quality of life has added new dimension to the measurement of health and disease and neurological disorders are no exception. "Best guess" treatment modalities are no more accepted and evidence based medicine has become an integral component of medical care. Traditional treatments need validation and new therapies require vigorous trials. Thus, proper quantification in neurology has become essential, both in practice and research methodology in neurology. While this aspect is widely acknowledged, there is a limited access to a comprehensive document pertaining to measurements in neurology. This following description is a critical appraisal of various measurements and also provides certain commonly used rating scales/scores in neurological practice.

  11. Recombinant adeno-associated virus type 2 replication and packaging is entirely supported by a herpes simplex virus type 1 amplicon expressing Rep and Cap.

    Science.gov (United States)

    Conway, J E; Zolotukhin, S; Muzyczka, N; Hayward, G S; Byrne, B J

    1997-11-01

    Recombinant adeno-associated virus (AAV) type 2 (rAAV) vectors have recently been shown to have great utility as gene transfer agents both in vitro and in vivo. One of the problems associated with the use of rAAV vectors has been the difficulty of large-scale vector production. Low-efficiency plasmid transfection of the rAAV vector and complementing AAV type 2 (AAV-2) functions (rep and cap) followed by superinfection with adenovirus has been the standard approach to rAAV production. The objectives of this study were to demonstrate the ability of a recombinant herpes simplex virus type 1 (HSV-1) amplicon expressing AAV-2 Rep and Cap to support replication and packaging of rAAV vectors. HSV-1 amplicon vectors were constructed which contain the AAV-2 rep and cap genes under control of their native promoters (p5, p19, and p40). An HSV-1 amplicon vector, HSV-RC/KOS or HSV-RC/d27, was generated by supplying helper functions with either wild-type HSV-1 (KOS strain) or the ICP27-deleted mutant of HSV-1, d27-1, respectively. Replication of the amplicon stocks is not inhibited by the presence of AAV-2 Rep proteins, which highlights important differences between HSV-1 and adenovirus replication and the mechanism of providing helper function for productive AAV infection. Coinfection of rAAV and HSV-RC/KOS resulted in the replication and amplification of rAAV genomes. Similarly, rescue and replication of rAAV genomes occurred when rAAV vector plasmids were transfected into cells followed by HSV-RC/KOS infection and when two rAAV proviral cell lines were infected with HSV-RC/KOS or HSV-RC/d27. Production of infectious rAAV by rescue from two rAAV proviral cell lines has also been achieved with HSV-RC/KOS and HSV-RC/d27. The particle titer of rAAV produced with HSV-RC/d27 is equal to that achieved by supplying rep and cap by transfection followed by adenovirus superinfection. Importantly, no detectable wild-type AAV-2 is generated with this approach. These results demonstrate

  12. RNA-Based Amplicon Sequencing Reveals Microbiota Development during Ripening of Artisanal versus Industrial Lard d'Arnad.

    Science.gov (United States)

    Ferrocino, Ilario; Bellio, Alberto; Romano, Angelo; Macori, Guerrino; Rantsiou, Kalliopi; Decastelli, Lucia; Cocolin, Luca

    2017-08-15

    Valle d'Aosta Lard d'Arnad is a protected designation of origin (PDO) product produced from fat of the shoulder and back of heavy pigs. Its manufacturing process can be very diverse, especially regarding the maturation temperature and the NaCl concentration used for the brine; thereby, the main goal of this study was to investigate the impact of those parameters on the microbiota developed during curing and ripening. Three farms producing Lard d'Arnad were selected. Two plants, reflecting the industrial process characterized either by low maturation temperature (plant A [10% NaCl, 2°C]) or by using a low NaCl concentration (plant B [2.5% NaCl, 4°C]), were selected, while the third was characterized by an artisanal process (plant C [30% NaCl, 8°C]). Lard samples were obtained at time 0 and after 7, 15, 30, 60, and 90 days of maturation. From each plant, 3 independent lots were analyzed. The diversity of live microbiota was evaluated by using classical plate counts and amplicon target sequencing of small subunit (SSU) rRNA. The main taxa identified by sequencing were Acinetobacter johnsonii , Psychrobacter , Staphylococcus equorum , Staphylococcus sciuri , Pseudomonas fragi , Brochothrix , Halomonas , and Vibrio , and differences in their relative abundances distinguished samples from the individual plants. The composition of the microbiota was more similar among plants A and B, and it was characterized by the higher presence of taxa recognized as undesired bacteria in food-processing environments. Oligotype analysis of Halomonas and Acinetobacter revealed the presence of several characteristic oligotypes associated with A and B samples. IMPORTANCE Changes in the food production process can drastically affect the microbial community structure, with a possible impact on the final characteristics of the products. The industrial processes of Lard d'Arnad production are characterized by a reduction in the salt concentration in the brines to address a consumer demand

  13. Polymicrobial nature of chronic diabetic foot ulcer biofilm infections determined using bacterial tag encoded FLX amplicon pyrosequencing (bTEFAP.

    Directory of Open Access Journals (Sweden)

    Scot E Dowd

    Full Text Available BACKGROUND: Diabetic extremity ulcers are associated with chronic infections. Such ulcer infections are too often followed by amputation because there is little or no understanding of the ecology of such infections or how to control or eliminate this type of chronic infection. A primary impediment to the healing of chronic wounds is biofilm phenotype infections. Diabetic foot ulcers are the most common, disabling, and costly complications of diabetes. Here we seek to derive a better understanding of the polymicrobial nature of chronic diabetic extremity ulcer infections. METHODS AND FINDINGS: Using a new bacterial tag encoded FLX amplicon pyrosequencing (bTEFAP approach we have evaluated the bacterial diversity of 40 chronic diabetic foot ulcers from different patients. The most prevalent bacterial genus associated with diabetic chronic wounds was Corynebacterium spp. Findings also show that obligate anaerobes including Bacteroides, Peptoniphilus, Fingoldia, Anaerococcus, and Peptostreptococcus spp. are ubiquitous in diabetic ulcers, comprising a significant portion of the wound biofilm communities. Other major components of the bacterial communities included commonly cultured genera such as Streptococcus, Serratia, Staphylococcus and Enterococcus spp. CONCLUSIONS: In this article, we highlight the patterns of population diversity observed in the samples and introduce preliminary evidence to support the concept of functional equivalent pathogroups (FEP. Here we introduce FEP as consortia of genotypically distinct bacteria that symbiotically produce a pathogenic community. According to this hypothesis, individual members of these communities when they occur alone may not cause disease but when they coaggregate or consort together into a FEP the synergistic effect provides the functional equivalence of well-known pathogens, such as Staphylococcus aureus, giving the biofilm community the factors necessary to maintain chronic biofilm infections

  14. Strategy for the maximization of clinically relevant information from hepatitis C virus, RT-PCR quantification.

    LENUS (Irish Health Repository)

    Levis, J

    2012-02-03

    BACKGROUND: The increasing clinical application of viral load assays for monitoring viral infections has been an incentive for the development of standardized tests for the hepatitis C virus. OBJECTIVE: To develop a simple model for the prediction of baseline viral load in individuals infected with the hepatitis C virus. METHODOLOGY: Viral load quantification of each patient\\'s first sample was assessed by RT-PCR-ELISA using the Roche MONITOR assay in triplicate. Genotype of the infecting virus was identified by reverse line probe hybridization, using amplicons resulting from the qualitative HCV Roche AMPLICOR assay. RESULTS: Retrospective evaluation of first quantitative values suggested that 82.4% (n=168\\/204) of individuals had a viral load between 4.3 and 6.7 log(10) viral copies per ml. A few patients (3.4%; n=7\\/204) have a serum viremia less than the lower limit of the linear range of the RT-PCR assay. Subsequent, prospective evaluation of hepatitis C viral load of all new patients using a model based on the dynamic range of viral load in the retrospective group correctly predicted the dynamic range in 75.9% (n=33\\/54). CONCLUSION: The dynamic range of hepatitis C viremia extends beyond the linear range of the Roche MONITOR assay. Accurate determination of serum viremia is substantially improved by dilution of specimens prior to quantification.

  15. Digital quantification of gene methylation in stool DNA by emulsion-PCR coupled with hydrogel immobilized bead-array.

    Science.gov (United States)

    Liu, Yunlong; Wu, Haiping; Zhou, Qiang; Song, Qinxin; Rui, Jianzhong; Zou, Bingjie; Zhou, Guohua

    2017-06-15

    Aberrations of gene methylation in stool DNA (sDNA) is an effective biomarker for non-invasive colorectal cancer diagnosis. However, it is challenging to accurately quantitate the gene methylation levels in sDNA due to the low abundance and degradation of sDNA. In this study, a digital quantification strategy was proposed by combining emulsion PCR (emPCR) with hydrogel immobilized bead-array. The assay includes following steps: bisulfite conversion of sDNA, pre-amplification by PCR with specific primers containing 5' universal sequences, emPCR of pre-amplicons with beaded primers to achieve single-molecular amplification and identification of hydrogel embedding beads coated with amplicons. The sensitivity and the specificity of the method are high enough to pick up 0.05% methylated targets from unmethylated DNA background. The successful detection of hypermethylated vimentin gene in clinical stool samples suggests that the proposed method should be a potential tool for non-invasive colorectal cancer screening. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Development of a quantitative competitive reverse transcriptase polymerase chain reaction for the quantification of growth hormone gene expression in pigs

    Directory of Open Access Journals (Sweden)

    Maurício Machaim Franco

    2003-01-01

    Full Text Available After the advent of the genome projects, followed by the discovery of DNA polymorphisms, basic understanding of gene expression is the next focus to explain the association between polymorphisms and the level of gene expression, as well as to demonstrate the interaction among genes. Among the various techniques for the investigation of transcriptional profiling involving patterns of gene expression, quantitative PCR is the simplest analytical laboratory technique. The objective of this work was to analyze two strategies of a competitive PCR technique for the quantification of the pig growth hormone (GH gene expression. A pair of primers was designed targeting exons 3 and 5, and two competitive PCR strategies were performed, one utilizing a specific amplicon as a competitor, and the other utilizing a low-stringency PCR amplicon as a competitor. The latter strategy proved to be easier and more efficient, offering an accessible tool that can be used in any kind of competitive reaction, facilitating the study of gene expression patterns for both genetics and diagnostics of infectious diseases.

  17. Biphasic Study to Characterize Agricultural Biogas Plants by High-Throughput 16S rRNA Gene Amplicon Sequencing and Microscopic Analysis.

    Science.gov (United States)

    Maus, Irena; Kim, Yong Sung; Wibberg, Daniel; Stolze, Yvonne; Off, Sandra; Antonczyk, Sebastian; Pühler, Alfred; Scherer, Paul; Schlüter, Andreas

    2017-02-28

    Process surveillance within agricultural biogas plants (BGPs) was concurrently studied by high-throughput 16S rRNA gene amplicon sequencing and an optimized quantitative microscopic fingerprinting (QMF) technique. In contrast to 16S rRNA gene amplicons, digitalized microscopy is a rapid and cost-effective method that facilitates enumeration and morphological differentiation of the most significant groups of methanogens regarding their shape and characteristic autofluorescent factor 420. Moreover, the fluorescence signal mirrors cell vitality. In this study, four different BGPs were investigated. The results indicated stable process performance in the mesophilic BGPs and in the thermophilic reactor. Bacterial subcommunity characterization revealed significant differences between the four BGPs. Most remarkably, the genera Defluviitoga and Halocella dominated the thermophilic bacterial subcommunity, whereas members of another taxon, Syntrophaceticus , were found to be abundant in the mesophilic BGP. The domain Archaea was dominated by the genus Methanoculleus in all four BGPs, followed by Methanosaeta in BGP1 and BGP3. In contrast, Methanothermobacter members were highly abundant in the thermophilic BGP4. Furthermore, a high consistency between the sequencing approach and the QMF method was shown, especially for the thermophilic BGP. The differences elucidated that using this biphasic approach for mesophilic BGPs provided novel insights regarding disaggregated single cells of Methanosarcina and Methanosaeta species. Both dominated the archaeal subcommunity and replaced coccoid Methanoculleus members belonging to the same group of Methanomicrobiales that have been frequently observed in similar BGPs. This work demonstrates that combining QMF and 16S rRNA gene amplicon sequencing is a complementary strategy to describe archaeal community structures within biogas processes.

  18. The application of amplicon length heterogeneity PCR (LH-PCR) for monitoring the dynamics of soil microbial communities associated with cadaver decomposition.

    Science.gov (United States)

    Moreno, Lilliana I; Mills, DeEtta; Fetscher, Jill; John-Williams, Krista; Meadows-Jantz, Lee; McCord, Bruce

    2011-03-01

    The placement of cadavers in shallow, clandestine graves may alter the microbial and geochemical composition of the underlying and adjacent soils. Using amplicon length heterogeneity-PCR (LH-PCR) the microbial community changes in these soils can be assessed. In this investigation, nine different grave sites were examined over a period of 16weeks. The results indicated that measurable changes occurred in the soil bacterial community during the decomposition process. In this study, amplicons corresponding to anaerobic bacteria, not indigenous to the soil, were shown to produce differences between grave sites and control soils. Among the bacteria linked to these amplicons are those that are most often part of the commensal flora of the intestines, mouth and skin. In addition, over the 16week sampling interval, the level of indicator organisms (i.e., nitrogen fixing bacteria) dropped as the body decomposed and after four weeks of environmental exposure they began to increase again; thus differences in the abundance of nitrogen fixing bacteria were also found to contribute to the variation between controls and grave soils. These results were verified using primers that specifically targeted the nifH gene coding for nitrogenase reductase. LH-PCR provides a fast, robust and reproducible method to measure microbial changes in soil and could be used to determine potential cadaveric contact in a given area. The results obtained with this method could ultimately provide leads to investigators in criminal or missing person scenarios and allow for further analysis using human specific DNA assays to establish the identity of the buried body. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Microbial community profiling of fresh basil and pitfalls in taxonomic assignment of enterobacterial pathogenic species based upon 16S rRNA amplicon sequencing.

    Science.gov (United States)

    Ceuppens, Siele; De Coninck, Dieter; Bottledoorn, Nadine; Van Nieuwerburgh, Filip; Uyttendaele, Mieke

    2017-09-18

    Application of 16S rRNA (gene) amplicon sequencing on food samples is increasingly applied for assessing microbial diversity but may as unintended advantage also enable simultaneous detection of any human pathogens without a priori definition. In the present study high-throughput next-generation sequencing (NGS) of the V1-V2-V3 regions of the 16S rRNA gene was applied to identify the bacteria present on fresh basil leaves. However, results were strongly impacted by variations in the bioinformatics analysis pipelines (MEGAN, SILVAngs, QIIME and MG-RAST), including the database choice (Greengenes, RDP and M5RNA) and the annotation algorithm (best hit, representative hit and lowest common ancestor). The use of pipelines with default parameters will lead to discrepancies. The estimate of microbial diversity of fresh basil using 16S rRNA (gene) amplicon sequencing is thus indicative but subject to biases. Salmonella enterica was detected at low frequencies, between 0.1% and 0.4% of bacterial sequences, corresponding with 37 to 166 reads. However, this result was dependent upon the pipeline used: Salmonella was detected by MEGAN, SILVAngs and MG-RAST, but not by QIIME. Confirmation of Salmonella sequences by real-time PCR was unsuccessful. It was shown that taxonomic resolution obtained from the short (500bp) sequence reads of the 16S rRNA gene containing the hypervariable regions V1-V3 cannot allow distinction of Salmonella with closely related enterobacterial species. In conclusion 16S amplicon sequencing, getting the status of standard method in microbial ecology studies of foods, needs expertise on both bioinformatics and microbiology for analysis of results. It is a powerful tool to estimate bacterial diversity but amenable to biases. Limitations concerning taxonomic resolution for some bacterial species or its inability to detect sub-dominant (pathogenic) species should be acknowledged in order to avoid overinterpretation of results. Copyright © 2017 Elsevier B

  20. A novel ultra high-throughput 16S rRNA gene amplicon sequencing library preparation method for the Illumina HiSeq platform.

    Science.gov (United States)

    de Muinck, Eric J; Trosvik, Pål; Gilfillan, Gregor D; Hov, Johannes R; Sundaram, Arvind Y M

    2017-07-06

    Advances in sequencing technologies and bioinformatics have made the analysis of microbial communities almost routine. Nonetheless, the need remains to improve on the techniques used for gathering such data, including increasing throughput while lowering cost and benchmarking the techniques so that potential sources of bias can be better characterized. We present a triple-index amplicon sequencing strategy to sequence large numbers of samples at significantly lower c ost and in a shorter timeframe compared to existing methods. The design employs a two-stage PCR protocol, incorpo rating three barcodes to each sample, with the possibility to add a fourth-index. It also includes heterogeneity spacers to overcome low complexity issues faced when sequencing amplicons on Illumina platforms. The library preparation method was extensively benchmarked through analysis of a mock community in order to assess biases introduced by sample indexing, number of PCR cycles, and template concentration. We further evaluated the method through re-sequencing of a standardized environmental sample. Finally, we evaluated our protocol on a set of fecal samples from a small cohort of healthy adults, demonstrating good performance in a realistic experimental setting. Between-sample variation was mainly related to batch effects, such as DNA extraction, while sample indexing was also a significant source of bias. PCR cycle number strongly influenced chimera formation and affected relative abundance estimates of species with high GC content. Libraries were sequenced using the Illumina HiSeq and MiSeq platforms to demonstrate that this protocol is highly scalable to sequence thousands of samples at a very low cost. Here, we provide the most comprehensive study of performance and bias inherent to a 16S rRNA gene amplicon sequencing method to date. Triple-indexing greatly reduces the number of long custom DNA oligos required for library preparation, while the inclusion of variable length

  1. Illumina MiSeq 16S amplicon sequence analysis of bovine respiratory disease associated bacteria in lung and mediastinal lymph node tissue.

    Science.gov (United States)

    Johnston, Dayle; Earley, Bernadette; Cormican, Paul; Murray, Gerard; Kenny, David Anthony; Waters, Sinead Mary; McGee, Mark; Kelly, Alan Kieran; McCabe, Matthew Sean

    2017-05-02

    Bovine respiratory disease (BRD) is caused by growth of single or multiple species of pathogenic bacteria in lung tissue following stress and/or viral infection. Next generation sequencing of 16S ribosomal RNA gene PCR amplicons (NGS 16S amplicon analysis) is a powerful culture-independent open reference method that has recently been used to increase understanding of BRD-associated bacteria in the upper respiratory tract of BRD cattle. However, it has not yet been used to examine the microbiome of the bovine lower respiratory tract. The objective of this study was to use NGS 16S amplicon analysis to identify bacteria in post-mortem lung and lymph node tissue samples harvested from fatal BRD cases and clinically healthy animals. Cranial lobe and corresponding mediastinal lymph node post-mortem tissue samples were collected from calves diagnosed as BRD cases by veterinary laboratory pathologists and from clinically healthy calves. NGS 16S amplicon libraries, targeting the V3-V4 region of the bacterial 16S rRNA gene were prepared and sequenced on an Illumina MiSeq. Quantitative insights into microbial ecology (QIIME) was used to determine operational taxonomic units (OTUs) which corresponded to the 16S rRNA gene sequences. Leptotrichiaceae, Mycoplasma, Pasteurellaceae, and Fusobacterium were the most abundant OTUs identified in the lungs and lymph nodes of the calves which died from BRD. Leptotrichiaceae, Fusobacterium, Mycoplasma, Trueperella and Bacteroides had greater relative abundances in post-mortem lung samples collected from fatal cases of BRD in dairy calves, compared with clinically healthy calves without lung lesions. Leptotrichiaceae, Mycoplasma and Pasteurellaceae showed higher relative abundances in post-mortem lymph node samples collected from fatal cases of BRD in dairy calves, compared with clinically healthy calves without lung lesions. Two Leptotrichiaceae sequence contigs were subsequently assembled from bacterial DNA-enriched shotgun sequences

  2. High-throughput sequencing of 16S rRNA gene amplicons: effects of extraction procedure, primer length and annealing temperature.

    Science.gov (United States)

    Sergeant, Martin J; Constantinidou, Chrystala; Cogan, Tristan; Penn, Charles W; Pallen, Mark J

    2012-01-01

    The analysis of 16S-rDNA sequences to assess the bacterial community composition of a sample is a widely used technique that has increased with the advent of high throughput sequencing. Although considerable effort has been devoted to identifying the most informative region of the 16S gene and the optimal informatics procedures to process the data, little attention has been paid to the PCR step, in particular annealing temperature and primer length. To address this, amplicons derived from 16S-rDNA were generated from chicken caecal content DNA using different annealing temperatures, primers and different DNA extraction procedures. The amplicons were pyrosequenced to determine the optimal protocols for capture of maximum bacterial diversity from a chicken caecal sample. Even at very low annealing temperatures there was little effect on the community structure, although the abundance of some OTUs such as Bifidobacterium increased. Using shorter primers did not reveal any novel OTUs but did change the community profile obtained. Mechanical disruption of the sample by bead beating had a significant effect on the results obtained, as did repeated freezing and thawing. In conclusion, existing primers and standard annealing temperatures captured as much diversity as lower annealing temperatures and shorter primers.

  3. Two-stage clustering (TSC: a pipeline for selecting operational taxonomic units for the high-throughput sequencing of PCR amplicons.

    Directory of Open Access Journals (Sweden)

    Xiao-Tao Jiang

    Full Text Available Clustering 16S/18S rRNA amplicon sequences into operational taxonomic units (OTUs is a critical step for the bioinformatic analysis of microbial diversity. Here, we report a pipeline for selecting OTUs with a relatively low computational demand and a high degree of accuracy. This pipeline is referred to as two-stage clustering (TSC because it divides tags into two groups according to their abundance and clusters them sequentially. The more abundant group is clustered using a hierarchical algorithm similar to that in ESPRIT, which has a high degree of accuracy but is computationally costly for large datasets. The rarer group, which includes the majority of tags, is then heuristically clustered to improve efficiency. To further improve the computational efficiency and accuracy, two preclustering steps are implemented. To maintain clustering accuracy, all tags are grouped into an OTU depending on their pairwise Needleman-Wunsch distance. This method not only improved the computational efficiency but also mitigated the spurious OTU estimation from 'noise' sequences. In addition, OTUs clustered using TSC showed comparable or improved performance in beta-diversity comparisons compared to existing OTU selection methods. This study suggests that the distribution of sequencing datasets is a useful property for improving the computational efficiency and increasing the clustering accuracy of the high-throughput sequencing of PCR amplicons. The software and user guide are freely available at http://hwzhoulab.smu.edu.cn/paperdata/.

  4. Conservative fragments in bacterial 16S rRNA genes and primer design for 16S ribosomal DNA amplicons in metagenomic studies

    KAUST Repository

    Wang, Yong

    2009-10-09

    Bacterial 16S ribosomal DNA (rDNA) amplicons have been widely used in the classification of uncultured bacteria inhabiting environmental niches. Primers targeting conservative regions of the rDNAs are used to generate amplicons of variant regions that are informative in taxonomic assignment. One problem is that the percentage coverage and application scope of the primers used in previous studies are largely unknown. In this study, conservative fragments of available rDNA sequences were first mined and then used to search for candidate primers within the fragments by measuring the coverage rate defined as the percentage of bacterial sequences containing the target. Thirty predicted primers with a high coverage rate (>90%) were identified, which were basically located in the same conservative regions as known primers in previous reports, whereas 30% of the known primers were associated with a coverage rate of <90%. The application scope of the primers was also examined by calculating the percentages of failed detections in bacterial phyla. Primers A519-539, E969- 983, E1063-1081, U515 and E517, are highly recommended because of their high coverage in almost all phyla. As expected, the three predominant phyla, Firmicutes, Gemmatimonadetes and Proteobacteria, are best covered by the predicted primers. The primers recommended in this report shall facilitate a comprehensive and reliable survey of bacterial diversity in metagenomic studies. © 2009 Wang, Qian.

  5. Cowpea mosaic virus RNA-1 acts as an amplicon whose effects can be counteracted by a RNA-2-encoded suppressor of silencing

    International Nuclear Information System (INIS)

    Liu Li; Grainger, Jef; Canizares, M. Carmen; Angell, Susan M.; Lomonossoff, George P.

    2004-01-01

    Lines of Nicotiana benthamiana transgenic for full-length copies of both Cowpea mosaic virus (CPMV) genomic RNAs, either singly or together, have been produced. Plants transgenic for both RNAs developed symptoms characteristic of a CPMV infection. When plants transgenic for RNA-1 were agro-inoculated with RNA-2, no infection developed and the plants were also resistant to challenge with CPMV. By contrast, plants transgenic for RNA-2 became infected when agro-inoculated with RNA-1 and were fully susceptible to CPMV infection. The resistance of RNA-1 transgenic plants was shown to be related to the ability of RNA-1 to self-replicate and act as an amplicon. The ability of transgenically expressed RNA-2 to counteract the amplicon effect suggested that it encodes a suppressor of posttranscriptional gene silencing (PTGS). By examining the ability of portions of RNA-2 to reverse PTGS in N. benthamiana, we have identified the small (S) coat protein as the CPMV RNA-2-encoded suppressor of PTGS

  6. 'Mitominis': multiplex PCR analysis of reduced size amplicons for compound sequence analysis of the entire mtDNA control region in highly degraded samples.

    Science.gov (United States)

    Eichmann, Cordula; Parson, Walther

    2008-09-01

    The traditional protocol for forensic mitochondrial DNA (mtDNA) analyses involves the amplification and sequencing of the two hypervariable segments HVS-I and HVS-II of the mtDNA control region. The primers usually span fragment sizes of 300-400 bp each region, which may result in weak or failed amplification in highly degraded samples. Here we introduce an improved and more stable approach using shortened amplicons in the fragment range between 144 and 237 bp. Ten such amplicons were required to produce overlapping fragments that cover the entire human mtDNA control region. These were co-amplified in two multiplex polymerase chain reactions and sequenced with the individual amplification primers. The primers were carefully selected to minimize binding on homoplasic and haplogroup-specific sites that would otherwise result in loss of amplification due to mis-priming. The multiplexes have successfully been applied to ancient and forensic samples such as bones and teeth that showed a high degree of degradation.

  7. Collagen Quantification in Tissue Specimens.

    Science.gov (United States)

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  8. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato.

    Science.gov (United States)

    Péréfarres, Frédéric; Hoareau, Murielle; Chiroleu, Frédéric; Reynaud, Bernard; Dintinger, Jacques; Lett, Jean-Michel

    2011-08-05

    Begomovirus is a genus of phytopathogenic single-stranded DNA viruses, transmitted by the whitefly Bemisia tabaci. This genus includes emerging and economically significant viruses such as those associated with Tomato Yellow Leaf Curl Disease, for which diagnostic tools are needed to prevent dispersion and new introductions. Five real-time PCRs with an internal tomato reporter gene were developed for accurate detection and quantification of monopartite begomoviruses, including two strains of the Tomato yellow leaf curl virus (TYLCV; Mld and IL strains), the Tomato leaf curl Comoros virus-like viruses (ToLCKMV-like viruses) and the two molecules of the bipartite Potato yellow mosaic virus. These diagnostic tools have a unique standard quantification, comprising the targeted viral and internal report amplicons. These duplex real-time PCRs were applied to artificially inoculated plants to monitor and compare their viral development. Real-time PCRs were optimized for accurate detection and quantification over a range of 2 × 10(9) to 2 × 10(3) copies of genomic viral DNA/μL for TYLCV-Mld, TYLCV-IL and PYMV-B and 2 × 10(8) to 2 × 10(3) copies of genomic viral DNA/μL for PYMV-A and ToLCKMV-like viruses. These real-time PCRs were applied to artificially inoculated plants and viral loads were compared at 10, 20 and 30 days post-inoculation. Different patterns of viral accumulation were observed between the bipartite and the monopartite begomoviruses. Interestingly, PYMV accumulated more viral DNA at each date for both genomic components compared to all the monopartite viruses. Also, PYMV reached its highest viral load at 10 dpi contrary to the other viruses (20 dpi). The accumulation kinetics of the two strains of emergent TYLCV differed from the ToLCKMV-like viruses in the higher quantities of viral DNA produced in the early phase of the infection and in the shorter time to reach this peak viral load. To detect and quantify a wide range of begomoviruses, five duplex

  9. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato

    Directory of Open Access Journals (Sweden)

    Lett Jean-Michel

    2011-08-01

    Full Text Available Abstract Background Begomovirus is a genus of phytopathogenic single-stranded DNA viruses, transmitted by the whitefly Bemisia tabaci. This genus includes emerging and economically significant viruses such as those associated with Tomato Yellow Leaf Curl Disease, for which diagnostic tools are needed to prevent dispersion and new introductions. Five real-time PCRs with an internal tomato reporter gene were developed for accurate detection and quantification of monopartite begomoviruses, including two strains of the Tomato yellow leaf curl virus (TYLCV; Mld and IL strains, the Tomato leaf curl Comoros virus-like viruses (ToLCKMV-like viruses and the two molecules of the bipartite Potato yellow mosaic virus. These diagnostic tools have a unique standard quantification, comprising the targeted viral and internal report amplicons. These duplex real-time PCRs were applied to artificially inoculated plants to monitor and compare their viral development. Results Real-time PCRs were optimized for accurate detection and quantification over a range of 2 × 109 to 2 × 103 copies of genomic viral DNA/μL for TYLCV-Mld, TYLCV-IL and PYMV-B and 2 × 108 to 2 × 103 copies of genomic viral DNA/μL for PYMV-A and ToLCKMV-like viruses. These real-time PCRs were applied to artificially inoculated plants and viral loads were compared at 10, 20 and 30 days post-inoculation. Different patterns of viral accumulation were observed between the bipartite and the monopartite begomoviruses. Interestingly, PYMV accumulated more viral DNA at each date for both genomic components compared to all the monopartite viruses. Also, PYMV reached its highest viral load at 10 dpi contrary to the other viruses (20 dpi. The accumulation kinetics of the two strains of emergent TYLCV differed from the ToLCKMV-like viruses in the higher quantities of viral DNA produced in the early phase of the infection and in the shorter time to reach this peak viral load. Conclusions To detect and

  10. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    International Nuclear Information System (INIS)

    Jackson, Christopher B.; Gallati, Sabina; Schaller, André

    2012-01-01

    Highlights: ► Serial qPCR accurately determines fragmentation state of any given DNA sample. ► Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. ► Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. ► Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serial qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze–thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA (λ nDNA ) and mtDNA (λ mtDNA ) we present an approach to possibly correct measurements in degraded samples in the future. To our knowledge this is the first time different degradation impact of the two

  11. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Gallati, Sabina, E-mail: sabina.gallati@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Schaller, Andre, E-mail: andre.schaller@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland)

    2012-07-06

    Highlights: Black-Right-Pointing-Pointer Serial qPCR accurately determines fragmentation state of any given DNA sample. Black-Right-Pointing-Pointer Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. Black-Right-Pointing-Pointer Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. Black-Right-Pointing-Pointer Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serial qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze-thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA ({lambda}{sub nDNA}) and mtDNA ({lambda}{sub mtDNA}) we present an approach to possibly correct measurements in

  12. Competitive Reporter Monitored Amplification (CMA) - Quantification of Molecular Targets by Real Time Monitoring of Competitive Reporter Hybridization

    Science.gov (United States)

    Ullrich, Thomas; Ermantraut, Eugen; Schulz, Torsten; Steinmetzer, Katrin

    2012-01-01

    Background State of the art molecular diagnostic tests are based on the sensitive detection and quantification of nucleic acids. However, currently established diagnostic tests are characterized by elaborate and expensive technical solutions hindering the development of simple, affordable and compact point-of-care molecular tests. Methodology and Principal Findings The described competitive reporter monitored amplification allows the simultaneous amplification and quantification of multiple nucleic acid targets by polymerase chain reaction. Target quantification is accomplished by real-time detection of amplified nucleic acids utilizing a capture probe array and specific reporter probes. The reporter probes are fluorescently labeled oligonucleotides that are complementary to the respective capture probes on the array and to the respective sites of the target nucleic acids in solution. Capture probes and amplified target compete for reporter probes. Increasing amplicon concentration leads to decreased fluorescence signal at the respective capture probe position on the array which is measured after each cycle of amplification. In order to observe reporter probe hybridization in real-time without any additional washing steps, we have developed a mechanical fluorescence background displacement technique. Conclusions and Significance The system presented in this paper enables simultaneous detection and quantification of multiple targets. Moreover, the presented fluorescence background displacement technique provides a generic solution for real time monitoring of binding events of fluorescently labelled ligands to surface immobilized probes. With the model assay for the detection of human immunodeficiency virus type 1 and 2 (HIV 1/2), we have been able to observe the amplification kinetics of five targets simultaneously and accommodate two additional hybridization controls with a simple instrument set-up. The ability to accommodate multiple controls and targets into a

  13. Competitive reporter monitored amplification (CMA--quantification of molecular targets by real time monitoring of competitive reporter hybridization.

    Directory of Open Access Journals (Sweden)

    Thomas Ullrich

    Full Text Available BACKGROUND: State of the art molecular diagnostic tests are based on the sensitive detection and quantification of nucleic acids. However, currently established diagnostic tests are characterized by elaborate and expensive technical solutions hindering the development of simple, affordable and compact point-of-care molecular tests. METHODOLOGY AND PRINCIPAL FINDINGS: The described competitive reporter monitored amplification allows the simultaneous amplification and quantification of multiple nucleic acid targets by polymerase chain reaction. Target quantification is accomplished by real-time detection of amplified nucleic acids utilizing a capture probe array and specific reporter probes. The reporter probes are fluorescently labeled oligonucleotides that are complementary to the respective capture probes on the array and to the respective sites of the target nucleic acids in solution. Capture probes and amplified target compete for reporter probes. Increasing amplicon concentration leads to decreased fluorescence signal at the respective capture probe position on the array which is measured after each cycle of amplification. In order to observe reporter probe hybridization in real-time without any additional washing steps, we have developed a mechanical fluorescence background displacement technique. CONCLUSIONS AND SIGNIFICANCE: The system presented in this paper enables simultaneous detection and quantification of multiple targets. Moreover, the presented fluorescence background displacement technique provides a generic solution for real time monitoring of binding events of fluorescently labelled ligands to surface immobilized probes. With the model assay for the detection of human immunodeficiency virus type 1 and 2 (HIV 1/2, we have been able to observe the amplification kinetics of five targets simultaneously and accommodate two additional hybridization controls with a simple instrument set-up. The ability to accommodate multiple controls

  14. Bacterial tag encoded FLX titanium amplicon pyrosequencing (bTEFAP based assessment of prokaryotic diversity in metagenome of Lonar soda lake, India

    Directory of Open Access Journals (Sweden)

    Pravin Dudhagara

    2015-06-01

    Full Text Available Bacterial diversity and archaeal diversity in metagenome of the Lonar soda lake sediment were assessed by bacterial tag-encoded FLX amplicon pyrosequencing (bTEFAP. Metagenome comprised 5093 sequences with 2,531,282 bp and 53 ± 2% G + C content. Metagenome sequence data are available at NCBI under the Bioproject database with accession no. PRJNA218849. Metagenome sequence represented the presence of 83.1% bacterial and 10.5% archaeal origin. A total of 14 different bacteria demonstrating 57 species were recorded with dominating species like Coxiella burnetii (17%, Fibrobacter intestinalis (12% and Candidatus Cloacamonas acidaminovorans (11%. Occurrence of two archaeal phyla representing 24 species, among them Methanosaeta harundinacea (35%, Methanoculleus chikugoensis (12% and Methanolinea tarda (11% were dominating species. Significant presence of 11% sequences as an unclassified indicated the possibilities for unknown novel prokaryotes from the metagenome.

  15. 16S rRNA Amplicon Sequencing Demonstrates that Indoor-Reared Bumblebees (Bombus terrestris Harbor a Core Subset of Bacteria Normally Associated with the Wild Host.

    Directory of Open Access Journals (Sweden)

    Ivan Meeus

    Full Text Available A MiSeq multiplexed 16S rRNA amplicon sequencing of the gut microbiota of wild and indoor-reared Bombus terrestris (bumblebees confirmed the presence of a core set of bacteria, which consisted of Neisseriaceae (Snodgrassella, Orbaceae (Gilliamella, Lactobacillaceae (Lactobacillus, and Bifidobacteriaceae (Bifidobacterium. In wild B. terrestris we detected several non-core bacteria having a more variable prevalence. Although Enterobacteriaceae are unreported by non next-generation sequencing studies, it can become a dominant gut resident. Furthermore the presence of some non-core lactobacilli were associated with the relative abundance of bifidobacteria. This association was not observed in indoor-reared bumblebees lacking the non-core bacteria, but having a more standardized microbiota compared to their wild counterparts. The impact of the bottleneck microbiota of indoor-reared bumblebees when they are used in the field for pollination purpose is discussed.

  16. Identification and Evaluation of Single-Nucleotide Polymorphisms in Allotetraploid Peanut (Arachis hypogaea L.) Based on Amplicon Sequencing Combined with High Resolution Melting (HRM) Analysis.

    Science.gov (United States)

    Hong, Yanbin; Pandey, Manish K; Liu, Ying; Chen, Xiaoping; Liu, Hong; Varshney, Rajeev K; Liang, Xuanqiang; Huang, Shangzhi

    2015-01-01

    The cultivated peanut (Arachis hypogaea L.) is an allotetraploid (AABB) species derived from the A-genome (Arachis duranensis) and B-genome (Arachis ipaensis) progenitors. Presence of two versions of a DNA sequence based on the two progenitor genomes poses a serious technical and analytical problem during single nucleotide polymorphism (SNP) marker identification and analysis. In this context, we have analyzed 200 amplicons derived from expressed sequence tags (ESTs) and genome survey sequences (GSS) to identify SNPs in a panel of genotypes consisting of 12 cultivated peanut varieties and two diploid progenitors representing the ancestral genomes. A total of 18 EST-SNPs and 44 genomic-SNPs were identified in 12 peanut varieties by aligning the sequence of A. hypogaea with diploid progenitors. The average frequency of sequence polymorphism was higher for genomic-SNPs than the EST-SNPs with one genomic-SNP every 1011 bp as compared to one EST-SNP every 2557 bp. In order to estimate the potential and further applicability of these identified SNPs, 96 peanut varieties were genotyped using high resolution melting (HRM) method. Polymorphism information content (PIC) values for EST-SNPs ranged between 0.021 and 0.413 with a mean of 0.172 in the set of peanut varieties, while genomic-SNPs ranged between 0.080 and 0.478 with a mean of 0.249. Total 33 SNPs were used for polymorphism detection among the parents and 10 selected lines from mapping population Y13Zh (Zhenzhuhei × Yueyou13). Of the total 33 SNPs, nine SNPs showed polymorphism in the mapping population Y13Zh, and seven SNPs were successfully mapped into five linkage groups. Our results showed that SNPs can be identified in allotetraploid peanut with high accuracy through amplicon sequencing and HRM assay. The identified SNPs were very informative and can be used for different genetic and breeding applications in peanut.

  17. Quantification of 16S rRNAs in complex bacterial communities by multiple competitive reverse transcription-PCR in temperature gradient gel electrophoresis fingerprints.

    Science.gov (United States)

    Felske, A; Akkermans, A D; De Vos, W M

    1998-11-01

    A novel approach was developed to quantify rRNA sequences in complex bacterial communities. The main bacterial 16S rRNAs in Drentse A grassland soils (The Netherlands) were amplified by reverse transcription (RT)-PCR with bacterium-specific primers and were separated by temperature gradient gel electrophoresis (TGGE). The primer pair used (primers U968-GC and L1401) was found to amplify with the same efficiency 16S rRNAs from bacterial cultures containing different taxa and cloned 16S ribosomal DNA amplicons from uncultured soil bacteria. The sequence-specific efficiency of amplification was determined by monitoring the amplification kinetics by kinetic PCR. The primer-specific amplification efficiency was assessed by competitive PCR and RT-PCR, and identical input amounts of different 16S rRNAs resulted in identical amplicon yields. The sequence-specific detection system used for competitive amplifications was TGGE, which also has been found to be suitable for simultaneous quantification of more than one sequence. We demonstrate that this approach can be applied to TGGE fingerprints of soil bacteria to estimate the ratios of the bacterial 16S rRNAs.

  18. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than...... Quantification kit in two experiments. The measured DNA concentrations with Quantifiler were 125 and 160% higher than expected based on the manufacturers' information. When the Quantifiler human DNA standard (Raji cell line) was replaced by the commercial human DNA preparation G147A (Promega) to generate the DNA...... standard curve in the Quantifiler Human DNA Quantification kit, the DNA quantification results of the human DNA preparations were 31% higher than expected based on the manufacturers' information. The results indicate a calibration problem with the Quantifiler human DNA standard for its use...

  19. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Min, Jung Joon; Lee, Byeong Il; Choi, Eun Seo; Tak, Yoon O; Choi, Heung Kook; Lee, Ju Young

    2009-01-01

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  20. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to

  1. The use of genus-specific amplicon pyrosequencing to assess phytophthora species diversity using eDNA from soil and water in Northern Spain.

    Science.gov (United States)

    Català, Santiago; Pérez-Sierra, Ana; Abad-Campos, Paloma

    2015-01-01

    Phytophthora is one of the most important and aggressive plant pathogenic genera in agriculture and forestry. Early detection and identification of its pathways of infection and spread are of high importance to minimize the threat they pose to natural ecosystems. eDNA was extracted from soil and water from forests and plantations in the north of Spain. Phytophthora-specific primers were adapted for use in high-throughput Sequencing (HTS). Primers were tested in a control reaction containing eight Phytophthora species and applied to water and soil eDNA samples from northern Spain. Different score coverage threshold values were tested for optimal Phytophthora species separation in a custom-curated database and in the control reaction. Clustering at 99% was the optimal criteria to separate most of the Phytophthora species. Multiple Molecular Operational Taxonomic Units (MOTUs) corresponding to 36 distinct Phytophthora species were amplified in the environmental samples. Pyrosequencing of amplicons from soil samples revealed low Phytophthora diversity (13 species) in comparison with the 35 species detected in water samples. Thirteen of the MOTUs detected in rivers and streams showed no close match to sequences in international sequence databases, revealing that eDNA pyrosequencing is a useful strategy to assess Phytophthora species diversity in natural ecosystems.

  2. A performance evaluation of Nextera XT and KAPA HyperPlus for rapid Illumina library preparation of long-range mitogenome amplicons.

    Science.gov (United States)

    Ring, Joseph D; Sturk-Andreaggi, Kimberly; Peck, Michelle A; Marshall, Charla

    2017-07-01

    Next-generation sequencing (NGS) facilitates the rapid and high-throughput generation of human mitochondrial genome (mitogenome) data to build population and reference databases for forensic comparisons. To this end, long-range amplification provides an effective method of target enrichment that is amenable to library preparation assays employing DNA fragmentation. This study compared the Nextera XT DNA Library Preparation Kit (Illumina, San Diego, CA) and the KAPA HyperPlus Library Preparation Kit (Kapa Biosystems, Wilmington, MA) for enzymatic fragmentation and indexing of ∼8500bp mitogenome amplicons for Illumina sequencing. The Nextera XT libraries produced low-coverage regions that were consistent across all samples, while the HyperPlus libraries resulted in uniformly high coverage across the mitogenome, even with reduced-volume reaction conditions. The balanced coverage observed from KAPA HyperPlus libraries enables not only low-level variant calling across the mitogenome but also increased sample multiplexing for greater processing efficiency. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Microfluidic PCR Amplification and MiSeq Amplicon Sequencing Techniques for High-Throughput Detection and Genotyping of Human Pathogenic RNA Viruses in Human Feces, Sewage, and Oysters

    Directory of Open Access Journals (Sweden)

    Mamoru Oshiki

    2018-04-01

    Full Text Available Detection and genotyping of pathogenic RNA viruses in human and environmental samples are useful for monitoring the circulation and prevalence of these pathogens, whereas a conventional PCR assay followed by Sanger sequencing is time-consuming and laborious. The present study aimed to develop a high-throughput detection-and-genotyping tool for 11 human RNA viruses [Aichi virus; astrovirus; enterovirus; norovirus genogroup I (GI, GII, and GIV; hepatitis A virus; hepatitis E virus; rotavirus; sapovirus; and human parechovirus] using a microfluidic device and next-generation sequencer. Microfluidic nested PCR was carried out on a 48.48 Access Array chip, and the amplicons were recovered and used for MiSeq sequencing (Illumina, Tokyo, Japan; genotyping was conducted by homology searching and phylogenetic analysis of the obtained sequence reads. The detection limit of the 11 tested viruses ranged from 100 to 103 copies/μL in cDNA sample, corresponding to 101–104 copies/mL-sewage, 105–108 copies/g-human feces, and 102–105 copies/g-digestive tissues of oyster. The developed assay was successfully applied for simultaneous detection and genotyping of RNA viruses to samples of human feces, sewage, and artificially contaminated oysters. Microfluidic nested PCR followed by MiSeq sequencing enables efficient tracking of the fate of multiple RNA viruses in various environments, which is essential for a better understanding of the circulation of human pathogenic RNA viruses in the human population.

  4. Size Matters: Assessing Optimum Soil Sample Size for Fungal and Bacterial Community Structure Analyses Using High Throughput Sequencing of rRNA Gene Amplicons

    Directory of Open Access Journals (Sweden)

    Christopher Ryan Penton

    2016-06-01

    Full Text Available We examined the effect of different soil sample sizes obtained from an agricultural field, under a single cropping system uniform in soil properties and aboveground crop responses, on bacterial and fungal community structure and microbial diversity indices. DNA extracted from soil sample sizes of 0.25, 1, 5 and 10 g using MoBIO kits and from 10 and 100 g sizes using a bead-beating method (SARDI were used as templates for high-throughput sequencing of 16S and 28S rRNA gene amplicons for bacteria and fungi, respectively, on the Illumina MiSeq and Roche 454 platforms. Sample size significantly affected overall bacterial and fungal community structure, replicate dispersion and the number of operational taxonomic units (OTUs retrieved. Richness, evenness and diversity were also significantly affected. The largest diversity estimates were always associated with the 10 g MoBIO extractions with a corresponding reduction in replicate dispersion. For the fungal data, smaller MoBIO extractions identified more unclassified Eukaryota incertae sedis and unclassified glomeromycota while the SARDI method retrieved more abundant OTUs containing unclassified Pleosporales and the fungal genera Alternaria and Cercophora. Overall, these findings indicate that a 10 g soil DNA extraction is most suitable for both soil bacterial and fungal communities for retrieving optimal diversity while still capturing rarer taxa in concert with decreasing replicate variation.

  5. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  6. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  7. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  8. Quantification of virus syndrome in chili peppers

    African Journals Online (AJOL)

    Jane

    2011-06-15

    Jun 15, 2011 ... alternative for the quantification of the disease' syndromes in regards to this crop. The result of these ..... parison of treatments such as cultivars or control measures and ..... Vascular discoloration and stem necrosis. 2.

  9. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  10. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    Science.gov (United States)

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  11. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  12. Stochastic approach for radionuclides quantification

    Science.gov (United States)

    Clement, A.; Saurel, N.; Perrin, G.

    2018-01-01

    Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.

  13. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  14. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.

    2014-01-01

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  15. Inverse problems and uncertainty quantification

    KAUST Repository

    Litvinenko, Alexander

    2013-12-18

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)— the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  16. Lung involvement quantification in chest radiographs

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A.; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M.

    2014-01-01

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  17. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  18. Influence of heavy metals on rhizosphere microbial communities of Siam weed (Chromolaena odorata (L. using a 16S rRNA gene amplicon sequencing approach

    Directory of Open Access Journals (Sweden)

    Thanyaporn Ruangdech

    2017-06-01

    Full Text Available A 16S rRNA amplicon sequencing approach was used to assess the impacts of cadmium (Cd and zinc (Zn contamination on populations of rhizobacteria on Siam weed (Chromolaena odorata (L.. Bacterial communities were characterized using the Illumina MiSeq platform and the V6 hypervariable region of the 16S rRNA gene. Among the 54,026 unique operational taxonomic units (OTUs identified, 99.7% were classified as bacteria and the rest were classified as archaea. Several dominant bacterial phyla were observed in all samples—Proteobacteria, Actinobacteria, Acidobacteria, Firmicutes and Bacteroidetes. These five phyla accounted for 89.2% of all OTUs identified among all sites, and only two OTUs could not be classified to a phylum. Comparison among samples containing low and high levels of Cd contamination using nonparametric Shannon and Shannon diversity indices showed that soils with low levels of diversity had a higher level of Cd (p < 0.05. These results indicated that levels of Cd may significantly alter bacterial species selection. The Cd- and Zn-resistant bacteria from each sample were subjected to heavy-metal minimum inhibitory concentration (MIC analyses. The MIC values obtained from 1152 isolates were used to individually analyze the pattern of gene function using the BioNumerics software. The results of this analysis showed that 26.7% of the bacteria were resistant to Cd concentrations up to 320 mg/L and only 2.3% of bacteria were resistant to Zn at concentrations up to 3200 mg/L. The MIC analyses indicated that the number of resistant bacteria decreased with increasing metal concentrations and those bacteria resistant to Cd and Zn may contain more than one group of metal-resistance genes.

  19. Lactococcus lactis Diversity in Undefined Mixed Dairy Starter Cultures as Revealed by Comparative Genome Analyses and Targeted Amplicon Sequencing of epsD.

    Science.gov (United States)

    Frantzen, Cyril A; Kleppen, Hans Petter; Holo, Helge

    2018-02-01

    Undefined mesophilic mixed (DL) starter cultures are used in the production of continental cheeses and contain unknown strain mixtures of Lactococcus lactis and leuconostocs. The choice of starter culture affects the taste, aroma, and quality of the final product. To gain insight into the diversity of Lactococcus lactis strains in starter cultures, we whole-genome sequenced 95 isolates from three different starter cultures. Pan-genomic analyses, which included 30 publically available complete genomes, grouped the strains into 21 L. lactis subsp . lactis and 28 L. lactis subsp. cremoris lineages. Only one of the 95 isolates grouped with previously sequenced strains, and the three starter cultures showed no overlap in lineage distributions. The culture diversity was assessed by targeted amplicon sequencing using purR , a core gene, and epsD , present in 93 of the 95 starter culture isolates but absent in most of the reference strains. This enabled an unprecedented discrimination of starter culture Lactococcus lactis and revealed substantial differences between the three starter cultures and compositional shifts during the cultivation of cultures in milk. IMPORTANCE In contemporary cheese production, standardized frozen seed stock starter cultures are used to ensure production stability, reproducibility, and quality control of the product. The dairy industry experiences significant disruptions of cheese production due to phage attacks, and one commonly used countermeasure to phage attack is to employ a starter rotation strategy, in which two or more starters with minimal overlap in phage sensitivity are used alternately. A culture-independent analysis of the lactococcal diversity in complex undefined starter cultures revealed large differences between the three starter cultures and temporal shifts in lactococcal composition during the production of bulk starters. A better understanding of the lactococcal diversity in starter cultures will enable the development of

  20. (SELAX strain) B2 amplicon

    African Journals Online (AJOL)

    USER

    2010-07-12

    Jul 12, 2010 ... enzyme was inactivated by heating at 75°C for 10 min after the addition of 2 µl of ... material of the vector DNA and the partial genomic digests were packaged in ... tubes. The tubes were incubated for 15 min at 37°C. Molten LB ... Plaque transfer was carried out for 2 min with ink markings made on the agar ...

  1. The quantification of risk and tourism

    Directory of Open Access Journals (Sweden)

    Piet Croucamp

    2014-01-01

    Full Text Available Tourism in South Africa comprises 9.5% of Gross Domestic Product (GDP, but remains an underresearched industry, especially regarding the quantification of the risks prevailing in the social, political and economic environment in which the industry operates. Risk prediction, extrapolation forecasting is conducted largely in the context of a qualitative methodology. This article reflects on the quantification of social constructs as variables of risk in the tourism industry with reference to South Africa. The theory and methodology of quantification is briefly reviewed and the indicators of risk are conceptualized and operationalized. The identified indicators are scaled in indices for purposes of quantification. Risk assessments and the quantification of constructs rely heavily on the experience - often personal - of the researcher and this scholarly endeavour is, therefore, not inclusive of all possible identified indicators of risk. It is accepted that tourism in South Africa is an industry comprising of a large diversity of sectors, each with a different set of risk indicators and risk profiles. The emphasis of this article is thus on the methodology to be applied to a risk profile. A secondary endeavour is to provide for clarity about the conceptual and operational confines of risk in general, as well as how quantified risk relates to the tourism industry. The indices provided include both domesticand international risk indicators. The motivation for the article is to encourage a greater emphasis on quantitative research in our efforts to understand and manage a risk profile for the tourist industry.

  2. Simultaneous DNA-RNA Extraction from Coastal Sediments and Quantification of 16S rRNA Genes and Transcripts by Real-time PCR.

    Science.gov (United States)

    Tatti, Enrico; McKew, Boyd A; Whitby, Corrine; Smith, Cindy J

    2016-06-11

    Real Time Polymerase Chain Reaction also known as quantitative PCR (q-PCR) is a widely used tool in microbial ecology to quantify gene abundances of taxonomic and functional groups in environmental samples. Used in combination with a reverse transcriptase reaction (RT-q-PCR), it can also be employed to quantify gene transcripts. q-PCR makes use of highly sensitive fluorescent detection chemistries that allow quantification of PCR amplicons during the exponential phase of the reaction. Therefore, the biases associated with 'end-point' PCR detected in the plateau phase of the PCR reaction are avoided. A protocol to quantify bacterial 16S rRNA genes and transcripts from coastal sediments via real-time PCR is provided. First, a method for the co-extraction of DNA and RNA from coastal sediments, including the additional steps required for the preparation of DNA-free RNA, is outlined. Second, a step-by-step guide for the quantification of 16S rRNA genes and transcripts from the extracted nucleic acids via q-PCR and RT-q-PCR is outlined. This includes details for the construction of DNA and RNA standard curves. Key considerations for the use of RT-q-PCR assays in microbial ecology are included.

  3. Terahertz identification and quantification of penicillamine enantiomers

    International Nuclear Information System (INIS)

    Ji Te; Zhao Hongwei; Chen Min; Xiao Tiqiao; Han Pengyu

    2013-01-01

    Identification and characterization of L-, D- and DL- penicillamine were demonstrated by Terahertz time-domain spectroscopy (THz-TDS). To understand the physical origins of the low frequency resonant modes, the density functional theory (DFT) was adopted for theoretical calculation. It was found that the collective THz frequency motions were decided by the intramolecular and intermolecular hydrogen bond interactions. Moreover, the quantification of penicillamine enantiomers mixture was demonstrated by a THz spectra fitting method with a relative error of less than 3.5%. This technique can be a valuable tool for the discrimination and quantification of chiral drugs in pharmaceutical industry. (authors)

  4. Characteristics of MHC class I genes in house sparrows Passer domesticus as revealed by long cDNA transcripts and amplicon sequencing.

    Science.gov (United States)

    Karlsson, Maria; Westerdahl, Helena

    2013-08-01

    In birds the major histocompatibility complex (MHC) organization differs both among and within orders; chickens Gallus gallus of the order Galliformes have a simple arrangement, while many songbirds of the order Passeriformes have a more complex arrangement with larger numbers of MHC class I and II genes. Chicken MHC genes are found at two independent loci, classical MHC-B and non-classical MHC-Y, whereas non-classical MHC genes are yet to be verified in passerines. Here we characterize MHC class I transcripts (α1 to α3 domain) and perform amplicon sequencing using a next-generation sequencing technique on exon 3 from house sparrow Passer domesticus (a passerine) families. Then we use phylogenetic, selection, and segregation analyses to gain a better understanding of the MHC class I organization. Trees based on the α1 and α2 domain revealed a distinct cluster with short terminal branches for transcripts with a 6-bp deletion. Interestingly, this cluster was not seen in the tree based on the α3 domain. 21 exon 3 sequences were verified in a single individual and the average numbers within an individual were nine and five for sequences with and without a 6-bp deletion, respectively. All individuals had exon 3 sequences with and without a 6-bp deletion. The sequences with a 6-bp deletion have many characteristics in common with non-classical MHC, e.g., highly conserved amino acid positions were substituted compared with the other alleles, low nucleotide diversity and just a single site was subject to positive selection. However, these alleles also have characteristics that suggest they could be classical, e.g., complete linkage and absence of a distinct cluster in a tree based on the α3 domain. Thus, we cannot determine for certain whether or not the alleles with a 6-bp deletion are non-classical based on our present data. Further analyses on segregation patterns of these alleles in combination with dating the 6-bp deletion through MHC characterization across the

  5. Illumina MiSeq Phylogenetic Amplicon Sequencing Shows a Large Reduction of an Uncharacterised Succinivibrionaceae and an Increase of the Methanobrevibacter gottschalkii Clade in Feed Restricted Cattle.

    Directory of Open Access Journals (Sweden)

    Matthew Sean McCabe

    Full Text Available Periodic feed restriction is used in cattle production to reduce feed costs. When normal feed levels are resumed, cattle catch up to a normal weight by an acceleration of normal growth rate, known as compensatory growth, which is not yet fully understood. Illumina Miseq Phylogenetic marker amplicon sequencing of DNA extracted from rumen contents of 55 bulls showed that restriction of feed (70% concentrate, 30% grass silage for 125 days, to levels that caused a 60% reduction of growth rate, resulted in a large increase of relative abundance of Methanobrevibacter gottschalkii clade (designated as OTU-M7, and a large reduction of an uncharacterised Succinivibrionaceae species (designated as OTU-S3004. There was a strong negative Spearman correlation (ρ = -0.72, P = <1x10(-20 between relative abundances of OTU-3004 and OTU-M7 in the liquid rumen fraction. There was also a significant increase in acetate:propionate ratio (A:P in feed restricted animals that showed a negative Spearman correlation (ρ = -0.69, P = <1x10(-20 with the relative abundance of OTU-S3004 in the rumen liquid fraction but not the solid fraction, and a strong positive Spearman correlation with OTU-M7 in the rumen liquid (ρ = 0.74, P = <1x10(-20 and solid (ρ = 0.69, P = <1x10(-20 fractions. Reduced A:P ratios in the rumen are associated with increased feed efficiency and reduced production of methane which has a global warming potential (GWP 100 years of 28. Succinivibrionaceae growth in the rumen was previously suggested to reduce methane emissions as some members of this family utilise hydrogen, which is also utilised by methanogens for methanogenesis, to generate succinate which is converted to propionate. Relative abundance of OTU-S3004 showed a positive Spearman correlation with propionate (ρ = 0.41, P = <0.01 but not acetate in the liquid rumen fraction.

  6. Benchmarking common quantification strategies for large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Hogrebe, Alexander; von Stechow, Louise; Bekker-Jensen, Dorte B

    2018-01-01

    Comprehensive mass spectrometry (MS)-based proteomics is now feasible, but reproducible quantification remains challenging, especially for post-translational modifications such as phosphorylation. Here, we compare the most popular quantification techniques for global phosphoproteomics: label-free...

  7. Distortion of genetically modified organism quantification in processed foods: influence of particle size compositions and heat-induced DNA degradation.

    Science.gov (United States)

    Moreano, Francisco; Busch, Ulrich; Engel, Karl-Heinz

    2005-12-28

    Milling fractions from conventional and transgenic corn were prepared at laboratory scale and used to study the influence of sample composition and heat-induced DNA degradation on the relative quantification of genetically modified organisms (GMO) in food products. Particle size distributions of the obtained fractions (coarse grits, regular grits, meal, and flour) were characterized using a laser diffraction system. The application of two DNA isolation protocols revealed a strong correlation between the degree of comminution of the milling fractions and the DNA yield in the extracts. Mixtures of milling fractions from conventional and transgenic material (1%) were prepared and analyzed via real-time polymerase chain reaction. Accurate quantification of the adjusted GMO content was only possible in mixtures containing conventional and transgenic material in the form of analogous milling fractions, whereas mixtures of fractions exhibiting different particle size distributions delivered significantly over- and underestimated GMO contents depending on their compositions. The process of heat-induced nucleic acid degradation was followed by applying two established quantitative assays showing differences between the lengths of the recombinant and reference target sequences (A, deltal(A) = -25 bp; B, deltal(B) = +16 bp; values related to the amplicon length of the reference gene). Data obtained by the application of method A resulted in underestimated recoveries of GMO contents in the samples of heat-treated products, reflecting the favored degradation of the longer target sequence used for the detection of the transgene. In contrast, data yielded by the application of method B resulted in increasingly overestimated recoveries of GMO contents. The results show how commonly used food technological processes may lead to distortions in the results of quantitative GMO analyses.

  8. Comparison of different real-time PCR chemistries and their suitability for detection and quantification of genetically modified organisms

    Directory of Open Access Journals (Sweden)

    Žel Jana

    2008-03-01

    Full Text Available Abstract Background The real-time polymerase chain reaction is currently the method of choice for quantifying nucleic acids in different DNA based quantification applications. It is widely used also for detecting and quantifying genetically modified components in food and feed, predominantly employing TaqMan® and SYBR® Green real-time PCR chemistries. In our study four alternative chemistries: Lux™, Plexor™, Cycling Probe Technology and LNA® were extensively evaluated and compared using TaqMan® chemistry as a reference system. Results Amplicons were designed on the maize invertase gene and the 5'-junction of inserted transgene and plant genomic DNA in MON 810 event. Real-time assays were subsequently compared for their efficiency in PCR amplification, limits of detection and quantification, repeatability and accuracy to test the performance of the assays. Additionally, the specificity of established assays was checked on various transgenic and non-transgenic plant species. The overall applicability of the designed assays was evaluated, adding practicability and costs issues to the performance characteristics. Conclusion Although none of the chemistries significantly outperformed the others, there are certain characteristics that suggest that LNA® technology is an alternative to TaqMan® when designing assays for quantitative analysis. Because LNA® probes are much shorter they might be especially appropriate when high specificity is required and where the design of a common TaqMan® probe is difficult or even impossible due to sequence characteristics. Plexor™ on the other hand might be a method of choice for qualitative analysis when sensitivity, low cost and simplicity of use prevail.

  9. TaqMan probe real-time polymerase chain reaction assay for the quantification of canine DNA in chicken nugget.

    Science.gov (United States)

    Rahman, Md Mahfujur; Hamid, Sharifah Bee Abd; Basirun, Wan Jefrey; Bhassu, Subha; Rashid, Nur Raifana Abdul; Mustafa, Shuhaimi; Mohd Desa, Mohd Nasir; Ali, Md Eaqub

    2016-01-01

    This paper describes a short-amplicon-based TaqMan probe quantitative real-time PCR (qPCR) assay for the quantitative detection of canine meat in chicken nuggets, which are very popular across the world, including Malaysia. The assay targeted a 100-bp fragment of canine cytb gene using a canine-specific primer and TaqMan probe. Specificity against 10 different animals and plants species demonstrated threshold cycles (Ct) of 16.13 ± 0.12 to 16.25 ± 0.23 for canine DNA and negative results for the others in a 40-cycle reaction. The assay was tested for the quantification of up to 0.01% canine meat in deliberately spiked chicken nuggets with 99.7% PCR efficiency and 0.995 correlation coefficient. The analysis of the actual and qPCR predicted values showed a high recovery rate (from 87% ± 28% to 112% ± 19%) with a linear regression close to unity (R(2) = 0.999). Finally, samples of three halal-branded commercial chicken nuggets collected from different Malaysian outlets were screened for canine meat, but no contamination was demonstrated.

  10. Colour thresholding and objective quantification in bioimaging

    Science.gov (United States)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  11. Recurrence quantification analysis in Liu's attractor

    International Nuclear Information System (INIS)

    Balibrea, Francisco; Caballero, M. Victoria; Molera, Lourdes

    2008-01-01

    Recurrence Quantification Analysis is used to detect transitions chaos to periodical states or chaos to chaos in a new dynamical system proposed by Liu et al. This system contains a control parameter in the second equation and was originally introduced to investigate the forming mechanism of the compound structure of the chaotic attractor which exists when the control parameter is zero

  12. Quantification of coating aging using impedance measurements

    NARCIS (Netherlands)

    Westing, E.P.M. van; Weijde, D.H. van der; Vreijling, M.P.W.; Ferrari, G.M.; Wit, J.H.W. de

    1998-01-01

    This chapter shows the application results of a novel approach to quantify the ageing of organic coatings using impedance measurements. The ageing quantification is based on the typical impedance behaviour of barrier coatings in immersion. This immersion behaviour is used to determine the limiting

  13. Quantification analysis of CT for aphasic patients

    International Nuclear Information System (INIS)

    Watanabe, Shunzo; Ooyama, Hiroshi; Hojo, Kei; Tasaki, Hiroichi; Hanazono, Toshihide; Sato, Tokijiro; Metoki, Hirobumi; Totsuka, Motokichi; Oosumi, Noboru.

    1987-01-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on Slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis). (author)

  14. Quantification analysis of CT for aphasic patients

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, S.; Ooyama, H.; Hojo, K.; Tasaki, H.; Hanazono, T.; Sato, T.; Metoki, H.; Totsuka, M.; Oosumi, N.

    1987-02-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis).

  15. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  16. Quantification of glycyrrhizin biomarker in Glycyrrhiza glabra ...

    African Journals Online (AJOL)

    Background: A simple and sensitive thin-layer chromatographic method has been established for quantification of glycyrrhizin in Glycyrrhiza glabra rhizome and baby herbal formulations by validated Reverse Phase HPTLC method. Materials and Methods: RP-HPTLC Method was carried out using glass coated with RP-18 ...

  17. Noninvasive Quantification of Pancreatic Fat in Humans

    OpenAIRE

    Lingvay, Ildiko; Esser, Victoria; Legendre, Jaime L.; Price, Angela L.; Wertz, Kristen M.; Adams-Huet, Beverley; Zhang, Song; Unger, Roger H.; Szczepaniak, Lidia S.

    2009-01-01

    Objective: To validate magnetic resonance spectroscopy (MRS) as a tool for non-invasive quantification of pancreatic triglyceride (TG) content and to measure the pancreatic TG content in a diverse human population with a wide range of body mass index (BMI) and glucose control.

  18. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  19. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...

  20. Model Uncertainty Quantification Methods In Data Assimilation

    Science.gov (United States)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  1. Uncertainty Quantification in Alchemical Free Energy Methods.

    Science.gov (United States)

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-05-02

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  2. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....

  3. Level 2 probabilistic event analyses and quantification

    International Nuclear Information System (INIS)

    Boneham, P.

    2003-01-01

    In this paper an example of quantification of a severe accident phenomenological event is given. The performed analysis for assessment of the probability that the debris released from the reactor vessel was in a coolable configuration in the lower drywell is presented. It is also analysed the assessment of the type of core/concrete attack that would occur. The coolability of the debris ex-vessel evaluation by an event in the Simplified Boiling Water Reactor (SBWR) Containment Event Tree (CET) and a detailed Decomposition Event Tree (DET) developed to aid in the quantification of this CET event are considered. The headings in the DET selected to represent plant physical states (e.g., reactor vessel pressure at the time of vessel failure) and the uncertainties associated with the occurrence of critical physical phenomena (e.g., debris configuration in the lower drywell) considered important to assessing whether the debris was coolable or not coolable ex-vessel are also discussed

  4. SPECT quantification of regional radionuclide distributions

    International Nuclear Information System (INIS)

    Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1986-01-01

    SPECT quantification of regional radionuclide activities within the human body is affected by several physical and instrumental factors including attenuation of photons within the patient, Compton scattered events, the system's finite spatial resolution and object size, finite number of detected events, partial volume effects, the radiopharmaceutical biokinetics, and patient and/or organ motion. Furthermore, other instrumentation factors such as calibration of the center-of-rotation, sampling, and detector nonuniformities will affect the SPECT measurement process. These factors are described, together with examples of compensation methods that are currently available for improving SPECT quantification. SPECT offers the potential to improve in vivo estimates of absorbed dose, provided the acquisition, reconstruction, and compensation procedures are adequately implemented and utilized. 53 references, 2 figures

  5. Two populations of double minute chromosomes harbor distinct amplicons, the MYC locus at 8q24.2 and a 0.43-Mb region at 14q24.1, in the SW613-S human carcinoma cell line.

    Science.gov (United States)

    Guillaud-Bataille, M; Brison, O; Danglot, G; Lavialle, C; Raynal, B; Lazar, V; Dessen, P; Bernheim, A

    2009-01-01

    High-level amplifications observed in tumor cells are usually indicative of genes involved in oncogenesis. We report here a high resolution characterization of a new amplified region in the SW613-S carcinoma cell line. This cell line contains tumorigenic cells displaying high-level MYC amplification in the form of double minutes (DM(+) cells) and non tumorigenic cells exhibiting low-level MYC amplification in the form of homogeneously staining regions (DM(-) cells). Both cell types were studied at genomic and functional levels. The DM(+) cells display a second amplification, corresponding to the 14q24.1 region, in a distinct population of DMs. The 0.43-Mb amplified and overexpressed region contains the PLEK2, PIGH, ARG2, VTI1B, RDH11, and ZFYVE26 genes. Both amplicons were stably maintained upon in vitro and in vivo propagation. However, the 14q24.1 amplicon was not found in cells with high-level MYC amplification in the form of HSRs, either obtained after spontaneous integration of endogenous DM MYC copies or after transfection of DM(-) cells with a MYC gene expression vector. These HSR-bearing cells are highly tumorigenic. The 14q24.1 amplification may not play a role in malignancy per se but might contribute to maintaining the amplification in the form of DMs. Copyright 2009 S. Karger AG, Basel.

  6. Uncertainty quantification for hyperbolic and kinetic equations

    CERN Document Server

    Pareschi, Lorenzo

    2017-01-01

    This book explores recent advances in uncertainty quantification for hyperbolic, kinetic, and related problems. The contributions address a range of different aspects, including: polynomial chaos expansions, perturbation methods, multi-level Monte Carlo methods, importance sampling, and moment methods. The interest in these topics is rapidly growing, as their applications have now expanded to many areas in engineering, physics, biology and the social sciences. Accordingly, the book provides the scientific community with a topical overview of the latest research efforts.

  7. Quantification of heterogeneity observed in medical images

    OpenAIRE

    Brooks, Frank J; Grigsby, Perry W

    2013-01-01

    Background There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging mod...

  8. Artifacts Quantification of Metal Implants in MRI

    Science.gov (United States)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  9. Toolbox Approaches Using Molecular Markers and 16S rRNA Gene Amplicon Data Sets for Identification of Fecal Pollution in Surface Water.

    Science.gov (United States)

    Ahmed, W; Staley, C; Sadowsky, M J; Gyawali, P; Sidhu, J P S; Palmer, A; Beale, D J; Toze, S

    2015-10-01

    In this study, host-associated molecular markers and bacterial 16S rRNA gene community analysis using high-throughput sequencing were used to identify the sources of fecal pollution in environmental waters in Brisbane, Australia. A total of 92 fecal and composite wastewater samples were collected from different host groups (cat, cattle, dog, horse, human, and kangaroo), and 18 water samples were collected from six sites (BR1 to BR6) along the Brisbane River in Queensland, Australia. Bacterial communities in the fecal, wastewater, and river water samples were sequenced. Water samples were also tested for the presence of bird-associated (GFD), cattle-associated (CowM3), horse-associated, and human-associated (HF183) molecular markers, to provide multiple lines of evidence regarding the possible presence of fecal pollution associated with specific hosts. Among the 18 water samples tested, 83%, 33%, 17%, and 17% were real-time PCR positive for the GFD, HF183, CowM3, and horse markers, respectively. Among the potential sources of fecal pollution in water samples from the river, DNA sequencing tended to show relatively small contributions from wastewater treatment plants (up to 13% of sequence reads). Contributions from other animal sources were rarely detected and were very small (molecular markers showed variable agreement. A lack of relationships among fecal indicator bacteria, host-associated molecular markers, and 16S rRNA gene community analysis data was also observed. Nonetheless, we show that bacterial community and host-associated molecular marker analyses can be combined to identify potential sources of fecal pollution in an urban river. This study is a proof of concept, and based on the results, we recommend using bacterial community analysis (where possible) along with PCR detection or quantification of host-associated molecular markers to provide information on the sources of fecal pollution in waterways. Copyright © 2015, American Society for Microbiology

  10. Quantification in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Buvat, Irene

    2005-01-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena; 2 - quantification in SPECT, problems and correction methods: Attenuation, scattering, un-stationary spatial resolution, partial volume effect, movement, tomographic reconstruction, calibration; 3 - Synthesis: actual quantification accuracy; 4 - Beyond the activity concentration measurement

  11. Digital PCR for direct quantification of viruses without DNA extraction.

    Science.gov (United States)

    Pavšič, Jernej; Žel, Jana; Milavec, Mojca

    2016-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration material and it has higher tolerance to inhibitors. DNA quantification without an extraction step (i.e. direct quantification) was performed here using dPCR and two different human cytomegalovirus whole-virus materials. Two dPCR platforms were used for this direct quantification of the viral DNA, and these were compared with quantification of the extracted viral DNA in terms of yield and variability. Direct quantification of both whole-virus materials present in simple matrices like cell lysate or Tris-HCl buffer provided repeatable measurements of virus concentrations that were probably in closer agreement with the actual viral load than when estimated through quantification of the extracted DNA. Direct dPCR quantification of other viruses, reference materials and clinically relevant matrices is now needed to show the full versatility of this very promising and cost-efficient development in virus quantification.

  12. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against...... human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...

  13. Quantification procedures in micro X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Kanngiesser, Birgit

    2003-01-01

    For the quantification in micro X-ray fluorescence analysis standardfree quantification procedures have become especially important. An introduction to the basic concepts of these quantification procedures is given, followed by a short survey of the procedures which are available now and what kind of experimental situations and analytical problems are addressed. The last point is extended by the description of an own development for the fundamental parameter method, which renders the inclusion of nonparallel beam geometries possible. Finally, open problems for the quantification procedures are discussed

  14. Quantification of competitive value of documents

    Directory of Open Access Journals (Sweden)

    Pavel Šimek

    2009-01-01

    Full Text Available The majority of Internet users use the global network to search for different information using fulltext search engines such as Google, Yahoo!, or Seznam. The web presentation operators are trying, with the help of different optimization techniques, to get to the top places in the results of fulltext search engines. Right there is a great importance of Search Engine Optimization and Search Engine Marketing, because normal users usually try links only on the first few pages of the fulltext search engines results on certain keywords and in catalogs they use primarily hierarchically higher placed links in each category. Key to success is the application of optimization methods which deal with the issue of keywords, structure and quality of content, domain names, individual sites and quantity and reliability of backward links. The process is demanding, long-lasting and without a guaranteed outcome. A website operator without advanced analytical tools do not identify the contribution of individual documents from which the entire web site consists. If the web presentation operators want to have an overview of their documents and web site in global, it is appropriate to quantify these positions in a specific way, depending on specific key words. For this purpose serves the quantification of competitive value of documents, which consequently sets global competitive value of a web site. Quantification of competitive values is performed on a specific full-text search engine. For each full-text search engine can be and often are, different results. According to published reports of ClickZ agency or Market Share is according to the number of searches by English-speaking users most widely used Google search engine, which has a market share of more than 80%. The whole procedure of quantification of competitive values is common, however, the initial step which is the analysis of keywords depends on a choice of the fulltext search engine.

  15. Advances in forensic DNA quantification: a review.

    Science.gov (United States)

    Lee, Steven B; McCord, Bruce; Buel, Eric

    2014-11-01

    This review focuses upon a critical step in forensic biology: detection and quantification of human DNA from biological samples. Determination of the quantity and quality of human DNA extracted from biological evidence is important for several reasons. Firstly, depending on the source and extraction method, the quality (purity and length), and quantity of the resultant DNA extract can vary greatly. This affects the downstream method as the quantity of input DNA and its relative length can determine which genotyping procedure to use-standard short-tandem repeat (STR) typing, mini-STR typing or mitochondrial DNA sequencing. Secondly, because it is important in forensic analysis to preserve as much of the evidence as possible for retesting, it is important to determine the total DNA amount available prior to utilizing any destructive analytical method. Lastly, results from initial quantitative and qualitative evaluations permit a more informed interpretation of downstream analytical results. Newer quantitative techniques involving real-time PCR can reveal the presence of degraded DNA and PCR inhibitors, that provide potential reasons for poor genotyping results and may indicate methods to use for downstream typing success. In general, the more information available, the easier it is to interpret and process the sample resulting in a higher likelihood of successful DNA typing. The history of the development of quantitative methods has involved two main goals-improving precision of the analysis and increasing the information content of the result. This review covers advances in forensic DNA quantification methods and recent developments in RNA quantification. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  17. Quantification of thermal damage in skin tissue

    Institute of Scientific and Technical Information of China (English)

    Xu Feng; Wen Ting; Lu Tianjian; Seffen Keith

    2008-01-01

    Skin thermal damage or skin burns are the most commonly encountered type of trauma in civilian and military communities. Besides, advances in laser, microwave and similar technologies have led to recent developments of thermal treatments for disease and damage involving skin tissue, where the objective is to induce thermal damage precisely within targeted tissue structures but without affecting the surrounding, healthy tissue. Further, extended pain sensation induced by thermal damage has also brought great problem for burn patients. Thus, it is of great importance to quantify the thermal damage in skin tissue. In this paper, the available models and experimental methods for quantification of thermal damage in skin tissue are discussed.

  18. Automated Quantification of Pneumothorax in CT

    Science.gov (United States)

    Do, Synho; Salvaggio, Kristen; Gupta, Supriya; Kalra, Mannudeep; Ali, Nabeel U.; Pien, Homer

    2012-01-01

    An automated, computer-aided diagnosis (CAD) algorithm for the quantification of pneumothoraces from Multidetector Computed Tomography (MDCT) images has been developed. Algorithm performance was evaluated through comparison to manual segmentation by expert radiologists. A combination of two-dimensional and three-dimensional processing techniques was incorporated to reduce required processing time by two-thirds (as compared to similar techniques). Volumetric measurements on relative pneumothorax size were obtained and the overall performance of the automated method shows an average error of just below 1%. PMID:23082091

  19. Uncertainty quantification for PZT bimorph actuators

    Science.gov (United States)

    Bravo, Nikolas; Smith, Ralph C.; Crews, John

    2018-03-01

    In this paper, we discuss the development of a high fidelity model for a PZT bimorph actuator used for micro-air vehicles, which includes the Robobee. We developed a high-fidelity model for the actuator using the homogenized energy model (HEM) framework, which quantifies the nonlinear, hysteretic, and rate-dependent behavior inherent to PZT in dynamic operating regimes. We then discussed an inverse problem on the model. We included local and global sensitivity analysis of the parameters in the high-fidelity model. Finally, we will discuss the results of Bayesian inference and uncertainty quantification on the HEM.

  20. Linking probe thermodynamics to microarray quantification

    International Nuclear Information System (INIS)

    Li, Shuzhao; Pozhitkov, Alexander; Brouwer, Marius

    2010-01-01

    Understanding the difference in probe properties holds the key to absolute quantification of DNA microarrays. So far, Langmuir-like models have failed to link sequence-specific properties to hybridization signals in the presence of a complex hybridization background. Data from washing experiments indicate that the post-hybridization washing has no major effect on the specifically bound targets, which give the final signals. Thus, the amount of specific targets bound to probes is likely determined before washing, by the competition against nonspecific binding. Our competitive hybridization model is a viable alternative to Langmuir-like models. (comment)

  1. Image cytometry: nuclear and chromosomal DNA quantification.

    Science.gov (United States)

    Carvalho, Carlos Roberto; Clarindo, Wellington Ronildo; Abreu, Isabella Santiago

    2011-01-01

    Image cytometry (ICM) associates microscopy, digital image and software technologies, and has been particularly useful in spatial and densitometric cytological analyses, such as DNA ploidy and DNA content measurements. Basically, ICM integrates methodologies of optical microscopy calibration, standard density filters, digital CCD camera, and image analysis softwares for quantitative applications. Apart from all system calibration and setup, cytological protocols must provide good slide preparations for efficient and reliable ICM analysis. In this chapter, procedures for ICM applications employed in our laboratory are described. Protocols shown here for human DNA ploidy determination and quantification of nuclear and chromosomal DNA content in plants could be used as described, or adapted for other studies.

  2. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  3. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  4. Quantification of complex modular architecture in plants.

    Science.gov (United States)

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-04-01

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  5. Quantification of prebiotics in commercial infant formulas.

    Science.gov (United States)

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Seed shape quantification in the order Cucurbitales

    Directory of Open Access Journals (Sweden)

    Emilio Cervantes

    2018-02-01

    Full Text Available Seed shape quantification in diverse species of the families belonging to the order Cucurbitales is done based on the comparison of seed images with geometric figures. Quantification of seed shape is a useful tool in plant description for phenotypic characterization and taxonomic analysis. J index gives the percent of similarity of the image of a seed with a geometric figure and it is useful in taxonomy for the study of relationships between plant groups. Geometric figures used as models in the Cucurbitales are the ovoid, two ellipses with different x/y ratios and the outline of the Fibonacci spiral. The images of seeds have been compared with these figures and values of J index obtained. The results obtained for 29 species in the family Cucurbitaceae support a relationship between seed shape and species ecology. Simple seed shape, with images resembling simple geometric figures like the ovoid, ellipse or the Fibonacci spiral, may be a feature in the basal clades of taxonomic groups.

  7. Quantification of abdominal aortic deformation after EVAR

    Science.gov (United States)

    Demirci, Stefanie; Manstad-Hulaas, Frode; Navab, Nassir

    2009-02-01

    Quantification of abdominal aortic deformation is an important requirement for the evaluation of endovascular stenting procedures and the further refinement of stent graft design. During endovascular aortic repair (EVAR) treatment, the aortic shape is subject to severe deformation that is imposed by medical instruments such as guide wires, catheters, and, the stent graft. This deformation can affect the flow characteristics and morphology of the aorta which have been shown to be elicitors for stent graft failures and be reason for reappearance of aneurysms. We present a method for quantifying the deformation of an aneurysmatic aorta imposed by an inserted stent graft device. The outline of the procedure includes initial rigid alignment of the two abdominal scans, segmentation of abdominal vessel trees, and automatic reduction of their centerline structures to one specified region of interest around the aorta. This is accomplished by preprocessing and remodeling of the pre- and postoperative aortic shapes before performing a non-rigid registration. We further narrow the resulting displacement fields to only include local non-rigid deformation and therefore, eliminate all remaining global rigid transformations. Finally, deformations for specified locations can be calculated from the resulting displacement fields. In order to evaluate our method, experiments for the extraction of aortic deformation fields are conducted on 15 patient datasets from endovascular aortic repair (EVAR) treatment. A visual assessment of the registration results and evaluation of the usage of deformation quantification were performed by two vascular surgeons and one interventional radiologist who are all experts in EVAR procedures.

  8. Virus detection and quantification using electrical parameters

    Science.gov (United States)

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-10-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.

  9. CT quantification of central airway in tracheobronchomalacia

    Energy Technology Data Exchange (ETDEWEB)

    Im, Won Hyeong; Jin, Gong Yong; Han, Young Min; Kim, Eun Young [Dept. of Radiology, Chonbuk National University Hospital, Jeonju (Korea, Republic of)

    2016-05-15

    To know which factors help to diagnose tracheobronchomalacia (TBM) using CT quantification of central airway. From April 2013 to July 2014, 19 patients (68.0 ± 15.0 years; 6 male, 13 female) were diagnosed as TBM on CT. As case-matching, 38 normal subjects (65.5 ± 21.5 years; 6 male, 13 female) were selected. All 57 subjects underwent CT with end-inspiration and end-expiration. Airway parameters of trachea and both main bronchus were assessed using software (VIDA diagnostic). Airway parameters of TBM patients and normal subjects were compared using the Student t-test. In expiration, both wall perimeter and wall thickness in TBM patients were significantly smaller than normal subjects (wall perimeter: trachea, 43.97 mm vs. 49.04 mm, p = 0.020; right main bronchus, 33.52 mm vs. 42.69 mm, p < 0.001; left main bronchus, 26.76 mm vs. 31.88 mm, p = 0.012; wall thickness: trachea, 1.89 mm vs. 2.22 mm, p = 0.017; right main bronchus, 1.64 mm vs. 1.83 mm, p = 0.021; left main bronchus, 1.61 mm vs. 1.75 mm, p = 0.016). Wall thinning and decreased perimeter of central airway of expiration by CT quantification would be a new diagnostic indicators in TBM.

  10. Molecular quantification of genes encoding for green-fluorescent proteins

    DEFF Research Database (Denmark)

    Felske, A; Vandieken, V; Pauling, B V

    2003-01-01

    A quantitative PCR approach is presented to analyze the amount of recombinant green fluorescent protein (gfp) genes in environmental DNA samples. The quantification assay is a combination of specific PCR amplification and temperature gradient gel electrophoresis (TGGE). Gene quantification...... PCR strategy is a highly specific and sensitive way to monitor recombinant DNA in environments like the efflux of a biotechnological plant....

  11. La quantification en Kabiye: une approche linguistique | Pali ...

    African Journals Online (AJOL)

    ... which is denoted by lexical quantifiers. Quantification with specific reference is provided by different types of linguistic units (nouns, numerals, adjectives, adverbs, ideophones and verbs) in arguments/noun phrases and in the predicative phrase in the sense of Chomsky. Keywords: quantification, class, number, reference, ...

  12. In vivo MRS metabolite quantification using genetic optimization

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure.

  13. In vivo MRS metabolite quantification using genetic optimization

    International Nuclear Information System (INIS)

    Papakostas, G A; Mertzios, B G; Karras, D A; Van Ormondt, D; Graveron-Demilly, D

    2011-01-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure

  14. A newly developed real-time PCR assay for detection and quantification of Fusarium oxysporum and its use in compatible and incompatible interactions with grafted melon genotypes.

    Science.gov (United States)

    Haegi, Anita; Catalano, Valentina; Luongo, Laura; Vitale, Salvatore; Scotton, Michele; Ficcadenti, Nadia; Belisario, Alessandra

    2013-08-01

    A reliable and species-specific real-time quantitative polymerase chain reaction (qPCR) assay was developed for detection of the complex soilborne anamorphic fungus Fusarium oxysporum. The new primer pair, designed on the translation elongation factor 1-α gene with an amplicon of 142 bp, was highly specific to F. oxysporum without cross reactions with other Fusarium spp. The protocol was applied to grafted melon plants for the detection and quantification of F. oxysporum f. sp. melonis, a devastating pathogen of this cucurbit. Grafting technologies are widely used in melon to confer resistance against new virulent races of F. oxysporum f. sp. melonis, while maintaining the properties of valuable commercial varieties. However, the effects on the vascular pathogen colonization have not been fully investigated. Analyses were performed on 'Charentais-T' (susceptible) and 'Nad-1' (resistant) melon cultivars, both used either as rootstock and scion, and inoculated with F. oxysporum f. sp. melonis race 1 and race 1,2. Pathogen development was compared using qPCR and isolations from stem tissues. Early asymptomatic melon infections were detected with a quantification limit of 1 pg of fungal DNA. The qPCR protocol clearly showed that fungal development was highly affected by host-pathogen interaction (compatible or incompatible) and time (days postinoculation). The principal significant effect (P ≤ 0.01) on fungal development was due to the melon genotype used as rootstock, and this effect had a significant interaction with time and F. oxysporum f. sp. melonis race. In particular, the amount of race 1,2 DNA was significantly higher compared with that estimated for race 1 in the incompatible interaction at 18 days postinoculation. The two fungal races were always present in both the rootstock and scion of grafted plants in either the compatible or incompatible interaction.

  15. Quantitative PCR for the specific quantification of Lactococcus lactis and Lactobacillus paracasei and its interest for Lactococcus lactis in cheese samples.

    Science.gov (United States)

    Achilleos, Christine; Berthier, Françoise

    2013-12-01

    The first objective of this work was to develop real-time quantitative PCR (qPCR) assays to quantify two species of mesophilic lactic acid bacteria technologically active in food fermentation, including cheese making: Lactococcus lactis and Lactobacillus paracasei. The second objective was to compare qPCR and plate counts of these two species in cheese samples. Newly designed primers efficiently amplified a region of the tuf gene from the target species. Sixty-three DNA samples from twenty different bacterial species, phylogenetically related or commonly found in raw milk and dairy products, were selected as positive and negative controls. Target DNA was successfully amplified showing a single peak on the amplicon melting curve; non-target DNA was not amplified. Quantification was linear over 5 log units (R(2) > 0.990), down to 22 gene copies/μL per well for Lc. lactis and 73 gene copies/μL per well for Lb. paracasei. qPCR efficiency ranged from 82.9% to 93.7% for Lc. lactis and from 81.1% to 99.5% for Lb. paracasei. At two stages of growth, Lc. lactis was quantified in 12 soft cheeses and Lb. paracasei in 24 hard cooked cheeses. qPCR proved to be useful for quantifying Lc. lactis, but not Lb. paracasei. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  17. Quantification Methods of Management Skills in Shipping

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2012-04-01

    Full Text Available Romania can not overcome the financial crisis without business growth, without finding opportunities for economic development and without attracting investment into the country. Successful managers find ways to overcome situations of uncertainty. The purpose of this paper is to determine the managerial skills developed by the Romanian fluvial shipping company NAVROM (hereinafter CNFR NAVROM SA, compared with ten other major competitors in the same domain, using financial information of these companies during the years 2005-2010. For carrying out the work it will be used quantification methods of managerial skills to CNFR NAVROM SA Galati, Romania, as example mentioning the analysis of financial performance management based on profitability ratios, net profit margin, suppliers management, turnover.

  18. Recurrence quantification analysis of global stock markets

    Science.gov (United States)

    Bastos, João A.; Caiado, Jorge

    2011-04-01

    This study investigates the presence of deterministic dependencies in international stock markets using recurrence plots and recurrence quantification analysis (RQA). The results are based on a large set of free float-adjusted market capitalization stock indices, covering a period of 15 years. The statistical tests suggest that the dynamics of stock prices in emerging markets is characterized by higher values of RQA measures when compared to their developed counterparts. The behavior of stock markets during critical financial events, such as the burst of the technology bubble, the Asian currency crisis, and the recent subprime mortgage crisis, is analyzed by performing RQA in sliding windows. It is shown that during these events stock markets exhibit a distinctive behavior that is characterized by temporary decreases in the fraction of recurrence points contained in diagonal and vertical structures.

  19. Recurrence quantification analysis theory and best practices

    CERN Document Server

    Jr, Jr; Marwan, Norbert

    2015-01-01

    The analysis of recurrences in dynamical systems by using recurrence plots and their quantification is still an emerging field.  Over the past decades recurrence plots have proven to be valuable data visualization and analysis tools in the theoretical study of complex, time-varying dynamical systems as well as in various applications in biology, neuroscience, kinesiology, psychology, physiology, engineering, physics, geosciences, linguistics, finance, economics, and other disciplines.   This multi-authored book intends to comprehensively introduce and showcase recent advances as well as established best practices concerning both theoretical and practical aspects of recurrence plot based analysis.  Edited and authored by leading researcher in the field, the various chapters address an interdisciplinary readership, ranging from theoretical physicists to application-oriented scientists in all data-providing disciplines.

  20. Quantification practices in the nuclear industry

    International Nuclear Information System (INIS)

    1986-01-01

    In this chapter the quantification of risk practices adopted by the nuclear industries in Germany, Britain and France are examined as representative of the practices adopted throughout Europe. From this examination a number of conclusions are drawn about the common features of the practices adopted. In making this survey, the views expressed in the report of the Task Force on Safety Goals/Objectives appointed by the Commission of the European Communities, are taken into account. For each country considered, the legal requirements for presentation of quantified risk assessment as part of the licensing procedure are examined, and the way in which the requirements have been developed for practical application are then examined. (author)

  1. Convex geometry of quantum resource quantification

    Science.gov (United States)

    Regula, Bartosz

    2018-01-01

    We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof-extended negativity, a measure of k-coherence which generalises the \

  2. Kinetic quantification of plyometric exercise intensity.

    Science.gov (United States)

    Ebben, William P; Fauth, McKenzie L; Garceau, Luke R; Petushek, Erich J

    2011-12-01

    Ebben, WP, Fauth, ML, Garceau, LR, and Petushek, EJ. Kinetic quantification of plyometric exercise intensity. J Strength Cond Res 25(12): 3288-3298, 2011-Quantification of plyometric exercise intensity is necessary to understand the characteristics of these exercises and the proper progression of this mode of exercise. The purpose of this study was to assess the kinetic characteristics of a variety of plyometric exercises. This study also sought to assess gender differences in these variables. Twenty-six men and 23 women with previous experience in performing plyometric training served as subjects. The subjects performed a variety of plyometric exercises including line hops, 15.24-cm cone hops, squat jumps, tuck jumps, countermovement jumps (CMJs), loaded CMJs equal to 30% of 1 repetition maximum squat, depth jumps normalized to the subject's jump height (JH), and single leg jumps. All plyometric exercises were assessed with a force platform. Outcome variables associated with the takeoff, airborne, and landing phase of each plyometric exercise were evaluated. These variables included the peak vertical ground reaction force (GRF) during takeoff, the time to takeoff, flight time, JH, peak power, landing rate of force development, and peak vertical GRF during landing. A 2-way mixed analysis of variance with repeated measures for plyometric exercise type demonstrated main effects for exercise type and all outcome variables (p ≤ 0.05) and for the interaction between gender and peak vertical GRF during takeoff (p ≤ 0.05). Bonferroni-adjusted pairwise comparisons identified a number of differences between the plyometric exercises for the outcome variables assessed (p ≤ 0.05). These findings can be used to guide the progression of plyometric training by incorporating exercises of increasing intensity over the course of a program.

  3. Quantification of heterogeneity observed in medical images

    International Nuclear Information System (INIS)

    Brooks, Frank J; Grigsby, Perry W

    2013-01-01

    There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity

  4. Quantification of heterogeneity observed in medical images.

    Science.gov (United States)

    Brooks, Frank J; Grigsby, Perry W

    2013-03-02

    There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity.

  5. Exploring Heterogeneous Multicore Architectures for Advanced Embedded Uncertainty Quantification.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Edwards, Harold C.; Hu, Jonathan J.

    2014-09-01

    We explore rearrangements of classical uncertainty quantification methods with the aim of achieving higher aggregate performance for uncertainty quantification calculations on emerging multicore and manycore architectures. We show a rearrangement of the stochastic Galerkin method leads to improved performance and scalability on several computational architectures whereby un- certainty information is propagated at the lowest levels of the simulation code improving memory access patterns, exposing new dimensions of fine grained parallelism, and reducing communica- tion. We also develop a general framework for implementing such rearrangements for a diverse set of uncertainty quantification algorithms as well as computational simulation codes to which they are applied.

  6. Application of Fuzzy Comprehensive Evaluation Method in Trust Quantification

    Directory of Open Access Journals (Sweden)

    Shunan Ma

    2011-10-01

    Full Text Available Trust can play an important role for the sharing of resources and information in open network environments. Trust quantification is thus an important issue in dynamic trust management. By considering the fuzziness and uncertainty of trust, in this paper, we propose a fuzzy comprehensive evaluation method to quantify trust along with a trust quantification algorithm. Simulation results show that the trust quantification algorithm that we propose can effectively quantify trust and the quantified value of an entity's trust is consistent with the behavior of the entity.

  7. Quantification of Uncertainties in Integrated Spacecraft System Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...

  8. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.; Adams, M.L.; McClarren, R.G.; Mallick, B.K.

    2011-01-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework

  9. Direct quantification of negatively charged functional groups on membrane surfaces

    KAUST Repository

    Tiraferri, Alberto; Elimelech, Menachem

    2012-01-01

    groups at the surface of dense polymeric membranes. Both techniques consist of associating the membrane surface moieties with chemical probes, followed by quantification of the bound probes. Uranyl acetate and toluidine blue O dye, which interact

  10. Synthesis and Review: Advancing agricultural greenhouse gas quantification

    International Nuclear Information System (INIS)

    Olander, Lydia P; Wollenberg, Eva; Tubiello, Francesco N; Herold, Martin

    2014-01-01

    Reducing emissions of agricultural greenhouse gases (GHGs), such as methane and nitrous oxide, and sequestering carbon in the soil or in living biomass can help reduce the impact of agriculture on climate change while improving productivity and reducing resource use. There is an increasing demand for improved, low cost quantification of GHGs in agriculture, whether for national reporting to the United Nations Framework Convention on Climate Change (UNFCCC), underpinning and stimulating improved practices, establishing crediting mechanisms, or supporting green products. This ERL focus issue highlights GHG quantification to call attention to our existing knowledge and opportunities for further progress. In this article we synthesize the findings of 21 papers on the current state of global capability for agricultural GHG quantification and visions for its improvement. We conclude that strategic investment in quantification can lead to significant global improvement in agricultural GHG estimation in the near term. (paper)

  11. A Micropillar Compression Methodology for Ductile Damage Quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  12. Multi data reservior history matching and uncertainty quantification framework

    KAUST Repository

    Katterbauer, Klemens; Hoteit, Ibrahim; Sun, Shuyu

    2015-01-01

    A multi-data reservoir history matching and uncertainty quantification framework is provided. The framework can utilize multiple data sets such as production, seismic, electromagnetic, gravimetric and surface deformation data for improving

  13. A micropillar compression methodology for ductile damage quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  14. The value of serum Hepatitis B surface antigen quantification in ...

    African Journals Online (AJOL)

    The value of serum Hepatitis B surface antigen quantification in determining viralactivity in chronic Hepatitis B virus infection. ... ofCHB andalso higher in hepatitis e antigen positive patients compared to hepatitis e antigen negative patients.

  15. an expansion of the aboveground biomass quantification model for ...

    African Journals Online (AJOL)

    Research Note BECVOL 3: an expansion of the aboveground biomass quantification model for ... African Journal of Range and Forage Science ... encroachment and estimation of food to browser herbivore species, was proposed during 1989.

  16. (1) H-MRS processing parameters affect metabolite quantification

    DEFF Research Database (Denmark)

    Bhogal, Alex A; Schür, Remmelt R; Houtepen, Lotte C

    2017-01-01

    investigated the influence of model parameters and spectral quantification software on fitted metabolite concentration values. Sixty spectra in 30 individuals (repeated measures) were acquired using a 7-T MRI scanner. Data were processed by four independent research groups with the freedom to choose their own...... + NAAG/Cr + PCr and Glu/Cr + PCr, respectively. Metabolite quantification using identical (1) H-MRS data was influenced by processing parameters, basis sets and software choice. Locally preferred processing choices affected metabolite quantification, even when using identical software. Our results......Proton magnetic resonance spectroscopy ((1) H-MRS) can be used to quantify in vivo metabolite levels, such as lactate, γ-aminobutyric acid (GABA) and glutamate (Glu). However, there are considerable analysis choices which can alter the accuracy or precision of (1) H-MRS metabolite quantification...

  17. The Maqāṣid approach and rethinking political rights in modern society

    Directory of Open Access Journals (Sweden)

    Louay Safi

    2010-12-01

    Full Text Available This paper examines political rights in Islam by focusing on freedom of religion and the extent to which the state is empowered to enforce faith and religious law on society. It starts by comparing the notion of law in both Western and Islamic traditions, and then analyzes the difference between the ethical and legal within Sharī‘ah. The paper illustrates how Islamic law grew historically by working to limit the power of the state, and points out the need to maintain the distinction between the state and civil society for the proper implementation of Sharī‘ah. The paper also contends that those who call on the state to enforce all rules of Sharī‘ah on society rely on a faulty theory of right and concludes that Islamic law fully recognizes the right of individuals to adopt and practice their faith freely. Freedom of religion, it stresses, is an intrinsic aspect of Islamic law and all efforts to limit this freedom is bound to violate its purpose and dictates.

  18. Delignification of high kappa number soda-ODiMAQ pulps with a polyoxometalate

    Science.gov (United States)

    Edward L. Springer; Aziz Ahmed

    2001-01-01

    Increased use of pressure sensitive adhesives for labels and stamps has introduced another contaminant into the office paper stream: silicone- coated release liners. This study examines methods and conditions for removal of contaminants, including these liners, from a typical batch of discarded office papers. Removal of contaminants contained in the furnish were...

  19. Quantification Model for Estimating Temperature Field Distributions of Apple Fruit

    OpenAIRE

    Zhang , Min; Yang , Le; Zhao , Huizhong; Zhang , Leijie; Zhong , Zhiyou; Liu , Yanling; Chen , Jianhua

    2009-01-01

    International audience; A quantification model of transient heat conduction was provided to simulate apple fruit temperature distribution in the cooling process. The model was based on the energy variation of apple fruit of different points. It took into account, heat exchange of representative elemental volume, metabolism heat and external heat. The following conclusions could be obtained: first, the quantification model can satisfactorily describe the tendency of apple fruit temperature dis...

  20. Quantification of aortic regurgitation by magnetic resonance velocity mapping

    DEFF Research Database (Denmark)

    Søndergaard, Lise; Lindvig, K; Hildebrandt, P

    1993-01-01

    The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients, and the regurgit......The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients...

  1. FRANX. Application for analysis and quantification of the APS fire

    International Nuclear Information System (INIS)

    Snchez, A.; Osorio, F.; Ontoso, N.

    2014-01-01

    The FRANX application has been developed by EPRI within the Risk and Reliability User Group in order to facilitate the process of quantification and updating APS Fire (also covers floods and earthquakes). By applying fire scenarios are quantified in the central integrating the tasks performed during the APS fire. This paper describes the main features of the program to allow quantification of an APS Fire. (Author)

  2. Tissue quantification for development of pediatric phantom

    International Nuclear Information System (INIS)

    Alves, A.F.F.; Miranda, J.R.A.; Pina, D.R.

    2013-01-01

    The optimization of the risk- benefit ratio is a major concern in the pediatric radiology, due to the greater vulnerability of children to the late somatic effects and genetic effects of exposure to radiation compared to adults. In Brazil, it is estimated that the causes of death from head trauma are 18 % for the age group between 1-5 years and the radiograph is the primary diagnostic test for the detection of skull fracture . Knowing that the image quality is essential to ensure the identification of structures anatomical and minimizing errors diagnostic interpretation, this paper proposed the development and construction of homogeneous phantoms skull, for the age group 1-5 years. The construction of the phantoms homogeneous was performed using the classification and quantification of tissue present in the skull of pediatric patients. In this procedure computational algorithms were used, using Matlab, to quantify distinct biological tissues present in the anatomical regions studied , using pictures retrospective CT scans. Preliminary data obtained from measurements show that between the ages of 1-5 years, assuming an average anteroposterior diameter of the pediatric skull region of the 145.73 ± 2.97 mm, can be represented by 92.34 mm ± 5.22 of lucite and 1.75 ± 0:21 mm of aluminum plates of a provision of PEP (Pacient equivalent phantom). After its construction, the phantoms will be used for image and dose optimization in pediatric protocols process to examinations of computerized radiography

  3. Uncertainty Quantification in High Throughput Screening ...

    Science.gov (United States)

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for

  4. Uncertainty quantification in flood risk assessment

    Science.gov (United States)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  5. Quantification of the vocal folds’ dynamic displacements

    International Nuclear Information System (INIS)

    Hernández-Montes, María del Socorro; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-01-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ∼100–1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues. (paper)

  6. Tentacle: distributed quantification of genes in metagenomes.

    Science.gov (United States)

    Boulund, Fredrik; Sjögren, Anders; Kristiansson, Erik

    2015-01-01

    In metagenomics, microbial communities are sequenced at increasingly high resolution, generating datasets with billions of DNA fragments. Novel methods that can efficiently process the growing volumes of sequence data are necessary for the accurate analysis and interpretation of existing and upcoming metagenomes. Here we present Tentacle, which is a novel framework that uses distributed computational resources for gene quantification in metagenomes. Tentacle is implemented using a dynamic master-worker approach in which DNA fragments are streamed via a network and processed in parallel on worker nodes. Tentacle is modular, extensible, and comes with support for six commonly used sequence aligners. It is easy to adapt Tentacle to different applications in metagenomics and easy to integrate into existing workflows. Evaluations show that Tentacle scales very well with increasing computing resources. We illustrate the versatility of Tentacle on three different use cases. Tentacle is written for Linux in Python 2.7 and is published as open source under the GNU General Public License (v3). Documentation, tutorials, installation instructions, and the source code are freely available online at: http://bioinformatics.math.chalmers.se/tentacle.

  7. Cross recurrence quantification for cover song identification

    Energy Technology Data Exchange (ETDEWEB)

    Serra, Joan; Serra, Xavier; Andrzejak, Ralph G [Department of Information and Communication Technologies, Universitat Pompeu Fabra, Roc Boronat 138, 08018 Barcelona (Spain)], E-mail: joan.serraj@upf.edu

    2009-09-15

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Roessler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  8. Verification Validation and Uncertainty Quantification for CGS

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kamm, James R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    The overall conduct of verification, validation and uncertainty quantification (VVUQ) is discussed through the construction of a workflow relevant to computational modeling including the turbulence problem in the coarse grained simulation (CGS) approach. The workflow contained herein is defined at a high level and constitutes an overview of the activity. Nonetheless, the workflow represents an essential activity in predictive simulation and modeling. VVUQ is complex and necessarily hierarchical in nature. The particular characteristics of VVUQ elements depend upon where the VVUQ activity takes place in the overall hierarchy of physics and models. In this chapter, we focus on the differences between and interplay among validation, calibration and UQ, as well as the difference between UQ and sensitivity analysis. The discussion in this chapter is at a relatively high level and attempts to explain the key issues associated with the overall conduct of VVUQ. The intention is that computational physicists can refer to this chapter for guidance regarding how VVUQ analyses fit into their efforts toward conducting predictive calculations.

  9. Information theoretic quantification of diagnostic uncertainty.

    Science.gov (United States)

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  10. [Quantification of acetabular coverage in normal adult].

    Science.gov (United States)

    Lin, R M; Yang, C Y; Yu, C Y; Yang, C R; Chang, G L; Chou, Y L

    1991-03-01

    Quantification of acetabular coverage is important and can be expressed by superimposition of cartilage tracings on the maximum cross-sectional area of the femoral head. A practical Autolisp program on PC AutoCAD has been developed by us to quantify the acetabular coverage through numerical expression of the images of computed tomography. Thirty adults (60 hips) with normal center-edge angle and acetabular index in plain X ray were randomly selected for serial drops. These slices were prepared with a fixed coordination and in continuous sections of 5 mm in thickness. The contours of the cartilage of each section were digitized into a PC computer and processed by AutoCAD programs to quantify and characterize the acetabular coverage of normal and dysplastic adult hips. We found that a total coverage ratio of greater than 80%, an anterior coverage ratio of greater than 75% and a posterior coverage ratio of greater than 80% can be categorized in a normal group. Polar edge distance is a good indicator for the evaluation of preoperative and postoperative coverage conditions. For standardization and evaluation of acetabular coverage, the most suitable parameters are the total coverage ratio, anterior coverage ratio, posterior coverage ratio and polar edge distance. However, medial coverage and lateral coverage ratios are indispensable in cases of dysplastic hip because variations between them are so great that acetabuloplasty may be impossible. This program can also be used to classify precisely the type of dysplastic hip.

  11. On uncertainty quantification in hydrogeology and hydrogeophysics

    Science.gov (United States)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  12. Quantification of the vocal folds’ dynamic displacements

    Science.gov (United States)

    del Socorro Hernández-Montes, María; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-05-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ~100-1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues.

  13. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost which quickly becomes intractable with the current explosion of data sizes. In this work we reduce this complexity to quadratic with the synergy of two algorithms that gracefully complement each other and lead to a radically different approach. First, we turned to stochastic estimation of the diagonal. This allowed us to cast the problem as a linear system with a relatively small number of multiple right hand sides. Second, for this linear system we developed a novel, mixed precision, iterative refinement scheme, which uses iterative solvers instead of matrix factorizations. We demonstrate that the new framework not only achieves the much needed quadratic cost but in addition offers excellent opportunities for scaling at massively parallel environments. We based our implementation on BLAS 3 kernels that ensure very high processor performance. We achieved a peak performance of 730 TFlops on 72 BG/P racks, with a sustained performance 73% of theoretical peak. We stress that the techniques presented in this work are quite general and applicable to several other important applications. Copyright © 2009 ACM.

  14. Standardless quantification methods in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Trincavelli, Jorge, E-mail: trincavelli@famaf.unc.edu.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Limandri, Silvina, E-mail: s.limandri@conicet.gov.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Bonetto, Rita, E-mail: bonetto@quimica.unlp.edu.ar [Centro de Investigación y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Facultad de Ciencias Exactas, de la Universidad Nacional de La Plata, Calle 47 N° 257, 1900 La Plata (Argentina)

    2014-11-01

    The elemental composition of a solid sample can be determined by electron probe microanalysis with or without the use of standards. The standardless algorithms are quite faster than the methods that require standards; they are useful when a suitable set of standards is not available or for rough samples, and also they help to solve the problem of current variation, for example, in equipments with cold field emission gun. Due to significant advances in the accuracy achieved during the last years, product of the successive efforts made to improve the description of generation, absorption and detection of X-rays, the standardless methods have increasingly become an interesting option for the user. Nevertheless, up to now, algorithms that use standards are still more precise than standardless methods. It is important to remark, that care must be taken with results provided by standardless methods that normalize the calculated concentration values to 100%, unless an estimate of the errors is reported. In this work, a comprehensive discussion of the key features of the main standardless quantification methods, as well as the level of accuracy achieved by them is presented. - Highlights: • Standardless methods are a good alternative when no suitable standards are available. • Their accuracy reaches 10% for 95% of the analyses when traces are excluded. • Some of them are suitable for the analysis of rough samples.

  15. Quantification of variability in trichome patterns

    Directory of Open Access Journals (Sweden)

    Bettina eGreese

    2014-11-01

    Full Text Available While pattern formation is studied in various areas of biology, little is known about the intrinsic noise leading to variations between individual realizations of the pattern. One prominent example for de novo pattern formation in plants is the patterning of trichomes on Arabidopsis leaves, which involves genetic regulation and cell-to-cell communication. These processes are potentially variable due to , e.g., the abundance of cell components or environmental conditions. To elevate the understanding of the regulatory processes underlying the pattern formation it is crucial to quantitatively analyze the variability in naturally occurring patterns. Here, we review recent approaches towards characterization of noise on trichome initiation. We present methods for the quantification of spatial patterns, which are the basis for data-driven mathematical modeling and enable the analysis of noise from different sources. Besides the insight gained on trichome formation, the examination of observed trichome patterns also shows that highly regulated biological processes can be substantially affected by variability.

  16. Cross recurrence quantification for cover song identification

    International Nuclear Information System (INIS)

    Serra, Joan; Serra, Xavier; Andrzejak, Ralph G

    2009-01-01

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Roessler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  17. Quality Quantification of Evaluated Cross Section Covariances

    International Nuclear Information System (INIS)

    Varet, S.; Dossantos-Uzarralde, P.; Vayatis, N.

    2015-01-01

    Presently, several methods are used to estimate the covariance matrix of evaluated nuclear cross sections. Because the resulting covariance matrices can be different according to the method used and according to the assumptions of the method, we propose a general and objective approach to quantify the quality of the covariance estimation for evaluated cross sections. The first step consists in defining an objective criterion. The second step is computation of the criterion. In this paper the Kullback-Leibler distance is proposed for the quality quantification of a covariance matrix estimation and its inverse. It is based on the distance to the true covariance matrix. A method based on the bootstrap is presented for the estimation of this criterion, which can be applied with most methods for covariance matrix estimation and without the knowledge of the true covariance matrix. The full approach is illustrated on the 85 Rb nucleus evaluations and the results are then used for a discussion on scoring and Monte Carlo approaches for covariance matrix estimation of the cross section evaluations

  18. Bacterial diversity of the Colombian fermented milk "Suero Costeño" assessed by culturing and high-throughput sequencing and DGGE analysis of 16S rRNA gene amplicons.

    Science.gov (United States)

    Motato, Karina Edith; Milani, Christian; Ventura, Marco; Valencia, Francia Elena; Ruas-Madiedo, Patricia; Delgado, Susana

    2017-12-01

    "Suero Costeño" (SC) is a traditional soured cream elaborated from raw milk in the Northern-Caribbean coast of Colombia. The natural microbiota that characterizes this popular Colombian fermented milk is unknown, although several culturing studies have previously been attempted. In this work, the microbiota associated with SC from three manufacturers in two regions, "Planeta Rica" (Córdoba) and "Caucasia" (Antioquia), was analysed by means of culturing methods in combination with high-throughput sequencing and DGGE analysis of 16S rRNA gene amplicons. The bacterial ecosystem of SC samples was revealed to be composed of lactic acid bacteria belonging to the Streptococcaceae and Lactobacillaceae families; the proportions and genera varying among manufacturers and region of elaboration. Members of the Lactobacillus acidophilus group, Lactocococcus lactis, Streptococcus infantarius and Streptococcus salivarius characterized this artisanal product. In comparison with culturing, the use of molecular in deep culture-independent techniques provides a more realistic picture of the overall bacterial communities residing in SC. Besides the descriptive purpose, these approaches will facilitate a rational strategy to follow (culture media and growing conditions) for the isolation of indigenous strains that allow standardization in the manufacture of SC. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Quantification of nanowire uptake by live cells

    KAUST Repository

    Margineanu, Michael B.

    2015-05-01

    Nanostructures fabricated by different methods have become increasingly important for various applications at the cellular level. In order to understand how these nanostructures “behave” and for studying their internalization kinetics, several attempts have been made at tagging and investigating their interaction with living cells. In this study, magnetic iron nanowires with an iron oxide layer are coated with (3-Aminopropyl)triethoxysilane (APTES), and subsequently labeled with a fluorogenic pH-dependent dye pHrodo™ Red, covalently bound to the aminosilane surface. Time-lapse live imaging of human colon carcinoma HCT 116 cells interacting with the labeled iron nanowires is performed for 24 hours. As the pHrodo™ Red conjugated nanowires are non-fluorescent outside the cells but fluoresce brightly inside, internalized nanowires are distinguished from non-internalized ones and their behavior inside the cells can be tracked for the respective time length. A machine learning-based computational framework dedicated to automatic analysis of live cell imaging data, Cell Cognition, is adapted and used to classify cells with internalized and non-internalized nanowires and subsequently determine the uptake percentage by cells at different time points. An uptake of 85 % by HCT 116 cells is observed after 24 hours incubation at NW-to-cell ratios of 200. While the approach of using pHrodo™ Red for internalization studies is not novel in the literature, this study reports for the first time the utilization of a machine-learning based time-resolved automatic analysis pipeline for quantification of nanowire uptake by cells. This pipeline has also been used for comparison studies with nickel nanowires coated with APTES and labeled with pHrodo™ Red, and another cell line derived from the cervix carcinoma, HeLa. It has thus the potential to be used for studying the interaction of different types of nanostructures with potentially any live cell types.

  20. Quantification of water in hydrous ringwoodite

    Directory of Open Access Journals (Sweden)

    Sylvia-Monique eThomas

    2015-01-01

    Full Text Available Ringwoodite, γ-(Mg,Fe2SiO4, in the lower 150 km of Earth’s mantle transition zone (410-660 km depth can incorporate up to 1.5-2 wt% H2O as hydroxyl defects. We present a mineral-specific IR calibration for the absolute water content in hydrous ringwoodite by combining results from Raman spectroscopy, secondary ion mass spectrometery (SIMS and proton-proton (pp-scattering on a suite of synthetic Mg- and Fe-bearing hydrous ringwoodites. H2O concentrations in the crystals studied here range from 0.46 to 1.7 wt% H2O (absolute methods, with the maximum H2O in the same sample giving 2.5 wt% by SIMS calibration. Anchoring our spectroscopic results to absolute H-atom concentrations from pp-scattering measurements, we report frequency-dependent integrated IR-absorption coefficients for water in ringwoodite ranging from 78180 to 158880 L mol-1cm-2, depending upon frequency of the OH absorption. We further report a linear wavenumber IR calibration for H2O quantification in hydrous ringwoodite across the Mg2SiO4-Fe2SiO4 solid solution, which will lead to more accurate estimations of the water content in both laboratory-grown and naturally occurring ringwoodites. Re-evaluation of the IR spectrum for a natural hydrous ringwoodite inclusion in diamond from the study of Pearson et al. (2014 indicates the crystal contains 1.43 ± 0.27 wt% H2O, thus confirming near-maximum amounts of H2O for this sample from the transition zone.

  1. Uncertainty Quantification in Geomagnetic Field Modeling

    Science.gov (United States)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  2. Rapid quantification and sex determination of forensic evidence materials.

    Science.gov (United States)

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  3. GMO quantification: valuable experience and insights for the future.

    Science.gov (United States)

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  4. Quantification model for energy consumption in edification

    Directory of Open Access Journals (Sweden)

    Mercader, Mª P.

    2012-12-01

    Full Text Available The research conducted in this paper focuses on the generation of a model for the quantification of energy consumption in building. This is to be done through one of the most relevant environmental impact indicators associated with weight per m2 of construction, as well as the energy consumption resulting from the manufacturing process of materials used in building construction. The practical application of the proposed model on different buildings typologies in Seville, will provide information regarding the building materials, the subsystems and the most relevant construction elements. Hence, we will be able to observe the impact the built surface has on the environment. The results obtained aim to reference the scientific community, providing quantitative data comparable to other types of buildings and geographical areas. Furthermore, it may also allow the analysis and the characterization of feasible solutions to reduce the environmental impact generated by the different materials, subsystems and construction elements commonly used in the different building types defined in this study.

    La investigación realizada en el presente trabajo plantea la generación de un modelo de cuantificación del consumo energético en edificación, a través de uno de los indicadores de impacto ambiental más relevantes asociados al peso por m2 de construcción, el consumo energético derivado del proceso de fabricación de los materiales de construcción empleados en edificación. La aplicación práctica del modelo propuesto sobre diferentes tipologías edificatorias en Sevilla aportará información respecto a los materiales de construcción, subsistemas y elementos constructivos más impactantes, permitiendo visualizar la influencia que presenta la superficie construida en cuanto al impacto ambiental generado. Los resultados obtenidos pretenden servir de referencia a la comunidad científica, aportando datos num

  5. Iron overload in the liver diagnostic and quantification

    International Nuclear Information System (INIS)

    Alustiza, Jose M.; Castiella, Agustin; Juan, Maria D. de; Emparanza, Jose I.; Artetxe, Jose; Uranga, Maite

    2007-01-01

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification

  6. A highly sensitive method for quantification of iohexol

    DEFF Research Database (Denmark)

    Schulz, A.; Boeringer, F.; Swifka, J.

    2014-01-01

    -chromatography-electrospray-massspectrometry (LC-ESI-MS) approach using the multiple reaction monitoring mode for iohexol quantification. In order to test whether a significantly decreased amount of iohexol is sufficient for reliable quantification, a LC-ESI-MS approach was assessed. We analyzed the kinetics of iohexol in rats after application...... of different amounts of iohexol (15 mg to 150 1.tg per rat). Blood sampling was conducted at four time points, at 15, 30, 60, and 90 min, after iohexol injection. The analyte (iohexol) and the internal standard (iotha(amic acid) were separated from serum proteins using a centrifugal filtration device...... with a cut-off of 3 kDa. The chromatographic separation was achieved on an analytical Zorbax SB C18 column. The detection and quantification were performed on a high capacity trap mass spectrometer using positive ion ESI in the multiple reaction monitoring (MRM) mode. Furthermore, using real-time polymerase...

  7. Iron overload in the liver diagnostic and quantification

    Energy Technology Data Exchange (ETDEWEB)

    Alustiza, Jose M. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)]. E-mail: jmalustiza@osatek.es; Castiella, Agustin [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Juan, Maria D. de [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Emparanza, Jose I. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Artetxe, Jose [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Uranga, Maite [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)

    2007-03-15

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification.

  8. Stereological quantification of mast cells in human synovium

    DEFF Research Database (Denmark)

    Damsgaard, T E; Sørensen, Flemming Brandt; Herlin, T

    1999-01-01

    Mast cells participate in both the acute allergic reaction as well as in chronic inflammatory diseases. Earlier studies have revealed divergent results regarding the quantification of mast cells in the human synovium. The aim of the present study was therefore to quantify these cells in the human...... synovium, using stereological techniques. Different methods of staining and quantification have previously been used for mast cell quantification in human synovium. Stereological techniques provide precise and unbiased information on the number of cell profiles in two-dimensional tissue sections of......, in this case, human synovium. In 10 patients suffering from osteoarthritis a median of 3.6 mast cells/mm2 synovial membrane was found. The total number of cells (synoviocytes, fibroblasts, lymphocytes, leukocytes) present was 395.9 cells/mm2 (median). The mast cells constituted 0.8% of all the cell profiles...

  9. Superlattice band structure: New and simple energy quantification condition

    Energy Technology Data Exchange (ETDEWEB)

    Maiz, F., E-mail: fethimaiz@gmail.com [University of Cartage, Nabeul Engineering Preparatory Institute, Merazka, 8000 Nabeul (Tunisia); King Khalid University, Faculty of Science, Physics Department, P.O. Box 9004, Abha 61413 (Saudi Arabia)

    2014-10-01

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga{sub 0.5}Al{sub 0.5}As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results.

  10. Molecular quantification of environmental DNA using microfluidics and digital PCR.

    Science.gov (United States)

    Hoshino, Tatsuhiko; Inagaki, Fumio

    2012-09-01

    Real-time PCR has been widely used to evaluate gene abundance in natural microbial habitats. However, PCR-inhibitory substances often reduce the efficiency of PCR, leading to the underestimation of target gene copy numbers. Digital PCR using microfluidics is a new approach that allows absolute quantification of DNA molecules. In this study, digital PCR was applied to environmental samples, and the effect of PCR inhibitors on DNA quantification was tested. In the control experiment using λ DNA and humic acids, underestimation of λ DNA at 1/4400 of the theoretical value was observed with 6.58 ng μL(-1) humic acids. In contrast, digital PCR provided accurate quantification data with a concentration of humic acids up to 9.34 ng μL(-1). The inhibitory effect of paddy field soil extract on quantification of the archaeal 16S rRNA gene was also tested. By diluting the DNA extract, quantified copy numbers from real-time PCR and digital PCR became similar, indicating that dilution was a useful way to remedy PCR inhibition. The dilution strategy was, however, not applicable to all natural environmental samples. For example, when marine subsurface sediment samples were tested the copy number of archaeal 16S rRNA genes was 1.04×10(3) copies/g-sediment by digital PCR, whereas real-time PCR only resulted in 4.64×10(2) copies/g-sediment, which was most likely due to an inhibitory effect. The data from this study demonstrated that inhibitory substances had little effect on DNA quantification using microfluidics and digital PCR, and showed the great advantages of digital PCR in accurate quantifications of DNA extracted from various microbial habitats. Copyright © 2012 Elsevier GmbH. All rights reserved.

  11. Quantification is Neither Necessary Nor Sufficient for Measurement

    International Nuclear Information System (INIS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-01-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement

  12. 2D histomorphometric quantification from 3D computerized tomography

    International Nuclear Information System (INIS)

    Lima, Inaya; Oliveira, Luis Fernando de; Lopes, Ricardo T.; Jesus, Edgar Francisco O. de; Alves, Jose Marcos

    2002-01-01

    In the present article, preliminary results are presented showing the application of the tridimensional computerized microtomographic technique (3D-μCT) to bone tissue characterization, through histomorphometric quantification which are based on stereologic concepts. Two samples of human bone were correctly prepared to be submitted to the tomographic system. The system used to realize that process were a radiographic system with a microfocus X-ray tube. Through these three processes, acquisition, reconstruction and quantification, it was possible to get the good results and coherent to the literature data. From this point, it is intended to compare these results with the information due the conventional method, that is, conventional histomorphometry. (author)

  13. Quantification and Negation in Event Semantics

    Directory of Open Access Journals (Sweden)

    Lucas Champollion

    2010-12-01

    Full Text Available Recently, it has been claimed that event semantics does not go well together with quantification, especially if one rejects syntactic, LF-based approaches to quantifier scope. This paper shows that such fears are unfounded, by presenting a simple, variable-free framework which combines a Neo-Davidsonian event semantics with a type-shifting based account of quantifier scope. The main innovation is that the event variable is bound inside the verbal denotation, rather than at sentence level by existential closure. Quantifiers can then be interpreted in situ. The resulting framework combines the strengths of event semantics and type-shifting accounts of quantifiers and thus does not force the semanticist to posit either a default underlying word order or a syntactic LF-style level. It is therefore well suited for applications to languages where word order is free and quantifier scope is determined by surface order. As an additional benefit, the system leads to a straightforward account of negation, which has also been claimed to be problematic for event-based frameworks.ReferencesBarker, Chris. 2002. ‘Continuations and the nature of quantification’. Natural Language Semantics 10: 211–242.http://dx.doi.org/10.1023/A:1022183511876Barker, Chris & Shan, Chung-chieh. 2008. ‘Donkey anaphora is in-scope binding’. Semantics and Pragmatics 1: 1–46.Beaver, David & Condoravdi, Cleo. 2007. ‘On the logic of verbal modification’. In Maria Aloni, Paul Dekker & Floris Roelofsen (eds. ‘Proceedings of the Sixteenth Amsterdam Colloquium’, 3–9. Amsterdam, Netherlands: University of Amsterdam.Beghelli, Filippo & Stowell, Tim. 1997. ‘Distributivity and negation: The syntax of each and every’. In Anna Szabolcsi (ed. ‘Ways of scope taking’, 71–107. Dordrecht, Netherlands: Kluwer.Brasoveanu, Adrian. 2010. ‘Modified Numerals as Post-Suppositions’. In Maria Aloni, Harald Bastiaanse, Tikitu de Jager & Katrin Schulz (eds. ‘Logic, Language

  14. HPLC for simultaneous quantification of total ceramide, glucosylceramide, and ceramide trihexoside concentrations in plasma

    NARCIS (Netherlands)

    Groener, Johanna E. M.; Poorthuis, Ben J. H. M.; Kuiper, Sijmen; Helmond, Mariette T. J.; Hollak, Carla E. M.; Aerts, Johannes M. F. G.

    2007-01-01

    BACKGROUND: Simple, reproducible assays are needed for the quantification of sphingolipids, ceramide (Cer), and sphingoid bases. We developed an HPLC method for simultaneous quantification of total plasma concentrations of Cer, glucosylceramide (GlcCer), and ceramide trihexoside (CTH). METHODS:

  15. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    Science.gov (United States)

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  16. Two-Phase Microfluidic Systems for High Throughput Quantification of Agglutination Assays

    KAUST Repository

    Castro, David

    2018-01-01

    assay, with a minimum detection limit of 50 ng/mL using optical image analysis. We compare optical image analysis and light scattering as quantification methods, and demonstrate the first light scattering quantification of agglutination assays in a two

  17. Comparison of Quantifiler(®) Trio and InnoQuant™ human DNA quantification kits for detection of DNA degradation in developed and aged fingerprints.

    Science.gov (United States)

    Goecker, Zachary C; Swiontek, Stephen E; Lakhtakia, Akhlesh; Roy, Reena

    2016-06-01

    The development techniques employed to visualize fingerprints collected from crime scenes as well as post-development ageing may result in the degradation of the DNA present in low quantities in such evidence samples. Amplification of the DNA samples with short tandem repeat (STR) amplification kits may result in partial DNA profiles. A comparative study of two commercially available quantification kits, Quantifiler(®) Trio and InnoQuant™, was performed on latent fingerprint samples that were either (i) developed using one of three different techniques and then aged in ambient conditions or (ii) undeveloped and then aged in ambient conditions. The three fingerprint development techniques used were: cyanoacrylate fuming, dusting with black powder, and the columnar-thin-film (CTF) technique. In order to determine the differences between the expected quantities and actual quantities of DNA, manually degraded samples generated by controlled exposure of DNA standards to ultraviolet radiation were also analyzed. A total of 144 fingerprint and 42 manually degraded DNA samples were processed in this study. The results indicate that the InnoQuant™ kit is capable of producing higher degradation ratios compared to the Quantifiler(®) Trio kit. This was an expected result since the degradation ratio is a relative value specific for a kit based on the length and extent of amplification of the two amplicons that vary from one kit to the other. Additionally, samples with lower concentrations of DNA yielded non-linear relationships of degradation ratio with the duration of aging, whereas samples with higher concentrations of DNA yielded quasi-linear relationships. None of the three development techniques produced a noticeably different degradation pattern when compared to undeveloped fingerprints, and therefore do not impede downstream DNA analysis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Multiplex quantification of 16S rDNA of predominant bacteria group within human fecal samples by polymerase chain reaction--ligase detection reaction (PCR-LDR).

    Science.gov (United States)

    Li, Kai; Chen, Bei; Zhou, Yuxun; Huang, Rui; Liang, Yinming; Wang, Qinxi; Xiao, Zhenxian; Xiao, Junhua

    2009-03-01

    A new method, based on ligase detection reaction (LDR), was developed for quantitative detection of multiplex PCR amplicons of 16S rRNA genes present in complex mixtures (specifically feces). LDR has been widely used in single nucleotide polymorphism (SNP) assay but never applied for quantification of multiplex PCR products. This method employs one pair of DNA probes, one of which is labeled with fluorescence for signal capture, complementary to the target sequence. For multiple target sequence analysis, probes were modified with different lengths of polyT at the 5' end and 3' end. Using a DNA sequencer, these ligated probes were separated and identified by size and dye color. Then, relative abundance of target DNA were normalized and quantified based on the fluorescence intensities and exterior size standards. 16S rRNA gene of three preponderant bacteria groups in human feces: Clostridium coccoides, Bacteroides and related genera, and Clostridium leptum group, were amplified and cloned into plasmid DNA so as to make standard curves. After PCR-LDR analysis, a strong linear relationship was found between the florescence intensity and the diluted plasmid DNA concentrations. Furthermore, based on this method, 100 human fecal samples were quantified for the relative abundance of the three bacterial groups. Relative abundance of C. coccoides was significantly higher in elderly people in comparison with young adults, without gender differences. Relative abundance of Bacteroides and related genera and C. leptum group were significantly higher in young and middle aged than in the elderly. Regarding the whole set of sample, C. coccoides showed the highest relative abundance, followed by decreasing groups Bacteroides and related genera, and C. leptum. These results imply that PCR-LDR can be feasible and flexible applied to large scale epidemiological studies.

  19. Semi-automated quantification of living cells with internalized nanostructures

    KAUST Repository

    Margineanu, Michael B.; Julfakyan, Khachatur; Sommer, Christoph; Perez, Jose E.; Contreras, Maria F.; Khashab, Niveen M.; Kosel, Jü rgen; Ravasi, Timothy

    2016-01-01

    novel method for the quantification of cells that internalize a specific type of nanostructures. This approach is suitable for high-throughput and real-time data analysis and has the potential to be used to study the interaction of different types

  20. Quantification of the sequestration of indium 111 labelled platelets

    International Nuclear Information System (INIS)

    Najean, Y.; Picard, N.; Dufour, V.; Rain, J.D.

    1988-01-01

    A simple method is proposed for an accurate quantification of the splenic and/or hepatic sequestration of the 111 In-labelled platelets. It could be allow a better prediction of the efficiency of splenectomy in idiopathic thrombocytopenic purpura [fr

  1. Data-driven Demand Response Characterization and Quantification

    DEFF Research Database (Denmark)

    Le Ray, Guillaume; Pinson, Pierre; Larsen, Emil Mahler

    2017-01-01

    Analysis of load behavior in demand response (DR) schemes is important to evaluate the performance of participants. Very few real-world experiments have been carried out and quantification and characterization of the response is a difficult task. Nevertheless it will be a necessary tool for portf...

  2. MRI-based quantification of brain damage in cerebrovascular disorders

    NARCIS (Netherlands)

    de Bresser, J.H.J.M.

    2011-01-01

    Brain diseases can lead to diverse structural abnormalities that can be assessed on magnetic resonance imaging (MRI) scans. These abnormalities can be quantified by (semi-)automated techniques. The studies described in this thesis aimed to optimize and apply cerebral quantification techniques in

  3. Automated image analysis for quantification of filamentous bacteria

    DEFF Research Database (Denmark)

    Fredborg, Marlene; Rosenvinge, Flemming Schønning; Spillum, Erik

    2015-01-01

    in systems relying on colorimetry or turbidometry (such as Vitek-2, Phoenix, MicroScan WalkAway). The objective was to examine an automated image analysis algorithm for quantification of filamentous bacteria using the 3D digital microscopy imaging system, oCelloScope. Results Three E. coli strains displaying...

  4. Standardless quantification by parameter optimization in electron probe microanalysis

    International Nuclear Information System (INIS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-01-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: ► A method for standardless quantification in EPMA is presented. ► It gives better results than the commercial software GENESIS Spectrum. ► It gives better results than the software DTSA. ► It allows the determination of the conductive coating thickness. ► It gives an estimation for the concentration uncertainties.

  5. Damage Localization and Quantification of Earthquake Excited RC-Frames

    DEFF Research Database (Denmark)

    Skjærbæk, P.S.; Nielsen, Søren R.K.; Kirkegaard, Poul Henning

    In the paper a recently proposed method for damage localization and quantification of RC-structures from response measurements is tested on experimental data. The method investigated requires at least one response measurement along the structure and the ground surface acceleration. Further, the t...

  6. Automatic quantification of subarachnoid hemorrhage on noncontrast CT

    NARCIS (Netherlands)

    Boers, Anna Maria Merel; Zijlstra, I.A.; Gathier, C.S.; van den Berg, R.; Slump, Cornelis H.; Marquering, H.A.; Majoie, C.B.

    2014-01-01

    Quantification of blood after SAH on initial NCCT is an important radiologic measure to predict patient outcome and guide treatment decisions. In current scales, hemorrhage volume and density are not accounted for. The purpose of this study was to develop and validate a fully automatic method for

  7. Enhancement of Electroluminescence (EL) image measurements for failure quantification methods

    DEFF Research Database (Denmark)

    Parikh, Harsh; Spataru, Sergiu; Sera, Dezso

    2018-01-01

    Enhanced quality images are necessary for EL image analysis and failure quantification. A method is proposed which determines image quality in terms of more accurate failure detection of solar panels through electroluminescence (EL) imaging technique. The goal of the paper is to determine the most...

  8. Investigation on feasibility of recurrence quantification analysis for ...

    African Journals Online (AJOL)

    The RQA parameters such as percent recurrence (REC), trapping time (TT), percent laminarity (LAM) and entropy (ENT), and also the recurrence plots color patterns for different flank wear, can be used in detecting insert wear in face milling. Keywords: milling, flank wear, recurrence plot, recurrence quantification analysis.

  9. Quantification and presence of human ancient DNA in burial place ...

    African Journals Online (AJOL)

    Quantification and presence of human ancient DNA in burial place remains of Turkey using real time polymerase chain reaction. ... A published real-time PCR assay, which allows for the combined analysis of nuclear or ancient DNA and mitochondrial DNA, was modified. This approach can be used for recovering DNA from ...

  10. Rapid Quantification and Validation of Lipid Concentrations within Liposomes

    Directory of Open Access Journals (Sweden)

    Carla B. Roces

    2016-09-01

    Full Text Available Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics. The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC, cholesterol, dimethyldioctadecylammonium (DDA bromide, and ᴅ-(+-trehalose 6,6′-dibehenate (TDB. The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R2 > 0.993 for the four lipids tested. The corresponding limit of detection (LOD and limit of quantification (LOQ were 0.11 and 0.36 mg/mL (DMPC, 0.02 and 0.80 mg/mL (cholesterol, 0.06 and 0.20 mg/mL (DDA, and 0.05 and 0.16 mg/mL (TDB, respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes.

  11. Real-Time PCR for Universal Phytoplasma Detection and Quantification

    DEFF Research Database (Denmark)

    Christensen, Nynne Meyn; Nyskjold, Henriette; Nicolaisen, Mogens

    2013-01-01

    Currently, the most efficient detection and precise quantification of phytoplasmas is by real-time PCR. Compared to nested PCR, this method is less sensitive to contamination and is less work intensive. Therefore, a universal real-time PCR method will be valuable in screening programs and in other...

  12. Double-layer Tablets of Lornoxicam: Validation of Quantification ...

    African Journals Online (AJOL)

    Double-layer Tablets of Lornoxicam: Validation of Quantification Method, In vitro Dissolution and Kinetic Modelling. ... Satisfactory results were obtained from all the tablet formulations met compendial requirements. The slowest drug release rate was obtained with tablet cores based on PVP K90 (1.21 mg%.h-1).

  13. Methane emission quantification from landfills using a double tracer approach

    DEFF Research Database (Denmark)

    Scheutz, Charlotte; Samuelsson, J.; Fredenslund, Anders Michael

    2007-01-01

    A tracer method was successfully used for quantification of the whole methane (CH4) emission from Fakse landfill. By using two different tracers the emission from different sections of the landfill could be quantified. Furthermore, is was possible to determine the emissions from local on site...

  14. Protocol for Quantification of Defects in Natural Fibres for Composites

    DEFF Research Database (Denmark)

    Mortensen, Ulrich Andreas; Madsen, Bo

    2014-01-01

    Natural bast-type plant fibres are attracting increasing interest for being used for structural composite applications where high quality fibres with good mechanical properties are required. A protocol for the quantification of defects in natural fibres is presented. The protocol is based...

  15. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    Science.gov (United States)

    Ryan Wagner; Robert Moon; Jon Pratt; Gordon Shaw; Arvind Raman

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale...

  16. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian [University of Bern, From the Institute of Forensic Medicine, Bern (Switzerland); Persson, Anders; Warntjes, Marcel J. [University of Linkoeping, The Center for Medical Image Science and Visualization (CMIV), Linkoeping (Sweden)

    2015-08-15

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  17. Quantification of microbial quality and safety in minimally processed foods

    NARCIS (Netherlands)

    Zwietering, M.H.

    2002-01-01

    To find a good equilibrium between quality and margin of safety of minimally processed foods, often various hurdles are used. Quantification of the kinetics should be used to approach an optimum processing and to select the main aspects. Due to many factors of which the exact quantitative effect is

  18. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    Science.gov (United States)

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-04-06

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality.

  19. Machine Learning for Quantification of Small Vessel Disease Imaging Biomarkers

    NARCIS (Netherlands)

    Ghafoorian, M.

    2018-01-01

    This thesis is devoted to developing fully automated methods for quantification of small vessel disease imaging bio-markers, namely WMHs and lacunes, using vari- ous machine learning/deep learning and computer vision techniques. The rest of the thesis is organized as follows: Chapter 2 describes

  20. Direct quantification of nickel in stainless steels by spectrophotometry

    International Nuclear Information System (INIS)

    Singh, Ritu; Raut, Vaibhavi V.; Jeyakumar, S.; Ramakumar, K.L.

    2007-01-01

    A spectrophotometric method based on the Ni-DMG complex for the quantification of nickel in steel samples without employing any prior separation is reported in the present study. The interfering ions are masked by suitable complexing agents and the method was extended to real samples after validating with BCS and Euro steel standards. (author)

  1. Development of a competitive PCR assay for the quantification of ...

    African Journals Online (AJOL)

    ONOS

    2010-01-25

    Jan 25, 2010 ... quantification of total Escherichia coli DNA in water. Omar Kousar Banu, Barnard .... Thereafter the product was ligated into the pGEM®T-easy cloning ... agarose gel using the high pure PCR product purification kit. (Roche® ...

  2. Recognition and quantification of pain in horses: A tutorial review

    DEFF Research Database (Denmark)

    Gleerup, Karina Charlotte Bech; Lindegaard, Casper

    2016-01-01

    Pain management is dependent on the quality of the pain evaluation. Ideally, pain evaluation is objective, pain-specific and easily incorporated into a busy equine clinic. This paper reviews the existing knowledge base regarding the identification and quantification of pain in horses. Behavioural...

  3. Quantification in dynamic and small-animal positron emission tomography

    NARCIS (Netherlands)

    Disselhorst, Johannes Antonius

    2011-01-01

    This thesis covers two aspects of positron emission tomography (PET) quantification. The first section addresses the characterization and optimization of a small-animal PET/CT scanner. The sensitivity and resolution as well as various parameters affecting image quality (reconstruction settings, type

  4. 15 CFR 990.52 - Injury assessment-quantification.

    Science.gov (United States)

    2010-01-01

    ... (Continued) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OIL POLLUTION ACT..., trustees must quantify the degree, and spatial and temporal extent of such injuries relative to baseline. (b) Quantification approaches. Trustees may quantify injuries in terms of: (1) The degree, and...

  5. A posteriori uncertainty quantification of PIV-based pressure data

    NARCIS (Netherlands)

    Azijli, I.; Sciacchitano, A.; Ragni, D.; Palha Da Silva Clérigo, A.; Dwight, R.P.

    2016-01-01

    A methodology for a posteriori uncertainty quantification of pressure data retrieved from particle image velocimetry (PIV) is proposed. It relies upon the Bayesian framework, where the posterior distribution (probability distribution of the true velocity, given the PIV measurements) is obtained from

  6. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    International Nuclear Information System (INIS)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian; Persson, Anders; Warntjes, Marcel J.

    2015-01-01

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  7. Performance of the Real-Q EBV Quantification Kit for Epstein-Barr Virus DNA Quantification in Whole Blood.

    Science.gov (United States)

    Huh, Hee Jae; Park, Jong Eun; Kim, Ji Youn; Yun, Sun Ae; Lee, Myoung Keun; Lee, Nam Yong; Kim, Jong Won; Ki, Chang Seok

    2017-03-01

    There has been increasing interest in standardized and quantitative Epstein-Barr virus (EBV) DNA testing for the management of EBV disease. We evaluated the performance of the Real-Q EBV Quantification Kit (BioSewoom, Korea) in whole blood (WB). Nucleic acid extraction and real-time PCR were performed by using the MagNA Pure 96 (Roche Diagnostics, Germany) and 7500 Fast real-time PCR system (Applied Biosystems, USA), respectively. Assay sensitivity, linearity, and conversion factor were determined by using the World Health Organization international standard diluted in EBV-negative WB. We used 81 WB clinical specimens to compare performance of the Real-Q EBV Quantification Kit and artus EBV RG PCR Kit (Qiagen, Germany). The limit of detection (LOD) and limit of quantification (LOQ) for the Real-Q kit were 453 and 750 IU/mL, respectively. The conversion factor from EBV genomic copies to IU was 0.62. The linear range of the assay was from 750 to 10⁶ IU/mL. Viral load values measured with the Real-Q assay were on average 0.54 log₁₀ copies/mL higher than those measured with the artus assay. The Real-Q assay offered good analytical performance for EBV DNA quantification in WB.

  8. Compositional Solution Space Quantification for Probabilistic Software Analysis

    Science.gov (United States)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  9. Uncertainty Quantification for Large-Scale Ice Sheet Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [Univ. of Texas, Austin, TX (United States)

    2016-02-05

    This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.

  10. Improved perfusion quantification in FAIR imaging by offset correction

    DEFF Research Database (Denmark)

    Sidaros, Karam; Andersen, Irene Klærke; Gesmar, Henrik

    2001-01-01

    Perfusion quantification using pulsed arterial spin labeling has been shown to be sensitive to the RF pulse slice profiles. Therefore, in Flow-sensitive Alternating-Inversion Recovery (FAIR) imaging the slice selective (ss) inversion slab is usually three to four times thicker than the imaging...... slice. However, this reduces perfusion sensitivity due to the increased transit delay of the incoming blood with unperturbed spins. In the present article, the dependence of the magnetization on the RF pulse slice profiles is inspected both theoretically and experimentally. A perfusion quantification...... model is presented that allows the use of thinner ss inversion slabs by taking into account the offset of RF slice profiles between ss and nonselective inversion slabs. This model was tested in both phantom and human studies. Magn Reson Med 46:193-197, 2001...

  11. Good quantification practices of flavours and fragrances by mass spectrometry.

    Science.gov (United States)

    Begnaud, Frédéric; Chaintreau, Alain

    2016-10-28

    Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.

  12. Techniques of biomolecular quantification through AMS detection of radiocarbon

    International Nuclear Information System (INIS)

    Vogel, S.J.; Turteltaub, K.W.; Frantz, C.; Felton, J.S.; Gledhill, B.L.

    1992-01-01

    Accelerator mass spectrometry offers a large gain over scintillation counting in sensitivity for detecting radiocarbon in biomolecular tracing. Application of this sensitivity requires new considerations of procedures to extract or isolate the carbon fraction to be quantified, to inventory all carbon in the sample, to prepare graphite from the sample for use in the spectrometer, and to derive a meaningful quantification from the measured isotope ratio. These procedures need to be accomplished without contaminating the sample with radiocarbon, which may be ubiquitous in laboratories and on equipment previously used for higher dose, scintillation experiments. Disposable equipment, materials and surfaces are used to control these contaminations. Quantification of attomole amounts of labeled substances are possible through these techniques

  13. Quantification of the effects of dependence on human error probabilities

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1980-01-01

    In estimating the probabilities of human error in the performance of a series of tasks in a nuclear power plant, the situation-specific characteristics of the series must be considered. A critical factor not to be overlooked in this estimation is the dependence or independence that pertains to any of the several pairs of task performances. In discussing the quantification of the effects of dependence, the event tree symbology described will be used. In any series of tasks, the only dependence considered for quantification in this document will be that existing between the task of interest and the immediately preceeding task. Tasks performed earlier in the series may have some effect on the end task, but this effect is considered negligible

  14. Metering error quantification under voltage and current waveform distortion

    Science.gov (United States)

    Wang, Tao; Wang, Jia; Xie, Zhi; Zhang, Ran

    2017-09-01

    With integration of more and more renewable energies and distortion loads into power grid, the voltage and current waveform distortion results in metering error in the smart meters. Because of the negative effects on the metering accuracy and fairness, it is an important subject to study energy metering combined error. In this paper, after the comparing between metering theoretical value and real recorded value under different meter modes for linear and nonlinear loads, a quantification method of metering mode error is proposed under waveform distortion. Based on the metering and time-division multiplier principles, a quantification method of metering accuracy error is proposed also. Analyzing the mode error and accuracy error, a comprehensive error analysis method is presented which is suitable for new energy and nonlinear loads. The proposed method has been proved by simulation.

  15. CT quantification of pleuropulmonary lesions in severe thoracic trauma

    International Nuclear Information System (INIS)

    Kunisch-Hoppe, M.; Bachmann, G.; Weimar, B.; Bauer, T.; Rau, W.S.; Hoppe, M.; Zickmann, B.

    1997-01-01

    Purpose: Computed quantification of the extent of pleuropulmonary trauma by CT and comparison with conventional chest X-ray - Impact on therapy and correlation with mechanical ventilation support and clinical outcome. Method: In a prospective trial, 50 patients with clinically suspicious blunt chest trauma were evaluated using CT and conventional chest X-ray. The computed quantification of ventilated lung provided by CT volumetry was correlated with the consecutive artificial respiration parameters and the clinical outcome. Results: We found a high correlation between CT volumetry and artificial ventilation concerning maximal pressures and inspiratory oxygen concentration (FiO 2 , Goris-Score) (r=0.89, Pearson). The graduation of thoracic trauma correlated highly with the duration of mechanical ventilation (r=0.98, Pearson). Especially with regard to atelectases and lung contusions CT is superior compared to conventional chest X-ray; only 32% and 43%, respectively, were identified by conventional chest X-ray. (orig./AJ) [de

  16. Quantification of uranyl in presence of citric acid

    International Nuclear Information System (INIS)

    Garcia G, N.; Barrera D, C.E.; Ordonez R, E.

    2007-01-01

    To determine the influence that has the organic matter of the soil on the uranyl sorption on some solids is necessary to have a detection technique and quantification of uranyl that it is reliable and sufficiently quick in the obtaining of results. For that in this work, it intends to carry out the uranyl quantification in presence of citric acid modifying the Fluorescence induced by UV-Vis radiation technique. Since the uranyl ion is very sensitive to the medium that contains it, (speciation, pH, ionic forces, etc.) it was necessary to develop an analysis technique that stands out the fluorescence of uranyl ion avoiding the out one that produce the organic acids. (Author)

  17. Development of computational algorithms for quantification of pulmonary structures

    International Nuclear Information System (INIS)

    Oliveira, Marcela de; Alvarez, Matheus; Alves, Allan F.F.; Miranda, Jose R.A.; Pina, Diana R.

    2012-01-01

    The high-resolution computed tomography has become the imaging diagnostic exam most commonly used for the evaluation of the squeals of Paracoccidioidomycosis. The subjective evaluations the radiological abnormalities found on HRCT images do not provide an accurate quantification. The computer-aided diagnosis systems produce a more objective assessment of the abnormal patterns found in HRCT images. Thus, this research proposes the development of algorithms in MATLAB® computing environment can quantify semi-automatically pathologies such as pulmonary fibrosis and emphysema. The algorithm consists in selecting a region of interest (ROI), and by the use of masks, filter densities and morphological operators, to obtain a quantification of the injured area to the area of a healthy lung. The proposed method was tested on ten HRCT scans of patients with confirmed PCM. The results of semi-automatic measurements were compared with subjective evaluations performed by a specialist in radiology, falling to a coincidence of 80% for emphysema and 58% for fibrosis. (author)

  18. Collaborative framework for PIV uncertainty quantification: comparative assessment of methods

    International Nuclear Information System (INIS)

    Sciacchitano, Andrea; Scarano, Fulvio; Neal, Douglas R; Smith, Barton L; Warner, Scott O; Vlachos, Pavlos P; Wieneke, Bernhard

    2015-01-01

    A posteriori uncertainty quantification of particle image velocimetry (PIV) data is essential to obtain accurate estimates of the uncertainty associated with a given experiment. This is particularly relevant when measurements are used to validate computational models or in design and decision processes. In spite of the importance of the subject, the first PIV uncertainty quantification (PIV-UQ) methods have been developed only in the last three years. The present work is a comparative assessment of four approaches recently proposed in the literature: the uncertainty surface method (Timmins et al 2012), the particle disparity approach (Sciacchitano et al 2013), the peak ratio criterion (Charonko and Vlachos 2013) and the correlation statistics method (Wieneke 2015). The analysis is based upon experiments conducted for this specific purpose, where several measurement techniques are employed simultaneously. The performances of the above approaches are surveyed across different measurement conditions and flow regimes. (paper)

  19. Clinical applications of MS-based protein quantification.

    Science.gov (United States)

    Sabbagh, Bassel; Mindt, Sonani; Neumaier, Michael; Findeisen, Peter

    2016-04-01

    Mass spectrometry-based assays are increasingly important in clinical laboratory medicine and nowadays are already commonly used in several areas of routine diagnostics. These include therapeutic drug monitoring, toxicology, endocrinology, pediatrics, and microbiology. Accordingly, some of the most common analyses are therapeutic drug monitoring of immunosuppressants, vitamin D, steroids, newborn screening, and bacterial identification. However, MS-based quantification of peptides and proteins for routine diagnostic use is rather rare up to now despite excellent analytical specificity and good sensitivity. Here, we want to give an overview over current fit-for-purpose assays for MS-based protein quantification. Advantages as well as challenges of this approach will be discussed with focus on feasibility for routine diagnostic use. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Visualization and Quantification of Rotor Tip Vortices in Helicopter Flows

    Science.gov (United States)

    Kao, David L.; Ahmad, Jasim U.; Holst, Terry L.

    2015-01-01

    This paper presents an automated approach for effective extraction, visualization, and quantification of vortex core radii from the Navier-Stokes simulations of a UH-60A rotor in forward flight. We adopt a scaled Q-criterion to determine vortex regions and then perform vortex core profiling in these regions to calculate vortex core radii. This method provides an efficient way of visualizing and quantifying the blade tip vortices. Moreover, the vortices radii are displayed graphically in a plane.

  1. A "Toy" Model for Operational Risk Quantification using Credibility Theory

    OpenAIRE

    Hans B\\"uhlmann; Pavel V. Shevchenko; Mario V. W\\"uthrich

    2009-01-01

    To meet the Basel II regulatory requirements for the Advanced Measurement Approaches in operational risk, the bank's internal model should make use of the internal data, relevant external data, scenario analysis and factors reflecting the business environment and internal control systems. One of the unresolved challenges in operational risk is combining of these data sources appropriately. In this paper we focus on quantification of the low frequency high impact losses exceeding some high thr...

  2. Leishmania parasite detection and quantification using PCR-ELISA

    Czech Academy of Sciences Publication Activity Database

    Kobets, Tetyana; Badalová, Jana; Grekov, Igor; Havelková, Helena; Lipoldová, Marie

    2010-01-01

    Roč. 5, č. 6 (2010), s. 1074-1080 ISSN 1754-2189 R&D Projects: GA ČR GA310/08/1697; GA MŠk(CZ) LC06009 Institutional research plan: CEZ:AV0Z50520514 Keywords : polymerase chain reaction * Leishmania major infection * parasite quantification Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 8.362, year: 2010

  3. Digital PCR for direct quantification of viruses without DNA extraction

    OpenAIRE

    Pav?i?, Jernej; ?el, Jana; Milavec, Mojca

    2015-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration mat...

  4. Detection and quantification of Leveillula taurica growth in pepper leaves.

    Science.gov (United States)

    Zheng, Zheng; Nonomura, Teruo; Bóka, Károly; Matsuda, Yoshinori; Visser, Richard G F; Toyoda, Hideyoshi; Kiss, Levente; Bai, Yuling

    2013-06-01

    Leveillula taurica is an obligate fungal pathogen that causes powdery mildew disease on a broad range of plants, including important crops such as pepper, tomato, eggplant, onion, cotton, and so on. The early stage of this disease is difficult to diagnose and the disease can easily spread unobserved; for example, in pepper and tomato production fields and greenhouses. The objective of this study was to develop a detection and quantification method of L. taurica biomass in pepper leaves with special regard to the early stages of infection. We monitored the development of the disease to time the infection process on the leaf surface as well as inside the pepper leaves. The initial and final steps of the infection taking place on the leaf surface were consecutively observed using a dissecting microscope and a scanning electron microscope. The development of the intercellular mycelium in the mesophyll was followed by light and transmission electron microscopy. A pair of L. taurica-specific primers was designed based on the internal transcribed spacer sequence of L. taurica and used in real-time polymerase chain reaction (PCR) assay to quantify the fungal DNA during infection. The specificity of this assay was confirmed by testing the primer pair with DNA from host plants and also from another powdery mildew species, Oidium neolycopersici, infecting tomato. A standard curve was obtained for absolute quantification of L. taurica biomass. In addition, we tested a relative quantification method by using a plant gene as reference and the obtained results were compared with the visual disease index scoring. The real-time PCR assay for L. taurica provides a valuable tool for detection and quantification of this pathogen in breeding activities as well in plant-microbe interaction studies.

  5. A refined methodology for modeling volume quantification performance in CT

    Science.gov (United States)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  6. Standardless quantification by parameter optimization in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Limandri, Silvina P. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Bonetto, Rita D. [Centro de Investigacion y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco (CINDECA), CONICET, 47 Street 257, (1900) La Plata (Argentina); Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1 and 47 Streets (1900) La Plata (Argentina); Josa, Victor Galvan; Carreras, Alejo C. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Trincavelli, Jorge C., E-mail: trincavelli@famaf.unc.edu.ar [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina)

    2012-11-15

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum Registered-Sign for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: Black-Right-Pointing-Pointer A method for standardless quantification in EPMA is presented. Black-Right-Pointing-Pointer It gives better results than the commercial software GENESIS Spectrum. Black-Right-Pointing-Pointer It gives better results than the software DTSA. Black-Right-Pointing-Pointer It allows the determination of the conductive coating thickness. Black-Right-Pointing-Pointer It gives an estimation for the concentration uncertainties.

  7. Rapid quantification of biomarkers during kerogen microscale pyrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Stott, A.W.; Abbott, G.D. [Fossil Fuels and Environmental Geochemistry NRG, The University, Newcastle-upon-Tyne (United Kingdom)

    1995-02-01

    A rapid, reproducible method incorporating closed system microscale pyrolysis and thermal desorption-gas chromatography/mass spectrometry has been developed and applied to the quantification of sterane biomarkers released during pyrolysis of the Messel oil shale kerogen under confined conditions. This method allows a substantial experimental concentration-time data set to be collected at accurately controlled temperatures, due to the low thermal inertia of the microscale borosilicate glass reaction vessels, which facilitates kinetic studies of biomarker reactions during kerogen microscale pyrolysis

  8. Outcome quantification using SPHARM-PDM toolbox in orthognathic surgery

    Science.gov (United States)

    Cevidanes, Lucia; Zhu, HongTu; Styner, Martin

    2011-01-01

    Purpose Quantification of surgical outcomes in longitudinal studies has led to significant progress in the treatment of dentofacial deformity, both by offering options to patients who might not otherwise have been recommended for treatment and by clarifying the selection of appropriate treatment methods. Most existing surgical treatments have not been assessed in a systematic way. This paper presents the quantification of surgical outcomes in orthognathic surgery via our localized shape analysis framework. Methods In our setting, planning and surgical simulation is performed using the surgery planning software CMFapp. We then employ the SPHARM-PDM to measure the difference between pre-surgery and virtually simulated post-surgery models. This SPHARM-PDM shape framework is validated for use with craniofacial structures via simulating known 3D surgical changes within CMFapp. Results Our results show that SPHARM-PDM analysis accurately measures surgical displacements, compared with known displacement values. Visualization of color maps of virtually simulated surgical displacements describe corresponding surface distances that precisely describe location of changes, and difference vectors indicate directionality and magnitude of changes. Conclusions SPHARM-PDM-based quantification of surgical outcome is feasible. When compared to prior solutions, our method has the potential to make the surgical planning process more flexible, increase the level of detail and accuracy of the plan, yield higher operative precision and control and enhance the follow-up and documentation of clinical cases. PMID:21161693

  9. Initial water quantification results using neutron computed tomography

    Science.gov (United States)

    Heller, A. K.; Shi, L.; Brenizer, J. S.; Mench, M. M.

    2009-06-01

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at the Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  10. Quantification of rice bran oil in oil blends

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, R.; Sharma, H. K.; Sengar, G.

    2012-11-01

    Blends consisting of physically refined rice bran oil (PRBO): sunflower oil (SnF) and PRBO: safflower oil (SAF) in different proportions were analyzed for various physicochemical parameters. The quantification of pure rice bran oil in the blended oils was carried out using different methods including gas chromatographic, HPLC, ultrasonic velocity and methods based on physico-chemical parameters. The physicochemical parameters such as ultrasonic velocity, relative association and acoustic impedance at 2 MHz, iodine value, palmitic acid content and oryzanol content reflected significant changes with increased proportions of PRBO in the blended oils. These parameters were selected as dependent parameters and % PRBO proportion was selected as independent parameters. The study revealed that regression equations based on the oryzanol content, palmitic acid composition, ultrasonic velocity, relative association, acoustic impedance, and iodine value can be used for the quantification of rice bran oil in blended oils. The rice bran oil can easily be quantified in the blended oils based on the oryzanol content by HPLC even at a 1% level. The palmitic acid content in blended oils can also be used as an indicator to quantify rice bran oil at or above the 20% level in blended oils whereas the method based on ultrasonic velocity, acoustic impedance and relative association showed initial promise in the quantification of rice bran oil. (Author) 23 refs.

  11. Development of Accident Scenarios and Quantification Methodology for RAON Accelerator

    International Nuclear Information System (INIS)

    Lee, Yongjin; Jae, Moosung

    2014-01-01

    The RIsp (Rare Isotope Science Project) plans to provide neutron-rich isotopes (RIs) and stable heavy ion beams. The accelerator is defined as radiation production system according to Nuclear Safety Law. Therefore, it needs strict operate procedures and safety assurance to prevent radiation exposure. In order to satisfy this condition, there is a need for evaluating potential risk of accelerator from the design stage itself. Though some of PSA researches have been conducted for accelerator, most of them focus on not general accident sequence but simple explanation of accident. In this paper, general accident scenarios are developed by Event Tree and deduce new quantification methodology of Event Tree. In this study, some initial events, which may occur in the accelerator, are selected. Using selected initial events, the accident scenarios of accelerator facility are developed with Event Tree. These results can be used as basic data of the accelerator for future risk assessments. After analyzing the probability of each heading, it is possible to conduct quantification and evaluate the significance of the accident result. If there is a development of the accident scenario for external events, risk assessment of entire accelerator facility will be completed. To reduce the uncertainty of the Event Tree, it is possible to produce a reliable data via the presented quantification techniques

  12. A fast and robust hepatocyte quantification algorithm including vein processing

    Directory of Open Access Journals (Sweden)

    Homeyer André

    2010-03-01

    Full Text Available Abstract Background Quantification of different types of cells is often needed for analysis of histological images. In our project, we compute the relative number of proliferating hepatocytes for the evaluation of the regeneration process after partial hepatectomy in normal rat livers. Results Our presented automatic approach for hepatocyte (HC quantification is suitable for the analysis of an entire digitized histological section given in form of a series of images. It is the main part of an automatic hepatocyte quantification tool that allows for the computation of the ratio between the number of proliferating HC-nuclei and the total number of all HC-nuclei for a series of images in one processing run. The processing pipeline allows us to obtain desired and valuable results for a wide range of images with different properties without additional parameter adjustment. Comparing the obtained segmentation results with a manually retrieved segmentation mask which is considered to be the ground truth, we achieve results with sensitivity above 90% and false positive fraction below 15%. Conclusions The proposed automatic procedure gives results with high sensitivity and low false positive fraction and can be applied to process entire stained sections.

  13. Initial water quantification results using neutron computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.K. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)], E-mail: axh174@psu.edu; Shi, L.; Brenizer, J.S.; Mench, M.M. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)

    2009-06-21

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  14. Toponomics method for the automated quantification of membrane protein translocation.

    Science.gov (United States)

    Domanova, Olga; Borbe, Stefan; Mühlfeld, Stefanie; Becker, Martin; Kubitz, Ralf; Häussinger, Dieter; Berlage, Thomas

    2011-09-19

    Intra-cellular and inter-cellular protein translocation can be observed by microscopic imaging of tissue sections prepared immunohistochemically. A manual densitometric analysis is time-consuming, subjective and error-prone. An automated quantification is faster, more reproducible, and should yield results comparable to manual evaluation. The automated method presented here was developed on rat liver tissue sections to study the translocation of bile salt transport proteins in hepatocytes. For validation, the cholestatic liver state was compared to the normal biological state. An automated quantification method was developed to analyze the translocation of membrane proteins and evaluated in comparison to an established manual method. Firstly, regions of interest (membrane fragments) are identified in confocal microscopy images. Further, densitometric intensity profiles are extracted orthogonally to membrane fragments, following the direction from the plasma membrane to cytoplasm. Finally, several different quantitative descriptors were derived from the densitometric profiles and were compared regarding their statistical significance with respect to the transport protein distribution. Stable performance, robustness and reproducibility were tested using several independent experimental datasets. A fully automated workflow for the information extraction and statistical evaluation has been developed and produces robust results. New descriptors for the intensity distribution profiles were found to be more discriminative, i.e. more significant, than those used in previous research publications for the translocation quantification. The slow manual calculation can be substituted by the fast and unbiased automated method.

  15. [DNA quantification of blood samples pre-treated with pyramidon].

    Science.gov (United States)

    Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan

    2014-06-01

    To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.

  16. Voltammetric Quantification of Paraquat and Glyphosate in Surface Waters

    Directory of Open Access Journals (Sweden)

    William Roberto Alza-Camacho

    2016-09-01

    Full Text Available The indiscriminate use of pesticides on crops has a negative environmental impact that affects organisms, soil and water resources, essential for life. Therefore, it is necessary to evaluate the residual effect of these substances in water sources. A simple, affordable and accessible electrochemical method for Paraquat and Glyphosate quantification in water was developed. The study was conducted using as supporting electrolyte Britton-Robinson buffer solution, working electrode of glassy carbon, Ag/AgCl as the reference electrode, and platinum as auxiliary electrode. Differential pulse voltammetry (VDP method for both compounds were validated. Linearity of the methods presented a correlation coefficient of 0.9949 and 0.9919 and the limits of detection and quantification were 130 and 190 mg/L for Paraquat and 40 and 50 mg/L for glyphosate. Comparison with the reference method showed that the electrochemical method provides superior results in quantification of analytes. Of the samples tested, a value of Paraquat was between 0,011 to 1,572 mg/L and for glyphosate it was between 0.201 to 2.777 mg/L, indicating that these compounds are present in water sources and that those may be causing serious problems to human health.

  17. HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.

    Science.gov (United States)

    Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil

    2017-04-01

    Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Preliminary study on computer automatic quantification of brain atrophy

    International Nuclear Information System (INIS)

    Li Chuanfu; Zhou Kangyuan

    2006-01-01

    Objective: To study the variability of normal brain volume with the sex and age, and put forward an objective standard for computer automatic quantification of brain atrophy. Methods: The cranial volume, brain volume and brain parenchymal fraction (BPF) of 487 cases of brain atrophy (310 males, 177 females) and 1901 cases of normal subjects (993 males, 908 females) were calculated with the newly developed algorithm of automatic quantification for brain atrophy. With the technique of polynomial curve fitting, the mathematical relationship of BPF with age in normal subjects was analyzed. Results: The cranial volume, brain volume and BPF of normal subjects were (1 271 322 ± 128 699) mm 3 , (1 211 725 ± 122 077) mm 3 and (95.3471 ± 2.3453)%, respectively, and those of atrophy subjects were (1 276 900 ± 125 180) mm 3 , (1 203 400 ± 117 760) mm 3 and BPF(91.8115 ± 2.3035)% respectively. The difference of BPF between the two groups was extremely significant (P 0.05). The expression P(x)=-0.0008x 2 + 0.0193x + 96.9999 could accurately describe the mathematical relationship between BPF and age in normal subject (lower limit of 95% CI y=-0.0008x 2 +0.0184x+95.1090). Conclusion: The lower limit of 95% confidence interval mathematical relationship between BPF and age could be used as an objective criteria for automatic quantification of brain atrophy with computer. (authors)

  19. Nuclear and mitochondrial DNA quantification of various forensic materials.

    Science.gov (United States)

    Andréasson, H; Nilsson, M; Budowle, B; Lundberg, H; Allen, M

    2006-12-01

    Due to the different types and quality of forensic evidence materials, their DNA content can vary substantially, and particularly low quantities can impact the results in an identification analysis. In this study, the quantity of mitochondrial and nuclear DNA was determined in a variety of materials using a previously described real-time PCR method. DNA quantification in the roots and distal sections of plucked and shed head hairs revealed large variations in DNA content particularly between the root and the shaft of plucked hairs. Also large intra- and inter-individual variations were found among hairs. In addition, DNA content was estimated in samples collected from fingerprints and accessories. The quantification of DNA on various items also displayed large variations, with some materials containing large amounts of nuclear DNA while no detectable nuclear DNA and only limited amounts of mitochondrial DNA were seen in others. Using this sensitive real-time PCR quantification assay, a better understanding was obtained regarding DNA content and variation in commonly analysed forensic evidence materials and this may guide the forensic scientist as to the best molecular biology approach for analysing various forensic evidence materials.

  20. Efficient Quantification of Uncertainties in Complex Computer Code Results, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses methods for efficient quantification of margins and uncertainties (QMU) for models that couple multiple, large-scale commercial or...

  1. Cutset Quantification Error Evaluation for Shin-Kori 1 and 2 PSA model

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2009-01-01

    Probabilistic safety assessments (PSA) for nuclear power plants (NPPs) are based on the minimal cut set (MCS) quantification method. In PSAs, the risk and importance measures are computed from a cutset equation mainly by using approximations. The conservatism of the approximations is also a source of quantification uncertainty. In this paper, exact MCS quantification methods which are based on the 'sum of disjoint products (SDP)' logic and Inclusion-exclusion formula are applied and the conservatism of the MCS quantification results in Shin-Kori 1 and 2 PSA is evaluated

  2. Effects of humic acid on DNA quantification with Quantifiler® Human DNA Quantification kit and short tandem repeat amplification efficiency.

    Science.gov (United States)

    Seo, Seung Bum; Lee, Hye Young; Zhang, Ai Hua; Kim, Hye Yeon; Shin, Dong Hoon; Lee, Soong Deok

    2012-11-01

    Correct DNA quantification is an essential part to obtain reliable STR typing results. Forensic DNA analysts often use commercial kits for DNA quantification; among them, real-time-based DNA quantification kits are most frequently used. Incorrect DNA quantification due to the presence of PCR inhibitors may affect experiment results. In this study, we examined the alteration degree of DNA quantification results estimated in DNA samples containing a PCR inhibitor by using a Quantifiler® Human DNA Quantification kit. For experiments, we prepared approximately 0.25 ng/μl DNA samples containing various concentrations of humic acid (HA). The quantification results were 0.194-0.303 ng/μl at 0-1.6 ng/μl HA (final concentration in the Quantifiler reaction) and 0.003-0.168 ng/μl at 2.4-4.0 ng/μl HA. Most DNA quantity was undetermined when HA concentration was higher than 4.8 ng/μl HA. The C (T) values of an internal PCR control (IPC) were 28.0-31.0, 36.5-37.1, and undetermined at 0-1.6, 2.4, and 3.2 ng/μl HA. These results indicate that underestimated DNA quantification results may be obtained in the DNA sample with high C (T) values of IPC. Thus, researchers should carefully interpret the DNA quantification results. We additionally examined the effects of HA on the STR amplification by using an Identifiler® kit and a MiniFiler™ kit. Based on the results of this study, it is thought that a better understanding of various effects of HA would help researchers recognize and manipulate samples containing HA.

  3. Two-stream Convolutional Neural Network for Methane Emissions Quantification

    Science.gov (United States)

    Wang, J.; Ravikumar, A. P.; McGuire, M.; Bell, C.; Tchapmi, L. P.; Brandt, A. R.

    2017-12-01

    Methane, a key component of natural gas, has a 25x higher global warming potential than carbon dioxide on a 100-year basis. Accurately monitoring and mitigating methane emissions require cost-effective detection and quantification technologies. Optical gas imaging, one of the most commonly used leak detection technology, adopted by Environmental Protection Agency, cannot estimate leak-sizes. In this work, we harness advances in computer science to allow for rapid and automatic leak quantification. Particularly, we utilize two-stream deep Convolutional Networks (ConvNets) to estimate leak-size by capturing complementary spatial information from still plume frames, and temporal information from plume motion between frames. We build large leak datasets for training and evaluating purposes by collecting about 20 videos (i.e. 397,400 frames) of leaks. The videos were recorded at six distances from the source, covering 10 -60 ft. Leak sources included natural gas well-heads, separators, and tanks. All frames were labeled with a true leak size, which has eight levels ranging from 0 to 140 MCFH. Preliminary analysis shows that two-stream ConvNets provides significant accuracy advantage over single steam ConvNets. Spatial stream ConvNet can achieve an accuracy of 65.2%, by extracting important features, including texture, plume area, and pattern. Temporal stream, fed by the results of optical flow analysis, results in an accuracy of 58.3%. The integration of the two-stream ConvNets gives a combined accuracy of 77.6%. For future work, we will split the training and testing datasets in distinct ways in order to test the generalization of the algorithm for different leak sources. Several analytic metrics, including confusion matrix and visualization of key features, will be used to understand accuracy rates and occurrences of false positives. The quantification algorithm can help to find and fix super-emitters, and improve the cost-effectiveness of leak detection and repair

  4. Quantification of breast arterial calcification using full field digital mammography

    International Nuclear Information System (INIS)

    Molloi, Sabee; Xu Tong; Ducote, Justin; Iribarren, Carlos

    2008-01-01

    Breast arterial calcification is commonly detected on some mammograms. Previous studies indicate that breast arterial calcification is evidence of general atherosclerotic vascular disease and it may be a useful marker of coronary artery disease. It can potentially be a useful tool for assessment of coronary artery disease in women since mammography is widely used as a screening tool for early detection of breast cancer. However, there are currently no available techniques for quantification of calcium mass using mammography. The purpose of this study was to determine whether it is possible to quantify breast arterial calcium mass using standard digital mammography. An anthropomorphic breast phantom along with a vessel calcification phantom was imaged using a full field digital mammography system. Densitometry was used to quantify calcium mass. A calcium calibration measurement was performed at each phantom thickness and beam energy. The known (K) and measured (M) calcium mass on 5 and 9 cm thickness phantoms were related by M=0.964K-0.288 mg (r=0.997 and SEE=0.878 mg) and M=1.004K+0.324 mg (r=0.994 and SEE=1.32 mg), respectively. The results indicate that accurate calcium mass measurements can be made without correction for scatter glare as long as careful calcium calibration is made for each breast thickness. The results also indicate that composition variations and differences of approximately 1 cm between calibration phantom and breast thickness introduce only minimal error in calcium measurement. The uncertainty in magnification is expected to cause up to 5% and 15% error in calcium mass for 5 and 9 cm breast thicknesses, respectively. In conclusion, a densitometry technique for quantification of breast arterial calcium mass was validated using standard full field digital mammography. The results demonstrated the feasibility and potential utility of the densitometry technique for accurate quantification of breast arterial calcium mass using standard digital

  5. Development of the quantification procedures for in situ XRF analysis

    International Nuclear Information System (INIS)

    Kump, P.; Necemer, M.; Rupnik, P.

    2005-01-01

    For in situ XRF applications, two excitation systems (radioisotope and tube excited) and an X ray spectrometer based on an Si-PIN detector were assembled and used. The radioisotope excitation system with an Am-241 source was assembled into a prototype of a compact XRF analyser PEDUZO-01, which is also applicable in field work. The existing quantification software QAES (quantitative analysis of environmental samples) was assessed to be adequate also in field work. This QAES software was also integrated into a new software attached to the developed XRF analyser PEDUZO-01, which includes spectrum acquisition, spectrum analysis and quantification and runs in the LABVIEW environment. In a process of assessment of the Si-PIN based X ray spectrometers and QAES quantification software in field work, a comparison was made with the results obtained by the standard Si(Li) based spectrometer. The results of this study prove that the use of this spectrometer is adequate for field work. This work was accepted for publication in X ray Spectrometry. Application of a simple sample preparation of solid samples was studied in view of the analytical results obtained. It has been established that under definite conditions the results are not very different from the ones obtained by the homogenized sample pressed into the pellet. The influence of particle size and mineralogical effects on quantitative results was studied. A simple sample preparation kit was proposed. Sample preparation for the analysis of water samples by precipitation with APDC and aerosol analysis using a dichotomous sampler were also adapted and used in the field work. An adequate sample preparation kit was proposed. (author)

  6. QUANTIFICATION OF GENETICALLY MODIFIED MAIZE MON 810 IN PROCESSED FOODS

    Directory of Open Access Journals (Sweden)

    Peter Siekel

    2012-12-01

    Full Text Available 800x600 Normal 0 21 false false false SK X-NONE X-NONE MicrosoftInternetExplorer4 Maize MON 810 (Zea mays L. represents the majority of genetically modified food crops. It is the only transgenic cultivar grown in the EU (European Union countries and food products with its content higher than 0.9 % must be labelled. This study was aimed at impact of food processing (temperature, pH and pressure on DNA degradation and quantification of the genetically modified maize MON 810. The transgenic DNA was quantified by the real-time polymerase chain reaction method. Processing as is high temperature (121 °C, elevated pressure (0.1 MPa and low pH 2.25 fragmented DNA. A consequence of two order difference in the species specific gene content compared to the transgenic DNA content in plant materials used has led to false negative results in the quantification of transgenic DNA. The maize containing 4.2 % of the transgene after processing appeared to be as low as 3.0 % (100 °C and 1.9 % (121 °C, 0.1 MPa. The 2.1 % amount of transgene dropped at 100 °C to 1.0 % and at 121 °C, 0.1 MPa to 0.6 %. Under such make up the DNA degradation of transgenic content showed up 2 or 3 time higher decrease a consequence of unequal gene presence. Such genes disparity is expressed as considerable decrease of transgenic content while the decrease of species specific gene content remains unnoticed. Based on our findings we conclude that high degree of processing might have led to false negative results of the transgenic constituent quantification. Determination of GMO content in processed foods may leads to incorrect statement and labelling in these cases could misleads consumers.doi:10.5219/212

  7. Simultaneous quantification of flavonoids and triterpenoids in licorice using HPLC.

    Science.gov (United States)

    Wang, Yuan-Chuen; Yang, Yi-Shan

    2007-05-01

    Numerous bioactive compounds are present in licorice (Glycyrrhizae Radix), including flavonoids and triterpenoids. In this study, a reversed-phase high-performance liquid chromatography (HPLC) method for simultaneous quantification of three flavonoids (liquiritin, liquiritigenin and isoliquiritigenin) and four triterpenoids (glycyrrhizin, 18alpha-glycyrrhetinic acid, 18beta-glycyrrhetinic acid and 18beta-glycyrrhetinic acid methyl ester) from licorice was developed, and further, to quantify these 7 compounds from 20 different licorice samples. Specifically, the reverse-phase HPLC was performed with a gradient mobile phase composed of 25 mM phosphate buffer (pH 2.5)-acetonitrile featuring gradient elution steps as follows: 0 min, 100:0; 10 min, 80:20; 50 min, 70:30; 73 min, 50:50; 110 min, 50:50; 125 min, 20:80; 140 min, 20:80, and peaks were detected at 254 nm. By using our technique, a rather good specificity was obtained regarding to the separation of these seven compounds. The regression coefficient for the linear equations for the seven compounds lay between 0.9978 and 0.9992. The limits of detection and quantification lay in the range of 0.044-0.084 and 0.13-0.25 microg/ml, respectively. The relative recovery rates for the seven compounds lay between 96.63+/-2.43 and 103.55+/-2.77%. Coefficient variation for intra-day and inter-day precisions lay in the range of 0.20-1.84 and 0.28-1.86%, respectively. Based upon our validation results, this analytical technique is a convenient method to simultaneous quantify numerous bioactive compounds derived from licorice, featuring good quantification parameters, accuracy and precision.

  8. The role of PET quantification in cardiovascular imaging.

    Science.gov (United States)

    Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido

    2014-08-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries

  9. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Conrad, Patrick [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Bigoni, Daniele [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Parno, Matthew [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2017-06-09

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a history of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT

  10. Dermatologic radiotherapy and thyroid cancer. Dose measurements and risk quantification

    International Nuclear Information System (INIS)

    Goldschmidt, H.; Gorson, R.O.; Lassen, M.

    1983-01-01

    Thyroid doses for various dermatologic radiation techniques were measured with thermoluminescent dosimeters and ionization rate meters in an Alderson-Rando anthropomorphic phantom. The effects of changes in radiation quality and of the use or nonuse of treatment cones and thyroid shields were evaluated in detail. The results indicate that the potential risk of radiogenic thyroid cancer is very small when proper radiation protection measures are used. The probability of radiogenic thyroid cancer developing and the potential mortality risk were assessed quantitatively for each measurement. The quantification of radiation risks allows comparisons with risks of other therapeutic modalities and the common hazards of daily life

  11. HPLC Quantification of Cytotoxic Compounds from Aspergillus niger

    Directory of Open Access Journals (Sweden)

    Paula Karina S. Uchoa

    2017-01-01

    Full Text Available A high-performance liquid chromatography method was developed and validated for the quantification of the cytotoxic compounds produced by a marine strain of Aspergillus niger. The fungus was grown in malt peptone dextrose (MPD, potato dextrose yeast (PDY, and mannitol peptone yeast (MnPY media during 7, 14, 21, and 28 days, and the natural products were identified by standard compounds. The validation parameters obtained were selectivity, linearity (coefficient of correlation > 0.99, precision (relative standard deviation below 5%, and accuracy (recovery > 96.

  12. Improved Method for PD-Quantification in Power Cables

    DEFF Research Database (Denmark)

    Holbøll, Joachim T.; Villefrance, Rasmus; Henriksen, Mogens

    1999-01-01

    n this paper, a method is described for improved quantification of partial discharges(PD) in power cables. The method is suitable for PD-detection and location systems in the MHz-range, where pulse attenuation and distortion along the cable cannot be neglected. The system transfer function...... was calculated and measured in order to form basis for magnitude calculation after each measurements. --- Limitations and capabilities of the method will be discussed and related to relevant field applications of high frequent PD-measurements. --- Methods for increased signal/noise ratio are easily implemented...

  13. 'Motion frozen' quantification and display of myocardial perfusion gated SPECT

    International Nuclear Information System (INIS)

    Slomka, P.J.; Hurwitz, G.A.; Baddredine, M.; Baranowski, J.; Aladl, U.E.

    2002-01-01

    Aim: Gated SPECT imaging incorporates both functional and perfusion information of the left ventricle (LV). However perfusion data is confounded by the effect of ventricular motion. Most existing quantification paradigms simply add all gated frames and then proceed to extract the perfusion information from static images, discarding the effects of cardiac motion. In an attempt to improve the reliability and accuracy of cardiac SPECT quantification we propose to eliminate the LV motion prior to the perfusion quantification via automated image warping algorithm. Methods: A pilot series of 14 male and 11 female gated stress SPECT images acquired with 8 time bins have been co-registered to the coordinates of the 3D normal templates. Subsequently the LV endo and epi-cardial 3D points (300-500) were identified on end-systolic (ES) and end-diastolic (ED) frames, defining the ES-ED motion vectors. The nonlinear image warping algorithm (thin-plate-spline) was then applied to warp end-systolic frame was onto the end-diastolic frames using the corresponding ES-ED motion vectors. The remaining 6 intermediate frames were also transformed to the ED coordinates using fractions of the motion vectors. Such warped images were then summed to provide the LV perfusion image in the ED phase but with counts from the full cycle. Results: The identification of the ED/ES corresponding points was successful in all cases. The corrected displacement between ED and ES images was up to 25 mm. The summed images had the appearance of the ED frames but have been much less noisy since all the counts have been used. The spatial resolution of such images appeared higher than that of summed gated images, especially in the female scans. These 'motion frozen' images could be displayed and quantified as regular non-gated tomograms including polar map paradigm. Conclusions: This image processing technique may improve the effective image resolution of summed gated myocardial perfusion images used for

  14. Molecular nonlinear dynamics and protein thermal uncertainty quantification

    Science.gov (United States)

    Xia, Kelin; Wei, Guo-Wei

    2014-01-01

    This work introduces molecular nonlinear dynamics (MND) as a new approach for describing protein folding and aggregation. By using a mode system, we show that the MND of disordered proteins is chaotic while that of folded proteins exhibits intrinsically low dimensional manifolds (ILDMs). The stability of ILDMs is found to strongly correlate with protein energies. We propose a novel method for protein thermal uncertainty quantification based on persistently invariant ILDMs. Extensive comparison with experimental data and the state-of-the-art methods in the field validate the proposed new method for protein B-factor prediction. PMID:24697365

  15. Temporal and spatial quantification of farm and landscape functions

    DEFF Research Database (Denmark)

    Andersen, Peter Stubkjær

    , residence, habitat, and recreation; development of a method for quantifying farm functionality and assessing multifunctionality; and definition of a farm typology based on multifunctionality strategies. Empirical data from farm interviews were used in the study to test the developed methods. The results...... is generally decreases and a tendency of increased segregation of the rural landscape is observed. In perspective, further studies on quantification in tangible units, synergies and trade-offs between functions at different scales, and correlations between structures and functions are needed....

  16. Multi data reservior history matching and uncertainty quantification framework

    KAUST Repository

    Katterbauer, Klemens

    2015-11-26

    A multi-data reservoir history matching and uncertainty quantification framework is provided. The framework can utilize multiple data sets such as production, seismic, electromagnetic, gravimetric and surface deformation data for improving the history matching process. The framework can consist of a geological model that is interfaced with a reservoir simulator. The reservoir simulator can interface with seismic, electromagnetic, gravimetric and surface deformation modules to predict the corresponding observations. The observations can then be incorporated into a recursive filter that subsequently updates the model state and parameters distributions, providing a general framework to quantify and eventually reduce with the data, uncertainty in the estimated reservoir state and parameters.

  17. Review of some aspects of human reliability quantification

    International Nuclear Information System (INIS)

    Lydell, B.O.Y.; Spurgin, A.J.; Hannaman, G.W.; Lukic, Y.D.

    1986-01-01

    An area in systems reliability considered to be weak, is the characterization and quantification of the role of the operations and maintenance staff in combatting accidents. Several R and D programs are underway to improve the modeling of human interactions and some progress has been made. This paper describes a specific aspect of human reliability analysis which is referred to as modeling of cognitive processes. In particular, the basis for the so- called Human Cognitive Reliability (HCR) model is described and the focus is on its validation and on its benefits and limitations

  18. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Stracuzzi, David John [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Brost, Randolph [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Chen, Maximillian Gene [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Malinas, Rebecca [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Peterson, Matthew Gregor [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Phillips, Cynthia A. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Robinson, David G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Woodbridge, Diane [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search, and discuss a number of possible improvements for each.

  19. NEW MODEL FOR QUANTIFICATION OF ICT DEPENDABLE ORGANIZATIONS RESILIENCE

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2011-03-01

    Full Text Available Business environment today demands high reliable organizations in every segment to be competitive on the global market. Beside that, ICT sector is becoming irreplaceable in many fields of business, from the communication to the complex systems for process control and production. To fulfill those requirements and to develop further, many organizations worldwide are implementing business paradigm called - organizations resilience. Although resilience is well known term in many science fields, it is not well studied due to its complex nature. This paper is dealing with developing the new model for assessment and quantification of ICT dependable organizations resilience.

  20. Uncertainty Quantification and Statistical Engineering for Hypersonic Entry Applications

    Science.gov (United States)

    Cozmuta, Ioana

    2011-01-01

    NASA has invested significant resources in developing and validating a mathematical construct for TPS margin management: a) Tailorable for low/high reliability missions; b) Tailorable for ablative/reusable TPS; c) Uncertainty Quantification and Statistical Engineering are valuable tools not exploited enough; and d) Need to define strategies combining both Theoretical Tools and Experimental Methods. The main reason for this lecture is to give a flavor of where UQ and SE could contribute and hope that the broader community will work with us to improve in these areas.

  1. Uncertainty quantification in ion–solid interaction simulations

    Energy Technology Data Exchange (ETDEWEB)

    Preuss, R., E-mail: preuss@ipp.mpg.de; Toussaint, U. von

    2017-02-15

    Within the framework of Bayesian uncertainty quantification we propose a non-intrusive reduced-order spectral approach (polynomial chaos expansion) to the simulation of ion–solid interactions. The method not only reduces the number of function evaluations but provides simultaneously a quantitative measure for which combinations of inputs have the most important impact on the result. It is applied to SDTRIM-simulations (Möller et al., 1988) with several uncertain and Gaussian distributed input parameters (i.e. angle, projectile energy, surface binding energy, target composition) and the results are compared to full-grid based approaches and sampling based methods with respect to reliability, efficiency and scalability.

  2. DNA imaging and quantification using chemi-luminescent probes

    International Nuclear Information System (INIS)

    Dorner, G.; Redjdal, N.; Laniece, P.; Siebert, R.; Tricoire, H.; Valentin, L.

    1999-01-01

    During this interdisciplinary study we have developed an ultra sensitive and reliable imaging system of DNA labelled by chemiluminescence. Based on a liquid nitrogen cooled CCD, the system achieves sensitivities down to 10 fg/mm 2 labelled DNA over a surface area of 25 x 25 cm 2 with a sub-millimeter resolution. Commercially available chemi-luminescent - and enhancer molecules are compared and their reaction conditions optimized for best signal-to-noise ratios. Double labelling was performed to verify quantification with radioactive probes. (authors)

  3. Nuclear Data Uncertainty Quantification: Past, Present and Future

    International Nuclear Information System (INIS)

    Smith, D.L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested

  4. Nuclear Data Uncertainty Quantification: Past, Present and Future

    Science.gov (United States)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.

  5. Quantification in histopathology-Can magnetic particles help?

    International Nuclear Information System (INIS)

    Mitchels, John; Hawkins, Peter; Luxton, Richard; Rhodes, Anthony

    2007-01-01

    Every year, more than 270,000 people are diagnosed with cancer in the UK alone; this means that one in three people worldwide contract cancer within their lifetime. Histopathology is the principle method for confirming cancer and directing treatment. In this paper, a novel application of magnetic particles is proposed to help address the problem of subjectivity in histopathology. Preliminary results indicate that magnetic nanoparticles cannot only be used to assist diagnosis through improving quantification but also potentially increase throughput, hence offering a way of dramatically reducing costs within the routine histopathology laboratory

  6. Quantification of protein concentration using UV absorbance and Coomassie dyes.

    Science.gov (United States)

    Noble, James E

    2014-01-01

    The measurement of a solubilized protein concentration in solution is an important assay in biochemistry research and development labs for applications ranging from enzymatic studies to providing data for biopharmaceutical lot release. Spectrophotometric protein quantification assays are methods that use UV and visible spectroscopy to rapidly determine the concentration of protein, relative to a standard, or using an assigned extinction coefficient. Where multiple samples need measurement, and/or the sample volume and concentration is limited, preparations of the Coomassie dye commonly known as the Bradford assay can be used. © 2014 Elsevier Inc. All rights reserved.

  7. Development of magnetic resonance technology for noninvasive boron quantification

    International Nuclear Information System (INIS)

    Bradshaw, K.M.

    1990-11-01

    Boron magnetic resonance imaging (MRI) and spectroscopy (MRS) were developed in support of the noninvasive boron quantification task of the Idaho National Engineering Laboratory (INEL) Power Burst Facility/Boron Neutron Capture Therapy (PBF/BNCT) program. The hardware and software described in this report are modifications specific to a GE Signa trademark MRI system, release 3.X and are necessary for boron magnetic resonance operation. The technology developed in this task has been applied to obtaining animal pharmacokinetic data of boron compounds (drug time response) and the in-vivo localization of boron in animal tissue noninvasively. 9 refs., 21 figs

  8. QUANTIFICATION OF ANGIOGENESIS IN THE CHICKEN CHORIOALLANTOIC MEMBRANE (CAM

    Directory of Open Access Journals (Sweden)

    Silvia Blacher

    2011-05-01

    Full Text Available The chick chorioallantoic membrane (CAM provides a suitable in vivo model to study angiogenesis and evaluate several pro- and anti-angiogenic factors and compounds. In the present work, new developments in image analysis are used to quantify CAM angiogenic response from optical microscopic observations, covering all vascular components, from the large supplying and feeding vessels down to the capillary plexus. To validate our methodology angiogenesis is quantified during two phases of CAM development (day 7 and 13 and after treatment with an antiangiogenic modulator of the angiogenesis. Our morphometric analysis emphasizes that an accurate quantification of the CAM vasculature needs to be performed at various scales.

  9. Prospective comparison of liver stiffness measurements between two point wave elastography methods: Virtual ouch quantification and elastography point quantification

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Hyun Suk; Lee, Jeong Min; Yoon, Jeong Hee; Lee, Dong Ho; Chang, Won; Han, Joon Koo [Seoul National University Hospital, Seoul (Korea, Republic of)

    2016-09-15

    To prospectively compare technical success rate and reliable measurements of virtual touch quantification (VTQ) elastography and elastography point quantification (ElastPQ), and to correlate liver stiffness (LS) measurements obtained by the two elastography techniques. Our study included 85 patients, 80 of whom were previously diagnosed with chronic liver disease. The technical success rate and reliable measurements of the two kinds of point shear wave elastography (pSWE) techniques were compared by χ{sup 2} analysis. LS values measured using the two techniques were compared and correlated via Wilcoxon signed-rank test, Spearman correlation coefficient, and 95% Bland-Altman limit of agreement. The intraobserver reproducibility of ElastPQ was determined by 95% Bland-Altman limit of agreement and intraclass correlation coefficient (ICC). The two pSWE techniques showed similar technical success rate (98.8% for VTQ vs. 95.3% for ElastPQ, p = 0.823) and reliable LS measurements (95.3% for VTQ vs. 90.6% for ElastPQ, p = 0.509). The mean LS measurements obtained by VTQ (1.71 ± 0.47 m/s) and ElastPQ (1.66 ± 0.41 m/s) were not significantly different (p = 0.209). The LS measurements obtained by the two techniques showed strong correlation (r = 0.820); in addition, the 95% limit of agreement of the two methods was 27.5% of the mean. Finally, the ICC of repeat ElastPQ measurements was 0.991. Virtual touch quantification and ElastPQ showed similar technical success rate and reliable measurements, with strongly correlated LS measurements. However, the two methods are not interchangeable due to the large limit of agreement.

  10. DNA imaging and quantification using chemi-luminescent probes; Imagerie et quantification d`ADN par chimiluminescence

    Energy Technology Data Exchange (ETDEWEB)

    Dorner, G; Redjdal, N; Laniece, P; Siebert, R; Tricoire, H; Valentin, L [Groupe I.P.B., Experimental Research Division, Inst. de Physique Nucleaire, Paris-11 Univ., 91 - Orsay (France)

    1999-11-01

    During this interdisciplinary study we have developed an ultra sensitive and reliable imaging system of DNA labelled by chemiluminescence. Based on a liquid nitrogen cooled CCD, the system achieves sensitivities down to 10 fg/mm{sup 2} labelled DNA over a surface area of 25 x 25 cm{sup 2} with a sub-millimeter resolution. Commercially available chemi-luminescent - and enhancer molecules are compared and their reaction conditions optimized for best signal-to-noise ratios. Double labelling was performed to verify quantification with radioactive probes. (authors) 1 fig.

  11. A Spanish model for quantification and management of construction waste

    International Nuclear Information System (INIS)

    Solis-Guzman, Jaime; Marrero, Madelyn; Montes-Delgado, Maria Victoria; Ramirez-de-Arellano, Antonio

    2009-01-01

    Currently, construction and demolition waste (C and D waste) is a worldwide issue that concerns not only governments but also the building actors involved in construction activity. In Spain, a new national decree has been regulating the production and management of C and D waste since February 2008. The present work describes the waste management model that has inspired this decree: the Alcores model implemented with good results in Los Alcores Community (Seville, Spain). A detailed model is also provided to estimate the volume of waste that is expected to be generated on the building site. The quantification of C and D waste volume, from the project stage, is essential for the building actors to properly plan and control its disposal. This quantification model has been developed by studying 100 dwelling projects, especially their bill of quantities, and defining three coefficients to estimate the demolished volume (CT), the wreckage volume (CR) and the packaging volume (CE). Finally, two case studies are included to illustrate the usefulness of the model to estimate C and D waste volume in both new construction and demolition projects.

  12. Biomass to energy : GHG reduction quantification protocols and case study

    International Nuclear Information System (INIS)

    Reusing, G.; Taylor, C.; Nolan, W.; Kerr, G.

    2009-01-01

    With the growing concerns over greenhouses gases and their contribution to climate change, it is necessary to find ways of reducing environmental impacts by diversifying energy sources to include non-fossil fuel energy sources. Among the fastest growing green energy sources is energy from waste facilities that use biomass that would otherwise be landfilled or stockpiled. The quantification of greenhouse gas reductions through the use of biomass to energy systems can be calculated using various protocols and methodologies. This paper described each of these methodologies and presented a case study comparing some of these quantification methodologies. A summary and comparison of biomass to energy greenhouse gas reduction protocols in use or under development by the United Nations, the European Union, the Province of Alberta and Environment Canada was presented. It was concluded that regulatory, environmental pressures, and public policy will continue to impact the practices associated with biomass processing or landfill operations, such as composting, or in the case of landfills, gas collection systems, thus reducing the amount of potential credit available for biomass to energy facility offset projects. 10 refs., 2 tabs., 6 figs

  13. Overview of hybrid subspace methods for uncertainty quantification, sensitivity analysis

    International Nuclear Information System (INIS)

    Abdel-Khalik, Hany S.; Bang, Youngsuk; Wang, Congjian

    2013-01-01

    Highlights: ► We overview the state-of-the-art in uncertainty quantification and sensitivity analysis. ► We overview new developments in above areas using hybrid methods. ► We give a tutorial introduction to above areas and the new developments. ► Hybrid methods address the explosion in dimensionality in nonlinear models. ► Representative numerical experiments are given. -- Abstract: The role of modeling and simulation has been heavily promoted in recent years to improve understanding of complex engineering systems. To realize the benefits of modeling and simulation, concerted efforts in the areas of uncertainty quantification and sensitivity analysis are required. The manuscript intends to serve as a pedagogical presentation of the material to young researchers and practitioners with little background on the subjects. We believe this is important as the role of these subjects is expected to be integral to the design, safety, and operation of existing as well as next generation reactors. In addition to covering the basics, an overview of the current state-of-the-art will be given with particular emphasis on the challenges pertaining to nuclear reactor modeling. The second objective will focus on presenting our own development of hybrid subspace methods intended to address the explosion in the computational overhead required when handling real-world complex engineering systems.

  14. A critical view on microplastic quantification in aquatic organisms

    International Nuclear Information System (INIS)

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R.; Marques, Antonio; Granby, Kit; Fait, Gabriella; Kotterman, Michiel J.J.; Diogène, Jorge; Bekaert, Karen; Robbens, Johan; Devriese, Lisa

    2015-01-01

    Microplastics, plastic particles and fragments smaller than 5 mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different “hotspot” locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g −1 w.w. for the Acid mix Method and 0.12±0.04 total microplastics g −1 w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g −1 w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring.

  15. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    Science.gov (United States)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  16. Biomass to energy : GHG reduction quantification protocols and case study

    Energy Technology Data Exchange (ETDEWEB)

    Reusing, G.; Taylor, C. [Conestoga - Rovers and Associates, Waterloo, ON (Canada); Nolan, W. [Liberty Energy, Hamilton, ON (Canada); Kerr, G. [Index Energy, Ajax, ON (Canada)

    2009-07-01

    With the growing concerns over greenhouses gases and their contribution to climate change, it is necessary to find ways of reducing environmental impacts by diversifying energy sources to include non-fossil fuel energy sources. Among the fastest growing green energy sources is energy from waste facilities that use biomass that would otherwise be landfilled or stockpiled. The quantification of greenhouse gas reductions through the use of biomass to energy systems can be calculated using various protocols and methodologies. This paper described each of these methodologies and presented a case study comparing some of these quantification methodologies. A summary and comparison of biomass to energy greenhouse gas reduction protocols in use or under development by the United Nations, the European Union, the Province of Alberta and Environment Canada was presented. It was concluded that regulatory, environmental pressures, and public policy will continue to impact the practices associated with biomass processing or landfill operations, such as composting, or in the case of landfills, gas collection systems, thus reducing the amount of potential credit available for biomass to energy facility offset projects. 10 refs., 2 tabs., 6 figs.

  17. Evaluation of semi-automatic arterial stenosis quantification

    International Nuclear Information System (INIS)

    Hernandez Hoyos, M.; Universite Claude Bernard Lyon 1, 69 - Villeurbanne; Univ. de los Andes, Bogota; Serfaty, J.M.; Douek, P.C.; Universite Claude Bernard Lyon 1, 69 - Villeurbanne; Hopital Cardiovasculaire et Pneumologique L. Pradel, Bron; Maghiar, A.; Mansard, C.; Orkisz, M.; Magnin, I.; Universite Claude Bernard Lyon 1, 69 - Villeurbanne

    2006-01-01

    Object: To assess the accuracy and reproducibility of semi-automatic vessel axis extraction and stenosis quantification in 3D contrast-enhanced Magnetic Resonance Angiography (CE-MRA) of the carotid arteries (CA). Materials and methods: A total of 25 MRA datasets was used: 5 phantoms with known stenoses, and 20 patients (40 CAs) drawn from a multicenter trial database. Maracas software extracted vessel centerlines and quantified the stenoses, based on boundary detection in planes perpendicular to the centerline. Centerline accuracy was visually scored. Semi-automatic measurements were compared with: (1) theoretical phantom morphometric values, and (2) stenosis degrees evaluated by two independent radiologists. Results: Exploitable centerlines were obtained in 97% of CA and in all phantoms. In phantoms, the software achieved a better agreement with theoretic stenosis degrees (weighted kappa Κ W = 0.91) than the radiologists (Κ W = 0.69). In patients, agreement between software and radiologists varied from Κ W =0.67 to 0.90. In both, Maracas was substantially more reproducible than the readers. Mean operating time was within 1 min/ CA. Conclusion: Maracas software generates accurate 3D centerlines of vascular segments with minimum user intervention. Semi-automatic quantification of CA stenosis is also accurate, except in very severe stenoses that cannot be segmented. It substantially reduces the inter-observer variability. (orig.)

  18. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  19. 3D histomorphometric quantification from 3D computed tomography

    International Nuclear Information System (INIS)

    Oliveira, L.F. de; Lopes, R.T.

    2004-01-01

    The histomorphometric analysis is based on stereologic concepts and was originally applied to biologic samples. This technique has been used to evaluate different complex structures such as ceramic filters, net structures and cancellous objects that are objects with inner connected structures. The measured histomorphometric parameters of structure are: sample volume to total reconstructed volume (BV/TV), sample surface to sample volume (BS/BV), connection thickness (Tb Th ), connection number (Tb N ) and connection separation (Tb Sp ). The anisotropy was evaluated as well. These parameters constitute the base of histomorphometric analysis. The quantification is realized over cross-sections recovered by cone beam reconstruction, where a real-time microfocus radiographic system is used as tomographic system. The three-dimensional (3D) histomorphometry, obtained from tomography, corresponds to an evolution of conventional method that is based on 2D analysis. It is more coherent with morphologic and topologic context of the sample. This work shows result from 3D histomorphometric quantification to characterize objects examined by 3D computer tomography. The results, which characterizes the internal structures of ceramic foams with different porous density, are compared to results from conventional methods

  20. Mixture quantification using PLS in plastic scintillation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Bagan, H.; Tarancon, A.; Rauret, G. [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain); Garcia, J.F., E-mail: jfgarcia@ub.ed [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain)

    2011-06-15

    This article reports the capability of plastic scintillation (PS) combined with multivariate calibration (Partial least squares; PLS) to detect and quantify alpha and beta emitters in mixtures. While several attempts have been made with this purpose in mind using liquid scintillation (LS), no attempt was done using PS that has the great advantage of not producing mixed waste after the measurements are performed. Following this objective, ternary mixtures of alpha and beta emitters ({sup 241}Am, {sup 137}Cs and {sup 90}Sr/{sup 90}Y) have been quantified. Procedure optimisation has evaluated the use of the net spectra or the sample spectra, the inclusion of different spectra obtained at different values of the Pulse Shape Analysis parameter and the application of the PLS1 or PLS2 algorithms. The conclusions show that the use of PS+PLS2 applied to the sample spectra, without the use of any pulse shape discrimination, allows quantification of the activities with relative errors less than 10% in most of the cases. This procedure not only allows quantification of mixtures but also reduces measurement time (no blanks are required) and the application of this procedure does not require detectors that include the pulse shape analysis parameter.

  1. A Spanish model for quantification and management of construction waste.

    Science.gov (United States)

    Solís-Guzmán, Jaime; Marrero, Madelyn; Montes-Delgado, Maria Victoria; Ramírez-de-Arellano, Antonio

    2009-09-01

    Currently, construction and demolition waste (C&D waste) is a worldwide issue that concerns not only governments but also the building actors involved in construction activity. In Spain, a new national decree has been regulating the production and management of C&D waste since February 2008. The present work describes the waste management model that has inspired this decree: the Alcores model implemented with good results in Los Alcores Community (Seville, Spain). A detailed model is also provided to estimate the volume of waste that is expected to be generated on the building site. The quantification of C&D waste volume, from the project stage, is essential for the building actors to properly plan and control its disposal. This quantification model has been developed by studying 100 dwelling projects, especially their bill of quantities, and defining three coefficients to estimate the demolished volume (CT), the wreckage volume (CR) and the packaging volume (CE). Finally, two case studies are included to illustrate the usefulness of the model to estimate C&D waste volume in both new construction and demolition projects.

  2. Sludge quantification at water treatment plant and its management scenario.

    Science.gov (United States)

    Ahmad, Tarique; Ahmad, Kafeel; Alam, Mehtab

    2017-08-15

    Large volume of sludge is generated at the water treatment plants during the purification of surface water for potable supplies. Handling and disposal of sludge require careful attention from civic bodies, plant operators, and environmentalists. Quantification of the sludge produced at the treatment plants is important to develop suitable management strategies for its economical and environment friendly disposal. Present study deals with the quantification of sludge using empirical relation between turbidity, suspended solids, and coagulant dosing. Seasonal variation has significant effect on the raw water quality received at the water treatment plants so forth sludge generation also varies. Yearly production of the sludge in a water treatment plant at Ghaziabad, India, is estimated to be 29,700 ton. Sustainable disposal of such a quantity of sludge is a challenging task under stringent environmental legislation. Several beneficial reuses of sludge in civil engineering and constructional work have been identified globally such as raw material in manufacturing cement, bricks, and artificial aggregates, as cementitious material, and sand substitute in preparing concrete and mortar. About 54 to 60% sand, 24 to 28% silt, and 16% clay constitute the sludge generated at the water treatment plant under investigation. Characteristics of the sludge are found suitable for its potential utilization as locally available construction material for safe disposal. An overview of the sustainable management scenario involving beneficial reuses of the sludge has also been presented.

  3. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    Science.gov (United States)

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.

  4. Spectroscopic quantification of 5-hydroxymethylcytosine in genomic DNA.

    Science.gov (United States)

    Shahal, Tamar; Gilat, Noa; Michaeli, Yael; Redy-Keisar, Orit; Shabat, Doron; Ebenstein, Yuval

    2014-08-19

    5-Hydroxymethylcytosine (5hmC), a modified form of the DNA base cytosine, is an important epigenetic mark linked to regulation of gene expression in development, and tumorigenesis. We have developed a spectroscopic method for a global quantification of 5hmC in genomic DNA. The assay is performed within a multiwell plate, which allows simultaneous recording of up to 350 samples. Our quantification procedure of 5hmC is direct, simple, and rapid. It relies on a two-step protocol that consists of enzymatic glucosylation of 5hmC with an azide-modified glucose, followed by a "click reaction" with an alkyne-fluorescent tag. The fluorescence intensity recorded from the DNA sample is proportional to its 5hmC content and can be quantified by a simple plate reader measurement. This labeling technique is specific and highly sensitive, allowing detection of 5hmC down to 0.002% of the total nucleotides. Our results reveal significant variations in the 5hmC content obtained from different mouse tissues, in agreement with previously reported data.

  5. Quantification of 5-methyl-2'-deoxycytidine in the DNA.

    Science.gov (United States)

    Giel-Pietraszuk, Małgorzata; Insińska-Rak, Małgorzata; Golczak, Anna; Sikorski, Marek; Barciszewska, Mirosława; Barciszewski, Jan

    2015-01-01

    Methylation at position 5 of cytosine (Cyt) at the CpG sequences leading to formation of 5-methyl-cytosine (m(5)Cyt) is an important element of epigenetic regulation of gene expression. Modification of the normal methylation pattern, unique to each organism, leads to the development of pathological processes and diseases, including cancer. Therefore, quantification of the DNA methylation and analysis of changes in the methylation pattern is very important from a practical point of view and can be used for diagnostic purposes, as well as monitoring of the treatment progress. In this paper we present a new method for quantification of 5-methyl-2'deoxycytidine (m(5)C) in the DNA. The technique is based on conversion of m(5)C into fluorescent 3,N(4)-etheno-5-methyl-2'deoxycytidine (εm(5)C) and its identification by reversed-phase high-performance liquid chromatography (RP-HPLC). The assay was used to evaluate m(5)C concentration in DNA of calf thymus and peripheral blood of cows bred under different conditions. This approach can be applied for measuring of 5-methylcytosine in cellular DNA from different cells and tissues.

  6. Accurate quantification of supercoiled DNA by digital PCR

    Science.gov (United States)

    Dong, Lianhua; Yoo, Hee-Bong; Wang, Jing; Park, Sang-Ryoul

    2016-01-01

    Digital PCR (dPCR) as an enumeration-based quantification method is capable of quantifying the DNA copy number without the help of standards. However, it can generate false results when the PCR conditions are not optimized. A recent international comparison (CCQM P154) showed that most laboratories significantly underestimated the concentration of supercoiled plasmid DNA by dPCR. Mostly, supercoiled DNAs are linearized before dPCR to avoid such underestimations. The present study was conducted to overcome this problem. In the bilateral comparison, the National Institute of Metrology, China (NIM) optimized and applied dPCR for supercoiled DNA determination, whereas Korea Research Institute of Standards and Science (KRISS) prepared the unknown samples and quantified them by flow cytometry. In this study, several factors like selection of the PCR master mix, the fluorescent label, and the position of the primers were evaluated for quantifying supercoiled DNA by dPCR. This work confirmed that a 16S PCR master mix avoided poor amplification of the supercoiled DNA, whereas HEX labels on dPCR probe resulted in robust amplification curves. Optimizing the dPCR assay based on these two observations resulted in accurate quantification of supercoiled DNA without preanalytical linearization. This result was validated in close agreement (101~113%) with the result from flow cytometry. PMID:27063649

  7. Automatic Segmentation and Quantification of Filamentous Structures in Electron Tomography.

    Science.gov (United States)

    Loss, Leandro A; Bebis, George; Chang, Hang; Auer, Manfred; Sarkar, Purbasha; Parvin, Bahram

    2012-10-01

    Electron tomography is a promising technology for imaging ultrastructures at nanoscale resolutions. However, image and quantitative analyses are often hindered by high levels of noise, staining heterogeneity, and material damage either as a result of the electron beam or sample preparation. We have developed and built a framework that allows for automatic segmentation and quantification of filamentous objects in 3D electron tomography. Our approach consists of three steps: (i) local enhancement of filaments by Hessian filtering; (ii) detection and completion (e.g., gap filling) of filamentous structures through tensor voting; and (iii) delineation of the filamentous networks. Our approach allows for quantification of filamentous networks in terms of their compositional and morphological features. We first validate our approach using a set of specifically designed synthetic data. We then apply our segmentation framework to tomograms of plant cell walls that have undergone different chemical treatments for polysaccharide extraction. The subsequent compositional and morphological analyses of the plant cell walls reveal their organizational characteristics and the effects of the different chemical protocols on specific polysaccharides.

  8. A practical method for accurate quantification of large fault trees

    International Nuclear Information System (INIS)

    Choi, Jong Soo; Cho, Nam Zin

    2007-01-01

    This paper describes a practical method to accurately quantify top event probability and importance measures from incomplete minimal cut sets (MCS) of a large fault tree. The MCS-based fault tree method is extensively used in probabilistic safety assessments. Several sources of uncertainties exist in MCS-based fault tree analysis. The paper is focused on quantification of the following two sources of uncertainties: (1) the truncation neglecting low-probability cut sets and (2) the approximation in quantifying MCSs. The method proposed in this paper is based on a Monte Carlo simulation technique to estimate probability of the discarded MCSs and the sum of disjoint products (SDP) approach complemented by the correction factor approach (CFA). The method provides capability to accurately quantify the two uncertainties and estimate the top event probability and importance measures of large coherent fault trees. The proposed fault tree quantification method has been implemented in the CUTREE code package and is tested on the two example fault trees

  9. An EPGPT-based approach for uncertainty quantification

    International Nuclear Information System (INIS)

    Wang, C.; Abdel-Khalik, H. S.

    2012-01-01

    Generalized Perturbation Theory (GPT) has been widely used by many scientific disciplines to perform sensitivity analysis and uncertainty quantification. This manuscript employs recent developments in GPT theory, collectively referred to as Exact-to-Precision Generalized Perturbation Theory (EPGPT), to enable uncertainty quantification for computationally challenging models, e.g. nonlinear models associated with many input parameters and many output responses and with general non-Gaussian parameters distributions. The core difference between EPGPT and existing GPT is in the way the problem is formulated. GPT formulates an adjoint problem that is dependent on the response of interest. It tries to capture via the adjoint solution the relationship between the response of interest and the constraints on the state variations. EPGPT recasts the problem in terms of a smaller set of what is referred to as the 'active' responses which are solely dependent on the physics model and the boundary and initial conditions rather than on the responses of interest. The objective of this work is to apply an EPGPT methodology to propagate cross-sections variations in typical reactor design calculations. The goal is to illustrate its use and the associated impact for situations where the typical Gaussian assumption for parameters uncertainties is not valid and when nonlinear behavior must be considered. To allow this demonstration, exaggerated variations will be employed to stimulate nonlinear behavior in simple prototypical neutronics models. (authors)

  10. Unilateral condylar hyperplasia: a 3-dimensional quantification of asymmetry.

    Directory of Open Access Journals (Sweden)

    Tim J Verhoeven

    Full Text Available PURPOSE: Objective quantifications of facial asymmetry in patients with Unilateral Condylar Hyperplasia (UCH have not yet been described in literature. The aim of this study was to objectively quantify soft-tissue asymmetry in patients with UCH and to compare the findings with a control group using a new method. MATERIAL AND METHODS: Thirty 3D photographs of patients diagnosed with UCH were compared with 30 3D photographs of healthy controls. As UCH presents particularly in the mandible, a new method was used to isolate the lower part of the face to evaluate asymmetry of this part separately. The new method was validated by two observers using 3D photographs of five patients and five controls. RESULTS: A significant difference (0.79 mm between patients and controls whole face asymmetry was found. Intra- and inter-observer differences of 0.011 mm (-0.034-0.011 and 0.017 mm (-0.007-0.042 respectively were found. These differences are irrelevant in clinical practice. CONCLUSION: After objective quantification, a significant difference was identified in soft-tissue asymmetry between patients with UCH and controls. The method used to isolate mandibular asymmetry was found to be valid and a suitable tool to evaluate facial asymmetry.

  11. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  12. Complex Empiricism and the Quantification of Uncertainty in Paleoclimate Reconstructions

    Science.gov (United States)

    Brumble, K. C.

    2014-12-01

    Because the global climate cannot be observed directly, and because of vast and noisy data sets, climate science is a rich field to study how computational statistics informs what it means to do empirical science. Traditionally held virtues of empirical science and empirical methods like reproducibility, independence, and straightforward observation are complicated by representational choices involved in statistical modeling and data handling. Examining how climate reconstructions instantiate complicated empirical relationships between model, data, and predictions reveals that the path from data to prediction does not match traditional conceptions of empirical inference either. Rather, the empirical inferences involved are "complex" in that they require articulation of a good deal of statistical processing wherein assumptions are adopted and representational decisions made, often in the face of substantial uncertainties. Proxy reconstructions are both statistical and paleoclimate science activities aimed at using a variety of proxies to reconstruct past climate behavior. Paleoclimate proxy reconstructions also involve complex data handling and statistical refinement, leading to the current emphasis in the field on the quantification of uncertainty in reconstructions. In this presentation I explore how the processing needed for the correlation of diverse, large, and messy data sets necessitate the explicit quantification of the uncertainties stemming from wrangling proxies into manageable suites. I also address how semi-empirical pseudo-proxy methods allow for the exploration of signal detection in data sets, and as intermediary steps for statistical experimentation.

  13. Within-day repeatability for absolute quantification of Lawsonia intracellularis bacteria in feces from growing pigs

    DEFF Research Database (Denmark)

    Pedersen, Ken Steen; Pedersen, Klaus H.; Hjulsager, Charlotte Kristiane

    2012-01-01

    Absolute quantification of Lawsonia intracellularis by real-time polymerase chain reaction (PCR) is now possible on a routine basis. Poor repeatability of quantification can result in disease status misclassification of individual pigs when a single fecal sample is obtained. The objective...

  14. Quantification of cellular uptake of DNA nanostructures by qPCR

    DEFF Research Database (Denmark)

    Okholm, Anders Hauge; Nielsen, Jesper Sejrup; Vinther, Mathias

    2014-01-01

    interactions and structural and functional features of the DNA delivery device must be thoroughly investigated. Here, we present a rapid and robust method for the precise quantification of the component materials of DNA origami structures capable of entering cells in vitro. The quantification is performed...

  15. Accurate Quantification of Cardiovascular Biomarkers in Serum Using Protein Standard Absolute Quantification (PSAQ™) and Selected Reaction Monitoring*

    Science.gov (United States)

    Huillet, Céline; Adrait, Annie; Lebert, Dorothée; Picard, Guillaume; Trauchessec, Mathieu; Louwagie, Mathilde; Dupuis, Alain; Hittinger, Luc; Ghaleh, Bijan; Le Corvoisier, Philippe; Jaquinod, Michel; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-01-01

    Development of new biomarkers needs to be significantly accelerated to improve diagnostic, prognostic, and toxicity monitoring as well as therapeutic follow-up. Biomarker evaluation is the main bottleneck in this development process. Selected Reaction Monitoring (SRM) combined with stable isotope dilution has emerged as a promising option to speed this step, particularly because of its multiplexing capacities. However, analytical variabilities because of upstream sample handling or incomplete trypsin digestion still need to be resolved. In 2007, we developed the PSAQ™ method (Protein Standard Absolute Quantification), which uses full-length isotope-labeled protein standards to quantify target proteins. In the present study we used clinically validated cardiovascular biomarkers (LDH-B, CKMB, myoglobin, and troponin I) to demonstrate that the combination of PSAQ and SRM (PSAQ-SRM) allows highly accurate biomarker quantification in serum samples. A multiplex PSAQ-SRM assay was used to quantify these biomarkers in clinical samples from myocardial infarction patients. Good correlation between PSAQ-SRM and ELISA assay results was found and demonstrated the consistency between these analytical approaches. Thus, PSAQ-SRM has the capacity to improve both accuracy and reproducibility in protein analysis. This will be a major contribution to efficient biomarker development strategies. PMID:22080464

  16. Quantification by aberration corrected (S)TEM of boundaries formed by symmetry breaking phase transformations

    Energy Technology Data Exchange (ETDEWEB)

    Schryvers, D., E-mail: nick.schryvers@uantwerpen.be [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Salje, E.K.H. [Department of Earth Sciences, University of Cambridge, Cambridge CB2 3EQ (United Kingdom); Nishida, M. [Department of Engineering Sciences for Electronics and Materials, Faculty of Engineering Sciences, Kyushu University, Kasuga, Fukuoka 816-8580 (Japan); De Backer, A. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Idrissi, H. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Institute of Mechanics, Materials and Civil Engineering, Université Catholique de Louvain, Place Sainte Barbe, 2, B-1348, Louvain-la-Neuve (Belgium); Van Aert, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2017-05-15

    The present contribution gives a review of recent quantification work of atom displacements, atom site occupations and level of crystallinity in various systems and based on aberration corrected HR(S)TEM images. Depending on the case studied, picometer range precisions for individual distances can be obtained, boundary widths at the unit cell level determined or statistical evolutions of fractions of the ordered areas calculated. In all of these cases, these quantitative measures imply new routes for the applications of the respective materials. - Highlights: • Quantification of picometer displacements at ferroelastic twin boundary in CaTiO{sub 3.} • Quantification of kinks in meandering ferroelectric domain wall in LiNbO{sub 3}. • Quantification of column occupation in anti-phase boundary in Co-Pt. • Quantification of atom displacements at twin boundary in Ni-Ti B19′ martensite.

  17. PCR amplification of repetitive sequences as a possible approach in relative species quantification

    DEFF Research Database (Denmark)

    Ballin, Nicolai Zederkopff; Vogensen, Finn Kvist; Karlsson, Anders H

    2012-01-01

    Abstract Both relative and absolute quantifications are possible in species quantification when single copy genomic DNA is used. However, amplification of single copy genomic DNA does not allow a limit of detection as low as one obtained from amplification of repetitive sequences. Amplification...... of repetitive sequences is therefore frequently used in absolute quantification but problems occur in relative quantification as the number of repetitive sequences is unknown. A promising approach was developed where data from amplification of repetitive sequences were used in relative quantification of species...... to relatively quantify the amount of chicken DNA in a binary mixture of chicken DNA and pig DNA. However, the designed PCR primers lack the specificity required for regulatory species control....

  18. Real-time PCR for the quantification of fungi in planta.

    Science.gov (United States)

    Klosterman, Steven J

    2012-01-01

    Methods enabling quantification of fungi in planta can be useful for a variety of applications. In combination with information on plant disease severity, indirect quantification of fungi in planta offers an additional tool in the screening of plants that are resistant to fungal diseases. In this chapter, a method is described for the quantification of DNA from a fungus in plant leaves using real-time PCR (qPCR). Although the method described entails quantification of the fungus Verticillium dahliae in lettuce leaves, the methodology described would be useful for other pathosystems as well. The method utilizes primers that are specific for amplification of a β-tubulin sequence from V. dahliae and a lettuce actin gene sequence as a reference for normalization. This approach enabled quantification of V. dahliae in the amount of 2.5 fg/ng of lettuce leaf DNA at 21 days following plant inoculation.

  19. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    Science.gov (United States)

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  20. Quantification of anti-Leishmania antibodies in saliva of dogs.

    Science.gov (United States)

    Cantos-Barreda, Ana; Escribano, Damián; Bernal, Luis J; Cerón, José J; Martínez-Subiela, Silvia

    2017-08-15

    Detection of serum anti-Leishmania antibodies by quantitative or qualitative techniques has been the most used method to diagnose Canine Leishmaniosis (CanL). Nevertheless, saliva may represent an alternative to blood because it is easy to collect, painless and non-invasive in comparison with serum. In this study, two time-resolved immunofluorometric assays (TR-IFMAs) for quantification of anti-Leishmania IgG2 and IgA antibodies in saliva were developed and validated and their ability to distinguish Leishmania-seronegative from seropositive dogs was evaluated. The analytical study was performed by evaluation of assay precision, sensitivity and accuracy. In addition, serum from 48 dogs (21 Leishmania-seropositive and 27 Leishmania-seronegative) were analyzed by TR-IFMAs. The assays were precise, with an intra- and inter-assay coefficients of variation lower than 11%, and showed high level of accuracy, as determined by linearity under dilution (R 2 =0.99) and recovery tests (>88.60%). Anti-Leishmania IgG2 antibodies in saliva were significantly higher in the seropositive group compared with the seronegative (pLeishmania IgA antibodies between both groups were observed. Furthermore, TR-IFMA for quantification of anti-Leishmania IgG2 antibodies in saliva showed higher differences between seropositive and seronegative dogs than the commercial assay used in serum. In conclusion, TR-IFMAs developed may be used to quantify anti-Leishmania IgG2 and IgA antibodies in canine saliva with an adequate precision, analytical sensitivity and accuracy. Quantification of anti-Leishmania IgG2 antibodies in saliva could be potentially used to evaluate the humoral response in CanL. However, IgA in saliva seemed not to have diagnostic value for this disease. For future studies, it would be desirable to evaluate the ability of the IgG2 assay to detect dogs with subclinical disease or with low antibody titers in serum and also to study the antibodies behaviour in saliva during the

  1. EKSEKUSI HUKUMAN MATI Tinjauan Maqāṣid al-Sharī’ah dan Keadilan

    OpenAIRE

    Imam Yahya

    2013-01-01

    The debate about death penalty, is still attracted attention of people. At least, there are, two mainstream firstly those who agrees and secondly who refuses the death penalty being imposed. For those who agrees reasoned that severe violations of the right to life, should be punished by death so that could provide a deterrent effect, while those who refuses argued that the death penalty is a denial of human rights, especially right to life. The essence of the death penalty is not a violation ...

  2. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Directory of Open Access Journals (Sweden)

    Žel Jana

    2006-08-01

    Full Text Available Abstract Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was

  3. Critical points of DNA quantification by real-time PCR--effects of DNA extraction method and sample matrix on quantification of genetically modified organisms.

    Science.gov (United States)

    Cankar, Katarina; Stebih, Dejan; Dreo, Tanja; Zel, Jana; Gruden, Kristina

    2006-08-14

    Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to

  4. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Science.gov (United States)

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  5. Tannins quantification in barks of Mimosa tenuiflora and Acacia mearnsii

    Directory of Open Access Journals (Sweden)

    Leandro Calegari

    2016-03-01

    Full Text Available Due to its chemical complexity, there are several methodologies for vegetable tannins quantification. Thus, this work aims at quantifying both tannin and non-tannin substances present in the barks of Mimosa tenuiflora and Acacia mearnsii by two different methods. From bark particles of both species, analytical solutions were produced by using a steam-jacketed extractor. The solution was analyzed by Stiasny and hide-powder (no chromed methods. For both species, tannin levels were superior when analyzed by hide-powder method, reaching 47.8% and 24.1% for A. mearnsii and M. tenuiflora, respectively. By Stiasny method, the tannins levels considered were 39.0% for A. mearnsii, and 15.5% for M. tenuiflora. Despite the best results presented by A. mearnsii, the bark of M. tenuiflora also showed great potential due to its considerable amount of tannin and the availability of the species at Caatinga biome.

  6. Quantification of interfacial segregation by analytical electron microscopy

    CERN Document Server

    Muellejans, H

    2003-01-01

    The quantification of interfacial segregation by spatial difference and one-dimensional profiling is presented in general where special attention is given to the random and systematic uncertainties. The method is demonstrated for an example of Al-Al sub 2 O sub 3 interfaces in a metal-ceramic composite material investigated by energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy in a dedicated scanning transmission electron microscope. The variation of segregation measured at different interfaces by both methods is within the uncertainties, indicating a constant segregation level and interfacial phase formation. The most important random uncertainty is the counting statistics of the impurity signal whereas the specimen thickness introduces systematic uncertainties (via k factor and effective scan width). The latter could be significantly reduced when the specimen thickness is determined explicitly. (orig.)

  7. Imaging and Quantification of Extracellular Vesicles by Transmission Electron Microscopy.

    Science.gov (United States)

    Linares, Romain; Tan, Sisareuth; Gounou, Céline; Brisson, Alain R

    2017-01-01

    Extracellular vesicles (EVs) are cell-derived vesicles that are present in blood and other body fluids. EVs raise major interest for their diverse physiopathological roles and their potential biomedical applications. However, the characterization and quantification of EVs constitute major challenges, mainly due to their small size and the lack of methods adapted for their study. Electron microscopy has made significant contributions to the EV field since their initial discovery. Here, we describe the use of two transmission electron microscopy (TEM) techniques for imaging and quantifying EVs. Cryo-TEM combined with receptor-specific gold labeling is applied to reveal the morphology, size, and phenotype of EVs, while their enumeration is achieved after high-speed sedimentation on EM grids.

  8. Statistical approach for uncertainty quantification of experimental modal model parameters

    DEFF Research Database (Denmark)

    Luczak, M.; Peeters, B.; Kahsin, M.

    2014-01-01

    Composite materials are widely used in manufacture of aerospace and wind energy structural components. These load carrying structures are subjected to dynamic time-varying loading conditions. Robust structural dynamics identification procedure impose tight constraints on the quality of modal models...... represent different complexity levels ranging from coupon, through sub-component up to fully assembled aerospace and wind energy structural components made of composite materials. The proposed method is demonstrated on two application cases of a small and large wind turbine blade........ This paper aims at a systematic approach for uncertainty quantification of the parameters of the modal models estimated from experimentally obtained data. Statistical analysis of modal parameters is implemented to derive an assessment of the entire modal model uncertainty measure. Investigated structures...

  9. Quantification methodology for the French 900 MW PWR PRA

    International Nuclear Information System (INIS)

    Ducamp, F.; Lanore, J.M.; Duchemin, B.; De Villeneuve, M.J.

    1985-02-01

    This paper develops some improvements brought to provide to the classical way of risk assessment. The calculation of the contribution to the risk of one peculiar sequence of an event tree is composed of four stages: creation of a fault tree for each system which appears in the event trees, in terms of component faults; simplification of these fault trees into smaller ones, in terms of macrocomponents; creation of one ''super-tree'' by regrouping the fault trees of down systems (systems which fail in the sequence) under an AND gate and calculation of minimal cut sets of this super-tree, taking into account the up systems (systems that do not fail in the sequence) and peculiarities related to the initiating event if needed; quantification of the minimal cut sets so obtained, taking into account the duration of the scenario depicted by the sequence and the possibilities of repair. Each of these steps is developed in this article

  10. Quantification of fluorescence angiography in a porcine model

    DEFF Research Database (Denmark)

    Nerup, Nikolaj; Andersen, Helene Schou; Ambrus, Rikard

    2017-01-01

    PURPOSE: There is no consensus on how to quantify indocyanine green (ICG) fluorescence angiography. The aim of the present study was to establish and gather validity evidence for a method of quantifying fluorescence angiography, to assess organ perfusion. METHODS: Laparotomy was performed on seven...... pigs, with two regions of interest (ROIs) marked. ICG and neutron-activated microspheres were administered and the stomach was illuminated in the near-infrared range, parallel to continuous recording of fluorescence signal. Tissue samples from the ROIs were sent for quantification of microspheres...... to calculate the regional blood flow. A software system was developed to assess the fluorescent recordings quantitatively, and each quantitative parameter was compared with the regional blood flow. The parameter with the strongest correlation was then compared with results from an independently developed...

  11. A critical view on microplastic quantification in aquatic organisms

    DEFF Research Database (Denmark)

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R.

    2015-01-01

    Microplastics, plastic particles and fragments smaller than 5mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics...... to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature...... review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different "hotspot" locations in Europe (Po estuary, Italy; Tagus...

  12. Raman spectroscopy for DNA quantification in cell nucleus.

    Science.gov (United States)

    Okotrub, K A; Surovtsev, N V; Semeshin, V F; Omelyanchuk, L V

    2015-01-01

    Here we demonstrate the feasibility of a novel approach to quantify DNA in cell nuclei. This approach is based on spectroscopy analysis of Raman light scattering, and avoids the problem of nonstoichiometric binding of dyes to DNA, as it directly measures the signal from DNA. Quantitative analysis of nuclear DNA contribution to Raman spectrum could be reliably performed using intensity of a phosphate mode at 1096 cm(-1) . When compared to the known DNA standards from cells of different animals, our results matched those values at error of 10%. We therefore suggest that this approach will be useful to expand the list of DNA standards, to properly adjust the duration of hydrolysis in Feulgen staining, to assay the applicability of fuchsines for DNA quantification, as well as to measure DNA content in cells with complex hydrolysis patterns, when Feulgen densitometry is inappropriate. © 2014 International Society for Advancement of Cytometry.

  13. Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Ghanem, Roger [Univ. of Southern California, Los Angeles, CA (United States)

    2017-04-18

    QUEST was a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, the Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The USC effort centered on the development of reduced models and efficient algorithms for implementing various components of the UQ pipeline. USC personnel were responsible for the development of adaptive bases, adaptive quadrature, and reduced models to be used in estimation and inference.

  14. Automated quantification of emphysema in CT studies of the lung

    International Nuclear Information System (INIS)

    Archer, D.C.; deKemp, R.A.; Coblentz, C.L.; Nahmias, C.

    1991-01-01

    Emphysema by definition is a pathologic diagnosis. Recently, in vivo quantification of emphysema from CT with point counting and with a GE 9800 CT scanner program called Density Mask has been described. These methods are laborious and time-consuming, making them unsuitable for screening. The propose of this paper is to create a screening test for emphysema. The authors developed a computer program that quantifies the amount of emphysema from standard CT-scans. The computer was programmed to recognize the lung edges on each section by identifying abrupt changes in CT numbers; grow regions within each lung to identify and separate the lungs from other structures; count regions of lung containing CT numbers measuring <-900 HU corresponding to areas of emphysema; and calculation the percentage of emphysema present from the volume of normal emphysematous lung. The programs were written in C and urn on a Sun 4/100 workstation

  15. Capacitive immunosensor for C-reactive protein quantification

    KAUST Repository

    Sapsanis, Christos

    2015-08-02

    We report an agglutination-based immunosensor for the quantification of C-reactive protein (CRP). The developed immunoassay sensor requires approximately 15 minutes of assay time per sample and provides a sensitivity of 0.5 mg/L. We have measured the capacitance of interdigitated electrodes (IDEs) and quantified the concentration of added analyte. The proposed method is a label free detection method and hence provides rapid measurement preferable in diagnostics. We have so far been able to quantify the concentration to as low as 0.5 mg/L and as high as 10 mg/L. By quantifying CRP in serum, we can assess whether patients are prone to cardiac diseases and monitor the risk associated with such diseases. The sensor is a simple low cost structure and it can be a promising device for rapid and sensitive detection of disease markers at the point-of-care stage.

  16. Negative dielectrophoresis spectroscopy for rare analyte quantification in biological samples

    Science.gov (United States)

    Kirmani, Syed Abdul Mannan; Gudagunti, Fleming Dackson; Velmanickam, Logeeshan; Nawarathna, Dharmakeerthi; Lima, Ivan T., Jr.

    2017-03-01

    We propose the use of negative dielectrophoresis (DEP) spectroscopy as a technique to improve the detection limit of rare analytes in biological samples. We observe a significant dependence of the negative DEP force on functionalized polystyrene beads at the edges of interdigitated electrodes with respect to the frequency of the electric field. We measured this velocity of repulsion for 0% and 0.8% conjugation of avidin with biotin functionalized polystyrene beads with our automated software through real-time image processing that monitors the Rayleigh scattering from the beads. A significant difference in the velocity of the beads was observed in the presence of as little as 80 molecules of avidin per biotin functionalized bead. This technology can be applied in the detection and quantification of rare analytes that can be useful in the diagnosis and the treatment of diseases, such as cancer and myocardial infarction, with the use of polystyrene beads functionalized with antibodies for the target biomarkers.

  17. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    Science.gov (United States)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  18. Microplastics in Baltic bottom sediments: Quantification procedures and first results.

    Science.gov (United States)

    Zobkov, M; Esiukova, E

    2017-01-30

    Microplastics in the marine environment are known as a global ecological problem but there are still no standardized analysis procedures for their quantification. The first breakthrough in this direction was the NOAA Laboratory Methods for quantifying synthetic particles in water and sediments, but fibers numbers have been found to be underestimated with this approach. We propose modifications for these methods that will allow us to analyze microplastics in bottom sediments, including small fibers. Addition of an internal standard to sediment samples and occasional empty runs are advised for analysis quality control. The microplastics extraction efficiency using the proposed modifications is 92±7%. Distribution of microplastics in bottom sediments of the Russian part of the Baltic Sea is presented. Microplastic particles were found in all of the samples with an average concentration of 34±10 items/kg DW and have the same order of magnitude as neighbor studies reported. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Quantification of osteolytic bone lesions in a preclinical rat trial

    Science.gov (United States)

    Fränzle, Andrea; Bretschi, Maren; Bäuerle, Tobias; Giske, Kristina; Hillengass, Jens; Bendl, Rolf

    2013-10-01

    In breast cancer, most of the patients who died, have developed bone metastasis as disease progression. Bone metastases in case of breast cancer are mainly bone destructive (osteolytic). To understand pathogenesis and to analyse response to different treatments, animal models, in our case rats, are examined. For assessment of treatment response to bone remodelling therapies exact segmentations of osteolytic lesions are needed. Manual segmentations are not only time-consuming but lack in reproducibility. Computerized segmentation tools are essential. In this paper we present an approach for the computerized quantification of osteolytic lesion volumes using a comparison to a healthy reference model. The presented qualitative and quantitative evaluation of the reconstructed bone volumes show, that the automatically segmented lesion volumes complete missing bone in a reasonable way.

  20. Thermostability of biological systems: fundamentals, challenges, and quantification.

    Science.gov (United States)

    He, Xiaoming

    2011-01-01

    This review examines the fundamentals and challenges in engineering/understanding the thermostability of biological systems over a wide temperature range (from the cryogenic to hyperthermic regimen). Applications of the bio-thermostability engineering to either destroy unwanted or stabilize useful biologicals for the treatment of diseases in modern medicine are first introduced. Studies on the biological responses to cryogenic and hyperthermic temperatures for the various applications are reviewed to understand the mechanism of thermal (both cryo and hyperthermic) injury and its quantification at the molecular, cellular and tissue/organ levels. Methods for quantifying the thermophysical processes of the various applications are then summarized accounting for the effect of blood perfusion, metabolism, water transport across cell plasma membrane, and phase transition (both equilibrium and non-equilibrium such as ice formation and glass transition) of water. The review concludes with a summary of the status quo and future perspectives in engineering the thermostability of biological systems.

  1. Aspect-Oriented Programming is Quantification and Obliviousness

    Science.gov (United States)

    Filman, Robert E.; Friedman, Daniel P.; Norvig, Peter (Technical Monitor)

    2000-01-01

    This paper proposes that the distinguishing characteristic of Aspect-Oriented Programming (AOP) systems is that they allow programming by making quantified programmatic assertions over programs written by programmers oblivious to such assertions. Thus, AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the actions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are expressive enough to allow programming an AOP system within them. A corollary is that while AOP can be applied to Object-Oriented Programming, it is an independent concept applicable to other programming styles.

  2. Uncertainty quantification in computational fluid dynamics and aircraft engines

    CERN Document Server

    Montomoli, Francesco; D'Ammaro, Antonio; Massini, Michela; Salvadori, Simone

    2015-01-01

    This book introduces novel design techniques developed to increase the safety of aircraft engines. The authors demonstrate how the application of uncertainty methods can overcome problems in the accurate prediction of engine lift, caused by manufacturing error. This in turn ameliorates the difficulty of achieving required safety margins imposed by limits in current design and manufacturing methods. This text shows that even state-of-the-art computational fluid dynamics (CFD) are not able to predict the same performance measured in experiments; CFD methods assume idealised geometries but ideal geometries do not exist, cannot be manufactured and their performance differs from real-world ones. By applying geometrical variations of a few microns, the agreement with experiments improves dramatically, but unfortunately the manufacturing errors in engines or in experiments are unknown. In order to overcome this limitation, uncertainty quantification considers the probability density functions of manufacturing errors...

  3. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State Univ., Fort Collins, CO (United States); El-Azab, Anter [Florida State Univ., Tallahassee, FL (United States); Pernice, Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States); Peterson, John W. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Polyakov, Peter [Univ. of Wyoming, Laramie, WY (United States); Tavener, Simon [Colorado State Univ., Fort Collins, CO (United States); Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States); Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis for computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.

  4. Reliability and discriminatory power of methods for dental plaque quantification

    Directory of Open Access Journals (Sweden)

    Daniela Prócida Raggio

    2010-04-01

    Full Text Available OBJECTIVE: This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI and fluorescence camera (FC to detect plaque. MATERIAL AND METHODS: Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. RESULTS: Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. CONCLUSION: The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque.

  5. Ideas underlying the Quantification of Margins and Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Pilch, Martin, E-mail: mpilch@sandia.gov [Department 1514, Sandia National Laboratories, Albuquerque, NM 87185-0828 (United States); Trucano, Timothy G. [Department 1411, Sandia National Laboratories, Albuquerque, NM 87185-0370 (United States); Helton, Jon C. [Department of Mathematics and Statistics, Arizona State University, Tempe, AZ 85287-1804 (United States)

    2011-09-15

    Key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions are described. While QMU is a broad process and methodology for generating critical technical information to be used in U.S. nuclear weapon stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, the following topics are discussed: (i) the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, (ii) the need to separate aleatory and epistemic uncertainty in QMU, and (iii) the properties of risk-informed decision making (RIDM) that are best suited for effective application of QMU. The paper is written at a high level, but provides an extensive bibliography of useful papers for interested readers to deepen their understanding of the presented ideas.

  6. Plasticity models of material variability based on uncertainty quantification techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Reese E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Rizzi, Francesco [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Boyce, Brad [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Templeton, Jeremy Alan [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ostien, Jakob [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-11-01

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQ techniques can be used in model selection and assessing the quality of calibrated physical parameters.

  7. Study on quantification of HBs-antibody by immunoradiometric assay

    International Nuclear Information System (INIS)

    Kondo, Yuichi; Itoi, Yoshihiro; Kajiyama, Shizuo

    1989-01-01

    Quantification of HBs-antibody assay was carried out using a commercialized assay kit and standard solutions of HBs-antibody recognised as 1 st reference preparation of hepatitis B immunogloblin by WHO. Standard curve of HBs-antibody was drawn with the function of 3D-spline and the correlation factor was obtained as r = 0.999. Coefficient of intra-assay variance was 3.8 % and that of inter-assay variance was 7.8 %. Dilution tests showed satisfactory results in the range of 2-16 times. Correlation between value of cut-off indices and concentration of HBs-antibody was obtained as the formula of y = 2.599 x-3.894 (r = 0.992) and 2.1 of cut-off index corresponded to about 5 mIU/ml of HBs-antibody concentration. (author)

  8. An Uncertainty Quantification Framework for Remote Sensing Retrievals

    Science.gov (United States)

    Braverman, A. J.; Hobbs, J.

    2017-12-01

    Remote sensing data sets produced by NASA and other space agencies are the result of complex algorithms that infer geophysical state from observed radiances using retrieval algorithms. The processing must keep up with the downlinked data flow, and this necessitates computational compromises that affect the accuracies of retrieved estimates. The algorithms are also limited by imperfect knowledge of physics and of ancillary inputs that are required. All of this contributes to uncertainties that are generally not rigorously quantified by stepping outside the assumptions that underlie the retrieval methodology. In this talk we discuss a practical framework for uncertainty quantification that can be applied to a variety of remote sensing retrieval algorithms. Ours is a statistical approach that uses Monte Carlo simulation to approximate the sampling distribution of the retrieved estimates. We will discuss the strengths and weaknesses of this approach, and provide a case-study example from the Orbiting Carbon Observatory 2 mission.

  9. Nanodiamond arrays on glass for quantification and fluorescence characterisation.

    Science.gov (United States)

    Heffernan, Ashleigh H; Greentree, Andrew D; Gibson, Brant C

    2017-08-23

    Quantifying the variation in emission properties of fluorescent nanodiamonds is important for developing their wide-ranging applicability. Directed self-assembly techniques show promise for positioning nanodiamonds precisely enabling such quantification. Here we show an approach for depositing nanodiamonds in pre-determined arrays which are used to gather statistical information about fluorescent lifetimes. The arrays were created via a layer of photoresist patterned with grids of apertures using electron beam lithography and then drop-cast with nanodiamonds. Electron microscopy revealed a 90% average deposition yield across 3,376 populated array sites, with an average of 20 nanodiamonds per site. Confocal microscopy, optimised for nitrogen vacancy fluorescence collection, revealed a broad distribution of fluorescent lifetimes in agreement with literature. This method for statistically quantifying fluorescent nanoparticles provides a step towards fabrication of hybrid photonic devices for applications from quantum cryptography to sensing.

  10. Uncertainty quantification applied to the radiological characterization of radioactive waste.

    Science.gov (United States)

    Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P

    2017-09-01

    This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Measuring Identification and Quantification Errors in Spectral CT Material Decomposition

    Directory of Open Access Journals (Sweden)

    Aamir Younis Raja

    2018-03-01

    Full Text Available Material decomposition methods are used to identify and quantify multiple tissue components in spectral CT but there is no published method to quantify the misidentification of materials. This paper describes a new method for assessing misidentification and mis-quantification in spectral CT. We scanned a phantom containing gadolinium (1, 2, 4, 8 mg/mL, hydroxyapatite (54.3, 211.7, 808.5 mg/mL, water and vegetable oil using a MARS spectral scanner equipped with a poly-energetic X-ray source operated at 118 kVp and a CdTe Medipix3RX camera. Two imaging protocols were used; both with and without 0.375 mm external brass filter. A proprietary material decomposition method identified voxels as gadolinium, hydroxyapatite, lipid or water. Sensitivity and specificity information was used to evaluate material misidentification. Biological samples were also scanned. There were marked differences in identification and quantification between the two protocols even though spectral and linear correlation of gadolinium and hydroxyapatite in the reconstructed images was high and no qualitative segmentation differences in the material decomposed images were observed. At 8 mg/mL, gadolinium was correctly identified for both protocols, but concentration was underestimated by over half for the unfiltered protocol. At 1 mg/mL, gadolinium was misidentified in 38% of voxels for the filtered protocol and 58% of voxels for the unfiltered protocol. Hydroxyapatite was correctly identified at the two higher concentrations for both protocols, but mis-quantified for the unfiltered protocol. Gadolinium concentration as measured in the biological specimen showed a two-fold difference between protocols. In future, this methodology could be used to compare and optimize scanning protocols, image reconstruction methods, and methods for material differentiation in spectral CT.

  12. Bayesian Proteoform Modeling Improves Protein Quantification of Global Proteomic Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Datta, Susmita; Payne, Samuel H.; Kang, Jiyun; Bramer, Lisa M.; Nicora, Carrie D.; Shukla, Anil K.; Metz, Thomas O.; Rodland, Karin D.; Smith, Richard D.; Tardiff, Mark F.; McDermott, Jason E.; Pounds, Joel G.; Waters, Katrina M.

    2014-12-01

    As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally-driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian model (BP-Quant) that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern, or the existence of multiple over-expressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab ® and R packages at https://github.com/PNNL-Comp-Mass-Spec/BP-Quant.

  13. A critical view on microplastic quantification in aquatic organisms

    Energy Technology Data Exchange (ETDEWEB)

    Vandermeersch, Griet, E-mail: griet.vandermeersch@ilvo.vlaanderen.be [Institute for Agricultural and Fisheries Research (ILVO), Animal Sciences Unit – Marine Environment and Quality, Ankerstraat 1, 8400 Oostende (Belgium); Van Cauwenberghe, Lisbeth; Janssen, Colin R. [Ghent University, Laboratory of Environmental Toxicology and Aquatic Ecology, Environmental Toxicology Unit (GhEnToxLab), Jozef Plateaustraat 22, 9000 Ghent (Belgium); Marques, Antonio [Division of Aquaculture and Upgrading (DivAV), Portuguese Institute for the Sea and Atmosphere (IPMA), Avenida de Brasília s/n, 1449-006 Lisboa (Portugal); Granby, Kit [Technical University of Denmark, National Food Institute, Mørkhøj Bygade 19, 2860 Søborg (Denmark); Fait, Gabriella [Aeiforia Srl, 29027 Gariga di Podenzano (PC) (Italy); Kotterman, Michiel J.J. [Institute for Marine Resources and Ecosystem Studies (IMARES), Wageningen University and Research Center, Ijmuiden (Netherlands); Diogène, Jorge [Institut de la Recerca i Tecnologia Agroalimentàries (IRTA), Ctra. Poble Nou km 5,5, Sant Carles de la Ràpita E-43540 (Spain); Bekaert, Karen; Robbens, Johan [Institute for Agricultural and Fisheries Research (ILVO), Animal Sciences Unit – Marine Environment and Quality, Ankerstraat 1, 8400 Oostende (Belgium); Devriese, Lisa, E-mail: lisa.devriese@ilvo.vlaanderen.be [Institute for Agricultural and Fisheries Research (ILVO), Animal Sciences Unit – Marine Environment and Quality, Ankerstraat 1, 8400 Oostende (Belgium)

    2015-11-15

    Microplastics, plastic particles and fragments smaller than 5 mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different “hotspot” locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g{sup −1} w.w. for the Acid mix Method and 0.12±0.04 total microplastics g{sup −1} w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g{sup −1} w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring.

  14. Quantification of rutile in anatase by X-ray diffraction

    International Nuclear Information System (INIS)

    Chavez R, A.

    2001-01-01

    Nowadays the discovering of new and better materials required in all areas of the industry has been lead to the human being to introduce him to this small and great world. The crystalline materials, have properties markedly directional. When it is necessary to realize a quantitative analysis to these materials the task is not easy. The main objective of this work is the research of a real problem, its solution and perfecting of a technique involving the theoretical and experimental principles which allow the quantification of crystalline phases. The chapter 1 treats about the study of crystalline state during the last century, by means of the X-ray diffraction technique. The chapter 2 studies the nature and production of X-rays, the chapter 3 expounds the principles of the diffraction technique which to carry out when it is satisfied the Bragg law studying the powder diffraction method and its applications. In the chapter 4 it is explained how the intensities of the beams diffracted are determined by the atoms positions inside of the elemental cell of the crystal. The properties of the crystalline samples of anatase and rutile are described in the chapter 5. The results of this last analysis are the information which will be processed by means of the auxiliary software: Diffrac AT, Axum and Peakfit as well as the TAFOR and CUANTI software describing this part with more detail in the chapters 6 and 7 where it is mentioned step by step the function of each software until to reach the quantification of crystalline phases, objective of this work. Finally, in the chapter 8 there are a results analysis and conclusions. The contribution of this work is for those learned institutions of limited resources which can tackle in this way the characterization of materials. (Author)

  15. Exact reliability quantification of highly reliable systems with maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)

    2010-12-15

    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  16. A novel immunological assay for hepcidin quantification in human serum.

    Directory of Open Access Journals (Sweden)

    Vasiliki Koliaraki

    Full Text Available BACKGROUND: Hepcidin is a 25-aminoacid cysteine-rich iron regulating peptide. Increased hepcidin concentrations lead to iron sequestration in macrophages, contributing to the pathogenesis of anaemia of chronic disease whereas decreased hepcidin is observed in iron deficiency and primary iron overload diseases such as hereditary hemochromatosis. Hepcidin quantification in human blood or urine may provide further insights for the pathogenesis of disorders of iron homeostasis and might prove a valuable tool for clinicians for the differential diagnosis of anaemia. This study describes a specific and non-operator demanding immunoassay for hepcidin quantification in human sera. METHODS AND FINDINGS: An ELISA assay was developed for measuring hepcidin serum concentration using a recombinant hepcidin25-His peptide and a polyclonal antibody against this peptide, which was able to identify native hepcidin. The ELISA assay had a detection range of 10-1500 microg/L and a detection limit of 5.4 microg/L. The intra- and interassay coefficients of variance ranged from 8-15% and 5-16%, respectively. Mean linearity and recovery were 101% and 107%, respectively. Mean hepcidin levels were significantly lower in 7 patients with juvenile hemochromatosis (12.8 microg/L and 10 patients with iron deficiency anemia (15.7 microg/L and higher in 7 patients with Hodgkin lymphoma (116.7 microg/L compared to 32 age-matched healthy controls (42.7 microg/L. CONCLUSIONS: We describe a new simple ELISA assay for measuring hepcidin in human serum with sufficient accuracy and reproducibility.

  17. Quantification of the genetic risk of environmental mutagens

    International Nuclear Information System (INIS)

    Ehling, U.H.

    1988-01-01

    Screening methods are used for hazard identification. Assays for heritable mutations in mammals are used for the confirmation of short-term test results and for the quantification of the genetic risk. There are two main approaches in making genetic risk estimates. One of these, termed the direct method, expresses risk in terms of the expected frequency of genetic changes induced per unit. The other, referred to as the doubling dose method or the indirect method, expresses risk in relation to the observed incidence of genetic disorders now present in man. The indirect method uses experimental data only for the calculation of the doubling dose. The quality of the risk estimation depends on the assumption of persistence of the induced mutations and the ability to determine the current incidence of genetic diseases. The difficulties of improving the estimates of current incidences of genetic diseases or the persistence of the genes in the population led them to the development of an alternative method, the direct estimation of the genetic risk. The direct estimation uses experimental data for the induced frequency for dominant mutations in mice. For the verification of these quantifications one can use the data of Hiroshima and Nagasaki. According to the estimation with the direct method, one would expect less than 1 radiation-induced dominant cataract in 19,000 children with one or both parents exposed. The expected overall frequency of dominant mutations in the first generation would be 20-25, based on radiation-induced dominant cataract mutations. It is estimated that 10 times more recessive than dominant mutations are induced. The same approaches can be used to determine the impact of chemical mutagens

  18. Quantification and localization of mast cells in periapical lesions.

    Science.gov (United States)

    Mahita, V N; Manjunatha, B S; Shah, R; Astekar, M; Purohit, S; Kovvuru, S

    2015-01-01

    Periapical lesions occur in response to chronic irritation in periapical tissue, generally resulting from an infected root canal. Specific etiological agents of induction, participating cell population and growth factors associated with maintenance and resolution of periapical lesions are incompletely understood. Among the cells found in periapical lesions, mast cells have been implicated in the inflammatory mechanism. Quantifications and the possible role played by mast cells in the periapical granuloma and radicular cyst. Hence, this study is to emphasize the presence (localization) and quantification of mast cells in periapical granuloma and radicular cyst. A total of 30 cases and out of which 15 of periapical granuloma and 15 radicular cyst, each along with the case details from the previously diagnosed cases in the department of oral pathology were selected for the study. The gender distribution showed male 8 (53.3%) and females 7 (46.7%) in periapical granuloma cases and male 10 (66.7%) and females 5 (33.3%) in radicular cyst cases. The statistical analysis used was unpaired t-test. Mean mast cell count in periapical granuloma subepithelial and deeper connective tissue, was 12.40 (0.99%) and 7.13 (0.83%), respectively. The mean mast cell counts in subepithelial and deeper connective tissue of radicular cyst were 17.64 (1.59%) and 12.06 (1.33%) respectively, which was statistically significant. No statistical significant difference was noted among males and females. Mast cells were more in number in radicular cyst. Based on the concept that mast cells play a critical role in the induction of inflammation, it is logical to use therapeutic agents to alter mast cell function and secretion, to thwart inflammation at its earliest phases. These findings may suggest the possible role of mast cells in the pathogenesis of periapical lesions.

  19. Quantification of regional fat volume in rat MRI

    Science.gov (United States)

    Sacha, Jaroslaw P.; Cockman, Michael D.; Dufresne, Thomas E.; Trokhan, Darren

    2003-05-01

    Multiple initiatives in the pharmaceutical and beauty care industries are directed at identifying therapies for weight management. Body composition measurements are critical for such initiatives. Imaging technologies that can be used to measure body composition noninvasively include DXA (dual energy x-ray absorptiometry) and MRI (magnetic resonance imaging). Unlike other approaches, MRI provides the ability to perform localized measurements of fat distribution. Several factors complicate the automatic delineation of fat regions and quantification of fat volumes. These include motion artifacts, field non-uniformity, brightness and contrast variations, chemical shift misregistration, and ambiguity in delineating anatomical structures. We have developed an approach to deal practically with those challenges. The approach is implemented in a package, the Fat Volume Tool, for automatic detection of fat tissue in MR images of the rat abdomen, including automatic discrimination between abdominal and subcutaneous regions. We suppress motion artifacts using masking based on detection of implicit landmarks in the images. Adaptive object extraction is used to compensate for intensity variations. This approach enables us to perform fat tissue detection and quantification in a fully automated manner. The package can also operate in manual mode, which can be used for verification of the automatic analysis or for performing supervised segmentation. In supervised segmentation, the operator has the ability to interact with the automatic segmentation procedures to touch-up or completely overwrite intermediate segmentation steps. The operator's interventions steer the automatic segmentation steps that follow. This improves the efficiency and quality of the final segmentation. Semi-automatic segmentation tools (interactive region growing, live-wire, etc.) improve both the accuracy and throughput of the operator when working in manual mode. The quality of automatic segmentation has been

  20. Detection and quantification of proteins and cells by use of elemental mass spectrometry: progress and challenges.

    Science.gov (United States)

    Yan, Xiaowen; Yang, Limin; Wang, Qiuquan

    2013-07-01

    Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.

  1. Biventricular MR volumetric analysis and MR flow quantification in the ascending aorta and pulmonary trunk for quantification of valvular regurgitation

    International Nuclear Information System (INIS)

    Rominger, M.B.

    2004-01-01

    Purpose: To test the value of biventricular volumetric analysis and the combination of biventricular volumetric analysis with flow quantification in the ascending aorta (Ao) and pulmonary trunk (Pu) for quantification of regurgitation volume and cardiac function in valvular regurgitation (VR) according to location and presence of single or multivalvular disease. Materials and Methods: In 106 patients, the stroke volumes were assessed by measuring the biventricular volumes and the forward-stroke volumes in the great and small circulation by measuring the flow in the Ao and Pu. Valve regurgitation volumes and quotients were calculated for single and multivalvular disease and correlated with semiquantitative 2D-echocardiography (grade I-IV). For the assessment of the cardiac function in VR, the volumetric parameters of ejection fraction and end-diastolic (EDV) and end-systolic (ESV) volumes were determined. Results: The detection rate was 49% for left ventricular (LV) VR and 42% for right ventricular (RV) VR. Low LV VR and RV VR usually could not be detected quantitatively, with the detection rate improving with echocardiographically higher insufficiency grades. Quantitative MRI could detect a higher grade solitary aortic valve insufficiency (≥2) in 11 of 12 patients and higher grade mitral valve insufficiency in 4 of 10 patients. A significant increase in RV and LV ventricular EDV and ESV was seen more often with increased MR regurgitation volumes. Aortic stenosis did not interfere with flow measurements in the Ao. Conclusions: Biventricular volumetry combined with flow measurements in Ao and Pu is a robust, applicable and simple method to assess higher grade regurgitation volumes and the cardiac function in single and multivalvular regurgitation at different locations. It is an important application for the diagnosis of VR by MRI [de

  2. Improved quantification of farnesene during microbial production from Saccharomyces cerevisiae in two-liquid-phase fermentations

    DEFF Research Database (Denmark)

    Tippmann, Stefan; Nielsen, Jens; Khoomrung, Sakda

    2016-01-01

    Organic solvents are widely used in microbial fermentations to reduce gas stripping effects and capture hydrophobic or toxic compounds. Reliable quantification of biochemical products in these overlays is highly challenging and practically difficult. Here, we present a significant improvement...... carryover could be minimized. Direct quantification of farnesene in dodecane was achieved by GC-FID whereas GC-MS demonstrated to be an excellent technique for identification of known and unknown metabolites. The GC-FID is a suitable technique for direct quantification of farnesene in complex matrices...

  3. Quantification of rice bran oil in oil blends

    Directory of Open Access Journals (Sweden)

    Mishra, R.

    2012-03-01

    Full Text Available Blends consisting of physically refined rice bran oil (PRBO: sunflower oil (SnF and PRBO: safflower oil (SAF in different proportions were analyzed for various physicochemical parameters. The quantification of pure rice bran oil in the blended oils was carried out using different methods including gas chromatographic, HPLC, ultrasonic velocity and methods based on physico-chemical parameters. The physicochemical parameters such as ultrasonic velocity, relative association and acoustic impedance at 2 MHz, iodine value, palmitic acid content and oryzanol content reflected significant changes with increased proportions of PRBO in the blended oils. These parameters were selected as dependent parameters and % PRBO proportion was selected as independent parameters. The study revealed that regression equations based on the oryzanol content, palmitic acid composition, ultrasonic velocity, relative association, acoustic impedance, and iodine value can be used for the quantification of rice bran oil in blended oils. The rice bran oil can easily be quantified in the blended oils based on the oryzanol content by HPLC even at a 1% level. The palmitic acid content in blended oils can also be used as an indicator to quantify rice bran oil at or above the 20% level in blended oils whereas the method based on ultrasonic velocity, acoustic impedance and relative association showed initial promise in the quantification of rice bran oil.

    Se analizaron diversos parámetros físico-químicos para la evaluación de mezclas de aceites en diferentes proporciones que incluyen: aceite de salvado de arroz físícamente refinado (PRBO: aceite de girasol (SNF y las mezclas PRBO: aceite de cártamo (SAF en diferentes proporciones. La cuantificación de la presencia del aceite de salvado de arroz en las mezclas se llevó a cabo por diferentes métodos, como cromatografía de gases (GC, cromatografía líquida (HPLC, ultrasonidos y métodos basados en otros parámetros f

  4. Statistical Uncertainty Quantification of Physical Models during Reflood of LBLOCA

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Deog Yeon; Seul, Kwang Won; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The use of the best-estimate (BE) computer codes in safety analysis for loss-of-coolant accident (LOCA) is the major trend in many countries to reduce the significant conservatism. A key feature of this BE evaluation requires the licensee to quantify the uncertainty of the calculations. So, it is very important how to determine the uncertainty distribution before conducting the uncertainty evaluation. Uncertainty includes those of physical model and correlation, plant operational parameters, and so forth. The quantification process is often performed mainly by subjective expert judgment or obtained from reference documents of computer code. In this respect, more mathematical methods are needed to reasonably determine the uncertainty ranges. The first uncertainty quantification are performed with the various increments for two influential uncertainty parameters to get the calculated responses and their derivatives. The different data set with two influential uncertainty parameters for FEBA tests, are chosen applying more strict criteria for selecting responses and their derivatives, which may be considered as the user’s effect in the CIRCÉ applications. Finally, three influential uncertainty parameters are considered to study the effect on the number of uncertainty parameters due to the limitation of CIRCÉ method. With the determined uncertainty ranges, uncertainty evaluations for FEBA tests are performed to check whether the experimental responses such as the cladding temperature or pressure drop are inside the limits of calculated uncertainty bounds. A confirmation step will be performed to evaluate the quality of the information in the case of the different reflooding PERICLES experiments. The uncertainty ranges of physical model in MARS-KS thermal-hydraulic code during the reflooding were quantified by CIRCÉ method using FEBA experiment tests, instead of expert judgment. Also, through the uncertainty evaluation for FEBA and PERICLES tests, it was confirmed

  5. Development of hydrate risk quantification in oil and gas production

    Science.gov (United States)

    Chaudhari, Piyush N.

    order to reduce the parametric study that may require a long duration of time using The Colorado School of Mines Hydrate Kinetic Model (CSMHyK). The evolution of the hydrate plugging risk along flowline-riser systems is modeled for steady state and transient operations considering the effect of several critical parameters such as oil-hydrate slip, duration of shut-in, and water droplet size on a subsea tieback system. This research presents a novel platform for quantification of the hydrate plugging risk, which in-turn will play an important role in improving and optimizing current hydrate management strategies. The predictive strength of the hydrate risk quantification and hydrate prediction models will have a significant impact on flow assurance engineering and design with respect to building safe and efficient hydrate management techniques for future deep-water developments.

  6. Aerosol-type retrieval and uncertainty quantification from OMI data

    Science.gov (United States)

    Kauppi, Anu; Kolmonen, Pekka; Laine, Marko; Tamminen, Johanna

    2017-11-01

    We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs) and top-of-atmosphere (TOA) spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD). The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI) measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the difficulty in model

  7. Aerosol-type retrieval and uncertainty quantification from OMI data

    Directory of Open Access Journals (Sweden)

    A. Kauppi

    2017-11-01

    Full Text Available We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs and top-of-atmosphere (TOA spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD. The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the

  8. Computer-assisted radiological quantification of rheumatoid arthritis

    International Nuclear Information System (INIS)

    Peloschek, P.L.

    2000-03-01

    Specific objective was to develop the layout and structure of a platform for effective quantification of rheumatoid arthritis (RA). A fully operative Java stand-alone application software (RheumaCoach) was developed to support the efficacy of the scoring process in RA (Web address: http://www.univie.ac.at/radio/radio.htm). Addressed as potential users of such a program are physicians enrolled in clinical trials to evaluate the course of RA and its modulation with drug therapies and scientists developing new scoring modalities. The software 'RheumaCoach' consists of three major modules: The Tutorial starts with 'Rheumatoid Arthritis', to teach the basic pathology of the disease. Afterwards the section 'Imaging Standards' explains how to produce proper radiographs. 'Principles - How to use the 'Larsen Score', 'Radiographic Findings' and 'Quantification by Scoring' explain the requirements for unbiased scoring of RA. At the Data Input Sheet care was taken to follow the radiologist's approach in analysing films as published previously. At the compute sheet the calculated Larsen-Score may be compared with former scores and the further possibilities (calculate, export, print, send) are easily accessible. In a first pre-clinical study the system was tested in an unstructured. Two structured evaluations (30 fully documented and blinded cases of RA, four radiologists scored hands and feet with or without the RheumaCoach) followed. Between the evaluations we permanently improved the software. For all readers the usage of the RheumaCoach fastened the procedure, all together the scoring without computer-assistance needed about 20 % percent more time. Availability of the programme via the internet provides common access for potential quality control in multi-center studies. Documentation of results in a specifically designed printout improves communication between radiologists and rheumatologists. The possibilities of direct export to other programmes and electronic

  9. Quantification of liver fat in the presence of iron overload.

    Science.gov (United States)

    Horng, Debra E; Hernando, Diego; Reeder, Scott B

    2017-02-01

    To evaluate the accuracy of R2* models (1/T 2 * = R2*) for chemical shift-encoded magnetic resonance imaging (CSE-MRI)-based proton density fat-fraction (PDFF) quantification in patients with fatty liver and iron overload, using MR spectroscopy (MRS) as the reference standard. Two Monte Carlo simulations were implemented to compare the root-mean-squared-error (RMSE) performance of single-R2* and dual-R2* correction in a theoretical liver environment with high iron. Fatty liver was defined as hepatic PDFF >5.6% based on MRS; only subjects with fatty liver were considered for analyses involving fat. From a group of 40 patients with known/suspected iron overload, nine patients were identified at 1.5T, and 13 at 3.0T with fatty liver. MRS linewidth measurements were used to estimate R2* values for water and fat peaks. PDFF was measured from CSE-MRI data using single-R2* and dual-R2* correction with magnitude and complex fitting. Spectroscopy-based R2* analysis demonstrated that the R2* of water and fat remain close in value, both increasing as iron overload increases: linear regression between R2* W and R2* F resulted in slope = 0.95 [0.79-1.12] (95% limits of agreement) at 1.5T and slope = 0.76 [0.49-1.03] at 3.0T. MRI-PDFF using dual-R2* correction had severe artifacts. MRI-PDFF using single-R2* correction had good agreement with MRS-PDFF: Bland-Altman analysis resulted in -0.7% (bias) ± 2.9% (95% limits of agreement) for magnitude-fit and -1.3% ± 4.3% for complex-fit at 1.5T, and -1.5% ± 8.4% for magnitude-fit and -2.2% ± 9.6% for complex-fit at 3.0T. Single-R2* modeling enables accurate PDFF quantification, even in patients with iron overload. 1 J. Magn. Reson. Imaging 2017;45:428-439. © 2016 International Society for Magnetic Resonance in Medicine.

  10. Quantification of glucosylceramide in plasma of Gaucher disease patients

    Directory of Open Access Journals (Sweden)

    Maria Viviane Gomes Muller

    2010-12-01

    Full Text Available Gaucher disease is a sphingolipidosis that leads to an accumulation of glucosylceramide. The objective of this study was to develop a methodology, based on the extraction, purification and quantification of glucosylceramide from blood plasma, for use in clinical research laboratories. Comparison of the glucosylceramide content in plasma from Gaucher disease patients, submitted to enzyme replacement therapy or otherwise, against that from normal individuals was also carried out. The glucosylceramide, separated from other glycosphingolipids by high performance thin layer chromatography (HPTLC was chemically developed (CuSO4 / H3PO4 and the respective band confirmed by immunostaining (human anti-glucosylceramide antibody / peroxidase-conjugated secondary antibody. Chromatogram quantification by densitometry demonstrated that the glucosylceramide content in Gaucher disease patients was seventeen times higher than that in normal individuals, and seven times higher than that in patients on enzyme replacement therapy. The results obtained indicate that the methodology established can be used in complementary diagnosis and for treatment monitoring of Gaucher disease patients.A doença de Gaucher é uma esfingolipidose caracterizada pelo acúmulo de glicosilceramida. O objetivo deste estudo foi desenvolver metodologia baseada na extração, purificação e quantificação da glicosilceramida plasmática a qual possa ser usada em laboratórios de pesquisa clínica. Após o desenvolvimento desta metodologia, foi proposto, também, comparar o conteúdo de glicosilceramida presente no plasma de pacientes com doença de Gaucher, submetidos ou não a tratamento, com aquele de indivíduos normais. A glicosilceramida, separada de outros glicoesfingolipídios por cromatografia de camada delgada de alto desempenho (HPTLC, foi revelada quimicamente (CuSO4/H3PO4 e a respectiva banda foi confirmada por imunorrevelação (anticorpo anti-glicosilceramida humana

  11. Preclinical imaging characteristics and quantification of Platinum-195m SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Aalbersberg, E.A.; Wit-van der Veen, B.J. de; Vegt, E.; Vogel, Wouter V. [The Netherlands Cancer Institute (NKI-AVL), Department of Nuclear Medicine, Amsterdam (Netherlands); Zwaagstra, O.; Codee-van der Schilden, K. [Nuclear Research and Consultancy Group (NRG), Petten (Netherlands)

    2017-08-15

    In vivo biodistribution imaging of platinum-based compounds may allow better patient selection for treatment with chemo(radio)therapy. Radiolabeling with Platinum-195m ({sup 195m}Pt) allows SPECT imaging, without altering the chemical structure or biological activity of the compound. We have assessed the feasibility of {sup 195m}Pt SPECT imaging in mice, with the aim to determine the image quality and accuracy of quantification for current preclinical imaging equipment. Enriched (>96%) {sup 194}Pt was irradiated in the High Flux Reactor (HFR) in Petten, The Netherlands (NRG). A 0.05 M HCl {sup 195m}Pt-solution with a specific activity of 33 MBq/mg was obtained. Image quality was assessed for the NanoSPECT/CT (Bioscan Inc., Washington DC, USA) and U-SPECT{sup +}/CT (MILabs BV, Utrecht, the Netherlands) scanners. A radioactivity-filled rod phantom (rod diameter 0.85-1.7 mm) filled with 1 MBq {sup 195m}Pt was scanned with different acquisition durations (10-120 min). Four healthy mice were injected intravenously with 3-4 MBq {sup 195m}Pt. Mouse images were acquired with the NanoSPECT for 120 min at 0, 2, 4, or 24 h after injection. Organs were delineated to quantify {sup 195m}Pt concentrations. Immediately after scanning, the mice were sacrificed, and the platinum concentration was determined in organs using a gamma counter and graphite furnace - atomic absorption spectroscopy (GF-AAS) as reference standards. A 30-min acquisition of the phantom provided visually adequate image quality for both scanners. The smallest visible rods were 0.95 mm in diameter on the NanoSPECT and 0.85 mm in diameter on the U-SPECT{sup +}. The image quality in mice was visually adequate. Uptake was seen in the kidneys with excretion to the bladder, and in the liver, blood, and intestine. No uptake was seen in the brain. The Spearman correlation between SPECT and gamma counter was 0.92, between SPECT and GF-AAS it was 0.84, and between GF-AAS and gamma counter it was0.97 (all p < 0

  12. Uncertainty Quantification Bayesian Framework for Porous Media Flows

    Science.gov (United States)

    Demyanov, V.; Christie, M.; Erbas, D.

    2005-12-01

    Uncertainty quantification is an increasingly important aspect of many areas of applied science, where the challenge is to make reliable predictions about the performance of complex physical systems in the absence of complete or reliable data. Predicting flows of fluids through undersurface reservoirs is an example of a complex system where accuracy in prediction is needed (e.g. in oil industry it is essential for financial reasons). Simulation of fluid flow in oil reservoirs is usually carried out using large commercially written finite difference simulators solving conservation equations describing the multi-phase flow through the porous reservoir rocks, which is a highly computationally expensive task. This work examines a Bayesian Framework for uncertainty quantification in porous media flows that uses a stochastic sampling algorithm to generate models that match observed time series data. The framework is flexible for a wide range of general physical/statistical parametric models, which are used to describe the underlying hydro-geological process in its temporal dynamics. The approach is based on exploration of the parameter space and update of the prior beliefs about what the most likely model definitions are. Optimization problem for a highly parametric physical model usually have multiple solutions, which impact the uncertainty of the made predictions. Stochastic search algorithm (e.g. genetic algorithm) allows to identify multiple "good enough" models in the parameter space. Furthermore, inference of the generated model ensemble via MCMC based algorithm evaluates the posterior probability of the generated models and quantifies uncertainty of the predictions. Machine learning algorithm - Artificial Neural Networks - are used to speed up the identification of regions in parameter space where good matches to observed data can be found. Adaptive nature of ANN allows to develop different ways of integrating them into the Bayesian framework: as direct time

  13. Collaborative framework for PIV uncertainty quantification: the experimental database

    International Nuclear Information System (INIS)

    Neal, Douglas R; Sciacchitano, Andrea; Scarano, Fulvio; Smith, Barton L

    2015-01-01

    The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for

  14. The Effect of AOP on Software Engineering, with Particular Attention to OIF and Event Quantification

    Science.gov (United States)

    Havelund, Klaus; Filman, Robert; Korsmeyer, David (Technical Monitor)

    2003-01-01

    We consider the impact of Aspect-Oriented Programming on Software Engineering, and, in particular, analyze two AOP systems, one of which does component wrapping and the other, quantification over events, for their software engineering effects.

  15. Optofluidic interferometry chip designs of differential NIR absorbance based sensors for identification and quantification of electrolytes

    NARCIS (Netherlands)

    Steen, Gerrit W.; Wexler, Adam D.; Offerhaus, Herman L.

    2014-01-01

    Design and optimization of integrated photonic NIR absorbance based sensors for identification and quantification of aqueous electrolytes was performed by simulation in MATLAB and Optodesigner. Ten designs are presented and compared for suitability.

  16. Quantification of allochthonous nutrient input into freshwater bodies by herbivorous waterbirds

    NARCIS (Netherlands)

    Hahn, S.M.; Bauer, S.; Klaassen, M.R.J.

    2008-01-01

    1. Waterbirds are considered to import large quantities of nutrients to freshwater bodies but quantification of these loadings remains problematic. We developed two general models to calculate such allochthonous nutrient inputs considering food intake, foraging behaviour and digestive performance of

  17. A SIMPLE METHOD FOR THE EXTRACTION AND QUANTIFICATION OF PHOTOPIGMENTS FROM SYMBIODINIUM SPP.

    Science.gov (United States)

    John E. Rogers and Dragoslav Marcovich. Submitted. Simple Method for the Extraction and Quantification of Photopigments from Symbiodinium spp.. Limnol. Oceanogr. Methods. 19 p. (ERL,GB 1192). We have developed a simple, mild extraction procedure using methanol which, when...

  18. Quantification of collateral flow in humans: a comparison of angiographic, electrocardiographic and hemodynamic variables

    NARCIS (Netherlands)

    van Liebergen, R. A.; Piek, J. J.; Koch, K. T.; de Winter, R. J.; Schotborgh, C. E.; Lie, K. I.

    1999-01-01

    Evaluation of collateral vascular circulation according to hemodynamic variables and its relation to myocardial ischemia. There is limited information regarding the hemodynamic quantification of recruitable collateral vessels. Angiography of the donor coronary artery was performed before and during

  19. 3.8 Proposed approach to uncertainty quantification and sensitivity analysis in the next PA

    Energy Technology Data Exchange (ETDEWEB)

    Flach, Greg [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Wohlwend, Jen [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-02

    This memorandum builds upon Section 3.8 of SRNL (2016) and Flach (2017) by defining key error analysis, uncertainty quantification, and sensitivity analysis concepts and terms, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision.

  20. Quantification of the potential for biogas and biogas manure from the ...

    African Journals Online (AJOL)

    Thomas

    2013-09-04

    Sep 4, 2013 ... This wasted energy material is equivalent to 9000 L of diesel fuel that currently would cost 9389 ... Key words: Biogas potential, fruit waste, quantification, prediction, biogas manure. ... For example, consumption of fruits and.

  1. The Parallel C++ Statistical Library ‘QUESO’: Quantification of Uncertainty for Estimation, Simulation and Optimization

    KAUST Repository

    Prudencio, Ernesto E.; Schulz, Karl W.

    2012-01-01

    QUESO is a collection of statistical algorithms and programming constructs supporting research into the uncertainty quantification (UQ) of models and their predictions. It has been designed with three objectives: it should (a) be sufficiently

  2. Activity quantification of phantom using dual-head SPECT with two-view planar image

    International Nuclear Information System (INIS)

    Guo Leiming; Chen Tao; Sun Xiaoguang; Huang Gang

    2005-01-01

    The absorbed radiation dose from internally deposited radionuclide is a major factor in assessing risk and therapeutic utility in nuclear medicine diagnosis or treatment. The quantification of absolute activity in vivo is necessary procedure of estimating the absorbed dose of organ or tissue. To understand accuracy in the determination of organ activity, the experiments on 99 Tc m activity quantification were made for a body phantom using dual-heat SPECT with the two-view counting technique. Accuracy in the activity quantification is credible and is not affected by depth of source organ in vivo. When diameter of the radiation source is ≤2 cm, the most accurate activity quantification result can be obtained on the basis of establishing the system calibration factor and transmission factor. The use of Buijs's method is preferable, especially at very low source-to-background activity concentration rations. (authors)

  3. An overview of quantification methods in energy-dispersive X-ray ...

    Indian Academy of Sciences (India)

    methods for thin samples, samples with intermediate thickness and thick ... algorithms and quantification methods based on scattered primary radiation. ... technique for in situ characterization of materials such as contaminated soil, archaeo-.

  4. Evaluation of two autoinducer-2 quantification methods for application in marine environments

    KAUST Repository

    Wang, Tian-Nyu; Kaksonen, Anna H.; Hong, Pei-Ying

    2018-01-01

    This study evaluated two methods, namely high performance liquid chromatography with fluorescence detection (HPLC-FLD) and Vibrio harveyi BB170 bioassay, for autoinducer-2 (AI-2) quantification in marine samples. Using both methods, the study also

  5. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    Science.gov (United States)

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  6. Strategy study of quantification harmonization of SUV in PET/CT images

    International Nuclear Information System (INIS)

    Fischer, Andreia Caroline Fischer da Silveira

    2014-01-01

    In clinical practice, PET/CT images are often analyzed qualitatively by visual comparison of tumor lesions and normal tissues uptake; and semi-quantitatively by means of a parameter called SUV (Standardized Uptake Value). To ensure that longitudinal studies acquired on different scanners are interchangeable, and information of quantification is comparable, it is necessary to establish a strategy to harmonize the quantification of SUV. The aim of this study is to evaluate the strategy to harmonize the quantification of PET/CT images, performed with different scanner models and manufacturers. For this purpose, a survey of the technical characteristics of equipment and acquisition protocols of clinical images of different services of PET/CT in the state of Rio Grande do Sul was conducted. For each scanner, the accuracy of SUV quantification, and the Recovery Coefficient (RC) curves were determined, using the reconstruction parameters clinically relevant and available. From these data, harmonized performance specifications among the evaluated scanners were identified, as well as the algorithm that produces, for each one, the most accurate quantification. Finally, the most appropriate reconstruction parameters to harmonize the SUV quantification in each scanner, either regionally or internationally were identified. It was found that the RC values of the analyzed scanners proved to be overestimated by up to 38%, particularly for objects larger than 17mm. These results demonstrate the need for further optimization, through the reconstruction parameters modification, and even the change of the reconstruction algorithm used in each scanner. It was observed that there is a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies. Thus, the choice of reconstruction method should be tied to the purpose of the PET/CT study in question, since the same reconstruction algorithm is not adequate, in one scanner, for qualitative

  7. Quantification of miRNAs by a simple and specific qPCR method

    DEFF Research Database (Denmark)

    Cirera Salicio, Susanna; Busk, Peter K.

    2014-01-01

    MicroRNAs (miRNAs) are powerful regulators of gene expression at posttranscriptional level and play important roles in many biological processes and in disease. The rapid pace of the emerging field of miRNAs has opened new avenues for development of techniques to quantitatively determine mi...... in miRNA quantification. Furthermore, the method is easy to perform with common laboratory reagents, which allows miRNA quantification at low cost....

  8. Detection and quantification of beef and pork materials in meat products by duplex droplet digital PCR

    OpenAIRE

    Cai, Yicun; He, Yuping; Lv, Rong; Chen, Hongchao; Wang, Qiang; Pan, Liangwen

    2017-01-01

    Meat products often consist of meat from multiple animal species, and inaccurate food product adulteration and mislabeling can negatively affect consumers. Therefore, a cost-effective and reliable method for identification and quantification of animal species in meat products is required. In this study, we developed a duplex droplet digital PCR (dddPCR) detection and quantification system to simultaneously identify and quantify the source of meat in samples containing a mixture of beef (Bos t...

  9. Current position of high-resolution MS for drug quantification in clinical & forensic toxicology.

    Science.gov (United States)

    Meyer, Markus R; Helfer, Andreas G; Maurer, Hans H

    2014-08-01

    This paper reviews high-resolution MS approaches published from January 2011 until March 2014 for the quantification of drugs (of abuse) and/or their metabolites in biosamples using LC-MS with time-of-flight or Orbitrap™ mass analyzers. Corresponding approaches are discussed including sample preparation and mass spectral settings. The advantages and limitations of high-resolution MS for drug quantification, as well as the demand for a certain resolution or a specific mass accuracy are also explored.

  10. MR Spectroscopy: Real-Time Quantification of in-vivo MR Spectroscopic data

    OpenAIRE

    Massé, Kunal

    2009-01-01

    In the last two decades, magnetic resonance spectroscopy (MRS) has had an increasing success in biomedical research. This technique has the faculty of discerning several metabolites in human tissue non-invasively and thus offers a multitude of medical applications. In clinical routine, quantification plays a key role in the evaluation of the different chemical elements. The quantification of metabolites characterizing specific pathologies helps physicians establish the patient's diagnosis. E...

  11. Comparison of Suitability of the Most Common Ancient DNA Quantification Methods.

    Science.gov (United States)

    Brzobohatá, Kristýna; Drozdová, Eva; Smutný, Jiří; Zeman, Tomáš; Beňuš, Radoslav

    2017-04-01

    Ancient DNA (aDNA) extracted from historical bones is damaged and fragmented into short segments, present in low quantity, and usually copurified with microbial DNA. A wide range of DNA quantification methods are available. The aim of this study was to compare the five most common DNA quantification methods for aDNA. Quantification methods were tested on DNA extracted from skeletal material originating from an early medieval burial site. The tested methods included ultraviolet (UV) absorbance, real-time quantitative polymerase chain reaction (qPCR) based on SYBR ® green detection, real-time qPCR based on a forensic kit, quantification via fluorescent dyes bonded to DNA, and fragmentary analysis. Differences between groups were tested using a paired t-test. Methods that measure total DNA present in the sample (NanoDrop ™ UV spectrophotometer and Qubit ® fluorometer) showed the highest concentrations. Methods based on real-time qPCR underestimated the quantity of aDNA. The most accurate method of aDNA quantification was fragmentary analysis, which also allows DNA quantification of the desired length and is not affected by PCR inhibitors. Methods based on the quantification of the total amount of DNA in samples are unsuitable for ancient samples as they overestimate the amount of DNA presumably due to the presence of microbial DNA. Real-time qPCR methods give undervalued results due to DNA damage and the presence of PCR inhibitors. DNA quantification methods based on fragment analysis show not only the quantity of DNA but also fragment length.

  12. Probabilistic risk assessment course documentation. Volume 5. System reliability and analysis techniques Session D - quantification

    International Nuclear Information System (INIS)

    Lofgren, E.V.

    1985-08-01

    This course in System Reliability and Analysis Techniques focuses on the probabilistic quantification of accident sequences and the link between accident sequences and consequences. Other sessions in this series focus on the quantification of system reliability and the development of event trees and fault trees. This course takes the viewpoint that event tree sequences or combinations of system failures and success are available and that Boolean equations for system fault trees have been developed and are available. 93 figs., 11 tabs

  13. Recurrence Quantification Analysis of Sentence-Level Speech Kinematics.

    Science.gov (United States)

    Jackson, Eric S; Tiede, Mark; Riley, Michael A; Whalen, D H

    2016-12-01

    Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach-recurrence quantification analysis (RQA)-via a procedural example and subsequent analysis of kinematic data. To test the feasibility of RQA, lip aperture (i.e., the Euclidean distance between lip-tracking sensors) was recorded for 21 typically developing adult speakers during production of a simple utterance. The utterance was produced in isolation and in carrier structures differing just in length or in length and complexity. Four RQA indices were calculated: percent recurrence (%REC), percent determinism (%DET), stability (MAXLINE), and stationarity (TREND). Percent determinism (%DET) decreased only for the most linguistically complex sentence; MAXLINE decreased as a function of linguistic complexity but increased for the longer-only sentence; TREND decreased as a function of both length and linguistic complexity. This research note demonstrates the feasibility of using RQA as a tool to compare speech variability across speakers and groups. RQA offers promise as a technique to assess effects of potential stressors (e.g., linguistic or cognitive factors) on the speech production system.

  14. Specificity and affinity quantification of protein-protein interactions.

    Science.gov (United States)

    Yan, Zhiqiang; Guo, Liyong; Hu, Liang; Wang, Jin

    2013-05-01

    Most biological processes are mediated by the protein-protein interactions. Determination of the protein-protein structures and insight into their interactions are vital to understand the mechanisms of protein functions. Currently, compared with the isolated protein structures, only a small fraction of protein-protein structures are experimentally solved. Therefore, the computational docking methods play an increasing role in predicting the structures and interactions of protein-protein complexes. The scoring function of protein-protein interactions is the key responsible for the accuracy of the computational docking. Previous scoring functions were mostly developed by optimizing the binding affinity which determines the stability of the protein-protein complex, but they are often lack of the consideration of specificity which determines the discrimination of native protein-protein complex against competitive ones. We developed a scoring function (named as SPA-PP, specificity and affinity of the protein-protein interactions) by incorporating both the specificity and affinity into the optimization strategy. The testing results and comparisons with other scoring functions show that SPA-PP performs remarkably on both predictions of binding pose and binding affinity. Thus, SPA-PP is a promising quantification of protein-protein interactions, which can be implemented into the protein docking tools and applied for the predictions of protein-protein structure and affinity. The algorithm is implemented in C language, and the code can be downloaded from http://dl.dropbox.com/u/1865642/Optimization.cpp.

  15. Subspace-based Inverse Uncertainty Quantification for Nuclear Data Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Khuwaileh, B.A., E-mail: bakhuwai@ncsu.edu; Abdel-Khalik, H.S.

    2015-01-15

    Safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. An inverse problem can be defined and solved to assess the sources of uncertainty, and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this work a subspace-based algorithm for inverse sensitivity/uncertainty quantification (IS/UQ) has been developed to enable analysts account for all sources of nuclear data uncertainties in support of target accuracy assessment-type analysis. An approximate analytical solution of the optimization problem is used to guide the search for the dominant uncertainty subspace. By limiting the search to a subspace, the degrees of freedom available for the optimization search are significantly reduced. A quarter PWR fuel assembly is modeled and the accuracy of the multiplication factor and the fission reaction rate are used as reactor attributes whose uncertainties are to be reduced. Numerical experiments are used to demonstrate the computational efficiency of the proposed algorithm. Our ongoing work is focusing on extending the proposed algorithm to account for various forms of feedback, e.g., thermal-hydraulics and depletion effects.

  16. Quantification of the degree of reaction of fly ash

    International Nuclear Information System (INIS)

    Ben Haha, M.; De Weerdt, K.; Lothenbach, B.

    2010-01-01

    The quantification of the fly ash (FA) in FA blended cements is an important parameter to understand the effect of the fly ash on the hydration of OPC and on the microstructural development. The FA reaction in two different blended OPC-FA systems was studied using a selective dissolution technique based on EDTA/NaOH, diluted NaOH solution, the portlandite content and by backscattered electron image analysis. The amount of FA determined by selective dissolution using EDTA/NaOH is found to be associated with a significant possible error as different assumptions lead to large differences in the estimate of FA reacted. In addition, at longer hydration times, the reaction of the FA is underestimated by this method due to the presence of non-dissolved hydrates and MgO rich particles. The dissolution of FA in diluted NaOH solution agreed during the first days well with the dissolution as observed by image analysis. At 28 days and longer, the formation of hydrates in the diluted solutions leads to an underestimation. Image analysis appears to give consistent results and to be most reliable technique studied.

  17. Absolute Quantification of Toxicological Biomarkers via Mass Spectrometry.

    Science.gov (United States)

    Lau, Thomas Y K; Collins, Ben C; Stone, Peter; Tang, Ning; Gallagher, William M; Pennington, Stephen R

    2017-01-01

    With the advent of "-omics" technologies there has been an explosion of data generation in the field of toxicology, as well as many others. As new candidate biomarkers of toxicity are being regularly discovered, the next challenge is to validate these observations in a targeted manner. Traditionally, these validation experiments have been conducted using antibody-based technologies such as Western blotting, ELISA, and immunohistochemistry. However, this often produces a significant bottleneck as the time, cost, and development of successful antibodies are often far outpaced by the generation of targets of interest. In response to this, there recently have been several developments in the use of triple quadrupole (QQQ) mass spectrometry (MS) as a platform to provide quantification of proteins. This technology does not require antibodies; it is typically less expensive and quicker to develop assays and has the opportunity for more accessible multiplexing. The speed of these experiments combined with their flexibility and ability to multiplex assays makes the technique a valuable strategy to validate biomarker discovery.

  18. Atomic Resolution Imaging and Quantification of Chemical Functionality of Surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Schwarz, Udo D. [Yale Univ., New Haven, CT (United States). Dept. of Mechanical Engineering and Materials Science; Altman, Eric I. [Yale Univ., New Haven, CT (United States). Dept. of Chemical and Environmental Engineering

    2014-12-10

    The work carried out from 2006-2014 under DoE support was targeted at developing new approaches to the atomic-scale characterization of surfaces that include species-selective imaging and an ability to quantify chemical surface interactions with site-specific accuracy. The newly established methods were subsequently applied to gain insight into the local chemical interactions that govern the catalytic properties of model catalysts of interest to DoE. The foundation of our work was the development of three-dimensional atomic force microscopy (3DAFM), a new measurement mode that allows the mapping of the complete surface force and energy fields with picometer resolution in space (x, y, and z) and piconewton/millielectron volts in force/energy. From this experimental platform, we further expanded by adding the simultaneous recording of tunneling current (3D-AFM/STM) using chemically well-defined tips. Through comparison with simulations, we were able to achieve precise quantification and assignment of local chemical interactions to exact positions within the lattice. During the course of the project, the novel techniques were applied to surface-oxidized copper, titanium dioxide, and silicon oxide. On these materials, defect-induced changes to the chemical surface reactivity and electronic charge density were characterized with site-specific accuracy.

  19. Quantification of Safety-Critical Software Test Uncertainty

    International Nuclear Information System (INIS)

    Khalaquzzaman, M.; Cho, Jaehyun; Lee, Seung Jun; Jung, Wondea

    2015-01-01

    The method, conservatively assumes that the failure probability of a software for the untested inputs is 1, and the failure probability turns in 0 for successful testing of all test cases. However, in reality the chance of failure exists due to the test uncertainty. Some studies have been carried out to identify the test attributes that affect the test quality. Cao discussed the testing effort, testing coverage, and testing environment. Management of the test uncertainties was discussed in. In this study, the test uncertainty has been considered to estimate the software failure probability because the software testing process is considered to be inherently uncertain. A reliability estimation of software is very important for a probabilistic safety analysis of a digital safety critical system of NPPs. This study focused on the estimation of the probability of a software failure that considers the uncertainty in software testing. In our study, BBN has been employed as an example model for software test uncertainty quantification. Although it can be argued that the direct expert elicitation of test uncertainty is much simpler than BBN estimation, however the BBN approach provides more insights and a basis for uncertainty estimation

  20. Eosin fluorescence: A diagnostic tool for quantification of liver injury.

    Science.gov (United States)

    Ali, Hamid; Ali, Safdar; Mazhar, Maryam; Ali, Amjad; Jahan, Azra; Ali, Abid

    2017-09-01

    Hepatitis is one of the most common life threatening diseases. The diagnosis is mainly based on biochemical analysis such as liver function test. However, histopathological evaluation of liver serves far better for more accurate final diagnosis. The goal of our study was to evaluate the eosin fluorescence pattern in CCl 4 -induced liver injury model compared with normal and different treatment groups. For this purpose, liver tissues were stained with H/E and examined under bright field microscope but the fluorescence microscopy of H/E stained slides provided an interesting fluorescence pattern and was quite helpful in identifying different structures. Interesting fluorescence patterns were obtained with FITC, Texas Red and Dual channel filter cubes that were quite helpful in identifying different morphological features of the liver. During the course of hepatic injury, liver cells undergo necrosis, apoptosis and overall cellular microenvironment is altered due to the modification of proteins and other intracellular molecules. Intensified eosin fluorescence was observed around the central vein of injured liver compared to normal indicating enhanced binding of eosin to the more exposed amino acid residues. To conclude, eosin fluorescence pattern varies with the health status of a tissue and can be used further for the diagnosis and quantification of severity of various liver diseases. Copyright © 2017 Elsevier B.V. All rights reserved.