WorldWideScience

Sample records for text quantification approaches

  1. PCR amplification of repetitive sequences as a possible approach in relative species quantification

    DEFF Research Database (Denmark)

    Ballin, Nicolai Zederkopff; Vogensen, Finn Kvist; Karlsson, Anders H

    2012-01-01

    Abstract Both relative and absolute quantifications are possible in species quantification when single copy genomic DNA is used. However, amplification of single copy genomic DNA does not allow a limit of detection as low as one obtained from amplification of repetitive sequences. Amplification...... of repetitive sequences is therefore frequently used in absolute quantification but problems occur in relative quantification as the number of repetitive sequences is unknown. A promising approach was developed where data from amplification of repetitive sequences were used in relative quantification of species...... to relatively quantify the amount of chicken DNA in a binary mixture of chicken DNA and pig DNA. However, the designed PCR primers lack the specificity required for regulatory species control....

  2. Original Approach for Automated Quantification of Antinuclear Autoantibodies by Indirect Immunofluorescence

    Directory of Open Access Journals (Sweden)

    Daniel Bertin

    2013-01-01

    Full Text Available Introduction. Indirect immunofluorescence (IIF is the gold standard method for the detection of antinuclear antibodies (ANA which are essential markers for the diagnosis of systemic autoimmune rheumatic diseases. For the discrimination of positive and negative samples, we propose here an original approach named Immunofluorescence for Computed Antinuclear antibody Rational Evaluation (ICARE based on the calculation of a fluorescence index (FI. Methods. We made comparison between FI and visual evaluations on 237 consecutive samples and on a cohort of 25 patients with SLE. Results. We obtained very good technical performance of FI (95% sensitivity, 98% specificity, and a kappa of 0.92, even in a subgroup of weakly positive samples. A significant correlation between quantification of FI and IIF ANA titers was found (Spearman's ρ=0.80, P<0.0001. Clinical performance of ICARE was validated on a cohort of patients with SLE corroborating the fact that FI could represent an attractive alternative for the evaluation of antibody titer. Conclusion. Our results represent a major step for automated quantification of IIF ANA, opening attractive perspectives such as rapid sample screening and laboratory standardization.

  3. Stochastic approach for radionuclides quantification

    Science.gov (United States)

    Clement, A.; Saurel, N.; Perrin, G.

    2018-01-01

    Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.

  4. Molecular Approaches for High Throughput Detection and Quantification of Genetically Modified Crops: A Review

    Directory of Open Access Journals (Sweden)

    Ibrahim B. Salisu

    2017-10-01

    Full Text Available As long as the genetically modified crops are gaining attention globally, their proper approval and commercialization need accurate and reliable diagnostic methods for the transgenic content. These diagnostic techniques are mainly divided into two major groups, i.e., identification of transgenic (1 DNA and (2 proteins from GMOs and their products. Conventional methods such as PCR (polymerase chain reaction and enzyme-linked immunosorbent assay (ELISA were routinely employed for DNA and protein based quantification respectively. Although, these Techniques (PCR and ELISA are considered as significantly convenient and productive, but there is need for more advance technologies that allow for high throughput detection and the quantification of GM event as the production of more complex GMO is increasing day by day. Therefore, recent approaches like microarray, capillary gel electrophoresis, digital PCR and next generation sequencing are more promising due to their accuracy and precise detection of transgenic contents. The present article is a brief comparative study of all such detection techniques on the basis of their advent, feasibility, accuracy, and cost effectiveness. However, these emerging technologies have a lot to do with detection of a specific event, contamination of different events and determination of fusion as well as stacked gene protein are the critical issues to be addressed in future.

  5. AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid

    Directory of Open Access Journals (Sweden)

    Jongbin Ko

    2014-01-01

    Full Text Available A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  6. Working with text tools, techniques and approaches for text mining

    CERN Document Server

    Tourte, Gregory J L

    2016-01-01

    Text mining tools and technologies have long been a part of the repository world, where they have been applied to a variety of purposes, from pragmatic aims to support tools. Research areas as diverse as biology, chemistry, sociology and criminology have seen effective use made of text mining technologies. Working With Text collects a subset of the best contributions from the 'Working with text: Tools, techniques and approaches for text mining' workshop, alongside contributions from experts in the area. Text mining tools and technologies in support of academic research include supporting research on the basis of a large body of documents, facilitating access to and reuse of extant work, and bridging between the formal academic world and areas such as traditional and social media. Jisc have funded a number of projects, including NaCTem (the National Centre for Text Mining) and the ResDis programme. Contents are developed from workshop submissions and invited contributions, including: Legal considerations in te...

  7. Quantification of uncertainties in turbulence modeling: A comparison of physics-based and random matrix theoretic approaches

    International Nuclear Information System (INIS)

    Wang, Jian-Xun; Sun, Rui; Xiao, Heng

    2016-01-01

    Highlights: • Compared physics-based and random matrix methods to quantify RANS model uncertainty. • Demonstrated applications of both methods in channel ow over periodic hills. • Examined the amount of information introduced in the physics-based approach. • Discussed implications to modeling turbulence in both near-wall and separated regions. - Abstract: Numerical models based on Reynolds-Averaged Navier-Stokes (RANS) equations are widely used in engineering turbulence modeling. However, the RANS predictions have large model-form uncertainties for many complex flows, e.g., those with non-parallel shear layers or strong mean flow curvature. Quantification of these large uncertainties originating from the modeled Reynolds stresses has attracted attention in the turbulence modeling community. Recently, a physics-based Bayesian framework for quantifying model-form uncertainties has been proposed with successful applications to several flows. Nonetheless, how to specify proper priors without introducing unwarranted, artificial information remains challenging to the current form of the physics-based approach. Another recently proposed method based on random matrix theory provides the prior distributions with maximum entropy, which is an alternative for model-form uncertainty quantification in RANS simulations. This method has better mathematical rigorousness and provides the most non-committal prior distributions without introducing artificial constraints. On the other hand, the physics-based approach has the advantages of being more flexible to incorporate available physical insights. In this work, we compare and discuss the advantages and disadvantages of the two approaches on model-form uncertainty quantification. In addition, we utilize the random matrix theoretic approach to assess and possibly improve the specification of priors used in the physics-based approach. The comparison is conducted through a test case using a canonical flow, the flow past

  8. Spot quantification in two dimensional gel electrophoresis image analysis: comparison of different approaches and presentation of a novel compound fitting algorithm

    Science.gov (United States)

    2014-01-01

    Background Various computer-based methods exist for the detection and quantification of protein spots in two dimensional gel electrophoresis images. Area-based methods are commonly used for spot quantification: an area is assigned to each spot and the sum of the pixel intensities in that area, the so-called volume, is used a measure for spot signal. Other methods use the optical density, i.e. the intensity of the most intense pixel of a spot, or calculate the volume from the parameters of a fitted function. Results In this study we compare the performance of different spot quantification methods using synthetic and real data. We propose a ready-to-use algorithm for spot detection and quantification that uses fitting of two dimensional Gaussian function curves for the extraction of data from two dimensional gel electrophoresis (2-DE) images. The algorithm implements fitting using logical compounds and is computationally efficient. The applicability of the compound fitting algorithm was evaluated for various simulated data and compared with other quantification approaches. We provide evidence that even if an incorrect bell-shaped function is used, the fitting method is superior to other approaches, especially when spots overlap. Finally, we validated the method with experimental data of urea-based 2-DE of Aβ peptides andre-analyzed published data sets. Our methods showed higher precision and accuracy than other approaches when applied to exposure time series and standard gels. Conclusion Compound fitting as a quantification method for 2-DE spots shows several advantages over other approaches and could be combined with various spot detection methods. The algorithm was scripted in MATLAB (Mathworks) and is available as a supplemental file. PMID:24915860

  9. Unconventional barometry and rheometry: new quantification approaches for mechanically-controlled microstructures

    Science.gov (United States)

    Tajcmanova, L.; Moulas, E.; Vrijmoed, J.; Podladchikov, Y.

    2016-12-01

    Estimation of pressure-temperature (P-T) from petrographic observations in metamorphic rocks has become a common practice in petrology studies during the last 50 years. This data often serves as a key input in geodynamic reconstructions and thus directly influences our understanding of lithospheric processes. Such an approach might have led the metamorphic geology field to a certain level of quiescence. In the classical view of metamorphic quantification approaches, fast viscous relaxation (and therefore constant pressure across the rock microstructure) is assumed, with chemical diffusion being the limiting factor in equilibration. Recently, we have focused on the other possible scenario - fast chemical diffusion and slow viscous relaxation - and brings an alternative interpretation of chemical zoning found in high-grade rocks. The aim has been to provide insight into the role of mechanically maintained pressure variations on multi-component chemical zoning in minerals. Furthermore, we used the pressure information from the mechanically-controlled microstructure for rheological constrains. We show an unconventional way of relating the direct microstructural observations in rocks to the nonlinearity of rheology at time scales unattainable by laboratory measurements. Our analysis documents that mechanically controlled microstructures that have been preserved over geological times can be used to deduce flow-law parameters and in turn estimate stress levels of minerals in their natural environment. The development of the new quantification approaches has opened new horizons in understanding the phase transformations in the Earth's lithosphere. Furthermore, the new data generated can serve as a food for thought for the next generation of fully coupled numerical codes that involve reacting materials while respecting conservation of mass, momentum and energy.

  10. New approach for the quantification of processed animal proteins in feed using light microscopy.

    Science.gov (United States)

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  11. Real-time polymerase chain reaction-based approach for quantification of the pat gene in the T25 Zea mays event.

    Science.gov (United States)

    Weighardt, Florian; Barbati, Cristina; Paoletti, Claudia; Querci, Maddalena; Kay, Simon; De Beuckeleer, Marc; Van den Eede, Guy

    2004-01-01

    In Europe, a growing interest for reliable techniques for the quantification of genetically modified component(s) of food matrixes is arising from the need to comply with the European legislative framework on novel food products. Real-time polymerase chain reaction (PCR) is currently the most powerful technique for the quantification of specific nucleic acid sequences. Several real-time PCR methodologies based on different molecular principles have been developed for this purpose. The most frequently used approach in the field of genetically modified organism (GMO) quantification in food or feed samples is based on the 5'-3'-exonuclease activity of Taq DNA polymerase on specific degradation probes (TaqMan principle). A novel approach was developed for the establishment of a TaqMan quantification system assessing GMO contents around the 1% threshold stipulated under European Union (EU) legislation for the labeling of food products. The Zea mays T25 elite event was chosen as a model for the development of the novel GMO quantification approach. The most innovative aspect of the system is represented by the use of sequences cloned in plasmids as reference standards. In the field of GMO quantification, plasmids are an easy to use, cheap, and reliable alternative to Certified Reference Materials (CRMs), which are only available for a few of the GMOs authorized in Europe, have a relatively high production cost, and require further processing to be suitable for analysis. Strengths and weaknesses of the use of novel plasmid-based standards are addressed in detail. In addition, the quantification system was designed to avoid the use of a reference gene (e.g., a single copy, species-specific gene) as normalizer, i.e., to perform a GMO quantification based on an absolute instead of a relative measurement. In fact, experimental evidences show that the use of reference genes adds variability to the measurement system because a second independent real-time PCR-based measurement

  12. Uncertainty Quantification for Complex RF-structures Using the State-space Concatenation Approach

    CERN Document Server

    Heller, Johann; Schmidt, Christian; Van Rienen, Ursula

    2015-01-01

    as well as to employ robust optimizations, a so-called uncertainty quantification (UQ) is applied. For large and complex structures such computations are heavily demanding and cannot be carried out using standard brute-force approaches. In this paper, we propose a combination of established techniques to perform UQ for long and complex structures, where the uncertainty is located only in parts of the structure. As exemplary structure, we investigate the third-harmonic cavity, which is being used at the FLASH accelerator at DESY, assuming an uncertain...

  13. Cognition-Based Approaches for High-Precision Text Mining

    Science.gov (United States)

    Shannon, George John

    2017-01-01

    This research improves the precision of information extraction from free-form text via the use of cognitive-based approaches to natural language processing (NLP). Cognitive-based approaches are an important, and relatively new, area of research in NLP and search, as well as linguistics. Cognitive approaches enable significant improvements in both…

  14. Statistical approach for uncertainty quantification of experimental modal model parameters

    DEFF Research Database (Denmark)

    Luczak, M.; Peeters, B.; Kahsin, M.

    2014-01-01

    Composite materials are widely used in manufacture of aerospace and wind energy structural components. These load carrying structures are subjected to dynamic time-varying loading conditions. Robust structural dynamics identification procedure impose tight constraints on the quality of modal models...... represent different complexity levels ranging from coupon, through sub-component up to fully assembled aerospace and wind energy structural components made of composite materials. The proposed method is demonstrated on two application cases of a small and large wind turbine blade........ This paper aims at a systematic approach for uncertainty quantification of the parameters of the modal models estimated from experimentally obtained data. Statistical analysis of modal parameters is implemented to derive an assessment of the entire modal model uncertainty measure. Investigated structures...

  15. Monte Carlo approaches for uncertainty quantification of criticality for system dimensions

    International Nuclear Information System (INIS)

    Kiedrowski, B.C.; Brown, F.B.

    2013-01-01

    One of the current challenges in nuclear engineering computations is the issue of performing uncertainty analysis for either calculations or experimental measurements. This paper specifically focuses on the issue of estimating the uncertainties arising from geometric tolerances. For this paper, two techniques for uncertainty quantification are studied. The first is the forward propagation technique, which can be thought of as a 'brute force' approach; uncertain system parameters are randomly sampled, the calculation is run, and uncertainties are found from the empirically obtained distribution of results. This approach need make no approximations in principle, but is very computationally expensive. The other approach investigated is the adjoint-based approach; system sensitivities are computed via a single Monte Carlo calculation and those are used with a covariance matrix to provide a linear estimate of the uncertainty. Demonstration calculations are performed with the MCNP6 code for both techniques. The 2 techniques are tested on 2 cases: the first case is a solid, bare cylinder of Pu-metal while the second case is a can of plutonium nitrate solution. The results show that the forward and adjoint approaches appear to agree in some cases where the responses are not non-linearly correlated. In other cases, the uncertainties in the effective multiplication k disagree for reasons not yet known

  16. A machine learning approach for efficient uncertainty quantification using multiscale methods

    Science.gov (United States)

    Chan, Shing; Elsheikh, Ahmed H.

    2018-02-01

    Several multiscale methods account for sub-grid scale features using coarse scale basis functions. For example, in the Multiscale Finite Volume method the coarse scale basis functions are obtained by solving a set of local problems over dual-grid cells. We introduce a data-driven approach for the estimation of these coarse scale basis functions. Specifically, we employ a neural network predictor fitted using a set of solution samples from which it learns to generate subsequent basis functions at a lower computational cost than solving the local problems. The computational advantage of this approach is realized for uncertainty quantification tasks where a large number of realizations has to be evaluated. We attribute the ability to learn these basis functions to the modularity of the local problems and the redundancy of the permeability patches between samples. The proposed method is evaluated on elliptic problems yielding very promising results.

  17. Novel isotopic N, N-Dimethyl Leucine (iDiLeu) Reagents Enable Absolute Quantification of Peptides and Proteins Using a Standard Curve Approach

    Science.gov (United States)

    Greer, Tyler; Lietz, Christopher B.; Xiang, Feng; Li, Lingjun

    2015-01-01

    Absolute quantification of protein targets using liquid chromatography-mass spectrometry (LC-MS) is a key component of candidate biomarker validation. One popular method combines multiple reaction monitoring (MRM) using a triple quadrupole instrument with stable isotope-labeled standards (SIS) for absolute quantification (AQUA). LC-MRM AQUA assays are sensitive and specific, but they are also expensive because of the cost of synthesizing stable isotope peptide standards. While the chemical modification approach using mass differential tags for relative and absolute quantification (mTRAQ) represents a more economical approach when quantifying large numbers of peptides, these reagents are costly and still suffer from lower throughput because only two concentration values per peptide can be obtained in a single LC-MS run. Here, we have developed and applied a set of five novel mass difference reagents, isotopic N, N-dimethyl leucine (iDiLeu). These labels contain an amine reactive group, triazine ester, are cost effective because of their synthetic simplicity, and have increased throughput compared with previous LC-MS quantification methods by allowing construction of a four-point standard curve in one run. iDiLeu-labeled peptides show remarkably similar retention time shifts, slightly lower energy thresholds for higher-energy collisional dissociation (HCD) fragmentation, and high quantification accuracy for trypsin-digested protein samples (median errors <15%). By spiking in an iDiLeu-labeled neuropeptide, allatostatin, into mouse urine matrix, two quantification methods are validated. The first uses one labeled peptide as an internal standard to normalize labeled peptide peak areas across runs (<19% error), whereas the second enables standard curve creation and analyte quantification in one run (<8% error).

  18. Multi-tissue partial volume quantification in multi-contrast MRI using an optimised spectral unmixing approach.

    Science.gov (United States)

    Collewet, Guylaine; Moussaoui, Saïd; Deligny, Cécile; Lucas, Tiphaine; Idier, Jérôme

    2018-06-01

    Multi-tissue partial volume estimation in MRI images is investigated with a viewpoint related to spectral unmixing as used in hyperspectral imaging. The main contribution of this paper is twofold. It firstly proposes a theoretical analysis of the statistical optimality conditions of the proportion estimation problem, which in the context of multi-contrast MRI data acquisition allows to appropriately set the imaging sequence parameters. Secondly, an efficient proportion quantification algorithm based on the minimisation of a penalised least-square criterion incorporating a regularity constraint on the spatial distribution of the proportions is proposed. Furthermore, the resulting developments are discussed using empirical simulations. The practical usefulness of the spectral unmixing approach for partial volume quantification in MRI is illustrated through an application to food analysis on the proving of a Danish pastry. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Deep learning approach for the detection and quantification of intraretinal cystoid fluid in multivendor optical coherence tomography

    Science.gov (United States)

    Venhuizen, Freerk G.; van Ginneken, Bram; Liefers, Bart; van Asten, Freekje; Schreur, Vivian; Fauser, Sascha; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I.

    2018-01-01

    We developed a deep learning algorithm for the automatic segmentation and quantification of intraretinal cystoid fluid (IRC) in spectral domain optical coherence tomography (SD-OCT) volumes independent of the device used for acquisition. A cascade of neural networks was introduced to include prior information on the retinal anatomy, boosting performance significantly. The proposed algorithm approached human performance reaching an overall Dice coefficient of 0.754 ± 0.136 and an intraclass correlation coefficient of 0.936, for the task of IRC segmentation and quantification, respectively. The proposed method allows for fast quantitative IRC volume measurements that can be used to improve patient care, reduce costs, and allow fast and reliable analysis in large population studies. PMID:29675301

  20. Forest Carbon Leakage Quantification Methods and Their Suitability for Assessing Leakage in REDD

    Directory of Open Access Journals (Sweden)

    Sabine Henders

    2012-01-01

    Full Text Available This paper assesses quantification methods for carbon leakage from forestry activities for their suitability in leakage accounting in a future Reducing Emissions from Deforestation and Forest Degradation (REDD mechanism. To that end, we first conducted a literature review to identify specific pre-requisites for leakage assessment in REDD. We then analyzed a total of 34 quantification methods for leakage emissions from the Clean Development Mechanism (CDM, the Verified Carbon Standard (VCS, the Climate Action Reserve (CAR, the CarbonFix Standard (CFS, and from scientific literature sources. We screened these methods for the leakage aspects they address in terms of leakage type, tools used for quantification and the geographical scale covered. Results show that leakage methods can be grouped into nine main methodological approaches, six of which could fulfill the recommended REDD leakage requirements if approaches for primary and secondary leakage are combined. The majority of methods assessed, address either primary or secondary leakage; the former mostly on a local or regional and the latter on national scale. The VCS is found to be the only carbon accounting standard at present to fulfill all leakage quantification requisites in REDD. However, a lack of accounting methods was identified for international leakage, which was addressed by only two methods, both from scientific literature.

  1. Quantification of fossil organic matter in contaminated sediments from an industrial watershed: Validation of the quantitative multimolecular approach by radiocarbon analysis

    International Nuclear Information System (INIS)

    Jeanneau, Laurent; Faure, Pierre

    2010-01-01

    The quantitative multimolecular approach (QMA) based on an exhaustive identification and quantification of molecules from the extractable organic matter (EOM) has been recently developed in order to investigate organic contamination in sediments by a more complete method than the restrictive quantification of target contaminants. Such an approach allows (i) the comparison between natural and anthropogenic inputs, (ii) between modern and fossil organic matter and (iii) the differentiation between several anthropogenic sources. However QMA is based on the quantification of molecules recovered by organic solvent and then analyzed by gas chromatography-mass spectrometry, which represent a small fraction of sedimentary organic matter (SOM). In order to extend the conclusions of QMA to SOM, radiocarbon analyses have been performed on organic extracts and decarbonated sediments. This analysis allows (i) the differentiation between modern biomass (contemporary 14 C) and fossil organic matter ( 14 C-free) and (ii) the calculation of the modern carbon percentage (PMC). At the confluence between Fensch and Moselle Rivers, a catchment highly contaminated by both industrial activities and urbanization, PMC values in decarbonated sediments are well correlated with the percentage of natural molecular markers determined by QMA. It highlights that, for this type of contamination by fossil organic matter inputs, the conclusions of QMA can be scaled up to SOM. QMA is an efficient environmental diagnostic tool that leads to a more realistic quantification of fossil organic matter in sediments.

  2. Associated diacritical watermarking approach to protect sensitive arabic digital texts

    Science.gov (United States)

    Kamaruddin, Nurul Shamimi; Kamsin, Amirrudin; Hakak, Saqib

    2017-10-01

    Among multimedia content, one of the most predominant medium is text content. There have been lots of efforts to protect and secure text information over the Internet. The limitations of existing works have been identified in terms of watermark capacity, time complexity and memory complexity. In this work, an invisible digital watermarking approach has been proposed to protect and secure the most sensitive text i.e. Digital Holy Quran. The proposed approach works by XOR-ing only those Quranic letters that has certain diacritics associated with it. Due to sensitive nature of Holy Quran, diacritics play vital role in the meaning of the particular verse. Hence, securing letters with certain diacritics will preserve the original meaning of Quranic verses in case of alternation attempt. Initial results have shown that the proposed approach is promising with less memory complexity and time complexity compared to existing approaches.

  3. Towards Technological Approaches for Concept Maps Mining from Text

    Directory of Open Access Journals (Sweden)

    Camila Zacche Aguiar

    2018-04-01

    Full Text Available Concept maps are resources for the representation and construction of knowledge. They allow showing, through concepts and relationships, how knowledge about a subject is organized. Technological advances have boosted the development of approaches for the automatic construction of a concept map, to facilitate and provide the benefits of that resource more broadly. Due to the need to better identify and analyze the functionalities and characteristics of those approaches, we conducted a detailed study on technological approaches for automatic construction of concept maps published between 1994 and 2016 in the IEEE Xplore, ACM and Elsevier Science Direct data bases. From this study, we elaborate a categorization defined on two perspectives, Data Source and Graphic Representation, and fourteen categories. That study collected 30 relevant articles, which were applied to the proposed categorization to identify the main features and limitations of each approach. A detailed view on these approaches, their characteristics and techniques are presented enabling a quantitative analysis. In addition, the categorization has given us objective conditions to establish new specification requirements for a new technological approach aiming at concept maps mining from texts.

  4. Optical technologies applied alongside on-site and remote approaches for climate gas emission quantification at a wastewater treatment plant

    DEFF Research Database (Denmark)

    Samuelsson, Jerker; Delre, Antonio; Tumlin, Susanne

    2018-01-01

    Plant-integrated and on-site gas emissions were quantified from a Swedish wastewater treatment plant by applying several optical analytical techniques and measurement methods. Plant-integrated CH4 emission rates, measured using mobile ground-based remote sensing methods, varied between 28.5 and 33.......5 kg CH4 h−1, corresponding to an average emission factor of 5.9% as kg CH4 (kg CH4production) −1, whereas N2O emissions varied between 4.0 and 6.4 kg h−1, corresponding to an average emission factor of 1.5% as kg N2O-N (kg TN influent) −1. Plant-integrated NH3 emissions were around 0.4 kg h−1...... quantifications were approximately two-thirds of the plant-integrated emission quantifications, which may be explained by the different timeframes of the approaches and that not all emission sources were identified during on-site investigation. Off-site gas emission quantifications, using ground-based remote...

  5. A reduction approach to improve the quantification of linked fault trees through binary decision diagrams

    International Nuclear Information System (INIS)

    Ibanez-Llano, Cristina; Rauzy, Antoine; Melendez, Enrique; Nieto, Francisco

    2010-01-01

    Over the last two decades binary decision diagrams have been applied successfully to improve Boolean reliability models. Conversely to the classical approach based on the computation of the MCS, the BDD approach involves no approximation in the quantification of the model and is able to handle correctly negative logic. However, when models are sufficiently large and complex, as for example the ones coming from the PSA studies of the nuclear industry, it begins to be unfeasible to compute the BDD within a reasonable amount of time and computer memory. Therefore, simplification or reduction of the full model has to be considered in some way to adapt the application of the BDD technology to the assessment of such models in practice. This paper proposes a reduction process based on using information provided by the set of the most relevant minimal cutsets of the model in order to perform the reduction directly on it. This allows controlling the degree of reduction and therefore the impact of such simplification on the final quantification results. This reduction is integrated in an incremental procedure that is compatible with the dynamic generation of the event trees and therefore adaptable to the recent dynamic developments and extensions of the PSA studies. The proposed method has been applied to a real case study, and the results obtained confirm that the reduction enables the BDD computation while maintaining accuracy.

  6. A reduction approach to improve the quantification of linked fault trees through binary decision diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez-Llano, Cristina, E-mail: cristina.ibanez@iit.upcomillas.e [Instituto de Investigacion Tecnologica (IIT), Escuela Tecnica Superior de Ingenieria ICAI, Universidad Pontificia Comillas, C/Santa Cruz de Marcenado 26, 28015 Madrid (Spain); Rauzy, Antoine, E-mail: Antoine.RAUZY@3ds.co [Dassault Systemes, 10 rue Marcel Dassault CS 40501, 78946 Velizy Villacoublay, Cedex (France); Melendez, Enrique, E-mail: ema@csn.e [Consejo de Seguridad Nuclear (CSN), C/Justo Dorado 11, 28040 Madrid (Spain); Nieto, Francisco, E-mail: nieto@iit.upcomillas.e [Instituto de Investigacion Tecnologica (IIT), Escuela Tecnica Superior de Ingenieria ICAI, Universidad Pontificia Comillas, C/Santa Cruz de Marcenado 26, 28015 Madrid (Spain)

    2010-12-15

    Over the last two decades binary decision diagrams have been applied successfully to improve Boolean reliability models. Conversely to the classical approach based on the computation of the MCS, the BDD approach involves no approximation in the quantification of the model and is able to handle correctly negative logic. However, when models are sufficiently large and complex, as for example the ones coming from the PSA studies of the nuclear industry, it begins to be unfeasible to compute the BDD within a reasonable amount of time and computer memory. Therefore, simplification or reduction of the full model has to be considered in some way to adapt the application of the BDD technology to the assessment of such models in practice. This paper proposes a reduction process based on using information provided by the set of the most relevant minimal cutsets of the model in order to perform the reduction directly on it. This allows controlling the degree of reduction and therefore the impact of such simplification on the final quantification results. This reduction is integrated in an incremental procedure that is compatible with the dynamic generation of the event trees and therefore adaptable to the recent dynamic developments and extensions of the PSA studies. The proposed method has been applied to a real case study, and the results obtained confirm that the reduction enables the BDD computation while maintaining accuracy.

  7. Normalized Tritium Quantification Approach (NoTQA) a Method for Quantifying Tritium Contaminated Trash and Debris at LLNL

    International Nuclear Information System (INIS)

    Dominick, J.L.; Rasmussen, C.L.

    2008-01-01

    Several facilities and many projects at LLNL work exclusively with tritium. These operations have the potential to generate large quantities of Low-Level Radioactive Waste (LLW) with the same or similar radiological characteristics. A standardized documented approach to characterizing these waste materials for disposal as radioactive waste will enhance the ability of the Laboratory to manage them in an efficient and timely manner while ensuring compliance with all applicable regulatory requirements. This standardized characterization approach couples documented process knowledge with analytical verification and is very conservative, overestimating the radioactivity concentration of the waste. The characterization approach documented here is the Normalized Tritium Quantification Approach (NoTQA). This document will serve as a Technical Basis Document which can be referenced in radioactive waste characterization documentation packages such as the Information Gathering Document. In general, radiological characterization of waste consists of both developing an isotopic breakdown (distribution) of radionuclides contaminating the waste and using an appropriate method to quantify the radionuclides in the waste. Characterization approaches require varying degrees of rigor depending upon the radionuclides contaminating the waste and the concentration of the radionuclide contaminants as related to regulatory thresholds. Generally, as activity levels in the waste approach a regulatory or disposal facility threshold the degree of required precision and accuracy, and therefore the level of rigor, increases. In the case of tritium, thresholds of concern for control, contamination, transportation, and waste acceptance are relatively high. Due to the benign nature of tritium and the resulting higher regulatory thresholds, this less rigorous yet conservative characterization approach is appropriate. The scope of this document is to define an appropriate and acceptable

  8. StakeMeter: value-based stakeholder identification and quantification framework for value-based software systems.

    Directory of Open Access Journals (Sweden)

    Muhammad Imran Babar

    Full Text Available Value-based requirements engineering plays a vital role in the development of value-based software (VBS. Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.

  9. Accurate quantification of endogenous androgenic steroids in cattle's meat by gas chromatography mass spectrometry using a surrogate analyte approach

    International Nuclear Information System (INIS)

    Ahmadkhaniha, Reza; Shafiee, Abbas; Rastkari, Noushin; Kobarfard, Farzad

    2009-01-01

    Determination of endogenous steroids in complex matrices such as cattle's meat is a challenging task. Since endogenous steroids always exist in animal tissues, no analyte-free matrices for constructing the standard calibration line will be available, which is crucial for accurate quantification specially at trace level. Although some methods have been proposed to solve the problem, none has offered a complete solution. To this aim, a new quantification strategy was developed in this study, which is named 'surrogate analyte approach' and is based on using isotope-labeled standards instead of natural form of endogenous steroids for preparing the calibration line. In comparison with the other methods, which are currently in use for the quantitation of endogenous steroids, this approach provides improved simplicity and speed for analysis on a routine basis. The accuracy of this method is better than other methods at low concentration and comparable to the standard addition at medium and high concentrations. The method was also found to be valid according to the ICH criteria for bioanalytical methods. The developed method could be a promising approach in the field of compounds residue analysis

  10. An EPGPT-based approach for uncertainty quantification

    International Nuclear Information System (INIS)

    Wang, C.; Abdel-Khalik, H. S.

    2012-01-01

    Generalized Perturbation Theory (GPT) has been widely used by many scientific disciplines to perform sensitivity analysis and uncertainty quantification. This manuscript employs recent developments in GPT theory, collectively referred to as Exact-to-Precision Generalized Perturbation Theory (EPGPT), to enable uncertainty quantification for computationally challenging models, e.g. nonlinear models associated with many input parameters and many output responses and with general non-Gaussian parameters distributions. The core difference between EPGPT and existing GPT is in the way the problem is formulated. GPT formulates an adjoint problem that is dependent on the response of interest. It tries to capture via the adjoint solution the relationship between the response of interest and the constraints on the state variations. EPGPT recasts the problem in terms of a smaller set of what is referred to as the 'active' responses which are solely dependent on the physics model and the boundary and initial conditions rather than on the responses of interest. The objective of this work is to apply an EPGPT methodology to propagate cross-sections variations in typical reactor design calculations. The goal is to illustrate its use and the associated impact for situations where the typical Gaussian assumption for parameters uncertainties is not valid and when nonlinear behavior must be considered. To allow this demonstration, exaggerated variations will be employed to stimulate nonlinear behavior in simple prototypical neutronics models. (authors)

  11. A Novel Text Clustering Approach Using Deep-Learning Vocabulary Network

    Directory of Open Access Journals (Sweden)

    Junkai Yi

    2017-01-01

    Full Text Available Text clustering is an effective approach to collect and organize text documents into meaningful groups for mining valuable information on the Internet. However, there exist some issues to tackle such as feature extraction and data dimension reduction. To overcome these problems, we present a novel approach named deep-learning vocabulary network. The vocabulary network is constructed based on related-word set, which contains the “cooccurrence” relations of words or terms. We replace term frequency in feature vectors with the “importance” of words in terms of vocabulary network and PageRank, which can generate more precise feature vectors to represent the meaning of text clustering. Furthermore, sparse-group deep belief network is proposed to reduce the dimensionality of feature vectors, and we introduce coverage rate for similarity measure in Single-Pass clustering. To verify the effectiveness of our work, we compare the approach to the representative algorithms, and experimental results show that feature vectors in terms of deep-learning vocabulary network have better clustering performance.

  12. Verb aspect, alternations and quantification

    Directory of Open Access Journals (Sweden)

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  13. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    Science.gov (United States)

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  14. An approach for quantification of platinum distribution in tissues by LA-ICP-MS imaging using isotope dilution analysis.

    Science.gov (United States)

    Moraleja, I; Mena, M L; Lázaro, A; Neumann, B; Tejedor, A; Jakubowski, N; Gómez-Gómez, M M; Esteban-Fernández, D

    2018-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been revealed as a convenient technique for trace elemental imaging in tissue sections, providing elemental 2D distribution at a quantitative level. For quantification purposes, in the last years several approaches have been proposed in the literature such as the use of CRMs or matrix matched standards. The use of Isotope Dilution (ID) for quantification by LA-ICP-MS has been also described, being mainly useful for bulk analysis but not feasible for spatial measurements so far. In this work, a quantification method based on ID analysis was developed by printing isotope-enriched inks onto kidney slices from rats treated with antitumoral Pt-based drugs using a commercial ink-jet device, in order to perform an elemental quantification in different areas from bio-images. For the ID experiments 194 Pt enriched platinum was used. The methodology was validated by deposition of natural Pt standard droplets with a known amount of Pt onto the surface of a control tissue, where could be quantified even 50pg of Pt, with recoveries higher than 90%. The amount of Pt present in the whole kidney slices was quantified for cisplatin, carboplatin and oxaliplatin-treated rats. The results obtained were in accordance with those previously reported. The amount of Pt distributed between the medullar and cortical areas was also quantified, observing different behavior for the three drugs. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. A Novel Approach for Arabic Text Steganography Based on the “BloodGroup” Text Hiding Method

    Directory of Open Access Journals (Sweden)

    S. Malalla,

    2017-04-01

    Full Text Available Steganography is the science of hiding certain messages (data in groups of irrelevant data possibly of other form. The purpose of steganography is covert communication to hide the existence of a message from an intermediary. Text Steganography is the process of embedding secret message (text in another text (cover text so that the existence of secret message cannot be detected by a third party. This paper presents a novel approach for text steganography using the Blood Group (BG method based on the behavior of blood group. Experimentally it is found that the proposed method got good results in capacity, hiding capacity, time complexity, robustness, visibility, and similarity which shows its superiority as compared to most several existing methods.

  16. A fast and robust hepatocyte quantification algorithm including vein processing

    Directory of Open Access Journals (Sweden)

    Homeyer André

    2010-03-01

    Full Text Available Abstract Background Quantification of different types of cells is often needed for analysis of histological images. In our project, we compute the relative number of proliferating hepatocytes for the evaluation of the regeneration process after partial hepatectomy in normal rat livers. Results Our presented automatic approach for hepatocyte (HC quantification is suitable for the analysis of an entire digitized histological section given in form of a series of images. It is the main part of an automatic hepatocyte quantification tool that allows for the computation of the ratio between the number of proliferating HC-nuclei and the total number of all HC-nuclei for a series of images in one processing run. The processing pipeline allows us to obtain desired and valuable results for a wide range of images with different properties without additional parameter adjustment. Comparing the obtained segmentation results with a manually retrieved segmentation mask which is considered to be the ground truth, we achieve results with sensitivity above 90% and false positive fraction below 15%. Conclusions The proposed automatic procedure gives results with high sensitivity and low false positive fraction and can be applied to process entire stained sections.

  17. Compositional Solution Space Quantification for Probabilistic Software Analysis

    Science.gov (United States)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  18. A highly sensitive method for quantification of iohexol

    DEFF Research Database (Denmark)

    Schulz, A.; Boeringer, F.; Swifka, J.

    2014-01-01

    -chromatography-electrospray-massspectrometry (LC-ESI-MS) approach using the multiple reaction monitoring mode for iohexol quantification. In order to test whether a significantly decreased amount of iohexol is sufficient for reliable quantification, a LC-ESI-MS approach was assessed. We analyzed the kinetics of iohexol in rats after application...... of different amounts of iohexol (15 mg to 150 1.tg per rat). Blood sampling was conducted at four time points, at 15, 30, 60, and 90 min, after iohexol injection. The analyte (iohexol) and the internal standard (iotha(amic acid) were separated from serum proteins using a centrifugal filtration device...... with a cut-off of 3 kDa. The chromatographic separation was achieved on an analytical Zorbax SB C18 column. The detection and quantification were performed on a high capacity trap mass spectrometer using positive ion ESI in the multiple reaction monitoring (MRM) mode. Furthermore, using real-time polymerase...

  19. Intertextual Content Analysis: An Approach for Analysing Text-Related Discussions with Regard to Movability in Reading and How Text Content Is Handled

    Science.gov (United States)

    Hallesson, Yvonne; Visén, Pia

    2018-01-01

    Reading and discussing texts as a means for learning subject content are regular features within educational contexts. This paper presents an approach for intertextual content analysis (ICA) of such text-related discussions revealing what the participants make of the text. Thus, in contrast to many other approaches for analysing conversation that…

  20. GMO quantification: valuable experience and insights for the future.

    Science.gov (United States)

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  1. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    Science.gov (United States)

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  2. Accurate quantification of endogenous androgenic steroids in cattle's meat by gas chromatography mass spectrometry using a surrogate analyte approach

    Energy Technology Data Exchange (ETDEWEB)

    Ahmadkhaniha, Reza; Shafiee, Abbas [Department of Medicinal Chemistry, Faculty of Pharmacy and Pharmaceutical Sciences Research Center, Tehran University of Medical Sciences, Tehran 14174 (Iran, Islamic Republic of); Rastkari, Noushin [Center for Environmental Research, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Kobarfard, Farzad [Department of Medicinal Chemistry, School of Pharmacy, Shaheed Beheshti University of Medical Sciences, Tavaneer Ave., Valieasr St., Tehran (Iran, Islamic Republic of)], E-mail: farzadkf@yahoo.com

    2009-01-05

    Determination of endogenous steroids in complex matrices such as cattle's meat is a challenging task. Since endogenous steroids always exist in animal tissues, no analyte-free matrices for constructing the standard calibration line will be available, which is crucial for accurate quantification specially at trace level. Although some methods have been proposed to solve the problem, none has offered a complete solution. To this aim, a new quantification strategy was developed in this study, which is named 'surrogate analyte approach' and is based on using isotope-labeled standards instead of natural form of endogenous steroids for preparing the calibration line. In comparison with the other methods, which are currently in use for the quantitation of endogenous steroids, this approach provides improved simplicity and speed for analysis on a routine basis. The accuracy of this method is better than other methods at low concentration and comparable to the standard addition at medium and high concentrations. The method was also found to be valid according to the ICH criteria for bioanalytical methods. The developed method could be a promising approach in the field of compounds residue analysis.

  3. A statistical approach to quantification of genetically modified organisms (GMO) using frequency distributions.

    Science.gov (United States)

    Gerdes, Lars; Busch, Ulrich; Pecoraro, Sven

    2014-12-14

    According to Regulation (EU) No 619/2011, trace amounts of non-authorised genetically modified organisms (GMO) in feed are tolerated within the EU if certain prerequisites are met. Tolerable traces must not exceed the so-called 'minimum required performance limit' (MRPL), which was defined according to the mentioned regulation to correspond to 0.1% mass fraction per ingredient. Therefore, not yet authorised GMO (and some GMO whose approvals have expired) have to be quantified at very low level following the qualitative detection in genomic DNA extracted from feed samples. As the results of quantitative analysis can imply severe legal and financial consequences for producers or distributors of feed, the quantification results need to be utterly reliable. We developed a statistical approach to investigate the experimental measurement variability within one 96-well PCR plate. This approach visualises the frequency distribution as zygosity-corrected relative content of genetically modified material resulting from different combinations of transgene and reference gene Cq values. One application of it is the simulation of the consequences of varying parameters on measurement results. Parameters could be for example replicate numbers or baseline and threshold settings, measurement results could be for example median (class) and relative standard deviation (RSD). All calculations can be done using the built-in functions of Excel without any need for programming. The developed Excel spreadsheets are available (see section 'Availability of supporting data' for details). In most cases, the combination of four PCR replicates for each of the two DNA isolations already resulted in a relative standard deviation of 15% or less. The aims of the study are scientifically based suggestions for minimisation of uncertainty of measurement especially in -but not limited to- the field of GMO quantification at low concentration levels. Four PCR replicates for each of the two DNA isolations

  4. Molecular quantification of genes encoding for green-fluorescent proteins

    DEFF Research Database (Denmark)

    Felske, A; Vandieken, V; Pauling, B V

    2003-01-01

    A quantitative PCR approach is presented to analyze the amount of recombinant green fluorescent protein (gfp) genes in environmental DNA samples. The quantification assay is a combination of specific PCR amplification and temperature gradient gel electrophoresis (TGGE). Gene quantification...... PCR strategy is a highly specific and sensitive way to monitor recombinant DNA in environments like the efflux of a biotechnological plant....

  5. Direct infusion-SIM as fast and robust method for absolute protein quantification in complex samples

    Directory of Open Access Journals (Sweden)

    Christina Looße

    2015-06-01

    Full Text Available Relative and absolute quantification of proteins in biological and clinical samples are common approaches in proteomics. Until now, targeted protein quantification is mainly performed using a combination of HPLC-based peptide separation and selected reaction monitoring on triple quadrupole mass spectrometers. Here, we show for the first time the potential of absolute quantification using a direct infusion strategy combined with single ion monitoring (SIM on a Q Exactive mass spectrometer. By using complex membrane fractions of Escherichia coli, we absolutely quantified the recombinant expressed heterologous human cytochrome P450 monooxygenase 3A4 (CYP3A4 comparing direct infusion-SIM with conventional HPLC-SIM. Direct-infusion SIM revealed only 14.7% (±4.1 (s.e.m. deviation on average, compared to HPLC-SIM and a decreased processing and analysis time of 4.5 min (that could be further decreased to 30 s for a single sample in contrast to 65 min by the LC–MS method. Summarized, our simplified workflow using direct infusion-SIM provides a fast and robust method for quantification of proteins in complex protein mixtures.

  6. Parsing and Quantification of Raw Orbitrap Mass Spectrometer Data Using RawQuant.

    Science.gov (United States)

    Kovalchik, Kevin A; Moggridge, Sophie; Chen, David D Y; Morin, Gregg B; Hughes, Christopher S

    2018-06-01

    Effective analysis of protein samples by mass spectrometry (MS) requires careful selection and optimization of a range of experimental parameters. As the output from the primary detection device, the "raw" MS data file can be used to gauge the success of a given sample analysis. However, the closed-source nature of the standard raw MS file can complicate effective parsing of the data contained within. To ease and increase the range of analyses possible, the RawQuant tool was developed to enable parsing of raw MS files derived from Thermo Orbitrap instruments to yield meta and scan data in an openly readable text format. RawQuant can be commanded to export user-friendly files containing MS 1 , MS 2 , and MS 3 metadata as well as matrices of quantification values based on isobaric tagging approaches. In this study, the utility of RawQuant is demonstrated in several scenarios: (1) reanalysis of shotgun proteomics data for the identification of the human proteome, (2) reanalysis of experiments utilizing isobaric tagging for whole-proteome quantification, and (3) analysis of a novel bacterial proteome and synthetic peptide mixture for assessing quantification accuracy when using isobaric tags. Together, these analyses successfully demonstrate RawQuant for the efficient parsing and quantification of data from raw Thermo Orbitrap MS files acquired in a range of common proteomics experiments. In addition, the individual analyses using RawQuant highlights parametric considerations in the different experimental sets and suggests targetable areas to improve depth of coverage in identification-focused studies and quantification accuracy when using isobaric tags.

  7. Methane emission quantification from landfills using a double tracer approach

    DEFF Research Database (Denmark)

    Scheutz, Charlotte; Samuelsson, J.; Fredenslund, Anders Michael

    2007-01-01

    A tracer method was successfully used for quantification of the whole methane (CH4) emission from Fakse landfill. By using two different tracers the emission from different sections of the landfill could be quantified. Furthermore, is was possible to determine the emissions from local on site...

  8. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    Science.gov (United States)

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  9. Application of Fuzzy Comprehensive Evaluation Method in Trust Quantification

    Directory of Open Access Journals (Sweden)

    Shunan Ma

    2011-10-01

    Full Text Available Trust can play an important role for the sharing of resources and information in open network environments. Trust quantification is thus an important issue in dynamic trust management. By considering the fuzziness and uncertainty of trust, in this paper, we propose a fuzzy comprehensive evaluation method to quantify trust along with a trust quantification algorithm. Simulation results show that the trust quantification algorithm that we propose can effectively quantify trust and the quantified value of an entity's trust is consistent with the behavior of the entity.

  10. Building a glaucoma interaction network using a text mining approach.

    Science.gov (United States)

    Soliman, Maha; Nasraoui, Olfa; Cooper, Nigel G F

    2016-01-01

    The volume of biomedical literature and its underlying knowledge base is rapidly expanding, making it beyond the ability of a single human being to read through all the literature. Several automated methods have been developed to help make sense of this dilemma. The present study reports on the results of a text mining approach to extract gene interactions from the data warehouse of published experimental results which are then used to benchmark an interaction network associated with glaucoma. To the best of our knowledge, there is, as yet, no glaucoma interaction network derived solely from text mining approaches. The presence of such a network could provide a useful summative knowledge base to complement other forms of clinical information related to this disease. A glaucoma corpus was constructed from PubMed Central and a text mining approach was applied to extract genes and their relations from this corpus. The extracted relations between genes were checked using reference interaction databases and classified generally as known or new relations. The extracted genes and relations were then used to construct a glaucoma interaction network. Analysis of the resulting network indicated that it bears the characteristics of a small world interaction network. Our analysis showed the presence of seven glaucoma linked genes that defined the network modularity. A web-based system for browsing and visualizing the extracted glaucoma related interaction networks is made available at http://neurogene.spd.louisville.edu/GlaucomaINViewer/Form1.aspx. This study has reported the first version of a glaucoma interaction network using a text mining approach. The power of such an approach is in its ability to cover a wide range of glaucoma related studies published over many years. Hence, a bigger picture of the disease can be established. To the best of our knowledge, this is the first glaucoma interaction network to summarize the known literature. The major findings were a set of

  11. A Short Review of FDTD-Based Methods for Uncertainty Quantification in Computational Electromagnetics

    Directory of Open Access Journals (Sweden)

    Theodoros T. Zygiridis

    2017-01-01

    Full Text Available We provide a review of selected computational methodologies that are based on the deterministic finite-difference time-domain algorithm and are suitable for the investigation of electromagnetic problems involving uncertainties. As it will become apparent, several alternatives capable of performing uncertainty quantification in a variety of cases exist, each one exhibiting different qualities and ranges of applicability, which we intend to point out here. Given the numerous available approaches, the purpose of this paper is to clarify the main strengths and weaknesses of the described methodologies and help the potential readers to safely select the most suitable approach for their problem under consideration.

  12. Real-time PCR for the quantification of fungi in planta.

    Science.gov (United States)

    Klosterman, Steven J

    2012-01-01

    Methods enabling quantification of fungi in planta can be useful for a variety of applications. In combination with information on plant disease severity, indirect quantification of fungi in planta offers an additional tool in the screening of plants that are resistant to fungal diseases. In this chapter, a method is described for the quantification of DNA from a fungus in plant leaves using real-time PCR (qPCR). Although the method described entails quantification of the fungus Verticillium dahliae in lettuce leaves, the methodology described would be useful for other pathosystems as well. The method utilizes primers that are specific for amplification of a β-tubulin sequence from V. dahliae and a lettuce actin gene sequence as a reference for normalization. This approach enabled quantification of V. dahliae in the amount of 2.5 fg/ng of lettuce leaf DNA at 21 days following plant inoculation.

  13. Genre based Approach to Teach Writing Descriptive Text

    Directory of Open Access Journals (Sweden)

    Putu Ngurah Rusmawan

    2017-10-01

    Full Text Available This study aims to discuss how teaching and learning activities were carried out by using Genre based Approach in teaching writing descriptive text at junior high school. This study was conducted in the classroom of VII-1. Therefore, the appropriate design was qualitative research design. The subject of the study was the English teacher. To collect data, the researcher used observation and interview. The finding of the study described that the teaching and learning activities that were carried out by the teacher fulfilled the basic competencies. The teacher carried out the opening teaching activities by greeting, asking the students’ preparation during the lesson, checking the student’s attendance list, and informing the learning objective. The teacher carried out the main teaching activities by informing about how to write a descriptive text, giving, and asking opinions, eliciting the students’ understanding, prompting and directing to do exercises. The teacher carried out the closing teaching activities by directing the student to continue at home and eliciting the students’ reflection of what they could learn at that time.

  14. Techniques for quantification of liver fat in risk stratification of diabetics

    International Nuclear Information System (INIS)

    Kuehn, J.P.; Spoerl, M.C.; Mahlke, C.; Hegenscheid, K.

    2015-01-01

    Fatty liver disease plays an important role in the development of type 2 diabetes. Accurate techniques for detection and quantification of liver fat are essential for clinical diagnostics. Chemical shift-encoded magnetic resonance imaging (MRI) is a simple approach to quantify liver fat content. Liver fat quantification using chemical shift-encoded MRI is influenced by several bias factors, such as T2* decay, T1 recovery and the multispectral complexity of fat. The confounder corrected proton density fat fraction is a simple approach to quantify liver fat with comparable results independent of the software and hardware used. The proton density fat fraction is an accurate biomarker for assessment of liver fat. An accurate and reproducible quantification of liver fat using chemical shift-encoded MRI requires a calculation of the proton density fat fraction. (orig.) [de

  15. Network-Based Isoform Quantification with RNA-Seq Data for Cancer Transcriptome Analysis.

    Directory of Open Access Journals (Sweden)

    Wei Zhang

    2015-12-01

    Full Text Available High-throughput mRNA sequencing (RNA-Seq is widely used for transcript quantification of gene isoforms. Since RNA-Seq data alone is often not sufficient to accurately identify the read origins from the isoforms for quantification, we propose to explore protein domain-domain interactions as prior knowledge for integrative analysis with RNA-Seq data. We introduce a Network-based method for RNA-Seq-based Transcript Quantification (Net-RSTQ to integrate protein domain-domain interaction network with short read alignments for transcript abundance estimation. Based on our observation that the abundances of the neighboring isoforms by domain-domain interactions in the network are positively correlated, Net-RSTQ models the expression of the neighboring transcripts as Dirichlet priors on the likelihood of the observed read alignments against the transcripts in one gene. The transcript abundances of all the genes are then jointly estimated with alternating optimization of multiple EM problems. In simulation Net-RSTQ effectively improved isoform transcript quantifications when isoform co-expressions correlate with their interactions. qRT-PCR results on 25 multi-isoform genes in a stem cell line, an ovarian cancer cell line, and a breast cancer cell line also showed that Net-RSTQ estimated more consistent isoform proportions with RNA-Seq data. In the experiments on the RNA-Seq data in The Cancer Genome Atlas (TCGA, the transcript abundances estimated by Net-RSTQ are more informative for patient sample classification of ovarian cancer, breast cancer and lung cancer. All experimental results collectively support that Net-RSTQ is a promising approach for isoform quantification. Net-RSTQ toolbox is available at http://compbio.cs.umn.edu/Net-RSTQ/.

  16. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    Science.gov (United States)

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  17. Modelling text as process a dynamic approach to EFL classroom discourse

    CERN Document Server

    Yang, Xueyan

    2010-01-01

    A discourse analysis that is not based on grammar is likely to end up as a running commentary on a text, whereas a grammar-based one tends to treat text as a finished product rather than an on-going process. This book offers an approach to discourse analysis that is both grammar-based and oriented towards text as process. It proposes a model called TEXT TYPE within the framework of Hallidayan systemic-functional linguistics, which views grammatical choices in a text not as elements that combine to form a clause structure, but as semantic features that link successive clauses into an unfolding

  18. GHG emission quantification for pavement construction projects using a process-based approach

    Directory of Open Access Journals (Sweden)

    Charinee Limsawasd

    2017-03-01

    Full Text Available Climate change and greenhouse gas (GHG emissions have attracted much attention for their impacts upon the global environment. Initiating of new legislation and regulations for control of GHG emissions from the industrial sectors has been applied to address this problem. The transportation industries, which include operation of road pavement and pavement construction equipment, are the highest GHG-emitting sectors. This study presents a novel quantification model of GHG emissions of pavement construction using process-based analysis. The model is composed of five modules that evaluate GHG emissions. These are: material production and acquisition, (2 material transport to a project site, (3 heavy equipment use, (4 on-site machinery use, and, (5 on-site electricity use. The model was applied to a hypothetical pavement project to compare the environmental impacts of flexible and rigid pavement types during construction. The resulting model can be used for evaluation of environmental impacts, as well as for designing and planning highway pavement construction.

  19. A data fusion approach for progressive damage quantification in reinforced concrete masonry walls

    International Nuclear Information System (INIS)

    Vanniamparambil, Prashanth Abraham; Carmi, Rami; Kontsos, Antonios; Bolhassani, Mohammad; Khan, Fuad; Bartoli, Ivan; Moon, Franklin L; Hamid, Ahmad

    2014-01-01

    This paper presents a data fusion approach based on digital image correlation (DIC) and acoustic emission (AE) to detect, monitor and quantify progressive damage development in reinforced concrete masonry walls (CMW) with varying types of reinforcements. CMW were tested to evaluate their structural behavior under cyclic loading. The combination of DIC with AE provided a framework for the cross-correlation of full field strain maps on the surface of CMW with volume-inspecting acoustic activity. AE allowed in situ monitoring of damage progression which was correlated with the DIC through quantification of strain concentrations and by tracking crack evolution, visually verified. The presented results further demonstrate the relationships between the onset and development of cracking with changes in energy dissipation at each loading cycle, measured principal strains and computed AE energy, providing a promising paradigm for structural health monitoring applications on full-scale concrete masonry buildings. (paper)

  20. Can the CFO Trust the FX Exposure Quantification from a Stock Market Approach?

    DEFF Research Database (Denmark)

    Aabo, Tom; Brodin, Danielle

    This study examines the sensitivity of detected exchange rate exposures at the firm specific level to changes in methodological choices using a traditional two factor stock market approach for exposure quantification. We primarily focus on two methodological choices: the choice of market index...... and the choice of observation frequency. We investigate to which extent the detected exchange rate exposures for a given firm can be confirmed when the choice of market index and/or the choice of observation frequency are changed. Applying our sensitivity analysis to Scandinavian non-financial firms, we...... thirds of the number of detected exposures using weekly data and 2) there is no economic rationale that the detected exposures at the firm-specific level should change when going from the use of weekly data to the use of monthly data. In relation to a change in the choice of market index, we find...

  1. Simultaneous quantification of protein phosphorylation sites using liquid chromatography-tandem mass spectrometry-based targeted proteomics: a linear algebra approach for isobaric phosphopeptides.

    Science.gov (United States)

    Xu, Feifei; Yang, Ting; Sheng, Yuan; Zhong, Ting; Yang, Mi; Chen, Yun

    2014-12-05

    As one of the most studied post-translational modifications (PTM), protein phosphorylation plays an essential role in almost all cellular processes. Current methods are able to predict and determine thousands of phosphorylation sites, whereas stoichiometric quantification of these sites is still challenging. Liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS)-based targeted proteomics is emerging as a promising technique for site-specific quantification of protein phosphorylation using proteolytic peptides as surrogates of proteins. However, several issues may limit its application, one of which relates to the phosphopeptides with different phosphorylation sites and the same mass (i.e., isobaric phosphopeptides). While employment of site-specific product ions allows for these isobaric phosphopeptides to be distinguished and quantified, site-specific product ions are often absent or weak in tandem mass spectra. In this study, linear algebra algorithms were employed as an add-on to targeted proteomics to retrieve information on individual phosphopeptides from their common spectra. To achieve this simultaneous quantification, a LC-MS/MS-based targeted proteomics assay was first developed and validated for each phosphopeptide. Given the slope and intercept of calibration curves of phosphopeptides in each transition, linear algebraic equations were developed. Using a series of mock mixtures prepared with varying concentrations of each phosphopeptide, the reliability of the approach to quantify isobaric phosphopeptides containing multiple phosphorylation sites (≥ 2) was discussed. Finally, we applied this approach to determine the phosphorylation stoichiometry of heat shock protein 27 (HSP27) at Ser78 and Ser82 in breast cancer cells and tissue samples.

  2. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    Science.gov (United States)

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. 15 CFR 990.52 - Injury assessment-quantification.

    Science.gov (United States)

    2010-01-01

    ... (Continued) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OIL POLLUTION ACT..., trustees must quantify the degree, and spatial and temporal extent of such injuries relative to baseline. (b) Quantification approaches. Trustees may quantify injuries in terms of: (1) The degree, and...

  4. Current position of high-resolution MS for drug quantification in clinical & forensic toxicology.

    Science.gov (United States)

    Meyer, Markus R; Helfer, Andreas G; Maurer, Hans H

    2014-08-01

    This paper reviews high-resolution MS approaches published from January 2011 until March 2014 for the quantification of drugs (of abuse) and/or their metabolites in biosamples using LC-MS with time-of-flight or Orbitrap™ mass analyzers. Corresponding approaches are discussed including sample preparation and mass spectral settings. The advantages and limitations of high-resolution MS for drug quantification, as well as the demand for a certain resolution or a specific mass accuracy are also explored.

  5. Quantification of coating aging using impedance measurements

    NARCIS (Netherlands)

    Westing, E.P.M. van; Weijde, D.H. van der; Vreijling, M.P.W.; Ferrari, G.M.; Wit, J.H.W. de

    1998-01-01

    This chapter shows the application results of a novel approach to quantify the ageing of organic coatings using impedance measurements. The ageing quantification is based on the typical impedance behaviour of barrier coatings in immersion. This immersion behaviour is used to determine the limiting

  6. Opinion Mining in Latvian Text Using Semantic Polarity Analysis and Machine Learning Approach

    Directory of Open Access Journals (Sweden)

    Gatis Špats

    2016-07-01

    Full Text Available In this paper we demonstrate approaches for opinion mining in Latvian text. Authors have applied, combined and extended results of several previous studies and public resources to perform opinion mining in Latvian text using two approaches, namely, semantic polarity analysis and machine learning. One of the most significant constraints that make application of opinion mining for written content classification in Latvian text challenging is the limited publicly available text corpora for classifier training. We have joined several sources and created a publically available extended lexicon. Our results are comparable to or outperform current achievements in opinion mining in Latvian. Experiments show that lexicon-based methods provide more accurate opinion mining than the application of Naive Bayes machine learning classifier on Latvian tweets. Methods used during this study could be further extended using human annotators, unsupervised machine learning and bootstrapping to create larger corpora of classified text.

  7. The quantification of risk and tourism

    Directory of Open Access Journals (Sweden)

    Piet Croucamp

    2014-01-01

    Full Text Available Tourism in South Africa comprises 9.5% of Gross Domestic Product (GDP, but remains an underresearched industry, especially regarding the quantification of the risks prevailing in the social, political and economic environment in which the industry operates. Risk prediction, extrapolation forecasting is conducted largely in the context of a qualitative methodology. This article reflects on the quantification of social constructs as variables of risk in the tourism industry with reference to South Africa. The theory and methodology of quantification is briefly reviewed and the indicators of risk are conceptualized and operationalized. The identified indicators are scaled in indices for purposes of quantification. Risk assessments and the quantification of constructs rely heavily on the experience - often personal - of the researcher and this scholarly endeavour is, therefore, not inclusive of all possible identified indicators of risk. It is accepted that tourism in South Africa is an industry comprising of a large diversity of sectors, each with a different set of risk indicators and risk profiles. The emphasis of this article is thus on the methodology to be applied to a risk profile. A secondary endeavour is to provide for clarity about the conceptual and operational confines of risk in general, as well as how quantified risk relates to the tourism industry. The indices provided include both domesticand international risk indicators. The motivation for the article is to encourage a greater emphasis on quantitative research in our efforts to understand and manage a risk profile for the tourist industry.

  8. Pore REconstruction and Segmentation (PORES) method for improved porosity quantification of nanoporous materials

    Energy Technology Data Exchange (ETDEWEB)

    Van Eyndhoven, G., E-mail: geert.vaneyndhoven@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Kurttepeli, M. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Van Oers, C.J.; Cool, P. [Laboratory of Adsorption and Catalysis, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Bals, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Batenburg, K.J. [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Centrum Wiskunde and Informatica, Science Park 123, NL-1090 GB Amsterdam (Netherlands); Mathematical Institute, Universiteit Leiden, Niels Bohrweg 1, NL-2333 CA Leiden (Netherlands); Sijbers, J. [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium)

    2015-01-15

    Electron tomography is currently a versatile tool to investigate the connection between the structure and properties of nanomaterials. However, a quantitative interpretation of electron tomography results is still far from straightforward. Especially accurate quantification of pore-space is hampered by artifacts introduced in all steps of the processing chain, i.e., acquisition, reconstruction, segmentation and quantification. Furthermore, most common approaches require subjective manual user input. In this paper, the PORES algorithm “POre REconstruction and Segmentation” is introduced; it is a tailor-made, integral approach, for the reconstruction, segmentation, and quantification of porous nanomaterials. The PORES processing chain starts by calculating a reconstruction with a nanoporous-specific reconstruction algorithm: the Simultaneous Update of Pore Pixels by iterative REconstruction and Simple Segmentation algorithm (SUPPRESS). It classifies the interior region to the pores during reconstruction, while reconstructing the remaining region by reducing the error with respect to the acquired electron microscopy data. The SUPPRESS reconstruction can be directly plugged into the remaining processing chain of the PORES algorithm, resulting in accurate individual pore quantification and full sample pore statistics. The proposed approach was extensively validated on both simulated and experimental data, indicating its ability to generate accurate statistics of nanoporous materials. - Highlights: • An electron tomography reconstruction/segmentation method for nanoporous materials. • The method exploits the porous nature of the scanned material. • Validated extensively on both simulation and real data experiments. • Results in increased image resolution and improved porosity quantification.

  9. The semiotics of typography in literary texts. A multimodal approach

    DEFF Research Database (Denmark)

    Nørgaard, Nina

    2009-01-01

    to multimodal discourse proposed, for instance, by Kress & Van Leeuwen (2001) and Baldry & Thibault (2006), and, more specifically, the multimodal approach to typography suggested by Van Leeuwen (2005b; 2006), in order to sketch out a methodological framework applicable to the description and analysis...... of the semiotic potential of typography in literary texts....

  10. Quantification of Uncertainties in Integrated Spacecraft System Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...

  11. Quantification of variability in trichome patterns

    Directory of Open Access Journals (Sweden)

    Bettina eGreese

    2014-11-01

    Full Text Available While pattern formation is studied in various areas of biology, little is known about the intrinsic noise leading to variations between individual realizations of the pattern. One prominent example for de novo pattern formation in plants is the patterning of trichomes on Arabidopsis leaves, which involves genetic regulation and cell-to-cell communication. These processes are potentially variable due to , e.g., the abundance of cell components or environmental conditions. To elevate the understanding of the regulatory processes underlying the pattern formation it is crucial to quantitatively analyze the variability in naturally occurring patterns. Here, we review recent approaches towards characterization of noise on trichome initiation. We present methods for the quantification of spatial patterns, which are the basis for data-driven mathematical modeling and enable the analysis of noise from different sources. Besides the insight gained on trichome formation, the examination of observed trichome patterns also shows that highly regulated biological processes can be substantially affected by variability.

  12. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  13. Comparison of methods for quantification of global DNA methylation in human cells and tissues.

    Directory of Open Access Journals (Sweden)

    Sofia Lisanti

    Full Text Available DNA methylation is a key epigenetic modification which, in mammals, occurs mainly at CpG dinucleotides. Most of the CpG methylation in the genome is found in repetitive regions, rich in dormant transposons and endogenous retroviruses. Global DNA hypomethylation, which is a common feature of several conditions such as ageing and cancer, can cause the undesirable activation of dormant repeat elements and lead to altered expression of associated genes. DNA hypomethylation can cause genomic instability and may contribute to mutations and chromosomal recombinations. Various approaches for quantification of global DNA methylation are widely used. Several of these approaches measure a surrogate for total genomic methyl cytosine and there is uncertainty about the comparability of these methods. Here we have applied 3 different approaches (luminometric methylation assay, pyrosequencing of the methylation status of the Alu repeat element and of the LINE1 repeat element for estimating global DNA methylation in the same human cell and tissue samples and have compared these estimates with the "gold standard" of methyl cytosine quantification by HPLC. Next to HPLC, the LINE1 approach shows the smallest variation between samples, followed by Alu. Pearson correlations and Bland-Altman analyses confirmed that global DNA methylation estimates obtained via the LINE1 approach corresponded best with HPLC-based measurements. Although, we did not find compelling evidence that the gold standard measurement by HPLC could be substituted with confidence by any of the surrogate assays for detecting global DNA methylation investigated here, the LINE1 assay seems likely to be an acceptable surrogate in many cases.

  14. Methodological Demonstration of a Text Analytics Approach to Country Logistics System Assessments

    DEFF Research Database (Denmark)

    Kinra, Aseem; Mukkamala, Raghava Rao; Vatrapu, Ravi

    2017-01-01

    The purpose of this study is to develop and demonstrate a semi-automated text analytics approach for the identification and categorization of information that can be used for country logistics assessments. In this paper, we develop the methodology on a set of documents for 21 countries using...... and the text analyst. Implications are discussed and future work is outlined....

  15. The Quantification Process for the PRiME-U34i

    International Nuclear Information System (INIS)

    Hwang, Mee-Jeong; Han, Sang-Hoon; Yang, Joon-Eon

    2006-01-01

    In this paper, we introduce the quantification process for the PRIME-U34i, which is the merged model of ETs (Event Trees) and FTs (Fault Trees) for the level 1 internal PSA of UCN 3 and 4. PRiME-U34i has one top event. Therefore, the quantification process is changed to a simplified method when compared to the past one. In the past, we used the text file called a user file to control the quantification process. However, this user file is so complicated that it is difficult for a non-expert to understand it. Moreover, in the past PSA, ET and FT were separated but in PRiMEU34i, ET and FT were merged together. Thus, the quantification process is different. This paper is composed of five sections. In section 2, we introduce the construction of the one top model. Section 3 shows the quantification process used in the PRiME-U34i. Section 4 describes the post processing. Last section is the conclusions

  16. The Application of Machine Learning Algorithms for Text Mining based on Sentiment Analysis Approach

    Directory of Open Access Journals (Sweden)

    Reza Samizade

    2018-06-01

    Full Text Available Classification of the cyber texts and comments into two categories of positive and negative sentiment among social media users is of high importance in the research are related to text mining. In this research, we applied supervised classification methods to classify Persian texts based on sentiment in cyber space. The result of this research is in a form of a system that can decide whether a comment which is published in cyber space such as social networks is considered positive or negative. The comments that are published in Persian movie and movie review websites from 1392 to 1395 are considered as the data set for this research. A part of these data are considered as training and others are considered as testing data. Prior to implementing the algorithms, pre-processing activities such as tokenizing, removing stop words, and n-germs process were applied on the texts. Naïve Bayes, Neural Networks and support vector machine were used for text classification in this study. Out of sample tests showed that there is no evidence indicating that the accuracy of SVM approach is statistically higher than Naïve Bayes or that the accuracy of Naïve Bayes is not statistically higher than NN approach. However, the researchers can conclude that the accuracy of the classification using SVM approach is statistically higher than the accuracy of NN approach in 5% confidence level.

  17. Collaborative framework for PIV uncertainty quantification: comparative assessment of methods

    International Nuclear Information System (INIS)

    Sciacchitano, Andrea; Scarano, Fulvio; Neal, Douglas R; Smith, Barton L; Warner, Scott O; Vlachos, Pavlos P; Wieneke, Bernhard

    2015-01-01

    A posteriori uncertainty quantification of particle image velocimetry (PIV) data is essential to obtain accurate estimates of the uncertainty associated with a given experiment. This is particularly relevant when measurements are used to validate computational models or in design and decision processes. In spite of the importance of the subject, the first PIV uncertainty quantification (PIV-UQ) methods have been developed only in the last three years. The present work is a comparative assessment of four approaches recently proposed in the literature: the uncertainty surface method (Timmins et al 2012), the particle disparity approach (Sciacchitano et al 2013), the peak ratio criterion (Charonko and Vlachos 2013) and the correlation statistics method (Wieneke 2015). The analysis is based upon experiments conducted for this specific purpose, where several measurement techniques are employed simultaneously. The performances of the above approaches are surveyed across different measurement conditions and flow regimes. (paper)

  18. An impedance-based approach for detection and quantification of damage in cracked plates and loose bolts in bridge structures

    Science.gov (United States)

    Rabiei, Masoud; Sheldon, Jeremy; Palmer, Carl

    2012-04-01

    The applicability of Electro-Mechanical Impedance (EMI) approach to damage detection, localization and quantification in a mobile bridge structure is investigated in this paper. The developments in this paper focus on assessing the health of Armored Vehicle Launched Bridges (AVLBs). Specifically, two key failure mechanisms of the AVLB to be monitored were fatigue crack growth and damaged (loose) rivets (bolts) were identified. It was shown through experiment that bolt damage (defined here as different torque levels applied to bolts) can be detected, quantified and located using a network of lead zirconate titanate (PZT) transducers distributed on the structure. It was also shown that cracks of various sizes can be detected and quantified using the EMI approach. The experiments were performed on smaller laboratory specimens as well as full size bridge-like components that were built as part of this research. The effects of various parameters such as transducer type and size on the performance of the proposed health assessment approach were also investigated.

  19. Building a protein name dictionary from full text: a machine learning term extraction approach

    Directory of Open Access Journals (Sweden)

    Campagne Fabien

    2005-04-01

    Full Text Available Abstract Background The majority of information in the biological literature resides in full text articles, instead of abstracts. Yet, abstracts remain the focus of many publicly available literature data mining tools. Most literature mining tools rely on pre-existing lexicons of biological names, often extracted from curated gene or protein databases. This is a limitation, because such databases have low coverage of the many name variants which are used to refer to biological entities in the literature. Results We present an approach to recognize named entities in full text. The approach collects high frequency terms in an article, and uses support vector machines (SVM to identify biological entity names. It is also computationally efficient and robust to noise commonly found in full text material. We use the method to create a protein name dictionary from a set of 80,528 full text articles. Only 8.3% of the names in this dictionary match SwissProt description lines. We assess the quality of the dictionary by studying its protein name recognition performance in full text. Conclusion This dictionary term lookup method compares favourably to other published methods, supporting the significance of our direct extraction approach. The method is strong in recognizing name variants not found in SwissProt.

  20. Visualized attribute analysis approach for characterization and quantification of rice taste flavor using electronic tongue

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Lin; Hu, Xianqiao [Rice Product Quality Supervision and Inspection Center, Ministry of Agriculture, China National Rice Research Institute, Hangzhou 310006 (China); Tian, Shiyi; Deng, Shaoping [College of Food Science and Biotechnology, Zhejiang Gongshang University, Hangzhou 310035 (China); Zhu, Zhiwei, E-mail: 615834652@qq.com [Rice Product Quality Supervision and Inspection Center, Ministry of Agriculture, China National Rice Research Institute, Hangzhou 310006 (China)

    2016-05-05

    This paper deals with a novel visualized attributive analysis approach for characterization and quantification of rice taste flavor attributes (softness, stickiness, sweetness and aroma) employing a multifrequency large-amplitude pulse voltammetric electronic tongue. Data preprocessing methods including Principal Component Analysis (PCA) and Fast Fourier Transform (FFT) were provided. An attribute characterization graph was represented for visualization of the interactive response in which each attribute responded by specific electrodes and frequencies. The model was trained using signal data from electronic tongue and attribute scores from artificial evaluation. The correlation coefficients for all attributes were over 0.9, resulting in good predictive ability of attributive analysis model preprocessed by FFT. This approach extracted more effective information about linear relationship between electronic tongue and taste flavor attribute. Results indicated that this approach can accurately quantify taste flavor attributes, and can be an efficient tool for data processing in a voltammetric electronic tongue system. - Graphical abstract: Schematic process for visualized attributive analysis approach using multifrequency large-amplitude pulse voltammetric electronic tongue for determination of rice taste flavor attribute. (a) sample; (b) sensors in electronic tongue; (c) excitation voltage program and response current signal from MLAPS; (d) similarity data matrix by data preprocessing and similarity extraction; (e) feature data matrix of attribute; (f) attribute characterization graph; (g) attribute scores predicted by the model. - Highlights: • Multifrequency large-amplitude pulse voltammetric electronic tongue was used. • A visualized attributive analysis approach was created as an efficient tool for data processing. • Rice taste flavor attribute was determined and predicted. • The attribute characterization graph was represented for visualization of the

  1. Task-based Language Teaching and Text Types in Teaching Writing Using Communicative Approach

    Directory of Open Access Journals (Sweden)

    Riyana Sari Ni Nyoman

    2018-01-01

    Full Text Available One of the most important language competencies in teaching learning process is writing. The present study focused on investigating the effect of communicative approach with task-based language teaching and communicative approach on the students’ writing competency at SMP N 2 Kediri viewed from text types(i.e. descriptive, recount, and narrative. To analyze the data, the design of the experimental study was posttest-only comparison groups by involving 60 students that were selected as the sample of the study through cluster random design. The sample’s post tests were assessed by using analytical scoring rubric. The data were then analyzed by using One-way ANOVA and the post hoc test was done by computing Multiple Comparison using Tukey HSD Test. The result showed that there was significant difference of the effect of communicative approach with task-based language teaching and communicative approach on the students’ writing competency. These findings are expected to give contribution in teaching English, particularly writing.

  2. Clinical applications of MS-based protein quantification.

    Science.gov (United States)

    Sabbagh, Bassel; Mindt, Sonani; Neumaier, Michael; Findeisen, Peter

    2016-04-01

    Mass spectrometry-based assays are increasingly important in clinical laboratory medicine and nowadays are already commonly used in several areas of routine diagnostics. These include therapeutic drug monitoring, toxicology, endocrinology, pediatrics, and microbiology. Accordingly, some of the most common analyses are therapeutic drug monitoring of immunosuppressants, vitamin D, steroids, newborn screening, and bacterial identification. However, MS-based quantification of peptides and proteins for routine diagnostic use is rather rare up to now despite excellent analytical specificity and good sensitivity. Here, we want to give an overview over current fit-for-purpose assays for MS-based protein quantification. Advantages as well as challenges of this approach will be discussed with focus on feasibility for routine diagnostic use. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Word-level recognition of multifont Arabic text using a feature vector matching approach

    Science.gov (United States)

    Erlandson, Erik J.; Trenkle, John M.; Vogt, Robert C., III

    1996-03-01

    Many text recognition systems recognize text imagery at the character level and assemble words from the recognized characters. An alternative approach is to recognize text imagery at the word level, without analyzing individual characters. This approach avoids the problem of individual character segmentation, and can overcome local errors in character recognition. A word-level recognition system for machine-printed Arabic text has been implemented. Arabic is a script language, and is therefore difficult to segment at the character level. Character segmentation has been avoided by recognizing text imagery of complete words. The Arabic recognition system computes a vector of image-morphological features on a query word image. This vector is matched against a precomputed database of vectors from a lexicon of Arabic words. Vectors from the database with the highest match score are returned as hypotheses for the unknown image. Several feature vectors may be stored for each word in the database. Database feature vectors generated using multiple fonts and noise models allow the system to be tuned to its input stream. Used in conjunction with database pruning techniques, this Arabic recognition system has obtained promising word recognition rates on low-quality multifont text imagery.

  4. The use of self-quantification systems for personal health information: big data management activities and prospects.

    Science.gov (United States)

    Almalki, Manal; Gray, Kathleen; Sanchez, Fernando Martin

    2015-01-01

    Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Self-quantification in personal health maintenance

  5. Semi-automated quantification of living cells with internalized nanostructures

    KAUST Repository

    Margineanu, Michael B.; Julfakyan, Khachatur; Sommer, Christoph; Perez, Jose E.; Contreras, Maria F.; Khashab, Niveen M.; Kosel, Jü rgen; Ravasi, Timothy

    2016-01-01

    novel method for the quantification of cells that internalize a specific type of nanostructures. This approach is suitable for high-throughput and real-time data analysis and has the potential to be used to study the interaction of different types

  6. A multifractal approach to space-filling recovery for PET quantification

    Energy Technology Data Exchange (ETDEWEB)

    Willaime, Julien M. Y., E-mail: julien.willaime@siemens.com; Aboagye, Eric O. [Comprehensive Cancer Imaging Centre, Imperial College London, Hammersmith Hospital, London W12 0NN (United Kingdom); Tsoumpas, Charalampos [Division of Medical Physics, University of Leeds, LS2 9JT (United Kingdom); Turkheimer, Federico E. [Department of Neuroimaging, Institute of Psychiatry, King’s College London, London SE5 8AF (United Kingdom)

    2014-11-01

    Purpose: A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). Methods: A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV {sub mean}) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal and synthetic objects contaminated by partial volume effects (PVEs), validated on realistic {sup 18}F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical {sup 18}F-fluorothymidine PET test–retest dataset. Results: TLA estimates were stable for a range of resolutions typical in PET oncology (4–6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV {sub mean} or TV measurements across imaging protocols. Conclusions: The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.

  7. Estimation of immune cell densities in immune cell conglomerates: an approach for high-throughput quantification.

    Directory of Open Access Journals (Sweden)

    Niels Halama

    2009-11-01

    Full Text Available Determining the correct number of positive immune cells in immunohistological sections of colorectal cancer and other tumor entities is emerging as an important clinical predictor and therapy selector for an individual patient. This task is usually obstructed by cell conglomerates of various sizes. We here show that at least in colorectal cancer the inclusion of immune cell conglomerates is indispensable for estimating reliable patient cell counts. Integrating virtual microscopy and image processing principally allows the high-throughput evaluation of complete tissue slides.For such large-scale systems we demonstrate a robust quantitative image processing algorithm for the reproducible quantification of cell conglomerates on CD3 positive T cells in colorectal cancer. While isolated cells (28 to 80 microm(2 are counted directly, the number of cells contained in a conglomerate is estimated by dividing the area of the conglomerate in thin tissues sections (< or =6 microm by the median area covered by an isolated T cell which we determined as 58 microm(2. We applied our algorithm to large numbers of CD3 positive T cell conglomerates and compared the results to cell counts obtained manually by two independent observers. While especially for high cell counts, the manual counting showed a deviation of up to 400 cells/mm(2 (41% variation, algorithm-determined T cell numbers generally lay in between the manually observed cell numbers but with perfect reproducibility.In summary, we recommend our approach as an objective and robust strategy for quantifying immune cell densities in immunohistological sections which can be directly implemented into automated full slide image processing systems.

  8. Easy, Fast, and Reproducible Quantification of Cholesterol and Other Lipids in Human Plasma by Combined High Resolution MSX and FTMS Analysis

    Science.gov (United States)

    Gallego, Sandra F.; Højlund, Kurt; Ejsing, Christer S.

    2018-01-01

    Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for direct detection and quantification of cholesterol in extracts of human plasma. These approaches are high resolution full scan Fourier transform mass spectrometry (FTMS) analysis, parallel reaction monitoring (PRM), and novel multiplexed MS/MS (MSX) technology, where fragments from selected precursor ions are detected simultaneously. Evaluating the performance of these approaches in terms of dynamic quantification range, linearity, and analytical precision showed that the MSX-based approach is superior to that of the FTMS and PRM-based approaches. To further show the efficacy of this approach, we devised a simple routine for extensive plasma lipidome characterization using only 8 μL of plasma, using a new commercially available ready-to-spike-in mixture with 14 synthetic lipid standards, and executing a single 6 min sample injection with combined MSX analysis for cholesterol quantification and FTMS analysis for quantification of sterol esters, glycerolipids, glycerophospholipids, and sphingolipids. Using this simple routine afforded reproducible and absolute quantification of 200 lipid species encompassing 13 lipid classes in human plasma samples. Notably, the analysis time of this procedure can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MSALL technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. [Figure not available: see fulltext.

  9. Quantification of trace-level DNA by real-time whole genome amplification.

    Science.gov (United States)

    Kang, Min-Jung; Yu, Hannah; Kim, Sook-Kyung; Park, Sang-Ryoul; Yang, Inchul

    2011-01-01

    Quantification of trace amounts of DNA is a challenge in analytical applications where the concentration of a target DNA is very low or only limited amounts of samples are available for analysis. PCR-based methods including real-time PCR are highly sensitive and widely used for quantification of low-level DNA samples. However, ordinary PCR methods require at least one copy of a specific gene sequence for amplification and may not work for a sub-genomic amount of DNA. We suggest a real-time whole genome amplification method adopting the degenerate oligonucleotide primed PCR (DOP-PCR) for quantification of sub-genomic amounts of DNA. This approach enabled quantification of sub-picogram amounts of DNA independently of their sequences. When the method was applied to the human placental DNA of which amount was accurately determined by inductively coupled plasma-optical emission spectroscopy (ICP-OES), an accurate and stable quantification capability for DNA samples ranging from 80 fg to 8 ng was obtained. In blind tests of laboratory-prepared DNA samples, measurement accuracies of 7.4%, -2.1%, and -13.9% with analytical precisions around 15% were achieved for 400-pg, 4-pg, and 400-fg DNA samples, respectively. A similar quantification capability was also observed for other DNA species from calf, E. coli, and lambda phage. Therefore, when provided with an appropriate standard DNA, the suggested real-time DOP-PCR method can be used as a universal method for quantification of trace amounts of DNA.

  10. A visual approach to efficient analysis and quantification of ductile iron and reinforced sprayed concrete.

    Science.gov (United States)

    Fritz, Laura; Hadwiger, Markus; Geier, Georg; Pittino, Gerhard; Gröller, M Eduard

    2009-01-01

    This paper describes advanced volume visualization and quantification for applications in non-destructive testing (NDT), which results in novel and highly effective interactive workflows for NDT practitioners. We employ a visual approach to explore and quantify the features of interest, based on transfer functions in the parameter spaces of specific application scenarios. Examples are the orientations of fibres or the roundness of particles. The applicability and effectiveness of our approach is illustrated using two specific scenarios of high practical relevance. First, we discuss the analysis of Steel Fibre Reinforced Sprayed Concrete (SFRSpC). We investigate the orientations of the enclosed steel fibres and their distribution, depending on the concrete's application direction. This is a crucial step in assessing the material's behavior under mechanical stress, which is still in its infancy and therefore a hot topic in the building industry. The second application scenario is the designation of the microstructure of ductile cast irons with respect to the contained graphite. This corresponds to the requirements of the ISO standard 945-1, which deals with 2D metallographic samples. We illustrate how the necessary analysis steps can be carried out much more efficiently using our system for 3D volumes. Overall, we show that a visual approach with custom transfer functions in specific application domains offers significant benefits and has the potential of greatly improving and optimizing the workflows of domain scientists and engineers.

  11. Simultaneous quantification of 21 water soluble vitamin circulating forms in human plasma by liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Meisser Redeuil, Karine; Longet, Karin; Bénet, Sylvie; Munari, Caroline; Campos-Giménez, Esther

    2015-11-27

    This manuscript reports a validated analytical approach for the quantification of 21 water soluble vitamins and their main circulating forms in human plasma. Isotope dilution-based sample preparation consisted of protein precipitation using acidic methanol enriched with stable isotope labelled internal standards. Separation was achieved by reversed-phase liquid chromatography and detection performed by tandem mass spectrometry in positive electrospray ionization mode. Instrumental lower limits of detection and quantification reached water soluble vitamins in human plasma single donor samples. The present report provides a sensitive and reliable approach for the quantification of water soluble vitamins and main circulating forms in human plasma. In the future, the application of this analytical approach will give more confidence to provide a comprehensive assessment of water soluble vitamins nutritional status and bioavailability studies in humans. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Interdisciplinary Approach to the Mental Lexicon: Neural Network and Text Extraction From Long-term Memory

    Directory of Open Access Journals (Sweden)

    Vardan G. Arutyunyan

    2013-01-01

    Full Text Available The paper touches upon the principles of mental lexicon organization in the light of recent research in psycho- and neurolinguistics. As a focal point of discussion two main approaches to mental lexicon functioning are considered: modular or dual-system approach, developed within generativism and opposite single-system approach, representatives of which are the connectionists and supporters of network models. The paper is an endeavor towards advocating the viewpoint that mental lexicon is complex psychological organization based upon specific composition of neural network. In this regard, the paper further elaborates on the matter of storing text in human mental space and introduces a model of text extraction from long-term memory. Based upon data available, the author develops a methodology of modeling structures of knowledge representation in the systems of artificial intelligence.

  13. A statistical kinematic source inversion approach based on the QUESO library for uncertainty quantification and prediction

    Science.gov (United States)

    Zielke, Olaf; McDougall, Damon; Mai, Martin; Babuska, Ivo

    2014-05-01

    Seismic, often augmented with geodetic data, are frequently used to invert for the spatio-temporal evolution of slip along a rupture plane. The resulting images of the slip evolution for a single event, inferred by different research teams, often vary distinctly, depending on the adopted inversion approach and rupture model parameterization. This observation raises the question, which of the provided kinematic source inversion solutions is most reliable and most robust, and — more generally — how accurate are fault parameterization and solution predictions? These issues are not included in "standard" source inversion approaches. Here, we present a statistical inversion approach to constrain kinematic rupture parameters from teleseismic body waves. The approach is based a) on a forward-modeling scheme that computes synthetic (body-)waves for a given kinematic rupture model, and b) on the QUESO (Quantification of Uncertainty for Estimation, Simulation, and Optimization) library that uses MCMC algorithms and Bayes theorem for sample selection. We present Bayesian inversions for rupture parameters in synthetic earthquakes (i.e. for which the exact rupture history is known) in an attempt to identify the cross-over at which further model discretization (spatial and temporal resolution of the parameter space) is no longer attributed to a decreasing misfit. Identification of this cross-over is of importance as it reveals the resolution power of the studied data set (i.e. teleseismic body waves), enabling one to constrain kinematic earthquake rupture histories of real earthquakes at a resolution that is supported by data. In addition, the Bayesian approach allows for mapping complete posterior probability density functions of the desired kinematic source parameters, thus enabling us to rigorously assess the uncertainties in earthquake source inversions.

  14. Quantification of competitive value of documents

    Directory of Open Access Journals (Sweden)

    Pavel Šimek

    2009-01-01

    Full Text Available The majority of Internet users use the global network to search for different information using fulltext search engines such as Google, Yahoo!, or Seznam. The web presentation operators are trying, with the help of different optimization techniques, to get to the top places in the results of fulltext search engines. Right there is a great importance of Search Engine Optimization and Search Engine Marketing, because normal users usually try links only on the first few pages of the fulltext search engines results on certain keywords and in catalogs they use primarily hierarchically higher placed links in each category. Key to success is the application of optimization methods which deal with the issue of keywords, structure and quality of content, domain names, individual sites and quantity and reliability of backward links. The process is demanding, long-lasting and without a guaranteed outcome. A website operator without advanced analytical tools do not identify the contribution of individual documents from which the entire web site consists. If the web presentation operators want to have an overview of their documents and web site in global, it is appropriate to quantify these positions in a specific way, depending on specific key words. For this purpose serves the quantification of competitive value of documents, which consequently sets global competitive value of a web site. Quantification of competitive values is performed on a specific full-text search engine. For each full-text search engine can be and often are, different results. According to published reports of ClickZ agency or Market Share is according to the number of searches by English-speaking users most widely used Google search engine, which has a market share of more than 80%. The whole procedure of quantification of competitive values is common, however, the initial step which is the analysis of keywords depends on a choice of the fulltext search engine.

  15. Correlation Coefficients Between Different Methods of Expressing Bacterial Quantification Using Real Time PCR

    Directory of Open Access Journals (Sweden)

    Bahman Navidshad

    2012-02-01

    Full Text Available The applications of conventional culture-dependent assays to quantify bacteria populations are limited by their dependence on the inconsistent success of the different culture-steps involved. In addition, some bacteria can be pathogenic or a source of endotoxins and pose a health risk to the researchers. Bacterial quantification based on the real-time PCR method can overcome the above-mentioned problems. However, the quantification of bacteria using this approach is commonly expressed as absolute quantities even though the composition of samples (like those of digesta can vary widely; thus, the final results may be affected if the samples are not properly homogenized, especially when multiple samples are to be pooled together before DNA extraction. The objective of this study was to determine the correlation coefficients between four different methods of expressing the output data of real-time PCR-based bacterial quantification. The four methods were: (i the common absolute method expressed as the cell number of specific bacteria per gram of digesta; (ii the Livak and Schmittgen, ΔΔCt method; (iii the Pfaffl equation; and (iv a simple relative method based on the ratio of cell number of specific bacteria to the total bacterial cells. Because of the effect on total bacteria population in the results obtained using ΔCt-based methods (ΔΔCt and Pfaffl, these methods lack the acceptable consistency to be used as valid and reliable methods in real-time PCR-based bacterial quantification studies. On the other hand, because of the variable compositions of digesta samples, a simple ratio of cell number of specific bacteria to the corresponding total bacterial cells of the same sample can be a more accurate method to quantify the population.

  16. Automatic Segmentation and Quantification of Filamentous Structures in Electron Tomography.

    Science.gov (United States)

    Loss, Leandro A; Bebis, George; Chang, Hang; Auer, Manfred; Sarkar, Purbasha; Parvin, Bahram

    2012-10-01

    Electron tomography is a promising technology for imaging ultrastructures at nanoscale resolutions. However, image and quantitative analyses are often hindered by high levels of noise, staining heterogeneity, and material damage either as a result of the electron beam or sample preparation. We have developed and built a framework that allows for automatic segmentation and quantification of filamentous objects in 3D electron tomography. Our approach consists of three steps: (i) local enhancement of filaments by Hessian filtering; (ii) detection and completion (e.g., gap filling) of filamentous structures through tensor voting; and (iii) delineation of the filamentous networks. Our approach allows for quantification of filamentous networks in terms of their compositional and morphological features. We first validate our approach using a set of specifically designed synthetic data. We then apply our segmentation framework to tomograms of plant cell walls that have undergone different chemical treatments for polysaccharide extraction. The subsequent compositional and morphological analyses of the plant cell walls reveal their organizational characteristics and the effects of the different chemical protocols on specific polysaccharides.

  17. Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield

    Science.gov (United States)

    Baurle, R. A.; Axdahl, E. L.

    2017-01-01

    Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.

  18. Quantification of Airfoil Geometry-Induced Aerodynamic Uncertainties---Comparison of Approaches

    KAUST Repository

    Liu, Dishi

    2015-04-14

    Uncertainty quantification in aerodynamic simulations calls for efficient numerical methods to reduce computational cost, especially for uncertainties caused by random geometry variations which involve a large number of variables. This paper compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and by point collocation, radial basis function and a gradient-enhanced version of kriging, and examines their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry which is parameterized by independent Gaussian variables. The results show that gradient-enhanced surrogate methods achieve better accuracy than direct integration methods with the same computational cost.

  19. Quantification of Airfoil Geometry-Induced Aerodynamic Uncertainties---Comparison of Approaches

    KAUST Repository

    Liu, Dishi; Litvinenko, Alexander; Schillings, Claudia; Schulz, Volker

    2015-01-01

    Uncertainty quantification in aerodynamic simulations calls for efficient numerical methods to reduce computational cost, especially for uncertainties caused by random geometry variations which involve a large number of variables. This paper compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and by point collocation, radial basis function and a gradient-enhanced version of kriging, and examines their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry which is parameterized by independent Gaussian variables. The results show that gradient-enhanced surrogate methods achieve better accuracy than direct integration methods with the same computational cost.

  20. Approach to Mathematics in Textbooks at Tertiary Level--Exploring Authors' Views about Their Texts

    Science.gov (United States)

    Randahl, Mira

    2012-01-01

    The aim of this article is to present and discuss some results from an inquiry into mathematics textbooks authors' visions about their texts and approaches they choose when new concepts are introduced. Authors' responses are discussed in relation to results about students' difficulties with approaching calculus reported by previous research. A…

  1. 3.8 Proposed approach to uncertainty quantification and sensitivity analysis in the next PA

    Energy Technology Data Exchange (ETDEWEB)

    Flach, Greg [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Wohlwend, Jen [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-02

    This memorandum builds upon Section 3.8 of SRNL (2016) and Flach (2017) by defining key error analysis, uncertainty quantification, and sensitivity analysis concepts and terms, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision.

  2. StakeMeter: value-based stakeholder identification and quantification framework for value-based software systems.

    Science.gov (United States)

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.

  3. StakeMeter: Value-Based Stakeholder Identification and Quantification Framework for Value-Based Software Systems

    Science.gov (United States)

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490

  4. Offshore wind turbine risk quantification/evaluation under extreme environmental conditions

    International Nuclear Information System (INIS)

    Taflanidis, Alexandros A.; Loukogeorgaki, Eva; Angelides, Demos C.

    2013-01-01

    A simulation-based framework is discussed in this paper for quantification/evaluation of risk and development of automated risk assessment tools, focusing on applications to offshore wind turbines under extreme environmental conditions. The framework is founded on a probabilistic characterization of the uncertainty in the models for the excitation, the turbine and its performance. Risk is then quantified as the expected value of some risk consequence measure over the probability distributions considered for the uncertain model parameters. Stochastic simulation is proposed for the risk assessment, corresponding to the evaluation of some associated probabilistic integral quantifying risk, as it allows for the adoption of comprehensive computational models for describing the dynamic turbine behavior. For improvement of the computational efficiency, a surrogate modeling approach is introduced based on moving least squares response surface approximations. The assessment is also extended to a probabilistic sensitivity analysis that identifies the importance of each of the uncertain model parameters, i.e. risk factors, towards the total risk as well as towards each of the failure modes contributing to this risk. The versatility and computational efficiency of the advocated approaches is finally exploited to support the development of standalone risk assessment applets for automated implementation of the probabilistic risk quantification/assessment. -- Highlights: ► A simulation-based risk quantification/assessment framework is discussed. ► Focus is on offshore wind turbines under extreme environmental conditions. ► Approach is founded on probabilistic description of excitation/system model parameters. ► Surrogate modeling is adopted for improved computational efficiency. ► Standalone risk assessment applets for automated implementation are supported

  5. Rapid Quantification and Validation of Lipid Concentrations within Liposomes

    Directory of Open Access Journals (Sweden)

    Carla B. Roces

    2016-09-01

    Full Text Available Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics. The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC, cholesterol, dimethyldioctadecylammonium (DDA bromide, and ᴅ-(+-trehalose 6,6′-dibehenate (TDB. The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R2 > 0.993 for the four lipids tested. The corresponding limit of detection (LOD and limit of quantification (LOQ were 0.11 and 0.36 mg/mL (DMPC, 0.02 and 0.80 mg/mL (cholesterol, 0.06 and 0.20 mg/mL (DDA, and 0.05 and 0.16 mg/mL (TDB, respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes.

  6. Rapid capillary electrophoresis approach for the quantification of ewe milk adulteration with cow milk.

    Science.gov (United States)

    Trimboli, Francesca; Morittu, Valeria Maria; Cicino, Caterina; Palmieri, Camillo; Britti, Domenico

    2017-10-13

    The substitution of ewe milk with more economic cow milk is a common fraud. Here we present a capillary electrophoresis method for the quantification of ewe milk in ovine/bovine milk mixtures, which allows for the rapid and inexpensive recognition of ewe milk adulteration with cow milk. We utilized a routine CE method for human blood and urine proteins analysis, which fulfilled the separation of skimmed milk proteins in alkaline buffer. Under this condition, ovine and bovine milk exhibited a recognizable and distinct CE protein profiles, with a specific ewe peak showing a reproducible migration zone in ovine/bovine mixtures. Based on ewe specific CE peak, we developed a method for ewe milk quantification in ovine/bovine skimmed milk mixtures, which showed good linearity, precision and accuracy, and a minimum amount of detectable fraudulent cow milk equal to 5%. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Virus detection and quantification using electrical parameters

    Science.gov (United States)

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-10-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.

  8. A practical method for accurate quantification of large fault trees

    International Nuclear Information System (INIS)

    Choi, Jong Soo; Cho, Nam Zin

    2007-01-01

    This paper describes a practical method to accurately quantify top event probability and importance measures from incomplete minimal cut sets (MCS) of a large fault tree. The MCS-based fault tree method is extensively used in probabilistic safety assessments. Several sources of uncertainties exist in MCS-based fault tree analysis. The paper is focused on quantification of the following two sources of uncertainties: (1) the truncation neglecting low-probability cut sets and (2) the approximation in quantifying MCSs. The method proposed in this paper is based on a Monte Carlo simulation technique to estimate probability of the discarded MCSs and the sum of disjoint products (SDP) approach complemented by the correction factor approach (CFA). The method provides capability to accurately quantify the two uncertainties and estimate the top event probability and importance measures of large coherent fault trees. The proposed fault tree quantification method has been implemented in the CUTREE code package and is tested on the two example fault trees

  9. Online updating and uncertainty quantification using nonstationary output-only measurement

    Science.gov (United States)

    Yuen, Ka-Veng; Kuok, Sin-Chi

    2016-01-01

    Extended Kalman filter (EKF) is widely adopted for state estimation and parametric identification of dynamical systems. In this algorithm, it is required to specify the covariance matrices of the process noise and measurement noise based on prior knowledge. However, improper assignment of these noise covariance matrices leads to unreliable estimation and misleading uncertainty estimation on the system state and model parameters. Furthermore, it may induce diverging estimation. To resolve these problems, we propose a Bayesian probabilistic algorithm for online estimation of the noise parameters which are used to characterize the noise covariance matrices. There are three major appealing features of the proposed approach. First, it resolves the divergence problem in the conventional usage of EKF due to improper choice of the noise covariance matrices. Second, the proposed approach ensures the reliability of the uncertainty quantification. Finally, since the noise parameters are allowed to be time-varying, nonstationary process noise and/or measurement noise are explicitly taken into account. Examples using stationary/nonstationary response of linear/nonlinear time-varying dynamical systems are presented to demonstrate the efficacy of the proposed approach. Furthermore, comparison with the conventional usage of EKF will be provided to reveal the necessity of the proposed approach for reliable model updating and uncertainty quantification.

  10. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    Science.gov (United States)

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Quantification of microbial quality and safety in minimally processed foods

    NARCIS (Netherlands)

    Zwietering, M.H.

    2002-01-01

    To find a good equilibrium between quality and margin of safety of minimally processed foods, often various hurdles are used. Quantification of the kinetics should be used to approach an optimum processing and to select the main aspects. Due to many factors of which the exact quantitative effect is

  12. Quantification in emission tomography

    International Nuclear Information System (INIS)

    Buvat, Irene

    2011-11-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena 2 - Main problems impacting quantification in PET and SPECT: problems, consequences, correction methods, results (Attenuation, scattering, partial volume effect, movement, un-stationary spatial resolution in SPECT, fortuitous coincidences in PET, standardisation in PET); 3 - Synthesis: accessible efficiency, know-how, Precautions, beyond the activity measurement

  13. Accident sequence quantification with KIRAP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP`s cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs.

  14. Accident sequence quantification with KIRAP

    International Nuclear Information System (INIS)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong.

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP's cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs

  15. Molecular quantification of environmental DNA using microfluidics and digital PCR.

    Science.gov (United States)

    Hoshino, Tatsuhiko; Inagaki, Fumio

    2012-09-01

    Real-time PCR has been widely used to evaluate gene abundance in natural microbial habitats. However, PCR-inhibitory substances often reduce the efficiency of PCR, leading to the underestimation of target gene copy numbers. Digital PCR using microfluidics is a new approach that allows absolute quantification of DNA molecules. In this study, digital PCR was applied to environmental samples, and the effect of PCR inhibitors on DNA quantification was tested. In the control experiment using λ DNA and humic acids, underestimation of λ DNA at 1/4400 of the theoretical value was observed with 6.58 ng μL(-1) humic acids. In contrast, digital PCR provided accurate quantification data with a concentration of humic acids up to 9.34 ng μL(-1). The inhibitory effect of paddy field soil extract on quantification of the archaeal 16S rRNA gene was also tested. By diluting the DNA extract, quantified copy numbers from real-time PCR and digital PCR became similar, indicating that dilution was a useful way to remedy PCR inhibition. The dilution strategy was, however, not applicable to all natural environmental samples. For example, when marine subsurface sediment samples were tested the copy number of archaeal 16S rRNA genes was 1.04×10(3) copies/g-sediment by digital PCR, whereas real-time PCR only resulted in 4.64×10(2) copies/g-sediment, which was most likely due to an inhibitory effect. The data from this study demonstrated that inhibitory substances had little effect on DNA quantification using microfluidics and digital PCR, and showed the great advantages of digital PCR in accurate quantifications of DNA extracted from various microbial habitats. Copyright © 2012 Elsevier GmbH. All rights reserved.

  16. Uncertainty quantification in ion–solid interaction simulations

    Energy Technology Data Exchange (ETDEWEB)

    Preuss, R., E-mail: preuss@ipp.mpg.de; Toussaint, U. von

    2017-02-15

    Within the framework of Bayesian uncertainty quantification we propose a non-intrusive reduced-order spectral approach (polynomial chaos expansion) to the simulation of ion–solid interactions. The method not only reduces the number of function evaluations but provides simultaneously a quantitative measure for which combinations of inputs have the most important impact on the result. It is applied to SDTRIM-simulations (Möller et al., 1988) with several uncertain and Gaussian distributed input parameters (i.e. angle, projectile energy, surface binding energy, target composition) and the results are compared to full-grid based approaches and sampling based methods with respect to reliability, efficiency and scalability.

  17. Good quantification practices of flavours and fragrances by mass spectrometry.

    Science.gov (United States)

    Begnaud, Frédéric; Chaintreau, Alain

    2016-10-28

    Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.

  18. Application of Quality by Design Approach to Bioanalysis: Development of a Method for Elvitegravir Quantification in Human Plasma.

    Science.gov (United States)

    Baldelli, Sara; Marrubini, Giorgio; Cattaneo, Dario; Clementi, Emilio; Cerea, Matteo

    2017-10-01

    The application of Quality by Design (QbD) principles in clinical laboratories can help to develop an analytical method through a systematic approach, providing a significant advance over the traditional heuristic and empirical methodology. In this work, we applied for the first time the QbD concept in the development of a method for drug quantification in human plasma using elvitegravir as the test molecule. The goal of the study was to develop a fast and inexpensive quantification method, with precision and accuracy as requested by the European Medicines Agency guidelines on bioanalytical method validation. The method was divided into operative units, and for each unit critical variables affecting the results were identified. A risk analysis was performed to select critical process parameters that should be introduced in the design of experiments (DoEs). Different DoEs were used depending on the phase of advancement of the study. Protein precipitation and high-performance liquid chromatography-tandem mass spectrometry were selected as the techniques to be investigated. For every operative unit (sample preparation, chromatographic conditions, and detector settings), a model based on factors affecting the responses was developed and optimized. The obtained method was validated and clinically applied with success. To the best of our knowledge, this is the first investigation thoroughly addressing the application of QbD to the analysis of a drug in a biological matrix applied in a clinical laboratory. The extensive optimization process generated a robust method compliant with its intended use. The performance of the method is continuously monitored using control charts.

  19. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    Directory of Open Access Journals (Sweden)

    Fuqiang Sun

    2017-06-01

    Full Text Available Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM and a genetic algorithm (GA. Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  20. Text mining with R a tidy approach

    CERN Document Server

    Silge, Julia

    2017-01-01

    Much of the data available today is unstructured and text-heavy, making it challenging for analysts to apply their usual data wrangling and visualization tools. With this practical book, you'll explore text-mining techniques with tidytext, a package that authors Julia Silge and David Robinson developed using the tidy principles behind R packages like ggraph and dplyr. You'll learn how tidytext and other tidy tools in R can make text analysis easier and more effective. The authors demonstrate how treating text as data frames enables you to manipulate, summarize, and visualize characteristics of text. You'll also learn how to integrate natural language processing (NLP) into effective workflows. Practical code examples and data explorations will help you generate real insights from literature, news, and social media. Learn how to apply the tidy text format to NLP Use sentiment analysis to mine the emotional content of text Identify a document's most important terms with frequency measurements E...

  1. Evaluation of two main RNA-seq approaches for gene quantification in clinical RNA sequencing: polyA+ selection versus rRNA depletion.

    Science.gov (United States)

    Zhao, Shanrong; Zhang, Ying; Gamini, Ramya; Zhang, Baohong; von Schack, David

    2018-03-19

    To allow efficient transcript/gene detection, highly abundant ribosomal RNAs (rRNA) are generally removed from total RNA either by positive polyA+ selection or by rRNA depletion (negative selection) before sequencing. Comparisons between the two methods have been carried out by various groups, but the assessments have relied largely on non-clinical samples. In this study, we evaluated these two RNA sequencing approaches using human blood and colon tissue samples. Our analyses showed that rRNA depletion captured more unique transcriptome features, whereas polyA+ selection outperformed rRNA depletion with higher exonic coverage and better accuracy of gene quantification. For blood- and colon-derived RNAs, we found that 220% and 50% more reads, respectively, would have to be sequenced to achieve the same level of exonic coverage in the rRNA depletion method compared with the polyA+ selection method. Therefore, in most cases we strongly recommend polyA+ selection over rRNA depletion for gene quantification in clinical RNA sequencing. Our evaluation revealed that a small number of lncRNAs and small RNAs made up a large fraction of the reads in the rRNA depletion RNA sequencing data. Thus, we recommend that these RNAs are specifically depleted to improve the sequencing depth of the remaining RNAs.

  2. Person-generated Data in Self-quantification. A Health Informatics Research Program.

    Science.gov (United States)

    Gray, Kathleen; Martin-Sanchez, Fernando J; Lopez-Campos, Guillermo H; Almalki, Manal; Merolli, Mark

    2017-01-09

    The availability of internet-connected mobile, wearable and ambient consumer technologies, direct-to-consumer e-services and peer-to-peer social media sites far outstrips evidence about the efficiency, effectiveness and efficacy of using them in healthcare applications. The aim of this paper is to describe one approach to build a program of health informatics research, so as to generate rich and robust evidence about health data and information processing in self-quantification and associated healthcare and health outcomes. The paper summarises relevant health informatics research approaches in the literature and presents an example of developing a program of research in the Health and Biomedical Informatics Centre (HaBIC) at the University of Melbourne. The paper describes this program in terms of research infrastructure, conceptual models, research design, research reporting and knowledge sharing. The paper identifies key outcomes from integrative and multiple-angle approaches to investigating the management of information and data generated by use of this Centre's collection of wearable, mobiles and other devices in health self-monitoring experiments. These research results offer lessons for consumers, developers, clinical practitioners and biomedical and health informatics researchers. Health informatics is increasingly called upon to make sense of emerging self-quantification and other digital health phenomena that are well beyond the conventions of healthcare in which the field of informatics originated and consolidated. To make a substantial contribution to optimise the aims, processes and outcomes of health self-quantification needs further work at scale in multi-centre collaborations for this Centre and for health informatics researchers generally.

  3. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Directory of Open Access Journals (Sweden)

    Žel Jana

    2006-08-01

    Full Text Available Abstract Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was

  4. Quantification and presence of human ancient DNA in burial place ...

    African Journals Online (AJOL)

    Quantification and presence of human ancient DNA in burial place remains of Turkey using real time polymerase chain reaction. ... A published real-time PCR assay, which allows for the combined analysis of nuclear or ancient DNA and mitochondrial DNA, was modified. This approach can be used for recovering DNA from ...

  5. Swift Quantification of Fenofibrate and Tiemonium methylsulfate Active Ingredients in Solid Drugs Using Particle Induced X-Ray Emission

    International Nuclear Information System (INIS)

    Bejjani, A.; Nsouli, B.; Zahraman, K.; Assi, S.; Younes, Gh.; Yazbi, F.

    2011-01-01

    The quantification of active ingredients (AI) in drugs is a crucial and important step in the drug quality control process. This is usually performed by using wet chemical techniques like LC-MS, UV spectrophotometry and other appropriate organic analytical methods. However, if the active ingredient contains specific heteroatoms (F, S, Cl), elemental IBA like PIXE and PIGE techniques, using small tandem accelerator of 1-2 MV, can be explored for molecular quantification. IBA techniques permit the analysis of the sample under solid form, without any laborious sample preparations. In this work, we demonstrate the ability of the Thick Target PIXE technique for rapid and accurate quantification of both low and high concentrations of active ingredients in different commercial drugs. Fenofibrate, a chlorinated active ingredient, is present in high amounts in two different commercial drugs, its quantification was done using the relative approach to an external standard. On the other hand, Tiemonium methylsulfate which exists in relatively low amount in commercial drugs, its quantification was done using GUPIX simulation code (absolute quantification). The experimental aspects related to the quantification validity (use of external standards, absolute quantification, matrix effect,...) are presented and discussed. (author)

  6. Quantification of silver nanoparticle uptake and distribution within individual human macrophages by FIB/SEM slice and view.

    Science.gov (United States)

    Guehrs, Erik; Schneider, Michael; Günther, Christian M; Hessing, Piet; Heitz, Karen; Wittke, Doreen; López-Serrano Oliver, Ana; Jakubowski, Norbert; Plendl, Johanna; Eisebitt, Stefan; Haase, Andrea

    2017-03-21

    Quantification of nanoparticle (NP) uptake in cells or tissues is very important for safety assessment. Often, electron microscopy based approaches are used for this purpose, which allow imaging at very high resolution. However, precise quantification of NP numbers in cells and tissues remains challenging. The aim of this study was to present a novel approach, that combines precise quantification of NPs in individual cells together with high resolution imaging of their intracellular distribution based on focused ion beam/ scanning electron microscopy (FIB/SEM) slice and view approaches. We quantified cellular uptake of 75 nm diameter citrate stabilized silver NPs (Ag 75 Cit) into an individual human macrophage derived from monocytic THP-1 cells using a FIB/SEM slice and view approach. Cells were treated with 10 μg/ml for 24 h. We investigated a single cell and found in total 3138 ± 722 silver NPs inside this cell. Most of the silver NPs were located in large agglomerates, only a few were found in clusters of fewer than five NPs. Furthermore, we cross-checked our results by using inductively coupled plasma mass spectrometry and could confirm the FIB/SEM results. Our approach based on FIB/SEM slice and view is currently the only one that allows the quantification of the absolute dose of silver NPs in individual cells and at the same time to assess their intracellular distribution at high resolution. We therefore propose to use FIB/SEM slice and view to systematically analyse the cellular uptake of various NPs as a function of size, concentration and incubation time.

  7. A time-series approach to random number generation: Using recurrence quantification analysis to capture executive behavior

    Directory of Open Access Journals (Sweden)

    Wouter eOomens

    2015-06-01

    Full Text Available The concept of executive functions plays a prominent role in contemporary experimental and clinical studies on cognition. One paradigm used in this framework is the random number generation (RNG task, the execution of which demands aspects of executive functioning, specifically inhibition and working memory. Data from the RNG task are best seen as a series of successive events. However, traditional RNG measures that are used to quantify executive functioning are mostly summary statistics referring to deviations from mathematical randomness. In the current study, we explore the utility of recurrence quantification analysis (RQA, a nonlinear method that keeps the entire sequence intact, as a better way to describe executive functioning compared to traditional measures. To this aim, 242 first- and second-year students completed a non-paced RNG task. Principal component analysis of their data showed that traditional and RQA measures convey more or less the same information. However, RQA measures do so more parsimoniously and have a better interpretation.

  8. Elemental labelling combined with liquid chromatography inductively coupled plasma mass spectrometry for quantification of biomolecules: A review

    International Nuclear Information System (INIS)

    Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan

    2012-01-01

    Highlights: ► Survey of bio-analytical approaches utilizing biomolecule labelling. ► Detailed discussion of methodology and chemistry of elemental labelling. ► Biomedical and bio-analytical applications of elemental labelling. ► FI-ICP-MS and LC–ICP-MS for quantification of elemental labelled biomolecules. ► Review of selected applications. - Abstract: This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given.

  9. Nuclear and mitochondrial DNA quantification of various forensic materials.

    Science.gov (United States)

    Andréasson, H; Nilsson, M; Budowle, B; Lundberg, H; Allen, M

    2006-12-01

    Due to the different types and quality of forensic evidence materials, their DNA content can vary substantially, and particularly low quantities can impact the results in an identification analysis. In this study, the quantity of mitochondrial and nuclear DNA was determined in a variety of materials using a previously described real-time PCR method. DNA quantification in the roots and distal sections of plucked and shed head hairs revealed large variations in DNA content particularly between the root and the shaft of plucked hairs. Also large intra- and inter-individual variations were found among hairs. In addition, DNA content was estimated in samples collected from fingerprints and accessories. The quantification of DNA on various items also displayed large variations, with some materials containing large amounts of nuclear DNA while no detectable nuclear DNA and only limited amounts of mitochondrial DNA were seen in others. Using this sensitive real-time PCR quantification assay, a better understanding was obtained regarding DNA content and variation in commonly analysed forensic evidence materials and this may guide the forensic scientist as to the best molecular biology approach for analysing various forensic evidence materials.

  10. Quantification of visual soil erosion indicators in Gikuuri catchment in the central Highlands of Kenya.

    NARCIS (Netherlands)

    Sterk, G.; Okoba, B.O.

    2006-01-01

    Quantification of soil erosion using conventional approaches is hampered by lack of extensive spatial coverage and long duration data. Therefore use of these approaches for land management advisory has tended to result in unsatisfactory landuse plans that are in great disparity to on-site

  11. Systematic text condensation

    DEFF Research Database (Denmark)

    Malterud, Kirsti

    2012-01-01

    To present background, principles, and procedures for a strategy for qualitative analysis called systematic text condensation and discuss this approach compared with related strategies.......To present background, principles, and procedures for a strategy for qualitative analysis called systematic text condensation and discuss this approach compared with related strategies....

  12. Text Analysis: Critical Component of Planning for Text-Based Discussion Focused on Comprehension of Informational Texts

    Science.gov (United States)

    Kucan, Linda; Palincsar, Annemarie Sullivan

    2018-01-01

    This investigation focuses on a tool used in a reading methods course to introduce reading specialist candidates to text analysis as a critical component of planning for text-based discussions. Unlike planning that focuses mainly on important text content or information, a text analysis approach focuses both on content and how that content is…

  13. Text Maps: Helping Students Navigate Informational Texts.

    Science.gov (United States)

    Spencer, Brenda H.

    2003-01-01

    Notes that a text map is an instructional approach designed to help students gain fluency in reading content area materials. Discusses how the goal is to teach students about the important features of the material and how the maps can be used to build new understandings. Presents the procedures for preparing and using a text map. (SG)

  14. Seed shape quantification in the order Cucurbitales

    Directory of Open Access Journals (Sweden)

    Emilio Cervantes

    2018-02-01

    Full Text Available Seed shape quantification in diverse species of the families belonging to the order Cucurbitales is done based on the comparison of seed images with geometric figures. Quantification of seed shape is a useful tool in plant description for phenotypic characterization and taxonomic analysis. J index gives the percent of similarity of the image of a seed with a geometric figure and it is useful in taxonomy for the study of relationships between plant groups. Geometric figures used as models in the Cucurbitales are the ovoid, two ellipses with different x/y ratios and the outline of the Fibonacci spiral. The images of seeds have been compared with these figures and values of J index obtained. The results obtained for 29 species in the family Cucurbitaceae support a relationship between seed shape and species ecology. Simple seed shape, with images resembling simple geometric figures like the ovoid, ellipse or the Fibonacci spiral, may be a feature in the basal clades of taxonomic groups.

  15. Quantification of regional fat volume in rat MRI

    Science.gov (United States)

    Sacha, Jaroslaw P.; Cockman, Michael D.; Dufresne, Thomas E.; Trokhan, Darren

    2003-05-01

    Multiple initiatives in the pharmaceutical and beauty care industries are directed at identifying therapies for weight management. Body composition measurements are critical for such initiatives. Imaging technologies that can be used to measure body composition noninvasively include DXA (dual energy x-ray absorptiometry) and MRI (magnetic resonance imaging). Unlike other approaches, MRI provides the ability to perform localized measurements of fat distribution. Several factors complicate the automatic delineation of fat regions and quantification of fat volumes. These include motion artifacts, field non-uniformity, brightness and contrast variations, chemical shift misregistration, and ambiguity in delineating anatomical structures. We have developed an approach to deal practically with those challenges. The approach is implemented in a package, the Fat Volume Tool, for automatic detection of fat tissue in MR images of the rat abdomen, including automatic discrimination between abdominal and subcutaneous regions. We suppress motion artifacts using masking based on detection of implicit landmarks in the images. Adaptive object extraction is used to compensate for intensity variations. This approach enables us to perform fat tissue detection and quantification in a fully automated manner. The package can also operate in manual mode, which can be used for verification of the automatic analysis or for performing supervised segmentation. In supervised segmentation, the operator has the ability to interact with the automatic segmentation procedures to touch-up or completely overwrite intermediate segmentation steps. The operator's interventions steer the automatic segmentation steps that follow. This improves the efficiency and quality of the final segmentation. Semi-automatic segmentation tools (interactive region growing, live-wire, etc.) improve both the accuracy and throughput of the operator when working in manual mode. The quality of automatic segmentation has been

  16. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than...... Quantification kit in two experiments. The measured DNA concentrations with Quantifiler were 125 and 160% higher than expected based on the manufacturers' information. When the Quantifiler human DNA standard (Raji cell line) was replaced by the commercial human DNA preparation G147A (Promega) to generate the DNA...... standard curve in the Quantifiler Human DNA Quantification kit, the DNA quantification results of the human DNA preparations were 31% higher than expected based on the manufacturers' information. The results indicate a calibration problem with the Quantifiler human DNA standard for its use...

  17. Quantification of methionine and selenomethionine in biological samples using multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS).

    Science.gov (United States)

    Vu, Dai Long; Ranglová, Karolína; Hájek, Jan; Hrouzek, Pavel

    2018-05-01

    Quantification of selenated amino-acids currently relies on methods employing inductively coupled plasma mass spectrometry (ICP-MS). Although very accurate, these methods do not allow the simultaneous determination of standard amino-acids, hampering the comparison of the content of selenated versus non-selenated species such as methionine (Met) and selenomethionine (SeMet). This paper reports two approaches for the simultaneous quantification of Met and SeMet. In the first approach, standard enzymatic hydrolysis employing Protease XIV was applied for the preparation of samples. The second approach utilized methanesulfonic acid (MA) for the hydrolysis of samples, either in a reflux system or in a microwave oven, followed by derivatization with diethyl ethoxymethylenemalonate. The prepared samples were then analyzed by multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS). Both approaches provided platforms for the accurate determination of selenium/sulfur substitution rate in Met. Moreover the second approach also provided accurate simultaneous quantification of Met and SeMet with a low limit of detection, low limit of quantification and wide linearity range, comparable to the commonly used gas chromatography mass spectrometry (GC-MS) method or ICP-MS. The novel method was validated using certified reference material in conjunction with the GC-MS reference method. Copyright © 2018. Published by Elsevier B.V.

  18. A Network of Themes: A Qualitative Approach to Gerhard Richter's Text

    Directory of Open Access Journals (Sweden)

    Narvika Bovcon

    2017-07-01

    Full Text Available Gerhard Richter's books Text – a collection of painter's verbal statements about his artistic method – and Atlas – 783 sheets with images, mainly photographs and visual notations – are two archives that complement the understanding of his diverse artistic practice. The paper presents a textual model that experimentally simulates a possible ordering principle for archives. Richter's statements in the book Text are cut up and used as short quotations. Those that relate to multiple aspects of the painter's oeuvre are identified as hubs in the semantic network. The hubs are organized paratactically, as an array of different themes. The paper presents a methodological hypothesis and an experimental model that aim to connect the research of real networks with the paradigms of humanistic interpretation. We have to bear in mind that the network is a result of the researcher's interpretative approach, which is added to the initial archive included in the book Text. The breaking up of Richter's poetics into atoms of quotations is an experimental proposal of a new textuality in art history and humanities, which has its own history. In comparison to digital archives with complex interfaces that often tend to obscure the content, the elements in our experiment appear as specific configurations of the semantic network and are presented in a limited number of linear texts. The method of listing of quotations gathers the fragments into a potential “whole”, i.e. a narrativized gateway to an archive according to the researcher's interpretation.

  19. Aerosol-type retrieval and uncertainty quantification from OMI data

    Directory of Open Access Journals (Sweden)

    A. Kauppi

    2017-11-01

    Full Text Available We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs and top-of-atmosphere (TOA spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD. The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the

  20. Comparison of machine learning and semi-quantification algorithms for (I123)FP-CIT classification: the beginning of the end for semi-quantification?

    Science.gov (United States)

    Taylor, Jonathan Christopher; Fenner, John Wesley

    2017-11-29

    performance was lower for the local database than the research database for both semi-quantitative and machine learning algorithms. However, for both databases, the machine learning methods generated equal or higher mean accuracies (with lower variance) than any of the semi-quantification approaches. The gain in performance from using machine learning algorithms as compared to semi-quantification was relatively small and may be insufficient, when considered in isolation, to offer significant advantages in the clinical context.

  1. A Method for Quantification of Epithelium Colonization Capacity by Pathogenic Bacteria

    Directory of Open Access Journals (Sweden)

    Rune M. Pedersen

    2018-02-01

    Full Text Available Most bacterial infections initiate at the mucosal epithelium lining the gastrointestinal, respiratory, and urogenital tracts. At these sites, bacterial pathogens must adhere and increase in numbers to effectively breach the outer barrier and invade the host. If the bacterium succeeds in reaching the bloodstream, effective dissemination again requires that bacteria in the blood, reestablish contact to distant endothelium sites and form secondary site foci. The infectious potential of bacteria is therefore closely linked to their ability to adhere to, colonize, and invade epithelial and endothelial surfaces. Measurement of bacterial adhesion to epithelial cells is therefore standard procedure in studies of bacterial virulence. Traditionally, such measurements have been conducted with microtiter plate cell cultures to which bacteria are added, followed by washing procedures and final quantification of retained bacteria by agar plating. This approach is fast and straightforward, but yields only a rough estimate of the adhesive properties of the bacteria upon contact, and little information on the ability of the bacterium to colonize these surfaces under relevant physiological conditions. Here, we present a method in which epithelia/endothelia are simulated by flow chamber-grown human cell layers, and infection is induced by seeding of pathogenic bacteria on these surfaces under conditions that simulate the physiological microenvironment. Quantification of bacterial adhesion and colonization of the cell layers is then performed by in situ time-lapse fluorescence microscopy and automatic detection of bacterial surface coverage. The method is demonstrated in three different infection models, simulating Staphylococcus aureus endothelial infection and Escherichia coli intestinal- and uroepithelial infection. The approach yields valuable information on the fitness of the bacterium to successfully adhere to and colonize epithelial surfaces and can be used

  2. Assessment of current mass spectrometric workflows for the quantification of low abundant proteins and phosphorylation sites

    Directory of Open Access Journals (Sweden)

    Manuel Bauer

    2015-12-01

    Full Text Available The data described here provide a systematic performance evaluation of popular data-dependent (DDA and independent (DIA mass spectrometric (MS workflows currently used in quantitative proteomics. We assessed the limits of identification, quantification and detection for each method by analyzing a dilution series of 20 unmodified and 10 phosphorylated synthetic heavy labeled reference peptides, respectively, covering six orders of magnitude in peptide concentration with and without a complex human cell digest background. We found that all methods performed very similarly in the absence of background proteins, however, when analyzing whole cell lysates, targeted methods were at least 5–10 times more sensitive than directed or DDA methods. In particular, higher stage fragmentation (MS3 of the neutral loss peak using a linear ion trap increased dynamic quantification range of some phosphopeptides up to 100-fold. We illustrate the power of this targeted MS3 approach for phosphopeptide monitoring by successfully quantifying 9 phosphorylation sites of the kinetochore and spindle assembly checkpoint component Mad1 over different cell cycle states from non-enriched pull-down samples. The data are associated to the research article ‘Evaluation of data-dependent and data-independent mass spectrometric workflows for sensitive quantification of proteins and phosphorylation sites׳ (Bauer et al., 2014 [1]. The mass spectrometry and the analysis dataset have been deposited to the ProteomeXchange Consortium (http://proteomecentral.proteomexchange.org via the PRIDE partner repository with the dataset identifier PXD000964.

  3. Multiplex electrochemical DNA platform for femtomolar-level quantification of genetically modified soybean.

    Science.gov (United States)

    Manzanares-Palenzuela, C Lorena; de-los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz

    2015-06-15

    Current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) with a minimum content of 0.9% would benefit from the availability of reliable and rapid methods to detect and quantify DNA sequences specific for GMOs. Different genosensors have been developed to this aim, mainly intended for GMO screening. A remaining challenge, however, is the development of genosensing platforms for GMO quantification, which should be expressed as the number of event-specific DNA sequences per taxon-specific sequences. Here we report a simple and sensitive multiplexed electrochemical approach for the quantification of Roundup-Ready Soybean (RRS). Two DNA sequences, taxon (lectin) and event-specific (RR), are targeted via hybridization onto magnetic beads. Both sequences are simultaneously detected by performing the immobilization, hybridization and labeling steps in a single tube and parallel electrochemical readout. Hybridization is performed in a sandwich format using signaling probes labeled with fluorescein isothiocyanate (FITC) or digoxigenin (Dig), followed by dual enzymatic labeling using Fab fragments of anti-Dig and anti-FITC conjugated to peroxidase or alkaline phosphatase, respectively. Electrochemical measurement of the enzyme activity is finally performed on screen-printed carbon electrodes. The assay gave a linear range of 2-250 pM for both targets, with LOD values of 650 fM (160 amol) and 190 fM (50 amol) for the event-specific and the taxon-specific targets, respectively. Results indicate that the method could be applied for GMO quantification below the European labeling threshold level (0.9%), offering a general approach for the rapid quantification of specific GMO events in foods. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Joint optimization of collimator and reconstruction parameters in SPECT imaging for lesion quantification

    International Nuclear Information System (INIS)

    McQuaid, Sarah J; Southekal, Sudeepti; Kijewski, Marie Foley; Moore, Stephen C

    2011-01-01

    Obtaining the best possible task performance using reconstructed SPECT images requires optimization of both the collimator and reconstruction parameters. The goal of this study is to determine how to perform this optimization, namely whether the collimator parameters can be optimized solely from projection data, or whether reconstruction parameters should also be considered. In order to answer this question, and to determine the optimal collimation, a digital phantom representing a human torso with 16 mm diameter hot lesions (activity ratio 8:1) was generated and used to simulate clinical SPECT studies with parallel-hole collimation. Two approaches to optimizing the SPECT system were then compared in a lesion quantification task: sequential optimization, where collimation was optimized on projection data using the Cramer–Rao bound, and joint optimization, which simultaneously optimized collimator and reconstruction parameters. For every condition, quantification performance in reconstructed images was evaluated using the root-mean-squared-error of 400 estimates of lesion activity. Compared to the joint-optimization approach, the sequential-optimization approach favoured a poorer resolution collimator, which, under some conditions, resulted in sub-optimal estimation performance. This implies that inclusion of the reconstruction parameters in the optimization procedure is important in obtaining the best possible task performance; in this study, this was achieved with a collimator resolution similar to that of a general-purpose (LEGP) collimator. This collimator was found to outperform the more commonly used high-resolution (LEHR) collimator, in agreement with other task-based studies, using both quantification and detection tasks.

  5. A multivariate shape quantification approach for sickle red blood cell in patient-specific microscopy image data

    Science.gov (United States)

    Xu, Mengjia; Yang, Jinzhu; Zhao, Hong

    2017-07-01

    The morphological change of red blood cells(RBCs) plays an important role in revealing the biomechanical and biorheological characteristics of RBCs. Aiming to extract the shape indices for the sickle RBCs, an automated ex-vivo RBC shape quantification method is proposed. First, single RBC regions (ROIs) are extracted from raw microscopy image via an automatic hierarchical ROI extraction method. Second, an improved random walk method is used to detect the RBC outline. Finally, three types of RBC shape factors are calculated based on the elliptical fitting RBC contour. Experiments indicate that the proposed method can accurately segment the RBCs from the microscopy images with low contrast and prevent the disturbance of artifacts. Moreover, it can provide an efficient shape quantification means for diverse RBC shapes in a batch manner.

  6. Voltammetric Quantification of Paraquat and Glyphosate in Surface Waters

    Directory of Open Access Journals (Sweden)

    William Roberto Alza-Camacho

    2016-09-01

    Full Text Available The indiscriminate use of pesticides on crops has a negative environmental impact that affects organisms, soil and water resources, essential for life. Therefore, it is necessary to evaluate the residual effect of these substances in water sources. A simple, affordable and accessible electrochemical method for Paraquat and Glyphosate quantification in water was developed. The study was conducted using as supporting electrolyte Britton-Robinson buffer solution, working electrode of glassy carbon, Ag/AgCl as the reference electrode, and platinum as auxiliary electrode. Differential pulse voltammetry (VDP method for both compounds were validated. Linearity of the methods presented a correlation coefficient of 0.9949 and 0.9919 and the limits of detection and quantification were 130 and 190 mg/L for Paraquat and 40 and 50 mg/L for glyphosate. Comparison with the reference method showed that the electrochemical method provides superior results in quantification of analytes. Of the samples tested, a value of Paraquat was between 0,011 to 1,572 mg/L and for glyphosate it was between 0.201 to 2.777 mg/L, indicating that these compounds are present in water sources and that those may be causing serious problems to human health.

  7. Automated quantification of epicardial adipose tissue using CT angiography: evaluation of a prototype software

    International Nuclear Information System (INIS)

    Spearman, James V.; Silverman, Justin R.; Krazinski, Aleksander W.; Costello, Philip; Meinel, Felix G.; Geyer, Lucas L.; Schoepf, U.J.; Apfaltrer, Paul; Canstein, Christian; De Cecco, Carlo Nicola

    2014-01-01

    This study evaluated the performance of a novel automated software tool for epicardial fat volume (EFV) quantification compared to a standard manual technique at coronary CT angiography (cCTA). cCTA data sets of 70 patients (58.6 ± 12.9 years, 33 men) were retrospectively analysed using two different post-processing software applications. Observer 1 performed a manual single-plane pericardial border definition and EFV M segmentation (manual approach). Two observers used a software program with fully automated 3D pericardial border definition and EFV A calculation (automated approach). EFV and time required for measuring EFV (including software processing time and manual optimization time) for each method were recorded. Intraobserver and interobserver reliability was assessed on the prototype software measurements. T test, Spearman's rho, and Bland-Altman plots were used for statistical analysis. The final EFV A (with manual border optimization) was strongly correlated with the manual axial segmentation measurement (60.9 ± 33.2 mL vs. 65.8 ± 37.0 mL, rho = 0.970, P 0.9). Automated EFV A quantification is an accurate and time-saving method for quantification of EFV compared to established manual axial segmentation methods. (orig.)

  8. Accurate Quantification of Cardiovascular Biomarkers in Serum Using Protein Standard Absolute Quantification (PSAQ™) and Selected Reaction Monitoring*

    Science.gov (United States)

    Huillet, Céline; Adrait, Annie; Lebert, Dorothée; Picard, Guillaume; Trauchessec, Mathieu; Louwagie, Mathilde; Dupuis, Alain; Hittinger, Luc; Ghaleh, Bijan; Le Corvoisier, Philippe; Jaquinod, Michel; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-01-01

    Development of new biomarkers needs to be significantly accelerated to improve diagnostic, prognostic, and toxicity monitoring as well as therapeutic follow-up. Biomarker evaluation is the main bottleneck in this development process. Selected Reaction Monitoring (SRM) combined with stable isotope dilution has emerged as a promising option to speed this step, particularly because of its multiplexing capacities. However, analytical variabilities because of upstream sample handling or incomplete trypsin digestion still need to be resolved. In 2007, we developed the PSAQ™ method (Protein Standard Absolute Quantification), which uses full-length isotope-labeled protein standards to quantify target proteins. In the present study we used clinically validated cardiovascular biomarkers (LDH-B, CKMB, myoglobin, and troponin I) to demonstrate that the combination of PSAQ and SRM (PSAQ-SRM) allows highly accurate biomarker quantification in serum samples. A multiplex PSAQ-SRM assay was used to quantify these biomarkers in clinical samples from myocardial infarction patients. Good correlation between PSAQ-SRM and ELISA assay results was found and demonstrated the consistency between these analytical approaches. Thus, PSAQ-SRM has the capacity to improve both accuracy and reproducibility in protein analysis. This will be a major contribution to efficient biomarker development strategies. PMID:22080464

  9. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  10. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    International Nuclear Information System (INIS)

    Brown, C.S.; Zhang, Hongbin

    2016-01-01

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis. A 2 × 2 fuel assembly model was developed and simulated by VERA-CS, and uncertainty quantification and sensitivity analysis were performed with fourteen uncertain input parameters. The minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surface temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. Parameters used as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.

  11. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  12. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc; Vitriolo, Alessandro; Adamo, Antonio; Laise, Pasquale; Das, Vivek; Testa, Giuseppe

    2016-01-01

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  13. Quantifications and Modeling of Human Failure Events in a Fire PSA

    International Nuclear Information System (INIS)

    Kang, Dae Il; Kim, Kilyoo; Jang, Seung-Cheol

    2014-01-01

    USNRC and EPRI developed guidance, 'Fire Human Reliability Analysis Guidelines, NUREG-1921', for estimating human error probabilities (HEPs) for HFEs under fire conditions. NUREG-1921 classifies HFEs into four types associated with the following human actions: - Type 1: New and existing Main Control Room (MCR) actions - Type 2: New and existing ex-MCR actions - Type 3: Actions associated with using alternate shutdown means (ASD) - Type 4: Actions relating to the error of commissions (EOCs) or error of omissions (EOOs) as a result of incorrect indications (SPI) In this paper, approaches for the quantifications and modeling of HFEs related to Type 1, 2 and 3 human actions are introduced. This paper introduced the human reliability analysis process for a fire PSA of Hanul Unit 3. A multiplier of 10 was used to re-estimate the HEPs for the preexisting internal human actions. The HEPs for all ex- MCR actions were assumed to be one. New MCR human actions were quantified using the scoping analysis method of NUREG-1921. If the quantified human action were identified to be risk-significant, detailed approaches (modeling and quantification) were used for incorporating fire situations into them. Multiple HFEs for single human action were defined and they were separately and were separately quantified to incorporate the specific fire situations into them. From this study, we can confirm that the modeling as well as quantifications of human actions is very important to appropriately treat them in PSA logic structures

  14. Quantifications and Modeling of Human Failure Events in a Fire PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Kim, Kilyoo; Jang, Seung-Cheol [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    USNRC and EPRI developed guidance, 'Fire Human Reliability Analysis Guidelines, NUREG-1921', for estimating human error probabilities (HEPs) for HFEs under fire conditions. NUREG-1921 classifies HFEs into four types associated with the following human actions: - Type 1: New and existing Main Control Room (MCR) actions - Type 2: New and existing ex-MCR actions - Type 3: Actions associated with using alternate shutdown means (ASD) - Type 4: Actions relating to the error of commissions (EOCs) or error of omissions (EOOs) as a result of incorrect indications (SPI) In this paper, approaches for the quantifications and modeling of HFEs related to Type 1, 2 and 3 human actions are introduced. This paper introduced the human reliability analysis process for a fire PSA of Hanul Unit 3. A multiplier of 10 was used to re-estimate the HEPs for the preexisting internal human actions. The HEPs for all ex- MCR actions were assumed to be one. New MCR human actions were quantified using the scoping analysis method of NUREG-1921. If the quantified human action were identified to be risk-significant, detailed approaches (modeling and quantification) were used for incorporating fire situations into them. Multiple HFEs for single human action were defined and they were separately and were separately quantified to incorporate the specific fire situations into them. From this study, we can confirm that the modeling as well as quantifications of human actions is very important to appropriately treat them in PSA logic structures.

  15. Rapid quantification of vesicle concentration for DOPG/DOPC and Cardiolipin/DOPC mixed lipid systems of variable composition.

    Science.gov (United States)

    Elmer-Dixon, Margaret M; Bowler, Bruce E

    2018-05-19

    A novel approach to quantify mixed lipid systems is described. Traditional approaches to lipid vesicle quantification are time consuming, require large amounts of material and are destructive. We extend our recently described method for quantification of pure lipid systems to mixed lipid systems. The method only requires a UV-Vis spectrometer and does not destroy sample. Mie scattering data from absorbance measurements are used as input into a Matlab program to calculate the total vesicle concentration and the concentrations of each lipid in the mixed lipid system. The technique is fast and accurate, which is essential for analytical lipid binding experiments. Copyright © 2018. Published by Elsevier Inc.

  16. Approach for discrimination and quantification of electroactive species: kinetics difference revealed by higher harmonics of Fourier transformed sinusoidal voltammetry.

    Science.gov (United States)

    Fang, Yishan; Huang, Xinjian; Wang, Lishi

    2015-01-06

    Discrimination and quantification of electroactive species are traditionally realized by a potential difference which is mainly determined by thermodynamics. However, the resolution of this approach is limited to tens of millivolts. In this paper, we described an application of Fourier transformed sinusoidal voltammetry (FT-SV) that provides a new approach for discrimination and quantitative evaluation of electroactive species, especially thermodynamic similar ones. Numerical simulation indicates that electron transfer kinetics difference between electroactive species can be revealed by the phase angle of higher order harmonics of FT-SV, and the difference can be amplified order by order. Thus, even a very subtle kinetics difference can be amplified to be distinguishable at a certain order of harmonics. This method was verified with structurally similar ferrocene derivatives which were chosen as the model systems. Although these molecules have very close redox potential (harmonics. The results demonstrated the feasibility and reliability of the method. It was also implied that the combination of the traditional thermodynamic method and this kinetics method can form a two-dimension resolved detection method, and it has the potential to extend the resolution of voltammetric techniques to a new level.

  17. Towards a new method for the quantification of metabolites in the biological sample

    International Nuclear Information System (INIS)

    Neugnot, B.

    2005-03-01

    The quantification of metabolites is a key step in drug development. The aim of this Ph.D. work was to study the feasibility of a new method for this quantification, in the biological sample, without the drawbacks (cost, time, ethics) of the classical quantification methods based on metabolites synthesis or administration to man of the radiolabelled drug. Our strategy consists in determining the response factor, in mass spectrometry, of the metabolites. This approach is based on tritium labelling of the metabolites, ex vivo, by isotopic exchange. The labelling step was studied with deuterium. Metabolites of a model drug, recovered from in vitro or urinary samples, were labelled by three ways (Crab tree's catalyst ID2, deuterated trifluoroacetic acid or rhodium chloride ID20). Then, the transposition to tritium labelling was studied and the first results are very promising for the ultimate validation of the method. (author)

  18. Lung involvement quantification in chest radiographs

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A.; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M.

    2014-01-01

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  19. Fluorescent quantification of melanin.

    Science.gov (United States)

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Modeling qRT-PCR dynamics with application to cancer biomarker quantification.

    Science.gov (United States)

    Chervoneva, Inna; Freydin, Boris; Hyslop, Terry; Waldman, Scott A

    2017-01-01

    Quantitative reverse transcription polymerase chain reaction (qRT-PCR) is widely used for molecular diagnostics and evaluating prognosis in cancer. The utility of mRNA expression biomarkers relies heavily on the accuracy and precision of quantification, which is still challenging for low abundance transcripts. The critical step for quantification is accurate estimation of efficiency needed for computing a relative qRT-PCR expression. We propose a new approach to estimating qRT-PCR efficiency based on modeling dynamics of polymerase chain reaction amplification. In contrast, only models for fluorescence intensity as a function of polymerase chain reaction cycle have been used so far for quantification. The dynamics of qRT-PCR efficiency is modeled using an ordinary differential equation model, and the fitted ordinary differential equation model is used to obtain effective polymerase chain reaction efficiency estimates needed for efficiency-adjusted quantification. The proposed new qRT-PCR efficiency estimates were used to quantify GUCY2C (Guanylate Cyclase 2C) mRNA expression in the blood of colorectal cancer patients. Time to recurrence and GUCY2C expression ratios were analyzed in a joint model for survival and longitudinal outcomes. The joint model with GUCY2C quantified using the proposed polymerase chain reaction efficiency estimates provided clinically meaningful results for association between time to recurrence and longitudinal trends in GUCY2C expression.

  1. Deep-Dive Targeted Quantification for Ultrasensitive Analysis of Proteins in Nondepleted Human Blood Plasma/Serum and Tissues

    Energy Technology Data Exchange (ETDEWEB)

    Nie, Song [Biological Sciences Division; Shi, Tujin [Biological Sciences Division; Fillmore, Thomas L. [Biological Sciences Division; Schepmoes, Athena A. [Biological Sciences Division; Brewer, Heather [Biological Sciences Division; Gao, Yuqian [Biological Sciences Division; Song, Ehwang [Biological Sciences Division; Wang, Hui [Biological Sciences Division; Rodland, Karin D. [Biological Sciences Division; Qian, Wei-Jun [Biological Sciences Division; Smith, Richard D. [Biological Sciences Division; Liu, Tao [Biological Sciences Division

    2017-08-11

    Mass spectrometry-based targeted proteomics (e.g., selected reaction monitoring, SRM) is emerging as an attractive alternative to immunoassays for protein quantification. Recently we have made significant progress in SRM sensitivity for enabling quantification of low ng/mL to sub-ng/mL level proteins in nondepleted human blood plasma/serum without affinity enrichment. However, precise quantification of extremely low abundant but biologically important proteins (e.g., ≤100 pg/mL in blood plasma/serum) using targeted proteomics approaches still remains challenging. To address this need, we have developed an antibody-independent Deep-Dive SRM (DD-SRM) approach that capitalizes on multidimensional high-resolution reversed-phase liquid chromatography (LC) separation for target peptide enrichment combined with precise selection of target peptide fractions of interest, significantly improving SRM sensitivity by ~5 orders of magnitude when compared to conventional LC-SRM. Application of DD-SRM to human serum and tissue has been demonstrated to enable precise quantification of endogenous proteins at ~10 pg/mL level in nondepleted serum and at <10 copies per cell level in tissue. Thus, DD-SRM holds great promise for precisely measuring extremely low abundance proteins or protein modifications, especially when high-quality antibody is not available.

  2. Cytochrome c oxidase subunit 1-based human RNA quantification to enhance mRNA profiling in forensic biology

    Directory of Open Access Journals (Sweden)

    Dong Zhao

    2017-01-01

    Full Text Available RNA analysis offers many potential applications in forensic science, and molecular identification of body fluids by analysis of cell-specific RNA markers represents a new technique for use in forensic cases. However, due to the nature of forensic materials that often admixed with nonhuman cellular components, human-specific RNA quantification is required for the forensic RNA assays. Quantification assay for human RNA has been developed in the present study with respect to body fluid samples in forensic biology. The quantitative assay is based on real-time reverse transcription-polymerase chain reaction of mitochondrial RNA cytochrome c oxidase subunit I and capable of RNA quantification with high reproducibility and a wide dynamic range. The human RNA quantification improves the quality of mRNA profiling in the identification of body fluids of saliva and semen because the quantification assay can exclude the influence of nonhuman components and reduce the adverse affection from degraded RNA fragments.

  3. Adjustable typography: an approach to enhancing low vision text accessibility.

    Science.gov (United States)

    Arditi, Aries

    2004-04-15

    Millions of people have low vision, a disability condition caused by uncorrectable or partially correctable disorders of the eye. The primary goal of low vision rehabilitation is increasing access to printed material. This paper describes how adjustable typography, a computer graphic approach to enhancing text accessibility, can play a role in this process, by allowing visually-impaired users to customize fonts to maximize legibility according to their own visual needs. Prototype software and initial testing of the concept is described. The results show that visually-impaired users tend to produce a variety of very distinct fonts, and that the adjustment process results in greatly enhanced legibility. But this initial testing has not yet demonstrated increases in legibility over and above the legibility of highly legible standard fonts such as Times New Roman.

  4. Pedoinformatics Approach to Soil Text Analytics

    Science.gov (United States)

    Furey, J.; Seiter, J.; Davis, A.

    2017-12-01

    The several extant schema for the classification of soils rely on differing criteria, but the major soil science taxonomies, including the United States Department of Agriculture (USDA) and the international harmonized World Reference Base for Soil Resources systems, are based principally on inferred pedogenic properties. These taxonomies largely result from compiled individual observations of soil morphologies within soil profiles, and the vast majority of this pedologic information is contained in qualitative text descriptions. We present text mining analyses of hundreds of gigabytes of parsed text and other data in the digitally available USDA soil taxonomy documentation, the Soil Survey Geographic (SSURGO) database, and the National Cooperative Soil Survey (NCSS) soil characterization database. These analyses implemented iPython calls to Gensim modules for topic modelling, with latent semantic indexing completed down to the lowest taxon level (soil series) paragraphs. Via a custom extension of the Natural Language Toolkit (NLTK), approximately one percent of the USDA soil series descriptions were used to train a classifier for the remainder of the documents, essentially by treating soil science words as comprising a novel language. While location-specific descriptors at the soil series level are amenable to geomatics methods, unsupervised clustering of the occurrence of other soil science words did not closely follow the usual hierarchy of soil taxa. We present preliminary phrasal analyses that may account for some of these effects.

  5. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    International Nuclear Information System (INIS)

    Jackson, Christopher B.; Gallati, Sabina; Schaller, André

    2012-01-01

    Highlights: ► Serial qPCR accurately determines fragmentation state of any given DNA sample. ► Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. ► Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. ► Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serial qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze–thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA (λ nDNA ) and mtDNA (λ mtDNA ) we present an approach to possibly correct measurements in degraded samples in the future. To our knowledge this is the first time different degradation impact of the two

  6. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Gallati, Sabina, E-mail: sabina.gallati@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Schaller, Andre, E-mail: andre.schaller@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland)

    2012-07-06

    Highlights: Black-Right-Pointing-Pointer Serial qPCR accurately determines fragmentation state of any given DNA sample. Black-Right-Pointing-Pointer Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. Black-Right-Pointing-Pointer Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. Black-Right-Pointing-Pointer Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serial qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze-thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA ({lambda}{sub nDNA}) and mtDNA ({lambda}{sub mtDNA}) we present an approach to possibly correct measurements in

  7. Texts, Transmissions, Receptions. Modern Approaches to Narratives

    NARCIS (Netherlands)

    Lardinois, A.P.M.H.; Levie, S.A.; Hoeken, H.; Lüthy, C.H.

    2015-01-01

    The papers collected in this volume study the function and meaning of narrative texts from a variety of perspectives. The word 'text' is used here in the broadest sense of the term: it denotes literary books, but also oral tales, speeches, newspaper articles and comics. One of the purposes of this

  8. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    Directory of Open Access Journals (Sweden)

    Artem Yankov

    2012-01-01

    Full Text Available For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor and in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.

  9. Improving the reliability of POD curves in NDI methods using a Bayesian inversion approach for uncertainty quantification

    Science.gov (United States)

    Ben Abdessalem, A.; Jenson, F.; Calmon, P.

    2016-02-01

    This contribution provides an example of the possible advantages of adopting a Bayesian inversion approach to uncertainty quantification in nondestructive inspection methods. In such problem, the uncertainty associated to the random parameters is not always known and needs to be characterised from scattering signal measurements. The uncertainties may then correctly propagated in order to determine a reliable probability of detection curve. To this end, we establish a general Bayesian framework based on a non-parametric maximum likelihood function formulation and some priors from expert knowledge. However, the presented inverse problem is time-consuming and computationally intensive. To cope with this difficulty, we replace the real model by a surrogate one in order to speed-up the model evaluation and to make the problem to be computationally feasible for implementation. The least squares support vector regression is adopted as metamodelling technique due to its robustness to deal with non-linear problems. We illustrate the usefulness of this methodology through the control of tube with enclosed defect using ultrasonic inspection method.

  10. Interdisciplinarity in translation teaching: competence-based education, translation task-based approach, context-based text typology

    Directory of Open Access Journals (Sweden)

    Edelweiss Vitol Gysel

    2017-05-01

    Full Text Available In the context of competence-based teaching, this paper draws upon the model of Translation Competence (TC put forward by the PACTE group (2003 to establish a dialogue between cognitive-constructivist paradigms for translation teaching and the model of the Context-based Text Typology (MATTHIESSEN et al., 2007. In this theoretical environment, it proposes a model for the design of a Teaching Unit (TU for the development of the bilingual competence in would-be-translators.To this end, it explores translation as a cognitive, communicative and textual activity (HURTADO ALBIR, 2011 and considers its teaching from the translation task-based approach (HURTADO ALBIR, 1999. This approach is illustrated through the practical example of the design of a TU elaborated for the subject ‘Introduction to Specialized Translation’,part of the curricular grid of the program ‘Secretariado Executivo’ at Universidade Federal de Santa Catarina. Aspects such as the establishment of learning objectives and their alignment with the translation tasks composing the TU are addressed for this specific pedagogical situation. We argue for the development of textual competences by means of the acquisition of strategies derived from the Context-based Text Typology to solve problems arising from the translation of different text types and contextual configurations.

  11. Construction and Quantification of the One Top model of the Fire Events PSA

    International Nuclear Information System (INIS)

    Kang, Dae Il; Lee, Yoon Hwan; Han, Sang Hoon

    2008-01-01

    KAERI constructed the one top model of the fire events PSA for Ulchin Unit 3 and 4 by using the 'mapping technique'. The mapping technique was developed for the construction and quantification of external events PSA models with a one top model for an internal events PSA. With 'AIMS', the mapping technique can be implemented by the construction of mapping tables. The mapping tables include fire rooms, fire ignition frequency, related initiating events, fire transfer events, and the internal PSA basic events affected by a fire. The constructed one top fire PSA model is based on previously conducted fire PSA results for Ulchin Unit 3 and 4. In this paper, we introduce the construction procedure and quantification results of the one top model of the fire events PSA by using the mapping technique. As the one top model of the fire events PSA developed in this study is based on the previous study, we also introduce the previous fire PSA approach focused on quantification

  12. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  13. An Approach to Retrieval of OCR Degraded Text

    Directory of Open Access Journals (Sweden)

    Yuen-Hsien Tseng

    1998-12-01

    Full Text Available The major problem with retrieval of OCR text is the unpredictable distortion of characters due to recognition errors. Because users have no ideas of such distortion, the terms they query can hardly match the terms stored in the OCR text exactly. Thus retrieval effectiveness is significantly reduced , especially for low-quality input. To reduce the losses from retrieving such noisy OCR text, a fault-tolerant retrieval strategy based on automatic keyword extraction and fuzzy matching is proposed. In this strategy, terms, correct or not, and their term frequencies are extracted from the noisy text and presented for browsing and selection in response to users' initial queries , With theunderstanding of the real terms stored in the noisy text and of their estimated frequency distributions, users may then choose appropriate terms for a more effective searching, A text retrieval system based on this strategy has been built. Examples to show the effectiveness are demonstrated. Finally, some OCR issues for further enhancingretrieval effectiveness are discussed.

  14. Computer-assisted imaging algorithms facilitate histomorphometric quantification of kidney damage in rodent renal failure models

    Directory of Open Access Journals (Sweden)

    Marcin Klapczynski

    2012-01-01

    Full Text Available Introduction: Surgical 5/6 nephrectomy and adenine-induced kidney failure in rats are frequently used models of progressive renal failure. In both models, rats develop significant morphological changes in the kidneys and quantification of these changes can be used to measure the efficacy of prophylactic or therapeutic approaches. In this study, the Aperio Genie Pattern Recognition technology, along with the Positive Pixel Count, Nuclear and Rare Event algorithms were used to quantify histological changes in both rat renal failure models. Methods: Analysis was performed on digitized slides of whole kidney sagittal sections stained with either hematoxylin and eosin or immunohistochemistry with an anti-nestin antibody to identify glomeruli, regenerating tubular epithelium, and tubulointerstitial myofibroblasts. An anti-polymorphonuclear neutrophil (PMN antibody was also used to investigate neutrophil tissue infiltration. Results: Image analysis allowed for rapid and accurate quantification of relevant histopathologic changes such as increased cellularity and expansion of glomeruli, renal tubular dilatation, and degeneration, tissue inflammation, and mineral aggregation. The algorithms provided reliable and consistent results in both control and experimental groups and presented a quantifiable degree of damage associated with each model. Conclusion: These algorithms represent useful tools for the uniform and reproducible characterization of common histomorphologic features of renal injury in rats.

  15. Assessment of probiotic viability during Cheddar cheese manufacture and ripening using propidium monoazide-PCR quantification

    Directory of Open Access Journals (Sweden)

    Emilie eDesfossés-Foucault

    2012-10-01

    Full Text Available The use of a suitable food carrier such as cheese could significantly enhance probiotic viability during storage. The main goal of this study was to assess viability of commercial probiotic strains during Cheddar cheesemaking and ripening (four to six months by comparing the efficiency of microbiological and molecular approaches. Molecular methods such as quantitative PCR (qPCR allow bacterial quantification, and DNA-blocking molecules such as propidium monoazide (PMA select only the living cells’ DNA. Cheese samples were manufactured with a lactococci starter and with one of three probiotic strains (Bifidobacterium animalis subsp. lactis BB-12, Lactobacillus rhamnosus RO011 or Lactobacillus helveticus RO052 or a mixed culture containing B. animalis subsp. lactis BB-12 and L. helveticus RO052 (MC1, both lactobacilli strains (MC2 or all three strains (MC3. DNA extractions were then carried out on PMA-treated and non-treated cell pellets in order to assess PMA treatment efficiency, followed by quantification using the 16S rRNA gene, the elongation factor Tu gene (tuf or the transaldolase gene (tal. Results with intact/dead ratios of bacteria showed that PMA-treated cheese samples had a significantly lower bacterial count than non-treated DNA samples (P<0.005, confirming that PMA did eliminate dead bacteria from PCR quantification. For both quantification methods, the addition of probiotic strains seemed to accelerate the loss of lactococci viability in comparison to control cheese samples, especially when L. helveticus RO052 was added. Viability of all three probiotic strains was also significantly reduced in mixed culture cheese samples (P<0.0001, B. animalis subsp. lactis BB-12 being the most sensitive to the presence of other strains. However, all probiotic strains did retain their viability (log nine cfu/g of cheese throughout ripening. This study was successful in monitoring living probiotic species in Cheddar cheese samples through PMA-qPCR.

  16. Large differences in land use emission quantifications implied by definition discrepancies

    Science.gov (United States)

    Stocker, B. D.; Joos, F.

    2015-03-01

    The quantification of CO2 emissions from anthropogenic land use and land use change (eLUC) is essential to understand the drivers of the atmospheric CO2 increase and to inform climate change mitigation policy. Reported values in synthesis reports are commonly derived from different approaches (observation-driven bookkeeping and process-modelling) but recent work has emphasized that inconsistencies between methods may imply substantial differences in eLUC estimates. However, a consistent quantification is lacking and no concise modelling protocol for the separation of primary and secondary components of eLUC has been established. Here, we review the conceptual differences of eLUC quantification methods and apply an Earth System Model to demonstrate that what is claimed to represent total eLUC differs by up to ~20% when quantified from ESM vs. offline vegetation models. Under a future business-as-usual scenario, differences tend to increase further due to slowing land conversion rates and an increasing impact of altered environmental conditions on land-atmosphere fluxes. We establish how coupled Earth System Models may be applied to separate component fluxes of eLUC arising from the replacement of potential C sinks/sources and the land use feedback and show that secondary fluxes derived from offline vegetation models are conceptually and quantitatively not identical to either, nor their sum. Therefore, we argue that synthesis studies and global carbon budget accountings should resort to the "least common denominator" of different methods, following the bookkeeping approach where only primary land use emissions are quantified under the assumption of constant environmental boundary conditions.

  17. Predicting Text Comprehension, Processing, and Familiarity in Adult Readers: New Approaches to Readability Formulas

    Science.gov (United States)

    Crossley, Scott A.; Skalicky, Stephen; Dascalu, Mihai; McNamara, Danielle S.; Kyle, Kristopher

    2017-01-01

    Research has identified a number of linguistic features that influence the reading comprehension of young readers; yet, less is known about whether and how these findings extend to adult readers. This study examines text comprehension, processing, and familiarity judgment provided by adult readers using a number of different approaches (i.e.,…

  18. Quantification of margins and mixed uncertainties using evidence theory and stochastic expansions

    International Nuclear Information System (INIS)

    Shah, Harsheel; Hosder, Serhat; Winter, Tyler

    2015-01-01

    The objective of this paper is to implement Dempster–Shafer Theory of Evidence (DSTE) in the presence of mixed (aleatory and multiple sources of epistemic) uncertainty to the reliability and performance assessment of complex engineering systems through the use of quantification of margins and uncertainties (QMU) methodology. This study focuses on quantifying the simulation uncertainties, both in the design condition and the performance boundaries along with the determination of margins. To address the possibility of multiple sources and intervals for epistemic uncertainty characterization, DSTE is used for uncertainty quantification. An approach to incorporate aleatory uncertainty in Dempster–Shafer structures is presented by discretizing the aleatory variable distributions into sets of intervals. In view of excessive computational costs for large scale applications and repetitive simulations needed for DSTE analysis, a stochastic response surface based on point-collocation non-intrusive polynomial chaos (NIPC) has been implemented as the surrogate for the model response. The technique is demonstrated on a model problem with non-linear analytical functions representing the outputs and performance boundaries of two coupled systems. Finally, the QMU approach is demonstrated on a multi-disciplinary analysis of a high speed civil transport (HSCT). - Highlights: • Quantification of margins and uncertainties (QMU) methodology with evidence theory. • Treatment of both inherent and epistemic uncertainties within evidence theory. • Stochastic expansions for representation of performance metrics and boundaries. • Demonstration of QMU on an analytical problem. • QMU analysis applied to an aerospace system (high speed civil transport)

  19. Sample Size Bounding and Context Ranking as Approaches to the Human Error Quantification Problem

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B

    2004-03-01

    The paper describes a technique denoted as Sub-Sample-Size Bounding (SSSB), which is useable for the statistical derivation of context-specific probabilities from data available in existing reports on operating experience. Applications to human reliability analysis (HRA) are emphasised in the presentation of this technique. Exemplified by a sample of 180 abnormal event sequences, the manner in which SSSB can provide viable input for the quantification of errors of commission (EOCs) are outlined. (author)

  20. Sample Size Bounding and Context Ranking as Approaches to the Human Error Quantification Problem

    International Nuclear Information System (INIS)

    Reer, B.

    2004-01-01

    The paper describes a technique denoted as Sub-Sample-Size Bounding (SSSB), which is useable for the statistical derivation of context-specific probabilities from data available in existing reports on operating experience. Applications to human reliability analysis (HRA) are emphasised in the presentation of this technique. Exemplified by a sample of 180 abnormal event sequences, the manner in which SSSB can provide viable input for the quantification of errors of commission (EOCs) are outlined. (author)

  1. Effects of humic acid on DNA quantification with Quantifiler® Human DNA Quantification kit and short tandem repeat amplification efficiency.

    Science.gov (United States)

    Seo, Seung Bum; Lee, Hye Young; Zhang, Ai Hua; Kim, Hye Yeon; Shin, Dong Hoon; Lee, Soong Deok

    2012-11-01

    Correct DNA quantification is an essential part to obtain reliable STR typing results. Forensic DNA analysts often use commercial kits for DNA quantification; among them, real-time-based DNA quantification kits are most frequently used. Incorrect DNA quantification due to the presence of PCR inhibitors may affect experiment results. In this study, we examined the alteration degree of DNA quantification results estimated in DNA samples containing a PCR inhibitor by using a Quantifiler® Human DNA Quantification kit. For experiments, we prepared approximately 0.25 ng/μl DNA samples containing various concentrations of humic acid (HA). The quantification results were 0.194-0.303 ng/μl at 0-1.6 ng/μl HA (final concentration in the Quantifiler reaction) and 0.003-0.168 ng/μl at 2.4-4.0 ng/μl HA. Most DNA quantity was undetermined when HA concentration was higher than 4.8 ng/μl HA. The C (T) values of an internal PCR control (IPC) were 28.0-31.0, 36.5-37.1, and undetermined at 0-1.6, 2.4, and 3.2 ng/μl HA. These results indicate that underestimated DNA quantification results may be obtained in the DNA sample with high C (T) values of IPC. Thus, researchers should carefully interpret the DNA quantification results. We additionally examined the effects of HA on the STR amplification by using an Identifiler® kit and a MiniFiler™ kit. Based on the results of this study, it is thought that a better understanding of various effects of HA would help researchers recognize and manipulate samples containing HA.

  2. Elemental labelling combined with liquid chromatography inductively coupled plasma mass spectrometry for quantification of biomolecules: A review

    Science.gov (United States)

    Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan

    2012-01-01

    This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given. PMID:23062431

  3. Label-free DNA quantification via a 'pipette, aggregate and blot' (PAB) approach with magnetic silica particles on filter paper.

    Science.gov (United States)

    Li, Jingyi; Liu, Qian; Alsamarri, Hussein; Lounsbury, Jenny A; Haversitick, Doris M; Landers, James P

    2013-03-07

    Reliable measurement of DNA concentration is essential for a broad range of applications in biology and molecular biology, and for many of these, quantifying the nucleic acid content is inextricably linked to obtaining optimal results. In its most simplistic form, quantitative analysis of nucleic acids can be accomplished by UV-Vis absorbance and, in more sophisticated format, by fluorimetry. A recently reported new concept, the 'pinwheel assay', involves a label-free approach for quantifying DNA through aggregation of paramagnetic beads in a rotating magnetic field. Here, we describe a simplified version of that assay adapted for execution using only a pipet and filter paper. The 'pipette, aggregate, and blot' (PAB) approach allows DNA to induce bead aggregation in a pipette tip through exposure to a magnetic field, followed by dispensing (blotting) onto filter paper. The filter paper immortalises the extent of aggregation, and digital images of the immortalized bead conformation, acquired with either a document scanner or a cell phone camera, allows for DNA quantification using a noncomplex algorithm. Human genomic DNA samples extracted from blood are quantified with the PAB approach and the results utilized to define the volume of sample used in a PCR reaction that is sensitive to input mass of template DNA. Integrating the PAB assay with paper-based DNA extraction and detection modalities has the potential to yield 'DNA quant-on-paper' devices that may be useful for point-of-care testing.

  4. Quantification of arbuscular mycorrhizal fungal DNA in roots: how important is material preservation?

    Science.gov (United States)

    Janoušková, Martina; Püschel, David; Hujslová, Martina; Slavíková, Renata; Jansa, Jan

    2015-04-01

    Monitoring populations of arbuscular mycorrhizal fungi (AMF) in roots is a pre-requisite for improving our understanding of AMF ecology and functioning of the symbiosis in natural conditions. Among other approaches, quantification of fungal DNA in plant tissues by quantitative real-time PCR is one of the advanced techniques with a great potential to process large numbers of samples and to deliver truly quantitative information. Its application potential would greatly increase if the samples could be preserved by drying, but little is currently known about the feasibility and reliability of fungal DNA quantification from dry plant material. We addressed this question by comparing quantification results based on dry root material to those obtained from deep-frozen roots of Medicago truncatula colonized with Rhizophagus sp. The fungal DNA was well conserved in the dry root samples with overall fungal DNA levels in the extracts comparable with those determined in extracts of frozen roots. There was, however, no correlation between the quantitative data sets obtained from the two types of material, and data from dry roots were more variable. Based on these results, we recommend dry material for qualitative screenings but advocate using frozen root materials if precise quantification of fungal DNA is required.

  5. Quantification of local mobilities

    DEFF Research Database (Denmark)

    Zhang, Y. B.

    2018-01-01

    A new method for quantification of mobilities of local recrystallization boundary segments is presented. The quantification is based on microstructures characterized using electron microscopy and on determination of migration velocities and driving forces for local boundary segments. Pure aluminium...... is investigated and the results show that even for a single recrystallization boundary, different boundary segments migrate differently, and the differences can be understood based on variations in mobilities and local deformed microstructures. The present work has important implications for understanding...

  6. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Min, Jung Joon; Lee, Byeong Il; Choi, Eun Seo; Tak, Yoon O; Choi, Heung Kook; Lee, Ju Young

    2009-01-01

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  7. A Project-Based Quantification of BIM Benefits

    Directory of Open Access Journals (Sweden)

    Jian Li

    2014-08-01

    Full Text Available In the construction industry, research is being carried out to look for feasible methods and technologies to cut down project costs and waste. Building Information Modelling (BIM is certainly currently a promising technology/method that can achieve this. The output of the construction industry has a considerable scale; however, the concentration of the industry and the level of informatization are still not high. There is still a large gap in terms of productivity between the construction industry and other industries. Due to the lack of first-hand data regarding how much of an effect can be genuinely had by BIM in real cases, it is unrealistic for construction stakeholders to take the risk of widely adopting BIM. This paper focuses on the methodological quantification (through a case study approach of BIM's benefits in building construction resource management and real-time costs control, in contrast to traditional non-BIM technologies. Through the use of BIM technology for the dynamic querying and statistical analysis of construction schedules, engineering, resources and costs, the three implementations considered demonstrate how BIM can facilitate the comprehensive grasp of a project's implementation and progress, identify and solve the contradictions and conflicts between construction resources and costs controls, reduce project over-spends and protect the supply of resources.

  8. Text and ideology: text-oriented discourse analysis

    Directory of Open Access Journals (Sweden)

    Maria Eduarda Gonçalves Peixoto

    2018-04-01

    Full Text Available The article aims to contribute to the understanding of the connection between text and ideology articulated by the text-oriented analysis of discourse (ADTO. Based on the reflections of Fairclough (1989, 2001, 2003 and Fairclough and Chouliaraki (1999, the debate presents the social ontology that ADTO uses to base its conception of social life as an open system and textually mediated; the article then explains the chronological-narrative development of the main critical theories of ideology, by virtue of which ADTO organizes the assumptions that underpin the particular use it makes of the term. Finally, the discussion presents the main aspects of the connection between text and ideology, offering a conceptual framework that can contribute to the domain of the theme according to a critical discourse analysis approach.

  9. Generation of structural MR images from amyloid PET: Application to MR-less quantification.

    Science.gov (United States)

    Choi, Hongyoon; Lee, Dong Soo

    2017-12-07

    Structural magnetic resonance (MR) images concomitantly acquired with PET images can provide crucial anatomical information for precise quantitative analysis. However, in the clinical setting, not all the subjects have corresponding MR. Here, we developed a model to generate structural MR images from amyloid PET using deep generative networks. We applied our model to quantification of cortical amyloid load without structural MR. Methods: We used florbetapir PET and structural MR data of Alzheimer's Disease Neuroimaging Initiative database. The generative network was trained to generate realistic structural MR images from florbetapir PET images. After the training, the model was applied to the quantification of cortical amyloid load. PET images were spatially normalized to the template space using the generated MR and then standardized uptake value ratio (SUVR) of the target regions was measured by predefined regions-of-interests. A real MR-based quantification was used as the gold standard to measure the accuracy of our approach. Other MR-less methods, a normal PET template-based, multi-atlas PET template-based and PET segmentation-based normalization/quantification methods, were also tested. We compared performance of quantification methods using generated MR with that of MR-based and MR-less quantification methods. Results: Generated MR images from florbetapir PET showed visually similar signal patterns to the real MR. The structural similarity index between real and generated MR was 0.91 ± 0.04. Mean absolute error of SUVR of cortical composite regions estimated by the generated MR-based method was 0.04±0.03, which was significantly smaller than other MR-less methods (0.29±0.12 for the normal PET-template, 0.12±0.07 for multiatlas PET-template and 0.08±0.06 for PET segmentation-based methods). Bland-Altman plots revealed that the generated MR-based SUVR quantification was the closest to the SUVR values estimated by the real MR-based method. Conclusion

  10. Quantification of viral DNA during HIV-1 infection: A review of relevant clinical uses and laboratory methods.

    Science.gov (United States)

    Alidjinou, E K; Bocket, L; Hober, D

    2015-02-01

    Effective antiretroviral therapy usually leads to undetectable HIV-1 RNA in the plasma. However, the virus persists in some cells of infected patients as various DNA forms, both integrated and unintegrated. This reservoir represents the greatest challenge to the complete cure of HIV-1 infection and its characteristics highly impact the course of the disease. The quantification of HIV-1 DNA in blood samples constitutes currently the most practical approach to measure this residual infection. Real-time quantitative PCR (qPCR) is the most common method used for HIV-DNA quantification and many strategies have been developed to measure the different forms of HIV-1 DNA. In the literature, several "in-house" PCR methods have been used and there is a need for standardization to have comparable results. In addition, qPCR is limited for the precise quantification of low levels by background noise. Among new assays in development, digital PCR was shown to allow an accurate quantification of HIV-1 DNA. Total HIV-1 DNA is most commonly measured in clinical routine. The absolute quantification of proviruses and unintegrated forms is more often used for research purposes. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  11. Quantification in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Buvat, Irene

    2005-01-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena; 2 - quantification in SPECT, problems and correction methods: Attenuation, scattering, un-stationary spatial resolution, partial volume effect, movement, tomographic reconstruction, calibration; 3 - Synthesis: actual quantification accuracy; 4 - Beyond the activity concentration measurement

  12. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against...... human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...

  13. Quantification procedures in micro X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Kanngiesser, Birgit

    2003-01-01

    For the quantification in micro X-ray fluorescence analysis standardfree quantification procedures have become especially important. An introduction to the basic concepts of these quantification procedures is given, followed by a short survey of the procedures which are available now and what kind of experimental situations and analytical problems are addressed. The last point is extended by the description of an own development for the fundamental parameter method, which renders the inclusion of nonparallel beam geometries possible. Finally, open problems for the quantification procedures are discussed

  14. Uncertainty Quantification in the Reliability and Risk Assessment of Generation IV Reactors: Final Scientific/Technical Report

    International Nuclear Information System (INIS)

    Vierow, Karen; Aldemir, Tunc

    2009-01-01

    The project entitled, 'Uncertainty Quantification in the Reliability and Risk Assessment of Generation IV Reactors', was conducted as a DOE NERI project collaboration between Texas A and M University and The Ohio State University between March 2006 and June 2009. The overall goal of the proposed project was to develop practical approaches and tools by which dynamic reliability and risk assessment techniques can be used to augment the uncertainty quantification process in probabilistic risk assessment (PRA) methods and PRA applications for Generation IV reactors. This report is the Final Scientific/Technical Report summarizing the project.

  15. Uncertainty Quantification in the Reliability and Risk Assessment of Generation IV Reactors: Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Vierow, Karen; Aldemir, Tunc

    2009-09-10

    The project entitled, “Uncertainty Quantification in the Reliability and Risk Assessment of Generation IV Reactors”, was conducted as a DOE NERI project collaboration between Texas A&M University and The Ohio State University between March 2006 and June 2009. The overall goal of the proposed project was to develop practical approaches and tools by which dynamic reliability and risk assessment techniques can be used to augment the uncertainty quantification process in probabilistic risk assessment (PRA) methods and PRA applications for Generation IV reactors. This report is the Final Scientific/Technical Report summarizing the project.

  16. Demonstration of uncertainty quantification and sensitivity analysis for PWR fuel performance with BISON

    International Nuclear Information System (INIS)

    Zhang, Hongbin; Zhao, Haihua; Zou, Ling; Burns, Douglas; Ladd, Jacob

    2017-01-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis. (author)

  17. Demonstration of Uncertainty Quantification and Sensitivity Analysis for PWR Fuel Performance with BISON

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Hongbin; Ladd, Jacob; Zhao, Haihua; Zou, Ling; Burns, Douglas

    2015-11-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis.

  18. QUANTIFICATION AND BIOREMEDIATION OF ENVIRONMENTAL SAMPLES BY DEVELOPING A NOVEL AND EFFICIENT METHOD

    Directory of Open Access Journals (Sweden)

    Mohammad Osama

    2014-06-01

    Full Text Available Pleurotus ostreatus, a white rot fungus, is capable of bioremediating a wide range of organic contaminants including Polycyclic Aromatic Hydrocarbons (PAHs. Ergosterol is produced by living fungal biomass and used as a measure of fungal biomass. The first part of this work deals with the extraction and quantification of PAHs from contaminated sediments by Lipid Extraction Method (LEM. The second part consists of the development of a novel extraction method (Ergosterol Extraction Method (EEM, quantification and bioremediation. The novelty of this method is the simultaneously extraction and quantification of two different types of compounds, sterol (ergosterol and PAHs and is more efficient than LEM. EEM has been successful in extracting ergosterol from the fungus grown on barley in the concentrations of 17.5-39.94 µg g-1 ergosterol and the PAHs are much more quantified in numbers and amounts as compared to LEM. In addition, cholesterol usually found in animals, has also been detected in the fungus, P. ostreatus at easily detectable levels.

  19. Development of a Novel Reference Plasmid for Accurate Quantification of Genetically Modified Kefeng6 Rice DNA in Food and Feed Samples

    Directory of Open Access Journals (Sweden)

    Liang Li

    2013-01-01

    Full Text Available Reference plasmids are an essential tool for the quantification of genetically modified (GM events. Quantitative real-time PCR (qPCR is the most commonly used method to characterize and quantify reference plasmids. However, the precision of this method is often limited by calibration curves, and qPCR data can be affected by matrix differences between the standards and samples. Here, we describe a digital PCR (dPCR approach that can be used to accurately measure the novel reference plasmid pKefeng6 and quantify the unauthorized variety of GM rice Kefeng6, eliminating the issues associated with matrix effects in calibration curves. The pKefeng6 plasmid was used as a calibrant for the quantification of Kefeng6 rice by determining the copy numbers of event- (77 bp and taxon-specific (68 bp fragments, their ratios, and their concentrations. The plasmid was diluted to five different concentrations. The third sample (S3 was optimized for the quantification range of dPCR according to previous reports. The ratio between the two fragments was 1.005, which closely approximated the value certified by sequencing, and the concentration was found to be 792 copies/μL. This method was precise, with an RSD of ~3%. These findings demonstrate the advantages of using the dPCR method to characterize reference materials.

  20. Combination radioimmunotherapy approaches and quantification of immuno-PET

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Su [Molecular Imaging Research Center, Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2016-06-15

    Monoclonal antibodies (mAbs), which play a prominent role in cancer therapy, can interact with specific antigens on cancer cells, thereby enhancing the patient' immune response via various mechanisms, or mAbs can act against cell growth factors and, thereby, arrest the proliferation of tumor cells. Radionuclide-labeled mAbs, which are used in radioimmunotherapy (RIT), are effective for cancer treatment because tumor associated-mAbs linked to cytotoxic radionuclides can selectively bind to tumor antigens and release targeted cytotoxic radiation. Immunological positron emission tomography (immuno-PET), which is the combination of PET with mAb, is an attractive option for improving tumor detection and mAb quantification. However, RIT remains a challenge because of the limited delivery of mAb into tumors. The transport and uptake of mAb into tumors is slow and heterogeneous. The tumor microenvironment contributed to the limited delivery of the mAb. During the delivery process of mAb to tumor, mechanical drug resistance such as collagen distribution or physiological drug resistance such as high intestinal pressure or absence of lymphatic vessel would be the limited factor of mAb delivery to the tumor at a potentially lethal mAb concentration. When α-emitter-labeled mAbs were used, deeper penetration of α-emitter-labeled mAb inside tumors was more important because of the short range of the α emitter. Therefore, combination therapy strategies aimed at improving mAb tumor penetration and accumulation would be beneficial for maximizing their therapeutic efficacy against solid tumors.

  1. Application of adaptive hierarchical sparse grid collocation to the uncertainty quantification of nuclear reactor simulators

    Energy Technology Data Exchange (ETDEWEB)

    Yankov, A.; Downar, T. [University of Michigan, 2355 Bonisteel Blvd, Ann Arbor, MI 48109 (United States)

    2013-07-01

    Recent efforts in the application of uncertainty quantification to nuclear systems have utilized methods based on generalized perturbation theory and stochastic sampling. While these methods have proven to be effective they both have major drawbacks that may impede further progress. A relatively new approach based on spectral elements for uncertainty quantification is applied in this paper to several problems in reactor simulation. Spectral methods based on collocation attempt to couple the approximation free nature of stochastic sampling methods with the determinism of generalized perturbation theory. The specific spectral method used in this paper employs both the Smolyak algorithm and adaptivity by using Newton-Cotes collocation points along with linear hat basis functions. Using this approach, a surrogate model for the outputs of a computer code is constructed hierarchically by adaptively refining the collocation grid until the interpolant is converged to a user-defined threshold. The method inherently fits into the framework of parallel computing and allows for the extraction of meaningful statistics and data that are not within reach of stochastic sampling and generalized perturbation theory. This paper aims to demonstrate the advantages of spectral methods-especially when compared to current methods used in reactor physics for uncertainty quantification-and to illustrate their full potential. (authors)

  2. Method for indirect quantification of CH4 production via H2O production using hydrogenotrophic methanogens

    Directory of Open Access Journals (Sweden)

    Ruth-Sophie eTaubner

    2016-04-01

    Full Text Available ydrogenotrophic methanogens are an intriguing group of microorganisms from the domain Archaea. They exhibit extraordinary ecological, biochemical, physiological characteristics colorbox{yellow}{and have a huge biotechnological potential}. Yet, the only possibility to assess the methane (CH$_4$ production potential of hydrogenotrophic methanogens is to apply gas chromatographic quantification of CH$_4$.In order to be able to effectively screen pure cultures of hydrogenotrophic methanogens regarding their CH$_4$ production potential we developed a novel method for indirect quantification of colorbox{yellow}{the} volumetric CH$_4$ production rate by measuring colorbox{yellow}{the} volumetric water production rate. This colorbox{yellow}{ } method was established in serum bottles for cultivation of methanogens in closed batch cultivation mode. Water production was colorbox{yellow}{estimated} by determining the difference in mass increase in an isobaric setting.This novel CH$_4$ quantification method is an accurate and precise analytical technique, colorbox{yellow}{which can be used} to rapidly screen pure cultures of methanogens regarding colorbox{yellow}{their} volumetric CH$_{4}$ evolution rate. colorbox{yellow}{It} is a cost effective alternative colorbox{yellow}{determining} CH$_4$ production of methanogens over CH$_4$ quantification by using gas chromatography, especially if colorbox{yellow}{ } applied as a high throughput quantification method. colorbox{yellow}{Eventually, the} method can be universally applied for quantification of CH$_4$ production from psychrophilic, thermophilic and hyperthermophilic hydrogenotrophic methanogens.

  3. Raman spectroscopy for DNA quantification in cell nucleus.

    Science.gov (United States)

    Okotrub, K A; Surovtsev, N V; Semeshin, V F; Omelyanchuk, L V

    2015-01-01

    Here we demonstrate the feasibility of a novel approach to quantify DNA in cell nuclei. This approach is based on spectroscopy analysis of Raman light scattering, and avoids the problem of nonstoichiometric binding of dyes to DNA, as it directly measures the signal from DNA. Quantitative analysis of nuclear DNA contribution to Raman spectrum could be reliably performed using intensity of a phosphate mode at 1096 cm(-1) . When compared to the known DNA standards from cells of different animals, our results matched those values at error of 10%. We therefore suggest that this approach will be useful to expand the list of DNA standards, to properly adjust the duration of hydrolysis in Feulgen staining, to assay the applicability of fuchsines for DNA quantification, as well as to measure DNA content in cells with complex hydrolysis patterns, when Feulgen densitometry is inappropriate. © 2014 International Society for Advancement of Cytometry.

  4. In vivo MRS metabolite quantification using genetic optimization

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure.

  5. In vivo MRS metabolite quantification using genetic optimization

    International Nuclear Information System (INIS)

    Papakostas, G A; Mertzios, B G; Karras, D A; Van Ormondt, D; Graveron-Demilly, D

    2011-01-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure

  6. Standard Error Computations for Uncertainty Quantification in Inverse Problems: Asymptotic Theory vs. Bootstrapping.

    Science.gov (United States)

    Banks, H T; Holm, Kathleen; Robbins, Danielle

    2010-11-01

    We computationally investigate two approaches for uncertainty quantification in inverse problems for nonlinear parameter dependent dynamical systems. We compare the bootstrapping and asymptotic theory approaches for problems involving data with several noise forms and levels. We consider both constant variance absolute error data and relative error which produces non-constant variance data in our parameter estimation formulations. We compare and contrast parameter estimates, standard errors, confidence intervals, and computational times for both bootstrapping and asymptotic theory methods.

  7. Axiomatic Ontology Learning Approaches for English Translation of the Meaning of Quranic Texts

    Directory of Open Access Journals (Sweden)

    Saad Saidah

    2017-01-01

    Full Text Available Ontology learning (OL is the computational task of generating a knowledge base in the form of an ontology, given an unstructured corpus in natural language (NL. While most works in the field of ontology learning have been primarily based on a statistical approach to extract lightweight OL, very few attempts have been made to extract axiomatic OL (called heavyweight OL from NL text documents. Axiomatic OL supports more precise formal logic-based reasoning when compared to lightweight OL. Lexico-syntactic pattern matching and statisticsal one cannot lead to very accurate learning, mostly because of several linguistic nuances in the NL. Axiomatic OL is an alternative methodology that has not been explored much, where a deep linguistics analysis in computational linguistics is used to generate formal axioms and definitions instead of simply inducing a taxonomy. The ontology that is created not only stores the information about the application domain in explicit knowledge, but also can deduce the implicit knowledge from this ontology. This research will explore the English translation of the meaning of Quranic texts.

  8. A "Toy" Model for Operational Risk Quantification using Credibility Theory

    OpenAIRE

    Hans B\\"uhlmann; Pavel V. Shevchenko; Mario V. W\\"uthrich

    2009-01-01

    To meet the Basel II regulatory requirements for the Advanced Measurement Approaches in operational risk, the bank's internal model should make use of the internal data, relevant external data, scenario analysis and factors reflecting the business environment and internal control systems. One of the unresolved challenges in operational risk is combining of these data sources appropriately. In this paper we focus on quantification of the low frequency high impact losses exceeding some high thr...

  9. Rapid and serial quantification of adhesion forces of yeast and Mammalian cells.

    Directory of Open Access Journals (Sweden)

    Eva Potthoff

    Full Text Available Cell adhesion to surfaces represents the basis for niche colonization and survival. Here we establish serial quantification of adhesion forces of different cell types using a single probe. The pace of single-cell force-spectroscopy was accelerated to up to 200 yeast and 20 mammalian cells per probe when replacing the conventional cell trapping cantilever chemistry of atomic force microscopy by underpressure immobilization with fluidic force microscopy (FluidFM. In consequence, statistically relevant data could be recorded in a rapid manner, the spectrum of examinable cells was enlarged, and the cell physiology preserved until approached for force spectroscopy. Adhesion forces of Candida albicans increased from below 4 up to 16 nN at 37°C on hydrophobic surfaces, whereas a Δhgc1-mutant showed forces consistently below 4 nN. Monitoring adhesion of mammalian cells revealed mean adhesion forces of 600 nN of HeLa cells on fibronectin and were one order of magnitude higher than those observed for HEK cells.

  10. Region of interest-based versus whole-lung segmentation-based approach for MR lung perfusion quantification in 2-year-old children after congenital diaphragmatic hernia repair

    International Nuclear Information System (INIS)

    Weis, M.; Sommer, V.; Hagelstein, C.; Schoenberg, S.O.; Neff, K.W.; Zoellner, F.G.; Zahn, K.; Schaible, T.

    2016-01-01

    With a region of interest (ROI)-based approach 2-year-old children after congenital diaphragmatic hernia (CDH) show reduced MR lung perfusion values on the ipsilateral side compared to the contralateral. This study evaluates whether results can be reproduced by segmentation of whole-lung and whether there are differences between the ROI-based and whole-lung measurements. Using dynamic contrast-enhanced (DCE) MRI, pulmonary blood flow (PBF), pulmonary blood volume (PBV) and mean transit time (MTT) were quantified in 30 children after CDH repair. Quantification results of an ROI-based (six cylindrical ROIs generated of five adjacent slices per lung-side) and a whole-lung segmentation approach were compared. In both approaches PBF and PBV were significantly reduced on the ipsilateral side (p always <0.0001). In ipsilateral lungs, PBF of the ROI-based and the whole-lung segmentation-based approach was equal (p=0.50). In contralateral lungs, the ROI-based approach significantly overestimated PBF in comparison to the whole-lung segmentation approach by approximately 9.5 % (p=0.0013). MR lung perfusion in 2-year-old children after CDH is significantly reduced ipsilaterally. In the contralateral lung, the ROI-based approach significantly overestimates perfusion, which can be explained by exclusion of the most ventral parts of the lung. Therefore whole-lung segmentation should be preferred. (orig.)

  11. Region of interest-based versus whole-lung segmentation-based approach for MR lung perfusion quantification in 2-year-old children after congenital diaphragmatic hernia repair

    Energy Technology Data Exchange (ETDEWEB)

    Weis, M.; Sommer, V.; Hagelstein, C.; Schoenberg, S.O.; Neff, K.W. [Heidelberg University, Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Mannheim (Germany); Zoellner, F.G. [Heidelberg University, Computer Assisted Clinical Medicine, Medical Faculty Mannheim, Mannheim (Germany); Zahn, K. [University of Heidelberg, Department of Paediatric Surgery, University Medical Center Mannheim, Medical Faculty Mannheim, Mannheim (Germany); Schaible, T. [Heidelberg University, Department of Paediatrics, University Medical Center Mannheim, Medical Faculty Mannheim, Mannheim (Germany)

    2016-12-15

    With a region of interest (ROI)-based approach 2-year-old children after congenital diaphragmatic hernia (CDH) show reduced MR lung perfusion values on the ipsilateral side compared to the contralateral. This study evaluates whether results can be reproduced by segmentation of whole-lung and whether there are differences between the ROI-based and whole-lung measurements. Using dynamic contrast-enhanced (DCE) MRI, pulmonary blood flow (PBF), pulmonary blood volume (PBV) and mean transit time (MTT) were quantified in 30 children after CDH repair. Quantification results of an ROI-based (six cylindrical ROIs generated of five adjacent slices per lung-side) and a whole-lung segmentation approach were compared. In both approaches PBF and PBV were significantly reduced on the ipsilateral side (p always <0.0001). In ipsilateral lungs, PBF of the ROI-based and the whole-lung segmentation-based approach was equal (p=0.50). In contralateral lungs, the ROI-based approach significantly overestimated PBF in comparison to the whole-lung segmentation approach by approximately 9.5 % (p=0.0013). MR lung perfusion in 2-year-old children after CDH is significantly reduced ipsilaterally. In the contralateral lung, the ROI-based approach significantly overestimates perfusion, which can be explained by exclusion of the most ventral parts of the lung. Therefore whole-lung segmentation should be preferred. (orig.)

  12. Estimating oil price 'Value at Risk' using the historical simulation approach

    International Nuclear Information System (INIS)

    David Cabedo, J.; Moya, Ismael

    2003-01-01

    In this paper we propose using Value at Risk (VaR) for oil price risk quantification. VaR provides an estimation for the maximum oil price change associated with a likelihood level, and can be used for designing risk management strategies. We analyse three VaR calculation methods: the historical simulation standard approach, the historical simulation with ARMA forecasts (HSAF) approach, developed in this paper, and the variance-covariance method based on autoregressive conditional heteroskedasticity models forecasts. The results obtained indicate that HSAF methodology provides a flexible VaR quantification, which fits the continuous oil price movements well and provides an efficient risk quantification

  13. Estimating oil price 'Value at Risk' using the historical simulation approach

    International Nuclear Information System (INIS)

    Cabedo, J.D.; Moya, I.

    2003-01-01

    In this paper we propose using Value at Risk (VaR) for oil price risk quantification. VaR provides an estimation for the maximum oil price change associated with a likelihood level, and can be used for designing risk management strategies. We analyse three VaR calculation methods: the historical simulation standard approach, the historical simulation with ARMA forecasts (HSAF) approach. developed in this paper, and the variance-covariance method based on autoregressive conditional heteroskedasticity models forecasts. The results obtained indicate that HSAF methodology provides a flexible VaR quantification, which fits the continuous oil price movements well and provides an efficient risk quantification. (author)

  14. Proteomic Identification and Quantification of S-glutathionylation in Mouse Macrophages Using Resin-Assisted Enrichment and Isobaric Labeling

    Energy Technology Data Exchange (ETDEWEB)

    Su, Dian; Gaffrey, Matthew J.; Guo, Jia; Hatchell, Kayla E.; Chu, Rosalie K.; Clauss, Therese RW; Aldrich, Joshua T.; Wu, Si; Purvine, Samuel O.; Camp, David G.; Smith, Richard D.; Thrall, Brian D.; Qian, Weijun

    2014-02-11

    Protein S-glutathionylation (SSG) is an important regulatory posttranslational modification of protein cysteine (Cys) thiol redox switches, yet the role of specific cysteine residues as targets of modification is poorly understood. We report a novel quantitative mass spectrometry (MS)-based proteomic method for site-specific identification and quantification of S-glutathionylation across different conditions. Briefly, this approach consists of initial blocking of free thiols by alkylation, selective reduction of glutathionylated thiols and enrichment using thiol affinity resins, followed by on-resin tryptic digestion and isobaric labeling with iTRAQ (isobaric tags for relative and absolute quantitation) for MS-based identification and quantification. The overall approach was validated by application to RAW 264.7 mouse macrophages treated with different doses of diamide to induce glutathionylation. A total of 1071 Cys-sites from 690 proteins were identified in response to diamide treatment, with ~90% of the sites displaying >2-fold increases in SSG-modification compared to controls.. This approach was extended to identify potential SSG modified Cys-sites in response to H2O2, an endogenous oxidant produced by activated macrophages and many pathophysiological stimuli. The results revealed 364 Cys-sites from 265 proteins that were sensitive to S-glutathionylation in response to H2O2 treatment. These proteins covered a range of molecular types and molecular functions with free radical scavenging, and cell death and survival included as the most significantly enriched functional categories. Overall the results demonstrate that our approach is effective for site-specific identification and quantification of S-glutathionylated proteins. The analytical strategy also provides a unique approach to determining the major pathways and cell processes most susceptible to glutathionylation at a proteome-wide scale.

  15. HPLC Quantification of Cytotoxic Compounds from Aspergillus niger

    Directory of Open Access Journals (Sweden)

    Paula Karina S. Uchoa

    2017-01-01

    Full Text Available A high-performance liquid chromatography method was developed and validated for the quantification of the cytotoxic compounds produced by a marine strain of Aspergillus niger. The fungus was grown in malt peptone dextrose (MPD, potato dextrose yeast (PDY, and mannitol peptone yeast (MnPY media during 7, 14, 21, and 28 days, and the natural products were identified by standard compounds. The validation parameters obtained were selectivity, linearity (coefficient of correlation > 0.99, precision (relative standard deviation below 5%, and accuracy (recovery > 96.

  16. A Relational Reasoning Approach to Text-Graphic Processing

    Science.gov (United States)

    Danielson, Robert W.; Sinatra, Gale M.

    2017-01-01

    We propose that research on text-graphic processing could be strengthened by the inclusion of relational reasoning perspectives. We briefly outline four aspects of relational reasoning: "analogies," "anomalies," "antinomies", and "antitheses". Next, we illustrate how text-graphic researchers have been…

  17. Quantification of uncertainty in flood risk assessment for flood protection planning: a Bayesian approach

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2017-04-01

    Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.

  18. Quality Quantification of Evaluated Cross Section Covariances

    International Nuclear Information System (INIS)

    Varet, S.; Dossantos-Uzarralde, P.; Vayatis, N.

    2015-01-01

    Presently, several methods are used to estimate the covariance matrix of evaluated nuclear cross sections. Because the resulting covariance matrices can be different according to the method used and according to the assumptions of the method, we propose a general and objective approach to quantify the quality of the covariance estimation for evaluated cross sections. The first step consists in defining an objective criterion. The second step is computation of the criterion. In this paper the Kullback-Leibler distance is proposed for the quality quantification of a covariance matrix estimation and its inverse. It is based on the distance to the true covariance matrix. A method based on the bootstrap is presented for the estimation of this criterion, which can be applied with most methods for covariance matrix estimation and without the knowledge of the true covariance matrix. The full approach is illustrated on the 85 Rb nucleus evaluations and the results are then used for a discussion on scoring and Monte Carlo approaches for covariance matrix estimation of the cross section evaluations

  19. Selective detection and quantification of modified DNA with solid-state nanopores.

    Science.gov (United States)

    Carlsen, Autumn T; Zahid, Osama K; Ruzicka, Jan A; Taylor, Ethan W; Hall, Adam R

    2014-10-08

    We demonstrate a solid-state nanopore assay for the unambiguous discrimination and quantification of modified DNA. Individual streptavidin proteins are employed as high-affinity tags for DNA containing a single biotin moiety. We establish that the rate of translocation events corresponds directly to relative concentration of protein-DNA complexes and use the selectivity of our approach to quantify modified oligonucleotides from among a background of unmodified DNA in solution.

  20. Hepatic Iron Quantification on 3 Tesla (3 T Magnetic Resonance (MR: Technical Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Muhammad Anwar

    2013-01-01

    Full Text Available MR has become a reliable and noninvasive method of hepatic iron quantification. Currently, most of the hepatic iron quantification is performed on 1.5 T MR, and the biopsy measurements have been paired with R2 and R2* values for 1.5 T MR. As the use of 3 T MR scanners is steadily increasing in clinical practice, it has become important to evaluate the practicality of calculating iron burden at 3 T MR. Hepatic iron quantification on 3 T MR requires a better understanding of the process and more stringent technical considerations. The purpose of this work is to focus on the technical challenges in establishing a relationship between T2* values at 1.5 T MR and 3 T MR for hepatic iron concentration (HIC and to develop an appropriately optimized MR protocol for the evaluation of T2* values in the liver at 3 T magnetic field strength. We studied 22 sickle cell patients using multiecho fast gradient-echo sequence (MFGRE 3 T MR and compared the results with serum ferritin and liver biopsy results. Our study showed that the quantification of hepatic iron on 3 T MRI in sickle cell disease patients correlates well with clinical blood test results and biopsy results. 3 T MR liver iron quantification based on MFGRE can be used for hepatic iron quantification in transfused patients.

  1. Robust approaches to quantification of margin and uncertainty for sparse data

    Energy Technology Data Exchange (ETDEWEB)

    Hund, Lauren [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schroeder, Benjamin B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rumsey, Kelin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Murchison, Nicole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of the risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.

  2. Visualization and Quantification of Rotor Tip Vortices in Helicopter Flows

    Science.gov (United States)

    Kao, David L.; Ahmad, Jasim U.; Holst, Terry L.

    2015-01-01

    This paper presents an automated approach for effective extraction, visualization, and quantification of vortex core radii from the Navier-Stokes simulations of a UH-60A rotor in forward flight. We adopt a scaled Q-criterion to determine vortex regions and then perform vortex core profiling in these regions to calculate vortex core radii. This method provides an efficient way of visualizing and quantifying the blade tip vortices. Moreover, the vortices radii are displayed graphically in a plane.

  3. Synthesis of nanodiamond derivatives carrying amino functions and quantification by a modified Kaiser test

    Directory of Open Access Journals (Sweden)

    Gerald Jarre

    2014-11-01

    Full Text Available Nanodiamonds functionalized with different organic moieties carrying terminal amino groups have been synthesized. These include conjugates generated by Diels–Alder reactions of ortho-quinodimethanes formed in situ from pyrazine and 5,6-dihydrocyclobuta[d]pyrimidine derivatives. For the quantification of primary amino groups a modified photometric assay based on the Kaiser test has been developed and validated for different types of aminated nanodiamond. The results correspond well to values obtained by thermogravimetry. The method represents an alternative wet-chemical quantification method in cases where other techniques like elemental analysis fail due to unfavourable combustion behaviour of the analyte or other impediments.

  4. Error correction in multi-fidelity molecular dynamics simulations using functional uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Reeve, Samuel Temple; Strachan, Alejandro, E-mail: strachan@purdue.edu

    2017-04-01

    We use functional, Fréchet, derivatives to quantify how thermodynamic outputs of a molecular dynamics (MD) simulation depend on the potential used to compute atomic interactions. Our approach quantifies the sensitivity of the quantities of interest with respect to the input functions as opposed to its parameters as is done in typical uncertainty quantification methods. We show that the functional sensitivity of the average potential energy and pressure in isothermal, isochoric MD simulations using Lennard–Jones two-body interactions can be used to accurately predict those properties for other interatomic potentials (with different functional forms) without re-running the simulations. This is demonstrated under three different thermodynamic conditions, namely a crystal at room temperature, a liquid at ambient pressure, and a high pressure liquid. The method provides accurate predictions as long as the change in potential can be reasonably described to first order and does not significantly affect the region in phase space explored by the simulation. The functional uncertainty quantification approach can be used to estimate the uncertainties associated with constitutive models used in the simulation and to correct predictions if a more accurate representation becomes available.

  5. Molecular nonlinear dynamics and protein thermal uncertainty quantification

    Science.gov (United States)

    Xia, Kelin; Wei, Guo-Wei

    2014-01-01

    This work introduces molecular nonlinear dynamics (MND) as a new approach for describing protein folding and aggregation. By using a mode system, we show that the MND of disordered proteins is chaotic while that of folded proteins exhibits intrinsically low dimensional manifolds (ILDMs). The stability of ILDMs is found to strongly correlate with protein energies. We propose a novel method for protein thermal uncertainty quantification based on persistently invariant ILDMs. Extensive comparison with experimental data and the state-of-the-art methods in the field validate the proposed new method for protein B-factor prediction. PMID:24697365

  6. Validation of methods for the detection and quantification of engineered nanoparticles in food

    DEFF Research Database (Denmark)

    Linsinger, T.P.J.; Chaudhry, Q.; Dehalu, V.

    2013-01-01

    the methods apply equally well to particles of different suppliers. In trueness testing, information whether the particle size distribution has changed during analysis is required. Results are largely expected to follow normal distributions due to the expected high number of particles. An approach...... approach for the validation of methods for detection and quantification of nanoparticles in food samples. It proposes validation of identity, selectivity, precision, working range, limit of detection and robustness, bearing in mind that each “result” must include information about the chemical identity...

  7. Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Lines, Amanda M. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Adami, Susan R. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Sinkov, Sergey I. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Lumetta, Gregg J. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Bryan, Samuel A. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States

    2017-08-09

    Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example of a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.

  8. Digital PCR for direct quantification of viruses without DNA extraction.

    Science.gov (United States)

    Pavšič, Jernej; Žel, Jana; Milavec, Mojca

    2016-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration material and it has higher tolerance to inhibitors. DNA quantification without an extraction step (i.e. direct quantification) was performed here using dPCR and two different human cytomegalovirus whole-virus materials. Two dPCR platforms were used for this direct quantification of the viral DNA, and these were compared with quantification of the extracted viral DNA in terms of yield and variability. Direct quantification of both whole-virus materials present in simple matrices like cell lysate or Tris-HCl buffer provided repeatable measurements of virus concentrations that were probably in closer agreement with the actual viral load than when estimated through quantification of the extracted DNA. Direct dPCR quantification of other viruses, reference materials and clinically relevant matrices is now needed to show the full versatility of this very promising and cost-efficient development in virus quantification.

  9. Mass spectrometry–based relative quantification of proteins in precatalytic and catalytically active spliceosomes by metabolic labeling (SILAC), chemical labeling (iTRAQ), and label-free spectral count

    Science.gov (United States)

    Schmidt, Carla; Grønborg, Mads; Deckert, Jochen; Bessonov, Sergey; Conrad, Thomas; Lührmann, Reinhard; Urlaub, Henning

    2014-01-01

    The spliceosome undergoes major changes in protein and RNA composition during pre-mRNA splicing. Knowing the proteins—and their respective quantities—at each spliceosomal assembly stage is critical for understanding the molecular mechanisms and regulation of splicing. Here, we applied three independent mass spectrometry (MS)–based approaches for quantification of these proteins: (1) metabolic labeling by SILAC, (2) chemical labeling by iTRAQ, and (3) label-free spectral count for quantification of the protein composition of the human spliceosomal precatalytic B and catalytic C complexes. In total we were able to quantify 157 proteins by at least two of the three approaches. Our quantification shows that only a very small subset of spliceosomal proteins (the U5 and U2 Sm proteins, a subset of U5 snRNP-specific proteins, and the U2 snRNP-specific proteins U2A′ and U2B′′) remains unaltered upon transition from the B to the C complex. The MS-based quantification approaches classify the majority of proteins as dynamically associated specifically with the B or the C complex. In terms of experimental procedure and the methodical aspect of this work, we show that metabolically labeled spliceosomes are functionally active in terms of their assembly and splicing kinetics and can be utilized for quantitative studies. Moreover, we obtain consistent quantification results from all three methods, including the relatively straightforward and inexpensive label-free spectral count technique. PMID:24448447

  10. Approach to mathematics in textbooks at tertiary level - exploring authors' views about their texts

    Science.gov (United States)

    Randahl, Mira

    2012-10-01

    The aim of this article is to present and discuss some results from an inquiry into mathematics textbooks authors' visions about their texts and approaches they choose when new concepts are introduced. Authors' responses are discussed in relation to results about students' difficulties with approaching calculus reported by previous research. A questionnaire has been designed and sent to seven authors of the most used calculus textbooks in Norway and four authors have responded. The responses show that the authors mainly view teaching in terms of transmission so they focus mainly on getting the mathematical content correct and 'clear'. The dominant view is that the textbook is intended to help the students to learn by explaining and clarifying. The authors prefer the approach to introduce new concepts based on the traditional way of perceiving mathematics as a system of definitions, examples and exercises. The results of this study may enhance our understanding of the role of the textbook at tertiary level. They may also form a foundation for further research.

  11. Absolute quantification of superoxide dismutase in cytosol and mitochondria of mice hepatic cells exposed to mercury by a novel metallomic approach

    Energy Technology Data Exchange (ETDEWEB)

    García-Sevillano, M.A.; García-Barrera, T. [Department of Chemistry and Materials Science, Faculty of Experimental Sciences, University of Huelva, Campus de El Carmen, Huelva 21007 (Spain); Research Center on Health and Environment (CYSMA), University of Huelva (Spain); International Campus of Excellence on Agrofood (ceiA3), University of Huelva (Spain); Navarro, F. [International Campus of Excellence on Agrofood (ceiA3), University of Huelva (Spain); Department of Environmental Biology and Public Health, Cell Biology, Faculty of Experimental Sciences, University of Huelva, Campus El Carmen, Huelva 21007 (Spain); Gómez-Ariza, J.L., E-mail: ariza@uhu.es [Department of Chemistry and Materials Science, Faculty of Experimental Sciences, University of Huelva, Campus de El Carmen, Huelva 21007 (Spain); Research Center on Health and Environment (CYSMA), University of Huelva (Spain); International Campus of Excellence on Agrofood (ceiA3), University of Huelva (Spain)

    2014-09-09

    Highlights: • Identification and quantification of Cu,Zn-superoxide dismutase in mice hepatic cells. • IDA-ICP-MSis applied to obtain a high degree of accuracy, precision and sensibility. • This methodology reduces the time of analysis and avoids clean-up procedures. • The application of this method to Hg-exposed mice reveals perturbations in Cu,Zn-SOD. - Abstract: In the last years, the development of new methods for analyzing accurate and precise individual metalloproteins is of increasing importance, since numerous metalloproteins are excellent biomarkers of oxidative stress and diseases. In that way, methods based on the use of post column isotopic dilution analysis (IDA) or enriched protein standards are required to obtain a sufficient degree of accuracy, precision and high limits of detection. This paper reports the identification and absolute quantification of Cu,Zn-superoxide dismutase (Cu,Zn-SOD) in cytosol and mitochondria from mice hepatic cells using a innovative column switching analytical approach. The method consisted of orthogonal chromatographic systems coupled to inductively coupling plasma-mass spectrometry equipped with a octopole reaction systems (ICP-ORS-MS) and UV detectors: size exclusion fractionation (SEC) of the cytosolic and mitochondrial extracts followed by online anion exchange chromatographic (AEC) separation of Cu/Zn containing species. After purification, Cu,Zn-SOD was identified after tryptic digestion by molecular mass spectrometry (MS). The MS/MS spectrum of a doubly charged peptide was used to obtain the sequence of the protein using the MASCOT searching engine. This optimized methodology reduces the time of analysis and avoids the use of sample preconcentration and clean-up procedures, such as cut-off centrifuged filters, solid phase extraction (SPE), precipitation procedures, off-line fractions insolates, etc. In this sense, the method is robust, reliable and fast with typical chromatographic run time less than 20 min

  12. A risk-informed approach of quantification of epistemic uncertainty for the long-term radioactive waste disposal. Improving reliability of expert judgements with an advanced elicitation procedure

    International Nuclear Information System (INIS)

    Sugiyama, Daisuke; Chida, Taiji; Fujita, Tomonari; Tsukamoto, Masaki

    2011-01-01

    A quantification methodology of epistemic uncertainty by expert judgement based on the risk-informed approach is developed to assess inevitable uncertainty for the long-term safety assessment of radioactive waste disposal. The proposed method in this study employs techniques of logic tree, by which options of models and/or scenarios are identified, and Evidential Support Logic (ESL), by which possibility of each option is quantified. In this report, the effect of a feedback process of discussion between experts and input of state-of-the-art knowledge in the proposed method is discussed to estimate alteration of the distribution of expert judgements which is one of the factors causing uncertainty. In a preliminary quantification experiment of uncertainty of degradation of the engineering barrier materials in a tentative sub-surface disposal using the proposed methodology, experts themselves modified questions appropriately to facilitate sound judgements and to correlate those with scientific evidences clearly. The result suggests that the method effectively improves confidence of expert judgement. Also, the degree of consensus of expert judgement was sort of improved in some cases, since scientific knowledge and information of expert judgement in other fields became common understanding. It is suggested that the proposed method could facilitate consensus on uncertainty between interested persons. (author)

  13. Development and validation of an open source quantification tool for DSC-MRI studies.

    Science.gov (United States)

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Accurate and precise DNA quantification in the presence of different amplification efficiencies using an improved Cy0 method.

    Science.gov (United States)

    Guescini, Michele; Sisti, Davide; Rocchi, Marco B L; Panebianco, Renato; Tibollo, Pasquale; Stocchi, Vilberto

    2013-01-01

    Quantitative real-time PCR represents a highly sensitive and powerful technology for the quantification of DNA. Although real-time PCR is well accepted as the gold standard in nucleic acid quantification, there is a largely unexplored area of experimental conditions that limit the application of the Ct method. As an alternative, our research team has recently proposed the Cy0 method, which can compensate for small amplification variations among the samples being compared. However, when there is a marked decrease in amplification efficiency, the Cy0 is impaired, hence determining reaction efficiency is essential to achieve a reliable quantification. The proposed improvement in Cy0 is based on the use of the kinetic parameters calculated in the curve inflection point to compensate for efficiency variations. Three experimental models were used: inhibition of primer extension, non-optimal primer annealing and a very small biological sample. In all these models, the improved Cy0 method increased quantification accuracy up to about 500% without affecting precision. Furthermore, the stability of this procedure was enhanced integrating it with the SOD method. In short, the improved Cy0 method represents a simple yet powerful approach for reliable DNA quantification even in the presence of marked efficiency variations.

  15. Validation of an HPLC-UV method for the identification and quantification of bioactive amines in chicken meat

    Directory of Open Access Journals (Sweden)

    D.C.S. Assis

    2016-06-01

    Full Text Available ABSTRACT A high-performance liquid chromatography with ultraviolet detection (HPLC-UV method was validated for the study of bioactive amines in chicken meat. A gradient elution system with an ultraviolet detector was used after extraction with trichloroacetic acid and pre-column derivatization with dansyl chloride. Putrescine, cadaverine, histamine, tyramine, spermidine, and spermine standards were used for the evaluation of the following performance parameters: selectivity, linearity, precision, recovery, limits of detection, limits of quantification and ruggedness. The results indicated excellent selectivity, separation of all amines, a coefficient of determination greater than 0.99 and recovery from 92.25 to 102.25% at the concentration of 47.2mg.kg-1, with a limit of detection at 0.3mg.kg-1 and a limit of quantification at 0.9mg.kg-1 for all amines, with the exception of histamine, which exhibited the limit of quantification, of 1mg.kg-1. In conclusion, the performance parameters demonstrated adequacy of the method for the detection and quantification of bioactive amines in chicken meat.

  16. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    Science.gov (United States)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  17. A general approach to quantification of hydroxycinnamic acid derivatives and flavones, flavonols, and their glycosides by UV spectrophotometry

    Science.gov (United States)

    A general method was developed for the quantification of hydroxycinnamic acid derivatives and flavones, flavonols, and their glycosides based on the UV molar relative response factors (MRRF) of the standards. Each of these phenolic compounds contains a cinnamoyl structure and has a maximum absorban...

  18. Spinning reserve quantification by a stochastic–probabilistic scheme for smart power systems with high wind penetration

    International Nuclear Information System (INIS)

    Khazali, Amirhossein; Kalantar, Mohsen

    2015-01-01

    Highlights: • A stochastic–probabilistic approach is proposed for spinning reserve quantification. • A new linearized formulation integrating reliability metrics is presented. • The framework manages the reserve provided by responsive loads and storage systems. • The proposed method is capable of detaching the spinning reserve for different uses. - Abstract: This paper introduces a novel spinning reserve quantification scheme based on a hybrid stochastic–probabilistic approach for smart power systems including high penetration of wind generation. In this research the required spinning reserve is detached into two main parts. The first part of the reserve is procured to overcome imbalances between load and generation in the system. The second part of the required spinning reserve is scheduled according to the probability of unit outages. In order to overcome uncertainties caused by wind generation and load forecasting errors different scenarios of wind generation and load uncertainties are generated. For each scenario the reserve deployed by different components are taken account as the first part of the required reserve which is used to overcome imbalances. The second part of the required reserve is based on reliability constraints. The total expected energy not supplied (TEENS) is the reliability criterion which determines the second part of the required spinning reserve to overcome unit outage possibilities. This formulation permits the independent system operator to purchase the two different types of reserve with different prices. The introduced formulation for reserve quantification is also capable of managing and detaching the reserve provided by responsive loads and energy storage devices. The problem is formulated as a mixed integer linear programming (MILP) problem including linearized formulations for reliability metrics. Obtained results show the efficiency of the proposed approach compared with the conventional stochastic and deterministic

  19. An Approach to a Comprehensive Test Framework for Analysis and Evaluation of Text Line Segmentation Algorithms

    Directory of Open Access Journals (Sweden)

    Zoran N. Milivojevic

    2011-09-01

    Full Text Available The paper introduces a testing framework for the evaluation and validation of text line segmentation algorithms. Text line segmentation represents the key action for correct optical character recognition. Many of the tests for the evaluation of text line segmentation algorithms deal with text databases as reference templates. Because of the mismatch, the reliable testing framework is required. Hence, a new approach to a comprehensive experimental framework for the evaluation of text line segmentation algorithms is proposed. It consists of synthetic multi-like text samples and real handwritten text as well. Although the tests are mutually independent, the results are cross-linked. The proposed method can be used for different types of scripts and languages. Furthermore, two different procedures for the evaluation of algorithm efficiency based on the obtained error type classification are proposed. The first is based on the segmentation line error description, while the second one incorporates well-known signal detection theory. Each of them has different capabilities and convenience, but they can be used as supplements to make the evaluation process efficient. Overall the proposed procedure based on the segmentation line error description has some advantages, characterized by five measures that describe measurement procedures.

  20. A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.

    Directory of Open Access Journals (Sweden)

    Robert G Rutledge

    Full Text Available BACKGROUND: Linear regression of efficiency (LRE introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. FINDINGS: Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. CONCLUSIONS: The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.

  1. Usle systematization of the factors in gis to the quantification the of laminate erosion in the jirau river watershed

    Directory of Open Access Journals (Sweden)

    Elisete Guimarães

    2005-12-01

    Full Text Available The present paper demonstrates the use of USLE (Universal Equation of Soil Losses in GIS (Geographic Information System as a tool for the quantification of soil losses by laminate erosion. The study area is the Jirau River watershed, which is located in the district of Dois Vizinhos, Southwestern Parana. Our results present a contribution to the development and implementation of automated methodologies focused on the characterization, quantification, and control of the laminate erosion process.

  2. Diffuse optical microscopy for quantification of depth-dependent epithelial backscattering in the cervix

    Science.gov (United States)

    Bodenschatz, Nico; Lam, Sylvia; Carraro, Anita; Korbelik, Jagoda; Miller, Dianne M.; McAlpine, Jessica N.; Lee, Marette; Kienle, Alwin; MacAulay, Calum

    2016-06-01

    A fiber optic imaging approach is presented using structured illumination for quantification of almost pure epithelial backscattering. We employ multiple spatially modulated projection patterns and camera-based reflectance capture to image depth-dependent epithelial scattering. The potential diagnostic value of our approach is investigated on cervical ex vivo tissue specimens. Our study indicates a strong backscattering increase in the upper part of the cervical epithelium caused by dysplastic microstructural changes. Quantization of relative depth-dependent backscattering is confirmed as a potentially useful diagnostic feature for detection of precancerous lesions in cervical squamous epithelium.

  3. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    Science.gov (United States)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  4. The role of PET quantification in cardiovascular imaging.

    Science.gov (United States)

    Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido

    2014-08-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries

  5. (1) H-MRS processing parameters affect metabolite quantification

    DEFF Research Database (Denmark)

    Bhogal, Alex A; Schür, Remmelt R; Houtepen, Lotte C

    2017-01-01

    investigated the influence of model parameters and spectral quantification software on fitted metabolite concentration values. Sixty spectra in 30 individuals (repeated measures) were acquired using a 7-T MRI scanner. Data were processed by four independent research groups with the freedom to choose their own...... + NAAG/Cr + PCr and Glu/Cr + PCr, respectively. Metabolite quantification using identical (1) H-MRS data was influenced by processing parameters, basis sets and software choice. Locally preferred processing choices affected metabolite quantification, even when using identical software. Our results......Proton magnetic resonance spectroscopy ((1) H-MRS) can be used to quantify in vivo metabolite levels, such as lactate, γ-aminobutyric acid (GABA) and glutamate (Glu). However, there are considerable analysis choices which can alter the accuracy or precision of (1) H-MRS metabolite quantification...

  6. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State Univ., Fort Collins, CO (United States); El-Azab, Anter [Florida State Univ., Tallahassee, FL (United States); Pernice, Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States); Peterson, John W. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Polyakov, Peter [Univ. of Wyoming, Laramie, WY (United States); Tavener, Simon [Colorado State Univ., Fort Collins, CO (United States); Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States); Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis for computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.

  7. Quantification of the genetic risk of environmental mutagens

    International Nuclear Information System (INIS)

    Ehling, U.H.

    1988-01-01

    Screening methods are used for hazard identification. Assays for heritable mutations in mammals are used for the confirmation of short-term test results and for the quantification of the genetic risk. There are two main approaches in making genetic risk estimates. One of these, termed the direct method, expresses risk in terms of the expected frequency of genetic changes induced per unit. The other, referred to as the doubling dose method or the indirect method, expresses risk in relation to the observed incidence of genetic disorders now present in man. The indirect method uses experimental data only for the calculation of the doubling dose. The quality of the risk estimation depends on the assumption of persistence of the induced mutations and the ability to determine the current incidence of genetic diseases. The difficulties of improving the estimates of current incidences of genetic diseases or the persistence of the genes in the population led them to the development of an alternative method, the direct estimation of the genetic risk. The direct estimation uses experimental data for the induced frequency for dominant mutations in mice. For the verification of these quantifications one can use the data of Hiroshima and Nagasaki. According to the estimation with the direct method, one would expect less than 1 radiation-induced dominant cataract in 19,000 children with one or both parents exposed. The expected overall frequency of dominant mutations in the first generation would be 20-25, based on radiation-induced dominant cataract mutations. It is estimated that 10 times more recessive than dominant mutations are induced. The same approaches can be used to determine the impact of chemical mutagens

  8. Benchmarking common quantification strategies for large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Hogrebe, Alexander; von Stechow, Louise; Bekker-Jensen, Dorte B

    2018-01-01

    Comprehensive mass spectrometry (MS)-based proteomics is now feasible, but reproducible quantification remains challenging, especially for post-translational modifications such as phosphorylation. Here, we compare the most popular quantification techniques for global phosphoproteomics: label-free...

  9. A method to quantify infectious airborne pathogens at concentrations below the threshold of quantification by culture

    Science.gov (United States)

    Cutler, Timothy D.; Wang, Chong; Hoff, Steven J.; Zimmerman, Jeffrey J.

    2013-01-01

    In aerobiology, dose-response studies are used to estimate the risk of infection to a susceptible host presented by exposure to a specific dose of an airborne pathogen. In the research setting, host- and pathogen-specific factors that affect the dose-response continuum can be accounted for by experimental design, but the requirement to precisely determine the dose of infectious pathogen to which the host was exposed is often challenging. By definition, quantification of viable airborne pathogens is based on the culture of micro-organisms, but some airborne pathogens are transmissible at concentrations below the threshold of quantification by culture. In this paper we present an approach to the calculation of exposure dose at microbiologically unquantifiable levels using an application of the “continuous-stirred tank reactor (CSTR) model” and the validation of this approach using rhodamine B dye as a surrogate for aerosolized microbial pathogens in a dynamic aerosol toroid (DAT). PMID:24082399

  10. Quantification of 5-methyl-2'-deoxycytidine in the DNA.

    Science.gov (United States)

    Giel-Pietraszuk, Małgorzata; Insińska-Rak, Małgorzata; Golczak, Anna; Sikorski, Marek; Barciszewska, Mirosława; Barciszewski, Jan

    2015-01-01

    Methylation at position 5 of cytosine (Cyt) at the CpG sequences leading to formation of 5-methyl-cytosine (m(5)Cyt) is an important element of epigenetic regulation of gene expression. Modification of the normal methylation pattern, unique to each organism, leads to the development of pathological processes and diseases, including cancer. Therefore, quantification of the DNA methylation and analysis of changes in the methylation pattern is very important from a practical point of view and can be used for diagnostic purposes, as well as monitoring of the treatment progress. In this paper we present a new method for quantification of 5-methyl-2'deoxycytidine (m(5)C) in the DNA. The technique is based on conversion of m(5)C into fluorescent 3,N(4)-etheno-5-methyl-2'deoxycytidine (εm(5)C) and its identification by reversed-phase high-performance liquid chromatography (RP-HPLC). The assay was used to evaluate m(5)C concentration in DNA of calf thymus and peripheral blood of cows bred under different conditions. This approach can be applied for measuring of 5-methylcytosine in cellular DNA from different cells and tissues.

  11. Bone histomorphometric quantification by X-ray phase contrast and transmission 3D SR microcomputed tomography

    International Nuclear Information System (INIS)

    Nogueira, L.P.; Pinheiro, C.J.G.; Braz, D.; Oliveira, L.F.; Barroso, R.C.

    2008-01-01

    Full text: Conventional histomorphometry is an important method for quantitative evaluation of bone microstructure. X-ray computed tomography is a noninvasive technique, which can be used to evaluate histomorphometric indices. In this technique, the output 3D images are used to quantify the whole sample, differently from the conventional one, in which the quantification is performed in 2D slices and extrapolated for 3D case. Looking for better resolutions and visualization of soft tissues, X-ray phase contrast imaging technique was developed. The objective of this work was to perform histomorphometric quantification of human cancellous bone using 3D synchrotron X ray computed microtomography, using two distinct techniques: transmission and phase contrast, in order to compare the results and evaluate the viability of applying the same methodology of quantification for both technique. All experiments were performed at the ELETTRA Synchrotron Light Laboratory in Trieste (Italy). MicroCT data sets were collected using the CT set-up on the SYRMEP (Synchrotron Radiation for Medical Physics) beamline. Results showed that there is a better correlation between histomorphometric parameters of both techniques when morphological filters had been used. However, using these filters, some important information given by phase contrast are lost and they shall be explored by new techniques of quantification

  12. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  13. Interpretation of biological and mechanical variations between the Lowry versus Bradford method for protein quantification.

    Science.gov (United States)

    Lu, Tzong-Shi; Yiao, Szu-Yu; Lim, Kenneth; Jensen, Roderick V; Hsiao, Li-Li

    2010-07-01

    The identification of differences in protein expression resulting from methodical variations is an essential component to the interpretation of true, biologically significant results. We used the Lowry and Bradford methods- two most commonly used methods for protein quantification, to assess whether differential protein expressions are a result of true biological or methodical variations. MATERIAL #ENTITYSTARTX00026; Differential protein expression patterns was assessed by western blot following protein quantification by the Lowry and Bradford methods. We have observed significant variations in protein concentrations following assessment with the Lowry versus Bradford methods, using identical samples. Greater variations in protein concentration readings were observed over time and in samples with higher concentrations, with the Bradford method. Identical samples quantified using both methods yielded significantly different expression patterns on Western blot. We show for the first time that methodical variations observed in these protein assay techniques, can potentially translate into differential protein expression patterns, that can be falsely taken to be biologically significant. Our study therefore highlights the pivotal need to carefully consider methodical approaches to protein quantification in techniques that report quantitative differences.

  14. Writing Treatment for Aphasia: A Texting Approach

    Science.gov (United States)

    Beeson, Pelagie M.; Higginson, Kristina; Rising, Kindle

    2013-01-01

    Purpose: Treatment studies have documented the therapeutic and functional value of lexical writing treatment for individuals with severe aphasia. The purpose of this study was to determine whether such retraining could be accomplished using the typing feature of a cellular telephone, with the ultimate goal of using text messaging for…

  15. Stable-isotope dilution GC-MS approach for nitrite quantification in human whole blood, erythrocytes, and plasma using pentafluorobenzyl bromide derivatization: nitrite distribution in human blood.

    Science.gov (United States)

    Schwarz, Alexandra; Modun, Darko; Heusser, Karsten; Tank, Jens; Gutzki, Frank-Mathias; Mitschke, Anja; Jordan, Jens; Tsikas, Dimitrios

    2011-05-15

    Previously, we reported on the usefulness of pentafluorobenzyl bromide (PFB-Br) for the simultaneous derivatization and quantitative determination of nitrite and nitrate in various biological fluids by GC-MS using their (15)N-labelled analogues as internal standards. As nitrite may be distributed unevenly in plasma and blood cells, its quantification in whole blood rather than in plasma or serum may be the most appropriate approach to determine nitrite concentration in the circulation. So far, GC-MS methods based on PFB-Br derivatization failed to measure nitrite in whole blood and erythrocytes because of rapid nitrite loss by oxidation and other unknown reactions during derivatization. The present article reports optimized and validated procedures for sample preparation and nitrite derivatization which allow for reliable quantification of nitrite in human whole blood and erythrocytes. Essential measures for stabilizing nitrite in these samples include sample cooling (0-4°C), hemoglobin (Hb) removal by precipitation with acetone and short derivatization of the Hb-free supernatant (5 min, 50°C). Potassium ferricyanide (K(3)Fe(CN)(6)) is useful in preventing Hb-caused nitrite loss, however, this chemical is not absolutely required in the present method. Our results show that accurate GC-MS quantification of nitrite as PFB derivative is feasible virtually in every biological matrix with similar accuracy and precision. In EDTA-anticoagulated venous blood of 10 healthy young volunteers, endogenous nitrite concentration was measured to be 486±280 nM in whole blood, 672±496 nM in plasma (C(P)), and 620±350 nM in erythrocytes (C(E)). The C(E)-to-C(P) ratio was 0.993±0.188 indicating almost even distribution of endogenous nitrite between plasma and erythrocytes. By contrast, the major fraction of nitrite added to whole blood remained in plasma. The present GC-MS method is useful to investigate distribution and metabolism of endogenous and exogenous nitrite in blood

  16. [Symbol: see text]2 Optimized predictive image coding with [Symbol: see text]∞ bound.

    Science.gov (United States)

    Chuah, Sceuchin; Dumitrescu, Sorina; Wu, Xiaolin

    2013-12-01

    In many scientific, medical, and defense applications of image/video compression, an [Symbol: see text]∞ error bound is required. However, pure[Symbol: see text]∞-optimized image coding, colloquially known as near-lossless image coding, is prone to structured errors such as contours and speckles if the bit rate is not sufficiently high; moreover, most of the previous [Symbol: see text]∞-based image coding methods suffer from poor rate control. In contrast, the [Symbol: see text]2 error metric aims for average fidelity and hence preserves the subtlety of smooth waveforms better than the ∞ error metric and it offers fine granularity in rate control, but pure [Symbol: see text]2-based image coding methods (e.g., JPEG 2000) cannot bound individual errors as the [Symbol: see text]∞-based methods can. This paper presents a new compression approach to retain the benefits and circumvent the pitfalls of the two error metrics. A common approach of near-lossless image coding is to embed into a DPCM prediction loop a uniform scalar quantizer of residual errors. The said uniform scalar quantizer is replaced, in the proposed new approach, by a set of context-based [Symbol: see text]2-optimized quantizers. The optimization criterion is to minimize a weighted sum of the [Symbol: see text]2 distortion and the entropy while maintaining a strict [Symbol: see text]∞ error bound. The resulting method obtains good rate-distortion performance in both [Symbol: see text]2 and [Symbol: see text]∞ metrics and also increases the rate granularity. Compared with JPEG 2000, the new method not only guarantees lower [Symbol: see text]∞ error for all bit rates, but also it achieves higher PSNR for relatively high bit rates.

  17. Place as Text: Approaches to Active Learning. 2nd Edition. National Collegiate Honors Council Monograph Series

    Science.gov (United States)

    Braid, Bernice, Ed.; Long, Ada, Ed.

    2010-01-01

    The decade since publication of "Place as Text: Approaches to Active Learning" has seen an explosion of interest and productivity in the field of experiential education. This monograph presents a story of an experiment and a blueprint of sorts for anyone interested in enriching an existing program or willing to experiment with pedagogy…

  18. Rapid Quantification of Low-Viscosity Acetyl-Triacylglycerols Using Electrospray Ionization Mass Spectrometry.

    Science.gov (United States)

    Bansal, Sunil; Durrett, Timothy P

    2016-09-01

    Acetyl-triacylglycerols (acetyl-TAG) possess an sn-3 acetate group, which confers useful chemical and physical properties to these unusual triacylglycerols (TAG). Current methods for quantification of acetyl-TAG are time consuming and do not provide any information on the molecular species profile. Electrospray ionization mass spectrometry (ESI-MS)-based methods can overcome these drawbacks. However, the ESI-MS signal intensity for TAG depends on the aliphatic chain length and unsaturation index of the molecule. Therefore response factors for different molecular species need to be determined before any quantification. The effects of the chain length and the number of double-bonds of the sn-1/2 acyl groups on the signal intensity for the neutral loss of short chain length sn-3 groups were quantified using a series of synthesized sn-3 specific structured TAG. The signal intensity for the neutral loss of the sn-3 acyl group was found to negatively correlated with the aliphatic chain length and unsaturation index of the sn-1/2 acyl groups. The signal intensity of the neutral loss of the sn-3 acyl group was also negatively correlated with the size of that chain. Further, the position of the group undergoing neutral loss was also important, with the signal from an sn-2 acyl group much lower than that from one located at sn-3. Response factors obtained from these analyses were used to develop a method for the absolute quantification of acetyl-TAG. The increased sensitivity of this ESI-MS-based approach allowed successful quantification of acetyl-TAG in various biological settings, including the products of in vitro enzyme activity assays.

  19. Automated quantification of epicardial adipose tissue using CT angiography: evaluation of a prototype software

    Energy Technology Data Exchange (ETDEWEB)

    Spearman, James V.; Silverman, Justin R.; Krazinski, Aleksander W.; Costello, Philip [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Meinel, Felix G.; Geyer, Lucas L. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Ludwig-Maximilians-University Hospital, Institute for Clinical Radiology, Munich (Germany); Schoepf, U.J. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Department of Medicine, Charleston, SC (United States); Apfaltrer, Paul [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University Medical Center Mannheim, Medical Faculty Mannheim - Heidelberg University, Institute of Clinical Radiology and Nuclear Medicine, Mannheim (Germany); Canstein, Christian [Siemens Medical Solutions USA, Inc., Malvern, PA (United States); De Cecco, Carlo Nicola [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Rome ' ' Sapienza' ' - Polo Pontino, Department of Radiological Sciences, Oncology and Pathology, Latina (Italy)

    2014-02-15

    This study evaluated the performance of a novel automated software tool for epicardial fat volume (EFV) quantification compared to a standard manual technique at coronary CT angiography (cCTA). cCTA data sets of 70 patients (58.6 ± 12.9 years, 33 men) were retrospectively analysed using two different post-processing software applications. Observer 1 performed a manual single-plane pericardial border definition and EFV{sub M} segmentation (manual approach). Two observers used a software program with fully automated 3D pericardial border definition and EFV{sub A} calculation (automated approach). EFV and time required for measuring EFV (including software processing time and manual optimization time) for each method were recorded. Intraobserver and interobserver reliability was assessed on the prototype software measurements. T test, Spearman's rho, and Bland-Altman plots were used for statistical analysis. The final EFV{sub A} (with manual border optimization) was strongly correlated with the manual axial segmentation measurement (60.9 ± 33.2 mL vs. 65.8 ± 37.0 mL, rho = 0.970, P < 0.001). A mean of 3.9 ± 1.9 manual border edits were performed to optimize the automated process. The software prototype required significantly less time to perform the measurements (135.6 ± 24.6 s vs. 314.3 ± 76.3 s, P < 0.001) and showed high reliability (ICC > 0.9). Automated EFV{sub A} quantification is an accurate and time-saving method for quantification of EFV compared to established manual axial segmentation methods. (orig.)

  20. System resiliency quantification using non-state-space and state-space analytic models

    International Nuclear Information System (INIS)

    Ghosh, Rahul; Kim, DongSeong; Trivedi, Kishor S.

    2013-01-01

    Resiliency is becoming an important service attribute for large scale distributed systems and networks. Key problems in resiliency quantification are lack of consensus on the definition of resiliency and systematic approach to quantify system resiliency. In general, resiliency is defined as the ability of (system/person/organization) to recover/defy/resist from any shock, insult, or disturbance [1]. Many researchers interpret resiliency as a synonym for fault-tolerance and reliability/availability. However, effect of failure/repair on systems is already covered by reliability/availability measures and that of on individual jobs is well covered under the umbrella of performability [2] and task completion time analysis [3]. We use Laprie [4] and Simoncini [5]'s definition in which resiliency is the persistence of service delivery that can justifiably be trusted, when facing changes. The changes we are referring to here are beyond the envelope of system configurations already considered during system design, that is, beyond fault tolerance. In this paper, we outline a general approach for system resiliency quantification. Using examples of non-state-space and state-space stochastic models, we analytically–numerically quantify the resiliency of system performance, reliability, availability and performability measures w.r.t. structural and parametric changes

  1. Simplified quantification of nicotinic receptors with 2[18F]F-A-85380 PET

    International Nuclear Information System (INIS)

    Mitkovski, Sascha; Villemagne, Victor L.; Novakovic, Kathy E.; O'Keefe, Graeme; Tochon-Danguy, Henri; Mulligan, Rachel S.; Dickinson, Kerryn L.; Saunder, Tim; Gregoire, Marie-Claude; Bottlaender, Michel; Dolle, Frederic; Rowe, Christopher C.

    2005-01-01

    Introduction: Neuronal nicotinic acetylcholine receptors (nAChRs), widely distributed in the human brain, are implicated in various neurophysiological processes as well as being particularly affected in neurodegenerative conditions such as Alzheimer's disease. We sought to evaluate a minimally invasive method for quantification of nAChR distribution in the normal human brain, suitable for routine clinical application, using 2[ 18 F]F-A-85380 and positron emission tomography (PET). Methods: Ten normal volunteers (four females and six males, aged 63.40±9.22 years) underwent a dynamic 120-min PET scan after injection of 226 MBq 2[ 18 F]F-A-85380 along with arterial blood sampling. Regional binding was assessed through standardized uptake value (SUV) and distribution volumes (DV) obtained using both compartmental (DV 2CM ) and graphical analysis (DV Logan ). A simplified approach to the estimation of DV (DV simplified ), defined as the region-to-plasma ratio at apparent steady state (90-120 min post injection), was compared with the other quantification approaches. Results: DV Logan values were higher than DV 2CM . A strong correlation was observed between DV simplified , DV Logan (r=.94) and DV 2CM (r=.90) in cortical regions, with lower correlations in thalamus (r=.71 and .82, respectively). Standardized uptake value showed low correlation against DV Logan and DV 2CM . Conclusion: DV simplified determined by the ratio of tissue to metabolite-corrected plasma using a single 90- to 120-min PET acquisition appears acceptable for quantification of cortical nAChR binding with 2[ 18 F]F-A-85380 and suitable for clinical application

  2. Colour thresholding and objective quantification in bioimaging

    Science.gov (United States)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  3. Quantification analysis of CT for aphasic patients

    International Nuclear Information System (INIS)

    Watanabe, Shunzo; Ooyama, Hiroshi; Hojo, Kei; Tasaki, Hiroichi; Hanazono, Toshihide; Sato, Tokijiro; Metoki, Hirobumi; Totsuka, Motokichi; Oosumi, Noboru.

    1987-01-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on Slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis). (author)

  4. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  5. Multivariate Quantification of the Solid State Phase Composition of Co-Amorphous Naproxen-Indomethacin

    Directory of Open Access Journals (Sweden)

    Andreas Beyer

    2015-10-01

    Full Text Available To benefit from the optimized dissolution properties of active pharmaceutical ingredients in their amorphous forms, co-amorphisation as a viable tool to stabilize these amorphous phases is of both academic and industrial interest. Reports dealing with the physical stability and recrystallization behavior of co-amorphous systems are however limited to qualitative evaluations based on the corresponding X-ray powder diffractograms. Therefore, the objective of the study was to develop a quantification model based on X-ray powder diffractometry (XRPD, followed by a multivariate partial least squares regression approach that enables the simultaneous determination of up to four solid state fractions: crystalline naproxen, γ-indomethacin, α-indomethacin as well as co-amorphous naproxen-indomethacin. For this purpose, a calibration set that covers the whole range of possible combinations of the four components was prepared and analyzed by XRPD. In order to test the model performances, leave-one-out cross validation was performed and revealed root mean square errors of validation between 3.11% and 3.45% for the crystalline molar fractions and 5.57% for the co-amorphous molar fraction. In summary, even four solid state phases, involving one co-amorphous phase, can be quantified with this XRPD data-based approach.

  6. Performance of the Real-Q EBV Quantification Kit for Epstein-Barr Virus DNA Quantification in Whole Blood.

    Science.gov (United States)

    Huh, Hee Jae; Park, Jong Eun; Kim, Ji Youn; Yun, Sun Ae; Lee, Myoung Keun; Lee, Nam Yong; Kim, Jong Won; Ki, Chang Seok

    2017-03-01

    There has been increasing interest in standardized and quantitative Epstein-Barr virus (EBV) DNA testing for the management of EBV disease. We evaluated the performance of the Real-Q EBV Quantification Kit (BioSewoom, Korea) in whole blood (WB). Nucleic acid extraction and real-time PCR were performed by using the MagNA Pure 96 (Roche Diagnostics, Germany) and 7500 Fast real-time PCR system (Applied Biosystems, USA), respectively. Assay sensitivity, linearity, and conversion factor were determined by using the World Health Organization international standard diluted in EBV-negative WB. We used 81 WB clinical specimens to compare performance of the Real-Q EBV Quantification Kit and artus EBV RG PCR Kit (Qiagen, Germany). The limit of detection (LOD) and limit of quantification (LOQ) for the Real-Q kit were 453 and 750 IU/mL, respectively. The conversion factor from EBV genomic copies to IU was 0.62. The linear range of the assay was from 750 to 10⁶ IU/mL. Viral load values measured with the Real-Q assay were on average 0.54 log₁₀ copies/mL higher than those measured with the artus assay. The Real-Q assay offered good analytical performance for EBV DNA quantification in WB.

  7. An international comparability study on quantification of mRNA gene expression ratios: CCQM-P103.1

    Directory of Open Access Journals (Sweden)

    Alison S. Devonshire

    2016-06-01

    Full Text Available Measurement of RNA can be used to study and monitor a range of infectious and non-communicable diseases, with profiling of multiple gene expression mRNA transcripts being increasingly applied to cancer stratification and prognosis. An international comparison study (Consultative Committee for Amount of Substance (CCQM-P103.1 was performed in order to evaluate the comparability of measurements of RNA copy number ratio for multiple gene targets between two samples. Six exogenous synthetic targets comprising of External RNA Control Consortium (ERCC standards were measured alongside transcripts for three endogenous gene targets present in the background of human cell line RNA. The study was carried out under the auspices of the Nucleic Acids (formerly Bioanalysis Working Group of the CCQM. It was coordinated by LGC (United Kingdom with the support of National Institute of Standards and Technology (USA and results were submitted from thirteen National Metrology Institutes and Designated Institutes. The majority of laboratories performed RNA measurements using RT-qPCR, with datasets also being submitted by two laboratories based on reverse transcription digital polymerase chain reaction and one laboratory using a next-generation sequencing method. In RT-qPCR analysis, the RNA copy number ratios between the two samples were quantified using either a standard curve or a relative quantification approach. In general, good agreement was observed between the reported results of ERCC RNA copy number ratio measurements. Measurements of the RNA copy number ratios for endogenous genes between the two samples were also consistent between the majority of laboratories. Some differences in the reported values and confidence intervals (‘measurement uncertainties’ were noted which may be attributable to choice of measurement method or quantification approach. This highlights the need for standardised practices for the calculation of fold change ratios and

  8. Quantification of in vivo oxidative damage in Caenorhabditis elegans during aging by endogenous F3-isoprostane measurement

    NARCIS (Netherlands)

    Labuschagne, C.F.; Stigter, E.C.; Hendriks, M.M.; Berger, R.; Rokach, J.; Korswagen, H.C.; Brenkman, A.B.

    2013-01-01

    Oxidative damage is thought to be a major cause in development of pathologies and aging. However, quantification of oxidative damage is methodologically difficult. Here, we present a robust liquid chromatography-tandem mass spectrometry (LC-MS/MS) approach for accurate, sensitive, and linear in vivo

  9. Application of laboratory and portable attenuated total reflectance infrared spectroscopic approaches for rapid quantification of alpaca serum immunoglobulin G.

    Directory of Open Access Journals (Sweden)

    Ibrahim Elsohaby

    Full Text Available The objective of this study was to develop and compare the performance of laboratory grade and portable attenuated total reflectance infrared (ATR-IR spectroscopic approaches in combination with partial least squares regression (PLSR for the rapid quantification of alpaca serum IgG concentration, and the identification of low IgG (<1000 mg/dL, which is consistent with the diagnosis of failure of transfer of passive immunity (FTPI in neonates. Serum samples (n = 175 collected from privately owned, healthy alpacas were tested by the reference method of radial immunodiffusion (RID assay, and laboratory grade and portable ATR-IR spectrometers. Various pre-processing strategies were applied to the ATR-IR spectra that were linked to corresponding RID-IgG concentrations, and then randomly split into two sets: calibration (training and test sets. PLSR was applied to the calibration set and calibration models were developed, and the test set was used to assess the accuracy of the analytical method. For the test set, the Pearson correlation coefficients between the IgG measured by RID and predicted by both laboratory grade and portable ATR-IR spectrometers was 0.91. The average differences between reference serum IgG concentrations and the two IR-based methods were 120.5 mg/dL and 71 mg/dL for the laboratory and portable ATR-IR-based assays, respectively. Adopting an IgG concentration <1000 mg/dL as the cut-point for FTPI cases, the sensitivity, specificity, and accuracy for identifying serum samples below this cut point by laboratory ATR-IR assay were 86, 100 and 98%, respectively (within the entire data set. Corresponding values for the portable ATR-IR assay were 95, 99 and 99%, respectively. These results suggest that the two different ATR-IR assays performed similarly for rapid qualitative evaluation of alpaca serum IgG and for diagnosis of IgG <1000 mg/dL, the portable ATR-IR spectrometer performed slightly better, and provides more flexibility for

  10. Quantification of furanoheliangolides by HPLC and GC Quantificação dos furanoeliangolidos por HPLC e CG

    Directory of Open Access Journals (Sweden)

    Pierre Alexandre dos Santos

    2003-09-01

    Full Text Available The development and comparison of two analytical methods (HPLC and GC for the quantification of the most common furanoheliangolides from Lychnophora is reported in this paper. Both methods are sensitive and suitable for quantification of these metabolites.Neste trabalho são descritos o desenvolvimento e comparação de dois métodos analíticos (CLAE e CG para quantificação dos furanoeliangolidos mais comuns em Lychnophora.Ambos os métodos são sensíveis e adequados para a quantificação desses metabólitos.

  11. Synthesis and Review: Advancing agricultural greenhouse gas quantification

    International Nuclear Information System (INIS)

    Olander, Lydia P; Wollenberg, Eva; Tubiello, Francesco N; Herold, Martin

    2014-01-01

    Reducing emissions of agricultural greenhouse gases (GHGs), such as methane and nitrous oxide, and sequestering carbon in the soil or in living biomass can help reduce the impact of agriculture on climate change while improving productivity and reducing resource use. There is an increasing demand for improved, low cost quantification of GHGs in agriculture, whether for national reporting to the United Nations Framework Convention on Climate Change (UNFCCC), underpinning and stimulating improved practices, establishing crediting mechanisms, or supporting green products. This ERL focus issue highlights GHG quantification to call attention to our existing knowledge and opportunities for further progress. In this article we synthesize the findings of 21 papers on the current state of global capability for agricultural GHG quantification and visions for its improvement. We conclude that strategic investment in quantification can lead to significant global improvement in agricultural GHG estimation in the near term. (paper)

  12. Improving Students� Ability in Writing Hortatory Exposition Texts by Using Process-Genre Based Approach with YouTube Videos as the Media

    Directory of Open Access Journals (Sweden)

    fifin naili rizkiyah

    2017-06-01

    Full Text Available Abstract: This research is aimed at finding out how Process-Genre Based Approach strategy with YouTube Videos as the media are employed to improve the students� ability in writing hortatory exposition texts. This study uses collaborative classroom action research design following the procedures namely planning, implementing, observing, and reflecting. The procedures of carrying out the strategy are: (1 relating several issues/ cases to the students� background knowledge and introducing the generic structures and linguistic features of hortatory exposition text as the BKoF stage, (2 analyzing the generic structure and the language features used in the text and getting model on how to write a hortatory exposition text by using the YouTube Video as the MoT stage, (3 writing a hortatory exposition text collaboratively in a small group and in pairs through process writing as the JCoT stage, and (4 writing a hortatory exposition text individually as the ICoT stage. The result shows that the use of Process-Genre Based Approach and YouTube Videos can improve the students� ability in writing hortatory exposition texts. The percentage of the students achieving the score above the minimum passing grade (70 had improved from only 15.8% (3 out of 19 students in the preliminary study to 100% (22 students in the Cycle 1. Besides, the score of each aspect; content, organization, vocabulary, grammar, and mechanics also improved. � Key Words: writing ability, hortatory exposition text, process-genre based approach, youtube video

  13. Quantification Methods of Management Skills in Shipping

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2012-04-01

    Full Text Available Romania can not overcome the financial crisis without business growth, without finding opportunities for economic development and without attracting investment into the country. Successful managers find ways to overcome situations of uncertainty. The purpose of this paper is to determine the managerial skills developed by the Romanian fluvial shipping company NAVROM (hereinafter CNFR NAVROM SA, compared with ten other major competitors in the same domain, using financial information of these companies during the years 2005-2010. For carrying out the work it will be used quantification methods of managerial skills to CNFR NAVROM SA Galati, Romania, as example mentioning the analysis of financial performance management based on profitability ratios, net profit margin, suppliers management, turnover.

  14. Real-time PCR based on SYBR-Green I fluorescence: An alternative to the TaqMan assay for a relative quantification of gene rearrangements, gene amplifications and micro gene deletions

    Directory of Open Access Journals (Sweden)

    Puisieux Alain

    2003-10-01

    Full Text Available Abstract Background Real-time PCR is increasingly being adopted for RNA quantification and genetic analysis. At present the most popular real-time PCR assay is based on the hybridisation of a dual-labelled probe to the PCR product, and the development of a signal by loss of fluorescence quenching as PCR degrades the probe. Though this so-called 'TaqMan' approach has proved easy to optimise in practice, the dual-labelled probes are relatively expensive. Results We have designed a new assay based on SYBR-Green I binding that is quick, reliable, easily optimised and compares well with the published assay. Here we demonstrate its general applicability by measuring copy number in three different genetic contexts; the quantification of a gene rearrangement (T-cell receptor excision circles (TREC in peripheral blood mononuclear cells; the detection and quantification of GLI, MYC-C and MYC-N gene amplification in cell lines and cancer biopsies; and detection of deletions in the OPA1 gene in dominant optic atrophy. Conclusion Our assay has important clinical applications, providing accurate diagnostic results in less time, from less biopsy material and at less cost than assays currently employed such as FISH or Southern blotting.

  15. Critical points of DNA quantification by real-time PCR--effects of DNA extraction method and sample matrix on quantification of genetically modified organisms.

    Science.gov (United States)

    Cankar, Katarina; Stebih, Dejan; Dreo, Tanja; Zel, Jana; Gruden, Kristina

    2006-08-14

    Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to

  16. Towards Technological Approaches for Concept Maps Mining from Text

    OpenAIRE

    Camila Zacche Aguiar; Davidson Cury; Amal Zouaq

    2018-01-01

    Concept maps are resources for the representation and construction of knowledge. They allow showing, through concepts and relationships, how knowledge about a subject is organized. Technological advances have boosted the development of approaches for the automatic construction of a concept map, to facilitate and provide the benefits of that resource more broadly. Due to the need to better identify and analyze the functionalities and characteristics of those approaches, we conducted a detailed...

  17. RSEM: accurate transcript quantification from RNA-Seq data with or without a reference genome

    Directory of Open Access Journals (Sweden)

    Dewey Colin N

    2011-08-01

    Full Text Available Abstract Background RNA-Seq is revolutionizing the way transcript abundances are measured. A key challenge in transcript quantification from RNA-Seq data is the handling of reads that map to multiple genes or isoforms. This issue is particularly important for quantification with de novo transcriptome assemblies in the absence of sequenced genomes, as it is difficult to determine which transcripts are isoforms of the same gene. A second significant issue is the design of RNA-Seq experiments, in terms of the number of reads, read length, and whether reads come from one or both ends of cDNA fragments. Results We present RSEM, an user-friendly software package for quantifying gene and isoform abundances from single-end or paired-end RNA-Seq data. RSEM outputs abundance estimates, 95% credibility intervals, and visualization files and can also simulate RNA-Seq data. In contrast to other existing tools, the software does not require a reference genome. Thus, in combination with a de novo transcriptome assembler, RSEM enables accurate transcript quantification for species without sequenced genomes. On simulated and real data sets, RSEM has superior or comparable performance to quantification methods that rely on a reference genome. Taking advantage of RSEM's ability to effectively use ambiguously-mapping reads, we show that accurate gene-level abundance estimates are best obtained with large numbers of short single-end reads. On the other hand, estimates of the relative frequencies of isoforms within single genes may be improved through the use of paired-end reads, depending on the number of possible splice forms for each gene. Conclusions RSEM is an accurate and user-friendly software tool for quantifying transcript abundances from RNA-Seq data. As it does not rely on the existence of a reference genome, it is particularly useful for quantification with de novo transcriptome assemblies. In addition, RSEM has enabled valuable guidance for cost

  18. La quantification en Kabiye: une approche linguistique | Pali ...

    African Journals Online (AJOL)

    ... which is denoted by lexical quantifiers. Quantification with specific reference is provided by different types of linguistic units (nouns, numerals, adjectives, adverbs, ideophones and verbs) in arguments/noun phrases and in the predicative phrase in the sense of Chomsky. Keywords: quantification, class, number, reference, ...

  19. Bayesian Proteoform Modeling Improves Protein Quantification of Global Proteomic Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Datta, Susmita; Payne, Samuel H.; Kang, Jiyun; Bramer, Lisa M.; Nicora, Carrie D.; Shukla, Anil K.; Metz, Thomas O.; Rodland, Karin D.; Smith, Richard D.; Tardiff, Mark F.; McDermott, Jason E.; Pounds, Joel G.; Waters, Katrina M.

    2014-12-01

    As the capability of mass spectrometry-based proteomics has matured, tens of thousands of peptides can be measured simultaneously, which has the benefit of offering a systems view of protein expression. However, a major challenge is that with an increase in throughput, protein quantification estimation from the native measured peptides has become a computational task. A limitation to existing computationally-driven protein quantification methods is that most ignore protein variation, such as alternate splicing of the RNA transcript and post-translational modifications or other possible proteoforms, which will affect a significant fraction of the proteome. The consequence of this assumption is that statistical inference at the protein level, and consequently downstream analyses, such as network and pathway modeling, have only limited power for biomarker discovery. Here, we describe a Bayesian model (BP-Quant) that uses statistically derived peptides signatures to identify peptides that are outside the dominant pattern, or the existence of multiple over-expressed patterns to improve relative protein abundance estimates. It is a research-driven approach that utilizes the objectives of the experiment, defined in the context of a standard statistical hypothesis, to identify a set of peptides exhibiting similar statistical behavior relating to a protein. This approach infers that changes in relative protein abundance can be used as a surrogate for changes in function, without necessarily taking into account the effect of differential post-translational modifications, processing, or splicing in altering protein function. We verify the approach using a dilution study from mouse plasma samples and demonstrate that BP-Quant achieves similar accuracy as the current state-of-the-art methods at proteoform identification with significantly better specificity. BP-Quant is available as a MatLab ® and R packages at https://github.com/PNNL-Comp-Mass-Spec/BP-Quant.

  20. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato

    Directory of Open Access Journals (Sweden)

    Lett Jean-Michel

    2011-08-01

    Full Text Available Abstract Background Begomovirus is a genus of phytopathogenic single-stranded DNA viruses, transmitted by the whitefly Bemisia tabaci. This genus includes emerging and economically significant viruses such as those associated with Tomato Yellow Leaf Curl Disease, for which diagnostic tools are needed to prevent dispersion and new introductions. Five real-time PCRs with an internal tomato reporter gene were developed for accurate detection and quantification of monopartite begomoviruses, including two strains of the Tomato yellow leaf curl virus (TYLCV; Mld and IL strains, the Tomato leaf curl Comoros virus-like viruses (ToLCKMV-like viruses and the two molecules of the bipartite Potato yellow mosaic virus. These diagnostic tools have a unique standard quantification, comprising the targeted viral and internal report amplicons. These duplex real-time PCRs were applied to artificially inoculated plants to monitor and compare their viral development. Results Real-time PCRs were optimized for accurate detection and quantification over a range of 2 × 109 to 2 × 103 copies of genomic viral DNA/μL for TYLCV-Mld, TYLCV-IL and PYMV-B and 2 × 108 to 2 × 103 copies of genomic viral DNA/μL for PYMV-A and ToLCKMV-like viruses. These real-time PCRs were applied to artificially inoculated plants and viral loads were compared at 10, 20 and 30 days post-inoculation. Different patterns of viral accumulation were observed between the bipartite and the monopartite begomoviruses. Interestingly, PYMV accumulated more viral DNA at each date for both genomic components compared to all the monopartite viruses. Also, PYMV reached its highest viral load at 10 dpi contrary to the other viruses (20 dpi. The accumulation kinetics of the two strains of emergent TYLCV differed from the ToLCKMV-like viruses in the higher quantities of viral DNA produced in the early phase of the infection and in the shorter time to reach this peak viral load. Conclusions To detect and

  1. Quantification In Neurology

    Directory of Open Access Journals (Sweden)

    Netravati M

    2005-01-01

    Full Text Available There is a distinct shift of emphasis in clinical neurology in the last few decades. A few years ago, it was just sufficient for a clinician to precisely record history, document signs, establish diagnosis and write prescription. In the present context, there has been a significant intrusion of scientific culture in clinical practice. Several criteria have been proposed, refined and redefined to ascertain accurate diagnosis for many neurological disorders. Introduction of the concept of impairment, disability, handicap and quality of life has added new dimension to the measurement of health and disease and neurological disorders are no exception. "Best guess" treatment modalities are no more accepted and evidence based medicine has become an integral component of medical care. Traditional treatments need validation and new therapies require vigorous trials. Thus, proper quantification in neurology has become essential, both in practice and research methodology in neurology. While this aspect is widely acknowledged, there is a limited access to a comprehensive document pertaining to measurements in neurology. This following description is a critical appraisal of various measurements and also provides certain commonly used rating scales/scores in neurological practice.

  2. Text mining approach to predict hospital admissions using early medical records from the emergency department.

    Science.gov (United States)

    Lucini, Filipe R; S Fogliatto, Flavio; C da Silveira, Giovani J; L Neyeloff, Jeruza; Anzanello, Michel J; de S Kuchenbecker, Ricardo; D Schaan, Beatriz

    2017-04-01

    Emergency department (ED) overcrowding is a serious issue for hospitals. Early information on short-term inward bed demand from patients receiving care at the ED may reduce the overcrowding problem, and optimize the use of hospital resources. In this study, we use text mining methods to process data from early ED patient records using the SOAP framework, and predict future hospitalizations and discharges. We try different approaches for pre-processing of text records and to predict hospitalization. Sets-of-words are obtained via binary representation, term frequency, and term frequency-inverse document frequency. Unigrams, bigrams and trigrams are tested for feature formation. Feature selection is based on χ 2 and F-score metrics. In the prediction module, eight text mining methods are tested: Decision Tree, Random Forest, Extremely Randomized Tree, AdaBoost, Logistic Regression, Multinomial Naïve Bayes, Support Vector Machine (Kernel linear) and Nu-Support Vector Machine (Kernel linear). Prediction performance is evaluated by F1-scores. Precision and Recall values are also informed for all text mining methods tested. Nu-Support Vector Machine was the text mining method with the best overall performance. Its average F1-score in predicting hospitalization was 77.70%, with a standard deviation (SD) of 0.66%. The method could be used to manage daily routines in EDs such as capacity planning and resource allocation. Text mining could provide valuable information and facilitate decision-making by inward bed management teams. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  3. Quantification analysis of CT for aphasic patients

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, S.; Ooyama, H.; Hojo, K.; Tasaki, H.; Hanazono, T.; Sato, T.; Metoki, H.; Totsuka, M.; Oosumi, N.

    1987-02-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis).

  4. Rapid quantification and sex determination of forensic evidence materials.

    Science.gov (United States)

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  5. Quantification of the Keto-Hydroperoxide (HOOCH2OCHO) and Other Elusive Intermediates during Low-Temperature Oxidation of Dimethyl Ether

    KAUST Repository

    Moshammer, Kai; Jasper, Ahren W.; Popolan-Vaida, Denisia M.; Wang, Zhandong; Bhavani Shankar, Vijai Shankar; Ruwe, Lena; Taatjes, Craig A.; Dagaut, Philippe; Hansen, Nils

    2016-01-01

    photoionization cross sections that are hard to obtain experimentally but essential to turn mass spectral data into mole fraction profiles. The presented approach enabled the quantification of the hydroperoxymethyl formate (HOOCH2OCH2O), which is a key

  6. An Uncertainty Quantification Framework for Remote Sensing Retrievals

    Science.gov (United States)

    Braverman, A. J.; Hobbs, J.

    2017-12-01

    Remote sensing data sets produced by NASA and other space agencies are the result of complex algorithms that infer geophysical state from observed radiances using retrieval algorithms. The processing must keep up with the downlinked data flow, and this necessitates computational compromises that affect the accuracies of retrieved estimates. The algorithms are also limited by imperfect knowledge of physics and of ancillary inputs that are required. All of this contributes to uncertainties that are generally not rigorously quantified by stepping outside the assumptions that underlie the retrieval methodology. In this talk we discuss a practical framework for uncertainty quantification that can be applied to a variety of remote sensing retrieval algorithms. Ours is a statistical approach that uses Monte Carlo simulation to approximate the sampling distribution of the retrieved estimates. We will discuss the strengths and weaknesses of this approach, and provide a case-study example from the Orbiting Carbon Observatory 2 mission.

  7. Chip-Oriented Fluorimeter Design and Detection System Development for DNA Quantification in Nano-Liter Volumes

    Directory of Open Access Journals (Sweden)

    Da-Sheng Lee

    2009-12-01

    Full Text Available The chip-based polymerase chain reaction (PCR system has been developed in recent years to achieve DNA quantification. Using a microstructure and miniature chip, the volume consumption for a PCR can be reduced to a nano-liter. With high speed cycling and a low reaction volume, the time consumption of one PCR cycle performed on a chip can be reduced. However, most of the presented prototypes employ commercial fluorimeters which are not optimized for fluorescence detection of such a small quantity sample. This limits the performance of DNA quantification, especially low experiment reproducibility. This study discusses the concept of a chip-oriented fluorimeter design. Using the analytical model, the current study analyzes the sensitivity and dynamic range of the fluorimeter to fit the requirements for detecting fluorescence in nano-liter volumes. Through the optimized processes, a real-time PCR on a chip system with only one nano-liter volume test sample is as sensitive as the commercial real-time PCR machine using the sample with twenty micro-liter volumes. The signal to noise (S/N ratio of a chip system for DNA quantification with hepatitis B virus (HBV plasmid samples is 3 dB higher. DNA quantification by the miniature chip shows higher reproducibility compared to the commercial machine with respect to samples of initial concentrations from 103 to 105 copies per reaction.

  8. Quantification of Representative Ciguatoxins in the Pacific Using Quantitative Nuclear Magnetic Resonance Spectroscopy

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Kato

    2017-10-01

    Full Text Available The absolute quantification of five toxins involved in ciguatera fish poisoning (CFP in the Pacific was carried out by quantitative 1H-NMR. The targeted toxins were ciguatoxin-1B (CTX1B, 52-epi-54-deoxyciguatoxin-1B (epideoxyCTX1B, ciguatoxin-3C (CTX3C, 51-hydroxyciguatoxin-3C (51OHCTX3C, and ciguatoxin-4A (CTX4A. We first calibrated the residual protons of pyridine-d5 using certified reference material, 1,4-BTMSB-d4, prepared the toxin solutions with the calibrated pyridin-d5, measured the 1H-NMR spectra, and quantified the toxin using the calibrated residual protons as the internal standard. The absolute quantification was carried out by comparing the signal intensities between the selected protons of the target toxin and the residual protons of the calibrated pyridine-d5. The proton signals residing on the ciguatoxins (CTXs to be used for quantification were carefully selected for those that were well separated from adjacent signals including impurities and that exhibited an effective intensity. To quantify CTX1B and its congeners, the olefin protons in the side chain were judged appropriate for use. The quantification was achievable with nano-molar solutions. The probable errors for uncertainty, calculated on respective toxins, ranged between 3% and 16%. The contamination of the precious toxins with nonvolatile internal standards was thus avoided. After the evaporation of pyridine-d5, the calibrated CTXs were ready for use as the reference standard in the quantitative analysis of ciguatoxins by LC/MS.

  9. Overall Key Performance Indicator to Optimizing Operation of High-Pressure Homogenizers for a Reliable Quantification of Intracellular Components in Pichia pastoris.

    Science.gov (United States)

    Garcia-Ortega, Xavier; Reyes, Cecilia; Montesinos, José Luis; Valero, Francisco

    2015-01-01

    The most commonly used cell disruption procedures may present lack of reproducibility, which introduces significant errors in the quantification of intracellular components. In this work, an approach consisting in the definition of an overall key performance indicator (KPI) was implemented for a lab scale high-pressure homogenizer (HPH) in order to determine the disruption settings that allow the reliable quantification of a wide sort of intracellular components. This innovative KPI was based on the combination of three independent reporting indicators: decrease of absorbance, release of total protein, and release of alkaline phosphatase activity. The yeast Pichia pastoris growing on methanol was selected as model microorganism due to it presents an important widening of the cell wall needing more severe methods and operating conditions than Escherichia coli and Saccharomyces cerevisiae. From the outcome of the reporting indicators, the cell disruption efficiency achieved using HPH was about fourfold higher than other lab standard cell disruption methodologies, such bead milling cell permeabilization. This approach was also applied to a pilot plant scale HPH validating the methodology in a scale-up of the disruption process. This innovative non-complex approach developed to evaluate the efficacy of a disruption procedure or equipment can be easily applied to optimize the most common disruption processes, in order to reach not only reliable quantification but also recovery of intracellular components from cell factories of interest.

  10. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  11. Direct quantification of fatty acids in wet microalgal and yeast biomass via a rapid in situ fatty acid methyl ester derivatization approach.

    Science.gov (United States)

    Dong, Tao; Yu, Liang; Gao, Difeng; Yu, Xiaochen; Miao, Chao; Zheng, Yubin; Lian, Jieni; Li, Tingting; Chen, Shulin

    2015-12-01

    Accurate determination of fatty acid contents is routinely required in microalgal and yeast biofuel studies. A method of rapid in situ fatty acid methyl ester (FAME) derivatization directly from wet fresh microalgal and yeast biomass was developed in this study. This method does not require prior solvent extraction or dehydration. FAMEs were prepared with a sequential alkaline hydrolysis (15 min at 85 °C) and acidic esterification (15 min at 85 °C) process. The resulting FAMEs were extracted into n-hexane and analyzed using gas chromatography. The effects of each processing parameter (temperature, reaction time, and water content) upon the lipids quantification in the alkaline hydrolysis step were evaluated with a full factorial design. This method could tolerate water content up to 20% (v/v) in total reaction volume, which equaled up to 1.2 mL of water in biomass slurry (with 0.05-25 mg of fatty acid). There were no significant differences in FAME quantification (p>0.05) between the standard AOAC 991.39 method and the proposed wet in situ FAME preparation method. This fatty acid quantification method is applicable to fresh wet biomass of a wide range of microalgae and yeast species.

  12. A probabilistic generative model for quantification of DNA modifications enables analysis of demethylation pathways.

    Science.gov (United States)

    Äijö, Tarmo; Huang, Yun; Mannerström, Henrik; Chavez, Lukas; Tsagaratou, Ageliki; Rao, Anjana; Lähdesmäki, Harri

    2016-03-14

    We present a generative model, Lux, to quantify DNA methylation modifications from any combination of bisulfite sequencing approaches, including reduced, oxidative, TET-assisted, chemical-modification assisted, and methylase-assisted bisulfite sequencing data. Lux models all cytosine modifications (C, 5mC, 5hmC, 5fC, and 5caC) simultaneously together with experimental parameters, including bisulfite conversion and oxidation efficiencies, as well as various chemical labeling and protection steps. We show that Lux improves the quantification and comparison of cytosine modification levels and that Lux can process any oxidized methylcytosine sequencing data sets to quantify all cytosine modifications. Analysis of targeted data from Tet2-knockdown embryonic stem cells and T cells during development demonstrates DNA modification quantification at unprecedented detail, quantifies active demethylation pathways and reveals 5hmC localization in putative regulatory regions.

  13. Planar imaging quantification using 3D attenuation correction data and Monte Carlo simulated buildup factors

    International Nuclear Information System (INIS)

    Miller, C.; Filipow, L.; Jackson, S.; Riauka, T.

    1996-01-01

    A new method to correct for attenuation and the buildup of scatter in planar imaging quantification is presented. The method is based on the combined use of 3D density information provided by computed tomography to correct for attenuation and the application of Monte Carlo simulated buildup factors to correct for buildup in the projection pixels. CT and nuclear medicine images were obtained for a purpose-built nonhomogeneous phantom that models the human anatomy in the thoracic and abdominal regions. The CT transverse slices of the phantom were converted to a set of consecutive density maps. An algorithm was developed that projects the 3D information contained in the set of density maps to create opposing pairs of accurate 2D correction maps that were subsequently applied to planar images acquired from a dual-head gamma camera. A comparison of results obtained by the new method and the geometric mean approach based on published techniques is presented for some of the source arrangements used. Excellent results were obtained for various source - phantom configurations used to evaluate the method. Activity quantification of a line source at most locations in the nonhomogeneous phantom produced errors of less than 2%. Additionally, knowledge of the actual source depth is not required for accurate activity quantification. Quantification of volume sources placed in foam, Perspex and aluminium produced errors of less than 7% for the abdominal and thoracic configurations of the phantom. (author)

  14. Uncertainty quantification in Eulerian-Lagrangian models for particle-laden flows

    Science.gov (United States)

    Fountoulakis, Vasileios; Jacobs, Gustaaf; Udaykumar, Hs

    2017-11-01

    A common approach to ameliorate the computational burden in simulations of particle-laden flows is to use a point-particle based Eulerian-Lagrangian model, which traces individual particles in their Lagrangian frame and models particles as mathematical points. The particle motion is determined by Stokes drag law, which is empirically corrected for Reynolds number, Mach number and other parameters. The empirical corrections are subject to uncertainty. Treating them as random variables renders the coupled system of PDEs and ODEs stochastic. An approach to quantify the propagation of this parametric uncertainty to the particle solution variables is proposed. The approach is based on averaging of the governing equations and allows for estimation of the first moments of the quantities of interest. We demonstrate the feasibility of our proposed methodology of uncertainty quantification of particle-laden flows on one-dimensional linear and nonlinear Eulerian-Lagrangian systems. This research is supported by AFOSR under Grant FA9550-16-1-0008.

  15. Quantification of transformation products of rocket fuel unsymmetrical dimethylhydrazine in soils using SPME and GC-MS.

    Science.gov (United States)

    Bakaikina, Nadezhda V; Kenessov, Bulat; Ul'yanovskii, Nikolay V; Kosyakov, Dmitry S

    2018-07-01

    Determination of transformation products (TPs) of rocket fuel unsymmetrical dimethylhydrazine (UDMH) in soil is highly important for environmental impact assessment of the launches of heavy space rockets from Kazakhstan, Russia, China and India. The method based on headspace solid-phase microextraction (HS SPME) and gas chromatography-mass spectrometry is advantageous over other known methods due to greater simplicity and cost efficiency. However, accurate quantification of these analytes using HS SPME is limited by the matrix effect. In this research, we proposed using internal standard and standard addition calibrations to achieve proper combination of accuracies of the quantification of key TPs of UDMH and cost efficiency. 1-Trideuteromethyl-1H-1,2,4-triazole (MTA-d3) was used as the internal standard. Internal standard calibration allowed controlling matrix effects during quantification of 1-methyl-1H-1,2,4-triazole (MTA), N,N-dimethylformamide (DMF), and N-nitrosodimethylamine (NDMA) in soils with humus content < 1%. Using SPME at 60 °C for 15 min by 65 µm Carboxen/polydimethylsiloxane fiber, recoveries of MTA, DMF and NDMA for sandy and loamy soil samples were 91-117, 85-123 and 64-132%, respectively. For improving the method accuracy and widening the range of analytes, standard addition and its combination with internal standard calibration were tested and compared on real soil samples. The combined calibration approach provided greatest accuracies for NDMA, DMF, N-methylformamide, formamide, 1H-pyrazole, 3-methyl-1H-pyrazole and 1H-pyrazole. For determination of 1-formyl-2,2-dimethylhydrazine, 3,5-dimethylpyrazole, 2-ethyl-1H-imidazole, 1H-imidazole, 1H-1,2,4-triazole, pyrazines and pyridines, standard addition calibration is more suitable. However, the proposed approach and collected data allow using both approaches simultaneously. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Uncertainty quantification applied to the radiological characterization of radioactive waste.

    Science.gov (United States)

    Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P

    2017-09-01

    This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. NEW MODEL FOR QUANTIFICATION OF ICT DEPENDABLE ORGANIZATIONS RESILIENCE

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2011-03-01

    Full Text Available Business environment today demands high reliable organizations in every segment to be competitive on the global market. Beside that, ICT sector is becoming irreplaceable in many fields of business, from the communication to the complex systems for process control and production. To fulfill those requirements and to develop further, many organizations worldwide are implementing business paradigm called - organizations resilience. Although resilience is well known term in many science fields, it is not well studied due to its complex nature. This paper is dealing with developing the new model for assessment and quantification of ICT dependable organizations resilience.

  18. Real-time PCR Machine System Modeling and a Systematic Approach for the Robust Design of a Real-time PCR-on-a-Chip System

    Directory of Open Access Journals (Sweden)

    Da-Sheng Lee

    2010-01-01

    Full Text Available Chip-based DNA quantification systems are widespread, and used in many point-of-care applications. However, instruments for such applications may not be maintained or calibrated regularly. Since machine reliability is a key issue for normal operation, this study presents a system model of the real-time Polymerase Chain Reaction (PCR machine to analyze the instrument design through numerical experiments. Based on model analysis, a systematic approach was developed to lower the variation of DNA quantification and achieve a robust design for a real-time PCR-on-a-chip system. Accelerated lift testing was adopted to evaluate the reliability of the chip prototype. According to the life test plan, this proposed real-time PCR-on-a-chip system was simulated to work continuously for over three years with similar reproducibility in DNA quantification. This not only shows the robustness of the lab-on-a-chip system, but also verifies the effectiveness of our systematic method for achieving a robust design.

  19. Convex geometry of quantum resource quantification

    Science.gov (United States)

    Regula, Bartosz

    2018-01-01

    We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof-extended negativity, a measure of k-coherence which generalises the \

  20. A multiresolutional approach to fuzzy text meaning: A first attempt

    Energy Technology Data Exchange (ETDEWEB)

    Mehler, A.

    1996-12-31

    The present paper focuses on the connotative meaning aspect of language signs especially above the level of words. In this context the view is taken that texts can be defined as a kind of supersign, to which-in the same way as to other signs-a meaning can be assigned. A text can therefore be described as the result of a sign articulation which connects the material text sign with a corresponding meaning. For the constitution of the structural text meaning a kind of a semiotic composition principle is responsible, which leads to the emergence of interlocked levels of language units, demonstrating different grades of resolution. Starting on the level of words, and going through the level of sentences this principle reaches finally the level of texts by aggregating step by step the meaning of a unit on a higher level out of the meanings of all components one level below, which occur within this unit. Besides, this article will elaborate the hypothesis that the meaning constitution as a two-stage process, corresponding to the syntagmatic and paradigmatic restrictions of language elements among each other, obtains equally on the level of texts. On text level this two-levelledness leads to the constitution of the connotative text meaning, whose constituents are determined on word level by the syntagmatic and paradigmatic relations of the words. The formalization of the text meaning representation occurs with the help of fuzzy set theory.

  1. Rigid 3D-3D registration of TOF MRA integrating vessel segmentation for quantification of recurrence volumes after coiling cerebral aneurysm

    International Nuclear Information System (INIS)

    Saering, Dennis; Forkert, Nils Daniel; Fiehler, Jens; Ries, Thorsten

    2012-01-01

    A fast and reproducible quantification of the recurrence volume of coiled aneurysms is required to enable a more timely evaluation of new coils. This paper presents two registration schemes for the semi-automatic quantification of aneurysm recurrence volumes based on baseline and follow-up 3D MRA TOF datasets. The quantification of shape changes requires a previous definition of corresponding structures in both datasets. For this, two different rigid registration methods have been developed and evaluated. Besides a state-of-the-art rigid registration method, a second approach integrating vessel segmentations is presented. After registration, the aneurysm recurrence volume can be calculated based on the difference image. The computed volumes were compared to manually extracted volumes. An evaluation based on 20 TOF MRA datasets (baseline and follow-up) of ten patients showed that both registration schemes are generally capable of providing sufficient registration results. Regarding the quantification of aneurysm recurrence volumes, the results suggest that the second segmentation-based registration method yields better results, while a reduction of the computation and interaction time is achieved at the same time. The proposed registration scheme incorporating vessel segmentation enables an improved quantification of recurrence volumes of coiled aneurysms with reduced computation and interaction time. (orig.)

  2. Urinary Cell-Free DNA Quantification as Non-Invasive Biomarker in Patients with Bladder Cancer.

    Science.gov (United States)

    Brisuda, Antonin; Pazourkova, Eva; Soukup, Viktor; Horinek, Ales; Hrbáček, Jan; Capoun, Otakar; Svobodova, Iveta; Pospisilova, Sarka; Korabecna, Marie; Mares, Jaroslav; Hanuš, Tomáš; Babjuk, Marek

    2016-01-01

    Concentration of urinary cell-free DNA (ucfDNA) belongs to potential bladder cancer markers, but the reported results are inconsistent due to the use of various non-standardised methodologies. The aim of the study was to standardise the methodology for ucfDNA quantification as a potential non-invasive tumour biomarker. In total, 66 patients and 34 controls were enrolled into the study. Volumes of each urine portion (V) were recorded and ucfDNA concentrations (c) were measured using real-time PCR. Total amounts (TA) of ucfDNA were calculated and compared between patients and controls. Diagnostic accuracy of the TA of ucfDNA was determined. The calculation of TA of ucfDNA in the second urine portion was the most appropriate approach to ucfDNA quantification, as there was logarithmic dependence between the volume and the concentration of a urine portion (p = 0.0001). Using this methodology, we were able to discriminate between bladder cancer patients and subjects without bladder tumours (p = 0.0002) with area under the ROC curve of 0.725. Positive and negative predictive value of the test was 90 and 45%, respectively. Quantification of ucf DNA according to our modified method could provide a potential non-invasive biomarker for diagnosis of patients with bladder cancer. © 2015 S. Karger AG, Basel.

  3. Module-based Hybrid Uncertainty Quantification for Multi-physics Applications: Theory and Software

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chen, Xiao [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Iaccarino, Gianluca [Stanford Univ., CA (United States); Mittal, Akshay [Stanford Univ., CA (United States)

    2013-10-08

    In this project we proposed to develop an innovative uncertainty quantification methodology that captures the best of the two competing approaches in UQ, namely, intrusive and non-intrusive approaches. The idea is to develop the mathematics and the associated computational framework and algorithms to facilitate the use of intrusive or non-intrusive UQ methods in different modules of a multi-physics multi-module simulation model in a way that physics code developers for different modules are shielded (as much as possible) from the chores of accounting for the uncertain ties introduced by the other modules. As the result of our research and development, we have produced a number of publications, conference presentations, and a software product.

  4. Strategy study of quantification harmonization of SUV in PET/CT images

    International Nuclear Information System (INIS)

    Fischer, Andreia Caroline Fischer da Silveira

    2014-01-01

    In clinical practice, PET/CT images are often analyzed qualitatively by visual comparison of tumor lesions and normal tissues uptake; and semi-quantitatively by means of a parameter called SUV (Standardized Uptake Value). To ensure that longitudinal studies acquired on different scanners are interchangeable, and information of quantification is comparable, it is necessary to establish a strategy to harmonize the quantification of SUV. The aim of this study is to evaluate the strategy to harmonize the quantification of PET/CT images, performed with different scanner models and manufacturers. For this purpose, a survey of the technical characteristics of equipment and acquisition protocols of clinical images of different services of PET/CT in the state of Rio Grande do Sul was conducted. For each scanner, the accuracy of SUV quantification, and the Recovery Coefficient (RC) curves were determined, using the reconstruction parameters clinically relevant and available. From these data, harmonized performance specifications among the evaluated scanners were identified, as well as the algorithm that produces, for each one, the most accurate quantification. Finally, the most appropriate reconstruction parameters to harmonize the SUV quantification in each scanner, either regionally or internationally were identified. It was found that the RC values of the analyzed scanners proved to be overestimated by up to 38%, particularly for objects larger than 17mm. These results demonstrate the need for further optimization, through the reconstruction parameters modification, and even the change of the reconstruction algorithm used in each scanner. It was observed that there is a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies. Thus, the choice of reconstruction method should be tied to the purpose of the PET/CT study in question, since the same reconstruction algorithm is not adequate, in one scanner, for qualitative

  5. Histogram-Based Thresholding for Detection and Quantification of Hemorrhages in Retinal Images

    Directory of Open Access Journals (Sweden)

    Hussain Fadhel Hamdan Jaafar

    2016-12-01

    Full Text Available Retinal image analysis is commonly used for the detection and quantification of retinal diabetic retinopathy. In retinal images, dark lesions including hemorrhages and microaneurysms are the earliest warnings of vision loss. In this paper, new algorithm for extraction and quantification of hemorrhages in fundus images is presented. Hemorrhage candidates are extracted in a preliminary step as a coarse segmentation followed by a fine segmentation step. Local variation processes are applied in the coarse segmentation step to determine boundaries of all candidates with distinct edges. Fine segmentation processes are based on histogram thresholding to extract real hemorrhages from the segmented candidates locally. The proposed method was trained and tested using an image dataset of 153 manually labeled retinal images. At the pixel level, the proposed method could identify abnormal retinal images with 90.7% sensitivity and 85.1% predictive value. Due to its distinctive performance measurements, this technique demonstrates that it could be used for a computer-aided mass screening of retinal diseases.

  6. HPCE quantification of 5-methyl-2'-deoxycytidine in genomic DNA: methodological optimization for chestnut and other woody species.

    Science.gov (United States)

    Hasbún, Rodrigo; Valledor, Luís; Rodríguez, José L; Santamaria, Estrella; Ríos, Darcy; Sanchez, Manuel; Cañal, María J; Rodríguez, Roberto

    2008-01-01

    Quantification of deoxynucleosides using micellar high-performance capillary electrophoresis (HPCE) is an efficient, fast and inexpensive evaluation method of genomic DNA methylation. This approach has been demonstrated to be more sensitive and specific than other methods for the quantification of DNA methylation content. However, effective detection and quantification of 5-methyl-2'-deoxycytidine depend of the sample characteristics. Previous works have revealed that in most woody species, the quality and quantity of RNA-free DNA extracted that is suitable for analysis by means of HPCE varies among species of the same gender, among tissues taken from the same tree, and vary in the same tissue depending on the different seasons of the year. The aim of this work is to establish a quantification method of genomic DNA methylation that lends itself to use in different Castanea sativa Mill. materials, and in other angiosperm and gymnosperm woody species. Using a DNA extraction kit based in silica membrane has increased the resolutive capacity of the method. Under these conditions, it can be analyzed different organs or tissues of angiosperms and gymnosperms, regardless of their state of development. We emphasized the importance of samples free of nucleosides, although, in the contrary case, the method ensures the effective separation of deoxynucleosides and identification of 5-methyl-2'-deoxycytidine.

  7. Iron overload in the liver diagnostic and quantification

    Energy Technology Data Exchange (ETDEWEB)

    Alustiza, Jose M. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)]. E-mail: jmalustiza@osatek.es; Castiella, Agustin [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Juan, Maria D. de [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Emparanza, Jose I. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Artetxe, Jose [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Uranga, Maite [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)

    2007-03-15

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification.

  8. Iron overload in the liver diagnostic and quantification

    International Nuclear Information System (INIS)

    Alustiza, Jose M.; Castiella, Agustin; Juan, Maria D. de; Emparanza, Jose I.; Artetxe, Jose; Uranga, Maite

    2007-01-01

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification

  9. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very ...

  10. Uncovering the underlying physical mechanisms of biological systems via quantification of landscape and flux

    International Nuclear Information System (INIS)

    Xu Li; Chu Xiakun; Yan Zhiqiang; Zheng Xiliang; Zhang Kun; Zhang Feng; Yan Han; Wu Wei; Wang Jin

    2016-01-01

    In this review, we explore the physical mechanisms of biological processes such as protein folding and recognition, ligand binding, and systems biology, including cell cycle, stem cell, cancer, evolution, ecology, and neural networks. Our approach is based on the landscape and flux theory for nonequilibrium dynamical systems. This theory provides a unifying principle and foundation for investigating the underlying mechanisms and physical quantification of biological systems. (topical review)

  11. Chemical Topic Modeling: Exploring Molecular Data Sets Using a Common Text-Mining Approach.

    Science.gov (United States)

    Schneider, Nadine; Fechner, Nikolas; Landrum, Gregory A; Stiefl, Nikolaus

    2017-08-28

    Big data is one of the key transformative factors which increasingly influences all aspects of modern life. Although this transformation brings vast opportunities it also generates novel challenges, not the least of which is organizing and searching this data deluge. The field of medicinal chemistry is not different: more and more data are being generated, for instance, by technologies such as DNA encoded libraries, peptide libraries, text mining of large literature corpora, and new in silico enumeration methods. Handling those huge sets of molecules effectively is quite challenging and requires compromises that often come at the expense of the interpretability of the results. In order to find an intuitive and meaningful approach to organizing large molecular data sets, we adopted a probabilistic framework called "topic modeling" from the text-mining field. Here we present the first chemistry-related implementation of this method, which allows large molecule sets to be assigned to "chemical topics" and investigating the relationships between those. In this first study, we thoroughly evaluate this novel method in different experiments and discuss both its disadvantages and advantages. We show very promising results in reproducing human-assigned concepts using the approach to identify and retrieve chemical series from sets of molecules. We have also created an intuitive visualization of the chemical topics output by the algorithm. This is a huge benefit compared to other unsupervised machine-learning methods, like clustering, which are commonly used to group sets of molecules. Finally, we applied the new method to the 1.6 million molecules of the ChEMBL22 data set to test its robustness and efficiency. In about 1 h we built a 100-topic model of this large data set in which we could identify interesting topics like "proteins", "DNA", or "steroids". Along with this publication we provide our data sets and an open-source implementation of the new method (CheTo) which

  12. Electrochemical Quantification of the Antioxidant Capacity of Medicinal Plants Using Biosensors

    Directory of Open Access Journals (Sweden)

    Erika Rodríguez-Sevilla

    2014-08-01

    Full Text Available The working area of a screen-printed electrode, SPE, was modified with the enzyme tyrosinase (Tyr using different immobilization methods, namely entrapment with water-soluble polyvinyl alcohol (PVA, cross-linking using glutaraldehyde (GA, and cross-linking using GA and human serum albumin (HSA; the resulting electrodes were termed SPE/Tyr/PVA, SPE/Tyr/GA and SPE/Tyr/HSA/GA, respectively. These biosensors were characterized by means of amperometry and EIS techniques. From amperometric evaluations, the apparent Michaelis-Menten constant, Km′, of each biosensor was evaluated while the respective charge transfer resistance, Rct, was assessed from impedance measurements. It was found that the SPE/Tyr/GA had the smallest Km′ (57 ± 7 µM and Rct values. This electrode also displayed both the lowest detection and quantification limits for catechol quantification. Using the SPE/Tyr/GA, the Trolox Equivalent Antioxidant Capacity (TEAC was determined from infusions prepared with “mirto” (Salvia microphylla, “hHierba dulce” (Lippia dulcis and “salve real” (Lippia alba, medicinal plants commonly used in Mexico.

  13. Zum Bildungspotenzial biblischer Texte

    Directory of Open Access Journals (Sweden)

    Theis, Joachim

    2017-11-01

    Full Text Available Biblical education as a holistic process goes far beyond biblical learning. It must be understood as a lifelong process, in which both biblical texts and their understanders operate appropriating their counterpart in a dialogical way. – Neither does the recipient’s horizon of understanding appear as an empty room, which had to be filled with the text only, nor is the latter a dead material one could only examine cognitively. The recipient discovers the meaning of the biblical text recomposing it by existential appropriation. So the text is brought to live in each individual reality. Both scientific insights and subjective structures as well as the understanders’ community must be included to avoid potential one-sidednesses. Unfortunately, a special negative association obscures the approach of the bible very often: Still biblical work as part of religious education appears in a cognitively oriented habit, which is neither regarding the vitality and sovereignty of the biblical texts nor the students’ desire for meaning. Moreover, the bible is getting misused for teaching moral terms or pontifications. Such downfalls can be disrupted by biblical didactics which are empowerment didactics. Regarding the sovereignty of biblical texts, these didactics assist the understander with his/her individuation by opening the texts with focus on the understander’s otherness. Thus each the text and the recipient become subjects in a dialogue. The approach of the Biblical-Enabling-Didactics leads the Bible to become always new a book of life. Understanding them from within their hermeneutics, empowerment didactics could be raised to the principle of biblical didactics in general and grow into an essential element of holistic education.

  14. Comparison of Suitability of the Most Common Ancient DNA Quantification Methods.

    Science.gov (United States)

    Brzobohatá, Kristýna; Drozdová, Eva; Smutný, Jiří; Zeman, Tomáš; Beňuš, Radoslav

    2017-04-01

    Ancient DNA (aDNA) extracted from historical bones is damaged and fragmented into short segments, present in low quantity, and usually copurified with microbial DNA. A wide range of DNA quantification methods are available. The aim of this study was to compare the five most common DNA quantification methods for aDNA. Quantification methods were tested on DNA extracted from skeletal material originating from an early medieval burial site. The tested methods included ultraviolet (UV) absorbance, real-time quantitative polymerase chain reaction (qPCR) based on SYBR ® green detection, real-time qPCR based on a forensic kit, quantification via fluorescent dyes bonded to DNA, and fragmentary analysis. Differences between groups were tested using a paired t-test. Methods that measure total DNA present in the sample (NanoDrop ™ UV spectrophotometer and Qubit ® fluorometer) showed the highest concentrations. Methods based on real-time qPCR underestimated the quantity of aDNA. The most accurate method of aDNA quantification was fragmentary analysis, which also allows DNA quantification of the desired length and is not affected by PCR inhibitors. Methods based on the quantification of the total amount of DNA in samples are unsuitable for ancient samples as they overestimate the amount of DNA presumably due to the presence of microbial DNA. Real-time qPCR methods give undervalued results due to DNA damage and the presence of PCR inhibitors. DNA quantification methods based on fragment analysis show not only the quantity of DNA but also fragment length.

  15. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Science.gov (United States)

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  16. DataToText: A Consumer-Oriented Approach to Data Analysis

    Science.gov (United States)

    Kenny, David A.

    2010-01-01

    DataToText is a project developed where the user communicates the relevant information for an analysis and DataToText computer routine produces text output that describes in words, tables, and figures the results from the analyses. Two extended examples are given, one an example of a moderator analysis and the other an example of a dyadic data…

  17. Polynomial Chaos Expansion Approach to Interest Rate Models

    Directory of Open Access Journals (Sweden)

    Luca Di Persio

    2015-01-01

    Full Text Available The Polynomial Chaos Expansion (PCE technique allows us to recover a finite second-order random variable exploiting suitable linear combinations of orthogonal polynomials which are functions of a given stochastic quantity ξ, hence acting as a kind of random basis. The PCE methodology has been developed as a mathematically rigorous Uncertainty Quantification (UQ method which aims at providing reliable numerical estimates for some uncertain physical quantities defining the dynamic of certain engineering models and their related simulations. In the present paper, we use the PCE approach in order to analyze some equity and interest rate models. In particular, we take into consideration those models which are based on, for example, the Geometric Brownian Motion, the Vasicek model, and the CIR model. We present theoretical as well as related concrete numerical approximation results considering, without loss of generality, the one-dimensional case. We also provide both an efficiency study and an accuracy study of our approach by comparing its outputs with the ones obtained adopting the Monte Carlo approach, both in its standard and its enhanced version.

  18. Unilateral condylar hyperplasia: a 3-dimensional quantification of asymmetry.

    Directory of Open Access Journals (Sweden)

    Tim J Verhoeven

    Full Text Available PURPOSE: Objective quantifications of facial asymmetry in patients with Unilateral Condylar Hyperplasia (UCH have not yet been described in literature. The aim of this study was to objectively quantify soft-tissue asymmetry in patients with UCH and to compare the findings with a control group using a new method. MATERIAL AND METHODS: Thirty 3D photographs of patients diagnosed with UCH were compared with 30 3D photographs of healthy controls. As UCH presents particularly in the mandible, a new method was used to isolate the lower part of the face to evaluate asymmetry of this part separately. The new method was validated by two observers using 3D photographs of five patients and five controls. RESULTS: A significant difference (0.79 mm between patients and controls whole face asymmetry was found. Intra- and inter-observer differences of 0.011 mm (-0.034-0.011 and 0.017 mm (-0.007-0.042 respectively were found. These differences are irrelevant in clinical practice. CONCLUSION: After objective quantification, a significant difference was identified in soft-tissue asymmetry between patients with UCH and controls. The method used to isolate mandibular asymmetry was found to be valid and a suitable tool to evaluate facial asymmetry.

  19. Exploring Heterogeneous Multicore Architectures for Advanced Embedded Uncertainty Quantification.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Edwards, Harold C.; Hu, Jonathan J.

    2014-09-01

    We explore rearrangements of classical uncertainty quantification methods with the aim of achieving higher aggregate performance for uncertainty quantification calculations on emerging multicore and manycore architectures. We show a rearrangement of the stochastic Galerkin method leads to improved performance and scalability on several computational architectures whereby un- certainty information is propagated at the lowest levels of the simulation code improving memory access patterns, exposing new dimensions of fine grained parallelism, and reducing communica- tion. We also develop a general framework for implementing such rearrangements for a diverse set of uncertainty quantification algorithms as well as computational simulation codes to which they are applied.

  20. Effects of pyruvate dose on in vivo metabolism and quantification of hyperpolarized 13C spectra

    DEFF Research Database (Denmark)

    Janich, M. A.; Menzel, M. I.; Wiesinger, F.

    2012-01-01

    Real‐time in vivo measurements of metabolites are performed by signal enhancement of [1‐13C]pyruvate using dynamic nuclear polarization, rapid dissolution and intravenous injection, acquisition of free induction decay signals and subsequent quantification of spectra. The commonly injected dose...... uptake and metabolic conversion. The goal of this study was to examine the effects of a [1‐13C]pyruvate bolus on metabolic conversion in vivo. Spectra were quantified by three different methods: frequency‐domain fitting with LCModel, time‐domain fitting with AMARES and simple linear least‐squares fitting...... in the time domain. Since the simple linear least‐squares approach showed bleeding artifacts and LCModel produced noisier time signals. AMARES performed best in the quantification of in vivo hyperpolarized pyruvate spectra. We examined pyruvate doses of 0.1–0.4 mmol/kg (body mass) in male Wistar rats...

  1. Automated quantification and sizing of unbranched filamentous cyanobacteria by model based object oriented image analysis

    OpenAIRE

    Zeder, M; Van den Wyngaert, S; Köster, O; Felder, K M; Pernthaler, J

    2010-01-01

    Quantification and sizing of filamentous cyanobacteria in environmental samples or cultures are time-consuming and are often performed by using manual or semiautomated microscopic analysis. Automation of conventional image analysis is difficult because filaments may exhibit great variations in length and patchy autofluorescence. Moreover, individual filaments frequently cross each other in microscopic preparations, as deduced by modeling. This paper describes a novel approach based on object-...

  2. Quantification of differential gene expression by multiplexed targeted resequencing of cDNA

    Science.gov (United States)

    Arts, Peer; van der Raadt, Jori; van Gestel, Sebastianus H.C.; Steehouwer, Marloes; Shendure, Jay; Hoischen, Alexander; Albers, Cornelis A.

    2017-01-01

    Whole-transcriptome or RNA sequencing (RNA-Seq) is a powerful and versatile tool for functional analysis of different types of RNA molecules, but sample reagent and sequencing cost can be prohibitive for hypothesis-driven studies where the aim is to quantify differential expression of a limited number of genes. Here we present an approach for quantification of differential mRNA expression by targeted resequencing of complementary DNA using single-molecule molecular inversion probes (cDNA-smMIPs) that enable highly multiplexed resequencing of cDNA target regions of ∼100 nucleotides and counting of individual molecules. We show that accurate estimates of differential expression can be obtained from molecule counts for hundreds of smMIPs per reaction and that smMIPs are also suitable for quantification of relative gene expression and allele-specific expression. Compared with low-coverage RNA-Seq and a hybridization-based targeted RNA-Seq method, cDNA-smMIPs are a cost-effective high-throughput tool for hypothesis-driven expression analysis in large numbers of genes (10 to 500) and samples (hundreds to thousands). PMID:28474677

  3. A simple method of digitizing analog scintigrams for quantification and digital archiving

    International Nuclear Information System (INIS)

    Schramm, M.; Kaempfer, B.; Wolf, H.; Clausen, M.; Wendhausen, H.; Henze, E.

    1993-01-01

    This study was undertaken to evaluate a quick, reliable and cheap method of digitizing analog scintigrams. 40 whole-body bone scintigrams were obtained simultaneously in analog and genuine digital format. The analog scans on X-ray film were then digitized seecondarily by three different methods: 300 dpi flatbed scanning, high-resolution camera scanning and camcorder recording. A simple exposure approach using a light box, a cheap camcorder, a PC and image grabber hard- and software proved to be optimal. Visual interpretation showed no differences in clinical findings when comparing the analog images with their secondarily digitized counterparts. To test the possibility of quantification, 126 equivalent ROIs were drawn both in the genuine digital and the secondarily digitized images. Comparing the ROI count to whole-body count percentage of the corresponding ROIs showed the correlation to be linear. The evaluation of phantom studies showed the linear correlation to be true within a wide activity range. Thus, secondary digitalization of analog scintigrams is an easy, cheap and reliable method of archiving images and allows secondary digital quantification. (orig.) [de

  4. [A simple method of digitizing analog scintigrams for quantification and digital archiving].

    Science.gov (United States)

    Schramm, M; Kämpfer, B; Wolf, H; Clausen, M; Wendhausen, H; Henze, E

    1993-02-01

    This study was undertaken to evaluate a quick, reliable and cheap method of digitizing analog scintigrams. 40 whole-body bone scintigrams were obtained simultaneously in analog and genuine digital format. The analog scans on x-ray film were then digitized secondarily by three different methods: 300 dpi flat-bed scanning, high-resolution camera scanning and camcorder recording. A simple exposure approach using a light box, a cheap camcorder, a PC and image grabber hard- and software proved to be optimal. Visual interpretation showed no differences in clinical findings when comparing the analog images with their secondarily digitized counterparts. To test the possibility of quantification, 126 equivalent ROIs were drawn both in the genuine digital and the secondarily digitized images. Comparing the ROI count to whole-body count percentage of the corresponding ROIs showed the correlation to be linear. The evaluation of phantom studies showed the linear correlation to be true within a wide activity range. Thus, secondary digitalization of analog scintigrams is an easy, cheap and reliable method of archiving images and allows secondary digital quantification.

  5. Label-free quantification of Tacrolimus in biological samples by atomic force microscopy

    International Nuclear Information System (INIS)

    Menotta, Michele; Biagiotti, Sara; Streppa, Laura; Rossi, Luigia; Magnani, Mauro

    2015-01-01

    Highlights: • Tacrolimus is a potent immunosuppressant drug that has to be continually monitored. • We present an atomic force microscope approach for quantification of Tacrolimus in blood samples. • Detection and quantification have been successfully achieved. - Abstract: In the present paper we describe an atomic force microscopy (AFM)-based method for the quantitative analysis of FK506 (Tacrolimus) in whole blood (WB) samples. Current reference methods used to quantify this immunosuppressive drug are based on mass spectrometry. In addition, an immunoenzymatic assay (ELISA) has been developed and is widely used in clinic, even though it shows a small but consistent overestimation of the actual drug concentration when compared with the mass spectrometry method. The AFM biosensor presented herein utilises the endogen drug receptor, FKBP12, to quantify Tacrolimus levels. The biosensor was first assayed to detect the free drug in solution, and subsequently used for the detection of Tacrolimus in blood samples. The sensor was suitable to generate a dose–response curve in the full range of clinical drug monitoring. A comparison with the clinically tested ELISA assay is also reported

  6. Label-free quantification of Tacrolimus in biological samples by atomic force microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Menotta, Michele, E-mail: michele.menotta@uniurb.it [Department of Biomolecular Sciences, University of Urbino “Carlo Bo” via Saffi 2, Urbino (Italy); Biagiotti, Sara [Department of Biomolecular Sciences, University of Urbino “Carlo Bo” via Saffi 2, Urbino (Italy); Streppa, Laura [Physics Laboratory, CNRS-ENS, UMR 5672, Lyon (France); Cell and Molecular Biology Laboratory, CNRS-ENS Lyon, UMR 5239, IFR128, Lyon (France); Rossi, Luigia; Magnani, Mauro [Department of Biomolecular Sciences, University of Urbino “Carlo Bo” via Saffi 2, Urbino (Italy)

    2015-07-16

    Highlights: • Tacrolimus is a potent immunosuppressant drug that has to be continually monitored. • We present an atomic force microscope approach for quantification of Tacrolimus in blood samples. • Detection and quantification have been successfully achieved. - Abstract: In the present paper we describe an atomic force microscopy (AFM)-based method for the quantitative analysis of FK506 (Tacrolimus) in whole blood (WB) samples. Current reference methods used to quantify this immunosuppressive drug are based on mass spectrometry. In addition, an immunoenzymatic assay (ELISA) has been developed and is widely used in clinic, even though it shows a small but consistent overestimation of the actual drug concentration when compared with the mass spectrometry method. The AFM biosensor presented herein utilises the endogen drug receptor, FKBP12, to quantify Tacrolimus levels. The biosensor was first assayed to detect the free drug in solution, and subsequently used for the detection of Tacrolimus in blood samples. The sensor was suitable to generate a dose–response curve in the full range of clinical drug monitoring. A comparison with the clinically tested ELISA assay is also reported.

  7. Quantification of genetically modified soya using strong anion exchange chromatography and time-of-flight mass spectrometry.

    Science.gov (United States)

    Chang, Po-Chih; Reddy, P Muralidhar; Ho, Yen-Peng

    2014-09-01

    Stable-isotope dimethyl labeling was applied to the quantification of genetically modified (GM) soya. The herbicide-resistant gene-related protein 5-enolpyruvylshikimate-3-phosphate synthase (CP4 EPSPS) was labeled using a dimethyl labeling reagent, formaldehyde-H2 or -D2. The identification and quantification of CP4 EPSPS was performed using matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS). The CP4 EPSPS protein was separated from high abundance proteins using strong anion exchange chromatography and sodium dodecyl sulfate-polyacrylamide gel electrophoresis. Then, the tryptic peptides from the samples and reference were labeled with formaldehyde-H2 and formaldehyde-D2, respectively. The two labeled pools were mixed and analyzed using MALDI-MS. The data showed a good correlation between the peak ratio of the H- and D-labeled peptides and the GM soya percentages at 0.5, 1, 3, and 5 %, with R (2) of 0.99. The labeling reagents are readily available. The labeling experiments and the detection procedures are simple. The approach is useful for the quantification of GM soya at a level as low as 0.5 %.

  8. A Bayesian approach for quantification of model uncertainty

    International Nuclear Information System (INIS)

    Park, Inseok; Amarchinta, Hemanth K.; Grandhi, Ramana V.

    2010-01-01

    In most engineering problems, more than one model can be created to represent an engineering system's behavior. Uncertainty is inevitably involved in selecting the best model from among the models that are possible. Uncertainty in model selection cannot be ignored, especially when the differences between the predictions of competing models are significant. In this research, a methodology is proposed to quantify model uncertainty using measured differences between experimental data and model outcomes under a Bayesian statistical framework. The adjustment factor approach is used to propagate model uncertainty into prediction of a system response. A nonlinear vibration system is used to demonstrate the processes for implementing the adjustment factor approach. Finally, the methodology is applied on the engineering benefits of a laser peening process, and a confidence band for residual stresses is established to indicate the reliability of model prediction.

  9. Sensitivity Analysis and Uncertainty Quantification for the LAMMPS Molecular Dynamics Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bhat, Kabekode Ghanasham [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-18

    We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.

  10. Quantification of Sunscreen Ethylhexyl Triazone in Topical Skin-Care Products by Normal-Phase TLC/Densitometry

    Directory of Open Access Journals (Sweden)

    Anna W. Sobanska

    2012-01-01

    Full Text Available Ethylhexyl triazone (ET was separated from other sunscreens such as avobenzone, octocrylene, octyl methoxycinnamate, and diethylamino hydroxybenzoyl hexyl benzoate and from parabens by normal-phase HPTLC on silica gel 60 as stationary phase. Two mobile phases were particularly effective: (A cyclohexane-diethyl ether 1 : 1 (v/v and (B cyclohexane-diethyl ether-acetone 15 : 1 : 2 (v/v/v since apart from ET analysis they facilitated separation and quantification of other sunscreens present in the formulations. Densitometric scanning was performed at 300 nm. Calibration curves for ET were nonlinear (second-degree polynomials, with R > 0.998. For both mobile phases limits of detection (LOD were 0.03 and limits of quantification (LOQ 0.1 μg spot−1. Both methods were validated.

  11. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    descriptive trends are sufficient or an understanding of drivers and causes are needed. While there are certainly similar needs across uses and users, the necessary methods, data, and models for quantifying GHGs may vary. Common challenges for quantification noted in an informal survey of users of GHG information by Olander et al (2013) include the following. 3.1. Need for user-friendly methods that work across scales, regions, and systems Much of the data gathered and models developed by the research community provide high confidence in data or indicators computed at one place or for one issue, thus they are relevant for only specific uses, not transparent, or not comparable. These research approaches need to be translated to practitioners though the development of farmer friendly, transparent, comparable, and broadly applicable methods. Many users noted the need for quantification data and methods that work and are accurate across region and scales. One of the interviewed users, Charlotte Streck, summed it up nicely: 'A priority would be to produce comparable datasets for agricultural GHG emissions of particular agricultural practices for a broad set of countries ... with a gradual increase in accuracy'. 3.2. Need for lower cost, feasible approaches Concerns about cost and complexity of existing quantification methods were raised by a number of users interviewed in the survey. In the field it is difficult to measure changes in GHGs from agricultural management due to spatial and temporal variability, and the scale of the management-induced changes relative to background pools and fluxes. Many users noted data gaps and inconsistencies and insufficient technical capacity and infrastructure to generate necessary information, particularly in developing countries. The need for creative approaches for data collection and analysis, such as crowd sourcing and mobile technology, were noted. 3.3. Need for methods that can crosswalk between emission-reduction strategy and inventories

  12. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  13. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.

    2014-01-01

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  14. Inverse problems and uncertainty quantification

    KAUST Repository

    Litvinenko, Alexander

    2013-12-18

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)— the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  15. Assessment of Credit Risk Approaches in Relation with Competitiveness Increase of the Banking Sector

    Directory of Open Access Journals (Sweden)

    Cipovová Eva

    2012-06-01

    Full Text Available The article is focused on a presentation and analysis of selected methods of credit risk management in relation with competitiveness increase of the banking sector. The article is defined credit risk approaches under the Basel III gradually. Aim of this contribution constitutes various methods of credit risk management and effects of their usage on regulatory capital amount in respect of corporate exposures. Optimal equity amount in relation to the risk portfolio presents an essential prerequisite of performance and competitiveness growth of commercial banks. Gradually capital requirements using Standardized Approach and Internal Based Approach in a case of used and unused techniques of credit risk reduce has been quantified. We presume that sophisticated approach means significant saving for bank’s equity which increases competitiveness of banking sector also. Within the article, quantification of capital savings in case of Standardized (with and without assigned external ratings and Foundation Internal Based Approach at the selected credit portfolio has been effected.

  16. QUANTIFICATION OF ANGIOGENESIS IN THE CHICKEN CHORIOALLANTOIC MEMBRANE (CAM

    Directory of Open Access Journals (Sweden)

    Silvia Blacher

    2011-05-01

    Full Text Available The chick chorioallantoic membrane (CAM provides a suitable in vivo model to study angiogenesis and evaluate several pro- and anti-angiogenic factors and compounds. In the present work, new developments in image analysis are used to quantify CAM angiogenic response from optical microscopic observations, covering all vascular components, from the large supplying and feeding vessels down to the capillary plexus. To validate our methodology angiogenesis is quantified during two phases of CAM development (day 7 and 13 and after treatment with an antiangiogenic modulator of the angiogenesis. Our morphometric analysis emphasizes that an accurate quantification of the CAM vasculature needs to be performed at various scales.

  17. Interword and intraword pause threshold in the writing of texts by children and adolescents : a methodological approach

    Directory of Open Access Journals (Sweden)

    Florence eChenu

    2014-03-01

    Full Text Available Writing words in real life involves setting objectives, imagining a recipient, translating ideas into linguistic forms, managing grapho-motor gestures, etc. Understanding writing requires observation of the processes as they occur in real time. Analysis of pauses is one of the preferred methods for accessing the dynamics of writing and is based on the idea that pauses are behavioral correlates of cognitive processes. However, there is a need to clarify what we are observing when studying pause phenomena, as we will argue in the first section. This taken into account, the study of pause phenomena can be considered following two approaches. A first approach, driven by temporality, would define a threshold and observe where pauses, e.g. scriptural inactivity occurs. A second approach, linguistically driven, would define structural units and look for scriptural inactivity at the boundaries of these units or within these units. Taking a temporally driven approach, we present two methods which aim at the automatic identification of scriptural inactivity which is most likely not attributable to grapho-motor management in texts written by children and adolescents using digitizing tablets in association with Eye and Pen© (Chesnet & Alamargot, 2005. The first method is purely statistical and is based on the idea that the distribution of pauses exhibits different Gaussian components each of them corresponding to a different type of pause. After having reviewed the limits of this statistical method, we present a second method based on writing dynamics which attempts to identify breaking points in the writing dynamics rather than relying only on pause duration. This second method needs to be refined to overcome the fact that calculation is impossible when there is insufficient data which is often the case when working with young scriptors.

  18. Artifacts Quantification of Metal Implants in MRI

    Science.gov (United States)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  19. Identification and Quantification of Celery Allergens Using Fiber Optic Surface Plasmon Resonance PCR

    Directory of Open Access Journals (Sweden)

    Devin Daems

    2017-07-01

    Full Text Available Abstract: Accurate identification and quantification of allergens is key in healthcare, biotechnology and food quality and safety. Celery (Apium graveolens is one of the most important elicitors of food allergic reactions in Europe. Currently, the golden standards to identify, quantify and discriminate celery in a biological sample are immunoassays and two-step molecular detection assays in which quantitative PCR (qPCR is followed by a high-resolution melting analysis (HRM. In order to provide a DNA-based, rapid and simple detection method suitable for one-step quantification, a fiber optic PCR melting assay (FO-PCR-MA was developed to determine different concentrations of celery DNA (1 pM–0.1 fM. The presented method is based on the hybridization and melting of DNA-coated gold nanoparticles to the FO sensor surface in the presence of the target gene (mannitol dehydrogenase, Mtd. The concept was not only able to reveal the presence of celery DNA, but also allowed for the cycle-to-cycle quantification of the target sequence through melting analysis. Furthermore, the developed bioassay was benchmarked against qPCR followed by HRM, showing excellent agreement (R2 = 0.96. In conclusion, this innovative and sensitive diagnostic test could further improve food quality control and thus have a large impact on allergen induced healthcare problems.

  20. Novel SPECT Technologies and Approaches in Cardiac Imaging

    Directory of Open Access Journals (Sweden)

    Piotr Slomka

    2016-12-01

    Full Text Available Recent novel approaches in myocardial perfusion single photon emission CT (SPECT have been facilitated by new dedicated high-efficiency hardware with solid-state detectors and optimized collimators. New protocols include very low-dose (1 mSv stress-only, two-position imaging to mitigate attenuation artifacts, and simultaneous dual-isotope imaging. Attenuation correction can be performed by specialized low-dose systems or by previously obtained CT coronary calcium scans. Hybrid protocols using CT angiography have been proposed. Image quality improvements have been demonstrated by novel reconstructions and motion correction. Fast SPECT acquisition facilitates dynamic flow and early function measurements. Image processing algorithms have become automated with virtually unsupervised extraction of quantitative imaging variables. This automation facilitates integration with clinical variables derived by machine learning to predict patient outcome or diagnosis. In this review, we describe new imaging protocols made possible by the new hardware developments. We also discuss several novel software approaches for the quantification and interpretation of myocardial perfusion SPECT scans.

  1. LIBERAL THOUGHT IN QUR’ANIC STUDIES: Tracing Humanistic Approach to Sacred Text in Islamic Scholarship

    Directory of Open Access Journals (Sweden)

    M. Nur Kholis Setiawan

    2007-03-01

    Full Text Available Literary approach to the Qur’an developed by al-Khuli created deep critiques from its opponents, in whose opinion, the usage of literary paradigm to the study of the Qur’an, according to them, implied a consequence of treating the Qur’an as a human text which clearly indicates a strong influence of a liberal mode of thinking that goes out of the line of the Qur’an’s spirit. This article shows a diametric fact compared to that they have claimed. The data proves that linguistic aspects of the Qur’an have succeeded in making an intellectual connection among progressive and liberal scholars in the classical and modern era. This supports the assumption that progressive and liberal thought whose one of its indicators is freedom of thought in accordance to Charles Kurzman term, is “children” of the Islamic civilization. Freedom of thought in the classical Islamic scholarship should be the élan of intellectualism including the field of Qur’anic studies.

  2. Accurate quantification of microorganisms in PCR-inhibiting environmental DNA extracts by a novel Internal Amplification Control approach using Biotrove OpenArrays

    NARCIS (Netherlands)

    Van Doorn, R.; Klerks, M.; van Gent-Pelzer, M.; Speksnijder, A.G.C.L.; Kowalchuk, G.A.; Schoen, C.D.

    2009-01-01

    PCR-based detection assays are prone to inhibition by substances present in environmental samples, thereby potentially leading to inaccurate target quantification or false-negative results. Internal amplification controls (IACs) have been developed to help alleviate this problem but are generally

  3. Collagen Quantification in Tissue Specimens.

    Science.gov (United States)

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  4. 3D automatic quantification applied to optically sectioned images to improve microscopy analysis

    Directory of Open Access Journals (Sweden)

    JE Diaz-Zamboni

    2009-08-01

    Full Text Available New fluorescence microscopy techniques, such as confocal or digital deconvolution microscopy, allow to easily obtain three-dimensional (3D information from specimens. However, there are few 3D quantification tools that allow extracting information of these volumes. Therefore, the amount of information acquired by these techniques is difficult to manipulate and analyze manually. The present study describes a model-based method, which for the first time shows 3D visualization and quantification of fluorescent apoptotic body signals, from optical serial sections of porcine hepatocyte spheroids correlating them to their morphological structures. The method consists on an algorithm that counts apoptotic bodies in a spheroid structure and extracts information from them, such as their centroids in cartesian and radial coordinates, relative to the spheroid centre, and their integrated intensity. 3D visualization of the extracted information, allowed us to quantify the distribution of apoptotic bodies in three different zones of the spheroid.

  5. Quantification of phosphorus in single cells using synchrotron X-ray fluorescence

    Energy Technology Data Exchange (ETDEWEB)

    Núñez-Milland, Daliángelis R. [Department of Chemistry and Biochemistry, University of South Carolina, Columbia, SC 29208 (United States); Baines, Stephen B. [Department of Ecology and Evolution, Stony Brook University, Stony Brook, NY 11755 (United States); Vogt, Stefan [Experimental Facilities Division, Advanced Photon Source, Argonne National Laboratory, Argonne, IL (United States); Twining, Benjamin S., E-mail: btwining@bigelow.org [Department of Chemistry and Biochemistry, University of South Carolina, Columbia, SC 29208 (United States)

    2010-07-01

    Phosphorus abundance was quantified in individual phytoplankton cells by synchrotron X-ray fluorescence and compared with bulk spectrophotometric measurements to confirm accuracy of quantification. Figures of merit for P quantification on three different types of transmission electron microscopy grids are compared to assess possible interferences. Phosphorus is required for numerous cellular compounds and as a result can serve as a useful proxy for total cell biomass in studies of cell elemental composition. Single-cell analysis by synchrotron X-ray fluorescence (SXRF) enables quantitative and qualitative analyses of cell elemental composition with high elemental sensitivity. Element standards are required to convert measured X-ray fluorescence intensities into element concentrations, but few appropriate standards are available, particularly for the biologically important element P. Empirical P conversion factors derived from other elements contained in certified thin-film standards were used to quantify P in the model diatom Thalassiosira pseudonana, and the measured cell quotas were compared with those measured in bulk by spectrophotometry. The mean cellular P quotas quantified with SXRF for cells on Au, Ni and nylon grids using this approach were not significantly different from each other or from those measured spectrophotometrically. Inter-cell variability typical of cell populations was observed. Additionally, the grid substrates were compared for their suitability to P quantification based on the potential for spectral interferences with P. Nylon grids were found to have the lowest background concentrations and limits of detection for P, while background concentrations in Ni and Au grids were 1.8- and 6.3-fold higher. The advantages and disadvantages of each grid type for elemental analysis of individual phytoplankton cells are discussed.

  6. Rapid dual-injection single-scan 13N-ammonia PET for quantification of rest and stress myocardial blood flows

    International Nuclear Information System (INIS)

    Rust, T C; DiBella, E V R; McGann, C J; Christian, P E; Hoffman, J M; Kadrmas, D J

    2006-01-01

    Quantification of myocardial blood flows at rest and stress using 13 N-ammonia PET is an established method; however, current techniques require a waiting period of about 1 h between scans. The objective of this study was to test a rapid dual-injection single-scan approach, where 13 N-ammonia injections are administered 10 min apart during rest and adenosine stress. Dynamic PET data were acquired in six human subjects using imaging protocols that provided separate single-injection scans as gold standards. Rest and stress data were combined to emulate rapid dual-injection data so that the underlying activity from each injection was known exactly. Regional blood flow estimates were computed from the dual-injection data using two methods: background subtraction and combined modelling. The rapid dual-injection approach provided blood flow estimates very similar to the conventional single-injection standards. Rest blood flow estimates were affected very little by the dual-injection approach, and stress estimates correlated strongly with separate single-injection values (r = 0.998, mean absolute difference = 0.06 ml min -1 g -1 ). An actual rapid dual-injection scan was successfully acquired in one subject and further demonstrates feasibility of the method. This study with a limited dataset demonstrates that blood flow quantification can be obtained in only 20 min by the rapid dual-injection approach with accuracy similar to that of conventional separate rest and stress scans. The rapid dual-injection approach merits further development and additional evaluation for potential clinical use

  7. Cutset Quantification Error Evaluation for Shin-Kori 1 and 2 PSA model

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2009-01-01

    Probabilistic safety assessments (PSA) for nuclear power plants (NPPs) are based on the minimal cut set (MCS) quantification method. In PSAs, the risk and importance measures are computed from a cutset equation mainly by using approximations. The conservatism of the approximations is also a source of quantification uncertainty. In this paper, exact MCS quantification methods which are based on the 'sum of disjoint products (SDP)' logic and Inclusion-exclusion formula are applied and the conservatism of the MCS quantification results in Shin-Kori 1 and 2 PSA is evaluated

  8. Operational risk quantification and modelling within Romanian insurance industry

    Directory of Open Access Journals (Sweden)

    Tudor Răzvan

    2017-07-01

    Full Text Available This paper aims at covering and describing the shortcomings of various models used to quantify and model the operational risk within insurance industry with a particular focus on Romanian specific regulation: Norm 6/2015 concerning the operational risk issued by IT systems. While most of the local insurers are focusing on implementing the standard model to compute the Operational Risk solvency capital required, the local regulator has issued a local norm that requires to identify and assess the IT based operational risks from an ISO 27001 perspective. The challenges raised by the correlations assumed in the Standard model are substantially increased by this new regulation that requires only the identification and quantification of the IT operational risks. The solvency capital requirement stipulated by the implementation of Solvency II doesn’t recommend a model or formula on how to integrate the newly identified risks in the Operational Risk capital requirements. In this context we are going to assess the academic and practitioner’s understanding in what concerns: The Frequency-Severity approach, Bayesian estimation techniques, Scenario Analysis and Risk Accounting based on risk units, and how they could support the modelling of operational risk that are IT based. Developing an internal model only for the operational risk capital requirement proved to be, so far, costly and not necessarily beneficial for the local insurers. As the IT component will play a key role in the future of the insurance industry, the result of this analysis will provide a specific approach in operational risk modelling that can be implemented in the context of Solvency II, in a particular situation when (internal or external operational risk databases are scarce or not available.

  9. Quantification of fossil fuel CO2 at the building/street level for large US cities

    Science.gov (United States)

    Gurney, K. R.; Razlivanov, I. N.; Song, Y.

    2012-12-01

    Quantification of fossil fuel CO2 emissions from the bottom-up perspective is a critical element in emerging plans on a global, integrated, carbon monitoring system (CMS). A space/time explicit emissions data product can act as both a verification and planning system. It can verify atmospheric CO2 measurements (in situ and remote) and offer detailed mitigation information to management authorities in order to optimize the mix of mitigation efforts. Here, we present the Hestia Project, an effort aimed at building a high resolution (eg. building and road link-specific, hourly) fossil fuel CO2 emissions data product for the urban domain as a pilot effort to a CMS. A complete data product has been built for the city of Indianapolis and preliminary quantification has been completed for Los Angeles and Phoenix (see figure). The effort in Indianapolis is now part of a larger effort aimed at a convergent top-down/bottom-up assessment of greenhouse gas emissions, called INFLUX. Our urban-level quantification relies on a mixture of data and modeling structures. We start with the sector-specific Vulcan Project estimate at the mix of geocoded and county-wide levels. The Hestia aim is to distribute the Vulcan result in space and time. Two components take the majority of effort: buildings and onroad emissions. In collaboration with our INFLUX colleagues, we are transporting these high resolution emissions through an atmospheric transport model for a forward comparison of the Hestia data product with atmospheric measurements, collected on aircraft and cell towers. In preparation for a formal urban-scale inversion, these forward comparisons offer insights into both improving our emissions data product and measurement strategies. A key benefit of the approach taken in this study is the tracking and archiving of fuel and process-level detail (eg. combustion process, other pollutants), allowing for a more thorough understanding and analysis of energy throughputs in the urban

  10. Rapid quantification of plant-powdery mildew interactions by qPCR and conidiospore counts.

    Science.gov (United States)

    Weßling, Ralf; Panstruga, Ralph

    2012-08-31

    The powdery mildew disease represents a valuable patho-system to study the interaction between plant hosts and obligate biotrophic fungal pathogens. Numerous discoveries have been made on the basis of the quantitative evaluation of plant-powdery mildew interactions, especially in the context of hyper-susceptible and/or resistant plant mutants. However, the presently available methods to score the pathogenic success of powdery mildew fungi are laborious and thus not well suited for medium- to high-throughput analysis. Here we present two new protocols that allow the rapid quantitative assessment of powdery mildew disease development. One procedure depends on quantitative polymerase chain reaction (qPCR)-based evaluation of fungal biomass, while the other relies on the quantification of fungal conidiospores. We validated both techniques using the powdery mildew pathogen Golovinomyces orontii on a set of hyper-susceptible and resistant Arabidopsis thaliana mutants and found that both cover a wide dynamic range of one to two (qPCR) and four to five (quantification of conidia) orders of magnitude, respectively. The two approaches yield reproducible results and are easy to perform without specialized equipment. The qPCR and spore count assays rapidly and reproducibly quantify powdery mildew pathogenesis. Our methods are performed at later stages of infection and discern mutant phenotypes accurately. The assays therefore complement currently used procedures of powdery mildew quantification and can overcome some of their limitations. In addition, they can easily be adapted to other plant-powdery mildew patho-systems.

  11. A text-mining system for extracting metabolic reactions from full-text articles.

    Science.gov (United States)

    Czarnecki, Jan; Nobeli, Irene; Smith, Adrian M; Shepherd, Adrian J

    2012-07-23

    Increasingly biological text mining research is focusing on the extraction of complex relationships relevant to the construction and curation of biological networks and pathways. However, one important category of pathway - metabolic pathways - has been largely neglected.Here we present a relatively simple method for extracting metabolic reaction information from free text that scores different permutations of assigned entities (enzymes and metabolites) within a given sentence based on the presence and location of stemmed keywords. This method extends an approach that has proved effective in the context of the extraction of protein-protein interactions. When evaluated on a set of manually-curated metabolic pathways using standard performance criteria, our method performs surprisingly well. Precision and recall rates are comparable to those previously achieved for the well-known protein-protein interaction extraction task. We conclude that automated metabolic pathway construction is more tractable than has often been assumed, and that (as in the case of protein-protein interaction extraction) relatively simple text-mining approaches can prove surprisingly effective. It is hoped that these results will provide an impetus to further research and act as a useful benchmark for judging the performance of more sophisticated methods that are yet to be developed.

  12. Droplet digital PCR (ddPCR) vs quantitative real-time PCR (qPCR) approach for detection and quantification of Merkel cell polyomavirus (MCPyV) DNA in formalin fixed paraffin embedded (FFPE) cutaneous biopsies.

    Science.gov (United States)

    Arvia, Rosaria; Sollai, Mauro; Pierucci, Federica; Urso, Carmelo; Massi, Daniela; Zakrzewska, Krystyna

    2017-08-01

    Merkel cell polyomavirus (MCPyV) is associated with Merkel cell carcinoma and high viral load in the skin was proposed as a risk factor for the occurrence of this tumour. MCPyV DNA was detected, with lower frequency, in different skin cancers but since the viral load was usually low, the real prevalence of viral DNA could be underestimated. To evaluate the performance of two assays (qPCR and ddPCR) for MCPyV detection and quantification in formalin fixed paraffin embedded (FFPE) tissue samples. Both assays were designed to simultaneous detection and quantification of both MCPyV as well as house-keeping DNA in clinical samples. The performance of MCPyV quantification was investigated using serial dilutions of cloned target DNA. We also evaluated the applicability of both tests for the analysis of 76 FFPE cutaneous biopsies. The two approaches resulted equivalent with regard to the reproducibility and repeatability and showed a high degree of linearity in the dynamic range tested in the present study. Moreover, qPCR was able to quantify ≥10 5 copies per reaction, while the upper limit of ddPCR was 10 4 copies. There was not significant difference between viral load measured by the two methods The detection limit of both tests was 0,15 copies per reaction, however, the number of positive samples obtained by ddPCR was higher than that obtained by qPCR (45% and 37% respectively). The ddPCR represents a better method for detection of MCPyV in FFPE biopsies, mostly these containing low copies number of viral genome. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. A flexible and accurate quantification algorithm for electron probe X-ray microanalysis based on thin-film element yields

    International Nuclear Information System (INIS)

    Schalm, O.; Janssens, K.

    2003-01-01

    Quantitative analysis by means of electron probe X-ray microanalysis (EPXMA) of low Z materials such as silicate glasses can be hampered by the fact that ice or other contaminants build up on the Si(Li) detector beryllium window or (in the case of a windowless detector) on the Si(Li) crystal itself. These layers act as an additional absorber in front of the detector crystal, decreasing the detection efficiency at low energies (<5 keV). Since the layer thickness gradually changes with time, also the detector efficiency in the low energy region is not constant. Using the normal ZAF approach to quantification of EPXMA data is cumbersome in these conditions, because spectra from reference materials and from unknown samples must be acquired within a fairly short period of time in order to avoid the effect of the change in efficiency. To avoid this problem, an alternative approach to quantification of EPXMA data is proposed, following a philosophy often employed in quantitative analysis of X-ray fluorescence (XRF) and proton-induced X-ray emission (PIXE) data. This approach is based on the (experimental) determination of thin-film element yields, rather than starting from infinitely thick and single element calibration standards. These thin-film sensitivity coefficients can also be interpolated to allow quantification of elements for which no suitable standards are available. The change in detector efficiency can be monitored by collecting an X-ray spectrum of one multi-element glass standard. This information is used to adapt the previously determined thin-film sensitivity coefficients to the actual detector efficiency conditions valid on the day that the experiments were carried out. The main advantage of this method is that spectra collected from the standards and from the unknown samples should not be acquired within a short period of time. This new approach is evaluated for glass and metal matrices and is compared with a standard ZAF method

  14. Drug quantification in turbid media by fluorescence imaging combined with light-absorption correction using white Monte Carlo simulations

    DEFF Research Database (Denmark)

    Xie, Haiyan; Liu, Haichun; Svenmarker, Pontus

    2011-01-01

    Accurate quantification of photosensitizers is in many cases a critical issue in photodynamic therapy. As a noninvasive and sensitive tool, fluorescence imaging has attracted particular interest for quantification in pre-clinical research. However, due to the absorption of excitation and emission...... in vivo by the fluorescence imaging technique. In this paper we present a novel approach to compensate for the light absorption in homogeneous turbid media both for the excitation and emission light, utilizing time-resolved fluorescence white Monte Carlo simulations combined with the Beer-Lambert law......-absorption correction and absolute fluorophore concentrations. These results suggest that the technique potentially provides the means to quantify the fluorophore concentration from fluorescence images. © 2011 Society of Photo-Optical Instrumentation Engineers (SPIE)....

  15. Uncertainty Quantification given Discontinuous Climate Model Response and a Limited Number of Model Runs

    Science.gov (United States)

    Sargsyan, K.; Safta, C.; Debusschere, B.; Najm, H.

    2010-12-01

    Uncertainty quantification in complex climate models is challenged by the sparsity of available climate model predictions due to the high computational cost of model runs. Another feature that prevents classical uncertainty analysis from being readily applicable is bifurcative behavior in climate model response with respect to certain input parameters. A typical example is the Atlantic Meridional Overturning Circulation. The predicted maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. We outline a methodology for uncertainty quantification given discontinuous model response and a limited number of model runs. Our approach is two-fold. First we detect the discontinuity with Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve shape and location for arbitrarily distributed input parameter values. Then, we construct spectral representations of uncertainty, using Polynomial Chaos (PC) expansions on either side of the discontinuity curve, leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification. The approach is enabled by a Rosenblatt transformation that maps each side of the discontinuity to regular domains where desirable orthogonality properties for the spectral bases hold. We obtain PC modes by either orthogonal projection or Bayesian inference, and argue for a hybrid approach that targets a balance between the accuracy provided by the orthogonal projection and the flexibility provided by the Bayesian inference - where the latter allows obtaining reasonable expansions without extra forward model runs. The model output, and its associated uncertainty at specific design points, are then computed by taking an ensemble average over PC expansions corresponding to possible realizations of the discontinuity curve. The methodology is tested on synthetic examples of

  16. Adaptive Quantification and Longitudinal Analysis of Pulmonary Emphysema with a Hidden Markov Measure Field Model

    Science.gov (United States)

    Häme, Yrjö; Angelini, Elsa D.; Hoffman, Eric A.; Barr, R. Graham; Laine, Andrew F.

    2014-01-01

    The extent of pulmonary emphysema is commonly estimated from CT images by computing the proportional area of voxels below a predefined attenuation threshold. However, the reliability of this approach is limited by several factors that affect the CT intensity distributions in the lung. This work presents a novel method for emphysema quantification, based on parametric modeling of intensity distributions in the lung and a hidden Markov measure field model to segment emphysematous regions. The framework adapts to the characteristics of an image to ensure a robust quantification of emphysema under varying CT imaging protocols and differences in parenchymal intensity distributions due to factors such as inspiration level. Compared to standard approaches, the present model involves a larger number of parameters, most of which can be estimated from data, to handle the variability encountered in lung CT scans. The method was used to quantify emphysema on a cohort of 87 subjects, with repeated CT scans acquired over a time period of 8 years using different imaging protocols. The scans were acquired approximately annually, and the data set included a total of 365 scans. The results show that the emphysema estimates produced by the proposed method have very high intra-subject correlation values. By reducing sensitivity to changes in imaging protocol, the method provides a more robust estimate than standard approaches. In addition, the generated emphysema delineations promise great advantages for regional analysis of emphysema extent and progression, possibly advancing disease subtyping. PMID:24759984

  17. Decision-Making Approach to Selecting Optimal Platform of Service Variants

    Directory of Open Access Journals (Sweden)

    Vladimir Modrak

    2016-01-01

    Full Text Available Nowadays, it is anticipated that service sector companies will be inspired to follow mass customization trends of industrial sector. However, services are more abstract than products and therefore concepts for mass customization in manufacturing domain cannot be transformed without a methodical change. This paper is focused on the development of a methodological framework to support decisions in a selection of optimal platform of service variants when compatibility problems between service options occurred. The approach is based on mutual relations between waste and constrained design space entropy. For this purpose, software for quantification of constrained and waste design space is developed. Practicability of the methodology is presented on a realistic case.

  18. SPE/TLC/Densitometric Quantification of Selected Synthetic Food Dyes in Liquid Foodstuffs and Pharmaceutical Preparations

    Directory of Open Access Journals (Sweden)

    Anna W. Sobańska

    2017-01-01

    Full Text Available Selected synthetic food dyes (tartrazine, Ponceau 4R, Brilliant Blue, orange yellow, and azorubine were isolated from liquid preparations (mouthwashes and beverages by Solid Phase Extraction on aminopropyl-bonded silica with diluted aqueous sodium hydroxide as an eluent. The extraction step was followed by thin layer chromatography on silica gel 60 with chloroform-isopropanol-25% aq. ammonia 1 : 3 : 1 (v/v/v as mobile phase and the densitometric quantification of dyes was achieved using quadratic calibration plots (R2>0.997; LOQ = 0.04–0.09 μgspot−1. The overall recoveries for all studied dyes were at the average level of over 90% and the repeatability of the proposed procedure (CV ≤ 4.1% was sufficient to recommend it for the routine quantification of the aforementioned dyes in liquid matrices.

  19. Stereological quantification of mast cells in human synovium

    DEFF Research Database (Denmark)

    Damsgaard, T E; Sørensen, Flemming Brandt; Herlin, T

    1999-01-01

    Mast cells participate in both the acute allergic reaction as well as in chronic inflammatory diseases. Earlier studies have revealed divergent results regarding the quantification of mast cells in the human synovium. The aim of the present study was therefore to quantify these cells in the human...... synovium, using stereological techniques. Different methods of staining and quantification have previously been used for mast cell quantification in human synovium. Stereological techniques provide precise and unbiased information on the number of cell profiles in two-dimensional tissue sections of......, in this case, human synovium. In 10 patients suffering from osteoarthritis a median of 3.6 mast cells/mm2 synovial membrane was found. The total number of cells (synoviocytes, fibroblasts, lymphocytes, leukocytes) present was 395.9 cells/mm2 (median). The mast cells constituted 0.8% of all the cell profiles...

  20. Assessment of the real-time PCR and different digital PCR platforms for DNA quantification.

    Science.gov (United States)

    Pavšič, Jernej; Žel, Jana; Milavec, Mojca

    2016-01-01

    Digital PCR (dPCR) is beginning to supersede real-time PCR (qPCR) for quantification of nucleic acids in many different applications. Several analytical properties of the two most commonly used dPCR platforms, namely the QX100 system (Bio-Rad) and the 12.765 array of the Biomark system (Fluidigm), have already been evaluated and compared with those of qPCR. However, to the best of our knowledge, direct comparison between the three of these platforms using the same DNA material has not been done, and the 37 K array on the Biomark system has also not been evaluated in terms of linearity, analytical sensitivity and limit of quantification. Here, a first assessment of qPCR, the QX100 system and both arrays of the Biomark system was performed with plasmid and genomic DNA from human cytomegalovirus. With use of PCR components that alter the efficiency of qPCR, each dPCR platform demonstrated consistent copy-number estimations, which indicates the high resilience of dPCR. Two approaches, one considering the total reaction volume and the other considering the effective reaction size, were used to assess linearity, analytical sensitivity and variability. When the total reaction volume was considered, the best performance was observed with qPCR, followed by the QX100 system and the Biomark system. In contrast, when the effective reaction size was considered, all three platforms showed almost equal limits of detection and variability. Although dPCR might not always be more appropriate than qPCR for quantification of low copy numbers, dPCR is a suitable method for robust and reproducible quantification of viral DNA, and a promising technology for the higher-order reference measurement method.

  1. SU-D-218-05: Material Quantification in Spectral X-Ray Imaging: Optimization and Validation.

    Science.gov (United States)

    Nik, S J; Thing, R S; Watts, R; Meyer, J

    2012-06-01

    To develop and validate a multivariate statistical method to optimize scanning parameters for material quantification in spectral x-rayimaging. An optimization metric was constructed by extensively sampling the thickness space for the expected number of counts for m (two or three) materials. This resulted in an m-dimensional confidence region ofmaterial quantities, e.g. thicknesses. Minimization of the ellipsoidal confidence region leads to the optimization of energy bins. For the given spectrum, the minimum counts required for effective material separation can be determined by predicting the signal-to-noise ratio (SNR) of the quantification. A Monte Carlo (MC) simulation framework using BEAM was developed to validate the metric. Projection data of the m-materials was generated and material decomposition was performed for combinations of iodine, calcium and water by minimizing the z-score between the expected spectrum and binned measurements. The mean square error (MSE) and variance were calculated to measure the accuracy and precision of this approach, respectively. The minimum MSE corresponds to the optimal energy bins in the BEAM simulations. In the optimization metric, this is equivalent to the smallest confidence region. The SNR of the simulated images was also compared to the predictions from the metric. TheMSE was dominated by the variance for the given material combinations,which demonstrates accurate material quantifications. The BEAMsimulations revealed that the optimization of energy bins was accurate to within 1keV. The SNRs predicted by the optimization metric yielded satisfactory agreement but were expectedly higher for the BEAM simulations due to the inclusion of scattered radiation. The validation showed that the multivariate statistical method provides accurate material quantification, correct location of optimal energy bins and adequateprediction of image SNR. The BEAM code system is suitable for generating spectral x- ray imaging simulations.

  2. Computer-assisted radiological quantification of rheumatoid arthritis

    International Nuclear Information System (INIS)

    Peloschek, P.L.

    2000-03-01

    Specific objective was to develop the layout and structure of a platform for effective quantification of rheumatoid arthritis (RA). A fully operative Java stand-alone application software (RheumaCoach) was developed to support the efficacy of the scoring process in RA (Web address: http://www.univie.ac.at/radio/radio.htm). Addressed as potential users of such a program are physicians enrolled in clinical trials to evaluate the course of RA and its modulation with drug therapies and scientists developing new scoring modalities. The software 'RheumaCoach' consists of three major modules: The Tutorial starts with 'Rheumatoid Arthritis', to teach the basic pathology of the disease. Afterwards the section 'Imaging Standards' explains how to produce proper radiographs. 'Principles - How to use the 'Larsen Score', 'Radiographic Findings' and 'Quantification by Scoring' explain the requirements for unbiased scoring of RA. At the Data Input Sheet care was taken to follow the radiologist's approach in analysing films as published previously. At the compute sheet the calculated Larsen-Score may be compared with former scores and the further possibilities (calculate, export, print, send) are easily accessible. In a first pre-clinical study the system was tested in an unstructured. Two structured evaluations (30 fully documented and blinded cases of RA, four radiologists scored hands and feet with or without the RheumaCoach) followed. Between the evaluations we permanently improved the software. For all readers the usage of the RheumaCoach fastened the procedure, all together the scoring without computer-assistance needed about 20 % percent more time. Availability of the programme via the internet provides common access for potential quality control in multi-center studies. Documentation of results in a specifically designed printout improves communication between radiologists and rheumatologists. The possibilities of direct export to other programmes and electronic

  3. A Probabilistic Framework for Peptide and Protein Quantification from Data-Dependent and Data-Independent LC-MS Proteomics Experiments

    Science.gov (United States)

    Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.

    2013-01-01

    A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168

  4. QUANTIFICATION OF GENETICALLY MODIFIED MAIZE MON 810 IN PROCESSED FOODS

    Directory of Open Access Journals (Sweden)

    Peter Siekel

    2012-12-01

    Full Text Available 800x600 Normal 0 21 false false false SK X-NONE X-NONE MicrosoftInternetExplorer4 Maize MON 810 (Zea mays L. represents the majority of genetically modified food crops. It is the only transgenic cultivar grown in the EU (European Union countries and food products with its content higher than 0.9 % must be labelled. This study was aimed at impact of food processing (temperature, pH and pressure on DNA degradation and quantification of the genetically modified maize MON 810. The transgenic DNA was quantified by the real-time polymerase chain reaction method. Processing as is high temperature (121 °C, elevated pressure (0.1 MPa and low pH 2.25 fragmented DNA. A consequence of two order difference in the species specific gene content compared to the transgenic DNA content in plant materials used has led to false negative results in the quantification of transgenic DNA. The maize containing 4.2 % of the transgene after processing appeared to be as low as 3.0 % (100 °C and 1.9 % (121 °C, 0.1 MPa. The 2.1 % amount of transgene dropped at 100 °C to 1.0 % and at 121 °C, 0.1 MPa to 0.6 %. Under such make up the DNA degradation of transgenic content showed up 2 or 3 time higher decrease a consequence of unequal gene presence. Such genes disparity is expressed as considerable decrease of transgenic content while the decrease of species specific gene content remains unnoticed. Based on our findings we conclude that high degree of processing might have led to false negative results of the transgenic constituent quantification. Determination of GMO content in processed foods may leads to incorrect statement and labelling in these cases could misleads consumers.doi:10.5219/212

  5. Spatial gene expression quantification: a tool for analysis of in situ hybridizations in sea anemone Nematostella vectensis

    Directory of Open Access Journals (Sweden)

    Botman Daniel

    2012-10-01

    Full Text Available Abstract Background Spatial gene expression quantification is required for modeling gene regulation in developing organisms. The fruit fly Drosophila melanogaster is the model system most widely applied for spatial gene expression analysis due to its unique embryonic properties: the shape does not change significantly during its early cleavage cycles and most genes are differentially expressed along a straight axis. This system of development is quite exceptional in the animal kingdom. In the sea anemone Nematostella vectensis the embryo changes its shape during early development; there are cell divisions and cell movement, like in most other metazoans. Nematostella is an attractive case study for spatial gene expression since its transparent body wall makes it accessible to various imaging techniques. Findings Our new quantification method produces standardized gene expression profiles from raw or annotated Nematostella in situ hybridizations by measuring the expression intensity along its cell layer. The procedure is based on digital morphologies derived from high-resolution fluorescence pictures. Additionally, complete descriptions of nonsymmetric expression patterns have been constructed by transforming the gene expression images into a three-dimensional representation. Conclusions We created a standard format for gene expression data, which enables quantitative analysis of in situ hybridizations from embryos with various shapes in different developmental stages. The obtained expression profiles are suitable as input for optimization of gene regulatory network models, and for correlation analysis of genes from dissimilar Nematostella morphologies. This approach is potentially applicable to many other metazoan model organisms and may also be suitable for processing data from three-dimensional imaging techniques.

  6. Review of advances in human reliability analysis of errors of commission-Part 2: EOC quantification

    International Nuclear Information System (INIS)

    Reer, Bernhard

    2008-01-01

    In close connection with examples relevant to contemporary probabilistic safety assessment (PSA), a review of advances in human reliability analysis (HRA) of post-initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions, has been carried out. The review comprises both EOC identification (part 1) and quantification (part 2); part 2 is presented in this article. Emerging HRA methods in this field are: ATHEANA, MERMOS, the EOC HRA method developed by Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS), the MDTA method and CREAM. The essential advanced features are on the conceptual side, especially to envisage the modeling of multiple contexts for an EOC to be quantified (ATHEANA, MERMOS and MDTA), in order to explicitly address adverse conditions. There is promising progress in providing systematic guidance to better account for cognitive demands and tendencies (GRS, CREAM), and EOC recovery (MDTA). Problematic issues are associated with the implementation of multiple context modeling and the assessment of context-specific error probabilities. Approaches for task or error opportunity scaling (CREAM, GRS) and the concept of reference cases (ATHEANA outlook) provide promising orientations for achieving progress towards data-based quantification. Further development work is needed and should be carried out in close connection with large-scale applications of existing approaches

  7. StatSTEM: An efficient approach for accurate and precise model-based quantification of atomic resolution electron microscopy images

    Energy Technology Data Exchange (ETDEWEB)

    De Backer, A.; Bos, K.H.W. van den [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); Van den Broek, W. [AG Strukturforschung/Elektronenmikroskopie, Institut für Physik, Humboldt-Universität zu Berlin, Newtonstraße 15, 12489 Berlin (Germany); Sijbers, J. [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium); Van Aert, S., E-mail: sandra.vanaert@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium)

    2016-12-15

    An efficient model-based estimation algorithm is introduced to quantify the atomic column positions and intensities from atomic resolution (scanning) transmission electron microscopy ((S)TEM) images. This algorithm uses the least squares estimator on image segments containing individual columns fully accounting for overlap between neighbouring columns, enabling the analysis of a large field of view. For this algorithm, the accuracy and precision with which measurements for the atomic column positions and scattering cross-sections from annular dark field (ADF) STEM images can be estimated, has been investigated. The highest attainable precision is reached even for low dose images. Furthermore, the advantages of the model-based approach taking into account overlap between neighbouring columns are highlighted. This is done for the estimation of the distance between two neighbouring columns as a function of their distance and for the estimation of the scattering cross-section which is compared to the integrated intensity from a Voronoi cell. To provide end-users this well-established quantification method, a user friendly program, StatSTEM, is developed which is freely available under a GNU public license. - Highlights: • An efficient model-based method for quantitative electron microscopy is introduced. • Images are modelled as a superposition of 2D Gaussian peaks. • Overlap between neighbouring columns is taken into account. • Structure parameters can be obtained with the highest precision and accuracy. • StatSTEM, auser friendly program (GNU public license) is developed.

  8. Semi-supervised probabilistics approach for normalising informal short text messages

    CSIR Research Space (South Africa)

    Modupe, A

    2017-03-01

    Full Text Available The growing use of informal social text messages on Twitter is one of the known sources of big data. These type of messages are noisy and frequently rife with acronyms, slangs, grammatical errors and non-standard words causing grief for natural...

  9. Parameter identification and global sensitivity analysis of Xin'anjiang model using meta-modeling approach

    Directory of Open Access Journals (Sweden)

    Xiao-meng Song

    2013-01-01

    Full Text Available Parameter identification, model calibration, and uncertainty quantification are important steps in the model-building process, and are necessary for obtaining credible results and valuable information. Sensitivity analysis of hydrological model is a key step in model uncertainty quantification, which can identify the dominant parameters, reduce the model calibration uncertainty, and enhance the model optimization efficiency. There are, however, some shortcomings in classical approaches, including the long duration of time and high computation cost required to quantitatively assess the sensitivity of a multiple-parameter hydrological model. For this reason, a two-step statistical evaluation framework using global techniques is presented. It is based on (1 a screening method (Morris for qualitative ranking of parameters, and (2 a variance-based method integrated with a meta-model for quantitative sensitivity analysis, i.e., the Sobol method integrated with the response surface model (RSMSobol. First, the Morris screening method was used to qualitatively identify the parameters' sensitivity, and then ten parameters were selected to quantify the sensitivity indices. Subsequently, the RSMSobol method was used to quantify the sensitivity, i.e., the first-order and total sensitivity indices based on the response surface model (RSM were calculated. The RSMSobol method can not only quantify the sensitivity, but also reduce the computational cost, with good accuracy compared to the classical approaches. This approach will be effective and reliable in the global sensitivity analysis of a complex large-scale distributed hydrological model.

  10. Quantification of cellular uptake of DNA nanostructures by qPCR

    DEFF Research Database (Denmark)

    Okholm, Anders Hauge; Nielsen, Jesper Sejrup; Vinther, Mathias

    2014-01-01

    interactions and structural and functional features of the DNA delivery device must be thoroughly investigated. Here, we present a rapid and robust method for the precise quantification of the component materials of DNA origami structures capable of entering cells in vitro. The quantification is performed...

  11. Sensitive quantification of the HIV-1 reservoir in gut-associated lymphoid tissue.

    Directory of Open Access Journals (Sweden)

    Sara Morón-López

    Full Text Available The implementation of successful strategies to achieve an HIV cure has become a priority in HIV research. However, the current location and size of HIV reservoirs is still unknown since there are limited tools to evaluate HIV latency in viral sanctuaries such as gut-associated lymphoid tissue (GALT. As reported in the so called "Boston Patients", despite undetectable levels of proviral HIV-1 DNA in blood and GALT, viral rebound happens in just few months after ART interruption. This fact might imply that current methods are not sensitive enough to detect residual reservoirs. Showing that, it is imperative to improve the detection and quantification of HIV-1 reservoir in tissue samples. Herein, we propose a novel non-enzymatic protocol for purification of Lamina Propria Leukocytes (LPL from gut biopsies combined to viral HIV DNA (vDNA quantification by droplet digital PCR (ddPCR to improve the sensitivity and accuracy of viral reservoir measurements (LPL-vDNA assay.Endoscopic ileum biopsies were sampled from 12 HIV-1-infected cART-suppressed subjects. We performed a DTT/EDTA-based treatment for epithelial layer removal followed by non-enzymatic disruption of the tissue to obtain lamina propria cell suspension (LP. CD45+ cells were subsequently purified by flow sorting and vDNA was determined by ddPCR.vDNA quantification levels were significantly higher in purified LPLs (CD45+ than in bulk LPs (p<0.01. The levels of vDNA were higher in ileum samples than in concurrent PBMC from the same individuals (p = 0.002. As a result of the increased sensitivity of this purification method, the Poisson 95% confidence intervals of the vDNA quantification data from LPLs were narrower than that from bulk LPs. Of note, vDNA was unambiguously quantified above the detection limit in 100% of LPL samples, while only in 58% of bulk LPs.We propose an innovative combined protocol for a more sensitive detection of the HIV reservoir in gut-associated viral sanctuaries

  12. An Alternative to the Carlson-Parkin Method for the Quantification of Qualitative Inflation Expectations: Evidence from the Ifo World Economic Survey

    OpenAIRE

    Henzel, Steffen; Wollmershäuser, Timo

    2005-01-01

    This paper presents a new methodology for the quantification of qualitative survey data. Traditional conversion methods, such as the probability approach of Carlson and Parkin (1975) or the time-varying parameters model of Seitz (1988), require very restrictive assumptions concerning the expectations formation process of survey respondents. Above all, the unbiasedness of expectations, which is a necessary condition for rationality, is imposed. Our approach avoids these assumptions. The novelt...

  13. Computing symmetrical strength of N-grams: a two pass filtering approach in automatic classification of text documents.

    Science.gov (United States)

    Agnihotri, Deepak; Verma, Kesari; Tripathi, Priyanka

    2016-01-01

    The contiguous sequences of the terms (N-grams) in the documents are symmetrically distributed among different classes. The symmetrical distribution of the N-Grams raises uncertainty in the belongings of the N-Grams towards the class. In this paper, we focused on the selection of most discriminating N-Grams by reducing the effects of symmetrical distribution. In this context, a new text feature selection method named as the symmetrical strength of the N-Grams (SSNG) is proposed using a two pass filtering based feature selection (TPF) approach. Initially, in the first pass of the TPF, the SSNG method chooses various informative N-Grams from the entire extracted N-Grams of the corpus. Subsequently, in the second pass the well-known Chi Square (χ(2)) method is being used to select few most informative N-Grams. Further, to classify the documents the two standard classifiers Multinomial Naive Bayes and Linear Support Vector Machine have been applied on the ten standard text data sets. In most of the datasets, the experimental results state the performance and success rate of SSNG method using TPF approach is superior to the state-of-the-art methods viz. Mutual Information, Information Gain, Odds Ratio, Discriminating Feature Selection and χ(2).

  14. Active Learning for Text Classification

    OpenAIRE

    Hu, Rong

    2011-01-01

    Text classification approaches are used extensively to solve real-world challenges. The success or failure of text classification systems hangs on the datasets used to train them, without a good dataset it is impossible to build a quality system. This thesis examines the applicability of active learning in text classification for the rapid and economical creation of labelled training data. Four main contributions are made in this thesis. First, we present two novel selection strategies to cho...

  15. Quantification of acrylamide in foods selected by using gas chromatography tandem mass spectrometry

    Directory of Open Access Journals (Sweden)

    Delević Veselin M.

    2016-01-01

    Full Text Available Acrylamide is toxic and probably carcinogenic compound, made as a result of high-temperature thermal treatment of carbohydrate-rich foodstuffs. In this article a method is improved for the extraction and quantitation of acrylamide in foods produced based on corn flour that are represented in our traditional diet. Acrylamide extraction was carried out using reduced volume of saturated solution of bromine water and the GC - MS method for the quantification was shown. Quantification of acrylamide was preceded by: sample homogenization, acrylamide extraction using water, extract purification using solid phase extraction, bromination, using a reduced volume of bromine water solution, dehydrobromination with sodium thiosulfate and transformation of dibromopropenamide in 2,3- 2- bromopropenamide using triethylamine. Regression and correlation analysis were applied for the probability level of 0.05. Calibration is performed in the concentration range 5-80 ug/kg with a detection limit 6.86 mg / kg and the limits of quantification 10.78 ug/kg and the coefficient of determination R2 > 0.999. Calibration curve was obtained: y = 0,069x + 0,038. Recovery values were an average from 97 to 110%. Proposed GC-MS method is simple, precise and reliable for the determination of acrylamide in the samples of thermal treated foods. Our results show that the tested foods quantify the presence of acrylamide in concentrations of 18 to 77 mg/kg acrylamide depending on whether the food was prepared by cooking or baking.

  16. Two approaches to gathering text corpora from the WorldWideWeb

    CSIR Research Space (South Africa)

    Botha, G

    2005-11-01

    Full Text Available Many applications of pattern recognition to natural language processing require large text corpora in a specified language. For many of the languages of the world, such corpora are not readily available, but significant quantities of text...

  17. Terahertz identification and quantification of penicillamine enantiomers

    International Nuclear Information System (INIS)

    Ji Te; Zhao Hongwei; Chen Min; Xiao Tiqiao; Han Pengyu

    2013-01-01

    Identification and characterization of L-, D- and DL- penicillamine were demonstrated by Terahertz time-domain spectroscopy (THz-TDS). To understand the physical origins of the low frequency resonant modes, the density functional theory (DFT) was adopted for theoretical calculation. It was found that the collective THz frequency motions were decided by the intramolecular and intermolecular hydrogen bond interactions. Moreover, the quantification of penicillamine enantiomers mixture was demonstrated by a THz spectra fitting method with a relative error of less than 3.5%. This technique can be a valuable tool for the discrimination and quantification of chiral drugs in pharmaceutical industry. (authors)

  18. Recommendations for adaptation and validation of commercial kits for biomarker quantification in drug development.

    Science.gov (United States)

    Khan, Masood U; Bowsher, Ronald R; Cameron, Mark; Devanarayan, Viswanath; Keller, Steve; King, Lindsay; Lee, Jean; Morimoto, Alyssa; Rhyne, Paul; Stephen, Laurie; Wu, Yuling; Wyant, Timothy; Lachno, D Richard

    2015-01-01

    Increasingly, commercial immunoassay kits are used to support drug discovery and development. Longitudinally consistent kit performance is crucial, but the degree to which kits and reagents are characterized by manufacturers is not standardized, nor are the approaches by users to adapt them and evaluate their performance through validation prior to use. These factors can negatively impact data quality. This paper offers a systematic approach to assessment, method adaptation and validation of commercial immunoassay kits for quantification of biomarkers in drug development, expanding upon previous publications and guidance. These recommendations aim to standardize and harmonize user practices, contributing to reliable biomarker data from commercial immunoassays, thus, enabling properly informed decisions during drug development.

  19. Track benchmarking method for uncertainty quantification of particle tracking velocimetry interpolations

    International Nuclear Information System (INIS)

    Schneiders, Jan F G; Sciacchitano, Andrea

    2017-01-01

    The track benchmarking method (TBM) is proposed for uncertainty quantification of particle tracking velocimetry (PTV) data mapped onto a regular grid. The method provides statistical uncertainty for a velocity time-series and can in addition be used to obtain instantaneous uncertainty at increased computational cost. Interpolation techniques are typically used to map velocity data from scattered PTV (e.g. tomographic PTV and Shake-the-Box) measurements onto a Cartesian grid. Recent examples of these techniques are the FlowFit and VIC+  methods. The TBM approach estimates the random uncertainty in dense velocity fields by performing the velocity interpolation using a subset of typically 95% of the particle tracks and by considering the remaining tracks as an independent benchmarking reference. In addition, also a bias introduced by the interpolation technique is identified. The numerical assessment shows that the approach is accurate when particle trajectories are measured over an extended number of snapshots, typically on the order of 10. When only short particle tracks are available, the TBM estimate overestimates the measurement error. A correction to TBM is proposed and assessed to compensate for this overestimation. The experimental assessment considers the case of a jet flow, processed both by tomographic PIV and by VIC+. The uncertainty obtained by TBM provides a quantitative evaluation of the measurement accuracy and precision and highlights the regions of high error by means of bias and random uncertainty maps. In this way, it is possible to quantify the uncertainty reduction achieved by advanced interpolation algorithms with respect to standard correlation-based tomographic PIV. The use of TBM for uncertainty quantification and comparison of different processing techniques is demonstrated. (paper)

  20. Mining consumer health vocabulary from community-generated text.

    Science.gov (United States)

    Vydiswaran, V G Vinod; Mei, Qiaozhu; Hanauer, David A; Zheng, Kai

    2014-01-01

    Community-generated text corpora can be a valuable resource to extract consumer health vocabulary (CHV) and link them to professional terminologies and alternative variants. In this research, we propose a pattern-based text-mining approach to identify pairs of CHV and professional terms from Wikipedia, a large text corpus created and maintained by the community. A novel measure, leveraging the ratio of frequency of occurrence, was used to differentiate consumer terms from professional terms. We empirically evaluated the applicability of this approach using a large data sample consisting of MedLine abstracts and all posts from an online health forum, MedHelp. The results show that the proposed approach is able to identify synonymous pairs and label the terms as either consumer or professional term with high accuracy. We conclude that the proposed approach provides great potential to produce a high quality CHV to improve the performance of computational applications in processing consumer-generated health text.

  1. Nonverbatim Captioning in Dutch Television Programs: A Text Linguistic Approach

    Science.gov (United States)

    Schilperoord, Joost; de Groot, Vanja; van Son, Nic

    2005-01-01

    In the Netherlands, as in most other European countries, closed captions for the deaf summarize texts rather than render them verbatim. Caption editors argue that in this way television viewers have enough time to both read the text and watch the program. They also claim that the meaning of the original message is properly conveyed. However, many…

  2. Texting As A Discursive Approach For The Production Of Agricultural Solutions

    Directory of Open Access Journals (Sweden)

    Ronan G. Zagado

    2015-08-01

    Full Text Available This paper demonstrates how the short messaging service SMS popularly known as texting has facilitated production of solutions to farm issues using the Farmers Text Centre FTC of the Philippine Rice Research PhilRice as the case study. Text messages registered in the FTC database in 2010 covering one cropping season were discourse analyzed. Interpretive qualitative research particularly the Grounded Theory was employed to interprettheorize said data. Since texting is a new emerging discourse in agricultural development Grounded Theory allows the explication of theoretical accounts that explain its existence and impact. Results indicate that timing queries received within working days from 8am to 5pm get speedy response content the easier the question the faster it gets reply length the shorter the message the better and clarity of the querytext message as well as cultural factors such as greetings and terms of respect are all important governing factors in texting for farm use. Moreover analysis reveals that the series of text messages sent back and forth by farmers and agricultural specialist in FTC suggests a dynamic process of negotiation rather than passive information sharing. The analysis further reveals that texting has allowed farmers to have access to a negotiated knowledge rather than a standard scientific recommendation vis--vis the solution to their farm issues. The term negotiated implies that farmers are actively involved in knowledge production via texting. Textholder is coined in this paper to describe farmers and agricultural specialists as co-creators of knowledge in texting as opposed to their traditional role as knowledge generator and user respectively. From the analysis reflections implications and theoretical contributions are drawn in relation to the value of SMSing in agricultural extension and communication.

  3. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    Science.gov (United States)

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-04-06

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality.

  4. Sources of hydrocarbons in urban road dust: Identification, quantification and prediction.

    Science.gov (United States)

    Mummullage, Sandya; Egodawatta, Prasanna; Ayoko, Godwin A; Goonetilleke, Ashantha

    2016-09-01

    Among urban stormwater pollutants, hydrocarbons are a significant environmental concern due to their toxicity and relatively stable chemical structure. This study focused on the identification of hydrocarbon contributing sources to urban road dust and approaches for the quantification of pollutant loads to enhance the design of source control measures. The study confirmed the validity of the use of mathematical techniques of principal component analysis (PCA) and hierarchical cluster analysis (HCA) for source identification and principal component analysis/absolute principal component scores (PCA/APCS) receptor model for pollutant load quantification. Study outcomes identified non-combusted lubrication oils, non-combusted diesel fuels and tyre and asphalt wear as the three most critical urban hydrocarbon sources. The site specific variabilities of contributions from sources were replicated using three mathematical models. The models employed predictor variables of daily traffic volume (DTV), road surface texture depth (TD), slope of the road section (SLP), effective population (EPOP) and effective impervious fraction (EIF), which can be considered as the five governing parameters of pollutant generation, deposition and redistribution. Models were developed such that they can be applicable in determining hydrocarbon contributions from urban sites enabling effective design of source control measures. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. On uncertainty quantification in hydrogeology and hydrogeophysics

    Science.gov (United States)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  6. SPECT quantification of regional radionuclide distributions

    International Nuclear Information System (INIS)

    Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1986-01-01

    SPECT quantification of regional radionuclide activities within the human body is affected by several physical and instrumental factors including attenuation of photons within the patient, Compton scattered events, the system's finite spatial resolution and object size, finite number of detected events, partial volume effects, the radiopharmaceutical biokinetics, and patient and/or organ motion. Furthermore, other instrumentation factors such as calibration of the center-of-rotation, sampling, and detector nonuniformities will affect the SPECT measurement process. These factors are described, together with examples of compensation methods that are currently available for improving SPECT quantification. SPECT offers the potential to improve in vivo estimates of absorbed dose, provided the acquisition, reconstruction, and compensation procedures are adequately implemented and utilized. 53 references, 2 figures

  7. Critical assessment of digital PCR for the detection and quantification of genetically modified organisms.

    Science.gov (United States)

    Demeke, Tigst; Dobnik, David

    2018-07-01

    The number of genetically modified organisms (GMOs) on the market is steadily increasing. Because of regulation of cultivation and trade of GMOs in several countries, there is pressure for their accurate detection and quantification. Today, DNA-based approaches are more popular for this purpose than protein-based methods, and real-time quantitative PCR (qPCR) is still the gold standard in GMO analytics. However, digital PCR (dPCR) offers several advantages over qPCR, making this new technique appealing also for GMO analysis. This critical review focuses on the use of dPCR for the purpose of GMO quantification and addresses parameters which are important for achieving accurate and reliable results, such as the quality and purity of DNA and reaction optimization. Three critical factors are explored and discussed in more depth: correct classification of partitions as positive, correctly determined partition volume, and dilution factor. This review could serve as a guide for all laboratories implementing dPCR. Most of the parameters discussed are applicable to fields other than purely GMO testing. Graphical abstract There are generally three different options for absolute quantification of genetically modified organisms (GMOs) using digital PCR: droplet- or chamber-based and droplets in chambers. All have in common the distribution of reaction mixture into several partitions, which are all subjected to PCR and scored at the end-point as positive or negative. Based on these results GMO content can be calculated.

  8. Tracing Knowledge Transfer from Universities to Industry: A Text Mining Approach

    DEFF Research Database (Denmark)

    Woltmann, Sabrina; Alkærsig, Lars

    2017-01-01

    This paper identifies transferred knowledge between universities and the industry by proposing the use of a computational linguistic method. Current research on university-industry knowledge exchange relies often on formal databases and indicators such as patents, collaborative publications and l...... is the first step to enable the identification of common knowledge and knowledge transfer via text mining to increase its measurability....... and license agreements, to assess the contribution to the socioeconomic surrounding of universities. We, on the other hand, use the texts from university abstracts to identify university knowledge and compare them with texts from firm webpages. We use these text data to identify common key words and thereby...... identify overlapping contents among the texts. As method we use a well-established word ranking method from the field of information retrieval term frequency–inverse document frequency (TFIDF) to identify commonalities between texts from university. In examining the outcomes of the TFIDF statistic we find...

  9. Recurrence Quantification Analysis of Sentence-Level Speech Kinematics.

    Science.gov (United States)

    Jackson, Eric S; Tiede, Mark; Riley, Michael A; Whalen, D H

    2016-12-01

    Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach-recurrence quantification analysis (RQA)-via a procedural example and subsequent analysis of kinematic data. To test the feasibility of RQA, lip aperture (i.e., the Euclidean distance between lip-tracking sensors) was recorded for 21 typically developing adult speakers during production of a simple utterance. The utterance was produced in isolation and in carrier structures differing just in length or in length and complexity. Four RQA indices were calculated: percent recurrence (%REC), percent determinism (%DET), stability (MAXLINE), and stationarity (TREND). Percent determinism (%DET) decreased only for the most linguistically complex sentence; MAXLINE decreased as a function of linguistic complexity but increased for the longer-only sentence; TREND decreased as a function of both length and linguistic complexity. This research note demonstrates the feasibility of using RQA as a tool to compare speech variability across speakers and groups. RQA offers promise as a technique to assess effects of potential stressors (e.g., linguistic or cognitive factors) on the speech production system.

  10. Non-parametric data-based approach for the quantification and communication of uncertainties in river flood forecasts

    Science.gov (United States)

    Van Steenbergen, N.; Willems, P.

    2012-04-01

    Reliable flood forecasts are the most important non-structural measures to reduce the impact of floods. However flood forecasting systems are subject to uncertainty originating from the input data, model structure and model parameters of the different hydraulic and hydrological submodels. To quantify this uncertainty a non-parametric data-based approach has been developed. This approach analyses the historical forecast residuals (differences between the predictions and the observations at river gauging stations) without using a predefined statistical error distribution. Because the residuals are correlated with the value of the forecasted water level and the lead time, the residuals are split up into discrete classes of simulated water levels and lead times. For each class, percentile values are calculated of the model residuals and stored in a 'three dimensional error' matrix. By 3D interpolation in this error matrix, the uncertainty in new forecasted water levels can be quantified. In addition to the quantification of the uncertainty, the communication of this uncertainty is equally important. The communication has to be done in a consistent way, reducing the chance of misinterpretation. Also, the communication needs to be adapted to the audience; the majority of the larger public is not interested in in-depth information on the uncertainty on the predicted water levels, but only is interested in information on the likelihood of exceedance of certain alarm levels. Water managers need more information, e.g. time dependent uncertainty information, because they rely on this information to undertake the appropriate flood mitigation action. There are various ways in presenting uncertainty information (numerical, linguistic, graphical, time (in)dependent, etc.) each with their advantages and disadvantages for a specific audience. A useful method to communicate uncertainty of flood forecasts is by probabilistic flood mapping. These maps give a representation of the

  11. Ct shift: A novel and accurate real-time PCR quantification model for direct comparison of different nucleic acid sequences and its application for transposon quantifications.

    Science.gov (United States)

    Kolacsek, Orsolya; Pergel, Enikő; Varga, Nóra; Apáti, Ágota; Orbán, Tamás I

    2017-01-20

    There are numerous applications of quantitative PCR for both diagnostic and basic research. As in many other techniques the basis of quantification is that comparisons are made between different (unknown and known or reference) specimens of the same entity. When the aim is to compare real quantities of different species in samples, one cannot escape their separate precise absolute quantification. We have established a simple and reliable method for this purpose (Ct shift method) which combines the absolute and the relative approach. It requires a plasmid standard containing both sequences of amplicons to be compared (e.g. the target of interest and the endogenous control). It can serve as a reference sample with equal copies of templates for both targets. Using the ΔΔCt formula we can quantify the exact ratio of the two templates in each unknown sample. The Ct shift method has been successfully applied for transposon gene copy measurements, as well as for comparison of different mRNAs in cDNA samples. This study provides the proof of concept and introduces some potential applications of the method; the absolute nature of results even without the need for real reference samples can contribute to the universality of the method and comparability of different studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Two-Phase Microfluidic Systems for High Throughput Quantification of Agglutination Assays

    KAUST Repository

    Castro, David

    2018-01-01

    assay, with a minimum detection limit of 50 ng/mL using optical image analysis. We compare optical image analysis and light scattering as quantification methods, and demonstrate the first light scattering quantification of agglutination assays in a two

  13. Aerosol-type retrieval and uncertainty quantification from OMI data

    Science.gov (United States)

    Kauppi, Anu; Kolmonen, Pekka; Laine, Marko; Tamminen, Johanna

    2017-11-01

    We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs) and top-of-atmosphere (TOA) spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD). The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI) measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the difficulty in model

  14. Combination of Complement-Dependent Cytotoxicity and Relative Fluorescent Quantification of HLA Length Polymorphisms Facilitates the Detection of a Loss of Heterozygosity

    Directory of Open Access Journals (Sweden)

    Klaus Witter

    2014-01-01

    Full Text Available Loss of heterozygosity (LOH is a common event in malignant cells. In this work we introduce a new approach to identify patients with loss of heterozygosity in the HLA region either at first diagnosis or after HLA mismatched allogeneic HSCT. Diagnosis of LOH requires a high purity of recipient target cells. FACS is time consuming and also frequently prevented by rather nonspecific or unknown immune phenotype. The approach for recipient cell enrichment is based on HLA targeted complement-dependent cytotoxicity (CDC. Relative fluorescent quantification (RFQ analysis of HLA intron length polymorphisms then allows analysis of HLA heterozygosity. The approach is exemplified in recent clinical cases illustrating the detection of an acquired allele loss. As illustrated in one case with DPB1, distinct HLA loci in donor and patient were sufficient for both proof of donor cell removal and evaluation of allele loss in the patient's leukemic cells. Results were confirmed using HLA-B RFQ analysis and leukemia-associated aberrant immunophenotype (LAIP based cell sort. Both results confirmed suspected loss of HLA heterozygosity. Our approach complements or substitutes for FACS-based cell enrichment; hence it may be further developed as novel routine diagnostic tool. This allows rapid recipient cell purification and testing for loss of HLA heterozygosity before and after allogeneic HSCT in easily accessible peripheral blood samples.

  15. The Effects of Using Multimodal Approaches in Meaning-Making of 21st Century Literacy Texts Among ESL Students in a Private School in Malaysia

    Directory of Open Access Journals (Sweden)

    Malini Ganapathy

    2016-04-01

    Full Text Available In today’s globalised digital era, students are inevitably engaged in various multimodal texts due to their active participation in social media and frequent usage of mobile devices on a daily basis. Such daily activities advocate the need for a transformation in the teaching and learning of ESL lessons in order to promote students’ capabilities in making meaning of different literacy texts which students come across in their ESL learning activities. This paper puts forth the framework of Multimodality in the restructuring of the teaching and learning of ESL with the aim of investigating its effects and students perspectives on the use of multimodal approaches underlying the Multiliteracies theory. Using focus group interviews, this qualitative case study examines the effectiveness of ESL teaching and learning using the Multimodal approaches on literacy in meaning-making among 15 students in a private school in Penang, Malaysia. The results confirm the need to reorientate the teaching and learning of ESL with the focus on multimodal pedagogical practices as it promotes positive learning outcomes among students. The implications of this study suggest that the multimodal approaches integrated in the teaching and learning of ESL have the capacity to promote students’ autonomy in learning, improve motivation to learn and facilitate various learning styles. Keywords: Multimodal Approaches; Multiliteracies; Monomodal; Flipped Classroom; Literacy; Multimodal texts; Ipad

  16. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  17. Techniques for quantification of liver fat in risk stratification of diabetics; Techniken zur Leberfettquantifizierung bei der Risikostratifikation von Diabetikern

    Energy Technology Data Exchange (ETDEWEB)

    Kuehn, J.P.; Spoerl, M.C.; Mahlke, C.; Hegenscheid, K. [Universitaetsmedizin Greifswald, Abteilung Experimentelle Radiologie, Institut fuer Diagnostische Radiologie und Neuroradiologie, Greifswald (Germany)

    2015-04-01

    Fatty liver disease plays an important role in the development of type 2 diabetes. Accurate techniques for detection and quantification of liver fat are essential for clinical diagnostics. Chemical shift-encoded magnetic resonance imaging (MRI) is a simple approach to quantify liver fat content. Liver fat quantification using chemical shift-encoded MRI is influenced by several bias factors, such as T2* decay, T1 recovery and the multispectral complexity of fat. The confounder corrected proton density fat fraction is a simple approach to quantify liver fat with comparable results independent of the software and hardware used. The proton density fat fraction is an accurate biomarker for assessment of liver fat. An accurate and reproducible quantification of liver fat using chemical shift-encoded MRI requires a calculation of the proton density fat fraction. (orig.) [German] Die Fettleber scheint einen unmittelbaren Einfluss auf die Pathophysiologie des Diabetes mellitus Typ 2 zu besitzen. Zur Detektion und Quantifizierung des Leberfetts werden in der klinischen Diagnostik akkurate Verfahren gebraucht. Ein einfaches Verfahren ist die Chemical-shift-kodierte Magnetresonanztomographie (MRT). Eine suffiziente Quantifizierung von Leberfett mithilfe der Chemical-shift-kodierten MRT erfordert eine Beruecksichtigung von Stoervariablen, wie den T2*-Zerfall, den T1-Wiederaufbau und die multispektrale Komplexitaet von Fett. Eine Korrektur aller Stoervariablen wird als Proton-density-Fettfraktion bezeichnet. Diese liefert unabhaengig von der verwendeten Einstellung und Hardware reproduzierbare Ergebnisse. Die korrigierte Proton-density-Fettfraktion ist ein akkurater Biomarker zur Quantifizierung von Leberfett. Die akkurate und reproduzierbare Quantifizierung von Leberfett in der MRT erfordert eine Berechnung der Proton-density-Fettfraktion. (orig.)

  18. Biomarker Identification Using Text Mining

    Directory of Open Access Journals (Sweden)

    Hui Li

    2012-01-01

    Full Text Available Identifying molecular biomarkers has become one of the important tasks for scientists to assess the different phenotypic states of cells or organisms correlated to the genotypes of diseases from large-scale biological data. In this paper, we proposed a text-mining-based method to discover biomarkers from PubMed. First, we construct a database based on a dictionary, and then we used a finite state machine to identify the biomarkers. Our method of text mining provides a highly reliable approach to discover the biomarkers in the PubMed database.

  19. Using text mining for study identification in systematic reviews: a systematic review of current approaches.

    Science.gov (United States)

    O'Mara-Eves, Alison; Thomas, James; McNaught, John; Miwa, Makoto; Ananiadou, Sophia

    2015-01-14

    The large and growing number of published studies, and their increasing rate of publication, makes the task of identifying relevant studies in an unbiased way for inclusion in systematic reviews both complex and time consuming. Text mining has been offered as a potential solution: through automating some of the screening process, reviewer time can be saved. The evidence base around the use of text mining for screening has not yet been pulled together systematically; this systematic review fills that research gap. Focusing mainly on non-technical issues, the review aims to increase awareness of the potential of these technologies and promote further collaborative research between the computer science and systematic review communities. Five research questions led our review: what is the state of the evidence base; how has workload reduction been evaluated; what are the purposes of semi-automation and how effective are they; how have key contextual problems of applying text mining to the systematic review field been addressed; and what challenges to implementation have emerged? We answered these questions using standard systematic review methods: systematic and exhaustive searching, quality-assured data extraction and a narrative synthesis to synthesise findings. The evidence base is active and diverse; there is almost no replication between studies or collaboration between research teams and, whilst it is difficult to establish any overall conclusions about best approaches, it is clear that efficiencies and reductions in workload are potentially achievable. On the whole, most suggested that a saving in workload of between 30% and 70% might be possible, though sometimes the saving in workload is accompanied by the loss of 5% of relevant studies (i.e. a 95% recall). Using text mining to prioritise the order in which items are screened should be considered safe and ready for use in 'live' reviews. The use of text mining as a 'second screener' may also be used cautiously

  20. Characterization and quantification of preferential flow in fractured rock systems, using resistivity tomography

    CSIR Research Space (South Africa)

    May, F

    2010-11-01

    Full Text Available , N Jovanovic2 and A Rozanov1 University of Stellenbosch1 and Council for Scientific and Industrial Research (CSIR)2 Characterization and quantification of preferential flow in fractured rock systems, using resistivity tomography Introduction... of slow and fast flowing pathways. Materials and Methods TABLE 1 DATE, TIME AND WEATHER CONDITIONS DURING RESISTIVITY TOMOGRAPHY SURVEY Survey No. Date Start time End time Precipitation (mm) Description KB001 8/27/2010 12H00 13H40 0.0 Sunny KB002 8...

  1. Application of Neesler reagent in the ammonium quantification used in the fermentations of biotechnology products

    Directory of Open Access Journals (Sweden)

    Dinorah Torres-Idavoy

    2015-08-01

    Full Text Available The ammonium salts are used in fermentations to supplement the deficient amounts of nitrogen and stabilize the pH of the culture medium. The excess ammonium ion exerts a detrimental effect on the fermentation process inhibiting microbial growth. An analytical method based on Neesler reagent was developed for monitoring and controlling the concentration of ammonium during the fermentation process. The test was standardized, by means of the selection of measuring equipment, and the reaction time as well as comparing standards of ammonium salts. The method was characterized with the evaluation of the next parameters: Specificity, Linearity and Range, Quantification Limit, Accuracy and Precision. The method proved to be specific. Two linear curves were defined in the ranges of concentrations of ammonium chloride salt (2-20 μg/ml and ammonium sulfate salt (5-30 μg/ml. The limits of quantification were the lowest points of each one. The method proved to be accurate and precise. This assay was applied to samples of the yeast culture and bacteria of the genus Saccharomyces and E. coli respectively. A novel method in micro plate for quantification and analytical control of ammonia was developed. This method is used to control this fundamental chemical component in the fermentations, to optimize the culture medium. Thus, an appropriate expression of recombinant proteins and proper vaccine candidates for clinical use are achieved

  2. Quantification of lung fibrosis and emphysema in mice using automated micro-computed tomography.

    Directory of Open Access Journals (Sweden)

    Ellen De Langhe

    Full Text Available BACKGROUND: In vivo high-resolution micro-computed tomography allows for longitudinal image-based measurements in animal models of lung disease. The combination of repetitive high resolution imaging with fully automated quantitative image analysis in mouse models of lung fibrosis lung benefits preclinical research. This study aimed to develop and validate such an automated micro-computed tomography analysis algorithm for quantification of aerated lung volume in mice; an indicator of pulmonary fibrosis and emphysema severity. METHODOLOGY: Mice received an intratracheal instillation of bleomycin (n = 8, elastase (0.25 U elastase n = 9, 0.5 U elastase n = 8 or saline control (n = 6 for fibrosis, n = 5 for emphysema. A subset of mice was scanned without intervention, to evaluate potential radiation-induced toxicity (n = 4. Some bleomycin-instilled mice were treated with imatinib for proof of concept (n = 8. Mice were scanned weekly, until four weeks after induction, when they underwent pulmonary function testing, lung histology and collagen quantification. Aerated lung volumes were calculated with our automated algorithm. PRINCIPAL FINDINGS: Our automated image-based aerated lung volume quantification method is reproducible with low intra-subject variability. Bleomycin-treated mice had significantly lower scan-derived aerated lung volumes, compared to controls. Aerated lung volume correlated with the histopathological fibrosis score and total lung collagen content. Inversely, a dose-dependent increase in lung volume was observed in elastase-treated mice. Serial scanning of individual mice is feasible and visualized dynamic disease progression. No radiation-induced toxicity was observed. Three-dimensional images provided critical topographical information. CONCLUSIONS: We report on a high resolution in vivo micro-computed tomography image analysis algorithm that runs fully automated and allows quantification of aerated lung volume in mice. This

  3. Superlattice band structure: New and simple energy quantification condition

    Energy Technology Data Exchange (ETDEWEB)

    Maiz, F., E-mail: fethimaiz@gmail.com [University of Cartage, Nabeul Engineering Preparatory Institute, Merazka, 8000 Nabeul (Tunisia); King Khalid University, Faculty of Science, Physics Department, P.O. Box 9004, Abha 61413 (Saudi Arabia)

    2014-10-01

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga{sub 0.5}Al{sub 0.5}As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results.

  4. Application of laboratory and portable attenuated total reflectance infrared spectroscopic approaches for rapid quantification of alpaca serum immunoglobulin G

    Science.gov (United States)

    Burns, Jennifer B.; Riley, Christopher B.; Shaw, R. Anthony; McClure, J. Trenton

    2017-01-01

    The objective of this study was to develop and compare the performance of laboratory grade and portable attenuated total reflectance infrared (ATR-IR) spectroscopic approaches in combination with partial least squares regression (PLSR) for the rapid quantification of alpaca serum IgG concentration, and the identification of low IgG (portable ATR-IR spectrometers. Various pre-processing strategies were applied to the ATR-IR spectra that were linked to corresponding RID-IgG concentrations, and then randomly split into two sets: calibration (training) and test sets. PLSR was applied to the calibration set and calibration models were developed, and the test set was used to assess the accuracy of the analytical method. For the test set, the Pearson correlation coefficients between the IgG measured by RID and predicted by both laboratory grade and portable ATR-IR spectrometers was 0.91. The average differences between reference serum IgG concentrations and the two IR-based methods were 120.5 mg/dL and 71 mg/dL for the laboratory and portable ATR-IR-based assays, respectively. Adopting an IgG concentration portable ATR-IR assay were 95, 99 and 99%, respectively. These results suggest that the two different ATR-IR assays performed similarly for rapid qualitative evaluation of alpaca serum IgG and for diagnosis of IgG portable ATR-IR spectrometer performed slightly better, and provides more flexibility for potential application in the field. PMID:28651006

  5. Quantification of pelvic floor muscle strength in female urinary incontinence: A systematic review and comparison of contemporary methodologies.

    Science.gov (United States)

    Deegan, Emily G; Stothers, Lynn; Kavanagh, Alex; Macnab, Andrew J

    2018-01-01

    There remains no gold standard for quantification of voluntary pelvic floor muscle (PFM) strength, despite international guidelines that recommend PFM assessment in females with urinary incontinence (UI). Methods currently reported for quantification of skeletal muscle strength across disciplines are systematically reviewed and their relevance for clinical and academic use related to the pelvic floor are described. A systematic review via Medline, PubMed, CINHAL, and the Cochrane database using key terms for pelvic floor anatomy and function were cross referenced with skeletal muscle strength quantification from 1946 to 2016. Full text peer-reviewed articles in English having female subjects with incontinence were identified. Each study was analyzed for use of controls, type of methodology as direct or indirect measures, benefits, and limitations of the technique. A total of 1586 articles were identified of which 50 met the inclusion criteria. Nine methodologies of determining PFM strength were described including: digital palpation, perineometer, dynamometry, EMG, vaginal cones, ultrasonography, magnetic resonance imaging, urine stream interruption test, and the Colpexin pull test. Thirty-two percent lacked a control group. Technical refinements in both direct and indirect instrumentation for PFM strength measurement are allowing for sensitivity. However, the most common methods of quantification remain digital palpation and perineometry; techniques that pose limitations and yield subjective or indirect measures of muscular strength. Dynamometry has potential as an accurate and sensitive tool, but is limited by inability to assess PFM strength during dynamic movements. © 2017 Wiley Periodicals, Inc.

  6. A quick survey of text categorization algorithms

    Directory of Open Access Journals (Sweden)

    Dan MUNTEANU

    2007-12-01

    Full Text Available This paper contains an overview of basic formulations and approaches to text classification. This paper surveys the algorithms used in text categorization: handcrafted rules, decision trees, decision rules, on-line learning, linear classifier, Rocchio’s algorithm, k Nearest Neighbor (kNN, Support Vector Machines (SVM.

  7. Unsupervised information extraction by text segmentation

    CERN Document Server

    Cortez, Eli

    2013-01-01

    A new unsupervised approach to the problem of Information Extraction by Text Segmentation (IETS) is proposed, implemented and evaluated herein. The authors' approach relies on information available on pre-existing data to learn how to associate segments in the input string with attributes of a given domain relying on a very effective set of content-based features. The effectiveness of the content-based features is also exploited to directly learn from test data structure-based features, with no previous human-driven training, a feature unique to the presented approach. Based on the approach, a

  8. A simple fluorescence based assay for quantification of human immunodeficiency virus particle release

    Directory of Open Access Journals (Sweden)

    Heuser Anke-Mareil

    2010-04-01

    Full Text Available Abstract Background The assembly and release of human immunodeficiency virus (HIV particles from infected cells represent attractive, but not yet exploited targets for antiretroviral therapy. The availability of simple methods to measure the efficiency of these replication steps in tissue culture would facilitate the identification of host factors essential for these processes as well as the screening for lead compounds acting as specific inhibitors of particle formation. We describe here the development of a rapid cell based assay for quantification of human immunodeficiency virus type 1 (HIV-1 particle assembly and/or release. Results Using a fluorescently labelled HIV-derivative, which carries an eYFP domain within the main viral structural protein Gag in the complete viral protein context, the release of virus like particles could be monitored by directly measuring the fluorescence intensity of the tissue culture supernatant. Intracellular Gag was quantitated in parallel by direct fluorescence analysis of cell lysates, allowing us to normalize for Gag expression efficiency. The assay was validated by comparison with p24 capsid ELISA measurements, a standard method for quantifying HIV-1 particles. Optimization of conditions allowed the robust detection of particle amounts corresponding to 50 ng p24/ml in medium by fluorescence spectroscopy. Further adaptation to a multi-well format rendered the assay suitable for medium or high throughput screening of siRNA libraries to identify host cell factors involved in late stages of HIV replication, as well as for random screening approaches to search for potential inhibitors of HIV-1 assembly or release. Conclusions The fast and simple fluorescence based quantification of HIV particle release yielded reproducible results which were comparable to the well established ELISA measurements, while in addition allowing the parallel determination of intracellular Gag expression. The protocols described here

  9. New Detection Modality for Label-Free Quantification of DNA in Biological Samples via Superparamagnetic Bead Aggregation

    Science.gov (United States)

    Leslie, Daniel C.; Li, Jingyi; Strachan, Briony C.; Begley, Matthew R.; Finkler, David; Bazydlo, Lindsay L.; Barker, N. Scott; Haverstick, Doris; Utz, Marcel; Landers, James P.

    2012-01-01

    Combining DNA and superparamagnetic beads in a rotating magnetic field produces multiparticle aggregates that are visually striking, and enables label-free optical detection and quantification of DNA at levels in the picogram per microliter range. DNA in biological samples can be quantified directly by simple analysis of optical images of microfluidic wells placed on a magnetic stirrer without DNA purification. Aggregation results from DNA/bead interactions driven either by the presence of a chaotrope (a nonspecific trigger for aggregation) or by hybridization with oligonucleotides on functionalized beads (sequence-specific). This paper demonstrates quantification of DNA with sensitivity comparable to that of the best currently available fluorometric assays. The robustness and sensitivity of the method enable a wide range of applications, illustrated here by counting eukaryotic cells. Using widely available and inexpensive benchtop hardware, the approach provides a highly accessible low-tech microscale alternative to more expensive DNA detection and cell counting techniques. PMID:22423674

  10. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection

    Science.gov (United States)

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-01

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  11. Quantification bias caused by plasmid DNA conformation in quantitative real-time PCR assay.

    Science.gov (United States)

    Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming

    2011-01-01

    Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification.

  12. High-Performance Thin-Layer Chromatographic Quantification of Rosmarinic Acid and Rutin in Abnormal Savda Munziq

    Directory of Open Access Journals (Sweden)

    S. G. Tian

    2013-01-01

    Full Text Available A high-performance thin-layer chromatographic (HPTLC method has been established for simultaneous analysis of rosmarinic acid and rutin in Abnormal Savda Munziq (ASMq. A methanol extract of ASMq was used for quantification. The compounds were separated on silica gel H thin layer plate with ethyl acetate-formic acid-acetic acid-water 15 : 1 : 1 : 1.5 (v/v as a developer, trichloroethanol as the color reagent. The plates were scanned at 365 nm. The linear calibration data of rosmarinic acid and rutin were in the range of 0.0508 to 0.2540 μg (r=0.9964, 0.2707 to 1.35354 μg (r=0.9981, respectively. The recovery rate of rosmarinic acid was 99.17% (RSD = 2.92% and rutin was 95.24% (RSD = 2.38%. The method enables rapid screening, precise, selective, and sensitive quantification for pharmaceutical analysis.

  13. A comparison of the performance of a fundamental parameter method for analysis of total reflection X-ray fluorescence spectra and determination of trace elements, versus an empirical quantification procedure

    Science.gov (United States)

    W(egrzynek, Dariusz; Hołyńska, Barbara; Ostachowicz, Beata

    1998-01-01

    The performance has been compared of two different quantification methods — namely, the commonly used empirical quantification procedure and a fundamental parameter approach — for determination of the mass fractions of elements in particulate-like sample residues on a quartz reflector measured in the total reflection geometry. In the empirical quantification procedure, the spectrometer system needs to be calibrated with the use of samples containing known concentrations of the elements. On the basis of intensities of the X-ray peaks and the known concentration or mass fraction of an internal standard element, by using relative sensitivities of the spectrometer system the concentrations or mass fractions of the elements are calculated. The fundamental parameter approach does not require any calibration of the spectrometer system to be carried out. However, in order to account for an unknown mass per unit area of a sample and sample nonuniformity, an internal standard element is added. The concentrations/mass fractions of the elements to be determined are calculated during fitting a modelled X-ray spectrum to the measured one. The two quantification methods were applied to determine the mass fractions of elements in the cross-sections of a peat core, biological standard reference materials and to determine the concentrations of elements in samples prepared from an aqueous multi-element standard solution.

  14. Quantification of osteolytic bone lesions in a preclinical rat trial

    Science.gov (United States)

    Fränzle, Andrea; Bretschi, Maren; Bäuerle, Tobias; Giske, Kristina; Hillengass, Jens; Bendl, Rolf

    2013-10-01

    In breast cancer, most of the patients who died, have developed bone metastasis as disease progression. Bone metastases in case of breast cancer are mainly bone destructive (osteolytic). To understand pathogenesis and to analyse response to different treatments, animal models, in our case rats, are examined. For assessment of treatment response to bone remodelling therapies exact segmentations of osteolytic lesions are needed. Manual segmentations are not only time-consuming but lack in reproducibility. Computerized segmentation tools are essential. In this paper we present an approach for the computerized quantification of osteolytic lesion volumes using a comparison to a healthy reference model. The presented qualitative and quantitative evaluation of the reconstructed bone volumes show, that the automatically segmented lesion volumes complete missing bone in a reasonable way.

  15. Uncertainty Quantification of Fork Detector Measurements from Spent Fuel Loading Campaigns

    International Nuclear Information System (INIS)

    Vaccaro, S.; De Baere, P.; Schwalbach, P.; Gauld, I.; Hu, J.

    2015-01-01

    With increasing activities at the end of the fuel cycle, the requirements for the verification of spent nuclear fuel for safeguards purposes are continuously growing. In the European Union we are experiencing a dramatic increase in the number of cask loadings for interim dry storage. This is caused by the progressive shut-down of reactors, related to facility ageing but also due to politically motivated phase-out of nuclear power. On the other hand there are advanced plans for the construction of encapsulation plants and geological repositories. The cask loading or the encapsulation process will provide the last occasion to verify the spent fuel assemblies. In this context, Euratom and the US DOE have carried out a critical review of the widely used Fork measurements method of irradiated assemblies. The Nuclear Safeguards directorates of the European Commission's Directorate General for Energy and Oak Ridge National Laboratory have collaborated to improve the Fork data evaluation process and simplify its use for inspection applications. Within the Commission's standard data evaluation package CRISP, we included a SCALE/ORIGEN-based irradiation and depletion simulation of the measured assembly and modelled the fork transfer function to calculate expected count rates based on operator's declarations. The complete acquisition and evaluation process has been automated to compare expected (calculated) with measured count rates. This approach allows a physics-based improvement of the data review and evaluation process. At the same time the new method provides the means for better measurement uncertainty quantification. The present paper will address the implications of the combined approach involving measured and simulated data to the quantification of measurement uncertainty and the consequences of these uncertainties in the possible use of the Fork detector as a partial defect detection method. (author)

  16. Identification and Quantification of Carbonate Species Using Rock-Eval Pyrolysis

    Directory of Open Access Journals (Sweden)

    Pillot D.

    2013-03-01

    Full Text Available This paper presents a new reliable and rapid method to characterise and quantify carbonates in solid samples based on monitoring the CO2 flux emitted by progressive thermal decomposition of carbonates during a programmed heating. The different peaks of destabilisation allow determining the different types of carbonates present in the analysed sample. The quantification of each peak gives the respective proportions of these different types of carbonates in the sample. In addition to the chosen procedure presented in this paper, using a standard Rock-Eval 6 pyrolyser, calibration characteristic profiles are also presented for the most common carbonates in nature. This method should allow different types of application for different disciplines, either academic or industrial.

  17. Systematic characterizations of text similarity in full text biomedical publications.

    Science.gov (United States)

    Sun, Zhaohui; Errami, Mounir; Long, Tara; Renard, Chris; Choradia, Nishant; Garner, Harold

    2010-09-15

    Computational methods have been used to find duplicate biomedical publications in MEDLINE. Full text articles are becoming increasingly available, yet the similarities among them have not been systematically studied. Here, we quantitatively investigated the full text similarity of biomedical publications in PubMed Central. 72,011 full text articles from PubMed Central (PMC) were parsed to generate three different datasets: full texts, sections, and paragraphs. Text similarity comparisons were performed on these datasets using the text similarity algorithm eTBLAST. We measured the frequency of similar text pairs and compared it among different datasets. We found that high abstract similarity can be used to predict high full text similarity with a specificity of 20.1% (95% CI [17.3%, 23.1%]) and sensitivity of 99.999%. Abstract similarity and full text similarity have a moderate correlation (Pearson correlation coefficient: -0.423) when the similarity ratio is above 0.4. Among pairs of articles in PMC, method sections are found to be the most repetitive (frequency of similar pairs, methods: 0.029, introduction: 0.0076, results: 0.0043). In contrast, among a set of manually verified duplicate articles, results are the most repetitive sections (frequency of similar pairs, results: 0.94, methods: 0.89, introduction: 0.82). Repetition of introduction and methods sections is more likely to be committed by the same authors (odds of a highly similar pair having at least one shared author, introduction: 2.31, methods: 1.83, results: 1.03). There is also significantly more similarity in pairs of review articles than in pairs containing one review and one nonreview paper (frequency of similar pairs: 0.0167 and 0.0023, respectively). While quantifying abstract similarity is an effective approach for finding duplicate citations, a comprehensive full text analysis is necessary to uncover all potential duplicate citations in the scientific literature and is helpful when

  18. Difficulties in translation of socio-political texts

    Directory of Open Access Journals (Sweden)

    Артур Нарманович Мамедов

    2013-12-01

    Full Text Available Belonging of Russian socio-political texts to publicistic style assumes being guided by functional approach in order to find most adequate linguistic means by transfer of pragmatic meaning of the source text. Intralinguistic meaning can slightly remain by the interpretation of German texts. Lexical and grammatical transformations help preserving semantic-syntactic structure of the target text which means achievement of the same communicative effect by the translate which is being achieved by the source text.

  19. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  20. Uncertainty Quantification Techniques for Sensor Calibration Monitoring in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Ramuhalli, Pradeep; Lin, Guang; Crawford, Susan L.; Konomi, Bledar A.; Coble, Jamie B.; Shumaker, Brent; Hashemian, Hash

    2014-04-30

    This report describes research towards the development of advanced algorithms for online calibration monitoring. The objective of this research is to develop the next generation of online monitoring technologies for sensor calibration interval extension and signal validation in operating and new reactors. These advances are expected to improve the safety and reliability of current and planned nuclear power systems as a result of higher accuracies and increased reliability of sensors used to monitor key parameters. The focus of this report is on documenting the outcomes of the first phase of R&D under this project, which addressed approaches to uncertainty quantification (UQ) in online monitoring that are data-driven, and can therefore adjust estimates of uncertainty as measurement conditions change. Such data-driven approaches to UQ are necessary to address changing plant conditions, for example, as nuclear power plants experience transients, or as next-generation small modular reactors (SMR) operate in load-following conditions.

  1. Uncertainty Quantification Techniques for Sensor Calibration Monitoring in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Ramuhalli, Pradeep; Lin, Guang; Crawford, Susan L.; Konomi, Bledar A.; Braatz, Brett G.; Coble, Jamie B.; Shumaker, Brent; Hashemian, Hash

    2013-09-01

    This report describes the status of ongoing research towards the development of advanced algorithms for online calibration monitoring. The objective of this research is to develop the next generation of online monitoring technologies for sensor calibration interval extension and signal validation in operating and new reactors. These advances are expected to improve the safety and reliability of current and planned nuclear power systems as a result of higher accuracies and increased reliability of sensors used to monitor key parameters. The focus of this report is on documenting the outcomes of the first phase of R&D under this project, which addressed approaches to uncertainty quantification (UQ) in online monitoring that are data-driven, and can therefore adjust estimates of uncertainty as measurement conditions change. Such data-driven approaches to UQ are necessary to address changing plant conditions, for example, as nuclear power plants experience transients, or as next-generation small modular reactors (SMR) operate in load-following conditions.

  2. Application of dynamic probabilistic safety assessment approach for accident sequence precursor analysis: Case study for steam generator tube rupture

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Han Sul; Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of); Kim, Tae Wan [Incheon National University, Incheon (Korea, Republic of)

    2017-03-15

    The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective.

  3. Application of dynamic probabilistic safety assessment approach for accident sequence precursor analysis: Case study for steam generator tube rupture

    International Nuclear Information System (INIS)

    Lee, Han Sul; Heo, Gyun Young; Kim, Tae Wan

    2017-01-01

    The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective

  4. Optimization of SPECT calibration for quantification of images applied to dosimetry with iodine-131

    International Nuclear Information System (INIS)

    Carvalho, Samira Marques de

    2018-01-01

    SPECT systems calibration plays an essential role in the accuracy of the quantification of images. In this work, in its first stage, an optimized SPECT calibration method was proposed for 131 I studies, considering the partial volume effect (PVE) and the position of the calibration source. In the second stage, the study aimed to investigate the impact of count density and reconstruction parameters on the determination of the calibration factor and the quantification of the image in dosimetry studies, considering the reality of clinical practice in Brazil. In the final step, the study aimed evaluating the influence of several factors in the calibration for absorbed dose calculation using Monte Carlo simulations (MC) GATE code. Calibration was performed by determining a calibration curve (sensitivity versus volume) obtained by applying different thresholds. Then, the calibration factors were determined with an exponential function adjustment. Images were performed with high and low counts densities for several source positions within the simulator. To validate the calibration method, the calibration factors were used for absolute quantification of the total reference activities. The images were reconstructed adopting two approaches of different parameters, usually used in patient images. The methodology developed for the calibration of the tomographic system was easier and faster to implement than other procedures suggested to improve the accuracy of the results. The study also revealed the influence of the location of the calibration source, demonstrating better precision in the absolute quantification considering the location of the target region during the calibration of the system. The study applied in the Brazilian thyroid protocol suggests the revision of the calibration of the SPECT system, including different positions for the reference source, besides acquisitions considering the Signal to Noise Ratio (SNR) of the images. Finally, the doses obtained with the

  5. Empirical Studies On Machine Learning Based Text Classification Algorithms

    OpenAIRE

    Shweta C. Dharmadhikari; Maya Ingle; Parag Kulkarni

    2011-01-01

    Automatic classification of text documents has become an important research issue now days. Properclassification of text documents requires information retrieval, machine learning and Natural languageprocessing (NLP) techniques. Our aim is to focus on important approaches to automatic textclassification based on machine learning techniques viz. supervised, unsupervised and semi supervised.In this paper we present a review of various text classification approaches under machine learningparadig...

  6. Use of recurrence plot and recurrence quantification analysis in Taiwan unemployment rate time series

    Science.gov (United States)

    Chen, Wei-Shing

    2011-04-01

    The aim of the article is to answer the question if the Taiwan unemployment rate dynamics is generated by a non-linear deterministic dynamic process. This paper applies a recurrence plot and recurrence quantification approach based on the analysis of non-stationary hidden transition patterns of the unemployment rate of Taiwan. The case study uses the time series data of the Taiwan’s unemployment rate during the period from 1978/01 to 2010/06. The results show that recurrence techniques are able to identify various phases in the evolution of unemployment transition in Taiwan.

  7. Learning From Short Text Streams With Topic Drifts.

    Science.gov (United States)

    Li, Peipei; He, Lu; Wang, Haiyan; Hu, Xuegang; Zhang, Yuhong; Li, Lei; Wu, Xindong

    2017-09-18

    Short text streams such as search snippets and micro blogs have been popular on the Web with the emergence of social media. Unlike traditional normal text streams, these data present the characteristics of short length, weak signal, high volume, high velocity, topic drift, etc. Short text stream classification is hence a very challenging and significant task. However, this challenge has received little attention from the research community. Therefore, a new feature extension approach is proposed for short text stream classification with the help of a large-scale semantic network obtained from a Web corpus. It is built on an incremental ensemble classification model for efficiency. First, more semantic contexts based on the senses of terms in short texts are introduced to make up of the data sparsity using the open semantic network, in which all terms are disambiguated by their semantics to reduce the noise impact. Second, a concept cluster-based topic drifting detection method is proposed to effectively track hidden topic drifts. Finally, extensive studies demonstrate that as compared to several well-known concept drifting detection methods in data stream, our approach can detect topic drifts effectively, and it enables handling short text streams effectively while maintaining the efficiency as compared to several state-of-the-art short text classification approaches.

  8. Quantification of fructo-oligosaccharides based on the evaluation of oligomer ratios using an artificial neural network

    Energy Technology Data Exchange (ETDEWEB)

    Onofrejova, Lucia; Farkova, Marta [Department of Chemistry, Faculty of Science, Masaryk University, Kotlarska 2, 611 37 Brno (Czech Republic); Preisler, Jan, E-mail: preisler@chemi.muni.cz [Department of Chemistry, Faculty of Science, Masaryk University, Kotlarska 2, 611 37 Brno (Czech Republic)

    2009-04-13

    The application of an internal standard in quantitative analysis is desirable in order to correct for variations in sample preparation and instrumental response. In mass spectrometry of organic compounds, the internal standard is preferably labelled with a stable isotope, such as {sup 18}O, {sup 15}N or {sup 13}C. In this study, a method for the quantification of fructo-oligosaccharides using matrix-assisted laser desorption/ionisation time-of-flight mass spectrometry (MALDI TOF MS) was proposed and tested on raftilose, a partially hydrolysed inulin with a degree of polymeration 2-7. A tetraoligosaccharide nystose, which is chemically identical to the raftilose tetramer, was used as an internal standard rather than an isotope-labelled analyte. Two mathematical approaches used for data processing, conventional calculations and artificial neural networks (ANN), were compared. The conventional data processing relies on the assumption that a constant oligomer dispersion profile will change after the addition of the internal standard and some simple numerical calculations. On the other hand, ANN was found to compensate for a non-linear MALDI response and variations in the oligomer dispersion profile with raftilose concentration. As a result, the application of ANN led to lower quantification errors and excellent day-to-day repeatability compared to the conventional data analysis. The developed method is feasible for MS quantification of raftilose in the range of 10-750 pg with errors below 7%. The content of raftilose was determined in dietary cream; application can be extended to other similar polymers. It should be stressed that no special optimisation of the MALDI process was carried out. A common MALDI matrix and sample preparation were used and only the basic parameters, such as sampling and laser energy, were optimised prior to quantification.

  9. Comparison of viable plate count, turbidity measurement and real-time PCR for quantification of Porphyromonas gingivalis.

    Science.gov (United States)

    Clais, S; Boulet, G; Van Kerckhoven, M; Lanckacker, E; Delputte, P; Maes, L; Cos, P

    2015-01-01

    The viable plate count (VPC) is considered as the reference method for bacterial enumeration in periodontal microbiology but shows some important limitations for anaerobic bacteria. As anaerobes such as Porphyromonas gingivalis are difficult to culture, VPC becomes time-consuming and less sensitive. Hence, efficient normalization of experimental data to bacterial cell count requires alternative rapid and reliable quantification methods. This study compared the performance of VPC with that of turbidity measurement and real-time PCR (qPCR) in an experimental context using highly concentrated bacterial suspensions. Our TaqMan-based qPCR assay for P. gingivalis 16S rRNA proved to be sensitive and specific. Turbidity measurements offer a fast method to assess P. gingivalis growth, but suffer from high variability and a limited dynamic range. VPC was very time-consuming and less repeatable than qPCR. Our study concludes that qPCR provides the most rapid and precise approach for P. gingivalis quantification. Although our data were gathered in a specific research context, we believe that our conclusions on the inferior performance of VPC and turbidity measurements in comparison to qPCR can be extended to other research and clinical settings and even to other difficult-to-culture micro-organisms. Various clinical and research settings require fast and reliable quantification of bacterial suspensions. The viable plate count method (VPC) is generally seen as 'the gold standard' for bacterial enumeration. However, VPC-based quantification of anaerobes such as Porphyromonas gingivalis is time-consuming due to their stringent growth requirements and shows poor repeatability. Comparison of VPC, turbidity measurement and TaqMan-based qPCR demonstrated that qPCR possesses important advantages regarding speed, accuracy and repeatability. © 2014 The Society for Applied Microbiology.

  10. [11C]Harmine Binding to Brain Monoamine Oxidase A: Test-Retest Properties and Noninvasive Quantification.

    Science.gov (United States)

    Zanderigo, Francesca; D'Agostino, Alexandra E; Joshi, Nandita; Schain, Martin; Kumar, Dileep; Parsey, Ramin V; DeLorenzo, Christine; Mann, J John

    2018-02-08

    Inhibition of the isoform A of monoamine oxidase (MAO-A), a mitochondrial enzyme catalyzing deamination of monoamine neurotransmitters, is useful in treatment of depression and anxiety disorders. [ 11 C]harmine, a MAO-A PET radioligand, has been used to study mood disorders and antidepressant treatment. However, [ 11 C]harmine binding test-retest characteristics have to date only been partially investigated. Furthermore, since MAO-A is ubiquitously expressed, no reference region is available, thus requiring arterial blood sampling during PET scanning. Here, we investigate [ 11 C]harmine binding measurements test-retest properties; assess effects of using a minimally invasive input function estimation on binding quantification and repeatability; and explore binding potentials estimation using a reference region-free approach. Quantification of [ 11 C]harmine distribution volume (V T ) via kinetic models and graphical analyses was compared based on absolute test-retest percent difference (TRPD), intraclass correlation coefficient (ICC), and identifiability. The optimal procedure was also used with a simultaneously estimated input function in place of the measured curve. Lastly, an approach for binding potentials quantification in absence of a reference region was evaluated. [ 11 C]harmine V T estimates quantified using arterial blood and kinetic modeling showed average absolute TRPD values of 7.7 to 15.6 %, and ICC values between 0.56 and 0.86, across brain regions. Using simultaneous estimation (SIME) of input function resulted in V T estimates close to those obtained using arterial input function (r = 0.951, slope = 1.073, intercept = - 1.037), with numerically but not statistically higher test-retest difference (range 16.6 to 22.0 %), but with overall poor ICC values, between 0.30 and 0.57. Prospective studies using [ 11 C]harmine are possible given its test-retest repeatability when binding is quantified using arterial blood. Results with SIME of

  11. Quantification of the recovered oil and water fractions during water flooding laboratory experiments

    DEFF Research Database (Denmark)

    Katika, Konstantina; Halim, Amalia Yunita; Shapiro, Alexander

    2015-01-01

    the volume might be less than a few microliters. In this study, we approach the determination of the oil volumes in flooding effluents using predetermined amounts of the North Sea oil with synthetic seawater. The UV/visible spectroscopy method and low-field NMR spectrometry are compared...... for this determination, and an account of advantages and disadvantages of each method is given. Both methods are reproducible with high accuracy. The NMR method was capable of direct quantification of both oil and water fractions, while the UV/visible spectroscopy quantifies only the oil fraction using a standard curve....

  12. HUMAN ERROR QUANTIFICATION USING PERFORMANCE SHAPING FACTORS IN THE SPAR-H METHOD

    Energy Technology Data Exchange (ETDEWEB)

    Harold S. Blackman; David I. Gertman; Ronald L. Boring

    2008-09-01

    This paper describes a cognitively based human reliability analysis (HRA) quantification technique for estimating the human error probabilities (HEPs) associated with operator and crew actions at nuclear power plants. The method described here, Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) method, was developed to aid in characterizing and quantifying human performance at nuclear power plants. The intent was to develop a defensible method that would consider all factors that may influence performance. In the SPAR-H approach, calculation of HEP rates is especially straightforward, starting with pre-defined nominal error rates for cognitive vs. action-oriented tasks, and incorporating performance shaping factor multipliers upon those nominal error rates.

  13. multiUQ: An intrusive uncertainty quantification tool for gas-liquid multiphase flows

    Science.gov (United States)

    Turnquist, Brian; Owkes, Mark

    2017-11-01

    Uncertainty quantification (UQ) can improve our understanding of the sensitivity of gas-liquid multiphase flows to variability about inflow conditions and fluid properties, creating a valuable tool for engineers. While non-intrusive UQ methods (e.g., Monte Carlo) are simple and robust, the cost associated with these techniques can render them unrealistic. In contrast, intrusive UQ techniques modify the governing equations by replacing deterministic variables with stochastic variables, adding complexity, but making UQ cost effective. Our numerical framework, called multiUQ, introduces an intrusive UQ approach for gas-liquid flows, leveraging a polynomial chaos expansion of the stochastic variables: density, momentum, pressure, viscosity, and surface tension. The gas-liquid interface is captured using a conservative level set approach, including a modified reinitialization equation which is robust and quadrature free. A least-squares method is leveraged to compute the stochastic interface normal and curvature needed in the continuum surface force method for surface tension. The solver is tested by applying uncertainty to one or two variables and verifying results against the Monte Carlo approach. NSF Grant #1511325.

  14. Quantification of endogenous metabolites by the postcolumn infused-internal standard method combined with matrix normalization factor in liquid chromatography-electrospray ionization tandem mass spectrometry.

    Science.gov (United States)

    Liao, Hsiao-Wei; Chen, Guan-Yuan; Wu, Ming-Shiang; Liao, Wei-Chih; Tsai, I-Lin; Kuo, Ching-Hua

    2015-01-02

    Quantification of endogenous metabolites has enabled the discovery of biomarkers for diagnosis and provided for an understanding of disease etiology. The standard addition and stable isotope labeled-internal standard (SIL-IS) methods are currently the most widely used approaches to quantifying endogenous metabolites, but both have some limitations for clinical measurement. In this study, we developed a new approach for endogenous metabolite quantification by the postcolumn infused-internal standard (PCI-IS) method combined with the matrix normalization factor (MNF) method. MNF was used to correct the difference in MEs between standard solution and biofluids, and PCI-IS additionally tailored the correction of the MEs for individual samples. Androstenedione and testosterone were selected as test articles to verify this new approach to quantifying metabolites in plasma. The repeatability (n=4 runs) and intermediate precision (n=3 days) in terms of the peak area of androstenedione and testosterone at all tested concentrations were all less than 11% relative standard deviation (RSD). The accuracy test revealed that the recoveries were between 95.72% and 113.46%. The concentrations of androstenedione and testosterone in fifty plasma samples obtained from healthy volunteers were quantified by the PCI-IS combined with the MNF method, and the quantification results were compared with the results of the SIL-IS method. The Pearson correlation test showed that the correlation coefficient was 0.98 for both androstenedione and testosterone. We demonstrated that the PCI-IS combined with the MNF method is an effective and accurate method for quantifying endogenous metabolites. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. On ISSM and leveraging the Cloud towards faster quantification of the uncertainty in ice-sheet mass balance projections

    Science.gov (United States)

    Larour, E.; Schlegel, N.

    2016-11-01

    With the Amazon EC2 Cloud becoming available as a viable platform for parallel computing, Earth System Models are increasingly interested in leveraging its capabilities towards improving climate projections. In particular, faced with long wait periods on high-end clusters, the elasticity of the Cloud presents a unique opportunity of potentially "infinite" availability of small-sized clusters running on high-performance instances. Among specific applications of this new paradigm, we show here how uncertainty quantification in climate projections of polar ice sheets (Antarctica and Greenland) can be significantly accelerated using the Cloud. Indeed, small-sized clusters are very efficient at delivering sensitivity and sampling analysis, core tools of uncertainty quantification. We demonstrate how this approach was used to carry out an extensive analysis of ice-flow projections on one of the largest basins in Greenland, the North-East Greenland Glacier, using the Ice Sheet System Model, the public-domain NASA-funded ice-flow modeling software. We show how errors in the projections were accurately quantified using Monte-Carlo sampling analysis on the EC2 Cloud, and how a judicious mix of high-end parallel computing and Cloud use can best leverage existing infrastructures, and significantly accelerate delivery of potentially ground-breaking climate projections, and in particular, enable uncertainty quantification that were previously impossible to achieve.

  16. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    Science.gov (United States)

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  17. Word2vec and dictionary based approach for uyghur text filtering

    Science.gov (United States)

    Tohti, Turdi; Zhao, Yunxing; Musajan, Winira

    2017-08-01

    With emerging of deep learning, the expression of words in computer has made major breakthroughs and the effect of text processing based on word vector has also been significantly improved. This paper maps all patterns into a more abstract vector space by Uyghur-Chinese dictionary and deep learning tool Word2vec, at first. Secondly, a similar pattern is found according the characteristics of the original pattern. Finally, texts are filtered using Wu-Manber algorithm. Experiments show that this method can get obvious filtering accuracy and recall of Uyghur text information improved.

  18. Secreted cerberus1 as a marker for quantification of definitive endoderm differentiation of the pluripotent stem cells.

    Directory of Open Access Journals (Sweden)

    Hidefumi Iwashita

    Full Text Available To date, CXCR4 and E-cadherin double-positive cells detected by flow cytometry have been used to identify the differentiation of embryonic stem (ES cells or induced pluripotent stem (iPS cells into definitive endoderm (DE lineages. Quantification of DE differentiation from ES/iPS cells by using flow cytometry is a multi-step procedure including dissociation of the cells, antibody reaction, and flow cytometry analysis. To establish a quick assay method for quantification of ES/iPS cell differentiation into the DE without dissociating the cells, we examined whether secreted Cerberus1 (Cer1 protein could be used as a marker. Cer1 is a secreted protein expressed first in the anterior visceral endoderm and then in the DE. The amount of Cer1 secreted correlated with the proportion of CXCR4+/E-Cadherin+ cells that differentiated from mouse ES cells. In addition, we found that human iPS cell-derived DE also expressed the secreted CER1 and that the expression level correlated with the proportion of SOX17+/FOXA2+ cells present. Taken together, these results show that Cer1 (or CER1 serves as a good marker for quantification of DE differentiation of mouse and human ES/iPS cells.

  19. Calibration of a Sensor Array (an Electronic Tongue for Identification and Quantification of Odorants from Livestock Buildings

    Directory of Open Access Journals (Sweden)

    Jens Jørgen Lønsmann Iversen

    2007-01-01

    Full Text Available This contribution serves a dual purpose. The first purpose was to investigate the possibility of using a sensor array (an electronic tongue for on-line identification and quantification of key odorants representing a variety of chemical groups at two different acidities, pH 6 and 8. The second purpose was to simplify the electronic tongue by decreasing the number of electrodes from 14, which was the number of electrodes in the prototype. Different electrodes were used for identification and quantification of different key odorants. A total of eight electrodes were sufficient for identification and quantification in micromolar concentrations of the key odorants n-butyrate, ammonium and phenolate in test mixtures also containing iso-valerate, skatole and p-cresolate. The limited number of electrodes decreased the standard deviation and the relative standard deviation of triplicate measurements in comparison with the array comprising 14 electrodes. The electronic tongue was calibrated using 4 different test mixtures, each comprising 50 different combinations of key odorants in triplicates, a total of 600 measurements. Back propagation artificial neural network, partial least square and principal component analysis were used in the data analysis. The results indicate that the electronic tongue has a promising potential as an on- line sensor for odorants absorbed in the bioscrubber used in livestock buildings.

  20. FRANX. Application for analysis and quantification of the APS fire

    International Nuclear Information System (INIS)

    Snchez, A.; Osorio, F.; Ontoso, N.

    2014-01-01

    The FRANX application has been developed by EPRI within the Risk and Reliability User Group in order to facilitate the process of quantification and updating APS Fire (also covers floods and earthquakes). By applying fire scenarios are quantified in the central integrating the tasks performed during the APS fire. This paper describes the main features of the program to allow quantification of an APS Fire. (Author)

  1. Quantification of Eosinophilic Granule Protein Deposition in Biopsies of Inflammatory Skin Diseases by Automated Image Analysis of Highly Sensitive Immunostaining

    Directory of Open Access Journals (Sweden)

    Peter Kiehl

    1999-01-01

    Full Text Available Eosinophilic granulocytes are major effector cells in inflammation. Extracellular deposition of toxic eosinophilic granule proteins (EGPs, but not the presence of intact eosinophils, is crucial for their functional effect in situ. As even recent morphometric approaches to quantify the involvement of eosinophils in inflammation have been only based on cell counting, we developed a new method for the cell‐independent quantification of EGPs by image analysis of immunostaining. Highly sensitive, automated immunohistochemistry was done on paraffin sections of inflammatory skin diseases with 4 different primary antibodies against EGPs. Image analysis of immunostaining was performed by colour translation, linear combination and automated thresholding. Using strictly standardized protocols, the assay was proven to be specific and accurate concerning segmentation in 8916 fields of 520 sections, well reproducible in repeated measurements and reliable over 16 weeks observation time. The method may be valuable for the cell‐independent segmentation of immunostaining in other applications as well.

  2. Quantification of aortic regurgitation by magnetic resonance velocity mapping

    DEFF Research Database (Denmark)

    Søndergaard, Lise; Lindvig, K; Hildebrandt, P

    1993-01-01

    The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients, and the regurgit......The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients...

  3. Nanodiamond arrays on glass for quantification and fluorescence characterisation.

    Science.gov (United States)

    Heffernan, Ashleigh H; Greentree, Andrew D; Gibson, Brant C

    2017-08-23

    Quantifying the variation in emission properties of fluorescent nanodiamonds is important for developing their wide-ranging applicability. Directed self-assembly techniques show promise for positioning nanodiamonds precisely enabling such quantification. Here we show an approach for depositing nanodiamonds in pre-determined arrays which are used to gather statistical information about fluorescent lifetimes. The arrays were created via a layer of photoresist patterned with grids of apertures using electron beam lithography and then drop-cast with nanodiamonds. Electron microscopy revealed a 90% average deposition yield across 3,376 populated array sites, with an average of 20 nanodiamonds per site. Confocal microscopy, optimised for nitrogen vacancy fluorescence collection, revealed a broad distribution of fluorescent lifetimes in agreement with literature. This method for statistically quantifying fluorescent nanoparticles provides a step towards fabrication of hybrid photonic devices for applications from quantum cryptography to sensing.

  4. Quantification by aberration corrected (S)TEM of boundaries formed by symmetry breaking phase transformations

    Energy Technology Data Exchange (ETDEWEB)

    Schryvers, D., E-mail: nick.schryvers@uantwerpen.be [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Salje, E.K.H. [Department of Earth Sciences, University of Cambridge, Cambridge CB2 3EQ (United Kingdom); Nishida, M. [Department of Engineering Sciences for Electronics and Materials, Faculty of Engineering Sciences, Kyushu University, Kasuga, Fukuoka 816-8580 (Japan); De Backer, A. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Idrissi, H. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Institute of Mechanics, Materials and Civil Engineering, Université Catholique de Louvain, Place Sainte Barbe, 2, B-1348, Louvain-la-Neuve (Belgium); Van Aert, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2017-05-15

    The present contribution gives a review of recent quantification work of atom displacements, atom site occupations and level of crystallinity in various systems and based on aberration corrected HR(S)TEM images. Depending on the case studied, picometer range precisions for individual distances can be obtained, boundary widths at the unit cell level determined or statistical evolutions of fractions of the ordered areas calculated. In all of these cases, these quantitative measures imply new routes for the applications of the respective materials. - Highlights: • Quantification of picometer displacements at ferroelastic twin boundary in CaTiO{sub 3.} • Quantification of kinks in meandering ferroelectric domain wall in LiNbO{sub 3}. • Quantification of column occupation in anti-phase boundary in Co-Pt. • Quantification of atom displacements at twin boundary in Ni-Ti B19′ martensite.

  5. Text Manipulation Techniques and Foreign Language Composition.

    Science.gov (United States)

    Walker, Ronald W.

    1982-01-01

    Discusses an approach to teaching second language composition which emphasizes (1) careful analysis of model texts from a limited, but well-defined perspective and (2) the application of text manipulation techniques developed by the word processing industry to student compositions. (EKN)

  6. Relative quantification of protein-protein interactions using a dual luciferase reporter pull-down assay system.

    Directory of Open Access Journals (Sweden)

    Shuaizheng Jia

    Full Text Available The identification and quantitative analysis of protein-protein interactions are essential to the functional characterization of proteins in the post-proteomics era. The methods currently available are generally time-consuming, technically complicated, insensitive and/or semi-quantitative. The lack of simple, sensitive approaches to precisely quantify protein-protein interactions still prevents our understanding of the functions of many proteins. Here, we develop a novel dual luciferase reporter pull-down assay by combining a biotinylated Firefly luciferase pull-down assay with a dual luciferase reporter assay. The biotinylated Firefly luciferase-tagged protein enables rapid and efficient isolation of a putative Renilla luciferase-tagged binding protein from a relatively small amount of sample. Both of these proteins can be quantitatively detected using the dual luciferase reporter assay system. Protein-protein interactions, including Fos-Jun located in the nucleus; MAVS-TRAF3 in cytoplasm; inducible IRF3 dimerization; viral protein-regulated interactions, such as MAVS-MAVS and MAVS-TRAF3; IRF3 dimerization; and protein interaction domain mapping, are studied using this novel assay system. Herein, we demonstrate that this dual luciferase reporter pull-down assay enables the quantification of the relative amounts of interacting proteins that bind to streptavidin-coupled beads for protein purification. This study provides a simple, rapid, sensitive, and efficient approach to identify and quantify relative protein-protein interactions. Importantly, the dual luciferase reporter pull-down method will facilitate the functional determination of proteins.

  7. High-speed broadband nanomechanical property quantification and imaging of life science materials using atomic force microscope

    Science.gov (United States)

    Ren, Juan

    Nanoscale morphological characterization and mechanical properties quantification of soft and biological materials play an important role in areas ranging from nano-composite material synthesis and characterization, cellular mechanics to drug design. Frontier studies in these areas demand the coordination between nanoscale morphological evolution and mechanical behavior variations through simultaneous measurement of these two aspects of properties. Atomic force microscope (AFM) is very promising in achieving such simultaneous measurements at high-speed and broadband owing to its unique capability in applying force stimuli and then, measuring the response at specific locations in a physiologically friendly environment with pico-newton force and nanometer spatial resolution. Challenges, however, arise as current AFM systems are unable to account for the complex and coupled dynamics of the measurement system and probe-sample interaction during high-speed imaging and broadband measurements. In this dissertation, the creation of a set of dynamics and control tools to probe-based high-speed imaging and rapid broadband nanomechanical spectroscopy of soft and biological materials are presented. Firstly, advanced control-based approaches are presented to improve the imaging performance of AFM imaging both in air and in liquid. An adaptive contact mode (ACM) imaging scheme is proposed to replace the traditional contact mode (CM) imaging by addressing the major concerns in both the speed and the force exerted to the sample. In this work, the image distortion caused by the topography tracking error is accounted for in the topography quantification and the quantified sample topography is utilized in a gradient-based optimization method to adjust the cantilever deflection set-point for each scanline closely around the minimal level needed for maintaining a stable probe-sample contact, and a data-driven iterative feedforward control that utilizes a prediction of the next

  8. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  9. Quantification of heterogeneity as a biomarker in tumor imaging: a systematic review.

    Directory of Open Access Journals (Sweden)

    Lejla Alic

    Full Text Available BACKGROUND: Many techniques are proposed for the quantification of tumor heterogeneity as an imaging biomarker for differentiation between tumor types, tumor grading, response monitoring and outcome prediction. However, in clinical practice these methods are barely used. This study evaluates the reported performance of the described methods and identifies barriers to their implementation in clinical practice. METHODOLOGY: The Ovid, Embase, and Cochrane Central databases were searched up to 20 September 2013. Heterogeneity analysis methods were classified into four categories, i.e., non-spatial methods (NSM, spatial grey level methods (SGLM, fractal analysis (FA methods, and filters and transforms (F&T. The performance of the different methods was compared. PRINCIPAL FINDINGS: Of the 7351 potentially relevant publications, 209 were included. Of these studies, 58% reported the use of NSM, 49% SGLM, 10% FA, and 28% F&T. Differentiation between tumor types, tumor grading and/or outcome prediction was the goal in 87% of the studies. Overall, the reported area under the curve (AUC ranged from 0.5 to 1 (median 0.87. No relation was found between the performance and the quantification methods used, or between the performance and the imaging modality. A negative correlation was found between the tumor-feature ratio and the AUC, which is presumably caused by overfitting in small datasets. Cross-validation was reported in 63% of the classification studies. Retrospective analyses were conducted in 57% of the studies without a clear description. CONCLUSIONS: In a research setting, heterogeneity quantification methods can differentiate between tumor types, grade tumors, and predict outcome and monitor treatment effects. To translate these methods to clinical practice, more prospective studies are required that use external datasets for validation: these datasets should be made available to the community to facilitate the development of new and improved

  10. Dataset exploited for the development and validation of automated cyanobacteria quantification algorithm, ACQUA

    Directory of Open Access Journals (Sweden)

    Emanuele Gandola

    2016-09-01

    Full Text Available The estimation and quantification of potentially toxic cyanobacteria in lakes and reservoirs are often used as a proxy of risk for water intended for human consumption and recreational activities. Here, we present data sets collected from three volcanic Italian lakes (Albano, Vico, Nemi that present filamentous cyanobacteria strains at different environments. Presented data sets were used to estimate abundance and morphometric characteristics of potentially toxic cyanobacteria comparing manual Vs. automated estimation performed by ACQUA (“ACQUA: Automated Cyanobacterial Quantification Algorithm for toxic filamentous genera using spline curves, pattern recognition and machine learning” (Gandola et al., 2016 [1]. This strategy was used to assess the algorithm performance and to set up the denoising algorithm. Abundance and total length estimations were used for software development, to this aim we evaluated the efficiency of statistical tools and mathematical algorithms, here described. The image convolution with the Sobel filter has been chosen to denoise input images from background signals, then spline curves and least square method were used to parameterize detected filaments and to recombine crossing and interrupted sections aimed at performing precise abundances estimations and morphometric measurements. Keywords: Comparing data, Filamentous cyanobacteria, Algorithm, Deoising, Natural sample

  11. Quantification Model for Estimating Temperature Field Distributions of Apple Fruit

    OpenAIRE

    Zhang , Min; Yang , Le; Zhao , Huizhong; Zhang , Leijie; Zhong , Zhiyou; Liu , Yanling; Chen , Jianhua

    2009-01-01

    International audience; A quantification model of transient heat conduction was provided to simulate apple fruit temperature distribution in the cooling process. The model was based on the energy variation of apple fruit of different points. It took into account, heat exchange of representative elemental volume, metabolism heat and external heat. The following conclusions could be obtained: first, the quantification model can satisfactorily describe the tendency of apple fruit temperature dis...

  12. The Voice of Chinese Health Consumers: A Text Mining Approach to Web-Based Physician Reviews.

    Science.gov (United States)

    Hao, Haijing; Zhang, Kunpeng

    2016-05-10

    skills and bedside manner, general appreciation from patients, and description of various symptoms. To the best of our knowledge, our work is the first study using an automated text-mining approach to analyze a large amount of unstructured textual data of Web-based physician reviews in China. Based on our analysis, we found that Chinese reviewers mainly concentrate on a few popular topics. This is consistent with the goal of Chinese online health platforms and demonstrates the health care focus in China's health care system. Our text-mining approach reveals a new research area on how to use big data to help health care providers, health care administrators, and policy makers hear patient voices, target patient concerns, and improve the quality of care in this age of patient-centered care. Also, on the health care consumer side, our text mining technique helps patients make more informed decisions about which specialists to see without reading thousands of reviews, which is simply not feasible. In addition, our comparison analysis of Web-based physician reviews in China and the United States also indicates some cultural differences.

  13. Uncertainty quantification in flood risk assessment

    Science.gov (United States)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  14. Application of the homology method for quantification of low-attenuation lung region in patients with and without COPD

    Directory of Open Access Journals (Sweden)

    Nishio M

    2016-09-01

    Full Text Available Mizuho Nishio,1 Kazuaki Nakane,2 Yutaka Tanaka3 1Clinical PET Center, Institute of Biomedical Research and Innovation, Hyogo, Japan; 2Department of Molecular Pathology, Osaka University Graduate School of Medicine and Health Science, Osaka, Japan; 3Department of Radiology, Chibune General Hospital, Osaka, Japan Background: Homology is a mathematical concept that can be used to quantify degree of contact. Recently, image processing with the homology method has been proposed. In this study, we used the homology method and computed tomography images to quantify emphysema.Methods: This study included 112 patients who had undergone computed tomography and pulmonary function test. Low-attenuation lung regions were evaluated by the homology method, and homology-based emphysema quantification (b0, b1, nb0, nb1, and R was performed. For comparison, the percentage of low-attenuation lung area (LAA% was also obtained. Relationships between emphysema quantification and pulmonary function test results were evaluated by Pearson’s correlation coefficients. In addition to the correlation, the patients were divided into the following three groups based on guidelines of the Global initiative for chronic Obstructive Lung Disease: Group A, nonsmokers; Group B, smokers without COPD, mild COPD, and moderate COPD; Group C, severe COPD and very severe COPD. The homology-based emphysema quantification and LAA% were compared among these groups.Results: For forced expiratory volume in 1 second/forced vital capacity, the correlation coefficients were as follows: LAA%, -0.603; b0, -0.460; b1, -0.500; nb0, -0.449; nb1, -0.524; and R, -0.574. For forced expiratory volume in 1 second, the coefficients were as follows: LAA%, -0.461; b0, -0.173; b1, -0.314; nb0, -0.191; nb1, -0.329; and R, -0.409. Between Groups A and B, difference in nb0 was significant (P-value = 0.00858, and those in the other types of quantification were not significant.Conclusion: Feasibility of the

  15. A simple and efficient method for poly-3-hydroxybutyrate quantification in diazotrophic bacteria within 5 minutes using flow cytometry

    Directory of Open Access Journals (Sweden)

    L.P.S. Alves

    Full Text Available The conventional method for quantification of polyhydroxyalkanoates based on whole-cell methanolysis and gas chromatography (GC is laborious and time-consuming. In this work, a method based on flow cytometry of Nile red stained bacterial cells was established to quantify poly-3-hydroxybutyrate (PHB production by the diazotrophic and plant-associated bacteria, Herbaspirillum seropedicae and Azospirillum brasilense. The method consists of three steps: i cell permeabilization, ii Nile red staining, and iii analysis by flow cytometry. The method was optimized step-by-step and can be carried out in less than 5 min. The final results indicated a high correlation coefficient (R2=0.99 compared to a standard method based on methanolysis and GC. This method was successfully applied to the quantification of PHB in epiphytic bacteria isolated from rice roots.

  16. Quantification of the least limiting water range in an oxisol using two methodological strategies

    Directory of Open Access Journals (Sweden)

    Wagner Henrique Moreira

    2014-12-01

    Full Text Available The least limiting water range (LLWR has been used as an indicator of soil physical quality as it represents, in a single parameter, the soil physical properties directly linked to plant growth, with the exception of temperature. The usual procedure for obtaining the LLWR involves determination of the water retention curve (WRC and the soil resistance to penetration curve (SRC in soil samples with undisturbed structure in the laboratory. Determination of the WRC and SRC using field measurements (in situ is preferable, but requires appropriate instrumentation. The objective of this study was to determine the LLWR from the data collected for determination of WRC and SRC in situ using portable electronic instruments, and to compare those determinations with the ones made in the laboratory. Samples were taken from the 0.0-0.1 m layer of a Latossolo Vermelho distrófico (Oxisol. Two methods were used for quantification of the LLWR: the traditional, with measurements made in soil samples with undisturbed structure; and in situ , with measurements of water content (θ, soil water potential (Ψ, and soil resistance to penetration (SR through the use of sensors. The in situ measurements of θ, Ψ and SR were taken over a period of four days of soil drying. At the same time, samples with undisturbed structure were collected for determination of bulk density (BD. Due to the limitations of measurement of Ψ by tensiometer, additional determinations of θ were made with a psychrometer (in the laboratory at the Ψ of -1500 kPa. The results show that it is possible to determine the LLWR by the θ, Ψ and SR measurements using the suggested approach and instrumentation. The quality of fit of the SRC was similar in both strategies. In contrast, the θ and Ψ in situ measurements, associated with those measured with a psychrometer, produced a better WRC description. The estimates of the LLWR were similar in both methodological strategies. The quantification of

  17. Initial water quantification results using neutron computed tomography

    Science.gov (United States)

    Heller, A. K.; Shi, L.; Brenizer, J. S.; Mench, M. M.

    2009-06-01

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at the Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  18. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...

  19. Validated LC-MS/MS Method for the Quantification of Ponatinib in Plasma: Application to Metabolic Stability.

    Directory of Open Access Journals (Sweden)

    Adnan A Kadi

    Full Text Available In the current work, a rapid, specific, sensitive and validated liquid chromatography tandem mass-spectrometric method was developed for the quantification of ponatinib (PNT in human plasma and rat liver microsomes (RLMs with its application to metabolic stability. Chromatographic separation of PNT and vandetanib (IS were accomplished on Agilent eclipse plus C18 analytical column (50 mm × 2.1 mm, 1.8 μm particle size maintained at 21±2°C. Flow rate was 0.25 mLmin-1 with run time of 4 min. Mobile phase consisted of solvent A (10 mM ammonium formate, pH adjusted to 4.1 with formic acid and solvent B (acetonitrile. Ions were generated by electrospray (ESI and multiple reaction monitoring (MRM was used as basis for quantification. The results revealed a linear calibration curve in the range of 5-400 ngmL-1 (r2 ≥ 0.9998 with lower limit of quantification (LOQ and lower limit of detection (LOD of 4.66 and 1.53 ngmL-1 in plasma, 4.19 and 1.38 ngmL-1 in RLMs. The intra- and inter-day precision and accuracy in plasma ranged from1.06 to 2.54% and -1.48 to -0.17, respectively. Whereas in RLMs ranged from 0.97 to 2.31% and -1.65 to -0.3%. The developed procedure was applied for quantification of PNT in human plasma and RLMs for study metabolic stability of PNT. PNT disappeared rapidly in the 1st 10 minutes of RLM incubation and the disappearance plateaued out for the rest of the incubation. In vitro half-life (t1/2 was 6.26 min and intrinsic clearance (CLin was 15.182± 0.477.

  20. Verification Validation and Uncertainty Quantification for CGS

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kamm, James R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    The overall conduct of verification, validation and uncertainty quantification (VVUQ) is discussed through the construction of a workflow relevant to computational modeling including the turbulence problem in the coarse grained simulation (CGS) approach. The workflow contained herein is defined at a high level and constitutes an overview of the activity. Nonetheless, the workflow represents an essential activity in predictive simulation and modeling. VVUQ is complex and necessarily hierarchical in nature. The particular characteristics of VVUQ elements depend upon where the VVUQ activity takes place in the overall hierarchy of physics and models. In this chapter, we focus on the differences between and interplay among validation, calibration and UQ, as well as the difference between UQ and sensitivity analysis. The discussion in this chapter is at a relatively high level and attempts to explain the key issues associated with the overall conduct of VVUQ. The intention is that computational physicists can refer to this chapter for guidance regarding how VVUQ analyses fit into their efforts toward conducting predictive calculations.

  1. Tentacle: distributed quantification of genes in metagenomes.

    Science.gov (United States)

    Boulund, Fredrik; Sjögren, Anders; Kristiansson, Erik

    2015-01-01

    In metagenomics, microbial communities are sequenced at increasingly high resolution, generating datasets with billions of DNA fragments. Novel methods that can efficiently process the growing volumes of sequence data are necessary for the accurate analysis and interpretation of existing and upcoming metagenomes. Here we present Tentacle, which is a novel framework that uses distributed computational resources for gene quantification in metagenomes. Tentacle is implemented using a dynamic master-worker approach in which DNA fragments are streamed via a network and processed in parallel on worker nodes. Tentacle is modular, extensible, and comes with support for six commonly used sequence aligners. It is easy to adapt Tentacle to different applications in metagenomics and easy to integrate into existing workflows. Evaluations show that Tentacle scales very well with increasing computing resources. We illustrate the versatility of Tentacle on three different use cases. Tentacle is written for Linux in Python 2.7 and is published as open source under the GNU General Public License (v3). Documentation, tutorials, installation instructions, and the source code are freely available online at: http://bioinformatics.math.chalmers.se/tentacle.

  2. Novel approaches to study low-energy electron-induced damage to DNA oligonucleotides

    International Nuclear Information System (INIS)

    Rackwitz, Jenny; Bald, Ilko; Ranković, Miloš Lj; Milosavljević, Aleksandar R

    2015-01-01

    The novel approach of DNA origami structures as templates for precise quantification of various well- defined oligonucleotides provides the opportunity to determine the sensitivity of complex DNA sequences towards low-energy electrons. (paper)

  3. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian [University of Bern, From the Institute of Forensic Medicine, Bern (Switzerland); Persson, Anders; Warntjes, Marcel J. [University of Linkoeping, The Center for Medical Image Science and Visualization (CMIV), Linkoeping (Sweden)

    2015-08-15

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  4. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    International Nuclear Information System (INIS)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian; Persson, Anders; Warntjes, Marcel J.

    2015-01-01

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  5. Quantification is Neither Necessary Nor Sufficient for Measurement

    International Nuclear Information System (INIS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-01-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement

  6. Preclinical in vivo imaging for fat tissue identification, quantification and functional characterization

    Directory of Open Access Journals (Sweden)

    Pasquina Marzola

    2016-09-01

    Full Text Available Localization, differentiation and quantitative assessment of fat tissues have always collected the interest of researchers. Nowadays, these topics are even more relevant as obesity (the excess of fat tissue is considered a real pathology requiring in some cases pharmacological and surgical approaches. Several weight loss medications, acting either on the metabolism or on the central nervous system, are currently under preclinical or clinical investigation. Animal models of obesity have been developed which are widely used in pharmaceutical research. The assessment of candidate drugs in animal models requires non-invasive methods for longitudinal assessment of efficacy, the main outcome being the amount of body fat. Fat tissues can be either quantified in the entire animal or localized and measured in selected organs/regions of the body. Fat tissues are characterized by peculiar contrast in several imaging modalities as for example Magnetic Resonance Imaging (MRI that can distinguish between fat and water protons thank to their different magnetic resonance properties. Since fat tissues have higher carbon/hydrogen content than other soft tissues and bones, they can be easily assessed by Computed Tomography (CT as well. Interestingly, MRI also discriminates between white and brown adipose tissue; the latter has long been regarded as a potential target for anti-obesity drugs because of its ability to enhance energy consumption through increased thermogenesis. Positron Emission Tomography (PET performed with 18F-FDG as glucose analogue radiotracer reflects well the metabolic rate in body tissues and consequently is the technique of choice for studies of BAT metabolism. This review will focus on the main, non-invasive imaging techniques (MRI, CT and PET that are fundamental for the assessment, quantification and functional characterization of fat deposits in small laboratory animals. The contribution of optical techniques, which are currently regarded

  7. Detection and quantification of proteins and cells by use of elemental mass spectrometry: progress and challenges.

    Science.gov (United States)

    Yan, Xiaowen; Yang, Limin; Wang, Qiuquan

    2013-07-01

    Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.

  8. Standardless quantification by parameter optimization in electron probe microanalysis

    International Nuclear Information System (INIS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-01-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: ► A method for standardless quantification in EPMA is presented. ► It gives better results than the commercial software GENESIS Spectrum. ► It gives better results than the software DTSA. ► It allows the determination of the conductive coating thickness. ► It gives an estimation for the concentration uncertainties.

  9. A novel immunological assay for hepcidin quantification in human serum.

    Directory of Open Access Journals (Sweden)

    Vasiliki Koliaraki

    Full Text Available BACKGROUND: Hepcidin is a 25-aminoacid cysteine-rich iron regulating peptide. Increased hepcidin concentrations lead to iron sequestration in macrophages, contributing to the pathogenesis of anaemia of chronic disease whereas decreased hepcidin is observed in iron deficiency and primary iron overload diseases such as hereditary hemochromatosis. Hepcidin quantification in human blood or urine may provide further insights for the pathogenesis of disorders of iron homeostasis and might prove a valuable tool for clinicians for the differential diagnosis of anaemia. This study describes a specific and non-operator demanding immunoassay for hepcidin quantification in human sera. METHODS AND FINDINGS: An ELISA assay was developed for measuring hepcidin serum concentration using a recombinant hepcidin25-His peptide and a polyclonal antibody against this peptide, which was able to identify native hepcidin. The ELISA assay had a detection range of 10-1500 microg/L and a detection limit of 5.4 microg/L. The intra- and interassay coefficients of variance ranged from 8-15% and 5-16%, respectively. Mean linearity and recovery were 101% and 107%, respectively. Mean hepcidin levels were significantly lower in 7 patients with juvenile hemochromatosis (12.8 microg/L and 10 patients with iron deficiency anemia (15.7 microg/L and higher in 7 patients with Hodgkin lymphoma (116.7 microg/L compared to 32 age-matched healthy controls (42.7 microg/L. CONCLUSIONS: We describe a new simple ELISA assay for measuring hepcidin in human serum with sufficient accuracy and reproducibility.

  10. Identification and accurate quantification of structurally related peptide impurities in synthetic human C-peptide by liquid chromatography-high resolution mass spectrometry.

    Science.gov (United States)

    Li, Ming; Josephs, Ralf D; Daireaux, Adeline; Choteau, Tiphaine; Westwood, Steven; Wielgosz, Robert I; Li, Hongmei

    2018-06-04

    Peptides are an increasingly important group of biomarkers and pharmaceuticals. The accurate purity characterization of peptide calibrators is critical for the development of reference measurement systems for laboratory medicine and quality control of pharmaceuticals. The peptides used for these purposes are increasingly produced through peptide synthesis. Various approaches (for example mass balance, amino acid analysis, qNMR, and nitrogen determination) can be applied to accurately value assign the purity of peptide calibrators. However, all purity assessment approaches require a correction for structurally related peptide impurities in order to avoid biases. Liquid chromatography coupled to high resolution mass spectrometry (LC-hrMS) has become the key technique for the identification and accurate quantification of structurally related peptide impurities in intact peptide calibrator materials. In this study, LC-hrMS-based methods were developed and validated in-house for the identification and quantification of structurally related peptide impurities in a synthetic human C-peptide (hCP) material, which served as a study material for an international comparison looking at the competencies of laboratories to perform peptide purity mass fraction assignments. More than 65 impurities were identified, confirmed, and accurately quantified by using LC-hrMS. The total mass fraction of all structurally related peptide impurities in the hCP study material was estimated to be 83.3 mg/g with an associated expanded uncertainty of 3.0 mg/g (k = 2). The calibration hierarchy concept used for the quantification of individual impurities is described in detail. Graphical abstract ᅟ.

  11. ROMA: representation and quantification of module activity from target expression data

    Directory of Open Access Journals (Sweden)

    Loredana eMartignetti

    2016-02-01

    Full Text Available In many analysis of high-throughput data in systems biology, there is a need to quantify the activity of a set of genes in individual samples. A typical example is the case where it is necessary to estimate the activity of a transcription factor (which is often not directly measurable from the expression of its target genes. We present here ROMA (Representation and quantification Of Module Activities Java software, designed for fast and robust computation of the activity of gene sets (or modules with coordinated expression. ROMA activity quantification is based on the simplest uni-factor linear model of gene regulation that approximates the expression data of a gene set by its first principal component.The proposed algorithm implements novel functionalities: it provides several method modifications for principal components computation, including weighted, robust and centered methods; it distinguishes overdispersed modules (based on the variance explained by the first principal component and coordinated modules (based on the significance of the spectral gap; finally, it computes statistical significance of the estimated module overdispersion or coordination.ROMA can be applied in many contexts, from estimating differential activities of transcriptional factors to findingoverdispersed pathways in single-cell transcriptomics data. We describe here the principles of ROMA providing several practical examples of its use.ROMA source code is available at https://github.com/sysbio-curie/Roma.

  12. Cadmium voltametric quantification in table chocolate produced in Chiquinquira-Boyaca, Colombia

    Directory of Open Access Journals (Sweden)

    Paola Andrea Vargas Moreno

    2017-04-01

    Full Text Available Bioaccumulation of heavy metals such as cadmium has been a major concern in scientific communities and international food organizations, given the great toxicological risk to the consumer, and in many places there is no detailed record of its actual content. In this way, the need arises to carry out a study and registration of the concentration of this metal in products such as table chocolate, of great consumption at regional and national level. Likewise, we seek to have effective quantification tools and a reliable and affordable method to achieve the aim of this research. In this research, Cadmium content in powdered and granulated table chocolate was determined, elaborated and commercialized in the municipality of Chiquinquira, Boyacá-Colombia, using the differential pulse voltammetric method of anodic redisolution (DPVMAR. Previously, the parameters of this method were evaluated, selecting selectivity, linearity, sensitivity, precision and accuracy with satisfactory results as follows: selective at a potential range of 0.54 to 0.64 V, sensitivity in ppb, R2> 0.95, % CV 80%. Analysis of variance showed no significant statistical differences (P <0.05 between the results. Cadmium quantification in samples of granulated and powder chocolate showed values of concentration between 214 and 260 ppb, with the highest concentrations of powder chocolate. Cadmium level did not exceed the tolerable weekly intake limit for this type of food.

  13. Automated Quantification of Hematopoietic Cell – Stromal Cell Interactions in Histological Images of Undecalcified Bone

    Science.gov (United States)

    Zehentmeier, Sandra; Cseresnyes, Zoltan; Escribano Navarro, Juan; Niesner, Raluca A.; Hauser, Anja E.

    2015-01-01

    Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data. PMID:25938636

  14. Reengineering the Cardiac Catheterization Lab Processes: A Lean Approach

    Directory of Open Access Journals (Sweden)

    Venkatesh Raghavan

    2010-01-01

    Full Text Available This paper presents a cross-functional effort in a US community hospital for an overall process improvement in its Cardiac Catheterization Lab (CCL. One of the key system performance metrics identified was the patient turnaround time. The objective of this study was to identify the sources of delays in the system that lead to prolonged patient turnaround time using a structured lean approach. A set of qualitative recommendations were proposed and implemented. Quantification of some of these recommendations and certain additional ‘what-if’ scenarios were evaluated using Discrete Event Simulation (DES. The simulation results showed that significant reduction in patient turnaround time could be achieved if the proposed recommendations were implemented. This study demonstrated the benefits of adopting the lean philosophy in the continuous process improvement journey in the healthcare delivery arena.

  15. Improved LC-MS/MS method for the quantification of hepcidin-25 in clinical samples.

    Science.gov (United States)

    Abbas, Ioana M; Hoffmann, Holger; Montes-Bayón, María; Weller, Michael G

    2018-06-01

    Mass spectrometry-based methods play a crucial role in the quantification of the main iron metabolism regulator hepcidin by singling out the bioactive 25-residue peptide from the other naturally occurring N-truncated isoforms (hepcidin-20, -22, -24), which seem to be inactive in iron homeostasis. However, several difficulties arise in the MS analysis of hepcidin due to the "sticky" character of the peptide and the lack of suitable standards. Here, we propose the use of amino- and fluoro-silanized autosampler vials to reduce hepcidin interaction to laboratory glassware surfaces after testing several types of vials for the preparation of stock solutions and serum samples for isotope dilution liquid chromatography-tandem mass spectrometry (ID-LC-MS/MS). Furthermore, we have investigated two sample preparation strategies and two chromatographic separation conditions with the aim of developing a LC-MS/MS method for the sensitive and reliable quantification of hepcidin-25 in serum samples. A chromatographic separation based on usual acidic mobile phases was compared with a novel approach involving the separation of hepcidin-25 with solvents at high pH containing 0.1% of ammonia. Both methods were applied to clinical samples in an intra-laboratory comparison of two LC-MS/MS methods using the same hepcidin-25 calibrators with good correlation of the results. Finally, we recommend a LC-MS/MS-based quantification method with a dynamic range of 0.5-40 μg/L for the assessment of hepcidin-25 in human serum that uses TFA-based mobile phases and silanized glass vials. Graphical abstract Structure of hepcidin-25 (Protein Data Bank, PDB ID 2KEF).

  16. A new approach to the classification of African oral texts | Kam ...

    African Journals Online (AJOL)

    Toutes ces raisons ont conduit à un nouvel examen des différents genres oraux dans le cadre africain et à proposer une division de ces textes en cinq grandes catégories. Mots clés: littérature orale, genres oraux, textes oraux, discours, énoncés, jeux de plaisanterie, chercheurs en littérature orale. Tydskrif vir Letterkunde ...

  17. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  18. A Study of Readability of Texts in Bangla through Machine Learning Approaches

    Science.gov (United States)

    Sinha, Manjira; Basu, Anupam

    2016-01-01

    In this work, we have investigated text readability in Bangla language. Text readability is an indicator of the suitability of a given document with respect to a target reader group. Therefore, text readability has huge impact on educational content preparation. The advances in the field of natural language processing have enabled the automatic…

  19. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    Science.gov (United States)

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  20. A novel quantification method of pantaprazole sodium monohydrate in sesquihydrate by thermogravimetric analyzer.

    Science.gov (United States)

    Reddy, V Ranga; Rajmohan, M Anantha; Shilpa, R Laxmi; Raut, Dilip M; Naveenkumar, Kolla; Suryanarayana, M V; Mathad, Vijayavitthal T

    2007-04-11

    To demonstrate the applicability of thermogravimetric analyzer as a tool for the quantification of pantaprazole sodium monohydrate in sesquihydrate, studies have been conducted. Thermal analysis (DSC, TGA) crystallographic (PXRD) and spectroscopic techniques (FT-IR) were used for the characterization of the polymorphs. Thermogravimetric analysis (TGA) analysis was explored by high-resolution dynamic (Hi-Res-dynamic) and high-resolution modulated (Hi-Res-modulated) test procedures to quantify the hydrate polymorphic mixtures. The two polymorphic forms exhibited significant differences and good resolution in the second derivative thermogram generated by Hi-Res-modulated test procedure. Thus, the TGA with Hi-Res-modulated test procedure was considered for the quantification of monohydrate in sesquihydrate. The calibration plot was constructed from the known mixtures of two polymorphs by plotting the peak area of the second derivative thermogram against the weight percent of monohydrate. Using this novel approach, 1 wt% limit of detection (LOD) was achieved. The polymorphic purity results, obtained by TGA in Hi-Res-modulated test procedure were found to be in good agreement with the results predicted by FT-IR and was comparable with the actual values of the known polymorphic mixtures. The Hi-Res-modulated TGA technique is very simple and easy to perform the analysis.

  1. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....

  2. Dynamic Chemical Model for $\\text {H} _2 $/$\\text {O} _2 $ Combustion Developed Through a Community Workflow

    KAUST Repository

    Oreluk, James; Needham, Craig D.; Baskaran, Sathya; Sarathy, Mani; Burke, Michael P.; West, Richard H.; Frenklach, Michael; Westmoreland, Phillip R.

    2018-01-01

    Elementary-reaction models for $\\text{H}_2$/$\\text{O}_2$ combustion were evaluated and optimized through a collaborative workflow, establishing accuracy and characterizing uncertainties. Quantitative findings were the optimized model, the importance of $\\text{H}_2 + \\text{O}_2(1\\Delta) = \\text{H} + \\text{HO}_2$ in high-pressure flames, and the inconsistency of certain low-temperature shock-tube data. The workflow described here is proposed to be even more important because the approach and publicly available cyberinfrastructure allows future community development of evolving improvements. The workflow steps applied here were to develop an initial reaction set using Burke et al. [2012], Burke et al. [2013], Sellevag et al. [2009], and Konnov [2015]; test it for thermodynamic and kinetics consistency and plausibility against other sets in the literature; assign estimated uncertainties where not stated in the sources; select key data targets (

  3. Dynamic Chemical Model for $\\text {H} _2 $/$\\text {O} _2 $ Combustion Developed Through a Community Workflow

    KAUST Repository

    Oreluk, James

    2018-01-30

    Elementary-reaction models for $\\\\text{H}_2$/$\\\\text{O}_2$ combustion were evaluated and optimized through a collaborative workflow, establishing accuracy and characterizing uncertainties. Quantitative findings were the optimized model, the importance of $\\\\text{H}_2 + \\\\text{O}_2(1\\\\Delta) = \\\\text{H} + \\\\text{HO}_2$ in high-pressure flames, and the inconsistency of certain low-temperature shock-tube data. The workflow described here is proposed to be even more important because the approach and publicly available cyberinfrastructure allows future community development of evolving improvements. The workflow steps applied here were to develop an initial reaction set using Burke et al. [2012], Burke et al. [2013], Sellevag et al. [2009], and Konnov [2015]; test it for thermodynamic and kinetics consistency and plausibility against other sets in the literature; assign estimated uncertainties where not stated in the sources; select key data targets (

  4. Text messaging approach improves weight loss in patients with nonalcoholic fatty liver disease: A randomized study.

    Science.gov (United States)

    Axley, Page; Kodali, Sudha; Kuo, Yong-Fang; Ravi, Sujan; Seay, Toni; Parikh, Nina M; Singal, Ashwani K

    2018-05-01

    Nonalcoholic fatty liver disease (NAFLD) is emerging as the most common liver disease. The only effective treatment is 7%-10% weight loss. Mobile technology is increasingly used in weight management. This study was performed to evaluate the effects of text messaging intervention on weight loss in patients with NAFLD. Thirty well-defined NAFLD patients (mean age 52 years, 67% females, mean BMI 38) were randomized 1:1 to control group: counselling on healthy diet and exercise, or intervention group: text messages in addition to healthy life style counselling. NAFLD text messaging program sent weekly messages for 22 weeks on healthy life style education. Primary outcome was change in weight. Secondary outcomes were changes in liver enzymes and lipid profile. Intervention group lost an average of 6.9 lbs. (P = .03) compared to gain of 1.8 lbs. in the control group (P = .45). Intervention group also showed a decrease in ALT level (-12.5 IU/L, P = .035) and improvement in serum triglycerides (-28 mg/dL, P = .048). There were no changes in the control group on serum ALT level (-6.1 IU/L, P = .46) and on serum triglycerides (-20.3 mg/dL P = .27). Using one-way analysis of variance, change in outcomes in intervention group compared to control group was significant for weight (P = .02) and BMI (P = .02). Text messaging on healthy life style is associated with reduction in weight in NAFLD patients. Larger studies are suggested to examine benefits on liver histology, and assess long-term impact of this approach in patients with NAFLD. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Review of Polynomial Chaos-Based Methods for Uncertainty Quantification in Modern Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Arun Kaintura

    2018-02-01

    Full Text Available Advances in manufacturing process technology are key ensembles for the production of integrated circuits in the sub-micrometer region. It is of paramount importance to assess the effects of tolerances in the manufacturing process on the performance of modern integrated circuits. The polynomial chaos expansion has emerged as a suitable alternative to standard Monte Carlo-based methods that are accurate, but computationally cumbersome. This paper provides an overview of the most recent developments and challenges in the application of polynomial chaos-based techniques for uncertainty quantification in integrated circuits, with particular focus on high-dimensional problems.

  6. Analytical approaches for the characterization and quantification of nanoparticles in food and beverages.

    Science.gov (United States)

    Mattarozzi, Monica; Suman, Michele; Cascio, Claudia; Calestani, Davide; Weigel, Stefan; Undas, Anna; Peters, Ruud

    2017-01-01

    Estimating consumer exposure to nanomaterials (NMs) in food products and predicting their toxicological properties are necessary steps in the assessment of the risks of this technology. To this end, analytical methods have to be available to detect, characterize and quantify NMs in food and materials related to food, e.g. food packaging and biological samples following metabolization of food. The challenge for the analytical sciences is that the characterization of NMs requires chemical as well as physical information. This article offers a comprehensive analysis of methods available for the detection and characterization of NMs in food and related products. Special attention was paid to the crucial role of sample preparation methods since these have been partially neglected in the scientific literature so far. The currently available instrumental methods are grouped as fractionation, counting and ensemble methods, and their advantages and limitations are discussed. We conclude that much progress has been made over the last 5 years but that many challenges still exist. Future perspectives and priority research needs are pointed out. Graphical Abstract Two possible analytical strategies for the sizing and quantification of Nanoparticles: Asymmetric Flow Field-Flow Fractionation with multiple detectors (allows the determination of true size and mass-based particle size distribution); Single Particle Inductively Coupled Plasma Mass Spectrometry (allows the determination of a spherical equivalent diameter of the particle and a number-based particle size distribution).

  7. Lung involvement quantification in chest radiographs; Quantificacao de comprometimento pulmonar em radiografias de torax

    Energy Technology Data Exchange (ETDEWEB)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A. [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Instituto de Biociencias. Departamento de Fisica e Biofisica; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M., E-mail: giacomini@ibb.unesp.br [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Faculdade de Medicina. Departamento de Doencas Tropicais e Diagnostico por Imagem

    2014-12-15

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  8. A Computational Approach to the Quantification of Animal Camouflage

    Science.gov (United States)

    2014-06-01

    and Norm Farr, for providing great feedback on my research and encouragement along the way. Finally, I thank my dad and my sister, for their love...that live different habitats. Another approach, albeit logistically difficult, would be to transport cuttlefish native to a chromatically poor ...habitat to a chromatically rich habitat. Many such challenges remain in the field of sensory ecology, not just of cephalopods in marine habitats but many

  9. XPS quantification of the hetero-junction interface energy

    International Nuclear Information System (INIS)

    Ma, Z.S.; Wang Yan; Huang, Y.L.; Zhou, Z.F.; Zhou, Y.C.; Zheng Weitao; Sun, Chang Q.

    2013-01-01

    Highlights: ► Quantum entrapment or polarization dictates the performance of dopant, impurity, interface, alloy and compounds. ► Interface bond energy, energy density, and atomic cohesive energy can be determined using XPS and our BOLS theory. ► Presents a new and reliable method for catalyst design and identification. ► Entrapment makes CuPd to be a p-type catalyst and polarization derives AgPd as an n-type catalyst. - Abstract: We present an approach for quantifying the heterogeneous interface bond energy using X-ray photoelectron spectroscopy (XPS). Firstly, from analyzing the XPS core-level shift of the elemental surfaces we obtained the energy levels of an isolated atom and their bulk shifts of the constituent elements for reference; then we measured the energy shifts of the specific energy levels upon interface alloy formation. Subtracting the referential spectrum from that collected from the alloy, we can distil the interface effect on the binding energy. Calibrated based on the energy levels and their bulk shifts derived from elemental surfaces, we can derive the bond energy, energy density, atomic cohesive energy, and free energy at the interface region. This approach has enabled us to clarify the dominance of quantum entrapment at CuPd interface and the dominance of polarization at AgPd and BeW interfaces, as the origin of interface energy change. Developed approach not only enhances the power of XPS but also enables the quantification of the interface energy at the atomic scale that has been an issue of long challenge.

  10. Dependency Analysis Guidance Nordic/German Working Group on Common Cause Failure analysis. Phase 2, Development of Harmonized Approach and Applications for Common Cause Failure Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Becker, Guenter; Johanson, Gunnar; Lindberg, Sandra; Vaurio, Jussi

    2009-03-15

    The Regulatory Code SSMFS 2008:1 of Swedish Radiation Safety Authority (SSM) includes requirements regarding the performance of probabilistic safety assessments (PSA), as well as PSA activities in general. Therefore, the follow-up of these activities is part of the inspection tasks of SSM. According to the SSMFS 2008:1, the safety analyses shall be based on a systematic identification and evaluation of such events, event sequences and other conditions which may lead to a radiological accident. The research report Nordic/German Working Group on Common cause Failure analysis. Phase 2 project report: Development of Harmonized Approach and Applications for Common Cause Failure Quantification has been developed under a contract with the Nordic PSA Group (NPSAG) and its German counterpart VGB, with the aim to create a common experience base for defence and analysis of dependent failures i.e. Common Cause Failures CCF. Phase 2 in this project if a deepened data analyses of CCF events and a demonstration on how the so called impact vectors can be constructed and on how CCF parameters are estimated. The word Guidance in the report title is used in order to indicate a common methodological guidance accepted by the NPSAG, based on current state of the art concerning the analysis of dependent failures and adapted to conditions relevant for Nordic sites. This will make it possible for the utilities to perform cost effective improvements and analyses. The report presents a common attempt by the authorities and the utilities to create a methodology and experience base for defence and analysis of dependent failures. The performed benchmark application has shown how important the interpretation of base data is to obtain robust CCF data and data analyses results. Good features were found in all benchmark approaches. The obtained experiences and approaches should now be used in harmonised procedures. A next step could