WorldWideScience

Sample records for accurate quantitative information

  1. Toward 3D structural information from quantitative electron exit wave analysis

    International Nuclear Information System (INIS)

    Borisenko, Konstantin B; Moldovan, Grigore; Kirkland, Angus I; Wang, Amy; Van Dyck, Dirk; Chen, Fu-Rong

    2012-01-01

    Simulations show that using a new direct imaging detector and accurate exit wave restoration algorithms allows nearly quantitative restoration of electron exit wave phase, which can be regarded as only qualitative for conventional indirect imaging cameras. This opens up a possibility of extracting accurate information on 3D atomic structure of the sample even from a single projection.

  2. Accurate virus quantitation using a Scanning Transmission Electron Microscopy (STEM) detector in a scanning electron microscope.

    Science.gov (United States)

    Blancett, Candace D; Fetterer, David P; Koistinen, Keith A; Morazzani, Elaine M; Monninger, Mitchell K; Piper, Ashley E; Kuehl, Kathleen A; Kearney, Brian J; Norris, Sarah L; Rossi, Cynthia A; Glass, Pamela J; Sun, Mei G

    2017-10-01

    A method for accurate quantitation of virus particles has long been sought, but a perfect method still eludes the scientific community. Electron Microscopy (EM) quantitation is a valuable technique because it provides direct morphology information and counts of all viral particles, whether or not they are infectious. In the past, EM negative stain quantitation methods have been cited as inaccurate, non-reproducible, and with detection limits that were too high to be useful. To improve accuracy and reproducibility, we have developed a method termed Scanning Transmission Electron Microscopy - Virus Quantitation (STEM-VQ), which simplifies sample preparation and uses a high throughput STEM detector in a Scanning Electron Microscope (SEM) coupled with commercially available software. In this paper, we demonstrate STEM-VQ with an alphavirus stock preparation to present the method's accuracy and reproducibility, including a comparison of STEM-VQ to viral plaque assay and the ViroCyt Virus Counter. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Searching for an Accurate Marker-Based Prediction of an Individual Quantitative Trait in Molecular Plant Breeding.

    Science.gov (United States)

    Fu, Yong-Bi; Yang, Mo-Hua; Zeng, Fangqin; Biligetu, Bill

    2017-01-01

    Molecular plant breeding with the aid of molecular markers has played an important role in modern plant breeding over the last two decades. Many marker-based predictions for quantitative traits have been made to enhance parental selection, but the trait prediction accuracy remains generally low, even with the aid of dense, genome-wide SNP markers. To search for more accurate trait-specific prediction with informative SNP markers, we conducted a literature review on the prediction issues in molecular plant breeding and on the applicability of an RNA-Seq technique for developing function-associated specific trait (FAST) SNP markers. To understand whether and how FAST SNP markers could enhance trait prediction, we also performed a theoretical reasoning on the effectiveness of these markers in a trait-specific prediction, and verified the reasoning through computer simulation. To the end, the search yielded an alternative to regular genomic selection with FAST SNP markers that could be explored to achieve more accurate trait-specific prediction. Continuous search for better alternatives is encouraged to enhance marker-based predictions for an individual quantitative trait in molecular plant breeding.

  4. Searching for an Accurate Marker-Based Prediction of an Individual Quantitative Trait in Molecular Plant Breeding

    Directory of Open Access Journals (Sweden)

    Yong-Bi Fu

    2017-07-01

    Full Text Available Molecular plant breeding with the aid of molecular markers has played an important role in modern plant breeding over the last two decades. Many marker-based predictions for quantitative traits have been made to enhance parental selection, but the trait prediction accuracy remains generally low, even with the aid of dense, genome-wide SNP markers. To search for more accurate trait-specific prediction with informative SNP markers, we conducted a literature review on the prediction issues in molecular plant breeding and on the applicability of an RNA-Seq technique for developing function-associated specific trait (FAST SNP markers. To understand whether and how FAST SNP markers could enhance trait prediction, we also performed a theoretical reasoning on the effectiveness of these markers in a trait-specific prediction, and verified the reasoning through computer simulation. To the end, the search yielded an alternative to regular genomic selection with FAST SNP markers that could be explored to achieve more accurate trait-specific prediction. Continuous search for better alternatives is encouraged to enhance marker-based predictions for an individual quantitative trait in molecular plant breeding.

  5. Searching for an Accurate Marker-Based Prediction of an Individual Quantitative Trait in Molecular Plant Breeding

    Science.gov (United States)

    Fu, Yong-Bi; Yang, Mo-Hua; Zeng, Fangqin; Biligetu, Bill

    2017-01-01

    Molecular plant breeding with the aid of molecular markers has played an important role in modern plant breeding over the last two decades. Many marker-based predictions for quantitative traits have been made to enhance parental selection, but the trait prediction accuracy remains generally low, even with the aid of dense, genome-wide SNP markers. To search for more accurate trait-specific prediction with informative SNP markers, we conducted a literature review on the prediction issues in molecular plant breeding and on the applicability of an RNA-Seq technique for developing function-associated specific trait (FAST) SNP markers. To understand whether and how FAST SNP markers could enhance trait prediction, we also performed a theoretical reasoning on the effectiveness of these markers in a trait-specific prediction, and verified the reasoning through computer simulation. To the end, the search yielded an alternative to regular genomic selection with FAST SNP markers that could be explored to achieve more accurate trait-specific prediction. Continuous search for better alternatives is encouraged to enhance marker-based predictions for an individual quantitative trait in molecular plant breeding. PMID:28729875

  6. Approaching system equilibrium with accurate or not accurate feedback information in a two-route system

    Science.gov (United States)

    Zhao, Xiao-mei; Xie, Dong-fan; Li, Qi

    2015-02-01

    With the development of intelligent transport system, advanced information feedback strategies have been developed to reduce traffic congestion and enhance the capacity. However, previous strategies provide accurate information to travelers and our simulation results show that accurate information brings negative effects, especially in delay case. Because travelers prefer to the best condition route with accurate information, and delayed information cannot reflect current traffic condition but past. Then travelers make wrong routing decisions, causing the decrease of the capacity and the increase of oscillations and the system deviating from the equilibrium. To avoid the negative effect, bounded rationality is taken into account by introducing a boundedly rational threshold BR. When difference between two routes is less than the BR, routes have equal probability to be chosen. The bounded rationality is helpful to improve the efficiency in terms of capacity, oscillation and the gap deviating from the system equilibrium.

  7. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe.

    Science.gov (United States)

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-09

    Accurate quantitation of intracellular pH (pH i ) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pH i sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pH i . Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pH i , in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF 4 :Yb 3+ , Tm 3+ UCNPs were used as pH i response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pH i value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pH i related areas and development of the intracellular drug delivery systems.

  8. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe

    NARCIS (Netherlands)

    Li, C.; Zuo, J.; Zhang, L.; Chang, Y.; Zhang, Y.; Tu, L.; Liu, X.; Xue, B.; Li, Q.; Zhao, H.; Zhang, H.; Kong, X.

    2016-01-01

    Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now,

  9. Accurate quantitative XRD phase analysis of cement clinkers

    International Nuclear Information System (INIS)

    Kern, A.

    2002-01-01

    Full text: Knowledge about the absolute phase abundance in cement clinkers is a requirement for both, research and quality control. Traditionally, quantitative analysis of cement clinkers has been carried out by theoretical normative calculation from chemical analysis using the so-called Bogue method or by optical microscopy. Therefore chemical analysis, mostly performed by X-ray fluorescence (XRF), forms the basis of cement plan control by providing information for proportioning raw materials, adjusting kiln and burning conditions, as well as cement mill feed proportioning. In addition, XRF is of highest importance with respect to the environmentally relevant control of waste recovery raw materials and alternative fuels, as well as filters, plants and sewage. However, the performance of clinkers and cements is governed by the mineralogy and not the elemental composition, and the deficiencies and inherent errors of Bogue as well as microscopic point counting are well known. With XRD and Rietveld analysis a full quantitative analysis of cement clinkers can be performed providing detailed mineralogical information about the product. Until recently several disadvantages prevented the frequent application of the Rietveld method in the cement industry. As the measurement of a full pattern is required, extended measurement times made an integration of this method into existing automation environments difficult. In addition, several drawbacks of existing Rietveld software such as complexity, low performance and severe numerical instability were prohibitive for automated use. The latest developments of on-line instrumentation, as well as dedicated Rietveld software for quantitative phase analysis (TOPAS), now make a decisive breakthrough possible. TOPAS not only allows the analysis of extremely complex phase mixtures in the shortest time possible, but also a fully automated online phase analysis for production control and quality management, free of any human interaction

  10. Quantitative information in medical imaging

    International Nuclear Information System (INIS)

    Deconinck, F.

    1985-01-01

    When developing new imaging or image processing techniques, one constantly has in mind that the new technique should provide a better, or more optimal answer to medical tasks than existing techniques do 'Better' or 'more optimal' imply some kind of standard by which one can measure imaging or image processing performance. The choice of a particular imaging modality to answer a diagnostic task, such as the detection of coronary artery stenosis is also based on an implicit optimalisation of performance criteria. Performance is measured by the ability to provide information about an object (patient) to the person (referring doctor) who ordered a particular task. In medical imaging the task is generally to find quantitative information on bodily function (biochemistry, physiology) and structure (histology, anatomy). In medical imaging, a wide range of techniques is available. Each technique has it's own characteristics. The techniques discussed in this paper are: nuclear magnetic resonance, X-ray fluorescence, scintigraphy, positron emission tomography, applied potential tomography, computerized tomography, and compton tomography. This paper provides a framework for the comparison of imaging performance, based on the way the quantitative information flow is altered by the characteristics of the modality

  11. Quantitative Information Flow as Safety and Liveness Hyperproperties

    Directory of Open Access Journals (Sweden)

    Hirotoshi Yasuoka

    2012-07-01

    Full Text Available We employ Clarkson and Schneider's "hyperproperties" to classify various verification problems of quantitative information flow. The results of this paper unify and extend the previous results on the hardness of checking and inferring quantitative information flow. In particular, we identify a subclass of liveness hyperproperties, which we call "k-observable hyperproperties", that can be checked relative to a reachability oracle via self composition.

  12. Highly sensitive capillary electrophoresis-mass spectrometry for rapid screening and accurate quantitation of drugs of abuse in urine.

    Science.gov (United States)

    Kohler, Isabelle; Schappler, Julie; Rudaz, Serge

    2013-05-30

    The combination of capillary electrophoresis (CE) and mass spectrometry (MS) is particularly well adapted to bioanalysis due to its high separation efficiency, selectivity, and sensitivity; its short analytical time; and its low solvent and sample consumption. For clinical and forensic toxicology, a two-step analysis is usually performed: first, a screening step for compound identification, and second, confirmation and/or accurate quantitation in cases of presumed positive results. In this study, a fast and sensitive CE-MS workflow was developed for the screening and quantitation of drugs of abuse in urine samples. A CE with a time-of-flight MS (CE-TOF/MS) screening method was developed using a simple urine dilution and on-line sample preconcentration with pH-mediated stacking. The sample stacking allowed for a high loading capacity (20.5% of the capillary length), leading to limits of detection as low as 2 ng mL(-1) for drugs of abuse. Compound quantitation of positive samples was performed by CE-MS/MS with a triple quadrupole MS equipped with an adapted triple-tube sprayer and an electrospray ionization (ESI) source. The CE-ESI-MS/MS method was validated for two model compounds, cocaine (COC) and methadone (MTD), according to the Guidance of the Food and Drug Administration. The quantitative performance was evaluated for selectivity, response function, the lower limit of quantitation, trueness, precision, and accuracy. COC and MTD detection in urine samples was determined to be accurate over the range of 10-1000 ng mL(-1) and 21-1000 ng mL(-1), respectively. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Reverse transcription-quantitative polymerase chain reaction: description of a RIN-based algorithm for accurate data normalization

    Directory of Open Access Journals (Sweden)

    Boissière-Michot Florence

    2009-04-01

    Full Text Available Abstract Background Reverse transcription-quantitative polymerase chain reaction (RT-qPCR is the gold standard technique for mRNA quantification, but appropriate normalization is required to obtain reliable data. Normalization to accurately quantitated RNA has been proposed as the most reliable method for in vivo biopsies. However, this approach does not correct differences in RNA integrity. Results In this study, we evaluated the effect of RNA degradation on the quantification of the relative expression of nine genes (18S, ACTB, ATUB, B2M, GAPDH, HPRT, POLR2L, PSMB6 and RPLP0 that cover a wide expression spectrum. Our results show that RNA degradation could introduce up to 100% error in gene expression measurements when RT-qPCR data were normalized to total RNA. To achieve greater resolution of small differences in transcript levels in degraded samples, we improved this normalization method by developing a corrective algorithm that compensates for the loss of RNA integrity. This approach allowed us to achieve higher accuracy, since the average error for quantitative measurements was reduced to 8%. Finally, we applied our normalization strategy to the quantification of EGFR, HER2 and HER3 in 104 rectal cancer biopsies. Taken together, our data show that normalization of gene expression measurements by taking into account also RNA degradation allows much more reliable sample comparison. Conclusion We developed a new normalization method of RT-qPCR data that compensates for loss of RNA integrity and therefore allows accurate gene expression quantification in human biopsies.

  14. A unique charge-coupled device/xenon arc lamp based imaging system for the accurate detection and quantitation of multicolour fluorescence.

    Science.gov (United States)

    Spibey, C A; Jackson, P; Herick, K

    2001-03-01

    In recent years the use of fluorescent dyes in biological applications has dramatically increased. The continual improvement in the capabilities of these fluorescent dyes demands increasingly sensitive detection systems that provide accurate quantitation over a wide linear dynamic range. In the field of proteomics, the detection, quantitation and identification of very low abundance proteins are of extreme importance in understanding cellular processes. Therefore, the instrumentation used to acquire an image of such samples, for spot picking and identification by mass spectrometry, must be sensitive enough to be able, not only, to maximise the sensitivity and dynamic range of the staining dyes but, as importantly, adapt to the ever changing portfolio of fluorescent dyes as they become available. Just as the available fluorescent probes are improving and evolving so are the users application requirements. Therefore, the instrumentation chosen must be flexible to address and adapt to those changing needs. As a result, a highly competitive market for the supply and production of such dyes and the instrumentation for their detection and quantitation have emerged. The instrumentation currently available is based on either laser/photomultiplier tube (PMT) scanning or lamp/charge-coupled device (CCD) based mechanisms. This review briefly discusses the advantages and disadvantages of both System types for fluorescence imaging, gives a technical overview of CCD technology and describes in detail a unique xenon/are lamp CCD based instrument, from PerkinElmer Life Sciences. The Wallac-1442 ARTHUR is unique in its ability to scan both large areas at high resolution and give accurate selectable excitation over the whole of the UV/visible range. It operates by filtering both the excitation and emission wavelengths, providing optimal and accurate measurement and quantitation of virtually any available dye and allows excellent spectral resolution between different fluorophores

  15. Ultra-accurate collaborative information filtering via directed user similarity

    Science.gov (United States)

    Guo, Q.; Song, W.-J.; Liu, J.-G.

    2014-07-01

    A key challenge of the collaborative filtering (CF) information filtering is how to obtain the reliable and accurate results with the help of peers' recommendation. Since the similarities from small-degree users to large-degree users would be larger than the ones in opposite direction, the large-degree users' selections are recommended extensively by the traditional second-order CF algorithms. By considering the users' similarity direction and the second-order correlations to depress the influence of mainstream preferences, we present the directed second-order CF (HDCF) algorithm specifically to address the challenge of accuracy and diversity of the CF algorithm. The numerical results for two benchmark data sets, MovieLens and Netflix, show that the accuracy of the new algorithm outperforms the state-of-the-art CF algorithms. Comparing with the CF algorithm based on random walks proposed by Liu et al. (Int. J. Mod. Phys. C, 20 (2009) 285) the average ranking score could reach 0.0767 and 0.0402, which is enhanced by 27.3% and 19.1% for MovieLens and Netflix, respectively. In addition, the diversity, precision and recall are also enhanced greatly. Without relying on any context-specific information, tuning the similarity direction of CF algorithms could obtain accurate and diverse recommendations. This work suggests that the user similarity direction is an important factor to improve the personalized recommendation performance.

  16. Differential contribution of visual and auditory information to accurately predict the direction and rotational motion of a visual stimulus.

    Science.gov (United States)

    Park, Seoung Hoon; Kim, Seonjin; Kwon, MinHyuk; Christou, Evangelos A

    2016-03-01

    Vision and auditory information are critical for perception and to enhance the ability of an individual to respond accurately to a stimulus. However, it is unknown whether visual and auditory information contribute differentially to identify the direction and rotational motion of the stimulus. The purpose of this study was to determine the ability of an individual to accurately predict the direction and rotational motion of the stimulus based on visual and auditory information. In this study, we recruited 9 expert table-tennis players and used table-tennis service as our experimental model. Participants watched recorded services with different levels of visual and auditory information. The goal was to anticipate the direction of the service (left or right) and the rotational motion of service (topspin, sidespin, or cut). We recorded their responses and quantified the following outcomes: (i) directional accuracy and (ii) rotational motion accuracy. The response accuracy was the accurate predictions relative to the total number of trials. The ability of the participants to predict the direction of the service accurately increased with additional visual information but not with auditory information. In contrast, the ability of the participants to predict the rotational motion of the service accurately increased with the addition of auditory information to visual information but not with additional visual information alone. In conclusion, this finding demonstrates that visual information enhances the ability of an individual to accurately predict the direction of the stimulus, whereas additional auditory information enhances the ability of an individual to accurately predict the rotational motion of stimulus.

  17. Accurate quantitation of D+ fetomaternal hemorrhage by flow cytometry using a novel reagent to eliminate granulocytes from analysis.

    Science.gov (United States)

    Kumpel, Belinda; Hazell, Matthew; Guest, Alan; Dixey, Jonathan; Mushens, Rosey; Bishop, Debbie; Wreford-Bush, Tim; Lee, Edmond

    2014-05-01

    Quantitation of fetomaternal hemorrhage (FMH) is performed to determine the dose of prophylactic anti-D (RhIG) required to prevent D immunization of D- women. Flow cytometry (FC) is the most accurate method. However, maternal white blood cells (WBCs) can give high background by binding anti-D nonspecifically, compromising accuracy. Maternal blood samples (69) were sent for FC quantitation of FMH after positive Kleihauer-Betke test (KBT) analysis and RhIG administration. Reagents used were BRAD-3-fluorescein isothiocyanate (FITC; anti-D), AEVZ5.3-FITC (anti-varicella zoster [anti-VZ], negative control), anti-fetal hemoglobin (HbF)-FITC, blended two-color reagents, BRAD-3-FITC/anti-CD45-phycoerythrin (PE; anti-D/L), and BRAD-3-FITC/anti-CD66b-PE (anti-D/G). PE-positive WBCs were eliminated from analysis by gating. Full blood counts were performed on maternal samples and female donors. Elevated numbers of neutrophils were present in 80% of patients. Red blood cell (RBC) indices varied widely in maternal blood. D+ FMH values obtained with anti-D/L, anti-D/G, and anti-HbF-FITC were very similar (r = 0.99, p < 0.001). Correlation between KBT and anti-HbF-FITC FMH results was low (r = 0.716). Inaccurate FMH quantitation using the current method (anti-D minus anti-VZ) occurred with 71% samples having less than 15 mL of D+ FMH (RBCs) and insufficient RhIG calculated for 9%. Using two-color reagents and anti-HbF-FITC, approximately 30% patients had elevated F cells, 26% had no fetal cells, 6% had D- FMH, 26% had 4 to 15 mL of D+ FMH, and 12% patients had more than 15 mL of D+ FMH (RBCs) requiring more than 300 μg of RhIG. Without accurate quantitation of D+ FMH by FC, some women would receive inappropriate or inadequate anti-D prophylaxis. The latter may be at risk of immunization leading to hemolytic disease of the newborn. © 2013 American Association of Blood Banks.

  18. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    Science.gov (United States)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and

  19. Accurate protein structure modeling using sparse NMR data and homologous structure information.

    Science.gov (United States)

    Thompson, James M; Sgourakis, Nikolaos G; Liu, Gaohua; Rossi, Paolo; Tang, Yuefeng; Mills, Jeffrey L; Szyperski, Thomas; Montelione, Gaetano T; Baker, David

    2012-06-19

    While information from homologous structures plays a central role in X-ray structure determination by molecular replacement, such information is rarely used in NMR structure determination because it can be incorrect, both locally and globally, when evolutionary relationships are inferred incorrectly or there has been considerable evolutionary structural divergence. Here we describe a method that allows robust modeling of protein structures of up to 225 residues by combining (1)H(N), (13)C, and (15)N backbone and (13)Cβ chemical shift data, distance restraints derived from homologous structures, and a physically realistic all-atom energy function. Accurate models are distinguished from inaccurate models generated using incorrect sequence alignments by requiring that (i) the all-atom energies of models generated using the restraints are lower than models generated in unrestrained calculations and (ii) the low-energy structures converge to within 2.0 Å backbone rmsd over 75% of the protein. Benchmark calculations on known structures and blind targets show that the method can accurately model protein structures, even with very remote homology information, to a backbone rmsd of 1.2-1.9 Å relative to the conventional determined NMR ensembles and of 0.9-1.6 Å relative to X-ray structures for well-defined regions of the protein structures. This approach facilitates the accurate modeling of protein structures using backbone chemical shift data without need for side-chain resonance assignments and extensive analysis of NOESY cross-peak assignments.

  20. A method to extract quantitative information in analyzer-based x-ray phase contrast imaging

    International Nuclear Information System (INIS)

    Pagot, E.; Cloetens, P.; Fiedler, S.; Bravin, A.; Coan, P.; Baruchel, J.; Haertwig, J.; Thomlinson, W.

    2003-01-01

    Analyzer-based imaging is a powerful phase-sensitive technique that generates improved contrast compared to standard absorption radiography. Combining numerically two images taken on either side at ±1/2 of the full width at half-maximum (FWHM) of the rocking curve provides images of 'pure refraction' and of 'apparent absorption'. In this study, a similar approach is made by combining symmetrical images with respect to the peak of the analyzer rocking curve but at general positions, ±α·FWHM. These two approaches do not consider the ultrasmall angle scattering produced by the object independently, which can lead to inconsistent results. An accurate way to separately retrieve the quantitative information intrinsic to the object is proposed. It is based on a statistical analysis of the local rocking curve, and allows one to overcome the problems encountered using the previous approaches

  1. A quantitative approach to modeling the information processing of NPP operators under input information overload

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload

  2. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yi, E-mail: y.wang@fkf.mpg.de; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y.; Aken, Peter A. van

    2016-09-15

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO{sub 6} octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. - Highlights: • We report a software tool for mapping atomic positions from HAADF and ABF images. • It enables quantification of both crystal lattice and oxygen octahedral distortions. • We test the measurement accuracy and precision on simulated and experimental images. • It works well for different orientations of perovskite structures and interfaces.

  3. Molecular acidity: An accurate description with information-theoretic approach in density functional reactivity theory.

    Science.gov (United States)

    Cao, Xiaofang; Rong, Chunying; Zhong, Aiguo; Lu, Tian; Liu, Shubin

    2018-01-15

    Molecular acidity is one of the important physiochemical properties of a molecular system, yet its accurate calculation and prediction are still an unresolved problem in the literature. In this work, we propose to make use of the quantities from the information-theoretic (IT) approach in density functional reactivity theory and provide an accurate description of molecular acidity from a completely new perspective. To illustrate our point, five different categories of acidic series, singly and doubly substituted benzoic acids, singly substituted benzenesulfinic acids, benzeneseleninic acids, phenols, and alkyl carboxylic acids, have been thoroughly examined. We show that using IT quantities such as Shannon entropy, Fisher information, Ghosh-Berkowitz-Parr entropy, information gain, Onicescu information energy, and relative Rényi entropy, one is able to simultaneously predict experimental pKa values of these different categories of compounds. Because of the universality of the quantities employed in this work, which are all density dependent, our approach should be general and be applicable to other systems as well. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  4. Computationally efficient and quantitatively accurate multiscale simulation of solid-solution strengthening by ab initio calculation

    International Nuclear Information System (INIS)

    Ma, Duancheng; Friák, Martin; Pezold, Johann von; Raabe, Dierk; Neugebauer, Jörg

    2015-01-01

    We propose an approach for the computationally efficient and quantitatively accurate prediction of solid-solution strengthening. It combines the 2-D Peierls–Nabarro model and a recently developed solid-solution strengthening model. Solid-solution strengthening is examined with Al–Mg and Al–Li as representative alloy systems, demonstrating a good agreement between theory and experiments within the temperature range in which the dislocation motion is overdamped. Through a parametric study, two guideline maps of the misfit parameters against (i) the critical resolved shear stress, τ 0 , at 0 K and (ii) the energy barrier, ΔE b , against dislocation motion in a solid solution with randomly distributed solute atoms are created. With these two guideline maps, τ 0 at finite temperatures is predicted for other Al binary systems, and compared with available experiments, achieving good agreement

  5. Quantitative X-ray mapping, scatter diagrams and the generation of correction maps to obtain more information about your material

    Science.gov (United States)

    Wuhrer, R.; Moran, K.

    2014-03-01

    Quantitative X-ray mapping with silicon drift detectors and multi-EDS detector systems have become an invaluable analysis technique and one of the most useful methods of X-ray microanalysis today. The time to perform an X-ray map has reduced considerably with the ability to map minor and trace elements very accurately due to the larger detector area and higher count rate detectors. Live X-ray imaging can now be performed with a significant amount of data collected in a matter of minutes. A great deal of information can be obtained from X-ray maps. This includes; elemental relationship or scatter diagram creation, elemental ratio mapping, chemical phase mapping (CPM) and quantitative X-ray maps. In obtaining quantitative x-ray maps, we are able to easily generate atomic number (Z), absorption (A), fluorescence (F), theoretical back scatter coefficient (η), and quantitative total maps from each pixel in the image. This allows us to generate an image corresponding to each factor (for each element present). These images allow the user to predict and verify where they are likely to have problems in our images, and are especially helpful to look at possible interface artefacts. The post-processing techniques to improve the quantitation of X-ray map data and the development of post processing techniques for improved characterisation are covered in this paper.

  6. Quantitative X-ray mapping, scatter diagrams and the generation of correction maps to obtain more information about your material

    International Nuclear Information System (INIS)

    Wuhrer, R; Moran, K

    2014-01-01

    Quantitative X-ray mapping with silicon drift detectors and multi-EDS detector systems have become an invaluable analysis technique and one of the most useful methods of X-ray microanalysis today. The time to perform an X-ray map has reduced considerably with the ability to map minor and trace elements very accurately due to the larger detector area and higher count rate detectors. Live X-ray imaging can now be performed with a significant amount of data collected in a matter of minutes. A great deal of information can be obtained from X-ray maps. This includes; elemental relationship or scatter diagram creation, elemental ratio mapping, chemical phase mapping (CPM) and quantitative X-ray maps. In obtaining quantitative x-ray maps, we are able to easily generate atomic number (Z), absorption (A), fluorescence (F), theoretical back scatter coefficient (η), and quantitative total maps from each pixel in the image. This allows us to generate an image corresponding to each factor (for each element present). These images allow the user to predict and verify where they are likely to have problems in our images, and are especially helpful to look at possible interface artefacts. The post-processing techniques to improve the quantitation of X-ray map data and the development of post processing techniques for improved characterisation are covered in this paper

  7. Highly accurate fluorogenic DNA sequencing with information theory-based error correction.

    Science.gov (United States)

    Chen, Zitian; Zhou, Wenxiong; Qiao, Shuo; Kang, Li; Duan, Haifeng; Xie, X Sunney; Huang, Yanyi

    2017-12-01

    Eliminating errors in next-generation DNA sequencing has proved challenging. Here we present error-correction code (ECC) sequencing, a method to greatly improve sequencing accuracy by combining fluorogenic sequencing-by-synthesis (SBS) with an information theory-based error-correction algorithm. ECC embeds redundancy in sequencing reads by creating three orthogonal degenerate sequences, generated by alternate dual-base reactions. This is similar to encoding and decoding strategies that have proved effective in detecting and correcting errors in information communication and storage. We show that, when combined with a fluorogenic SBS chemistry with raw accuracy of 98.1%, ECC sequencing provides single-end, error-free sequences up to 200 bp. ECC approaches should enable accurate identification of extremely rare genomic variations in various applications in biology and medicine.

  8. Using electronic health records and Internet search information for accurate influenza forecasting.

    Science.gov (United States)

    Yang, Shihao; Santillana, Mauricio; Brownstein, John S; Gray, Josh; Richardson, Stewart; Kou, S C

    2017-05-08

    Accurate influenza activity forecasting helps public health officials prepare and allocate resources for unusual influenza activity. Traditional flu surveillance systems, such as the Centers for Disease Control and Prevention's (CDC) influenza-like illnesses reports, lag behind real-time by one to 2 weeks, whereas information contained in cloud-based electronic health records (EHR) and in Internet users' search activity is typically available in near real-time. We present a method that combines the information from these two data sources with historical flu activity to produce national flu forecasts for the United States up to 4 weeks ahead of the publication of CDC's flu reports. We extend a method originally designed to track flu using Google searches, named ARGO, to combine information from EHR and Internet searches with historical flu activities. Our regularized multivariate regression model dynamically selects the most appropriate variables for flu prediction every week. The model is assessed for the flu seasons within the time period 2013-2016 using multiple metrics including root mean squared error (RMSE). Our method reduces the RMSE of the publicly available alternative (Healthmap flutrends) method by 33, 20, 17 and 21%, for the four time horizons: real-time, one, two, and 3 weeks ahead, respectively. Such accuracy improvements are statistically significant at the 5% level. Our real-time estimates correctly identified the peak timing and magnitude of the studied flu seasons. Our method significantly reduces the prediction error when compared to historical publicly available Internet-based prediction systems, demonstrating that: (1) the method to combine data sources is as important as data quality; (2) effectively extracting information from a cloud-based EHR and Internet search activity leads to accurate forecast of flu.

  9. Analysis of an Internet Community about Pneumothorax and the Importance of Accurate Information about the Disease.

    Science.gov (United States)

    Kim, Bong Jun; Lee, Sungsoo

    2018-04-01

    The huge improvements in the speed of data transmission and the increasing amount of data available as the Internet has expanded have made it easy to obtain information about any disease. Since pneumothorax frequently occurs in young adolescents, patients often search the Internet for information on pneumothorax. This study analyzed an Internet community for exchanging information on pneumothorax, with an emphasis on the importance of accurate information and doctors' role in providing such information. This study assessed 599,178 visitors to the Internet community from June 2008 to April 2017. There was an average of 190 visitors, 2.2 posts, and 4.5 replies per day. A total of 6,513 posts were made, and 63.3% of them included questions about the disease. The visitors mostly searched for terms such as 'pneumothorax,' 'recurrent pneumothorax,' 'pneumothorax operation,' and 'obtaining a medical certification of having been diagnosed with pneumothorax.' However, 22% of the pneumothorax-related posts by visitors contained inaccurate information. Internet communities can be an important source of information. However, incorrect information about a disease can be harmful for patients. We, as doctors, should try to provide more in-depth information about diseases to patients and to disseminate accurate information about diseases in Internet communities.

  10. Quantitative contrast-enhanced first-pass cardiac perfusion MRI at 3 tesla with accurate arterial input function and myocardial wall enhancement.

    Science.gov (United States)

    Breton, Elodie; Kim, Daniel; Chung, Sohae; Axel, Leon

    2011-09-01

    To develop, and validate in vivo, a robust quantitative first-pass perfusion cardiovascular MR (CMR) method with accurate arterial input function (AIF) and myocardial wall enhancement. A saturation-recovery (SR) pulse sequence was modified to sequentially acquire multiple slices after a single nonselective saturation pulse at 3 Tesla. In each heartbeat, an AIF image is acquired in the aortic root with a short time delay (TD) (50 ms), followed by the acquisition of myocardial images with longer TD values (∼150-400 ms). Longitudinal relaxation rates (R(1) = 1/T(1)) were calculated using an ideal saturation recovery equation based on the Bloch equation, and corresponding gadolinium contrast concentrations were calculated assuming fast water exchange condition. The proposed method was validated against a reference multi-point SR method by comparing their respective R(1) measurements in the blood and left ventricular myocardium, before and at multiple time-points following contrast injections, in 7 volunteers. R(1) measurements with the proposed method and reference multi-point method were strongly correlated (r > 0.88, P < 10(-5)) and in good agreement (mean difference ±1.96 standard deviation 0.131 ± 0.317/0.018 ± 0.140 s(-1) for blood/myocardium, respectively). The proposed quantitative first-pass perfusion CMR method measured accurate R(1) values for quantification of AIF and myocardial wall contrast agent concentrations in 3 cardiac short-axis slices, in a total acquisition time of 523 ms per heartbeat. Copyright © 2011 Wiley-Liss, Inc.

  11. Quantitative inspection by computerized tomography

    International Nuclear Information System (INIS)

    Lopes, R.T.; Assis, J.T. de; Jesus, E.F.O. de

    1989-01-01

    The computerized Tomography (CT) is a method of nondestructive testing, that furnish quantitative information, that permit the detection and accurate localization of defects, internal dimension measurement, and, measurement and chart of the density distribution. The CT technology is much versatile, not presenting restriction in relation to form, size or composition of the object. A tomographic system, projected and constructed in our laboratory is presented. The applications and limitation of this system, illustrated by tomographyc images, are shown. (V.R.B.)

  12. Quantitative approaches to information recovery from black holes

    Energy Technology Data Exchange (ETDEWEB)

    Balasubramanian, Vijay [David Rittenhouse Laboratory, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States); Czech, Bartlomiej, E-mail: vijay@physics.upenn.edu, E-mail: czech@phas.ubc.ca [Department of Physics and Astronomy, University of British Columbia, 6224 Agricultural Road, Vancouver, BC V6T 1Z1 (Canada)

    2011-08-21

    The evaporation of black holes into apparently thermal radiation poses a serious conundrum for theoretical physics: at face value, it appears that in the presence of a black hole, quantum evolution is non-unitary and destroys information. This information loss paradox has its seed in the presence of a horizon causally separating the interior and asymptotic regions in a black hole spacetime. A quantitative resolution of the paradox could take several forms: (a) a precise argument that the underlying quantum theory is unitary, and that information loss must be an artifact of approximations in the derivation of black hole evaporation, (b) an explicit construction showing how information can be recovered by the asymptotic observer, (c) a demonstration that the causal disconnection of the black hole interior from infinity is an artifact of the semiclassical approximation. This review summarizes progress on all these fronts. (topical review)

  13. Augmenting Amyloid PET Interpretations With Quantitative Information Improves Consistency of Early Amyloid Detection.

    Science.gov (United States)

    Harn, Nicholas R; Hunt, Suzanne L; Hill, Jacqueline; Vidoni, Eric; Perry, Mark; Burns, Jeffrey M

    2017-08-01

    Establishing reliable methods for interpreting elevated cerebral amyloid-β plaque on PET scans is increasingly important for radiologists, as availability of PET imaging in clinical practice increases. We examined a 3-step method to detect plaque in cognitively normal older adults, focusing on the additive value of quantitative information during the PET scan interpretation process. Fifty-five F-florbetapir PET scans were evaluated by 3 experienced raters. Scans were first visually interpreted as having "elevated" or "nonelevated" plaque burden ("Visual Read"). Images were then processed using a standardized quantitative analysis software (MIMneuro) to generate whole brain and region of interest SUV ratios. This "Quantitative Read" was considered elevated if at least 2 of 6 regions of interest had an SUV ratio of more than 1.1. The final interpretation combined both visual and quantitative data together ("VisQ Read"). Cohen kappa values were assessed as a measure of interpretation agreement. Plaque was elevated in 25.5% to 29.1% of the 165 total Visual Reads. Interrater agreement was strong (kappa = 0.73-0.82) and consistent with reported values. Quantitative Reads were elevated in 45.5% of participants. Final VisQ Reads changed from initial Visual Reads in 16 interpretations (9.7%), with most changing from "nonelevated" Visual Reads to "elevated." These changed interpretations demonstrated lower plaque quantification than those initially read as "elevated" that remained unchanged. Interrater variability improved for VisQ Reads with the addition of quantitative information (kappa = 0.88-0.96). Inclusion of quantitative information increases consistency of PET scan interpretations for early detection of cerebral amyloid-β plaque accumulation.

  14. Ultra-fast quantitative imaging using ptychographic iterative engine based digital micro-mirror device

    Science.gov (United States)

    Sun, Aihui; Tian, Xiaolin; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-01-01

    As a lensfree imaging technique, ptychographic iterative engine (PIE) method can provide both quantitative sample amplitude and phase distributions avoiding aberration. However, it requires field of view (FoV) scanning often relying on mechanical translation, which not only slows down measuring speed, but also introduces mechanical errors decreasing both resolution and accuracy in retrieved information. In order to achieve high-accurate quantitative imaging with fast speed, digital micromirror device (DMD) is adopted in PIE for large FoV scanning controlled by on/off state coding by DMD. Measurements were implemented using biological samples as well as USAF resolution target, proving high resolution in quantitative imaging using the proposed system. Considering its fast and accurate imaging capability, it is believed the DMD based PIE technique provides a potential solution for medical observation and measurements.

  15. Quantitative Model for Economic Analyses of Information Security Investment in an Enterprise Information System

    Directory of Open Access Journals (Sweden)

    Bojanc Rok

    2012-11-01

    Full Text Available The paper presents a mathematical model for the optimal security-technology investment evaluation and decision-making processes based on the quantitative analysis of security risks and digital asset assessments in an enterprise. The model makes use of the quantitative analysis of different security measures that counteract individual risks by identifying the information system processes in an enterprise and the potential threats. The model comprises the target security levels for all identified business processes and the probability of a security accident together with the possible loss the enterprise may suffer. The selection of security technology is based on the efficiency of selected security measures. Economic metrics are applied for the efficiency assessment and comparative analysis of different protection technologies. Unlike the existing models for evaluation of the security investment, the proposed model allows direct comparison and quantitative assessment of different security measures. The model allows deep analyses and computations providing quantitative assessments of different options for investments, which translate into recommendations facilitating the selection of the best solution and the decision-making thereof. The model was tested using empirical examples with data from real business environment.

  16. Assessing reference genes for accurate transcript normalization using quantitative real-time PCR in pearl millet [Pennisetum glaucum (L. R. Br].

    Directory of Open Access Journals (Sweden)

    Prasenjit Saha

    Full Text Available Pearl millet [Pennisetum glaucum (L. R.Br.], a close relative of Panicoideae food crops and bioenergy grasses, offers an ideal system to perform functional genomics studies related to C4 photosynthesis and abiotic stress tolerance. Quantitative real-time reverse transcription polymerase chain reaction (qRT-PCR provides a sensitive platform to conduct such gene expression analyses. However, the lack of suitable internal control reference genes for accurate transcript normalization during qRT-PCR analysis in pearl millet is the major limitation. Here, we conducted a comprehensive assessment of 18 reference genes on 234 samples which included an array of different developmental tissues, hormone treatments and abiotic stress conditions from three genotypes to determine appropriate reference genes for accurate normalization of qRT-PCR data. Analyses of Ct values using Stability Index, BestKeeper, ΔCt, Normfinder, geNorm and RefFinder programs ranked PP2A, TIP41, UBC2, UBQ5 and ACT as the most reliable reference genes for accurate transcript normalization under different experimental conditions. Furthermore, we validated the specificity of these genes for precise quantification of relative gene expression and provided evidence that a combination of the best reference genes are required to obtain optimal expression patterns for both endogeneous genes as well as transgenes in pearl millet.

  17. Accurate quantitative CF-LIBS analysis of both major and minor elements in alloys via iterative correction of plasma temperature and spectral intensity

    Science.gov (United States)

    Shuxia, ZHAO; Lei, ZHANG; Jiajia, HOU; Yang, ZHAO; Wangbao, YIN; Weiguang, MA; Lei, DONG; Liantuan, XIAO; Suotang, JIA

    2018-03-01

    The chemical composition of alloys directly determines their mechanical behaviors and application fields. Accurate and rapid analysis of both major and minor elements in alloys plays a key role in metallurgy quality control and material classification processes. A quantitative calibration-free laser-induced breakdown spectroscopy (CF-LIBS) analysis method, which carries out combined correction of plasma temperature and spectral intensity by using a second-order iterative algorithm and two boundary standard samples, is proposed to realize accurate composition measurements. Experimental results show that, compared to conventional CF-LIBS analysis, the relative errors for major elements Cu and Zn and minor element Pb in the copper-lead alloys has been reduced from 12%, 26% and 32% to 1.8%, 2.7% and 13.4%, respectively. The measurement accuracy for all elements has been improved substantially.

  18. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  19. QUAIL: A Quantitative Security Analyzer for Imperative Code

    DEFF Research Database (Denmark)

    Biondi, Fabrizio; Wasowski, Andrzej; Traonouez, Louis-Marie

    2013-01-01

    Quantitative security analysis evaluates and compares how effectively a system protects its secret data. We introduce QUAIL, the first tool able to perform an arbitrary-precision quantitative analysis of the security of a system depending on private information. QUAIL builds a Markov Chain model...... of the system’s behavior as observed by an attacker, and computes the correlation between the system’s observable output and the behavior depending on the private information, obtaining the expected amount of bits of the secret that the attacker will infer by observing the system. QUAIL is able to evaluate...... the safety of randomized protocols depending on secret data, allowing to verify a security protocol’s effectiveness. We experiment with a few examples and show that QUAIL’s security analysis is more accurate and revealing than results of other tools...

  20. Exploratory Movement Generates Higher-Order Information That Is Sufficient for Accurate Perception of Scaled Egocentric Distance

    Science.gov (United States)

    Mantel, Bruno; Stoffregen, Thomas A.; Campbell, Alain; Bardy, Benoît G.

    2015-01-01

    Body movement influences the structure of multiple forms of ambient energy, including optics and gravito-inertial force. Some researchers have argued that egocentric distance is derived from inferential integration of visual and non-visual stimulation. We suggest that accurate information about egocentric distance exists in perceptual stimulation as higher-order patterns that extend across optics and inertia. We formalize a pattern that specifies the egocentric distance of a stationary object across higher-order relations between optics and inertia. This higher-order parameter is created by self-generated movement of the perceiver in inertial space relative to the illuminated environment. For this reason, we placed minimal restrictions on the exploratory movements of our participants. We asked whether humans can detect and use the information available in this higher-order pattern. Participants judged whether a virtual object was within reach. We manipulated relations between body movement and the ambient structure of optics and inertia. Judgments were precise and accurate when the higher-order optical-inertial parameter was available. When only optic flow was available, judgments were poor. Our results reveal that participants perceived egocentric distance from the higher-order, optical-inertial consequences of their own exploratory activity. Analysis of participants’ movement trajectories revealed that self-selected movements were complex, and tended to optimize availability of the optical-inertial pattern that specifies egocentric distance. We argue that accurate information about egocentric distance exists in higher-order patterns of ambient energy, that self-generated movement can generate these higher-order patterns, and that these patterns can be detected and used to support perception of egocentric distance that is precise and accurate. PMID:25856410

  1. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  2. Media and Information Literacy (MIL) in journalistic learning: strategies for accurately engaging with information and reporting news

    Science.gov (United States)

    Inayatillah, F.

    2018-01-01

    In the era of digital technology, there is abundant information from various sources. This ease of access needs to be accompanied by the ability to engage with the information wisely. Thus, information and media literacy is required. From the results of preliminary observations, it was found that the students of Universitas Negeri Surabaya, whose major is Indonesian Literature, and they take journalistic course lack of the skill of media and information literacy (MIL). Therefore, they need to be equipped with MIL. The method used is descriptive qualitative, which includes data collection, data analysis, and presentation of data analysis. Observation and documentation techniques were used to obtain data of MIL’s impact on journalistic learning for students. This study aims at describing the important role of MIL for students of journalistic and its impact on journalistic learning for students of Indonesian literature batch 2014. The results of this research indicate that journalistic is a science that is essential for students because it affects how a person perceives news report. Through the reinforcement of the course, students can avoid a hoax. MIL-based journalistic learning makes students will be more skillful at absorbing, processing, and presenting information accurately. The subject influences students in engaging with information so that they can report news credibly.

  3. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    Directory of Open Access Journals (Sweden)

    Juan D Chavez

    Full Text Available Chemical cross-linking mass spectrometry (XL-MS provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  4. Accurate facade feature extraction method for buildings from three-dimensional point cloud data considering structural information

    Science.gov (United States)

    Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia

    2018-05-01

    Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.

  5. Activity assays and immunoassays for plasma Renin and prorenin: information provided and precautions necessary for accurate measurement

    DEFF Research Database (Denmark)

    Campbell, Duncan J; Nussberger, Juerg; Stowasser, Michael

    2009-01-01

    into focus the differences in information provided by activity assays and immunoassays for renin and prorenin measurement and has drawn attention to the need for precautions to ensure their accurate measurement. CONTENT: Renin activity assays and immunoassays provide related but different information...... provided by these assays and of the precautions necessary to ensure their accuracy....

  6. Perceived relevance and information needs regarding food topics and preferred information sources among Dutch adults: results of a quantitative consumer study

    NARCIS (Netherlands)

    Dillen, van S.M.E.; Hiddink, G.J.; Koelen, M.A.; Graaf, de C.; Woerkum, van C.M.J.

    2004-01-01

    Objective: For more effective nutrition communication, it is crucial to identify sources from which consumers seek information. Our purpose was to assess perceived relevance and information needs regarding food topics, and preferred information sources by means of quantitative consumer research.

  7. The Limitations of Quantitative Social Science for Informing Public Policy

    Science.gov (United States)

    Jerrim, John; de Vries, Robert

    2017-01-01

    Quantitative social science (QSS) has the potential to make an important contribution to public policy. However it also has a number of limitations. The aim of this paper is to explain these limitations to a non-specialist audience and to identify a number of ways in which QSS research could be improved to better inform public policy.

  8. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  9. Pattern decomposition and quantitative-phase analysis in pulsed neutron transmission

    International Nuclear Information System (INIS)

    Steuwer, A.; Santisteban, J.R.; Withers, P.J.; Edwards, L.

    2004-01-01

    Neutron diffraction methods provide accurate quantitative insight into material properties with applications ranging from fundamental physics to applied engineering research. Neutron radiography or tomography on the other hand, are useful tools in the non-destructive spatial imaging of materials or engineering components, but are less accurate with respect to any quantitative analysis. It is possible to combine the advantages of diffraction and radiography using pulsed neutron transmission in a novel way. Using a pixellated detector at a time-of-flight source it is possible to collect 2D 'images' containing a great deal of interesting information in the thermal regime. This together with the unprecedented intensities available at spallation sources and improvements in computing power allow for a re-assessment of the transmission methods. It opens the possibility of simultaneous imaging of diverse material properties such as strain or temperature, as well as the variation in attenuation, and can assist in the determination of phase volume fraction. Spatial and time resolution (for dynamic experiment) are limited only by the detector technology and the intensity of the source. In this example, phase information contained in the cross-section is extracted from Bragg edges using an approach similar to pattern decomposition

  10. Accurate screening for synthetic preservatives in beverage using high performance liquid chromatography with time-of-flight mass spectrometry

    International Nuclear Information System (INIS)

    Li Xiuqin; Zhang Feng; Sun Yanyan; Yong Wei; Chu Xiaogang; Fang Yanyan; Zweigenbaum, Jerry

    2008-01-01

    In this study, liquid chromatography time-of-flight mass spectrometry (HPLC/TOF-MS) is applied to qualitation and quantitation of 18 synthetic preservatives in beverage. The identification by HPLC/TOF-MS is accomplished with the accurate mass (the subsequent generated empirical formula) of the protonated molecules [M + H]+ or the deprotonated molecules [M - H]-, along with the accurate mass of their main fragment ions. In order to obtain sufficient sensitivity for quantitation purposes (using the protonated or deprotonated molecule) and additional qualitative mass spectrum information provided by the fragments ions, segment program of fragmentor voltages is designed in positive and negative ion mode, respectively. Accurate mass measurements are highly useful in the complex sample analyses since they allow us to achieve a high degree of specificity, often needed when other interferents are present in the matrix. The mass accuracy typically obtained is routinely better than 3 ppm. The 18 compounds behave linearly in the 0.005-5.0 mg.kg -1 concentration range, with correlation coefficient >0.996. The recoveries at the tested concentrations of 1.0 mg.kg -1 -100 mg.kg -1 are 81-106%, with coefficients of variation -1 , which are far below the required maximum residue level (MRL) for these preservatives in foodstuff. The method is suitable for routine quantitative and qualitative analyses of synthetic preservatives in foodstuff

  11. Perceived Physician-informed Weight Status Predicts Accurate Weight Self-Perception and Weight Self-Regulation in Low-income, African American Women.

    Science.gov (United States)

    Harris, Charlie L; Strayhorn, Gregory; Moore, Sandra; Goldman, Brian; Martin, Michelle Y

    2016-01-01

    Obese African American women under-appraise their body mass index (BMI) classification and report fewer weight loss attempts than women who accurately appraise their weight status. This cross-sectional study examined whether physician-informed weight status could predict weight self-perception and weight self-regulation strategies in obese women. A convenience sample of 118 low-income women completed a survey assessing demographic characteristics, comorbidities, weight self-perception, and weight self-regulation strategies. BMI was calculated during nurse triage. Binary logistic regression models were performed to test hypotheses. The odds of obese accurate appraisers having been informed about their weight status were six times greater than those of under-appraisers. The odds of those using an "approach" self-regulation strategy having been physician-informed were four times greater compared with those using an "avoidance" strategy. Physicians are uniquely positioned to influence accurate weight self-perception and adaptive weight self-regulation strategies in underserved women, reducing their risk for obesity-related morbidity.

  12. Optimisation of information influences on problems of consequences of Chernobyl accident and quantitative criteria for estimation of information actions

    International Nuclear Information System (INIS)

    Sobaleu, A.

    2004-01-01

    Consequences of Chernobyl NPP accident still very important for Belarus. About 2 million Byelorussians live in the districts polluted by Chernobyl radionuclides. Modern approaches to the decision of after Chernobyl problems in Belarus assume more active use of information and educational actions to grow up a new radiological culture. It will allow to reduce internal doze of radiation without spending a lot of money and other resources. Experience of information work with the population affected by Chernobyl since 1986 till 2004 has shown, that information and educational influences not always reach the final aim - application of received knowledge on radiating safety in practice and changing the style of life. If we take into account limited funds and facilities, we should optimize information work. The optimization can be achieved on the basis of quantitative estimations of information actions effectiveness. It is possible to use two parameters for this quantitative estimations: 1) increase in knowledge of the population and experts on the radiating safety, calculated by new method based on applied theory of the information (Mathematical Theory of Communication) by Claude E. Shannon and 2) reduction of internal doze of radiation, calculated on the basis of measurements on human irradiation counter (HIC) before and after an information or educational influence. (author)

  13. Quantitative predictions from competition theory with incomplete information on model parameters tested against experiments across diverse taxa

    OpenAIRE

    Fort, Hugo

    2017-01-01

    We derive an analytical approximation for making quantitative predictions for ecological communities as a function of the mean intensity of the inter-specific competition and the species richness. This method, with only a fraction of the model parameters (carrying capacities and competition coefficients), is able to predict accurately empirical measurements covering a wide variety of taxa (algae, plants, protozoa).

  14. Quantitative Analysis of Qualitative Information from Interviews: A Systematic Literature Review

    Science.gov (United States)

    Fakis, Apostolos; Hilliam, Rachel; Stoneley, Helen; Townend, Michael

    2014-01-01

    Background: A systematic literature review was conducted on mixed methods area. Objectives: The overall aim was to explore how qualitative information from interviews has been analyzed using quantitative methods. Methods: A contemporary review was undertaken and based on a predefined protocol. The references were identified using inclusion and…

  15. Towards accurate emergency response behavior

    International Nuclear Information System (INIS)

    Sargent, T.O.

    1981-01-01

    Nuclear reactor operator emergency response behavior has persisted as a training problem through lack of information. The industry needs an accurate definition of operator behavior in adverse stress conditions, and training methods which will produce the desired behavior. Newly assembled information from fifty years of research into human behavior in both high and low stress provides a more accurate definition of appropriate operator response, and supports training methods which will produce the needed control room behavior. The research indicates that operator response in emergencies is divided into two modes, conditioned behavior and knowledge based behavior. Methods which assure accurate conditioned behavior, and provide for the recovery of knowledge based behavior, are described in detail

  16. Quantitative study of FORC diagrams in thermally corrected Stoner– Wohlfarth nanoparticles systems

    International Nuclear Information System (INIS)

    De Biasi, E.; Curiale, J.; Zysler, R.D.

    2016-01-01

    The use of FORC diagrams is becoming increasingly popular among researchers devoted to magnetism and magnetic materials. However, a thorough interpretation of this kind of diagrams, in order to achieve quantitative information, requires an appropriate model of the studied system. For that reason most of the FORC studies are used for a qualitative analysis. In magnetic systems thermal fluctuations 'blur' the signatures of the anisotropy, volume and particle interactions distributions, therefore thermal effects in nanoparticles systems conspire against a proper interpretation and analysis of these diagrams. Motivated by this fact, we have quantitatively studied the degree of accuracy of the information extracted from FORC diagrams for the special case of single-domain thermal corrected Stoner– Wohlfarth (easy axes along the external field orientation) nanoparticles systems. In this work, the starting point is an analytical model that describes the behavior of a magnetic nanoparticles system as a function of field, anisotropy, temperature and measurement time. In order to study the quantitative degree of accuracy of our model, we built FORC diagrams for different archetypical cases of magnetic nanoparticles. Our results show that from the quantitative information obtained from the diagrams, under the hypotheses of the proposed model, is possible to recover the features of the original system with accuracy above 95%. This accuracy is improved at low temperatures and also it is possible to access to the anisotropy distribution directly from the FORC coercive field profile. Indeed, our simulations predict that the volume distribution plays a secondary role being the mean value and its deviation the only important parameters. Therefore it is possible to obtain an accurate result for the inversion and interaction fields despite the features of the volume distribution. - Highlights: • Quantify the degree of accuracy of the information obtained using the FORC diagrams.

  17. Providing Quantitative Information and a Nudge to Undergo Stool Testing in a Colorectal Cancer Screening Decision Aid: A Randomized Clinical Trial.

    Science.gov (United States)

    Schwartz, Peter H; Perkins, Susan M; Schmidt, Karen K; Muriello, Paul F; Althouse, Sandra; Rawl, Susan M

    2017-08-01

    Guidelines recommend that patient decision aids should provide quantitative information about probabilities of potential outcomes, but the impact of this information is unknown. Behavioral economics suggests that patients confused by quantitative information could benefit from a "nudge" towards one option. We conducted a pilot randomized trial to estimate the effect sizes of presenting quantitative information and a nudge. Primary care patients (n = 213) eligible for colorectal cancer screening viewed basic screening information and were randomized to view (a) quantitative information (quantitative module), (b) a nudge towards stool testing with the fecal immunochemical test (FIT) (nudge module), (c) neither a nor b, or (d) both a and b. Outcome measures were perceived colorectal cancer risk, screening intent, preferred test, and decision conflict, measured before and after viewing the decision aid, and screening behavior at 6 months. Patients viewing the quantitative module were more likely to be screened than those who did not ( P = 0.012). Patients viewing the nudge module had a greater increase in perceived colorectal cancer risk than those who did not ( P = 0.041). Those viewing the quantitative module had a smaller increase in perceived risk than those who did not ( P = 0.046), and the effect was moderated by numeracy. Among patients with high numeracy who did not view the nudge module, those who viewed the quantitative module had a greater increase in intent to undergo FIT ( P = 0.028) than did those who did not. The limitations of this study were the limited sample size and single healthcare system. Adding quantitative information to a decision aid increased uptake of colorectal cancer screening, while adding a nudge to undergo FIT did not increase uptake. Further research on quantitative information in decision aids is warranted.

  18. Liquid chromatography-mass spectrometry-based quantitative proteomics.

    Science.gov (United States)

    Linscheid, Michael W; Ahrends, Robert; Pieper, Stefan; Kühn, Andreas

    2009-01-01

    During the last decades, molecular sciences revolutionized biomedical research and gave rise to the biotechnology industry. During the next decades, the application of the quantitative sciences--informatics, physics, chemistry, and engineering--to biomedical research brings about the next revolution that will improve human healthcare and certainly create new technologies, since there is no doubt that small changes can have great effects. It is not a question of "yes" or "no," but of "how much," to make best use of the medical options we will have. In this context, the development of accurate analytical methods must be considered a cornerstone, since the understanding of biological processes will be impossible without information about the minute changes induced in cells by interactions of cell constituents with all sorts of endogenous and exogenous influences and disturbances. The first quantitative techniques, which were developed, allowed monitoring relative changes only, but they clearly showed the significance of the information obtained. The recent advent of techniques claiming to quantify proteins and peptides not only relative to each other, but also in an absolute fashion, promised another quantum leap, since knowing the absolute amount will allow comparing even unrelated species and the definition of parameters will permit to model biological systems much more accurate than before. To bring these promises to life, several approaches are under development at this point in time and this review is focused on those developments.

  19. Quantitative film radiography

    International Nuclear Information System (INIS)

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-01-01

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects

  20. Understanding the information needs of people with haematological cancers. A meta-ethnography of quantitative and qualitative research.

    Science.gov (United States)

    Atherton, K; Young, B; Salmon, P

    2017-11-01

    Clinical practice in haematological oncology often involves difficult diagnostic and treatment decisions. In this context, understanding patients' information needs and the functions that information serves for them is particularly important. We systematically reviewed qualitative and quantitative evidence on haematological oncology patients' information needs to inform how these needs can best be addressed in clinical practice. PsycINFO, Medline and CINAHL Plus electronic databases were searched for relevant empirical papers published from January 2003 to July 2016. Synthesis of the findings drew on meta-ethnography and meta-study. Most quantitative studies used a survey design and indicated that patients are largely content with the information they receive from physicians, however much or little they actually receive, although a minority of patients are not content with information. Qualitative studies suggest that a sense of being in a caring relationship with a physician allows patients to feel content with the information they have been given, whereas patients who lack such a relationship want more information. The qualitative evidence can help explain the lack of association between the amount of information received and contentment with it in the quantitative research. Trusting relationships are integral to helping patients feel that their information needs have been met. © 2017 John Wiley & Sons Ltd.

  1. Quantitative and qualitative coronary arteriography. 1

    International Nuclear Information System (INIS)

    Brown, B.G.; Simpson, Paul; Dodge, J.T. Jr; Bolson, E.L.; Dodge, H.T.

    1991-01-01

    The clinical objectives of arteriography are to obtain information that contributes to an understanding of the mechanisms of the clinical syndrome, provides prognostic information, facilitates therapeutic decisions, and guides invasive therapy. Quantitative and improved qualitative assessments of arterial disease provide us with a common descriptive language which has the potential to accomplish these objectives more effectively and thus to improve clinical outcome. In certain situations, this potential has been demonstrated. Clinical investigation using quantitative techniques has definitely contributed to our understanding of disease mechanisms and of atherosclerosis progression/regression. Routine quantitation of clinical images should permit more accurate and repeatable estimates of disease severity and promises to provide useful estimates of coronary flow reserve. But routine clinical QCA awaits more cost- and time-efficient methods and clear proof of a clinical advantage. Careful inspection of highly magnified, high-resolution arteriographic images reveals morphologic features related to the pathophysiology of the clinical syndrome and to the likelihood of future progression or regression of obstruction. Features that have been found useful include thrombus in its various forms, ulceration and irregularity, eccentricity, flexing and dissection. The description of such high-resolution features should be included among, rather than excluded from, the goals of image processing, since they contribute substantially to the understanding and treatment of the clinical syndrome. (author). 81 refs.; 8 figs.; 1 tab

  2. A general method for bead-enhanced quantitation by flow cytometry

    Science.gov (United States)

    Montes, Martin; Jaensson, Elin A.; Orozco, Aaron F.; Lewis, Dorothy E.; Corry, David B.

    2009-01-01

    Flow cytometry provides accurate relative cellular quantitation (percent abundance) of cells from diverse samples, but technical limitations of most flow cytometers preclude accurate absolute quantitation. Several quantitation standards are now commercially available which, when added to samples, permit absolute quantitation of CD4+ T cells. However, these reagents are limited by their cost, technical complexity, requirement for additional software and/or limited applicability. Moreover, few studies have validated the use of such reagents in complex biological samples, especially for quantitation of non-T cells. Here we show that addition to samples of known quantities of polystyrene fluorescence standardization beads permits accurate quantitation of CD4+ T cells from complex cell samples. This procedure, here termed single bead-enhanced cytofluorimetry (SBEC), was equally capable of enumerating eosinophils as well as subcellular fragments of apoptotic cells, moieties with very different optical and fluorescent characteristics. Relative to other proprietary products, SBEC is simple, inexpensive and requires no special software, suggesting that the method is suitable for the routine quantitation of most cells and other particles by flow cytometry. PMID:17067632

  3. Identifying Contributors of DNA Mixtures by Means of Quantitative Information of STR Typing

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    2012-01-01

    identified using polymorphic genetic markers. However, modern typing techniques supply additional quantitative data, which contain very important information about the observed evidence. This is particularly true for cases of DNA mixtures, where more than one individual has contributed to the observed......Abstract Estimating the weight of evidence in forensic genetics is often done in terms of a likelihood ratio, LR. The LR evaluates the probability of the observed evidence under competing hypotheses. Most often, probabilities used in the LR only consider the evidence from the genomic variation...... biological stain. This article presents a method for including the quantitative information of short tandem repeat (STR) DNA mixtures in the LR. Also, an efficient algorithmic method for finding the best matching combination of DNA mixture profiles is derived and implemented in an on-line tool for two...

  4. The controlled incorporation of foreign elements in metal surfaces by means of quantitative ion implantation

    International Nuclear Information System (INIS)

    Gries, W.H.

    1977-01-01

    Quantitative ion implantation is a powerful new method for the doping of metal surfaces with accurately known quantities of an element or one of its isotopes. It can be applied for the preparation of standards for various uses in instrumental methods of surface and bulk analysis. This paper provides selected information on some theoretical and practical aspects of quantitative ion implantation with the object of promoting the application of the method and stimulating further purposeful research on the subject. (Auth.)

  5. Quantitative Study of Emotional Intelligence and Communication Levels in Information Technology Professionals

    Science.gov (United States)

    Hendon, Michalina

    2016-01-01

    This quantitative non-experimental correlational research analyzes the relationship between emotional intelligence and communication due to the lack of this research on information technology professionals in the U.S. One hundred and eleven (111) participants completed a survey that measures both the emotional intelligence and communication…

  6. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  7. Accurate Information, Virtual Reality, Good Librarianship Doğru Bilgi, Sanal Gerçeklik, İyi Kütüphanecilik

    Directory of Open Access Journals (Sweden)

    M. Tayfun Gülle

    2010-03-01

    Full Text Available Departing from the idea that internet, which has become a deep information tunnel, is causing a problem in access to “accurate information”, it is expressed that societies are imprisoned within the world of “virtual reality” with web 2.0/web 3.0 technologies and social media applications. In order to diagnose this problem correctly, the media used from past to present for accessing information are explained shortly as “social tools.” Furthermore, it is emphasised and summarised with an editorial viewpoint that the means of reaching accurate information can be increased via the freedom of expression channel which will be brought forth by “good librarianship” applications. IFLA Principles of Freedom of Expression and Good Librarianship is referred to at the end of the editorial.

  8. Qualitative and Quantitative Data on the Use of the Internet for Archaeological Information

    Directory of Open Access Journals (Sweden)

    Lorna-Jane Richardson

    2015-04-01

    Full Text Available These survey results are from an online survey of 577 UK-based archaeological volunteers, professional archaeologists and archaeological organisations. These data cover a variety of topics related to how and why people access the Internet for information about archaeology, including demographic information, activity relating to accessing information on archaeological topics, archaeological sharing and networking and the use of mobile phone apps and QR codes for public engagement. There is wide scope for further qualitative and quantitative analysis of these data.

  9. Information Risk Management: Qualitative or Quantitative? Cross industry lessons from medical and financial fields

    Directory of Open Access Journals (Sweden)

    Upasna Saluja

    2012-06-01

    Full Text Available Enterprises across the world are taking a hard look at their risk management practices. A number of qualitative and quantitative models and approaches are employed by risk practitioners to keep risk under check. As a norm most organizations end up choosing the more flexible, easier to deploy and customize qualitative models of risk assessment. In practice one sees that such models often call upon the practitioners to make qualitative judgments on a relative rating scale which brings in considerable room for errors, biases and subjectivity. On the other hand under the quantitative risk analysis approach, estimation of risk is connected with application of numerical measures of some kind. Medical risk management models lend themselves as ideal candidates for deriving lessons for Information Security Risk Management. We can use this considerably developed understanding of risk management from the medical field especially Survival Analysis towards handling risks that information infrastructures face. Similarly, financial risk management discipline prides itself on perhaps the most quantifiable of models in risk management. Market Risk and Credit Risk Information Security Risk Management can make risk measurement more objective and quantitative by referring to the approach of Credit Risk. During the recent financial crisis many investors and financial institutions lost money or went bankrupt respectively, because they did not apply the basic principles of risk management. Learning from the financial crisis provides some valuable lessons for information risk management.

  10. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  11. Some exercises in quantitative NMR imaging

    International Nuclear Information System (INIS)

    Bakker, C.J.G.

    1985-01-01

    The articles represented in this thesis result from a series of investigations that evaluate the potential of NMR imaging as a quantitative research tool. In the first article the possible use of proton spin-lattice relaxation time T 1 in tissue characterization, tumor recognition and monitoring tissue response to radiotherapy is explored. The next article addresses the question whether water proton spin-lattice relaxation curves of biological tissues are adequately described by a single time constant T 1 , and analyzes the implications of multi-exponentiality for quantitative NMR imaging. In the third article the use of NMR imaging as a quantitative research tool is discussed on the basis of phantom experiments. The fourth article describes a method which enables unambiguous retrieval of sign information in a set of magnetic resonance images of the inversion recovery type. The next article shows how this method can be adapted to allow accurate calculation of T 1 pictures on a pixel-by-pixel basis. The sixth article, finally, describes a simulation procedure which enables a straightforward determination of NMR imaging pulse sequence parameters for optimal tissue contrast. (orig.)

  12. A multi-method approach to evaluate health information systems.

    Science.gov (United States)

    Yu, Ping

    2010-01-01

    Systematic evaluation of the introduction and impact of health information systems (HIS) is a challenging task. As the implementation is a dynamic process, with diverse issues emerge at various stages of system introduction, it is challenge to weigh the contribution of various factors and differentiate the critical ones. A conceptual framework will be helpful in guiding the evaluation effort; otherwise data collection may not be comprehensive and accurate. This may again lead to inadequate interpretation of the phenomena under study. Based on comprehensive literature research and own practice of evaluating health information systems, the author proposes a multimethod approach that incorporates both quantitative and qualitative measurement and centered around DeLone and McLean Information System Success Model. This approach aims to quantify the performance of HIS and its impact, and provide comprehensive and accurate explanations about the casual relationships of the different factors. This approach will provide decision makers with accurate and actionable information for improving the performance of the introduced HIS.

  13. Results of Studying Astronomy Students’ Science Literacy, Quantitative Literacy, and Information Literacy

    Science.gov (United States)

    Buxner, Sanlyn; Impey, Chris David; Follette, Katherine B.; Dokter, Erin F.; McCarthy, Don; Vezino, Beau; Formanek, Martin; Romine, James M.; Brock, Laci; Neiberding, Megan; Prather, Edward E.

    2017-01-01

    Introductory astronomy courses often serve as terminal science courses for non-science majors and present an opportunity to assess non future scientists’ attitudes towards science as well as basic scientific knowledge and scientific analysis skills that may remain unchanged after college. Through a series of studies, we have been able to evaluate students’ basic science knowledge, attitudes towards science, quantitative literacy, and informational literacy. In the Fall of 2015, we conducted a case study of a single class administering all relevant surveys to an undergraduate class of 20 students. We will present our analysis of trends of each of these studies as well as the comparison case study. In general we have found that students basic scientific knowledge has remained stable over the past quarter century. In all of our studies, there is a strong relationship between student attitudes and their science and quantitative knowledge and skills. Additionally, students’ information literacy is strongly connected to their attitudes and basic scientific knowledge. We are currently expanding these studies to include new audiences and will discuss the implications of our findings for instructors.

  14. Do we need 3D tube current modulation information for accurate organ dosimetry in chest CT? Protocols dose comparisons.

    Science.gov (United States)

    Lopez-Rendon, Xochitl; Zhang, Guozhi; Coudyzer, Walter; Develter, Wim; Bosmans, Hilde; Zanca, Federica

    2017-11-01

    To compare the lung and breast dose associated with three chest protocols: standard, organ-based tube current modulation (OBTCM) and fast-speed scanning; and to estimate the error associated with organ dose when modelling the longitudinal (z-) TCM versus the 3D-TCM in Monte Carlo simulations (MC) for these three protocols. Five adult and three paediatric cadavers with different BMI were scanned. The CTDI vol of the OBTCM and the fast-speed protocols were matched to the patient-specific CTDI vol of the standard protocol. Lung and breast doses were estimated using MC with both z- and 3D-TCM simulated and compared between protocols. The fast-speed scanning protocol delivered the highest doses. A slight reduction for breast dose (up to 5.1%) was observed for two of the three female cadavers with the OBTCM in comparison to the standard. For both adult and paediatric, the implementation of the z-TCM data only for organ dose estimation resulted in 10.0% accuracy for the standard and fast-speed protocols, while relative dose differences were up to 15.3% for the OBTCM protocol. At identical CTDI vol values, the standard protocol delivered the lowest overall doses. Only for the OBTCM protocol is the 3D-TCM needed if an accurate (<10.0%) organ dosimetry is desired. • The z-TCM information is sufficient for accurate dosimetry for standard protocols. • The z-TCM information is sufficient for accurate dosimetry for fast-speed scanning protocols. • For organ-based TCM schemes, the 3D-TCM information is necessary for accurate dosimetry. • At identical CTDI vol , the fast-speed scanning protocol delivered the highest doses. • Lung dose was higher in XCare than standard protocol at identical CTDI vol .

  15. Leveraging Two Kinect Sensors for Accurate Full-Body Motion Capture

    Directory of Open Access Journals (Sweden)

    Zhiquan Gao

    2015-09-01

    Full Text Available Accurate motion capture plays an important role in sports analysis, the medical field and virtual reality. Current methods for motion capture often suffer from occlusions, which limits the accuracy of their pose estimation. In this paper, we propose a complete system to measure the pose parameters of the human body accurately. Different from previous monocular depth camera systems, we leverage two Kinect sensors to acquire more information about human movements, which ensures that we can still get an accurate estimation even when significant occlusion occurs. Because human motion is temporally constant, we adopt a learning analysis to mine the temporal information across the posture variations. Using this information, we estimate human pose parameters accurately, regardless of rapid movement. Our experimental results show that our system can perform an accurate pose estimation of the human body with the constraint of information from the temporal domain.

  16. Presenting efficacy information in direct-to-consumer prescription drug advertisements.

    Science.gov (United States)

    O'Donoghue, Amie C; Sullivan, Helen W; Aikin, Kathryn J; Chowdhury, Dhuly; Moultrie, Rebecca R; Rupert, Douglas J

    2014-05-01

    We evaluated whether presenting prescription drug efficacy information in direct-to-consumer (DTC) advertising helps individuals accurately report a drug's benefits and, if so, which numerical format is most helpful. We conducted a randomized, controlled study of individuals diagnosed with high cholesterol (n=2807) who viewed fictitious prescription drug print or television ads containing either no drug efficacy information or efficacy information in one of five numerical formats. We measured drug efficacy recall, drug perceptions and attitudes, behavioral intentions, and drug risk recall. Individuals who viewed absolute frequency and/or percentage information more accurately reported drug efficacy than participants who viewed no efficacy information. Participants who viewed relative frequency information generally reported drug efficacy less accurately than participants who viewed other numerical formats. Adding efficacy information to DTC ads-both in print and on television-may potentially increase an individual's knowledge of a drug's efficacy, which may improve patient-provider communication and promote more informed decisions. Providing quantitative efficacy information in a combination of formats (e.g., absolute frequency and percent) may help patients remember information and make decisions about prescription drugs. Published by Elsevier Ireland Ltd.

  17. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  18. Validating quantitative precipitation forecast for the Flood ...

    Indian Academy of Sciences (India)

    In order to issue an accurate warning for flood, a better or appropriate quantitative forecasting of precipitationis required. In view of this, the present study intends to validate the quantitative precipitationforecast (QPF) issued during southwest monsoon season for six river catchments (basin) under theflood meteorological ...

  19. Quantitative remote visual inspection in nuclear power industry

    International Nuclear Information System (INIS)

    Stone, M.C.

    1992-01-01

    A borescope is an instrument that is used within the power industry to visually inspect remote locations. It is typically used for inspections of heat exchangers, condensers, boiler tubes, and steam generators and in many general inspection applications. The optical system of a borescope, like the human eye, does not have a fixed magnification. When viewing an object close up, it appears large; when the same object is viewed from afar, it appears small. Humans, though, have two separate eyes and a brain that process information to calculate the size of an object. These attributes are considered secondary information. Until now, making a measurement using a borescope has been an educated guess. There has always been a need to make accurate measurements from borescope images. The realization of this capability would make remote visual inspection a quantitative nondestructive testing method versus a qualitative one. For nuclear power plants, it is an excellent technique for maintaining radiation levels as low as reasonably achievable. Remote visual measurement provides distance and limits the exposure time needed to make accurate measurements. The design problem, therefore, was to develop the capability to make accurate and repeatable measurements of objects or physical defects with a borescope-type instrument. The solution was achieved by designing a borescope with a novel shadow projection mechanism, integrated with an electronics module containing the video display circuitry and a measurement computer

  20. Quantitative digital radiography with two dimensional flat panels

    International Nuclear Information System (INIS)

    Dinten, J.M.; Robert-Coutant, C.; Darboux, M.

    2003-01-01

    Purpose: Attenuation law relates radiographic images to irradiated object thickness and chemical composition. Film radiography exploits qualitatively this property for diagnosis. Digital radiographic flat panels present large dynamic range, reproducibility and linearity properties which open the gate for quantification. We will present, through two applications (mammography and bone densitometry), an approach to extract quantitative information from digital 2D radiographs. Material and method: The main difficulty for quantification is X-rays scatter, which superimposes to acquisition data. Because of multiple scatterings and 3D geometry dependence, it cannot be directly exploited through an exact analytical model. Therefore we have developed an approach for its estimation and subtraction from medical radiographs, based on approximations and derivations of analytical models of scatter formation in human tissues. Results: In digital mammography, the objective is to build a map of the glandular tissue thickness. Its separation from fat tissue is based on two equations: height of compression and attenuation. This last equation needs X-Rays scatter correction. In bone densitometry, physicians look for quantitative bone mineral density. Today, clinical DEXA systems use collimated single or linear detectors to eliminate scatter. This scanning technology induces poor image quality. By applying our scatter correction approach, we have developed a bone densitometer using a digital flat panel (Lexxos, DMS). It provides with accurate and reproducible measurements while presenting radiological image quality. Conclusion: These applications show how information processing, and especially X-Rays scatter processing, enables to extract quantitative information from digital radiographs. This approach, associated to Computer Aided Diagnosis algorithms or reconstructions algorithms, gives access to useful information for diagnosis. (author)

  1. Quantitative cone beam X-ray luminescence tomography/X-ray computed tomography imaging

    International Nuclear Information System (INIS)

    Chen, Dongmei; Zhu, Shouping; Chen, Xueli; Chao, Tiantian; Cao, Xu; Zhao, Fengjun; Huang, Liyu; Liang, Jimin

    2014-01-01

    X-ray luminescence tomography (XLT) is an imaging technology based on X-ray-excitable materials. The main purpose of this paper is to obtain quantitative luminescence concentration using the structural information of the X-ray computed tomography (XCT) in the hybrid cone beam XLT/XCT system. A multi-wavelength luminescence cone beam XLT method with the structural a priori information is presented to relieve the severe ill-posedness problem in the cone beam XLT. The nanophosphors and phantom experiments were undertaken to access the linear relationship of the system response. Then, an in vivo mouse experiment was conducted. The in vivo experimental results show that the recovered concentration error as low as 6.67% with the location error of 0.85 mm can be achieved. The results demonstrate that the proposed method can accurately recover the nanophosphor inclusion and realize the quantitative imaging

  2. Quantitative metrics for evaluating the phased roll-out of clinical information systems.

    Science.gov (United States)

    Wong, David; Wu, Nicolas; Watkinson, Peter

    2017-09-01

    We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  3. Three-dimensional Hessian matrix-based quantitative vascular imaging of rat iris with optical-resolution photoacoustic microscopy in vivo

    Science.gov (United States)

    Zhao, Huangxuan; Wang, Guangsong; Lin, Riqiang; Gong, Xiaojing; Song, Liang; Li, Tan; Wang, Wenjia; Zhang, Kunya; Qian, Xiuqing; Zhang, Haixia; Li, Lin; Liu, Zhicheng; Liu, Chengbo

    2018-04-01

    For the diagnosis and evaluation of ophthalmic diseases, imaging and quantitative characterization of vasculature in the iris are very important. The recently developed photoacoustic imaging, which is ultrasensitive in imaging endogenous hemoglobin molecules, provides a highly efficient label-free method for imaging blood vasculature in the iris. However, the development of advanced vascular quantification algorithms is still needed to enable accurate characterization of the underlying vasculature. We have developed a vascular information quantification algorithm by adopting a three-dimensional (3-D) Hessian matrix and applied for processing iris vasculature images obtained with a custom-built optical-resolution photoacoustic imaging system (OR-PAM). For the first time, we demonstrate in vivo 3-D vascular structures of a rat iris with a the label-free imaging method and also accurately extract quantitative vascular information, such as vessel diameter, vascular density, and vascular tortuosity. Our results indicate that the developed algorithm is capable of quantifying the vasculature in the 3-D photoacoustic images of the iris in-vivo, thus enhancing the diagnostic capability of the OR-PAM system for vascular-related ophthalmic diseases in vivo.

  4. Magnetoresistive biosensors for quantitative proteomics

    Science.gov (United States)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  5. On an efficient and accurate method to integrate restricted three-body orbits

    Science.gov (United States)

    Murison, Marc A.

    1989-01-01

    This work is a quantitative analysis of the advantages of the Bulirsch-Stoer (1966) method, demonstrating that this method is certainly worth considering when working with small N dynamical systems. The results, qualitatively suspected by many users, are quantitatively confirmed as follows: (1) the Bulirsch-Stoer extrapolation method is very fast and moderately accurate; (2) regularization of the equations of motion stabilizes the error behavior of the method and is, of course, essential during close approaches; and (3) when applicable, a manifold-correction algorithm reduces numerical errors to the limits of machine accuracy. In addition, for the specific case of the restricted three-body problem, even a small eccentricity for the orbit of the primaries drastically affects the accuracy of integrations, whether regularized or not; the circular restricted problem integrates much more accurately.

  6. Methodology development for quantitative optimization of security enhancement in medical information systems -Case study in a PACS and a multi-institutional radiotherapy database-.

    Science.gov (United States)

    Haneda, Kiyofumi; Umeda, Tokuo; Koyama, Tadashi; Harauchi, Hajime; Inamura, Kiyonari

    2002-01-01

    The target of our study is to establish the methodology for analyzing level of security requirements, for searching suitable security measures and for optimizing security distribution to every portion of medical practice. Quantitative expression must be introduced to our study as possible for the purpose of easy follow up of security procedures and easy evaluation of security outcomes or results. Results of system analysis by fault tree analysis (FTA) clarified that subdivided system elements in detail contribute to much more accurate analysis. Such subdivided composition factors very much depended on behavior of staff, interactive terminal devices, kinds of service, and routes of network. As conclusion, we found the methods to analyze levels of security requirements for each medical information systems employing FTA, basic events for each composition factor and combination of basic events. Methods for searching suitable security measures were found. Namely risk factors for each basic event, number of elements for each composition factor and candidates of security measure elements were found. Method to optimize the security measures for each medical information system was proposed. Namely optimum distribution of risk factors in terms of basic events were figured out, and comparison of them between each medical information systems became possible.

  7. Quantitative Evaluation of Defect Based on Ultrasonic Guided Wave and CHMM

    Directory of Open Access Journals (Sweden)

    Chen Le

    2016-01-01

    Full Text Available The axial length of pipe defects is not linear with the reflection coefficient, which is difficult to identify the axial length of the defect by the reflection coefficient method. Continuous Hidden Markov Model (CHMM is proposed to accurately classify the axial length of defects, achieving the objective of preliminary quantitative evaluation. Firstly, wavelet packet decomposition method is used to extract the characteristic information of the guided wave signal, and Kernel Sliced Inverse Regression (KSIR method is used to reduce the dimension of feature set. Then, a variety of CHMM models are trained for classification. Finally, the trained models are used to identify the artificial corrosion defects on the outer surface of the pipe. The results show that the CHMM model has better robustness and can accurately identify the axial defects.

  8. [A new method of processing quantitative PCR data].

    Science.gov (United States)

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  9. Quantitation of Proteinuria in Women With Pregnancy Induced ...

    African Journals Online (AJOL)

    This creates the need for a more accurate method for early detection and quantitation of proteinuria. Objective:To compare the accuracy of the Spot urine Protein to Creatinine ratio with that of Dipstick Tests in the quantitation of proteinuria in Nigerian women with Pregnancy Induced Hypertension. Methods: A cross-sectional ...

  10. Mental models accurately predict emotion transitions.

    Science.gov (United States)

    Thornton, Mark A; Tamir, Diana I

    2017-06-06

    Successful social interactions depend on people's ability to predict others' future actions and emotions. People possess many mechanisms for perceiving others' current emotional states, but how might they use this information to predict others' future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others' emotional dynamics. People could then use these mental models of emotion transitions to predict others' future emotions from currently observable emotions. To test this hypothesis, studies 1-3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants' ratings of emotion transitions predicted others' experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation-valence, social impact, rationality, and human mind-inform participants' mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants' accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone.

  11. Mental models accurately predict emotion transitions

    Science.gov (United States)

    Thornton, Mark A.; Tamir, Diana I.

    2017-01-01

    Successful social interactions depend on people’s ability to predict others’ future actions and emotions. People possess many mechanisms for perceiving others’ current emotional states, but how might they use this information to predict others’ future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others’ emotional dynamics. People could then use these mental models of emotion transitions to predict others’ future emotions from currently observable emotions. To test this hypothesis, studies 1–3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants’ ratings of emotion transitions predicted others’ experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation—valence, social impact, rationality, and human mind—inform participants’ mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants’ accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone. PMID:28533373

  12. A study on the quantitative model of human response time using the amount and the similarity of information

    International Nuclear Information System (INIS)

    Lee, Sung Jin

    2006-02-01

    The mental capacity to retain or recall information, or memory is related to human performance during processing of information. Although a large number of studies have been carried out on human performance, little is known about the similarity effect. The purpose of this study was to propose and validate a quantitative and predictive model on human response time in the user interface with the basic concepts of information amount, similarity and degree of practice. It was difficult to explain human performance by only similarity or information amount. There were two difficulties: constructing a quantitative model on human response time and validating the proposed model by experimental work. A quantitative model based on the Hick's law, the law of practice and similarity theory was developed. The model was validated under various experimental conditions by measuring the participants' response time in the environment of a computer-based display. Human performance was improved by degree of similarity and practice in the user interface. Also we found the age-related human performance which was degraded as he or she was more elder. The proposed model may be useful for training operators who will handle some interfaces and predicting human performance by changing system design

  13. Quantitative diagnosis of skeletons with demineralizing osteopathy

    International Nuclear Information System (INIS)

    Banzer, D.

    1979-01-01

    The quantitative diagnosis of bone diseases must be assessed according to the accuracy of the applied method, the expense in apparatus, personnel and financial resources and the comparability of results. Nuclide absorptiometry and in the future perhaps computed tomography represent the most accurate methods for determining the mineral content of bones. Their application is the clinics' prerogative because of the costs. Morphometry provides quantiative information, in particular in course control, and enables an objective judgement of visual pictures. It requires little expenditure and should be combined with microradioscopy. Direct comparability of the findings of different working groups is most easy in morphometry; it depends on the equipment in computerized tomography and is still hardly possible in nuclide absorptiometry. For fundamental physical reason, it will hardly be possible to produce a low-cost, fast and easy-to-handle instrument for the determination of the mineral salt concentration in bones. Instead, there is rather a trend towards more expensive equipment, e.g. CT instruments; the universal use of these instruments, however, will help to promote quantitative diagnoses. (orig.) [de

  14. A newly developed maneuver, field change conversion (FCC), improved evaluation of the left ventricular volume more accurately on quantitative gated SPECT (QGS) analysis

    International Nuclear Information System (INIS)

    Tajima, Osamu; Shibasaki, Masaki; Hoshi, Toshiko; Imai, Kamon

    2002-01-01

    The purpose of this study was to investigate whether a newly developed maneuver that reduces the reconstruction area by a half more accurately evaluates left ventricular (LV) volume on quantitative gated SPECT (QGS) analysis. The subjects were 38 patients who underwent left ventricular angiography (LVG) followed by G-SPECT within 2 weeks. Acquisition was performed with a general purpose collimator and a 64 x 64 matrix. On QGS analysis, the field magnification was 34 cm in original image (Original: ORI), and furthermore it was changed from 34 cm to 17 cm to enlarge the re-constructed image (Field Change Conversion: FCC). End-diastolic volume (EDV) and end-systolic volume (ESV) of the left ventricle were also obtained using LVG. EDV was 71±19 ml, 83±20 ml and 98±23 ml for ORI, FCC and LVG, respectively (p<0.001: ORI versus LVG, p<0.001: ORI versus FCC, p<0.001: FCC versus LVG). ESV was 28±12 ml, 34±13 ml and 41±14 ml for ORI, FCC and LVG, respectively (p<0.001: ORI versus LVG, p<0.001: ORI versus FCC, p<0.001: FCC versus LVG). FCC was better than ORI for calculating LV volume in clinical cases. Furthermore, FCC is a useful method for accurately measuring the LV volume on QGS analysis. (author)

  15. Accurate, Fast and Cost-Effective Diagnostic Test for Monosomy 1p36 Using Real-Time Quantitative PCR

    Directory of Open Access Journals (Sweden)

    Pricila da Silva Cunha

    2014-01-01

    Full Text Available Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5–0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH, which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH, and/or multiplex ligation-dependent probe amplification (MLPA all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs.

  16. Quantitative prediction of drug side effects based on drug-related features.

    Science.gov (United States)

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  17. MetaFluxNet: the management of metabolic reaction information and quantitative metabolic flux analysis.

    Science.gov (United States)

    Lee, Dong-Yup; Yun, Hongsoek; Park, Sunwon; Lee, Sang Yup

    2003-11-01

    MetaFluxNet is a program package for managing information on the metabolic reaction network and for quantitatively analyzing metabolic fluxes in an interactive and customized way. It allows users to interpret and examine metabolic behavior in response to genetic and/or environmental modifications. As a result, quantitative in silico simulations of metabolic pathways can be carried out to understand the metabolic status and to design the metabolic engineering strategies. The main features of the program include a well-developed model construction environment, user-friendly interface for metabolic flux analysis (MFA), comparative MFA of strains having different genotypes under various environmental conditions, and automated pathway layout creation. http://mbel.kaist.ac.kr/ A manual for MetaFluxNet is available as PDF file.

  18. Quantitative contrast-enhanced mammography for contrast medium kinetics studies

    Science.gov (United States)

    Arvanitis, C. D.; Speller, R.

    2009-10-01

    Quantitative contrast-enhanced mammography, based on a dual-energy approach, aims to extract quantitative and temporal information of the tumour enhancement after administration of iodinated vascular contrast media. Simulations using analytical expressions and optimization of critical parameters essential for the development of quantitative contrast-enhanced mammography are presented. The procedure has been experimentally evaluated using a tissue-equivalent phantom and an amorphous silicon active matrix flat panel imager. The x-ray beams were produced by a tungsten target tube and spectrally shaped using readily available materials. Measurement of iodine projected thickness in mg cm-2 has been performed. The effect of beam hardening does not introduce nonlinearities in the measurement of iodine projected thickness for values of thicknesses found in clinical investigations. However, scattered radiation introduces significant deviations from slope equal to unity when compared with the actual iodine projected thickness. Scatter correction before the analysis of the dual-energy images provides accurate iodine projected thickness measurements. At 10% of the exposure used in clinical mammography, signal-to-noise ratios in excess of 5 were achieved for iodine projected thicknesses less than 3 mg cm-2 within a 4 cm thick phantom. For the extraction of temporal information, a limited number of low-dose images were used with the phantom incorporating a flow of iodinated contrast medium. The results suggest that spatial and temporal information of iodinated contrast media can be used to indirectly measure the tumour microvessel density and determine its uptake and washout from breast tumours. The proposed method can significantly improve tumour detection in dense breasts. Its application to perform in situ x-ray biopsy and assessment of the oncolytic effect of anticancer agents is foreseeable.

  19. Sequence- vs. chip-assisted genomic selection: accurate biological information is advised.

    Science.gov (United States)

    Pérez-Enciso, Miguel; Rincón, Juan C; Legarra, Andrés

    2015-05-09

    The development of next-generation sequencing technologies (NGS) has made the use of whole-genome sequence data for routine genetic evaluations possible, which has triggered a considerable interest in animal and plant breeding fields. Here, we investigated whether complete or partial sequence data can improve upon existing SNP (single nucleotide polymorphism) array-based selection strategies by simulation using a mixed coalescence - gene-dropping approach. We simulated 20 or 100 causal mutations (quantitative trait nucleotides, QTN) within 65 predefined 'gene' regions, each 10 kb long, within a genome composed of ten 3-Mb chromosomes. We compared prediction accuracy by cross-validation using a medium-density chip (7.5 k SNPs), a high-density (HD, 17 k) and sequence data (335 k). Genetic evaluation was based on a GBLUP method. The simulations showed: (1) a law of diminishing returns with increasing number of SNPs; (2) a modest effect of SNP ascertainment bias in arrays; (3) a small advantage of using whole-genome sequence data vs. HD arrays i.e. ~4%; (4) a minor effect of NGS errors except when imputation error rates are high (≥20%); and (5) if QTN were known, prediction accuracy approached 1. Since this is obviously unrealistic, we explored milder assumptions. We showed that, if all SNPs within causal genes were included in the prediction model, accuracy could also dramatically increase by ~40%. However, this criterion was highly sensitive to either misspecification (including wrong genes) or to the use of an incomplete gene list; in these cases, accuracy fell rapidly towards that reached when all SNPs from sequence data were blindly included in the model. Our study shows that, unless an accurate prior estimate on the functionality of SNPs can be included in the predictor, there is a law of diminishing returns with increasing SNP density. As a result, use of whole-genome sequence data may not result in a highly increased selection response over high

  20. An efficient polyenergetic SART (pSART) reconstruction algorithm for quantitative myocardial CT perfusion

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Yuan, E-mail: yuan.lin@duke.edu; Samei, Ehsan [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, 2424 Erwin Road, Suite 302, Durham, North Carolina 27705 (United States)

    2014-02-15

    Purpose: In quantitative myocardial CT perfusion imaging, beam hardening effect due to dense bone and high concentration iodinated contrast agent can result in visible artifacts and inaccurate CT numbers. In this paper, an efficient polyenergetic Simultaneous Algebraic Reconstruction Technique (pSART) was presented to eliminate the beam hardening artifacts and to improve the CT quantitative imaging ability. Methods: Our algorithm made threea priori assumptions: (1) the human body is composed of several base materials (e.g., fat, breast, soft tissue, bone, and iodine); (2) images can be coarsely segmented to two types of regions, i.e., nonbone regions and noniodine regions; and (3) each voxel can be decomposed into a mixture of two most suitable base materials according to its attenuation value and its corresponding region type information. Based on the above assumptions, energy-independent accumulated effective lengths of all base materials can be fast computed in the forward ray-tracing process and be used repeatedly to obtain accurate polyenergetic projections, with which a SART-based equation can correctly update each voxel in the backward projecting process to iteratively reconstruct artifact-free images. This approach effectively reduces the influence of polyenergetic x-ray sources and it further enables monoenergetic images to be reconstructed at any arbitrarily preselected target energies. A series of simulation tests were performed on a size-variable cylindrical phantom and a realistic anthropomorphic thorax phantom. In addition, a phantom experiment was also performed on a clinical CT scanner to further quantitatively validate the proposed algorithm. Results: The simulations with the cylindrical phantom and the anthropomorphic thorax phantom showed that the proposed algorithm completely eliminated beam hardening artifacts and enabled quantitative imaging across different materials, phantom sizes, and spectra, as the absolute relative errors were reduced

  1. An efficient polyenergetic SART (pSART) reconstruction algorithm for quantitative myocardial CT perfusion

    International Nuclear Information System (INIS)

    Lin, Yuan; Samei, Ehsan

    2014-01-01

    Purpose: In quantitative myocardial CT perfusion imaging, beam hardening effect due to dense bone and high concentration iodinated contrast agent can result in visible artifacts and inaccurate CT numbers. In this paper, an efficient polyenergetic Simultaneous Algebraic Reconstruction Technique (pSART) was presented to eliminate the beam hardening artifacts and to improve the CT quantitative imaging ability. Methods: Our algorithm made threea priori assumptions: (1) the human body is composed of several base materials (e.g., fat, breast, soft tissue, bone, and iodine); (2) images can be coarsely segmented to two types of regions, i.e., nonbone regions and noniodine regions; and (3) each voxel can be decomposed into a mixture of two most suitable base materials according to its attenuation value and its corresponding region type information. Based on the above assumptions, energy-independent accumulated effective lengths of all base materials can be fast computed in the forward ray-tracing process and be used repeatedly to obtain accurate polyenergetic projections, with which a SART-based equation can correctly update each voxel in the backward projecting process to iteratively reconstruct artifact-free images. This approach effectively reduces the influence of polyenergetic x-ray sources and it further enables monoenergetic images to be reconstructed at any arbitrarily preselected target energies. A series of simulation tests were performed on a size-variable cylindrical phantom and a realistic anthropomorphic thorax phantom. In addition, a phantom experiment was also performed on a clinical CT scanner to further quantitatively validate the proposed algorithm. Results: The simulations with the cylindrical phantom and the anthropomorphic thorax phantom showed that the proposed algorithm completely eliminated beam hardening artifacts and enabled quantitative imaging across different materials, phantom sizes, and spectra, as the absolute relative errors were reduced

  2. Fluorescence correlation spectroscopy analysis for accurate determination of proportion of doubly labeled DNA in fluorescent DNA pool for quantitative biochemical assays.

    Science.gov (United States)

    Hou, Sen; Sun, Lili; Wieczorek, Stefan A; Kalwarczyk, Tomasz; Kaminski, Tomasz S; Holyst, Robert

    2014-01-15

    Fluorescent double-stranded DNA (dsDNA) molecules labeled at both ends are commonly produced by annealing of complementary single-stranded DNA (ssDNA) molecules, labeled with fluorescent dyes at the same (3' or 5') end. Because the labeling efficiency of ssDNA is smaller than 100%, the resulting dsDNA have two, one or are without a dye. Existing methods are insufficient to measure the percentage of the doubly-labeled dsDNA component in the fluorescent DNA sample and it is even difficult to distinguish the doubly-labeled DNA component from the singly-labeled component. Accurate measurement of the percentage of such doubly labeled dsDNA component is a critical prerequisite for quantitative biochemical measurements, which has puzzled scientists for decades. We established a fluorescence correlation spectroscopy (FCS) system to measure the percentage of doubly labeled dsDNA (PDL) in the total fluorescent dsDNA pool. The method is based on comparative analysis of the given sample and a reference dsDNA sample prepared by adding certain amount of unlabeled ssDNA into the original ssDNA solution. From FCS autocorrelation functions, we obtain the number of fluorescent dsDNA molecules in the focal volume of the confocal microscope and PDL. We also calculate the labeling efficiency of ssDNA. The method requires minimal amount of material. The samples have the concentration of DNA in the nano-molar/L range and the volume of tens of microliters. We verify our method by using restriction enzyme Hind III to cleave the fluorescent dsDNA. The kinetics of the reaction depends strongly on PDL, a critical parameter for quantitative biochemical measurements. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Mitochondrial DNA as a non-invasive biomarker: Accurate quantification using real time quantitative PCR without co-amplification of pseudogenes and dilution bias

    International Nuclear Information System (INIS)

    Malik, Afshan N.; Shahni, Rojeen; Rodriguez-de-Ledesma, Ana; Laftah, Abas; Cunningham, Phil

    2011-01-01

    Highlights: → Mitochondrial dysfunction is central to many diseases of oxidative stress. → 95% of the mitochondrial genome is duplicated in the nuclear genome. → Dilution of untreated genomic DNA leads to dilution bias. → Unique primers and template pretreatment are needed to accurately measure mitochondrial DNA content. -- Abstract: Circulating mitochondrial DNA (MtDNA) is a potential non-invasive biomarker of cellular mitochondrial dysfunction, the latter known to be central to a wide range of human diseases. Changes in MtDNA are usually determined by quantification of MtDNA relative to nuclear DNA (Mt/N) using real time quantitative PCR. We propose that the methodology for measuring Mt/N needs to be improved and we have identified that current methods have at least one of the following three problems: (1) As much of the mitochondrial genome is duplicated in the nuclear genome, many commonly used MtDNA primers co-amplify homologous pseudogenes found in the nuclear genome; (2) use of regions from genes such as β-actin and 18S rRNA which are repetitive and/or highly variable for qPCR of the nuclear genome leads to errors; and (3) the size difference of mitochondrial and nuclear genomes cause a 'dilution bias' when template DNA is diluted. We describe a PCR-based method using unique regions in the human mitochondrial genome not duplicated in the nuclear genome; unique single copy region in the nuclear genome and template treatment to remove dilution bias, to accurately quantify MtDNA from human samples.

  4. Quantitative stain-free and continuous multimodal monitoring of wound healing in vitro with digital holographic microscopy.

    Directory of Open Access Journals (Sweden)

    Dominik Bettenworth

    Full Text Available Impaired epithelial wound healing has significant pathophysiological implications in several conditions including gastrointestinal ulcers, anastomotic leakage and venous or diabetic skin ulcers. Promising drug candidates for accelerating wound closure are commonly evaluated in in vitro wound assays. However, staining procedures and discontinuous monitoring are major drawbacks hampering accurate assessment of wound assays. We therefore investigated digital holographic microscopy (DHM to appropriately monitor wound healing in vitro and secondly, to provide multimodal quantitative information on morphological and functional cell alterations as well as on motility changes upon cytokine stimulation. Wound closure as reflected by proliferation and migration of Caco-2 cells in wound healing assays was studied and assessed in time-lapse series for 40 h in the presence of stimulating epidermal growth factor (EGF and inhibiting mitomycin c. Therefore, digital holograms were recorded continuously every thirty minutes. Morphological changes including cell thickness, dry mass and tissue density were analyzed by data from quantitative digital holographic phase microscopy. Stimulation of Caco-2 cells with EGF or mitomycin c resulted in significant morphological changes during wound healing compared to control cells. In conclusion, DHM allows accurate, stain-free and continuous multimodal quantitative monitoring of wound healing in vitro and could be a promising new technique for assessment of wound healing.

  5. Automated selected reaction monitoring software for accurate label-free protein quantification.

    Science.gov (United States)

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  6. Quantitative FDG in depression

    Energy Technology Data Exchange (ETDEWEB)

    Chua, P.; O`Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D. [Austin Hospital, Melbourne, VIC (Australia). Dept of Psychiatry and Centre for PET

    1998-03-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual``s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals.

  7. Quantitative FDG in depression

    International Nuclear Information System (INIS)

    Chua, P.; O'Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D.

    1998-01-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual''s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals

  8. Accurate overlaying for mobile augmented reality

    NARCIS (Netherlands)

    Pasman, W; van der Schaaf, A; Lagendijk, RL; Jansen, F.W.

    1999-01-01

    Mobile augmented reality requires accurate alignment of virtual information with objects visible in the real world. We describe a system for mobile communications to be developed to meet these strict alignment criteria using a combination of computer vision. inertial tracking and low-latency

  9. urrent status and assessment of quantitative and qualitative one leg ...

    African Journals Online (AJOL)

    ... of only a quantitative assessment. These findings indicate that, when evaluating the one leg balance in children aged 3-6 years, a quantitative and qualitative assessment should be used in combination together to assure a more accurate assessment. (S. African J. for Research in Sport, Physical Ed. and Recreation: 2001 ...

  10. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    Science.gov (United States)

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  11. Quantitative analysis of gender stereotypes and information aggregation in a national election.

    Directory of Open Access Journals (Sweden)

    Michele Tumminello

    Full Text Available By analyzing a database of a questionnaire answered by a large majority of candidates and elected in a parliamentary election, we quantitatively verify that (i female candidates on average present political profiles which are more compassionate and more concerned with social welfare issues than male candidates and (ii the voting procedure acts as a process of information aggregation. Our results show that information aggregation proceeds with at least two distinct paths. In the first case candidates characterize themselves with a political profile aiming to describe the profile of the majority of voters. This is typically the case of candidates of political parties which are competing for the center of the various political dimensions. In the second case, candidates choose a political profile manifesting a clear difference from opposite political profiles endorsed by candidates of a political party positioned at the opposite extreme of some political dimension.

  12. Generation of accurate peptide retention data for targeted and data independent quantitative LC-MS analysis: Chromatographic lessons in proteomics.

    Science.gov (United States)

    Krokhin, Oleg V; Spicer, Vic

    2016-12-01

    The emergence of data-independent quantitative LC-MS/MS analysis protocols further highlights the importance of high-quality reproducible chromatographic procedures. Knowing, controlling and being able to predict the effect of multiple factors that alter peptide RP-HPLC separation selectivity is critical for successful data collection for the construction of ion libraries. Proteomic researchers have often regarded RP-HPLC as a "black box", while vast amount of research on peptide separation is readily available. In addition to obvious parameters, such as the type of ion-pairing modifier, stationary phase and column temperature, we describe the "mysterious" effects of gradient slope, column size and flow rate on peptide separation selectivity. Retention time variations due to these parameters are governed by the linear solvent strength (LSS) theory on a peptide level by the value of its slope S in the basic LSS equation-a parameter that can be accurately predicted. Thus, the application of shallower gradients, higher flow rates, or smaller columns will each increases the relative retention of peptides with higher S-values (long species with multiple positively charged groups). Simultaneous changes to these parameters that each drive shifts in separation selectivity in the same direction should be avoided. The unification of terminology represents another pressing issue in this field of applied proteomics that should be addressed to facilitate further progress. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Conjugate whole-body scanning system for quantitative measurement of organ distribution in vivo

    International Nuclear Information System (INIS)

    Tsui, B.M.W.; Chen, C.T.; Yasillo, N.J.; Ortega, C.J.; Charleston, D.B.; Lathrop, K.A.

    1979-01-01

    The determination of accurate, quantitative, biokinetic distribution of an internally dispersed radionuclide in humans is important in making realistic radiation absorbed dose estimates, studying biochemical transformations in health and disease, and developing clinical procedures indicative of abnormal functions. In order to collect these data, a whole-body imaging system is required which provides both adequate spatial resolution and some means of absolute quantitation. Based on these considerations, a new whole-body scanning system has been designed and constructed that employs the conjugate counting technique. The conjugate whole-body scanning system provides an efficient and accurate means of collecting absolute quantitative organ distribution data of radioactivity in vivo

  14. Quantitative Surface Analysis by Xps (X-Ray Photoelectron Spectroscopy: Application to Hydrotreating Catalysts

    Directory of Open Access Journals (Sweden)

    Beccat P.

    1999-07-01

    Full Text Available XPS is an ideal technique to provide the chemical composition of the extreme surface of solid materials, vastly applied to the study of catalysts. In this article, we will show that a quantitative approach, based upon fundamental expression of the XPS signal, has enabled us to obtain a consistent set of response factors for the elements of the periodic table. In-depth spadework has been necessary to know precisely the transmission function of the spectrometer used at IFP. The set of response factors obtained enables to perform, on a routine basis, a quantitative analysis with approximately 20% relative accuracy, which is quite acceptable for an analysis of such a nature. While using this quantitative approach, we have developed an analytical method specific to hydrotreating catalysts that allows obtaining the sulphiding degree of molybdenum quite reliably and reproducibly. The usage of this method is illustrated by two examples for which XPS spectroscopy has provided with information sufficiently accurate and quantitative to help understand the reactivity differences between certain MoS2/Al2O3 or NiMoS/Al2O3-type hydrotreating catalysts.

  15. QUESP and QUEST revisited - fast and accurate quantitative CEST experiments.

    Science.gov (United States)

    Zaiss, Moritz; Angelovski, Goran; Demetriou, Eleni; McMahon, Michael T; Golay, Xavier; Scheffler, Klaus

    2018-03-01

    Chemical exchange saturation transfer (CEST) NMR or MRI experiments allow detection of low concentrated molecules with enhanced sensitivity via their proton exchange with the abundant water pool. Be it endogenous metabolites or exogenous contrast agents, an exact quantification of the actual exchange rate is required to design optimal pulse sequences and/or specific sensitive agents. Refined analytical expressions allow deeper insight and improvement of accuracy for common quantification techniques. The accuracy of standard quantification methodologies, such as quantification of exchange rate using varying saturation power or varying saturation time, is improved especially for the case of nonequilibrium initial conditions and weak labeling conditions, meaning the saturation amplitude is smaller than the exchange rate (γB 1  exchange rate using varying saturation power/time' (QUESP/QUEST) equations allow for more accurate exchange rate determination, and provide clear insights on the general principles to execute the experiments and to perform numerical evaluation. The proposed methodology was evaluated on the large-shift regime of paramagnetic chemical-exchange-saturation-transfer agents using simulated data and data of the paramagnetic Eu(III) complex of DOTA-tetraglycineamide. The refined formulas yield improved exchange rate estimation. General convergence intervals of the methods that would apply for smaller shift agents are also discussed. Magn Reson Med 79:1708-1721, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  16. Realizing the quantitative potential of the radioisotope image

    International Nuclear Information System (INIS)

    Brown, N.J.G.; Britton, K.E.; Cruz, F.R.

    1977-01-01

    The sophistication and accuracy of a clinical strategy depends on the accuracy of the results of the tests used. When numerical values are given in the test report powerful clinical strategies can be developed. The eye is well able to perceive structures in a high-quality grey-scale image. However, the degree of difference in density between two points cannot be estimated quantitatively by eye. This creates a problem particularly when there is only a small difference between the count-rate at a suspicious point or region and the count-rate to be expected there if the image were normal. To resolve this problem methods of quantitation of the amplitude of a feature, defined as the difference between the observed and expected values at the region of the feature, have been developed. The eye can estimate the frequency of light entering it very accurately (perceived as colour). Thus, if count-rate data are transformed into colour in a systematic way then information about realtive count-rate can be perceived. A computer-driven, interactive colour display system is used in which the count-rate range of each colour is computed as a percentage of a reference count-rate value. This can be used to obtain quantitative estimates of the amplitude of an image feature. The application of two methods to normal and pathological data are described and the results discussed. (author)

  17. A New Variable Selection Method Based on Mutual Information Maximization by Replacing Collinear Variables for Nonlinear Quantitative Structure-Property Relationship Models

    Energy Technology Data Exchange (ETDEWEB)

    Ghasemi, Jahan B.; Zolfonoun, Ehsan [Toosi University of Technology, Tehran (Korea, Republic of)

    2012-05-15

    Selection of the most informative molecular descriptors from the original data set is a key step for development of quantitative structure activity/property relationship models. Recently, mutual information (MI) has gained increasing attention in feature selection problems. This paper presents an effective mutual information-based feature selection approach, named mutual information maximization by replacing collinear variables (MIMRCV), for nonlinear quantitative structure-property relationship models. The proposed variable selection method was applied to three different QSPR datasets, soil degradation half-life of 47 organophosphorus pesticides, GC-MS retention times of 85 volatile organic compounds, and water-to-micellar cetyltrimethylammonium bromide partition coefficients of 62 organic compounds.The obtained results revealed that using MIMRCV as feature selection method improves the predictive quality of the developed models compared to conventional MI based variable selection algorithms.

  18. A New Variable Selection Method Based on Mutual Information Maximization by Replacing Collinear Variables for Nonlinear Quantitative Structure-Property Relationship Models

    International Nuclear Information System (INIS)

    Ghasemi, Jahan B.; Zolfonoun, Ehsan

    2012-01-01

    Selection of the most informative molecular descriptors from the original data set is a key step for development of quantitative structure activity/property relationship models. Recently, mutual information (MI) has gained increasing attention in feature selection problems. This paper presents an effective mutual information-based feature selection approach, named mutual information maximization by replacing collinear variables (MIMRCV), for nonlinear quantitative structure-property relationship models. The proposed variable selection method was applied to three different QSPR datasets, soil degradation half-life of 47 organophosphorus pesticides, GC-MS retention times of 85 volatile organic compounds, and water-to-micellar cetyltrimethylammonium bromide partition coefficients of 62 organic compounds.The obtained results revealed that using MIMRCV as feature selection method improves the predictive quality of the developed models compared to conventional MI based variable selection algorithms

  19. Climate Change Education: Quantitatively Assessing the Impact of a Botanical Garden as an Informal Learning Environment

    Science.gov (United States)

    Sellmann, Daniela; Bogner, Franz X.

    2013-01-01

    Although informal learning environments have been studied extensively, ours is one of the first studies to quantitatively assess the impact of learning in botanical gardens on students' cognitive achievement. We observed a group of 10th graders participating in a one-day educational intervention on climate change implemented in a botanical garden.…

  20. Quantitative ion implantation

    International Nuclear Information System (INIS)

    Gries, W.H.

    1976-06-01

    This is a report of the study of the implantation of heavy ions at medium keV-energies into electrically conducting mono-elemental solids, at ion doses too small to cause significant loss of the implanted ions by resputtering. The study has been undertaken to investigate the possibility of accurate portioning of matter in submicrogram quantities, with some specific applications in mind. The problem is extensively investigated both on a theoretical level and in practice. A mathematical model is developed for calculating the loss of implanted ions by resputtering as a function of the implanted ion dose and the sputtering yield. Numerical data are produced therefrom which permit a good order-of-magnitude estimate of the loss for any ion/solid combination in which the ions are heavier than the solid atoms, and for any ion energy from 10 to 300 keV. The implanted ion dose is measured by integration of the ion beam current, and equipment and techniques are described which make possible the accurate integration of an ion current in an electromagnetic isotope separator. The methods are applied to two sample cases, one being a stable isotope, the other a radioisotope. In both cases independent methods are used to show that the implantation is indeed quantitative, as predicted. At the same time the sample cases are used to demonstrate two possible applications for quantitative ion implantation, viz. firstly for the manufacture of calibration standards for instrumental micromethods of elemental trace analysis in metals, and secondly for the determination of the half-lives of long-lived radioisotopes by a specific activity method. It is concluded that the present study has advanced quantitative ion implantation to the state where it can be successfully applied to the solution of problems in other fields

  1. Quantitative analysis of drug distribution by ambient mass spectrometry imaging method with signal extinction normalization strategy and inkjet-printing technology.

    Science.gov (United States)

    Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper

    2018-03-01

    Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Visible light scatter as quantitative information source on milk constituents

    DEFF Research Database (Denmark)

    Melentiyeva, Anastasiya; Kucheryavskiy, Sergey; Bogomolov, Andrey

    2012-01-01

    analysis. The main task here is to extract individual quantitative information on milk fat and total protein content from spectral data. This is particularly challenging problem in the case of raw natural milk, where the fat globule sizes may essentially differ depending on source. Fig. 1. Spots of light...... designed set of raw milk samples with simultaneously varying fat, total protein and particle size distribution has been analyzed in the Vis spectral region. The feasibility of raw milk analysis by PLS regression on spectral data has been proved. The root mean-square errors below 0.10% and 0.04% for fat....... 3J&M Analytik AG, Willy-Messerschmitt-Strasse 8, 73457 Essingen, Germany. bogomolov@j-m.de Fat and protein are two major milk nutrients that are routinely analyzed in the dairy industry. Growing food quality requirements promote the dissemination of spectroscopic analysis, enabling real...

  3. Accurate, fully-automated NMR spectral profiling for metabolomics.

    Directory of Open Access Journals (Sweden)

    Siamak Ravanbakhsh

    Full Text Available Many diseases cause significant changes to the concentrations of small molecules (a.k.a. metabolites that appear in a person's biofluids, which means such diseases can often be readily detected from a person's "metabolic profile"-i.e., the list of concentrations of those metabolites. This information can be extracted from a biofluids Nuclear Magnetic Resonance (NMR spectrum. However, due to its complexity, NMR spectral profiling has remained manual, resulting in slow, expensive and error-prone procedures that have hindered clinical and industrial adoption of metabolomics via NMR. This paper presents a system, BAYESIL, which can quickly, accurately, and autonomously produce a person's metabolic profile. Given a 1D 1H NMR spectrum of a complex biofluid (specifically serum or cerebrospinal fluid, BAYESIL can automatically determine the metabolic profile. This requires first performing several spectral processing steps, then matching the resulting spectrum against a reference compound library, which contains the "signatures" of each relevant metabolite. BAYESIL views spectral matching as an inference problem within a probabilistic graphical model that rapidly approximates the most probable metabolic profile. Our extensive studies on a diverse set of complex mixtures including real biological samples (serum and CSF, defined mixtures and realistic computer generated spectra; involving > 50 compounds, show that BAYESIL can autonomously find the concentration of NMR-detectable metabolites accurately (~ 90% correct identification and ~ 10% quantification error, in less than 5 minutes on a single CPU. These results demonstrate that BAYESIL is the first fully-automatic publicly-accessible system that provides quantitative NMR spectral profiling effectively-with an accuracy on these biofluids that meets or exceeds the performance of trained experts. We anticipate this tool will usher in high-throughput metabolomics and enable a wealth of new applications of

  4. The Most Likely Nemesis to Timely, Accurate Electronic Information

    Science.gov (United States)

    2002-02-04

    NETWORKS, TRAINING, COMMERCIAL OFF-THE-SHELF, INFORMATION TECHNOLOGY , INTERNET , COMMUNICATIONS EQUIPMENT, ELECTRONIC INFORMATION 15.Abstract...infected over 200,000 Internet computers . While the objective appeared to be to create a log-jam on the Internet and not actually alter information on...Brigadier General Robert M. Shea, USMC, Director for Command, Control, Communications and Computers for the Marine Corps, cites information overload problems

  5. A methodology for the extraction of quantitative information from electron microscopy images at the atomic level

    International Nuclear Information System (INIS)

    Galindo, P L; Pizarro, J; Guerrero, E; Guerrero-Lebrero, M P; Scavello, G; Yáñez, A; Sales, D L; Herrera, M; Molina, S I; Núñez-Moraleda, B M; Maestre, J M

    2014-01-01

    In this paper we describe a methodology developed at the University of Cadiz (Spain) in the past few years for the extraction of quantitative information from electron microscopy images at the atomic level. This work is based on a coordinated and synergic activity of several research groups that have been working together over the last decade in two different and complementary fields: Materials Science and Computer Science. The aim of our joint research has been to develop innovative high-performance computing techniques and simulation methods in order to address computationally challenging problems in the analysis, modelling and simulation of materials at the atomic scale, providing significant advances with respect to existing techniques. The methodology involves several fundamental areas of research including the analysis of high resolution electron microscopy images, materials modelling, image simulation and 3D reconstruction using quantitative information from experimental images. These techniques for the analysis, modelling and simulation allow optimizing the control and functionality of devices developed using materials under study, and have been tested using data obtained from experimental samples

  6. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  7. Combining qualitative and quantitative research approaches in understanding pain

    DEFF Research Database (Denmark)

    Moore, R.

    1996-01-01

    findings. Furthermore, with specific scientific assumptions, combining methods can aid in estimating minimum sample size required for theoretical generalizations from even a qualitative sample. This is based on measures of how accurately subjects describe a given social phenomenon and degree of agreement......There are many research issues about validity and especially reliability in regards to qualitative research results. Generalizability is brought into question to any population base from which a relatively small number of informants are drawn. Sensitivity to new discoveries is an advantage...... of qualitative research while the advantage of quantified survey data is their reliability. This paper argues for combining qualitative and quantitative methods to improve concurrent validity of results by triangulating interviews, observations or focus group data with short surveys for validation of main...

  8. Tool for the quantitative evaluation of a Facebook app-based informal training process

    Directory of Open Access Journals (Sweden)

    Adolfo Calle-Gómez

    2017-02-01

    Full Text Available The study of the impact of Facebook in academy has been mainly based on the qualitative evaluation of the academic performance and motivation of students. This work takes as starting point the use of the Facebook app Sigma in the Universidad Técnica de Ambato. Students of this university share educative resources through Sigma. This constitutes an informal learning process. We have proposed to construct Gamma, a tool for the generation of statistics and charts that illustrates the impact of the social network in the resulting learning process. This paper presents the results of the study of how Gamma is valued by those who like to do informal learning. It was checked that 1 Gamma gives feedback about the value of educative resources and social actions and that 2 it allows the quantitative measurement of the impact of using Facebook in the informal learning process. As an added value, Gamma supports the communication between supporters and detractors of the use of Facebook in the academia.

  9. Comparison of conventional, model-based quantitative planar, and quantitative SPECT image processing methods for organ activity estimation using In-111 agents

    International Nuclear Information System (INIS)

    He, Bin; Frey, Eric C

    2006-01-01

    Accurate quantification of organ radionuclide uptake is important for patient-specific dosimetry. The quantitative accuracy from conventional conjugate view methods is limited by overlap of projections from different organs and background activity, and attenuation and scatter. In this work, we propose and validate a quantitative planar (QPlanar) processing method based on maximum likelihood (ML) estimation of organ activities using 3D organ VOIs and a projector that models the image degrading effects. Both a physical phantom experiment and Monte Carlo simulation (MCS) studies were used to evaluate the new method. In these studies, the accuracies and precisions of organ activity estimates for the QPlanar method were compared with those from conventional planar (CPlanar) processing methods with various corrections for scatter, attenuation and organ overlap, and a quantitative SPECT (QSPECT) processing method. Experimental planar and SPECT projections and registered CT data from an RSD Torso phantom were obtained using a GE Millenium VH/Hawkeye system. The MCS data were obtained from the 3D NCAT phantom with organ activity distributions that modelled the uptake of 111 In ibritumomab tiuxetan. The simulations were performed using parameters appropriate for the same system used in the RSD torso phantom experiment. The organ activity estimates obtained from the CPlanar, QPlanar and QSPECT methods from both experiments were compared. From the results of the MCS experiment, even with ideal organ overlap correction and background subtraction, CPlanar methods provided limited quantitative accuracy. The QPlanar method with accurate modelling of the physical factors increased the quantitative accuracy at the cost of requiring estimates of the organ VOIs in 3D. The accuracy of QPlanar approached that of QSPECT, but required much less acquisition and computation time. Similar results were obtained from the physical phantom experiment. We conclude that the QPlanar method, based

  10. Physical characterization and preliminary results of a PET system using time-of-flight for quantitative studies

    International Nuclear Information System (INIS)

    Soussaline, F.; Verrey, B.; Comar, D.; Campagnolo, R.; Bouvier, A.; Lecomte, J.L.

    1984-01-01

    A positron camera was designed to meet the needs for a high sensitivity, high resolution, dynamic imaging at high count rate, multislice system, for quantitative measurements. Actually, the goals of present positron camera design are clearly to provide accurate quantitative images of physiological or biochemical parameters with dramatically improved spatial, temporal and contrast resolutions. The use of the time-of-flight (TOF) information which produces more accurate images with fewer detected events, provides an approach to such idenfied needs. This paper first presents the physical characterization of this system, so-called TTVO1, which confirms the TOF system capabilities and main advantages on the system without use of TOF, namely: the improvement of the signal-to-noise ratio due to the better, however approximate, localization of the source position, providing an equivalent gain in sensitivity; the good elimination of accidental -or random- coincidences due to the short time-window (3 nsec for a whole body inner ring); the ability to handle very high count rates without pile up of the detectors or electronic, due to the short scintillation decay time in fast crystals such as CsF or BaF 2 (Baryum fluoride)

  11. The Quantitative Theory of Information

    DEFF Research Database (Denmark)

    Topsøe, Flemming; Harremoës, Peter

    2008-01-01

    Information Theory as developed by Shannon and followers is becoming more and more important in a number of sciences. The concepts appear to be just the right ones with intuitively appealing operational interpretations. Furthermore, the information theoretical quantities are connected by powerful...

  12. Identification and evaluation of new reference genes in Gossypium hirsutum for accurate normalization of real-time quantitative RT-PCR data

    Directory of Open Access Journals (Sweden)

    Alves-Ferreira Marcio

    2010-03-01

    Full Text Available Abstract Background Normalizing through reference genes, or housekeeping genes, can make more accurate and reliable results from reverse transcription real-time quantitative polymerase chain reaction (qPCR. Recent studies have shown that no single housekeeping gene is universal for all experiments. Thus, suitable reference genes should be the first step of any qPCR analysis. Only a few studies on the identification of housekeeping gene have been carried on plants. Therefore qPCR studies on important crops such as cotton has been hampered by the lack of suitable reference genes. Results By the use of two distinct algorithms, implemented by geNorm and NormFinder, we have assessed the gene expression of nine candidate reference genes in cotton: GhACT4, GhEF1α5, GhFBX6, GhPP2A1, GhMZA, GhPTB, GhGAPC2, GhβTUB3 and GhUBQ14. The candidate reference genes were evaluated in 23 experimental samples consisting of six distinct plant organs, eight stages of flower development, four stages of fruit development and in flower verticils. The expression of GhPP2A1 and GhUBQ14 genes were the most stable across all samples and also when distinct plants organs are examined. GhACT4 and GhUBQ14 present more stable expression during flower development, GhACT4 and GhFBX6 in the floral verticils and GhMZA and GhPTB during fruit development. Our analysis provided the most suitable combination of reference genes for each experimental set tested as internal control for reliable qPCR data normalization. In addition, to illustrate the use of cotton reference genes we checked the expression of two cotton MADS-box genes in distinct plant and floral organs and also during flower development. Conclusion We have tested the expression stabilities of nine candidate genes in a set of 23 tissue samples from cotton plants divided into five different experimental sets. As a result of this evaluation, we recommend the use of GhUBQ14 and GhPP2A1 housekeeping genes as superior references

  13. Identification and evaluation of new reference genes in Gossypium hirsutum for accurate normalization of real-time quantitative RT-PCR data.

    Science.gov (United States)

    Artico, Sinara; Nardeli, Sarah M; Brilhante, Osmundo; Grossi-de-Sa, Maria Fátima; Alves-Ferreira, Marcio

    2010-03-21

    Normalizing through reference genes, or housekeeping genes, can make more accurate and reliable results from reverse transcription real-time quantitative polymerase chain reaction (qPCR). Recent studies have shown that no single housekeeping gene is universal for all experiments. Thus, suitable reference genes should be the first step of any qPCR analysis. Only a few studies on the identification of housekeeping gene have been carried on plants. Therefore qPCR studies on important crops such as cotton has been hampered by the lack of suitable reference genes. By the use of two distinct algorithms, implemented by geNorm and NormFinder, we have assessed the gene expression of nine candidate reference genes in cotton: GhACT4, GhEF1alpha5, GhFBX6, GhPP2A1, GhMZA, GhPTB, GhGAPC2, GhbetaTUB3 and GhUBQ14. The candidate reference genes were evaluated in 23 experimental samples consisting of six distinct plant organs, eight stages of flower development, four stages of fruit development and in flower verticils. The expression of GhPP2A1 and GhUBQ14 genes were the most stable across all samples and also when distinct plants organs are examined. GhACT4 and GhUBQ14 present more stable expression during flower development, GhACT4 and GhFBX6 in the floral verticils and GhMZA and GhPTB during fruit development. Our analysis provided the most suitable combination of reference genes for each experimental set tested as internal control for reliable qPCR data normalization. In addition, to illustrate the use of cotton reference genes we checked the expression of two cotton MADS-box genes in distinct plant and floral organs and also during flower development. We have tested the expression stabilities of nine candidate genes in a set of 23 tissue samples from cotton plants divided into five different experimental sets. As a result of this evaluation, we recommend the use of GhUBQ14 and GhPP2A1 housekeeping genes as superior references for normalization of gene expression measures in

  14. Accurate determination of rates from non-uniformly sampled relaxation data

    Energy Technology Data Exchange (ETDEWEB)

    Stetz, Matthew A.; Wand, A. Joshua, E-mail: wand@upenn.edu [University of Pennsylvania Perelman School of Medicine, Johnson Research Foundation and Department of Biochemistry and Biophysics (United States)

    2016-08-15

    The application of non-uniform sampling (NUS) to relaxation experiments traditionally used to characterize the fast internal motion of proteins is quantitatively examined. Experimentally acquired Poisson-gap sampled data reconstructed with iterative soft thresholding are compared to regular sequentially sampled (RSS) data. Using ubiquitin as a model system, it is shown that 25 % sampling is sufficient for the determination of quantitatively accurate relaxation rates. When the sampling density is fixed at 25 %, the accuracy of rates is shown to increase sharply with the total number of sampled points until eventually converging near the inherent reproducibility of the experiment. Perhaps contrary to some expectations, it is found that accurate peak height reconstruction is not required for the determination of accurate rates. Instead, inaccuracies in rates arise from inconsistencies in reconstruction across the relaxation series that primarily manifest as a non-linearity in the recovered peak height. This indicates that the performance of an NUS relaxation experiment cannot be predicted from comparison of peak heights using a single RSS reference spectrum. The generality of these findings was assessed using three alternative reconstruction algorithms, eight different relaxation measurements, and three additional proteins that exhibit varying degrees of spectral complexity. From these data, it is revealed that non-linearity in peak height reconstruction across the relaxation series is strongly correlated with errors in NUS-derived relaxation rates. Importantly, it is shown that this correlation can be exploited to reliably predict the performance of an NUS-relaxation experiment by using three or more RSS reference planes from the relaxation series. The RSS reference time points can also serve to provide estimates of the uncertainty of the sampled intensity, which for a typical relaxation times series incurs no penalty in total acquisition time.

  15. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee [Hanyang University, Seoul (Korea, Republic of); Lee, Minuk; Choi, Jong-su; Hong, Sup [Korea Research Insitute of Ships and Ocean Engineering, Daejeon (Korea, Republic of)

    2015-02-15

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF.

  16. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    International Nuclear Information System (INIS)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee; Lee, Minuk; Choi, Jong-su; Hong, Sup

    2015-01-01

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF

  17. Using an Educational Electronic Documentation System to Help Nursing Students Accurately Identify Nursing Diagnoses

    Science.gov (United States)

    Pobocik, Tamara J.

    2013-01-01

    The use of technology and electronic medical records in healthcare has exponentially increased. This quantitative research project used a pretest/posttest design, and reviewed how an educational electronic documentation system helped nursing students to identify the accurate related to statement of the nursing diagnosis for the patient in the case…

  18. Research in health sciences library and information science: a quantitative analysis.

    Science.gov (United States)

    Dimitroff, A

    1992-10-01

    A content analysis of research articles published between 1966 and 1990 in the Bulletin of the Medical Library Association was undertaken. Four specific questions were addressed: What subjects are of interest to health sciences librarians? Who is conducting this research? How do health sciences librarians conduct their research? Do health sciences librarians obtain funding for their research activities? Bibliometric characteristics of the research articles are described and compared to characteristics of research in library and information science as a whole in terms of subject and methodology. General findings were that most research in health sciences librarianship is conducted by librarians affiliated with academic health sciences libraries (51.8%); most deals with an applied (45.7%) or a theoretical (29.2%) topic; survey (41.0%) or observational (20.7%) research methodologies are used; descriptive quantitative analytical techniques are used (83.5%); and over 25% of research is funded. The average number of authors was 1.85, average article length was 7.25 pages, and average number of citations per article was 9.23. These findings are consistent with those reported in the general library and information science literature for the most part, although specific differences do exist in methodological and analytical areas.

  19. Quantitative characterization of crosstalk effects for friction force microscopy with scan-by-probe SPMs

    International Nuclear Information System (INIS)

    Prunici, Pavel; Hess, Peter

    2008-01-01

    If the photodetector and cantilever of an atomic force microscope (AFM) are not properly adjusted, crosstalk effects will appear. These effects disturb measurements of the absolute vertical and horizontal cantilever deflections, which are involved in friction force microscopy (FFM). A straightforward procedure is proposed to study quantitatively crosstalk effects observed in scan-by-probe SPMs. The advantage of this simple, fast, and accurate procedure is that no hardware change or upgrade is needed. The results indicate that crosstalk effects depend not only on the alignment of the detector but also on the cantilever properties, position, and detection conditions. The measurements may provide information on the origin of the crosstalk effect. After determination of its magnitude, simple correction formulas can be applied to correct the crosstalk effects and then the single-load wedge method, using a commercially available grating, can be employed for accurate calibration of the lateral force

  20. Quantitative characterization of crosstalk effects for friction force microscopy with scan-by-probe SPMs

    Energy Technology Data Exchange (ETDEWEB)

    Prunici, Pavel [Institute of Physical Chemistry, University of Heidelberg, D-69120 Heidelberg (Germany); Hess, Peter [Institute of Physical Chemistry, University of Heidelberg, D-69120 Heidelberg (Germany)], E-mail: peter.hess@urz.uni-heidelberg.de

    2008-06-15

    If the photodetector and cantilever of an atomic force microscope (AFM) are not properly adjusted, crosstalk effects will appear. These effects disturb measurements of the absolute vertical and horizontal cantilever deflections, which are involved in friction force microscopy (FFM). A straightforward procedure is proposed to study quantitatively crosstalk effects observed in scan-by-probe SPMs. The advantage of this simple, fast, and accurate procedure is that no hardware change or upgrade is needed. The results indicate that crosstalk effects depend not only on the alignment of the detector but also on the cantilever properties, position, and detection conditions. The measurements may provide information on the origin of the crosstalk effect. After determination of its magnitude, simple correction formulas can be applied to correct the crosstalk effects and then the single-load wedge method, using a commercially available grating, can be employed for accurate calibration of the lateral force.

  1. Relating Maxwell’s demon and quantitative analysis of information leakage for practical imperative programs

    International Nuclear Information System (INIS)

    Anjaria, Kushal; Mishra, Arun

    2017-01-01

    Shannon observed the relation between information entropy and Maxwell demon experiment to come up with information entropy formula. After that, Shannon’s entropy formula is widely used to measure information leakage in imperative programs. But in the present work, our aim is to go in a reverse direction and try to find possible Maxwell’s demon experimental setup for contemporary practical imperative programs in which variations of Shannon’s entropy formula has been applied to measure the information leakage. To establish the relation between the second principle of thermodynamics and quantitative analysis of information leakage, present work models contemporary variations of imperative programs in terms of Maxwell’s demon experimental setup. In the present work five contemporary variations of imperative program related to information quantification are identified. They are: (i) information leakage in imperative program, (ii) imperative multithreaded program, (iii) point to point leakage in the imperative program, (iv) imperative program with infinite observation, and (v) imperative program in the SOA-based environment. For these variations, minimal work required by an attacker to gain the secret is also calculated using historical Maxwell’s demon experiment. To model the experimental setup of Maxwell’s demon, non-interference security policy is used. In the present work, imperative programs with one-bit secret information have been considered to avoid the complexity. The findings of the present work from the history of physics can be utilized in many areas related to information flow of physical computing, nano-computing, quantum computing, biological computing, energy dissipation in computing, and computing power analysis. (paper)

  2. A Primer on Disseminating Applied Quantitative Research

    Science.gov (United States)

    Bell, Bethany A.; DiStefano, Christine; Morgan, Grant B.

    2010-01-01

    Transparency and replication are essential features of scientific inquiry, yet scientific communications of applied quantitative research are often lacking in much-needed procedural information. In an effort to promote researchers dissemination of their quantitative studies in a cohesive, detailed, and informative manner, the authors delineate…

  3. The use of semi-structured interviews for collection of qualitative and quantitative data in hydrological studies

    Science.gov (United States)

    O'Keeffe, Jimmy; Buytaert, Wouter; Mijic, Ana; Brozovic, Nicholas

    2015-04-01

    To build an accurate, robust understanding of the environment, it is important to not only collect information describing its physical characteristics, but also the drivers which influence it. As environmental change, from increasing CO2 levels to decreasing water levels, is often heavily influenced by human activity, gathering information on anthropogenic as well as environmental variables is extremely important. This can mean collecting qualitative, as well as quantitative information. In reality studies are often bound by financial and time constraints, limiting the depth and detail of the research. It is up to the researcher to determine what the best methodology to answer the research questions is likely to be. Here we present a methodology of collecting qualitative and quantitative information in tandem for hydrological studies through the use of semi-structured interviews. This is applied to a case study in two districts of Uttar Pradesh, North India, one of the most intensely irrigated areas of the world. Here, decreasing water levels exacerbated by unchecked water abstraction, an expanding population and government subsidies, have put the long term resilience of the farming population in doubt. Through random selection of study locations, combined with convenience sampling of the participants therein, we show how the data collected can provide valuable insight into the drivers which have led to the current water scenario. We also show how reliable quantitative information can, using the same methodology, be effectively and efficiently extracted for modelling purposes, which along with developing an understanding of the characteristics of the environment is vital in coming up with realistic and sustainable solutions for water resource management in the future.

  4. Detection and quantitation of trace phenolphthalein (in pharmaceutical preparations and in forensic exhibits) by liquid chromatography-tandem mass spectrometry, a sensitive and accurate method.

    Science.gov (United States)

    Sharma, Kakali; Sharma, Shiba P; Lahiri, Sujit C

    2013-01-01

    Phenolphthalein, an acid-base indicator and laxative, is important as a constituent of widely used weight-reducing multicomponent food formulations. Phenolphthalein is an useful reagent in forensic science for the identification of blood stains of suspected victims and for apprehending erring officials accepting bribes in graft or trap cases. The pink-colored alkaline hand washes originating from the phenolphthalein-smeared notes can easily be determined spectrophotometrically. But in many cases, colored solution turns colorless with time, which renders the genuineness of bribe cases doubtful to the judiciary. No method is known till now for the detection and identification of phenolphthalein in colorless forensic exhibits with positive proof. Liquid chromatography-tandem mass spectrometry had been found to be most sensitive, accurate method capable of detection and quantitation of trace phenolphthalein in commercial formulations and colorless forensic exhibits with positive proof. The detection limit of phenolphthalein was found to be 1.66 pg/L or ng/mL, and the calibration curve shows good linearity (r(2) = 0.9974). © 2012 American Academy of Forensic Sciences.

  5. Chemometric study of Andalusian extra virgin olive oils Raman spectra: Qualitative and quantitative information.

    Science.gov (United States)

    Sánchez-López, E; Sánchez-Rodríguez, M I; Marinas, A; Marinas, J M; Urbano, F J; Caridad, J M; Moalem, M

    2016-08-15

    Authentication of extra virgin olive oil (EVOO) is an important topic for olive oil industry. The fraudulent practices in this sector are a major problem affecting both producers and consumers. This study analyzes the capability of FT-Raman combined with chemometric treatments of prediction of the fatty acid contents (quantitative information), using gas chromatography as the reference technique, and classification of diverse EVOOs as a function of the harvest year, olive variety, geographical origin and Andalusian PDO (qualitative information). The optimal number of PLS components that summarizes the spectral information was introduced progressively. For the estimation of the fatty acid composition, the lowest error (both in fitting and prediction) corresponded to MUFA, followed by SAFA and PUFA though such errors were close to zero in all cases. As regards the qualitative variables, discriminant analysis allowed a correct classification of 94.3%, 84.0%, 89.0% and 86.6% of samples for harvest year, olive variety, geographical origin and PDO, respectively. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Quantitative assessment of hematopoietic chimerism by quantitative real-time polymerase chain reaction of sequence polymorphism systems after hematopoietic stem cell transplantation.

    Science.gov (United States)

    Qin, Xiao-ying; Li, Guo-xuan; Qin, Ya-zhen; Wang, Yu; Wang, Feng-rong; Liu, Dai-hong; Xu, Lan-ping; Chen, Huan; Han, Wei; Wang, Jing-zhi; Zhang, Xiao-hui; Li, Jin-lan; Li, Ling-di; Liu, Kai-yan; Huang, Xiao-jun

    2011-08-01

    Analysis of changes in recipient and donor hematopoietic cell origin is extremely useful to monitor the effect of hematopoietic stem cell transplantation (HSCT) and sequential adoptive immunotherapy by donor lymphocyte infusions. We developed a sensitive, reliable and rapid real-time PCR method based on sequence polymorphism systems to quantitatively assess the hematopoietic chimerism after HSCT. A panel of 29 selected sequence polymorphism (SP) markers was screened by real-time PCR in 101 HSCT patients with leukemia and other hematological diseases. The chimerism kinetics of bone marrow samples of 8 HSCT patients in remission and relapse situations were followed longitudinally. Recipient genotype discrimination was possible in 97.0% (98 of 101) with a mean number of 2.5 (1-7) informative markers per recipient/donor pair. Using serial dilutions of plasmids containing specific SP markers, the linear correlation (r) of 0.99, the slope between -3.2 and -3.7 and the sensitivity of 0.1% were proved reproducible. By this method, it was possible to very accurately detect autologous signals in the range from 0.1% to 30%. The accuracy of the method in the very important range of autologous signals below 5% was extraordinarily high (standard deviation real-time PCR method over short tandem repeat PCR chimerism assays is the absence of PCR competition and plateau biases, with demonstrated greater sensitivity and linearity. Finally, we prospectively analyzed bone marrow samples of 8 patients who received allografts and presented the chimerism kinetics of remission and relapse situations that illustrated the sensitivity level and the promising clinical application of this method. This SP-based real-time PCR assay provides a rapid, sensitive, and accurate quantitative assessment of mixed chimerism that can be useful in predicting graft rejection and early relapse.

  7. Quantitation of circulating tumor cells in blood samples from ovarian and prostate cancer patients using tumor-specific fluorescent ligands.

    Science.gov (United States)

    He, Wei; Kularatne, Sumith A; Kalli, Kimberly R; Prendergast, Franklyn G; Amato, Robert J; Klee, George G; Hartmann, Lynn C; Low, Philip S

    2008-10-15

    Quantitation of circulating tumor cells (CTCs) can provide information on the stage of a malignancy, onset of disease progression and response to therapy. In an effort to more accurately quantitate CTCs, we have synthesized fluorescent conjugates of 2 high-affinity tumor-specific ligands (folate-AlexaFluor 488 and DUPA-FITC) that bind tumor cells >20-fold more efficiently than fluorescent antibodies. Here we determine whether these tumor-specific dyes can be exploited for quantitation of CTCs in peripheral blood samples from cancer patients. A CTC-enriched fraction was isolated from the peripheral blood of ovarian and prostate cancer patients by an optimized density gradient centrifugation protocol and labeled with the aforementioned fluorescent ligands. CTCs were then quantitated by flow cytometry. CTCs were detected in 18 of 20 ovarian cancer patients (mean 222 CTCs/ml; median 15 CTCs/ml; maximum 3,118 CTCs/ml), whereas CTC numbers in 16 gender-matched normal volunteers were negligible (mean 0.4 CTCs/ml; median 0.3 CTCs/ml; maximum 1.5 CTCs/ml; p < 0.001, chi(2)). CTCs were also detected in 10 of 13 prostate cancer patients (mean 26 CTCs/ml, median 14 CTCs/ml, maximum 94 CTCs/ml) but not in 18 gender-matched healthy donors (mean 0.8 CTCs/ml, median 1, maximum 3 CTC/ml; p < 0.0026, chi(2)). Tumor-specific fluorescent antibodies were much less efficient in quantitating CTCs because of their lower CTC labeling efficiency. Use of tumor-specific fluorescent ligands to label CTCs in peripheral blood can provide a simple, accurate and sensitive method for determining the number of cancer cells circulating in the bloodstream.

  8. Quantitative SPECT reconstruction of iodine-123 data

    International Nuclear Information System (INIS)

    Gilland, D.R.; Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1991-01-01

    Many clinical and research studies in nuclear medicine require quantitation of iodine-123 ( 123 I) distribution for the determination of kinetics or localization. The objective of this study was to implement several reconstruction methods designed for single-photon emission computed tomography (SPECT) using 123 I and to evaluate their performance in terms of quantitative accuracy, image artifacts, and noise. The methods consisted of four attenuation and scatter compensation schemes incorporated into both the filtered backprojection/Chang (FBP) and maximum likelihood-expectation maximization (ML-EM) reconstruction algorithms. The methods were evaluated on data acquired of a phantom containing a hot sphere of 123 I activity in a lower level background 123 I distribution and nonuniform density media. For both reconstruction algorithms, nonuniform attenuation compensation combined with either scatter subtraction or Metz filtering produced images that were quantitatively accurate to within 15% of the true value. The ML-EM algorithm demonstrated quantitative accuracy comparable to FBP and smaller relative noise magnitude for all compensation schemes

  9. Absolute quantitation of proteins by Acid hydrolysis combined with amino Acid detection by mass spectrometry

    DEFF Research Database (Denmark)

    Mirgorodskaya, Olga A; Körner, Roman; Kozmin, Yuri P

    2012-01-01

    Amino acid analysis is among the most accurate methods for absolute quantification of proteins and peptides. Here, we combine acid hydrolysis with the addition of isotopically labeled standard amino acids and analysis by mass spectrometry for accurate and sensitive protein quantitation...

  10. A method for improved clustering and classification of microscopy images using quantitative co-localization coefficients

    LENUS (Irish Health Repository)

    Singan, Vasanth R

    2012-06-08

    AbstractBackgroundThe localization of proteins to specific subcellular structures in eukaryotic cells provides important information with respect to their function. Fluorescence microscopy approaches to determine localization distribution have proved to be an essential tool in the characterization of unknown proteins, and are now particularly pertinent as a result of the wide availability of fluorescently-tagged constructs and antibodies. However, there are currently very few image analysis options able to effectively discriminate proteins with apparently similar distributions in cells, despite this information being important for protein characterization.FindingsWe have developed a novel method for combining two existing image analysis approaches, which results in highly efficient and accurate discrimination of proteins with seemingly similar distributions. We have combined image texture-based analysis with quantitative co-localization coefficients, a method that has traditionally only been used to study the spatial overlap between two populations of molecules. Here we describe and present a novel application for quantitative co-localization, as applied to the study of Rab family small GTP binding proteins localizing to the endomembrane system of cultured cells.ConclusionsWe show how quantitative co-localization can be used alongside texture feature analysis, resulting in improved clustering of microscopy images. The use of co-localization as an additional clustering parameter is non-biased and highly applicable to high-throughput image data sets.

  11. Quantitative microanalysis with a nuclear microprobe

    International Nuclear Information System (INIS)

    Themner, Klas.

    1989-01-01

    The analytical techniques of paticle induced X-ray emission (PIXE) and Rutherford backscattering (RBS), together with the nuclear microprobe, form a very powerful tool for performing quantitative microanalysis of biological material. Calibration of the X-ray detection system in the microprobe set-up has been performed and the accuracy of the quantitative procedure using RBS for determination of the areal mass density was investigated. The accuracy of the analysis can be affected by alteration in the elemental concentrations during irradiation due to the radiation damage induced by the very intense beams of ionixing radiation. Loss of matrix elements from freeze-dried tissue sections and polymer films have been studied during proton and photon irradiation and the effect on the accuracy discussed. Scanning the beam over an area of the target, with e.g. 32x32 pixels, in order to produce en elemental map, yields a lot of information and, to be able to make an accurate quantitatification, a fast algorithm using descriptions of the different spectral contributions is of need. The production of continuum X-rays by 2.55 MeV protons has been studied and absolute cross-sections for the bremsstrahlung production from thin carbon and some polymer films determined. For the determination of the bremsstrahlung background knowledge of the amounts of the matrix elements is important and a fast program for the evaluation of spectra of proton back- and forward scattering from biological samples has been developed. Quantitative microanalysis with the nuclear microprobe has been performed on brain tissue from rats subjected to different pathological conditions. Increase in calcium levels and decrease in potssium levels for animals subjected to crebral ischaemia and for animals suffering from epileptic seizures were observed coincidentally with or, in some cases before, visible signs of cell necrosis. (author)

  12. Fast and accurate edge orientation processing during object manipulation

    Science.gov (United States)

    Flanagan, J Randall; Johansson, Roland S

    2018-01-01

    Quickly and accurately extracting information about a touched object’s orientation is a critical aspect of dexterous object manipulation. However, the speed and acuity of tactile edge orientation processing with respect to the fingertips as reported in previous perceptual studies appear inadequate in these respects. Here we directly establish the tactile system’s capacity to process edge-orientation information during dexterous manipulation. Participants extracted tactile information about edge orientation very quickly, using it within 200 ms of first touching the object. Participants were also strikingly accurate. With edges spanning the entire fingertip, edge-orientation resolution was better than 3° in our object manipulation task, which is several times better than reported in previous perceptual studies. Performance remained impressive even with edges as short as 2 mm, consistent with our ability to precisely manipulate very small objects. Taken together, our results radically redefine the spatial processing capacity of the tactile system. PMID:29611804

  13. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    Science.gov (United States)

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  14. Accurate core-electron binding energy shifts from density functional theory

    International Nuclear Information System (INIS)

    Takahata, Yuji; Marques, Alberto Dos Santos

    2010-01-01

    Current review covers description of density functional methods of calculation of accurate core-electron binding energy (CEBE) of second and third row atoms; applications of calculated CEBEs and CEBE shifts (ΔCEBEs) in elucidation of topics such as: hydrogen-bonding, peptide bond, polymers, DNA bases, Hammett substituent (σ) constants, inductive and resonance effects, quantitative structure activity relationship (QSAR), and solid state effect (WD). This review limits itself to works of mainly Chong and his coworkers for the period post-2002. It is not a fully comprehensive account of the current state of the art.

  15. Camouflage target detection via hyperspectral imaging plus information divergence measurement

    Science.gov (United States)

    Chen, Yuheng; Chen, Xinhua; Zhou, Jiankang; Ji, Yiqun; Shen, Weimin

    2016-01-01

    Target detection is one of most important applications in remote sensing. Nowadays accurate camouflage target distinction is often resorted to spectral imaging technique due to its high-resolution spectral/spatial information acquisition ability as well as plenty of data processing methods. In this paper, hyper-spectral imaging technique together with spectral information divergence measure method is used to solve camouflage target detection problem. A self-developed visual-band hyper-spectral imaging device is adopted to collect data cubes of certain experimental scene before spectral information divergences are worked out so as to discriminate target camouflage and anomaly. Full-band information divergences are measured to evaluate target detection effect visually and quantitatively. Information divergence measurement is proved to be a low-cost and effective tool for target detection task and can be further developed to other target detection applications beyond spectral imaging technique.

  16. Radionuclide angiocardiography. Improved diagnosis and quantitation of left-to-right shunts using area ratio techniques in children

    International Nuclear Information System (INIS)

    Alderson, P.O.; Jost, R.G.; Strauss, A.W.; Boonvisut, S.; Markham, J.

    1975-01-01

    A comparison of several reported methods for detection and quantitation of left-to-right shunts by radionuclides was performed in 50 children. Count ratio (C2/C1) techniques were compared with the exponential extrapolation and gamma function area ratio techniques. C2/C1 ratios accurately detected shunts and could reliably separate shunts from normals, but there was a high rate of false positives in children with valvular heart disease. The area ratio methods provided more accurate shunt quantitation and a better separation of patients with valvular heart disease than did the C2/C1 ratio. The gamma function method showed a higher correlation with oximetry than the exponential method, but the difference was not statistically significant. For accurate shunt quantitation and a reliable separation of patients with valvular heart disease from those with shunts, area ratio calculations are preferable to the C2/C1 ratio

  17. Quantitative evaluation methods of skin condition based on texture feature parameters

    Directory of Open Access Journals (Sweden)

    Hui Pang

    2017-03-01

    Full Text Available In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  18. Quantitative evaluation methods of skin condition based on texture feature parameters.

    Science.gov (United States)

    Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing

    2017-03-01

    In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  19. Markovian Processes for Quantitative Information Leakage

    DEFF Research Database (Denmark)

    Biondi, Fabrizio

    Quantification of information leakage is a successful approach for evaluating the security of a system. It models the system to be analyzed as a channel with the secret as the input and an output as observable by the attacker as the output, and applies information theory to quantify the amount...... and randomized processes with Markovian models and to compute their information leakage for a very general model of attacker. We present the QUAIL tool that automates such analysis and is able to compute the information leakage of an imperative WHILE language. Finally, we show how to use QUAIL to analyze some...... of information transmitted through such channel, thus effectively quantifying how many bits of the secret can be inferred by the attacker by analyzing the system’s output. Channels are usually encoded as matrices of conditional probabilities, known as channel matrices. Such matrices grow exponentially...

  20. Kajian Model Kesuksesan Sistem Informasi Delone & Mclean Pada Pengguna Sistem Informasi Akuntansi Accurate Di Kota Sukabumi

    OpenAIRE

    Hudin, Jamal Maulana; Riana, Dwiza

    2016-01-01

    Accurate accounting information system is one of accounting information systems used in the sixcompanies in the city of Sukabumi. DeLone and McLean information system success model is asuitable model to measure the success of the application of information systems in an organizationor company. This study will analyze factors that measure the success of DeLone & McLeaninformation systems model to the users of the Accurate accounting information systems in sixcompanies in the city of Sukabumi. ...

  1. Quantitative versus semiquantitative MR imaging of cartilage in blood-induced arthritic ankles: preliminary findings

    International Nuclear Information System (INIS)

    Doria, Andrea S.; Zhang, Ningning; Lundin, Bjorn; Hilliard, Pamela; Man, Carina; Weiss, Ruth; Detzler, Garry; Blanchette, Victor; Moineddin, Rahim; Eckstein, Felix; Sussman, Marshall S.

    2014-01-01

    Recent advances in hemophilia prophylaxis have raised the need for accurate noninvasive methods for assessment of early cartilage damage in maturing joints to guide initiation of prophylaxis. Such methods can either be semiquantitative or quantitative. Whereas semiquantitative scores are less time-consuming to be performed than quantitative methods, they are prone to subjective interpretation. To test the feasibility of a manual segmentation and a quantitative methodology for cross-sectional evaluation of articular cartilage status in growing ankles of children with blood-induced arthritis, as compared with a semiquantitative scoring system and clinical-radiographic constructs. Twelve boys, 11 with hemophilia (A, n = 9; B, n = 2) and 1 with von Willebrand disease (median age: 13; range: 6-17), underwent physical examination and MRI at 1.5 T. Two radiologists semiquantitatively scored the MRIs for cartilage pathology (surface erosions, cartilage loss) with blinding to clinical information. An experienced operator applied a validated quantitative 3-D MRI method to determine the percentage area of denuded bone (dAB) and the cartilage thickness (ThCtAB) in the joints' MRIs. Quantitative and semiquantitative MRI methods and clinical-radiographic constructs (Hemophilia Joint Health Score [HJHS], Pettersson radiograph scores) were compared. Moderate correlations were noted between erosions and dAB (r = 0.62, P = 0.03) in the talus but not in the distal tibia (P > 0.05). Whereas substantial to high correlations (r range: 0.70-0.94, P < 0.05) were observed between erosions, cartilage loss, HJHS and Pettersson scores both at the distal tibia and talus levels, moderate/borderline substantial (r range: 0.55-0.61, P < 0.05) correlations were noted between dAB/ThCtAB and clinical-radiographic constructs. Whereas the semiquantitative method of assessing cartilage status is closely associated with clinical-radiographic scores in cross-sectional studies of blood-induced arthropathy

  2. Quantitative versus semiquantitative MR imaging of cartilage in blood-induced arthritic ankles: preliminary findings

    Energy Technology Data Exchange (ETDEWEB)

    Doria, Andrea S. [The Hospital for Sick Children, Department of Diagnostic Imaging, Toronto, ON (Canada); University of Toronto, Department of Medical Imaging, Toronto, ON (Canada); Zhang, Ningning [Children' s Hospital, Department of Radiology, Beijing (China); Lundin, Bjorn [Skaane University Hospital and Lund University, University Hospital of Lund, Center for Medical Imaging and Physiology, Lund (Sweden); Hilliard, Pamela [The Hospital for Sick Children, Department of Rehabilitation Services, Toronto, ON (Canada); Man, Carina; Weiss, Ruth; Detzler, Garry [The Hospital for Sick Children, Department of Diagnostic Imaging, Toronto, ON (Canada); Blanchette, Victor [The Hospital for Sick Children, Department of Hematology, Toronto, ON (Canada); Moineddin, Rahim [Family and Community Medicine, Department of Public Health, Toronto, ON (Canada); Eckstein, Felix [Paracelsus Medical University, Institute of Anatomy and Musculoskeletal Research, Salzburg (Austria); Chondrometrics GmbH, Ainring (Germany); Sussman, Marshall S. [University of Toronto, Department of Medical Imaging, Toronto, ON (Canada); University Health Network, Department of Medical Imaging, Toronto, ON (Canada)

    2014-05-15

    Recent advances in hemophilia prophylaxis have raised the need for accurate noninvasive methods for assessment of early cartilage damage in maturing joints to guide initiation of prophylaxis. Such methods can either be semiquantitative or quantitative. Whereas semiquantitative scores are less time-consuming to be performed than quantitative methods, they are prone to subjective interpretation. To test the feasibility of a manual segmentation and a quantitative methodology for cross-sectional evaluation of articular cartilage status in growing ankles of children with blood-induced arthritis, as compared with a semiquantitative scoring system and clinical-radiographic constructs. Twelve boys, 11 with hemophilia (A, n = 9; B, n = 2) and 1 with von Willebrand disease (median age: 13; range: 6-17), underwent physical examination and MRI at 1.5 T. Two radiologists semiquantitatively scored the MRIs for cartilage pathology (surface erosions, cartilage loss) with blinding to clinical information. An experienced operator applied a validated quantitative 3-D MRI method to determine the percentage area of denuded bone (dAB) and the cartilage thickness (ThCtAB) in the joints' MRIs. Quantitative and semiquantitative MRI methods and clinical-radiographic constructs (Hemophilia Joint Health Score [HJHS], Pettersson radiograph scores) were compared. Moderate correlations were noted between erosions and dAB (r = 0.62, P = 0.03) in the talus but not in the distal tibia (P > 0.05). Whereas substantial to high correlations (r range: 0.70-0.94, P < 0.05) were observed between erosions, cartilage loss, HJHS and Pettersson scores both at the distal tibia and talus levels, moderate/borderline substantial (r range: 0.55-0.61, P < 0.05) correlations were noted between dAB/ThCtAB and clinical-radiographic constructs. Whereas the semiquantitative method of assessing cartilage status is closely associated with clinical-radiographic scores in cross-sectional studies of blood

  3. Electronic imaging systems for quantitative electrophoresis of DNA

    International Nuclear Information System (INIS)

    Sutherland, J.C.

    1989-01-01

    Gel electrophoresis is one of the most powerful and widely used methods for the separation of DNA. During the last decade, instruments have been developed that accurately quantitate in digital form the distribution of materials in a gel or on a blot prepared from a gel. In this paper, I review the various physical properties that can be used to quantitate the distribution of DNA on gels or blots and the instrumentation that has been developed to perform these tasks. The emphasis here is on DNA, but much of what is said also applies to RNA, proteins and other molecules. 36 refs

  4. High accurate time system of the Low Latitude Meridian Circle.

    Science.gov (United States)

    Yang, Jing; Wang, Feng; Li, Zhiming

    In order to obtain the high accurate time signal for the Low Latitude Meridian Circle (LLMC), a new GPS accurate time system is developed which include GPS, 1 MC frequency source and self-made clock system. The second signal of GPS is synchronously used in the clock system and information can be collected by a computer automatically. The difficulty of the cancellation of the time keeper can be overcomed by using this system.

  5. Development and applications of quantitative NMR spectroscopy

    International Nuclear Information System (INIS)

    Yamazaki, Taichi

    2016-01-01

    Recently, quantitative NMR spectroscopy has attracted attention as an analytical method which can easily secure traceability to SI unit system, and discussions about its accuracy and inaccuracy are also started. This paper focuses on the literatures on the advancement of quantitative NMR spectroscopy reported between 2009 and 2016, and introduces both NMR measurement conditions and actual analysis cases in quantitative NMR. The quantitative NMR spectroscopy using an internal reference method enables accurate quantitative analysis with a quick and versatile way in general, and it is possible to obtain the precision sufficiently applicable to the evaluation of pure substances and standard solutions. Since the external reference method can easily prevent contamination to samples and the collection of samples, there are many reported cases related to the quantitative analysis of biologically related samples and highly scarce natural products in which NMR spectra are complicated. In the precision of quantitative NMR spectroscopy, the internal reference method is superior. As the quantitative NMR spectroscopy widely spreads, discussions are also progressing on how to utilize this analytical method as the official methods in various countries around the world. In Japan, this method is listed in the Pharmacopoeia and Japanese Standard of Food Additives, and it is also used as the official method for purity evaluation. In the future, this method will be expected to spread as the general-purpose analysis method that can ensure traceability to SI unit system. (A.O.)

  6. Chlorophyll fluorescence imaging accurately quantifies freezing damage and cold acclimation responses in Arabidopsis leaves

    Directory of Open Access Journals (Sweden)

    Hincha Dirk K

    2008-05-01

    Full Text Available Abstract Background Freezing tolerance is an important factor in the geographical distribution of plants and strongly influences crop yield. Many plants increase their freezing tolerance during exposure to low, nonfreezing temperatures in a process termed cold acclimation. There is considerable natural variation in the cold acclimation capacity of Arabidopsis that has been used to study the molecular basis of this trait. Accurate methods for the quantitation of freezing damage in leaves that include spatial information about the distribution of damage and the possibility to screen large populations of plants are necessary, but currently not available. In addition, currently used standard methods such as electrolyte leakage assays are very laborious and therefore not easily applicable for large-scale screening purposes. Results We have performed freezing experiments with the Arabidopsis accessions C24 and Tenela, which differ strongly in their freezing tolerance, both before and after cold acclimation. Freezing tolerance of detached leaves was investigated using the well established electrolyte leakage assay as a reference. Chlorophyll fluorescence imaging was used as an alternative method that provides spatial resolution of freezing damage over the leaf area. With both methods, LT50 values (i.e. temperature where 50% damage occurred could be derived as quantitative measures of leaf freezing tolerance. Both methods revealed the expected differences between acclimated and nonacclimated plants and between the two accessions and LT50 values were tightly correlated. However, electrolyte leakage assays consistently yielded higher LT50 values than chlorophyll fluorescence imaging. This was to a large part due to the incubation of leaves for electrolyte leakage measurements in distilled water, which apparently led to secondary damage, while this pre-incubation was not necessary for the chlorophyll fluorescence measurements. Conclusion Chlorophyll

  7. Supramolecular assembly affording a ratiometric two-photon fluorescent nanoprobe for quantitative detection and bioimaging.

    Science.gov (United States)

    Wang, Peng; Zhang, Cheng; Liu, Hong-Wen; Xiong, Mengyi; Yin, Sheng-Yan; Yang, Yue; Hu, Xiao-Xiao; Yin, Xia; Zhang, Xiao-Bing; Tan, Weihong

    2017-12-01

    Fluorescence quantitative analyses for vital biomolecules are in great demand in biomedical science owing to their unique detection advantages with rapid, sensitive, non-damaging and specific identification. However, available fluorescence strategies for quantitative detection are usually hard to design and achieve. Inspired by supramolecular chemistry, a two-photon-excited fluorescent supramolecular nanoplatform ( TPSNP ) was designed for quantitative analysis with three parts: host molecules (β-CD polymers), a guest fluorophore of sensing probes (Np-Ad) and a guest internal reference (NpRh-Ad). In this strategy, the TPSNP possesses the merits of (i) improved water-solubility and biocompatibility; (ii) increased tissue penetration depth for bioimaging by two-photon excitation; (iii) quantitative and tunable assembly of functional guest molecules to obtain optimized detection conditions; (iv) a common approach to avoid the limitation of complicated design by adjustment of sensing probes; and (v) accurate quantitative analysis by virtue of reference molecules. As a proof-of-concept, we utilized the two-photon fluorescent probe NHS-Ad-based TPSNP-1 to realize accurate quantitative analysis of hydrogen sulfide (H 2 S), with high sensitivity and good selectivity in live cells, deep tissues and ex vivo -dissected organs, suggesting that the TPSNP is an ideal quantitative indicator for clinical samples. What's more, TPSNP will pave the way for designing and preparing advanced supramolecular sensors for biosensing and biomedicine.

  8. The importance of accurate meteorological input fields and accurate planetary boundary layer parameterizations, tested against ETEX-1

    International Nuclear Information System (INIS)

    Brandt, J.; Ebel, A.; Elbern, H.; Jakobs, H.; Memmesheimer, M.; Mikkelsen, T.; Thykier-Nielsen, S.; Zlatev, Z.

    1997-01-01

    Atmospheric transport of air pollutants is, in principle, a well understood process. If information about the state of the atmosphere is given in all details (infinitely accurate information about wind speed, etc.) and infinitely fast computers are available then the advection equation could in principle be solved exactly. This is, however, not the case: discretization of the equations and input data introduces some uncertainties and errors in the results. Therefore many different issues have to be carefully studied in order to diminish these uncertainties and to develop an accurate transport model. Some of these are e.g. the numerical treatment of the transport equation, accuracy of the mean meteorological input fields and parameterizations of sub-grid scale phenomena (as e.g. parameterizations of the 2 nd and higher order turbulence terms in order to reach closure in the perturbation equation). A tracer model for studying transport and dispersion of air pollution caused by a single but strong source is under development. The model simulations from the first ETEX release illustrate the differences caused by using various analyzed fields directly in the tracer model or using a meteorological driver. Also different parameterizations of the mixing height and the vertical exchange are compared. (author)

  9. Formulating informative, data-based priors for failure probability estimation in reliability analysis

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2007-01-01

    Priors play an important role in the use of Bayesian methods in risk analysis, and using all available information to formulate an informative prior can lead to more accurate posterior inferences. This paper examines the practical implications of using five different methods for formulating an informative prior for a failure probability based on past data. These methods are the method of moments, maximum likelihood (ML) estimation, maximum entropy estimation, starting from a non-informative 'pre-prior', and fitting a prior based on confidence/credible interval matching. The priors resulting from the use of these different methods are compared qualitatively, and the posteriors are compared quantitatively based on a number of different scenarios of observed data used to update the priors. The results show that the amount of information assumed in the prior makes a critical difference in the accuracy of the posterior inferences. For situations in which the data used to formulate the informative prior is an accurate reflection of the data that is later observed, the ML approach yields the minimum variance posterior. However, the maximum entropy approach is more robust to differences between the data used to formulate the prior and the observed data because it maximizes the uncertainty in the prior subject to the constraints imposed by the past data

  10. The value of quantitative MRI using 1.5 T magnet in diagnosis of carpal tunnel syndrome

    Directory of Open Access Journals (Sweden)

    Mohammad Fouad Abdel Baki Allam

    2017-03-01

    Conclusion: Quantitative 1.5 T MRI is an accurate diagnostic tool in CTS. The increase in MN ADC value from proximal to distal with an ADC ratio cutoff value of 1 is highly accurate in diagnosing CTS.

  11. ExSTA: External Standard Addition Method for Accurate High-Throughput Quantitation in Targeted Proteomics Experiments.

    Science.gov (United States)

    Mohammed, Yassene; Pan, Jingxi; Zhang, Suping; Han, Jun; Borchers, Christoph H

    2018-03-01

    Targeted proteomics using MRM with stable-isotope-labeled internal-standard (SIS) peptides is the current method of choice for protein quantitation in complex biological matrices. Better quantitation can be achieved with the internal standard-addition method, where successive increments of synthesized natural form (NAT) of the endogenous analyte are added to each sample, a response curve is generated, and the endogenous concentration is determined at the x-intercept. Internal NAT-addition, however, requires multiple analyses of each sample, resulting in increased sample consumption and analysis time. To compare the following three methods, an MRM assay for 34 high-to-moderate abundance human plasma proteins is used: classical internal SIS-addition, internal NAT-addition, and external NAT-addition-generated in buffer using NAT and SIS peptides. Using endogenous-free chicken plasma, the accuracy is also evaluated. The internal NAT-addition outperforms the other two in precision and accuracy. However, the curves derived by internal vs. external NAT-addition differ by only ≈3.8% in slope, providing comparable accuracies and precision with good CV values. While the internal NAT-addition method may be "ideal", this new external NAT-addition can be used to determine the concentration of high-to-moderate abundance endogenous plasma proteins, providing a robust and cost-effective alternative for clinical analyses or other high-throughput applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool

    Science.gov (United States)

    Chakraborty, Monisha; Ghosh, Dipak

    2018-04-01

    Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.

  13. Quantitative pre-surgical lung function estimation with SPECT/CT

    International Nuclear Information System (INIS)

    Bailey, D. L.; Willowson, K. P.; Timmins, S.; Harris, B. E.; Bailey, E. A.; Roach, P. J.

    2009-01-01

    Full text:Objectives: To develop methodology to predict lobar lung function based on SPECT/CT ventilation and perfusion (V/Q) scanning in candidates for lobectomy for lung cancer. Methods: This combines two development areas from our group: quantitative SPECT based on CT-derived corrections for scattering and attenuation of photons, and SPECT V/Q scanning with lobar segmentation from CT. Eight patients underwent baseline pulmonary function testing (PFT) including spirometry, measure of DLCO and cario-pulmonary exercise testing. A SPECT/CT V/Q scan was acquired at baseline. Using in-house software each lobe was anatomically defined using CT to provide lobar ROIs which could be applied to the SPECT data. From these, individual lobar contribution to overall function was calculated from counts within the lobe and post-operative FEV1, DLCO and VO2 peak were predicted. This was compared with the quantitative planar scan method using 3 rectangular ROIs over each lung. Results: Post-operative FEV1 most closely matched that predicted by the planar quantification method, with SPECT V/Q over-estimating the loss of function by 8% (range - 7 - +23%). However, post-operative DLCO and VO2 peak were both accurately predicted by SPECT V/Q (average error of 0 and 2% respectively) compared with planar. Conclusions: More accurate anatomical definition of lobar anatomy provides better estimates of post-operative loss of function for DLCO and VO2 peak than traditional planar methods. SPECT/CT provides the tools for accurate anatomical defintions of the surgical target as well as being useful in producing quantitative 3D functional images for ventilation and perfusion.

  14. Accurate and efficient spin integration for particle accelerators

    International Nuclear Information System (INIS)

    Abell, Dan T.; Meiser, Dominic; Ranjbar, Vahid H.; Barber, Desmond P.

    2015-01-01

    Accurate spin tracking is a valuable tool for understanding spin dynamics in particle accelerators and can help improve the performance of an accelerator. In this paper, we present a detailed discussion of the integrators in the spin tracking code GPUSPINTRACK. We have implemented orbital integrators based on drift-kick, bend-kick, and matrix-kick splits. On top of the orbital integrators, we have implemented various integrators for the spin motion. These integrators use quaternions and Romberg quadratures to accelerate both the computation and the convergence of spin rotations. We evaluate their performance and accuracy in quantitative detail for individual elements as well as for the entire RHIC lattice. We exploit the inherently data-parallel nature of spin tracking to accelerate our algorithms on graphics processing units.

  15. Accurate and efficient spin integration for particle accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Abell, Dan T.; Meiser, Dominic [Tech-X Corporation, Boulder, CO (United States); Ranjbar, Vahid H. [Brookhaven National Laboratory, Upton, NY (United States); Barber, Desmond P. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2015-01-15

    Accurate spin tracking is a valuable tool for understanding spin dynamics in particle accelerators and can help improve the performance of an accelerator. In this paper, we present a detailed discussion of the integrators in the spin tracking code GPUSPINTRACK. We have implemented orbital integrators based on drift-kick, bend-kick, and matrix-kick splits. On top of the orbital integrators, we have implemented various integrators for the spin motion. These integrators use quaternions and Romberg quadratures to accelerate both the computation and the convergence of spin rotations. We evaluate their performance and accuracy in quantitative detail for individual elements as well as for the entire RHIC lattice. We exploit the inherently data-parallel nature of spin tracking to accelerate our algorithms on graphics processing units.

  16. Indirect iodometric procedure for quantitation of Sn(II) in radiopharmaceutical kits

    International Nuclear Information System (INIS)

    Muddukrishna, S.N.; Chen, A.; Sykes, T.R.; Noujaim, A.A.; Alberta Univ., Edmonton, AB

    1994-01-01

    A method of quantitating stannous ion [Sn(II)] suitable for radiopharmaceutical kits, based on indirect iodometric titration, is described. The method is based on the oxidation of Sn(II) using a known excess of iodine and the excess unreacted iodine determined using thiosulphate by potentiometric titration. The titration cell is a beaker and the titrations are done conveniently under air using an autotitrator in approx. 4 min. The method is accurate and is linear in the range of approx. 10 μg to approx. 6 mg of Sn(II). Several radiopharmaceutical kits were analysed for their Sn(II) content using the method including those containing antibodies or other proteins. The studies indicate that the procedure is rapid, simple and accurate for routine quantitative estimation of Sn(II) in radiopharmaceutical preparations during development, manufacture and storage. (Author)

  17. An Accurate liver segmentation method using parallel computing algorithm

    International Nuclear Information System (INIS)

    Elbasher, Eiman Mohammed Khalied

    2014-12-01

    Computed Tomography (CT or CAT scan) is a noninvasive diagnostic imaging procedure that uses a combination of X-rays and computer technology to produce horizontal, or axial, images (often called slices) of the body. A CT scan shows detailed images of any part of the body, including the bones muscles, fat and organs CT scans are more detailed than standard x-rays. CT scans may be done with or without "contrast Contrast refers to a substance taken by mouth and/ or injected into an intravenous (IV) line that causes the particular organ or tissue under study to be seen more clearly. CT scan of the liver and biliary tract are used in the diagnosis of many diseases in the abdomen structures, particularly when another type of examination, such as X-rays, physical examination, and ultra sound is not conclusive. Unfortunately, the presence of noise and artifact in the edges and fine details in the CT images limit the contrast resolution and make diagnostic procedure more difficult. This experimental study was conducted at the College of Medical Radiological Science, Sudan University of Science and Technology and Fidel Specialist Hospital. The sample of study was included 50 patients. The main objective of this research was to study an accurate liver segmentation method using a parallel computing algorithm, and to segment liver and adjacent organs using image processing technique. The main technique of segmentation used in this study was watershed transform. The scope of image processing and analysis applied to medical application is to improve the quality of the acquired image and extract quantitative information from medical image data in an efficient and accurate way. The results of this technique agreed wit the results of Jarritt et al, (2010), Kratchwil et al, (2010), Jover et al, (2011), Yomamoto et al, (1996), Cai et al (1999), Saudha and Jayashree (2010) who used different segmentation filtering based on the methods of enhancing the computed tomography images. Anther

  18. [A new method of calibration and positioning in quantitative analysis of multicomponents by single marker].

    Science.gov (United States)

    He, Bing; Yang, Shi-Yan; Zhang, Yan

    2012-12-01

    This paper aims to establish a new method of calibration and positioning in quantitative analysis of multicomponents by single marker (QAMS), using Shuanghuanglian oral liquid as the research object. Establishing relative correction factors with reference chlorogenic acid to other 11 active components (neochlorogenic acid, cryptochlorogenic acid, cafferic acid, forsythoside A, scutellarin, isochlorogenic acid B, isochlorogenic acid A, isochlorogenic acid C, baicalin and phillyrin wogonoside) in Shuanghuanglian oral liquid by 3 correction methods (multipoint correction, slope correction and quantitative factor correction). At the same time chromatographic peak was positioned by linear regression method. Only one standard uas used to determine the content of 12 components in Shuanghuanglian oral liquid, in stead of needing too many reference substance in quality control. The results showed that within the linear ranges, no significant differences were found in the quantitative results of 12 active constituents in 3 batches of Shuanghuanglian oral liquid determined by 3 correction methods and external standard method (ESM) or standard curve method (SCM). And this method is simpler and quicker than literature methods. The results were accurate and reliable, and had good reproducibility. While the positioning chromatographic peaks by linear regression method was more accurate than relative retention time in literature. The slope and the quantitative factor correction controlling the quality of Chinese traditional medicine is feasible and accurate.

  19. Characterization of 3D PET systems for accurate quantification of myocardial blood flow

    OpenAIRE

    Renaud, Jennifer M.; Yip, Kathy; Guimond, Jean; Trottier, Mikaël; Pibarot, Philippe; Turcotte, Éric; Maguire, Conor; Lalonde, Lucille; Gulenchyn, Karen; Farncombe, Troy; Wisenberg, Gerald; Moody, Jonathan; Lee, Benjamin; Port, Steven C.; Turkington, Timothy G

    2016-01-01

    Three-dimensional (3D) mode imaging is the current standard for positron emission tomography-computed tomography (PET-CT) systems. Dynamic imaging for quantification of myocardial blood flow (MBF) with short-lived tracers, such as Rb-82- chloride (Rb-82), requires accuracy to be maintained over a wide range of isotope activities and scanner count-rates. We propose new performance standard measurements to characterize the dynamic range of PET systems for accurate quantitative...

  20. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  1. High-performance liquid chromatographic quantitation of desmosine plus isodesmosine in elastin and whole tissue hydrolysates

    International Nuclear Information System (INIS)

    Soskel, N.T.

    1987-01-01

    Quantitation of desmosine and isodesmosine, the major crosslinks in elastin, has been of interest because of their uniqueness and use as markers of that protein. Accurate measurement of these crosslinks may allow determination of elastin degradation in vivo and elastin content in tissues, obviating lengthy extraction procedures. We have developed a method of quantitating desmosine plus isodesmosine in hydrolysates of tissue and insoluble elastin using high-performance liquid chromatographic separation and absorbance detection that is rapid (21-35 min) and sensitive (accurate linearity from 100 pmol to 5 nmol). This method has been used to quantitate desmosines in elastin from bovine nuchal ligament and lung and in whole aorta from hamsters. The ability to completely separate [ 3 H]lysine from desmosine plus isodesmosine allows the method to be used to study incorporation of lysine into crosslinks in elastin

  2. Quantitative in situ magnetization reversal studies in Lorentz microscopy and electron holography

    International Nuclear Information System (INIS)

    Rodríguez, L.A.; Magén, C.; Snoeck, E.; Gatel, C.; Marín, L.; Serrano-Ramón, L.

    2013-01-01

    A generalized procedure for the in situ application of magnetic fields by means of the excitation of the objective lens for magnetic imaging experiments in Lorentz microscopy and electron holography is quantitatively described. A protocol for applying magnetic fields with arbitrary in-plane magnitude and orientation is presented, and a freeware script for Digital Micrograph ™ is provided to assist the operation of the microscope. Moreover, a method to accurately reconstruct hysteresis loops is detailed. We show that the out-of-plane component of the magnetic field cannot be always neglected when performing quantitative measurements of the local magnetization. Several examples are shown to demonstrate the accuracy and functionality of the methods. - Highlights: • Generalized procedure for application of magnetic fields with the TEM objective lens. • Arbitrary in-plane magnetic field magnitude and orientation can be applied. • Method to accurately reconstruct hysteresis loops by electron holography. • Out-of-plane field component should be considered in quantitative measurements. • Examples to illustrate the method in Lorentz microscopy and electron holography

  3. Accurate method of the magnetic field measurement of quadrupole magnets

    International Nuclear Information System (INIS)

    Kumada, M.; Sakai, I.; Someya, H.; Sasaki, H.

    1983-01-01

    We present an accurate method of the magnetic field measurement of the quadrupole magnet. The method of obtaining the information of the field gradient and the effective focussing length is given. A new scheme to obtain the information of the skew field components is also proposed. The relative accuracy of the measurement was 1 x 10 -4 or less. (author)

  4. Photogrammetry of the human brain: a novel method for 3D the quantitative exploration of the structural connectivity in neurosurgery and neurosciences.

    Science.gov (United States)

    De Benedictis, Alessandro; Nocerino, Erica; Menna, Fabio; Remondino, Fabio; Barbareschi, Mattia; Rozzanigo, Umberto; Corsini, Francesco; Olivetti, Emanuele; Marras, Carlo Efisio; Chioffi, Franco; Avesani, Paolo; Sarubbo, Silvio

    2018-04-13

    Anatomical awareness of brain's structural connectivity is mandatory for neurosurgeons, to select the most effective approaches for brain resections. Although standard micro-dissection is a validated technique to investigate the different white matter (WM) pathways and to verify results coming from tractography, the possibility of an interactive exploration of the specimens and of a reliable acquisition of quantitative information has not been described so far. Photogrammetry is a well-established technique allowing an accurate metrology on highly defined 3D-models. The aim of this work is to propose the application of photogrammetric technique for supporting the 3D-exploration and the quantitative analysis on the cerebral WM connectivity. The main peri-sylvian pathways, including the superior longitudinal fascicle (SLF) and the arcuate fascicle (AF) were exposed using the Klingler's technique. The photogrammetric acquisition followed each dissection step. The point-clouds were registered to a reference MRI of the specimen. All the acquisitions were co-registered into an open-source model. We analyzed five steps, including: the cortical surface, the short intergyral fibers, the indirect posterior and anterior SLF, and the AF. The co-registration between the MRI mesh and the point-clouds models resulted highly accurate. Multiple measures of distances between specific cortical landmarks and WM tracts were collected on the photogrammetric model. Photogrammetry allows an accurate 3D-reproduction of WM anatomy, and the acquisition of unlimited quantitative data directly on the real specimen during the post-dissection analysis. These results open many new promising neuroscientific and educational perspectives, also for optimizing the quality of neurosurgical treatments. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Label-free characterization of ultra violet-radiation-induced changes in skin fibroblasts with Raman spectroscopy and quantitative phase microscopy.

    Science.gov (United States)

    Singh, S P; Kang, Sungsam; Kang, Jeon Woong; So, Peter T C; Dasari, Ramanchandra Rao; Yaqoob, Zahid; Barman, Ishan

    2017-09-07

    Minimizing morbidities and mortalities associated with skin cancers requires sustained research with the goal of obtaining fresh insights into disease onset and progression under specific stimuli, particularly the influence of ultraviolet rays. In the present study, label-free profiling of skin fibroblasts exposed to time-bound ultra-violet radiation has been performed using quantitative phase imaging and Raman spectroscopy. Statistically significant differences in quantifiable biophysical parameters, such as matter density and cell dry mass, were observed with phase imaging. Accurate estimation of changes in the biochemical constituents, notably nucleic acids and proteins, was demonstrated through a combination of Raman spectroscopy and multivariate analysis of spectral patterns. Overall, the findings of this study demonstrate the promise of these non-perturbative optical modalities in accurately identifying cellular phenotypes and responses to external stimuli by combining molecular and biophysical information.

  6. Quantitative self-assembly prediction yields targeted nanomedicines

    Science.gov (United States)

    Shamay, Yosi; Shah, Janki; Işık, Mehtap; Mizrachi, Aviram; Leibold, Josef; Tschaharganeh, Darjus F.; Roxbury, Daniel; Budhathoki-Uprety, Januka; Nawaly, Karla; Sugarman, James L.; Baut, Emily; Neiman, Michelle R.; Dacek, Megan; Ganesh, Kripa S.; Johnson, Darren C.; Sridharan, Ramya; Chu, Karen L.; Rajasekhar, Vinagolu K.; Lowe, Scott W.; Chodera, John D.; Heller, Daniel A.

    2018-02-01

    Development of targeted nanoparticle drug carriers often requires complex synthetic schemes involving both supramolecular self-assembly and chemical modification. These processes are generally difficult to predict, execute, and control. We describe herein a targeted drug delivery system that is accurately and quantitatively predicted to self-assemble into nanoparticles based on the molecular structures of precursor molecules, which are the drugs themselves. The drugs assemble with the aid of sulfated indocyanines into particles with ultrahigh drug loadings of up to 90%. We devised quantitative structure-nanoparticle assembly prediction (QSNAP) models to identify and validate electrotopological molecular descriptors as highly predictive indicators of nano-assembly and nanoparticle size. The resulting nanoparticles selectively targeted kinase inhibitors to caveolin-1-expressing human colon cancer and autochthonous liver cancer models to yield striking therapeutic effects while avoiding pERK inhibition in healthy skin. This finding enables the computational design of nanomedicines based on quantitative models for drug payload selection.

  7. Using pseudoalignment and base quality to accurately quantify microbial community composition.

    Directory of Open Access Journals (Sweden)

    Mark Reppell

    2018-04-01

    Full Text Available Pooled DNA from multiple unknown organisms arises in a variety of contexts, for example microbial samples from ecological or human health research. Determining the composition of pooled samples can be difficult, especially at the scale of modern sequencing data and reference databases. Here we propose a novel method for taxonomic profiling in pooled DNA that combines the speed and low-memory requirements of k-mer based pseudoalignment with a likelihood framework that uses base quality information to better resolve multiply mapped reads. We apply the method to the problem of classifying 16S rRNA reads using a reference database of known organisms, a common challenge in microbiome research. Using simulations, we show the method is accurate across a variety of read lengths, with different length reference sequences, at different sample depths, and when samples contain reads originating from organisms absent from the reference. We also assess performance in real 16S data, where we reanalyze previous genetic association data to show our method discovers a larger number of quantitative trait associations than other widely used methods. We implement our method in the software Karp, for k-mer based analysis of read pools, to provide a novel combination of speed and accuracy that is uniquely suited for enhancing discoveries in microbial studies.

  8. The accurate assessment of small-angle X-ray scattering data.

    Science.gov (United States)

    Grant, Thomas D; Luft, Joseph R; Carter, Lester G; Matsui, Tsutomu; Weiss, Thomas M; Martel, Anne; Snell, Edward H

    2015-01-01

    Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targets for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. The studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.

  9. Quantitative X-ray microanalysis of biological specimens

    International Nuclear Information System (INIS)

    Roomans, G.M.

    1988-01-01

    Qualitative X-ray microanalysis of biological specimens requires an approach that is somewhat different from that used in the materials sciences. The first step is deconvolution and background subtraction on the obtained spectrum. The further treatment depends on the type of specimen: thin, thick, or semithick. For thin sections, the continuum method of quantitation is most often used, but it should be combined with an accurate correction for extraneous background. However, alternative methods to determine local mass should also be considered. In the analysis of biological bulk specimens, the ZAF-correction method appears to be less useful, primarily because of the uneven surface of biological specimens. The peak-to-local background model may be a more adequate method for thick specimens that are not mounted on a thick substrate. Quantitative X-ray microanalysis of biological specimens generally requires the use of standards that preferably should resemble the specimen in chemical and physical properties. Special problems in biological microanalysis include low count rates, specimen instability and mass loss, extraneous contributions to the spectrum, and preparative artifacts affecting quantitation. A relatively recent development in X-ray microanalysis of biological specimens is the quantitative determination of local water content

  10. Accurate and efficient spin integration for particle accelerators

    Directory of Open Access Journals (Sweden)

    Dan T. Abell

    2015-02-01

    Full Text Available Accurate spin tracking is a valuable tool for understanding spin dynamics in particle accelerators and can help improve the performance of an accelerator. In this paper, we present a detailed discussion of the integrators in the spin tracking code gpuSpinTrack. We have implemented orbital integrators based on drift-kick, bend-kick, and matrix-kick splits. On top of the orbital integrators, we have implemented various integrators for the spin motion. These integrators use quaternions and Romberg quadratures to accelerate both the computation and the convergence of spin rotations. We evaluate their performance and accuracy in quantitative detail for individual elements as well as for the entire RHIC lattice. We exploit the inherently data-parallel nature of spin tracking to accelerate our algorithms on graphics processing units.

  11. An Image Analysis Method for the Precise Selection and Quantitation of Fluorescently Labeled Cellular Constituents

    Science.gov (United States)

    Agley, Chibeza C.; Velloso, Cristiana P.; Lazarus, Norman R.

    2012-01-01

    The accurate measurement of the morphological characteristics of cells with nonuniform conformations presents difficulties. We report here a straightforward method using immunofluorescent staining and the commercially available imaging program Adobe Photoshop, which allows objective and precise information to be gathered on irregularly shaped cells. We have applied this measurement technique to the analysis of human muscle cells and their immunologically marked intracellular constituents, as these cells are prone to adopting a highly branched phenotype in culture. Use of this method can be used to overcome many of the long-standing limitations of conventional approaches for quantifying muscle cell size in vitro. In addition, wider applications of Photoshop as a quantitative and semiquantitative tool in immunocytochemistry are explored. PMID:22511600

  12. Comparison of Variable Number Tandem Repeat and Short Tandem Repeat Genetic Markers for Qualitative and Quantitative Chimerism Analysis Post Allogeneic Stem Cell Transplantation

    International Nuclear Information System (INIS)

    Mossallam, G.I.; Smith, A.G.; Mcfarland, C.

    2005-01-01

    amplified STR products offered the advantages of more rapid and accurate quantitative assessment of chimerism. The choice between these two techniques depends on the need for quantitative or qualitative information, the availability of equipment. and the cost

  13. Semi-quantitative prediction of a multiple API solid dosage form with a combination of vibrational spectroscopy methods.

    Science.gov (United States)

    Hertrampf, A; Sousa, R M; Menezes, J C; Herdling, T

    2016-05-30

    Quality control (QC) in the pharmaceutical industry is a key activity in ensuring medicines have the required quality, safety and efficacy for their intended use. QC departments at pharmaceutical companies are responsible for all release testing of final products but also all incoming raw materials. Near-infrared spectroscopy (NIRS) and Raman spectroscopy are important techniques for fast and accurate identification and qualification of pharmaceutical samples. Tablets containing two different active pharmaceutical ingredients (API) [bisoprolol, hydrochlorothiazide] in different commercially available dosages were analysed using Raman- and NIR Spectroscopy. The goal was to define multivariate models based on each vibrational spectroscopy to discriminate between different dosages (identity) and predict their dosage (semi-quantitative). Furthermore the combination of spectroscopic techniques was investigated. Therefore, two different multiblock techniques based on PLS have been applied: multiblock PLS (MB-PLS) and sequential-orthogonalised PLS (SO-PLS). NIRS showed better results compared to Raman spectroscopy for both identification and quantitation. The multiblock techniques investigated showed that each spectroscopy contains information not present or captured with the other spectroscopic technique, thus demonstrating that there is a potential benefit in their combined use for both identification and quantitation purposes. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Quantitation of specific binding ratio in 123I-FP-CIT SPECT: accurate processing strategy for cerebral ventricular enlargement with use of 3D-striatal digital brain phantom.

    Science.gov (United States)

    Furuta, Akihiro; Onishi, Hideo; Amijima, Hizuru

    2018-06-01

    This study aimed to evaluate the effect of ventricular enlargement on the specific binding ratio (SBR) and to validate the cerebrospinal fluid (CSF)-Mask algorithm for quantitative SBR assessment of 123 I-FP-CIT single-photon emission computed tomography (SPECT) images with the use of a 3D-striatum digital brain (SDB) phantom. Ventricular enlargement was simulated by three-dimensional extensions in a 3D-SDB phantom comprising segments representing the striatum, ventricle, brain parenchyma, and skull bone. The Evans Index (EI) was measured in 3D-SDB phantom images of an enlarged ventricle. Projection data sets were generated from the 3D-SDB phantoms with blurring, scatter, and attenuation. Images were reconstructed using the ordered subset expectation maximization (OSEM) algorithm and corrected for attenuation, scatter, and resolution recovery. We bundled DaTView (Southampton method) with the CSF-Mask processing software for SBR. We assessed SBR with the use of various coefficients (f factor) of the CSF-Mask. Specific binding ratios of 1, 2, 3, 4, and 5 corresponded to SDB phantom simulations with true values. Measured SBRs > 50% that were underestimated with EI increased compared with the true SBR and this trend was outstanding at low SBR. The CSF-Mask improved 20% underestimates and brought the measured SBR closer to the true values at an f factor of 1.0 despite an increase in EI. We connected the linear regression function (y = - 3.53x + 1.95; r = 0.95) with the EI and f factor using root-mean-square error. Processing with CSF-Mask generates accurate quantitative SBR from dopamine transporter SPECT images of patients with ventricular enlargement.

  15. Quantitative interpretation of the magnetic susceptibility frequency dependence

    Science.gov (United States)

    Ustra, Andrea; Mendonça, Carlos A.; Leite, Aruã; Jovane, Luigi; Trindade, Ricardo I. F.

    2018-05-01

    Low-field mass-specific magnetic susceptibility (MS) measurements using multifrequency alternating fields are commonly used to evaluate concentration of ferrimagnetic particles in the transition of superparamagnetic (SP) to stable single domain (SSD). In classical palaeomagnetic analyses, this measurement serves as a preliminary assessment of rock samples providing rapid, non-destructive, economical and easy information of magnetic properties. The SP-SSD transition is relevant in environmental studies because it has been associated with several geological and biogeochemical processes affecting magnetic mineralogy. MS is a complex function of mineral-type and grain-size distribution, as well as measuring parameters such as external field magnitude and frequency. In this work, we propose a new technique to obtain quantitative information on grain-size variations of magnetic particles in the SP-SSD transition by inverting frequency-dependent susceptibility. We introduce a descriptive parameter named as `limiting frequency effect' that provides an accurate estimation of MS loss with frequency. Numerical simulations show the methodology capability in providing data fitting and model parameters in many practical situations. Real-data applications with magnetite nanoparticles and core samples from sediments of Poggio le Guaine section of Umbria-Marche Basin (Italy) provide additional information not clearly recognized when interpreting cruder MS data. Caution is needed when interpreting frequency dependence in terms of single relaxation processes, which are not universally applicable and depend upon the nature of magnetic mineral in the material. Nevertheless, the proposed technique is a promising tool for SP-SSD content analyses.

  16. Quantitative measurements of shear displacement using atomic force microscopy

    International Nuclear Information System (INIS)

    Wang, Wenbo; Wu, Weida; Sun, Ying; Zhao, Yonggang

    2016-01-01

    We report a method to quantitatively measure local shear deformation with high sensitivity using atomic force microscopy. The key point is to simultaneously detect both torsional and buckling motions of atomic force microscopy (AFM) cantilevers induced by the lateral piezoelectric response of the sample. This requires the quantitative calibration of torsional and buckling response of AFM. This method is validated by measuring the angular dependence of the in-plane piezoelectric response of a piece of piezoelectric α-quartz. The accurate determination of the amplitude and orientation of the in-plane piezoelectric response, without rotation, would greatly enhance the efficiency of lateral piezoelectric force microscopy.

  17. Quantitation in planar renal scintigraphy: which μ value should be used?

    International Nuclear Information System (INIS)

    Hindie, E.; Jeanguillaume, C.; Galle, P.; Prigent, A.

    1999-01-01

    The attenuation coefficient value μ used by different authors for quantitation in planar renal scintigraphy varies greatly, from the theoretical value of 0.153 cm -1 (appropriate for scatter-free data) down to 0.099 cm -1 (empirical value assumed to compensate for both scatter and attenuation). For a 6-cm-deep kidney, such variations introduce up to 30% differences in absolute measurement of kidney activity. Using technetium-99m phantom studies, we determined the μ values that would yield accurate kidney activity quantitation for different energy windows corresponding to different amounts of scatter, and when using different image analysis approaches similar to those used in renal quantitation. With the 20% energy window, it was found that the μ value was strongly dependent on the size of the region of interest (ROI) and on whether background subtraction was performed: the μ value thus varied from 0.119 cm -1 (loose ROI, no background subtraction) to 0.150 cm -1 (kidney ROI and background subtraction). When using data from an energy window that could be considered scatter-free, the μ value became almost independent of the image analysis scheme. It is concluded that: (1) when performing background subtraction, which implicitly reduces the effect of scatter, the μ value to be used for accurate quantitation is close to the theoretical μ value; (2) if the acquired data were initially corrected for scatter, the appropriate μ value would then be the theoretical μ value, whatever the image analysis scheme. (orig.)

  18. Automatical and accurate segmentation of cerebral tissues in fMRI dataset with combination of image processing and deep learning

    Science.gov (United States)

    Kong, Zhenglun; Luo, Junyi; Xu, Shengpu; Li, Ting

    2018-02-01

    Image segmentation plays an important role in medical science. One application is multimodality imaging, especially the fusion of structural imaging with functional imaging, which includes CT, MRI and new types of imaging technology such as optical imaging to obtain functional images. The fusion process require precisely extracted structural information, in order to register the image to it. Here we used image enhancement, morphometry methods to extract the accurate contours of different tissues such as skull, cerebrospinal fluid (CSF), grey matter (GM) and white matter (WM) on 5 fMRI head image datasets. Then we utilized convolutional neural network to realize automatic segmentation of images in deep learning way. Such approach greatly reduced the processing time compared to manual and semi-automatic segmentation and is of great importance in improving speed and accuracy as more and more samples being learned. The contours of the borders of different tissues on all images were accurately extracted and 3D visualized. This can be used in low-level light therapy and optical simulation software such as MCVM. We obtained a precise three-dimensional distribution of brain, which offered doctors and researchers quantitative volume data and detailed morphological characterization for personal precise medicine of Cerebral atrophy/expansion. We hope this technique can bring convenience to visualization medical and personalized medicine.

  19. MR Fingerprinting for Rapid Quantitative Abdominal Imaging.

    Science.gov (United States)

    Chen, Yong; Jiang, Yun; Pahwa, Shivani; Ma, Dan; Lu, Lan; Twieg, Michael D; Wright, Katherine L; Seiberlich, Nicole; Griswold, Mark A; Gulani, Vikas

    2016-04-01

    To develop a magnetic resonance (MR) "fingerprinting" technique for quantitative abdominal imaging. This HIPAA-compliant study had institutional review board approval, and informed consent was obtained from all subjects. To achieve accurate quantification in the presence of marked B0 and B1 field inhomogeneities, the MR fingerprinting framework was extended by using a two-dimensional fast imaging with steady-state free precession, or FISP, acquisition and a Bloch-Siegert B1 mapping method. The accuracy of the proposed technique was validated by using agarose phantoms. Quantitative measurements were performed in eight asymptomatic subjects and in six patients with 20 focal liver lesions. A two-tailed Student t test was used to compare the T1 and T2 results in metastatic adenocarcinoma with those in surrounding liver parenchyma and healthy subjects. Phantom experiments showed good agreement with standard methods in T1 and T2 after B1 correction. In vivo studies demonstrated that quantitative T1, T2, and B1 maps can be acquired within a breath hold of approximately 19 seconds. T1 and T2 measurements were compatible with those in the literature. Representative values included the following: liver, 745 msec ± 65 (standard deviation) and 31 msec ± 6; renal medulla, 1702 msec ± 205 and 60 msec ± 21; renal cortex, 1314 msec ± 77 and 47 msec ± 10; spleen, 1232 msec ± 92 and 60 msec ± 19; skeletal muscle, 1100 msec ± 59 and 44 msec ± 9; and fat, 253 msec ± 42 and 77 msec ± 16, respectively. T1 and T2 in metastatic adenocarcinoma were 1673 msec ± 331 and 43 msec ± 13, respectively, significantly different from surrounding liver parenchyma relaxation times of 840 msec ± 113 and 28 msec ± 3 (P < .0001 and P < .01) and those in hepatic parenchyma in healthy volunteers (745 msec ± 65 and 31 msec ± 6, P < .0001 and P = .021, respectively). A rapid technique for quantitative abdominal imaging was developed that allows simultaneous quantification of multiple tissue

  20. A quantitative method to analyze the quality of EIA information in wind energy development and avian/bat assessments

    International Nuclear Information System (INIS)

    Chang, Tony; Nielsen, Erik; Auberle, William; Solop, Frederic I.

    2013-01-01

    The environmental impact assessment (EIA) has been a tool for decision makers since the enactment of the National Environmental Policy Act (NEPA). Since that time, few analyses have been performed to verify the quality of information and content within EIAs. High quality information within assessments is vital in order for decision makers, stake holders, and the public to understand the potential impact of proposed actions on the ecosystem and wildlife species. Low quality information has been a major cause for litigation and economic loss. Since 1999, wind energy development has seen an exponential growth with unknown levels of impact on wildlife species, in particular bird and bat species. The purpose of this article is to: (1) develop, validate, and apply a quantitative index to review avian/bat assessment quality for wind energy EIAs; and (2) assess the trends and status of avian/bat assessment quality in a sample of wind energy EIAs. This research presents the development and testing of the Avian and Bat Assessment Quality Index (ABAQI), a new approach to quantify information quality of ecological assessments within wind energy development EIAs in relation to avian and bat species based on review areas and factors derived from 23 state wind/wildlife siting guidance documents. The ABAQI was tested through a review of 49 publicly available EIA documents and validated by identifying high variation in avian and bat assessments quality for wind energy developments. Of all the reviewed EIAs, 66% failed to provide high levels of preconstruction avian and bat survey information, compared to recommended factors from state guidelines. This suggests the need for greater consistency from recommended guidelines by state, and mandatory compliance by EIA preparers to avoid possible habitat and species loss, wind energy development shut down, and future lawsuits. - Highlights: ► We developed, validated, and applied a quantitative index to review avian/bat assessment quality

  1. A quantitative method to analyze the quality of EIA information in wind energy development and avian/bat assessments

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Tony, E-mail: tc282@nau.edu [Environmental Science and Policy Program, School of Earth Science and Environmental Sustainability, Northern Arizona University, 602 S Humphreys P.O. Box 5694, Flagstaff, AZ, 86011 (United States); Nielsen, Erik, E-mail: erik.nielsen@nau.edu [Environmental Science and Policy Program, School of Earth Science and Environmental Sustainability, Northern Arizona University, 602 S Humphreys P.O. Box 5694, Flagstaff, AZ, 86011 (United States); Auberle, William, E-mail: william.auberle@nau.edu [Civil and Environmental Engineering Program, Department of Civil and Environmental Engineering, Northern Arizona University, 2112 S Huffer Ln P.O. Box 15600, Flagstaff, AZ, 860011 (United States); Solop, Frederic I., E-mail: fred.solop@nau.edu [Political Science Program, Department of Politics and International Affairs, Northern Arizona University, P.O. Box 15036, Flagstaff, AZ 86001 (United States)

    2013-01-15

    The environmental impact assessment (EIA) has been a tool for decision makers since the enactment of the National Environmental Policy Act (NEPA). Since that time, few analyses have been performed to verify the quality of information and content within EIAs. High quality information within assessments is vital in order for decision makers, stake holders, and the public to understand the potential impact of proposed actions on the ecosystem and wildlife species. Low quality information has been a major cause for litigation and economic loss. Since 1999, wind energy development has seen an exponential growth with unknown levels of impact on wildlife species, in particular bird and bat species. The purpose of this article is to: (1) develop, validate, and apply a quantitative index to review avian/bat assessment quality for wind energy EIAs; and (2) assess the trends and status of avian/bat assessment quality in a sample of wind energy EIAs. This research presents the development and testing of the Avian and Bat Assessment Quality Index (ABAQI), a new approach to quantify information quality of ecological assessments within wind energy development EIAs in relation to avian and bat species based on review areas and factors derived from 23 state wind/wildlife siting guidance documents. The ABAQI was tested through a review of 49 publicly available EIA documents and validated by identifying high variation in avian and bat assessments quality for wind energy developments. Of all the reviewed EIAs, 66% failed to provide high levels of preconstruction avian and bat survey information, compared to recommended factors from state guidelines. This suggests the need for greater consistency from recommended guidelines by state, and mandatory compliance by EIA preparers to avoid possible habitat and species loss, wind energy development shut down, and future lawsuits. - Highlights: Black-Right-Pointing-Pointer We developed, validated, and applied a quantitative index to review

  2. Determination of quantitative tissue composition by iterative reconstruction on 3D DECT volumes

    Energy Technology Data Exchange (ETDEWEB)

    Magnusson, Maria [Linkoeping Univ. (Sweden). Dept. of Electrical Engineering; Linkoeping Univ. (Sweden). Dept. of Medical and Health Sciences, Radiation Physics; Linkoeping Univ. (Sweden). Center for Medical Image Science and Visualization (CMIV); Malusek, Alexandr [Linkoeping Univ. (Sweden). Dept. of Medical and Health Sciences, Radiation Physics; Linkoeping Univ. (Sweden). Center for Medical Image Science and Visualization (CMIV); Nuclear Physics Institute AS CR, Prague (Czech Republic). Dept. of Radiation Dosimetry; Muhammad, Arif [Linkoeping Univ. (Sweden). Dept. of Medical and Health Sciences, Radiation Physics; Carlsson, Gudrun Alm [Linkoeping Univ. (Sweden). Dept. of Medical and Health Sciences, Radiation Physics; Linkoeping Univ. (Sweden). Center for Medical Image Science and Visualization (CMIV)

    2011-07-01

    Quantitative tissue classification using dual-energy CT has the potential to improve accuracy in radiation therapy dose planning as it provides more information about material composition of scanned objects than the currently used methods based on single-energy CT. One problem that hinders successful application of both single- and dual-energy CT is the presence of beam hardening and scatter artifacts in reconstructed data. Current pre- and post-correction methods used for image reconstruction often bias CT attenuation values and thus limit their applicability for quantitative tissue classification. Here we demonstrate simulation studies with a novel iterative algorithm that decomposes every soft tissue voxel into three base materials: water, protein, and adipose. The results demonstrate that beam hardening artifacts can effectively be removed and accurate estimation of mass fractions of each base material can be achieved. Our iterative algorithm starts with calculating parallel projections on two previously reconstructed DECT volumes reconstructed from fan-beam or helical projections with small conebeam angle. The parallel projections are then used in an iterative loop. Future developments include segmentation of soft and bone tissue and subsequent determination of bone composition. (orig.)

  3. Quantitative X-ray fluorescence analysis at the ESRF ID18F microprobe

    CERN Document Server

    Vekemans, B; Somogyi, A; Drakopoulos, M; Kempenaers, L; Simionovici, A; Adams, F

    2003-01-01

    The new ID18F end-station at the European synchrotron radiation facility (ESRF) in Grenoble (France) is dedicated to sensitive and accurate quantitative micro-X-ray fluorescence (XRF) analysis at the ppm level with accuracy better than 10% for elements with atomic numbers above 18. For accurate quantitative analysis, given a high level of instrumental stability, major steps are the extraction and conversion of experimental X-ray line intensities into elemental concentrations. For this purpose a two-step quantification approach was adopted. In the first step, the collected XRF spectra are deconvoluted on the basis of a non-linear least-squares fitting algorithm (AXIL). The extracted characteristic line intensities are then used as input for a detailed Monte Carlo (MC) simulation code dedicated to XRF spectroscopy taking into account specific experimental conditions (excitation/detection) as well as sample characteristics (absorption and enhancement effects, sample topology, heterogeneity etc.). The iterative u...

  4. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY

  5. CMEIAS color segmentation: an improved computing technology to process color images for quantitative microbial ecology studies at single-cell resolution.

    Science.gov (United States)

    Gross, Colin A; Reddy, Chandan K; Dazzo, Frank B

    2010-02-01

    Quantitative microscopy and digital image analysis are underutilized in microbial ecology largely because of the laborious task to segment foreground object pixels from background, especially in complex color micrographs of environmental samples. In this paper, we describe an improved computing technology developed to alleviate this limitation. The system's uniqueness is its ability to edit digital images accurately when presented with the difficult yet commonplace challenge of removing background pixels whose three-dimensional color space overlaps the range that defines foreground objects. Image segmentation is accomplished by utilizing algorithms that address color and spatial relationships of user-selected foreground object pixels. Performance of the color segmentation algorithm evaluated on 26 complex micrographs at single pixel resolution had an overall pixel classification accuracy of 99+%. Several applications illustrate how this improved computing technology can successfully resolve numerous challenges of complex color segmentation in order to produce images from which quantitative information can be accurately extracted, thereby gain new perspectives on the in situ ecology of microorganisms. Examples include improvements in the quantitative analysis of (1) microbial abundance and phylotype diversity of single cells classified by their discriminating color within heterogeneous communities, (2) cell viability, (3) spatial relationships and intensity of bacterial gene expression involved in cellular communication between individual cells within rhizoplane biofilms, and (4) biofilm ecophysiology based on ribotype-differentiated radioactive substrate utilization. The stand-alone executable file plus user manual and tutorial images for this color segmentation computing application are freely available at http://cme.msu.edu/cmeias/ . This improved computing technology opens new opportunities of imaging applications where discriminating colors really matter most

  6. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    Energy Technology Data Exchange (ETDEWEB)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  7. An Informed Approach to Improving Quantitative Literacy and Mitigating Math Anxiety in Undergraduates Through Introductory Science Courses

    Science.gov (United States)

    Follette, K.; McCarthy, D.

    2012-08-01

    Current trends in the teaching of high school and college science avoid numerical engagement because nearly all students lack basic arithmetic skills and experience anxiety when encountering numbers. Nevertheless, such skills are essential to science and vital to becoming savvy consumers, citizens capable of recognizing pseudoscience, and discerning interpreters of statistics in ever-present polls, studies, and surveys in which our society is awash. Can a general-education collegiate course motivate students to value numeracy and to improve their quantitative skills in what may well be their final opportunity in formal education? We present a tool to assess whether skills in numeracy/quantitative literacy can be fostered and improved in college students through the vehicle of non-major introductory courses in astronomy. Initial classroom applications define the magnitude of this problem and indicate that significant improvements are possible. Based on these initial results we offer this tool online and hope to collaborate with other educators, both formal and informal, to develop effective mechanisms for encouraging all students to value and improve their skills in basic numeracy.

  8. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity.

    Science.gov (United States)

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A; Bradford, William D; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S; Li, Rong

    2015-03-30

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein-based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. Copyright © 2015 Zhu et al.

  9. Optical Computed-Tomographic Microscope for Three-Dimensional Quantitative Histology

    Directory of Open Access Journals (Sweden)

    Ravil Chamgoulov

    2004-01-01

    Full Text Available A novel optical computed‐tomographic microscope has been developed allowing quantitative three‐dimensional (3D imaging and analysis of fixed pathological material. Rather than a conventional two‐dimensional (2D image, the instrument produces a 3D representation of fixed absorption‐stained material, from which quantitative histopathological features can be measured more accurately. The accurate quantification of these features is critically important in disease diagnosis and the clinical classification of cancer. The system consists of two high NA objective lenses, a light source, a digital spatial light modulator (DMD, by Texas Instrument, an x–y stage, and a CCD detector. The DMD, positioned at the back pupil‐plane of the illumination objective, is employed to illuminate the specimen with parallel rays at any desired angle. The system uses a modification of the convolution backprojection algorithm for reconstruction. In contrast to fluorescent images acquired by a confocal microscope, this instrument produces 3D images of absorption stained material. Microscopic 3D volume reconstructions of absorption‐stained cells have been demonstrated. Reconstructed 3D images of individual cells and tissue can be cut virtually with the distance between the axial slices less than 0.5 μm.

  10. Distributed 3-D iterative reconstruction for quantitative SPECT

    International Nuclear Information System (INIS)

    Ju, Z.W.; Frey, E.C.; Tsui, B.M.W.

    1995-01-01

    The authors describe a distributed three dimensional (3-D) iterative reconstruction library for quantitative single-photon emission computed tomography (SPECT). This library includes 3-D projector-backprojector pairs (PBPs) and distributed 3-D iterative reconstruction algorithms. The 3-D PBPs accurately and efficiently model various combinations of the image degrading factors including attenuation, detector response and scatter response. These PBPs were validated by comparing projection data computed using the projectors with that from direct Monte Carlo (MC) simulations. The distributed 3-D iterative algorithms spread the projection-backprojection operations for all the projection angles over a heterogeneous network of single or multi-processor computers to reduce the reconstruction time. Based on a master/slave paradigm, these distributed algorithms provide dynamic load balancing and fault tolerance. The distributed algorithms were verified by comparing images reconstructed using both the distributed and non-distributed algorithms. Computation times for distributed 3-D reconstructions running on up to 4 identical processors were reduced by a factor approximately 80--90% times the number of the processors participating, compared to those for non-distributed 3-D reconstructions running on a single processor. When combined with faster affordable computers, this library provides an efficient means for implementing accurate reconstruction and compensation methods to improve quality and quantitative accuracy in SPECT images

  11. Local structure in LaMnO3 and CaMnO3 perovskites: A quantitative structural refinement of Mn K-edge XANES data

    International Nuclear Information System (INIS)

    Monesi, C.; Meneghini, C.; Bardelli, F.; Benfatto, M.; Mobilio, S.; Manju, U.; Sarma, D.D.

    2005-01-01

    Hole-doped perovskites such as La 1-x Ca x MnO 3 present special magnetic and magnetotransport properties, and it is commonly accepted that the local atomic structure around Mn ions plays a crucial role in determining these peculiar features. Therefore experimental techniques directly probing the local atomic structure, like x-ray absorption spectroscopy (XAS), have been widely exploited to deeply understand the physics of these compounds. Quantitative XAS analysis usually concerns the extended region [extended x-ray absorption fine structure (EXAFS)] of the absorption spectra. The near-edge region [x-ray absorption near-edge spectroscopy (XANES)] of XAS spectra can provide detailed complementary information on the electronic structure and local atomic topology around the absorber. However, the complexity of the XANES analysis usually prevents a quantitative understanding of the data. This work exploits the recently developed MXAN code to achieve a quantitative structural refinement of the Mn K-edge XANES of LaMnO 3 and CaMnO 3 compounds; they are the end compounds of the doped manganite series La x Ca 1-x MnO 3 . The results derived from the EXAFS and XANES analyses are in good agreement, demonstrating that a quantitative picture of the local structure can be obtained from XANES in these crystalline compounds. Moreover, the quantitative XANES analysis provides topological information not directly achievable from EXAFS data analysis. This work demonstrates that combining the analysis of extended and near-edge regions of Mn K-edge XAS spectra could provide a complete and accurate description of Mn local atomic environment in these compounds

  12. Verification of Scientific Simulations via Hypothesis-Driven Comparative and Quantitative Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [ORNL; Heitmann, Katrin [ORNL; Petersen, Mark R [ORNL; Woodring, Jonathan [Los Alamos National Laboratory (LANL); Williams, Sean [Los Alamos National Laboratory (LANL); Fasel, Patricia [Los Alamos National Laboratory (LANL); Ahrens, Christine [Los Alamos National Laboratory (LANL); Hsu, Chung-Hsing [ORNL; Geveci, Berk [ORNL

    2010-11-01

    This article presents a visualization-assisted process that verifies scientific-simulation codes. Code verification is necessary because scientists require accurate predictions to interpret data confidently. This verification process integrates iterative hypothesis verification with comparative, feature, and quantitative visualization. Following this process can help identify differences in cosmological and oceanographic simulations.

  13. Information-theoretic treatment of tripartite systems and quantum channels

    International Nuclear Information System (INIS)

    Coles, Patrick J.; Yu Li; Gheorghiu, Vlad; Griffiths, Robert B.

    2011-01-01

    A Holevo measure is used to discuss how much information about a given positive operator valued measure (POVM) on system a is present in another system b, and how this influences the presence or absence of information about a different POVM on a in a third system c. The main goal is to extend information theorems for mutually unbiased bases or general bases to arbitrary POVMs, and especially to generalize ''all-or-nothing'' theorems about information located in tripartite systems to the case of partial information, in the form of quantitative inequalities. Some of the inequalities can be viewed as entropic uncertainty relations that apply in the presence of quantum side information, as in recent work by Berta et al. [Nature Physics 6, 659 (2010)]. All of the results also apply to quantum channels: For example, if E accurately transmits certain POVMs, the complementary channel F will necessarily be noisy for certain other POVMs. While the inequalities are valid for mixed states of tripartite systems, restricting to pure states leads to the basis invariance of the difference between the information about a contained in b and c.

  14. Entropic information of dynamical AdS/QCD holographic models

    Energy Technology Data Exchange (ETDEWEB)

    Bernardini, Alex E., E-mail: alexeb@ufscar.br [Departamento de Física, Universidade Federal de São Carlos, PO Box 676, 13565-905, São Carlos, SP (Brazil); Rocha, Roldão da, E-mail: roldao.rocha@ufabc.edu.br [Centro de Matemática, Computação e Cognição, Universidade Federal do ABC, UFABC, 09210-580, Santo André (Brazil)

    2016-11-10

    The Shannon based conditional entropy that underlies five-dimensional Einstein–Hilbert gravity coupled to a dilaton field is investigated in the context of dynamical holographic AdS/QCD models. Considering the UV and IR dominance limits of such AdS/QCD models, the conditional entropy is shown to shed some light onto the meson classification schemes, which corroborate with the existence of light-flavor mesons of lower spins in Nature. Our analysis is supported by a correspondence between statistical mechanics and information entropy which establishes the physical grounds to the Shannon information entropy, also in the context of statistical mechanics, and provides some specificities for accurately extending the entropic discussion to continuous modes of physical systems. From entropic informational grounds, the conditional entropy allows one to identify the lower experimental/phenomenological occurrence of higher spin mesons in Nature. Moreover, it introduces a quantitative theoretical apparatus for studying the instability of high spin light-flavor mesons.

  15. Qualitative and quantitative measurement of human brain activity using pixel subtraction algorithm

    International Nuclear Information System (INIS)

    Lee, Jin Myoung; Jeong, Gwang Woo; Kim, Hyung Joong; Cho, Seong Hoon; Kang, Heoung Keun; Seo, Jeong Jin; Park, Seung Jin

    2004-01-01

    quantification method showed an average error of 0.334±0.007 (%). Thus, the manual counting method gave less accurate quantitative information on brain activation than the FALBA program. The FALBA program is capable of providing accurate quantitative results, including the identification of the brain activation region and lateralization index with respect to the functional and anatomical areas. Also, the processing time was dramatically shortened in comparison with the manual counting method

  16. Qualitative and quantitative measurement of human brain activity using pixel subtraction algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jin Myoung; Jeong, Gwang Woo; Kim, Hyung Joong; Cho, Seong Hoon; Kang, Heoung Keun; Seo, Jeong Jin; Park, Seung Jin [School of Medicine, Chonnam National Univ., Kwangju (Korea, Republic of)

    2004-08-01

    manual quantification method showed an average error of 0.334{+-}0.007 (%). Thus, the manual counting method gave less accurate quantitative information on brain activation than the FALBA program. The FALBA program is capable of providing accurate quantitative results, including the identification of the brain activation region and lateralization index with respect to the functional and anatomical areas. Also, the processing time was dramatically shortened in comparison with the manual counting method.

  17. Quantitative imaging with a mobile phone microscope.

    Directory of Open Access Journals (Sweden)

    Arunan Skandarajah

    Full Text Available Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone-based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications.

  18. Quantitative Imaging with a Mobile Phone Microscope

    Science.gov (United States)

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  19. Optimization of metabolite basis sets prior to quantitation in magnetic resonance spectroscopy: an approach based on quantum mechanics

    International Nuclear Information System (INIS)

    Lazariev, A; Graveron-Demilly, D; Allouche, A-R; Aubert-Frécon, M; Fauvelle, F; Piotto, M; Elbayed, K; Namer, I-J; Van Ormondt, D

    2011-01-01

    High-resolution magic angle spinning (HRMAS) nuclear magnetic resonance (NMR) is playing an increasingly important role for diagnosis. This technique enables setting up metabolite profiles of ex vivo pathological and healthy tissue. The need to monitor diseases and pharmaceutical follow-up requires an automatic quantitation of HRMAS 1 H signals. However, for several metabolites, the values of chemical shifts of proton groups may slightly differ according to the micro-environment in the tissue or cells, in particular to its pH. This hampers the accurate estimation of the metabolite concentrations mainly when using quantitation algorithms based on a metabolite basis set: the metabolite fingerprints are not correct anymore. In this work, we propose an accurate method coupling quantum mechanical simulations and quantitation algorithms to handle basis-set changes. The proposed algorithm automatically corrects mismatches between the signals of the simulated basis set and the signal under analysis by maximizing the normalized cross-correlation between the mentioned signals. Optimized chemical shift values of the metabolites are obtained. This method, QM-QUEST, provides more robust fitting while limiting user involvement and respects the correct fingerprints of metabolites. Its efficiency is demonstrated by accurately quantitating 33 signals from tissue samples of human brains with oligodendroglioma, obtained at 11.7 tesla. The corresponding chemical shift changes of several metabolites within the series are also analyzed

  20. Optimization of metabolite basis sets prior to quantitation in magnetic resonance spectroscopy: an approach based on quantum mechanics

    Science.gov (United States)

    Lazariev, A.; Allouche, A.-R.; Aubert-Frécon, M.; Fauvelle, F.; Piotto, M.; Elbayed, K.; Namer, I.-J.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    High-resolution magic angle spinning (HRMAS) nuclear magnetic resonance (NMR) is playing an increasingly important role for diagnosis. This technique enables setting up metabolite profiles of ex vivo pathological and healthy tissue. The need to monitor diseases and pharmaceutical follow-up requires an automatic quantitation of HRMAS 1H signals. However, for several metabolites, the values of chemical shifts of proton groups may slightly differ according to the micro-environment in the tissue or cells, in particular to its pH. This hampers the accurate estimation of the metabolite concentrations mainly when using quantitation algorithms based on a metabolite basis set: the metabolite fingerprints are not correct anymore. In this work, we propose an accurate method coupling quantum mechanical simulations and quantitation algorithms to handle basis-set changes. The proposed algorithm automatically corrects mismatches between the signals of the simulated basis set and the signal under analysis by maximizing the normalized cross-correlation between the mentioned signals. Optimized chemical shift values of the metabolites are obtained. This method, QM-QUEST, provides more robust fitting while limiting user involvement and respects the correct fingerprints of metabolites. Its efficiency is demonstrated by accurately quantitating 33 signals from tissue samples of human brains with oligodendroglioma, obtained at 11.7 tesla. The corresponding chemical shift changes of several metabolites within the series are also analyzed.

  1. The profile of quantitative risk indicators in Krsko NPP

    International Nuclear Information System (INIS)

    Vrbanic, I.; Basic, I.; Bilic-Zabric, T.; Spiler, J.

    2004-01-01

    During the past decade strong initiative was observed which was aimed at incorporating information on risk into various aspects of operation of nuclear power plants. The initiative was observable in activities carried out by regulators as well as utilities and industry. It resulted in establishing the process, or procedure, which is often referred to as integrated decision making or risk informed decision making. In this process, engineering analyses and evaluations that are usually termed traditional and that rely on considerations of safety margins and defense in depth are supplemented by quantitative indicators of risk. Throughout the process, the plant risk was most commonly expressed in terms of likelihood of events involving damage to the reactor core and events with radiological releases to the environment. These became two commonly used quantitative indicators or metrics of plant risk (or, reciprocally, plant safety). They were evaluated for their magnitude (e.g. the expected number of events per specified time interval), as well as their profile (e.g. the types of contributing events). The information for quantitative risk indicators (to be used in risk informing process) is obtained from plant's probabilistic safety analyses or analyses of hazards. It is dependable on issues such as availability of input data or quality of model or analysis. Nuclear power plant Krsko has recently performed Periodic Safety Review, which was a good opportunity to evaluate and integrate the plant specific information on quantitative plant risk indicators and their profile. The paper discusses some aspects of quantitative plant risk profile and its perception.(author)

  2. Quantitative X-ray fluorescence analysis at the ESRF ID18F microprobe

    International Nuclear Information System (INIS)

    Vekemans, B.; Vincze, L.; Somogyi, A.; Drakopoulos, M.; Kempenaers, L.; Simionovici, A.; Adams, F.

    2003-01-01

    The new ID18F end-station at the European synchrotron radiation facility (ESRF) in Grenoble (France) is dedicated to sensitive and accurate quantitative micro-X-ray fluorescence (XRF) analysis at the ppm level with accuracy better than 10% for elements with atomic numbers above 18. For accurate quantitative analysis, given a high level of instrumental stability, major steps are the extraction and conversion of experimental X-ray line intensities into elemental concentrations. For this purpose a two-step quantification approach was adopted. In the first step, the collected XRF spectra are deconvoluted on the basis of a non-linear least-squares fitting algorithm (AXIL). The extracted characteristic line intensities are then used as input for a detailed Monte Carlo (MC) simulation code dedicated to XRF spectroscopy taking into account specific experimental conditions (excitation/detection) as well as sample characteristics (absorption and enhancement effects, sample topology, heterogeneity etc.). The iterative use of the MC code gives a 'no-compromise' solution for the quantification problem

  3. Quantitative X-ray fluorescence analysis at the ESRF ID18F microprobe

    Energy Technology Data Exchange (ETDEWEB)

    Vekemans, B. E-mail: vekemans@uia.ua.ac.be; Vincze, L.; Somogyi, A.; Drakopoulos, M.; Kempenaers, L.; Simionovici, A.; Adams, F

    2003-01-01

    The new ID18F end-station at the European synchrotron radiation facility (ESRF) in Grenoble (France) is dedicated to sensitive and accurate quantitative micro-X-ray fluorescence (XRF) analysis at the ppm level with accuracy better than 10% for elements with atomic numbers above 18. For accurate quantitative analysis, given a high level of instrumental stability, major steps are the extraction and conversion of experimental X-ray line intensities into elemental concentrations. For this purpose a two-step quantification approach was adopted. In the first step, the collected XRF spectra are deconvoluted on the basis of a non-linear least-squares fitting algorithm (AXIL). The extracted characteristic line intensities are then used as input for a detailed Monte Carlo (MC) simulation code dedicated to XRF spectroscopy taking into account specific experimental conditions (excitation/detection) as well as sample characteristics (absorption and enhancement effects, sample topology, heterogeneity etc.). The iterative use of the MC code gives a 'no-compromise' solution for the quantification problem.

  4. Quantitative characterization of conformational-specific protein-DNA binding using a dual-spectral interferometric imaging biosensor

    Science.gov (United States)

    Zhang, Xirui; Daaboul, George G.; Spuhler, Philipp S.; Dröge, Peter; Ünlü, M. Selim

    2016-03-01

    DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are

  5. Quantitative measurements in laser-induced plasmas using optical probing. Final report

    International Nuclear Information System (INIS)

    Sweeney, D.W.

    1981-01-01

    Optical probing of laser induced plasmas can be used to quantitatively reconstruct electron number densities and magnetic fields. Numerical techniques for extracting quantitative information from the experimental data are described. A computer simulation of optical probing is used to determine the quantitative information that can be reasonably extracted from real experimental interferometric systems to reconstruct electron number density distributions. An example of a reconstructed interferogram shows a steepened electron distribution due to radiation pressure effects

  6. Comparison of semi-quantitative and quantitative dynamic contrast-enhanced MRI evaluations of vertebral marrow perfusion in a rat osteoporosis model.

    Science.gov (United States)

    Zhu, Jingqi; Xiong, Zuogang; Zhang, Jiulong; Qiu, Yuyou; Hua, Ting; Tang, Guangyu

    2017-11-14

    This study aims to investigate the technical feasibility of semi-quantitative and quantitative dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in the assessment of longitudinal changes of marrow perfusion in a rat osteoporosis model, using bone mineral density (BMD) measured by micro-computed tomography (micro-CT) and histopathology as the gold standards. Fifty rats were randomly assigned to the control group (n=25) and ovariectomy (OVX) group whose bilateral ovaries were excised (n=25). Semi-quantitative and quantitative DCE-MRI, micro-CT, and histopathological examinations were performed on lumbar vertebrae at baseline and 3, 6, 9, and 12 weeks after operation. The differences between the two groups in terms of semi-quantitative DCE-MRI parameter (maximum enhancement, E max ), quantitative DCE-MRI parameters (volume transfer constant, K trans ; interstitial volume, V e ; and efflux rate constant, K ep ), micro-CT parameter (BMD), and histopathological parameter (microvessel density, MVD) were compared at each of the time points using an independent-sample t test. The differences in these parameters between baseline and other time points in each group were assessed via Bonferroni's multiple comparison test. A Pearson correlation analysis was applied to assess the relationships between DCE-MRI, micro-CT, and histopathological parameters. In the OVX group, the E max values decreased significantly compared with those of the control group at weeks 6 and 9 (p=0.003 and 0.004, respectively). The K trans values decreased significantly compared with those of the control group from week 3 (pquantitative DCE-MRI, the quantitative DCE-MRI parameter K trans is a more sensitive and accurate index for detecting early reduced perfusion in osteoporotic bone.

  7. Accurate thermoelastic tensor and acoustic velocities of NaCl

    Energy Technology Data Exchange (ETDEWEB)

    Marcondes, Michel L., E-mail: michel@if.usp.br [Physics Institute, University of Sao Paulo, Sao Paulo, 05508-090 (Brazil); Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455 (United States); Shukla, Gaurav, E-mail: shukla@physics.umn.edu [School of Physics and Astronomy, University of Minnesota, Minneapolis, 55455 (United States); Minnesota supercomputer Institute, University of Minnesota, Minneapolis, 55455 (United States); Silveira, Pedro da [Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455 (United States); Wentzcovitch, Renata M., E-mail: wentz002@umn.edu [Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455 (United States); Minnesota supercomputer Institute, University of Minnesota, Minneapolis, 55455 (United States)

    2015-12-15

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  8. Heterogeneity mapping of protein expression in tumors using quantitative immunofluorescence.

    Science.gov (United States)

    Faratian, Dana; Christiansen, Jason; Gustavson, Mark; Jones, Christine; Scott, Christopher; Um, InHwa; Harrison, David J

    2011-10-25

    Morphologic heterogeneity within an individual tumor is well-recognized by histopathologists in surgical practice. While this often takes the form of areas of distinct differentiation into recognized histological subtypes, or different pathological grade, often there are more subtle differences in phenotype which defy accurate classification (Figure 1). Ultimately, since morphology is dictated by the underlying molecular phenotype, areas with visible differences are likely to be accompanied by differences in the expression of proteins which orchestrate cellular function and behavior, and therefore, appearance. The significance of visible and invisible (molecular) heterogeneity for prognosis is unknown, but recent evidence suggests that, at least at the genetic level, heterogeneity exists in the primary tumor(1,2), and some of these sub-clones give rise to metastatic (and therefore lethal) disease. Moreover, some proteins are measured as biomarkers because they are the targets of therapy (for instance ER and HER2 for tamoxifen and trastuzumab (Herceptin), respectively). If these proteins show variable expression within a tumor then therapeutic responses may also be variable. The widely used histopathologic scoring schemes for immunohistochemistry either ignore, or numerically homogenize the quantification of protein expression. Similarly, in destructive techniques, where the tumor samples are homogenized (such as gene expression profiling), quantitative information can be elucidated, but spatial information is lost. Genetic heterogeneity mapping approaches in pancreatic cancer have relied either on generation of a single cell suspension(3), or on macrodissection(4). A recent study has used quantum dots in order to map morphologic and molecular heterogeneity in prostate cancer tissue(5), providing proof of principle that morphology and molecular mapping is feasible, but falling short of quantifying the heterogeneity. Since immunohistochemistry is, at best, only semi-quantitative

  9. An efficient and accurate 3D displacements tracking strategy for digital volume correlation

    Science.gov (United States)

    Pan, Bing; Wang, Bo; Wu, Dafang; Lubineau, Gilles

    2014-07-01

    Owing to its inherent computational complexity, practical implementation of digital volume correlation (DVC) for internal displacement and strain mapping faces important challenges in improving its computational efficiency. In this work, an efficient and accurate 3D displacement tracking strategy is proposed for fast DVC calculation. The efficiency advantage is achieved by using three improvements. First, to eliminate the need of updating Hessian matrix in each iteration, an efficient 3D inverse compositional Gauss-Newton (3D IC-GN) algorithm is introduced to replace existing forward additive algorithms for accurate sub-voxel displacement registration. Second, to ensure the 3D IC-GN algorithm that converges accurately and rapidly and avoid time-consuming integer-voxel displacement searching, a generalized reliability-guided displacement tracking strategy is designed to transfer accurate and complete initial guess of deformation for each calculation point from its computed neighbors. Third, to avoid the repeated computation of sub-voxel intensity interpolation coefficients, an interpolation coefficient lookup table is established for tricubic interpolation. The computational complexity of the proposed fast DVC and the existing typical DVC algorithms are first analyzed quantitatively according to necessary arithmetic operations. Then, numerical tests are performed to verify the performance of the fast DVC algorithm in terms of measurement accuracy and computational efficiency. The experimental results indicate that, compared with the existing DVC algorithm, the presented fast DVC algorithm produces similar precision and slightly higher accuracy at a substantially reduced computational cost.

  10. An efficient and accurate 3D displacements tracking strategy for digital volume correlation

    KAUST Repository

    Pan, Bing

    2014-07-01

    Owing to its inherent computational complexity, practical implementation of digital volume correlation (DVC) for internal displacement and strain mapping faces important challenges in improving its computational efficiency. In this work, an efficient and accurate 3D displacement tracking strategy is proposed for fast DVC calculation. The efficiency advantage is achieved by using three improvements. First, to eliminate the need of updating Hessian matrix in each iteration, an efficient 3D inverse compositional Gauss-Newton (3D IC-GN) algorithm is introduced to replace existing forward additive algorithms for accurate sub-voxel displacement registration. Second, to ensure the 3D IC-GN algorithm that converges accurately and rapidly and avoid time-consuming integer-voxel displacement searching, a generalized reliability-guided displacement tracking strategy is designed to transfer accurate and complete initial guess of deformation for each calculation point from its computed neighbors. Third, to avoid the repeated computation of sub-voxel intensity interpolation coefficients, an interpolation coefficient lookup table is established for tricubic interpolation. The computational complexity of the proposed fast DVC and the existing typical DVC algorithms are first analyzed quantitatively according to necessary arithmetic operations. Then, numerical tests are performed to verify the performance of the fast DVC algorithm in terms of measurement accuracy and computational efficiency. The experimental results indicate that, compared with the existing DVC algorithm, the presented fast DVC algorithm produces similar precision and slightly higher accuracy at a substantially reduced computational cost. © 2014 Elsevier Ltd.

  11. On the quantitativeness of EDS STEM

    Energy Technology Data Exchange (ETDEWEB)

    Lugg, N.R. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan); Kothleitner, G. [Institute for Electron Microscopy and Nanoanalysis, Graz University of Technology, Steyrergasse 17, 8010 Graz (Austria); Centre for Electron Microscopy, Steyrergasse 17, 8010 Graz (Austria); Shibata, N.; Ikuhara, Y. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2015-04-15

    Chemical mapping using energy dispersive X-ray spectroscopy (EDS) in scanning transmission electron microscopy (STEM) has recently shown to be a powerful technique in analyzing the elemental identity and location of atomic columns in materials at atomic resolution. However, most applications of EDS STEM have been used only to qualitatively map whether elements are present at specific sites. Obtaining calibrated EDS STEM maps so that they are on an absolute scale is a difficult task and even if one achieves this, extracting quantitative information about the specimen – such as the number or density of atoms under the probe – adds yet another layer of complexity to the analysis due to the multiple elastic and inelastic scattering of the electron probe. Quantitative information may be obtained by comparing calibrated EDS STEM with theoretical simulations, but in this case a model of the structure must be assumed a priori. Here we first theoretically explore how exactly elastic and thermal scattering of the probe confounds the quantitative information one is able to extract about the specimen from an EDS STEM map. We then show using simulation how tilting the specimen (or incident probe) can reduce the effects of scattering and how it can provide quantitative information about the specimen. We then discuss drawbacks of this method – such as the loss of atomic resolution along the tilt direction – but follow this with a possible remedy: precession averaged EDS STEM mapping. - Highlights: • Signal obtained in EDS STEM maps (of STO) compared to non-channelling signal. • Deviation from non-channelling signal occurs in on-axis experiments. • Tilting specimen: signal close to non-channelling case but atomic resolution is lost. • Tilt-precession series: non-channelling signal and atomic-resolution features obtained. • Associated issues are discussed.

  12. Extracting quantitative three-dimensional unsteady flow direction from tuft flow visualizations

    Energy Technology Data Exchange (ETDEWEB)

    Omata, Noriyasu; Shirayama, Susumu, E-mail: omata@nakl.t.u-tokyo.ac.jp, E-mail: sirayama@sys.t.u-tokyo.ac.jp [Department of Systems Innovation, School of Engineering, The University of Tokyo, Hongo 7-3-1, Bunkyo-ku, Tokyo, 113-8656 (Japan)

    2017-10-15

    We focus on the qualitative but widely used method of tuft flow visualization, and propose a method for quantifying it using information technology. By applying stereo image processing and computer vision, the three-dimensional (3D) flow direction in a real environment can be obtained quantitatively. In addition, we show that the flow can be divided temporally by performing appropriate machine learning on the data. Acquisition of flow information in real environments is important for design development, but it is generally considered difficult to apply simulations or quantitative experiments to such environments. Hence, qualitative methods including the tuft method are still in use today. Although attempts have been made previously to quantify such methods, it has not been possible to acquire 3D information. Furthermore, even if quantitative data could be acquired, analysis was often performed empirically or qualitatively. In contrast, we show that our method can acquire 3D information and analyze the measured data quantitatively. (paper)

  13. Extracting quantitative three-dimensional unsteady flow direction from tuft flow visualizations

    International Nuclear Information System (INIS)

    Omata, Noriyasu; Shirayama, Susumu

    2017-01-01

    We focus on the qualitative but widely used method of tuft flow visualization, and propose a method for quantifying it using information technology. By applying stereo image processing and computer vision, the three-dimensional (3D) flow direction in a real environment can be obtained quantitatively. In addition, we show that the flow can be divided temporally by performing appropriate machine learning on the data. Acquisition of flow information in real environments is important for design development, but it is generally considered difficult to apply simulations or quantitative experiments to such environments. Hence, qualitative methods including the tuft method are still in use today. Although attempts have been made previously to quantify such methods, it has not been possible to acquire 3D information. Furthermore, even if quantitative data could be acquired, analysis was often performed empirically or qualitatively. In contrast, we show that our method can acquire 3D information and analyze the measured data quantitatively. (paper)

  14. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    Science.gov (United States)

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  15. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    Science.gov (United States)

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  16. Subjective Quantitative Studies of Human Agency

    Science.gov (United States)

    Alkire, Sabina

    2005-01-01

    Amartya Sen's writings have articulated the importance of human agency, and identified the need for information on agency freedom to inform our evaluation of social arrangements. Many approaches to poverty reduction stress the need for empowerment. This paper reviews "subjective quantitative measures of human agency at the individual level." It…

  17. Development of the Japanese version of an information aid to provide accurate information on prognosis to patients with advanced non-small-cell lung cancer receiving chemotherapy: a pilot study.

    Science.gov (United States)

    Nakano, Kikuo; Kitahara, Yoshihiro; Mito, Mineyo; Seno, Misato; Sunada, Shoji

    2018-02-27

    Without explicit prognostic information, patients may overestimate their life expectancy and make poor choices at the end of life. We sought to design the Japanese version of an information aid (IA) to provide accurate information on prognosis to patients with advanced non-small-cell lung cancer (NSCLC) and to assess the effects of the IA on hope, psychosocial status, and perception of curability. We developed the Japanese version of an IA, which provided information on survival and cure rates as well as numerical survival estimates for patients with metastatic NSCLC receiving first-line chemotherapy. We then assessed the pre- and post-intervention effects of the IA on hope, anxiety, and perception of curability and treatment benefits. A total of 20 (95%) of 21 patients (65% male; median age, 72 years) completed the IA pilot test. Based on the results, scores on the Distress and Impact Thermometer screening tool for adjustment disorders and major depression tended to decrease (from 4.5 to 2.5; P = 0.204), whereas no significant changes were seen in scores for anxiety on the Japanese version of the Support Team Assessment Schedule or in scores on the Hearth Hope Index (from 41.9 to 41.5; p = 0.204). The majority of the patients (16/20, 80%) had high expectations regarding the curative effects of chemotherapy. The Japanese version of the IA appeared to help patients with NSCLC maintain hope, and did not increase their anxiety when they were given explicit prognostic information; however, the IA did not appear to help such patients understand the goal of chemotherapy. Further research is needed to test the findings in a larger sample and measure the outcomes of explicit prognostic information on hope, psychological status, and perception of curability.

  18. How predictive quantitative modelling of tissue organisation can inform liver disease pathogenesis.

    Science.gov (United States)

    Drasdo, Dirk; Hoehme, Stefan; Hengstler, Jan G

    2014-10-01

    From the more than 100 liver diseases described, many of those with high incidence rates manifest themselves by histopathological changes, such as hepatitis, alcoholic liver disease, fatty liver disease, fibrosis, and, in its later stages, cirrhosis, hepatocellular carcinoma, primary biliary cirrhosis and other disorders. Studies of disease pathogeneses are largely based on integrating -omics data pooled from cells at different locations with spatial information from stained liver structures in animal models. Even though this has led to significant insights, the complexity of interactions as well as the involvement of processes at many different time and length scales constrains the possibility to condense disease processes in illustrations, schemes and tables. The combination of modern imaging modalities with image processing and analysis, and mathematical models opens up a promising new approach towards a quantitative understanding of pathologies and of disease processes. This strategy is discussed for two examples, ammonia metabolism after drug-induced acute liver damage, and the recovery of liver mass as well as architecture during the subsequent regeneration process. This interdisciplinary approach permits integration of biological mechanisms and models of processes contributing to disease progression at various scales into mathematical models. These can be used to perform in silico simulations to promote unravelling the relation between architecture and function as below illustrated for liver regeneration, and bridging from the in vitro situation and animal models to humans. In the near future novel mechanisms will usually not be directly elucidated by modelling. However, models will falsify hypotheses and guide towards the most informative experimental design. Copyright © 2014 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.

  19. Quantitatively accurate activity measurements with a dedicated cardiac SPECT camera: Physical phantom experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pourmoghaddas, Amir, E-mail: apour@ottawaheart.ca; Wells, R. Glenn [Physics Department, Carleton University, Ottawa, Ontario K1S 5B6, Canada and Cardiology, The University of Ottawa Heart Institute, Ottawa, Ontario K1Y4W7 (Canada)

    2016-01-15

    Purpose: Recently, there has been increased interest in dedicated cardiac single photon emission computed tomography (SPECT) scanners with pinhole collimation and improved detector technology due to their improved count sensitivity and resolution over traditional parallel-hole cameras. With traditional cameras, energy-based approaches are often used in the clinic for scatter compensation because they are fast and easily implemented. Some of the cardiac cameras use cadmium-zinc-telluride (CZT) detectors which can complicate the use of energy-based scatter correction (SC) due to the low-energy tail—an increased number of unscattered photons detected with reduced energy. Modified energy-based scatter correction methods can be implemented, but their level of accuracy is unclear. In this study, the authors validated by physical phantom experiments the quantitative accuracy and reproducibility of easily implemented correction techniques applied to {sup 99m}Tc myocardial imaging with a CZT-detector-based gamma camera with multiple heads, each with a single-pinhole collimator. Methods: Activity in the cardiac compartment of an Anthropomorphic Torso phantom (Data Spectrum Corporation) was measured through 15 {sup 99m}Tc-SPECT acquisitions. The ratio of activity concentrations in organ compartments resembled a clinical {sup 99m}Tc-sestamibi scan and was kept consistent across all experiments (1.2:1 heart to liver and 1.5:1 heart to lung). Two background activity levels were considered: no activity (cold) and an activity concentration 1/10th of the heart (hot). A plastic “lesion” was placed inside of the septal wall of the myocardial insert to simulate the presence of a region without tracer uptake and contrast in this lesion was calculated for all images. The true net activity in each compartment was measured with a dose calibrator (CRC-25R, Capintec, Inc.). A 10 min SPECT image was acquired using a dedicated cardiac camera with CZT detectors (Discovery NM530c, GE

  20. Automatic quantitative analysis of liver functions by a computer system

    International Nuclear Information System (INIS)

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  1. Simulation evaluation of quantitative myocardial perfusion assessment from cardiac CT

    Science.gov (United States)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-03-01

    Contrast enhancement on cardiac CT provides valuable information about myocardial perfusion and methods have been proposed to assess perfusion with static and dynamic acquisitions. There is a lack of knowledge and consensus on the appropriate approach to ensure 1) sufficient diagnostic accuracy for clinical decisions and 2) low radiation doses for patient safety. This work developed a thorough dynamic CT simulation and several accepted blood flow estimation techniques to evaluate the performance of perfusion assessment across a range of acquisition and estimation scenarios. Cardiac CT acquisitions were simulated for a range of flow states (Flow = 0.5, 1, 2, 3 ml/g/min, cardiac output = 3,5,8 L/min). CT acquisitions were simulated with a validated CT simulator incorporating polyenergetic data acquisition and realistic x-ray flux levels for dynamic acquisitions with a range of scenarios including 1, 2, 3 sec sampling for 30 sec with 25, 70, 140 mAs. Images were generated using conventional image reconstruction with additional image-based beam hardening correction to account for iodine content. Time attenuation curves were extracted for multiple regions around the myocardium and used to estimate flow. In total, 2,700 independent realizations of dynamic sequences were generated and multiple MBF estimation methods were applied to each of these. Evaluation of quantitative kinetic modeling yielded blood flow estimates with an root mean square error (RMSE) of ~0.6 ml/g/min averaged across multiple scenarios. Semi-quantitative modeling and qualitative static imaging resulted in significantly more error (RMSE = ~1.2 and ~1.2 ml/min/g respectively). For quantitative methods, dose reduction through reduced temporal sampling or reduced tube current had comparable impact on the MBF estimate fidelity. On average, half dose acquisitions increased the RMSE of estimates by only 18% suggesting that substantial dose reductions can be employed in the context of quantitative myocardial

  2. Three-Dimensional Photography for Quantitative Assessment of Penile Volume-Loss Deformities in Peyronie's Disease.

    Science.gov (United States)

    Margolin, Ezra J; Mlynarczyk, Carrie M; Mulhall, John P; Stember, Doron S; Stahl, Peter J

    2017-06-01

    Non-curvature penile deformities are prevalent and bothersome manifestations of Peyronie's disease (PD), but the quantitative metrics that are currently used to describe these deformities are inadequate and non-standardized, presenting a barrier to clinical research and patient care. To introduce erect penile volume (EPV) and percentage of erect penile volume loss (percent EPVL) as novel metrics that provide detailed quantitative information about non-curvature penile deformities and to study the feasibility and reliability of three-dimensional (3D) photography for measurement of quantitative penile parameters. We constructed seven penis models simulating deformities found in PD. The 3D photographs of each model were captured in triplicate by four observers using a 3D camera. Computer software was used to generate automated measurements of EPV, percent EPVL, penile length, minimum circumference, maximum circumference, and angle of curvature. The automated measurements were statistically compared with measurements obtained using water-displacement experiments, a tape measure, and a goniometer. Accuracy of 3D photography for average measurements of all parameters compared with manual measurements; inter-test, intra-observer, and inter-observer reliabilities of EPV and percent EPVL measurements as assessed by the intraclass correlation coefficient. The 3D images were captured in a median of 52 seconds (interquartile range = 45-61). On average, 3D photography was accurate to within 0.3% for measurement of penile length. It overestimated maximum and minimum circumferences by averages of 4.2% and 1.6%, respectively; overestimated EPV by an average of 7.1%; and underestimated percent EPVL by an average of 1.9%. All inter-test, inter-observer, and intra-observer intraclass correlation coefficients for EPV and percent EPVL measurements were greater than 0.75, reflective of excellent methodologic reliability. By providing highly descriptive and reliable measurements of

  3. An accurate method for quantifying and analyzing copy number variation in porcine KIT by an oligonucleotide ligation assay

    Directory of Open Access Journals (Sweden)

    Cho In-Cheol

    2007-11-01

    Full Text Available Abstract Background Aside from single nucleotide polymorphisms, copy number variations (CNVs are the most important factors in susceptibility to genetic disorders because they affect expression levels of genes. In previous studies, pyrosequencing, mini-sequencing, real-time PCR, invader assays and other techniques have been used to detect CNVs. However, the higher the copy number in a genome, the more difficult it is to resolve the copies, so a more accurate method for measuring CNVs and assigning genotype is needed. Results PCR followed by a quantitative oligonucleotide ligation assay (qOLA was developed for quantifying CNVs. The accuracy and precision of the assay were evaluated for porcine KIT, which was selected as a model locus. Overall, the root mean squares of bias and standard deviation of qOLA were 2.09 and 0.45, respectively. These values are less than half of those in the published pyrosequencing assay for analyzing CNV in porcine KIT. Using a combined method of qOLA and another pyrosequencing for quantitative analysis of KIT copies with spliced forms, we confirmed the segregation of KIT alleles in 145 F1 animals with pedigree information and verified the correct assignment of genotypes. In a diagnostic test on 100 randomly sampled commercial pigs, there was perfect agreement between the genotypes obtained by grouping observations on a scatter plot and by clustering using the nearest centroid sorting method implemented in PROC FASTCLUS of the SAS package. In a test on 159 Large White pigs, there were only two discrepancies between genotypes assigned by the two clustering methods (98.7% agreement, confirming that the quantitative ligation assay established here makes genotyping possible through the accurate measurement of high KIT copy numbers (>4 per diploid genome. Moreover, the assay is sensitive enough for use on DNA from hair follicles, indicating that DNA from various sources could be used. Conclusion We have established a high

  4. What information on measurement uncertainty should be communicated to clinicians, and how?

    Science.gov (United States)

    Plebani, Mario; Sciacovelli, Laura; Bernardi, Daniela; Aita, Ada; Antonelli, Giorgia; Padoan, Andrea

    2018-02-02

    The communication of laboratory results to physicians and the quality of reports represent fundamental requirements of the post-analytical phase in order to assure the right interpretation and utilization of laboratory information. Accordingly, the International Standard for clinical laboratories accreditation (ISO 15189) requires that "laboratory reports shall include the information necessary for the interpretation of the examination results". Measurement uncertainty (MU) is an inherent property of any quantitative measurement result which express the lack of knowledge of the true value and quantify the uncertainty of a result, incorporating the factors known to influence it. Even if the MU is not included in the report attributes of ISO 15189 and cannot be considered a post-analytical requirement, it is suggested as an information which should facilitate an appropriate interpretation of quantitative results (quantity values). Therefore, MU has two intended uses: for laboratory professionals, it gives information about the quality of measurements, providing evidence of the compliance with analytical performance characteristics; for physicians (and patients) it may help in interpretation of measurement results, especially when values are compared with reference intervals or clinical decision limits, providing objective information. Here we describe the way that MU should be added to laboratory reports in order to facilitate the interpretation of laboratory results and connecting efforts performed within laboratory to provide more accurate and reliable results with a more objective tool for their interpretation by physicians. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  5. Tomosynthesis can facilitate accurate measurement of joint space width under the condition of the oblique incidence of X-rays in patients with rheumatoid arthritis.

    Science.gov (United States)

    Ono, Yohei; Kashihara, Rina; Yasojima, Nobutoshi; Kasahara, Hideki; Shimizu, Yuka; Tamura, Kenichi; Tsutsumi, Kaori; Sutherland, Kenneth; Koike, Takao; Kamishima, Tamotsu

    2016-06-01

    Accurate evaluation of joint space width (JSW) is important in the assessment of rheumatoid arthritis (RA). In clinical radiography of bilateral hands, the oblique incidence of X-rays is unavoidable, which may cause perceptional or measurement error of JSW. The objective of this study was to examine whether tomosynthesis, a recently developed modality, can facilitate a more accurate evaluation of JSW than radiography under the condition of oblique incidence of X-rays. We investigated quantitative errors derived from the oblique incidence of X-rays by imaging phantoms simulating various finger joint spaces using radiographs and tomosynthesis images. We then compared the qualitative results of the modified total Sharp score of a total of 320 joints from 20 patients with RA between these modalities. A quantitative error was prominent when the location of the phantom was shifted along the JSW direction. Modified total Sharp scores of tomosynthesis images were significantly higher than those of radiography, that is to say JSW was regarded as narrower in tomosynthesis than in radiography when finger joints were located where the oblique incidence of X-rays is expected in the JSW direction. Tomosynthesis can facilitate accurate evaluation of JSW in finger joints of patients with RA, even with oblique incidence of X-rays. Accurate evaluation of JSW is necessary for the management of patients with RA. Through phantom and clinical studies, we demonstrate that tomosynthesis may achieve more accurate evaluation of JSW.

  6. A Technique Using Calibrated Photography and Photoshop for Accurate Shade Analysis and Communication.

    Science.gov (United States)

    McLaren, Edward A; Figueira, Johan; Goldstein, Ronald E

    2017-02-01

    This article reviews the critical aspects of controlling the shade-taking environment and discusses various modalities introduced throughout the years to acquire and communicate shade information. Demonstrating a highly calibrated digital photographic technique for capturing shade information, this article shows how to use Photoshop® to standardize images and extract color information from the tooth and shade tab for use by a ceramist for an accurate shade-matching restoration.

  7. [Methods of quantitative proteomics].

    Science.gov (United States)

    Kopylov, A T; Zgoda, V G

    2007-01-01

    In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.

  8. Combining qualitative and quantitative operational research methods to inform quality improvement in pathways that span multiple settings

    Science.gov (United States)

    Crowe, Sonya; Brown, Katherine; Tregay, Jenifer; Wray, Jo; Knowles, Rachel; Ridout, Deborah A; Bull, Catherine; Utley, Martin

    2017-01-01

    Background Improving integration and continuity of care across sectors within resource constraints is a priority in many health systems. Qualitative operational research methods of problem structuring have been used to address quality improvement in services involving multiple sectors but not in combination with quantitative operational research methods that enable targeting of interventions according to patient risk. We aimed to combine these methods to augment and inform an improvement initiative concerning infants with congenital heart disease (CHD) whose complex care pathway spans multiple sectors. Methods Soft systems methodology was used to consider systematically changes to services from the perspectives of community, primary, secondary and tertiary care professionals and a patient group, incorporating relevant evidence. Classification and regression tree (CART) analysis of national audit datasets was conducted along with data visualisation designed to inform service improvement within the context of limited resources. Results A ‘Rich Picture’ was developed capturing the main features of services for infants with CHD pertinent to service improvement. This was used, along with a graphical summary of the CART analysis, to guide discussions about targeting interventions at specific patient risk groups. Agreement was reached across representatives of relevant health professions and patients on a coherent set of targeted recommendations for quality improvement. These fed into national decisions about service provision and commissioning. Conclusions When tackling complex problems in service provision across multiple settings, it is important to acknowledge and work with multiple perspectives systematically and to consider targeting service improvements in response to confined resources. Our research demonstrates that applying a combination of qualitative and quantitative operational research methods is one approach to doing so that warrants further

  9. Accurate location estimation of moving object In Wireless Sensor network

    Directory of Open Access Journals (Sweden)

    Vinay Bhaskar Semwal

    2011-12-01

    Full Text Available One of the central issues in wirless sensor networks is track the location, of moving object which have overhead of saving data, an accurate estimation of the target location of object with energy constraint .We do not have any mechanism which control and maintain data .The wireless communication bandwidth is also very limited. Some field which is using this technique are flood and typhoon detection, forest fire detection, temperature and humidity and ones we have these information use these information back to a central air conditioning and ventilation.In this research paper, we propose protocol based on the prediction and adaptive based algorithm which is using less sensor node reduced by an accurate estimation of the target location. We had shown that our tracking method performs well in terms of energy saving regardless of mobility pattern of the mobile target. We extends the life time of network with less sensor node. Once a new object is detected, a mobile agent will be initiated to track the roaming path of the object.

  10. Quantitative dynamic nuclear polarization‐NMR on blood plasma for assays of drug metabolism

    DEFF Research Database (Denmark)

    Lerche, Mathilde Hauge; Meier, Sebastian; Jensen, Pernille Rose

    2011-01-01

    ‐scan 13C DNP‐NMR. An internal standard is used for the accurate quantification of drug and metabolite. Comparison of quantitative DNP‐NMR data with an established analytical method (liquid chromatography‐mass spectrometry) yields a Pearson correlation coefficient r of 0.99. Notably, all DNP...

  11. BBN based Quantitative Assessment of Software Design Specification

    International Nuclear Information System (INIS)

    Eom, Heung-Seop; Park, Gee-Yong; Kang, Hyun-Gook; Kwon, Kee-Choon; Chang, Seung-Cheol

    2007-01-01

    Probabilistic Safety Assessment (PSA), which is one of the important methods in assessing the overall safety of a nuclear power plant (NPP), requires quantitative reliability information of safety-critical software, but the conventional reliability assessment methods can not provide enough information for PSA of a NPP. Therefore current PSA which includes safety-critical software does not usually consider the reliability of the software or uses arbitrary values for it. In order to solve this situation this paper proposes a method that can produce quantitative reliability information of safety-critical software for PSA by making use of Bayesian Belief Networks (BBN). BBN has generally been used to model an uncertain system in many research fields including the safety assessment of software. The proposed method was constructed by utilizing BBN which can combine the qualitative and the quantitative evidence relevant to the reliability of safety critical software. The constructed BBN model can infer a conclusion in a formal and a quantitative way. A case study was carried out with the proposed method to assess the quality of software design specification (SDS) of safety-critical software that will be embedded in a reactor protection system. The intermediate V and V results of the software design specification were used as inputs to the BBN model

  12. Qualitative and quantitative characterization of plasma proteins when incorporating traveling wave ion mobility into a liquid chromatography-mass spectrometry workflow for biomarker discovery: use of product ion quantitation as an alternative data analysis tool for label free quantitation.

    Science.gov (United States)

    Daly, Charlotte E; Ng, Leong L; Hakimi, Amirmansoor; Willingale, Richard; Jones, Donald J L

    2014-02-18

    Discovery of protein biomarkers in clinical samples necessitates significant prefractionation prior to liquid chromatography-mass spectrometry (LC-MS) analysis. Integrating traveling wave ion mobility spectrometry (TWIMS) enables in-line gas phase separation which when coupled with nanoflow liquid chromatography and data independent acquisition tandem mass spectrometry, confers significant advantages to the discovery of protein biomarkers by improving separation and inherent sensitivity. Incorporation of TWIMS leads to a packet of concentrated ions which ultimately provides a significant improvement in sensitivity. As a consequence of ion packeting, when present at high concentrations, accurate quantitation of proteins can be affected due to detector saturation effects. Human plasma was analyzed in triplicate using liquid-chromatography data independent acquisition mass spectrometry (LC-DIA-MS) and using liquid-chromatography ion-mobility data independent acquisition mass spectrometry (LC-IM-DIA-MS). The inclusion of TWIMS was assessed for the effect on sample throughput, data integrity, confidence of protein and peptide identification, and dynamic range. The number of identified proteins is significantly increased by an average of 84% while both the precursor and product mass accuracies are maintained between the modalities. Sample dynamic range is also maintained while quantitation is achieved for all but the most abundant proteins by incorporating a novel data interpretation method that allows accurate quantitation to occur. This additional separation is all achieved within a workflow with no discernible deleterious effect on throughput. Consequently, TWIMS greatly enhances proteome coverage and can be reliably used for quantification when using an alternative product ion quantification strategy. Using TWIMS in biomarker discovery in human plasma is thus recommended.

  13. Quantitation of the human basal ganglia with Positron Emission Tomography

    International Nuclear Information System (INIS)

    Bendriem, B.; Dewey, S.L.; Schlyer, D.J.; Wolf, A.P.; Volkow, N.D.

    1990-01-01

    The accurate measurement of the concentration of a radioisotope in small structures with PET requires a correction for quantitation loss due to the partial volume effect and the effect of scattered radiation. To evaluate errors associated with measures in the human basal ganglia (BG) we have built a unilateral model of the BG that we have inserted in a 20 cm cylinder. The recovery coefficient (RC = measured activity/true activity) for our BG phantom has been measured on a CTI tomograph (model 931-08/12) with different background concentrations (contrast) and at different axial locations in the gantry. The BG was visualized on 4 or 5 slices depending on its position in the gantry and on the contrast used. The RC was 0.75 with no background (contrast equal to 1.0). Increasing the relative radioactivity concentration in the background increased the RC from 0.75 to 2.00 when the contrast was -0.7 (BG 2 ). These results show that accurate RC correction depends not only on the volume of the structure but also on its contrast with its surroundings as well as on the selection of the ROI. They also demonstrate that the higher the contrast the more sensitive to axial positioning PET measurements in the BG are. These data provide us with some information about the variability of PET measurements in small structure like the BG and we have proposed some strategies to improve the reproducibility. 18 refs., 3 figs., 5 tabs

  14. How Novel Methods Can Help Discover More Information about Foodborne Pathogens

    Directory of Open Access Journals (Sweden)

    Mansel W Griffiths

    2000-01-01

    Full Text Available Considerable emphasis is being placed on quantitative risk assessment modelling as a basis for regulation of trade in food products. However, for models to be accurate, information about the behaviour of potential pathogens in foods needs to be available. The question is how to obtain this knowledge in a simple and cost effective way. One technique that has great potential is the use of reporter bacteria which have been genetically modified to express a phenotype that can be easily monitored, such as light production in luminescent organisms. Bacteria carrying these (lux genes can easily be detected using simple luminometers or more sophisticated low light imaging equipment.

  15. Quantitative Classification of Rice (Oryza sativa L.) Root Length and Diameter Using Image Analysis.

    Science.gov (United States)

    Gu, Dongxiang; Zhen, Fengxian; Hannaway, David B; Zhu, Yan; Liu, Leilei; Cao, Weixing; Tang, Liang

    2017-01-01

    Quantitative study of root morphological characteristics of plants is helpful for understanding the relationships between their morphology and function. However, few studies and little detailed and accurate information of root characteristics were reported in fine-rooted plants like rice (Oryza sativa L.). The aims of this study were to quantitatively classify fine lateral roots (FLRs), thick lateral roots (TLRs), and nodal roots (NRs) and analyze their dynamics of mean diameter (MD), lengths and surface area percentage with growth stages in rice plant. Pot experiments were carried out during three years with three rice cultivars, three nitrogen (N) rates and three water regimes. In cultivar experiment, among the three cultivars, root length of 'Yangdao 6' was longest, while the MD of its FLR was the smallest, and the mean diameters for TLR and NR were the largest, the surface area percentage (SAP) of TLRs (SAPT) was the highest, indicating that Yangdao 6 has better nitrogen and water uptake ability. High N rate increased the length of different types of roots and increased the MD of lateral roots, decreased the SAP of FLRs (SAPF) and TLRs, but increased the SAP of NRs (SAPN). Moderate decrease of water supply increased root length and diameter, water stress increased the SAPF and SAPT, but decreased SAPN. The quantitative results indicate that rice plant tends to increase lateral roots to get more surface area for nitrogen and water uptake when available assimilates are limiting under nitrogen and water stress environments.

  16. Abortion and mental health: quantitative synthesis and analysis of research published 1995-2009.

    Science.gov (United States)

    Coleman, Priscilla K

    2011-09-01

    Given the methodological limitations of recently published qualitative reviews of abortion and mental health, a quantitative synthesis was deemed necessary to represent more accurately the published literature and to provide clarity to clinicians. To measure the association between abortion and indicators of adverse mental health, with subgroup effects calculated based on comparison groups (no abortion, unintended pregnancy delivered, pregnancy delivered) and particular outcomes. A secondary objective was to calculate population-attributable risk (PAR) statistics for each outcome. After the application of methodologically based selection criteria and extraction rules to minimise bias, the sample comprised 22 studies, 36 measures of effect and 877 181 participants (163 831 experienced an abortion). Random effects pooled odds ratios were computed using adjusted odds ratios from the original studies and PAR statistics were derived from the pooled odds ratios. Women who had undergone an abortion experienced an 81% increased risk of mental health problems, and nearly 10% of the incidence of mental health problems was shown to be attributable to abortion. The strongest subgroup estimates of increased risk occurred when abortion was compared with term pregnancy and when the outcomes pertained to substance use and suicidal behaviour. This review offers the largest quantitative estimate of mental health risks associated with abortion available in the world literature. Calling into question the conclusions from traditional reviews, the results revealed a moderate to highly increased risk of mental health problems after abortion. Consistent with the tenets of evidence-based medicine, this information should inform the delivery of abortion services.

  17. A Method for Quantitative Determination of Biofilm Viability

    Directory of Open Access Journals (Sweden)

    Maria Strømme

    2012-06-01

    Full Text Available In this study we present a scheme for quantitative determination of biofilm viability offering significant improvement over existing methods with metabolic assays. Existing metabolic assays for quantifying viable bacteria in biofilms usually utilize calibration curves derived from planktonic bacteria, which can introduce large errors due to significant differences in the metabolic and/or growth rates of biofilm bacteria in the assay media compared to their planktonic counterparts. In the presented method we derive the specific growth rate of Streptococcus mutans bacteria biofilm from a series of metabolic assays using the pH indicator phenol red, and show that this information could be used to more accurately quantify the relative number of viable bacteria in a biofilm. We found that the specific growth rate of S. mutans in biofilm mode of growth was 0.70 h−1, compared to 1.09 h−1 in planktonic growth. This method should be applicable to other bacteria types, as well as other metabolic assays, and, for example, to quantify the effect of antibacterial treatments or the performance of bactericidal implant surfaces.

  18. A New Quantitative Method for the Non-Invasive Documentation of Morphological Damage in Paintings Using RTI Surface Normals

    Directory of Open Access Journals (Sweden)

    Marcello Manfredi

    2014-07-01

    Full Text Available In this paper we propose a reliable surface imaging method for the non-invasive detection of morphological changes in paintings. Usually, the evaluation and quantification of changes and defects results mostly from an optical and subjective assessment, through the comparison of the previous and subsequent state of conservation and by means of condition reports. Using quantitative Reflectance Transformation Imaging (RTI we obtain detailed information on the geometry and morphology of the painting surface with a fast, precise and non-invasive method. Accurate and quantitative measurements of deterioration were acquired after the painting experienced artificial damage. Morphological changes were documented using normal vector images while the intensity map succeeded in highlighting, quantifying and describing the physical changes. We estimate that the technique can detect a morphological damage slightly smaller than 0.3 mm, which would be difficult to detect with the eye, considering the painting size. This non-invasive tool could be very useful, for example, to examine paintings and artwork before they travel on loan or during a restoration. The method lends itself to automated analysis of large images and datasets. Quantitative RTI thus eases the transition of extending human vision into the realm of measuring change over time.

  19. Quantitative radiography

    International Nuclear Information System (INIS)

    Brase, J.M.; Martz, H.E.; Waltjen, K.E.; Hurd, R.L.; Wieting, M.G.

    1986-01-01

    Radiographic techniques have been used in nondestructive evaluation primarily to develop qualitative information (i.e., defect detection). This project applies and extends the techniques developed in medical x-ray imaging, particularly computed tomography (CT), to develop quantitative information (both spatial dimensions and material quantities) on the three-dimensional (3D) structure of solids. Accomplishments in FY 86 include (1) improvements in experimental equipment - an improved microfocus system that will give 20-μm resolution and has potential for increased imaging speed, and (2) development of a simple new technique for displaying 3D images so as to clearly show the structure of the object. Image reconstruction and data analysis for a series of synchrotron CT experiments conducted by LLNL's Chemistry Department has begun

  20. Quality control in quantitative computed tomography

    International Nuclear Information System (INIS)

    Jessen, K.A.; Joergensen, J.

    1989-01-01

    Computed tomography (CT) has for several years been an indispensable tool in diagnostic radiology, but it is only recently that extraction of quantitative information from CT images has been of practical clinical value. Only careful control of the scan parameters, and especially the scan geometry, allows useful information to be obtained; and it can be demonstrated by simple phantom measurements how sensitive a CT system can be to variations in size, shape and position of the phantom in the gantry aperture. Significant differences exist between systems that are not manifested in normal control of image quality and general performance tests. Therefore an actual system has to be analysed for its suitability for quantitative use of the images before critical clinical applications are justified. (author)

  1. Prediction of Accurate Mixed Mode Fatigue Crack Growth Curves using the Paris' Law

    Science.gov (United States)

    Sajith, S.; Krishna Murthy, K. S. R.; Robi, P. S.

    2017-12-01

    Accurate information regarding crack growth times and structural strength as a function of the crack size is mandatory in damage tolerance analysis. Various equivalent stress intensity factor (SIF) models are available for prediction of mixed mode fatigue life using the Paris' law. In the present investigation these models have been compared to assess their efficacy in prediction of the life close to the experimental findings as there are no guidelines/suggestions available on selection of these models for accurate and/or conservative predictions of fatigue life. Within the limitations of availability of experimental data and currently available numerical simulation techniques, the results of present study attempts to outline models that would provide accurate and conservative life predictions.

  2. Development of quantitative x-ray microtomography

    International Nuclear Information System (INIS)

    Deckman, H.W.; Dunsmuir, J.A.; D'Amico, K.L.; Ferguson, S.R.; Flannery, B.P.

    1990-01-01

    The authors have developed several x-ray microtomography systems which function as quantitative three dimensional x-ray microscopes. In this paper the authors describe the evolutionary path followed from making the first high resolution experimental microscopes to later generations which can be routinely used for investigating materials. Developing the instrumentation for reliable quantitative x-ray microscopy using synchrotron and laboratory based x-ray sources has led to other imaging modalities for obtaining temporal and spatial two dimensional information

  3. Fluorescence-based Western blotting for quantitation of protein biomarkers in clinical samples.

    Science.gov (United States)

    Zellner, Maria; Babeluk, Rita; Diestinger, Michael; Pirchegger, Petra; Skeledzic, Senada; Oehler, Rudolf

    2008-09-01

    Since most high throughput techniques used in biomarker discovery are very time and cost intensive, highly specific and quantitative analytical alternative application methods are needed for the routine analysis. Conventional Western blotting allows detection of specific proteins to the level of single isotypes while its quantitative accuracy is rather limited. We report a novel and improved quantitative Western blotting method. The use of fluorescently labelled secondary antibodies strongly extends the dynamic range of the quantitation and improves the correlation with the protein amount (r=0.997). By an additional fluorescent staining of all proteins immediately after their transfer to the blot membrane, it is possible to visualise simultaneously the antibody binding and the total protein profile. This allows for an accurate correction for protein load. Applying this normalisation it could be demonstrated that fluorescence-based Western blotting is able to reproduce a quantitative analysis of two specific proteins in blood platelet samples from 44 subjects with different diseases as initially conducted by 2D-DIGE. These results show that the proposed fluorescence-based Western blotting is an adequate application technique for biomarker quantitation and suggest possibilities of employment that go far beyond.

  4. Rethinking the Numerate Citizen: Quantitative Literacy and Public Issues

    Directory of Open Access Journals (Sweden)

    Ander W. Erickson

    2016-07-01

    Full Text Available Does a citizen need to possess quantitative literacy in order to make responsible decisions on behalf of the public good? If so, how much is enough? This paper presents an analysis of the quantitative claims made on behalf of ballot measures in order to better delineate the role of quantitative literacy for the citizen. I argue that this role is surprisingly limited due to the contextualized nature of quantitative claims that are encountered outside of a school setting. Instead, rational dependence, or the reasoned dependence on the knowledge of others, is proposed as an educational goal that can supplement quantitative literacy and, in so doing, provide a more realistic plan for informed evaluations of quantitative claims.

  5. Measuring Accurate Body Parameters of Dressed Humans with Large-Scale Motion Using a Kinect Sensor

    Directory of Open Access Journals (Sweden)

    Sidan Du

    2013-08-01

    Full Text Available Non-contact human body measurement plays an important role in surveillance, physical healthcare, on-line business and virtual fitting. Current methods for measuring the human body without physical contact usually cannot handle humans wearing clothes, which limits their applicability in public environments. In this paper, we propose an effective solution that can measure accurate parameters of the human body with large-scale motion from a Kinect sensor, assuming that the people are wearing clothes. Because motion can drive clothes attached to the human body loosely or tightly, we adopt a space-time analysis to mine the information across the posture variations. Using this information, we recover the human body, regardless of the effect of clothes, and measure the human body parameters accurately. Experimental results show that our system can perform more accurate parameter estimation on the human body than state-of-the-art methods.

  6. Parameter Optimization for Quantitative Signal-Concentration Mapping Using Spoiled Gradient Echo MRI

    Directory of Open Access Journals (Sweden)

    Gasser Hathout

    2012-01-01

    Full Text Available Rationale and Objectives. Accurate signal to tracer concentration maps are critical to quantitative MRI. The purpose of this study was to evaluate and optimize spoiled gradient echo (SPGR MR sequences for the use of gadolinium (Gd-DTPA as a kinetic tracer. Methods. Water-gadolinium phantoms were constructed for a physiologic range of gadolinium concentrations. Observed and calculated SPGR signal to concentration curves were generated. Using a percentage error determination, optimal pulse parameters for signal to concentration mapping were obtained. Results. The accuracy of the SPGR equation is a function of the chosen MR pulse parameters, particularly the time to repetition (TR and the flip angle (FA. At all experimental values of TR, increasing FA decreases the ratio between observed and calculated signals. Conversely, for a constant FA, increasing TR increases this ratio. Using optimized pulse parameter sets, it is possible to achieve excellent accuracy (approximately 5% over a physiologic range of concentration tracer concentrations. Conclusion. Optimal pulse parameter sets exist and their use is essential for deriving accurate signal to concentration curves in quantitative MRI.

  7. A simplified and accurate detection of the genetically modified wheat MON71800 with one calibrator plasmid.

    Science.gov (United States)

    Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Park, Sunghoon; Shin, Min-Ki; Moon, Gui Im; Hong, Jin-Hwan; Kim, Hae-Yeong

    2015-06-01

    With the increasing number of genetically modified (GM) events, unauthorized GMO releases into the food market have increased dramatically, and many countries have developed detection tools for them. This study described the qualitative and quantitative detection methods of unauthorized the GM wheat MON71800 with a reference plasmid (pGEM-M71800). The wheat acetyl-CoA carboxylase (acc) gene was used as the endogenous gene. The plasmid pGEM-M71800, which contains both the acc gene and the event-specific target MON71800, was constructed as a positive control for the qualitative and quantitative analyses. The limit of detection in the qualitative PCR assay was approximately 10 copies. In the quantitative PCR assay, the standard deviation and relative standard deviation repeatability values ranged from 0.06 to 0.25 and from 0.23% to 1.12%, respectively. This study supplies a powerful and very simple but accurate detection strategy for unauthorized GM wheat MON71800 that utilizes a single calibrator plasmid. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Turn-on Fluorescent Probe for Exogenous and Endogenous Imaging of Hypochlorous Acid in Living Cells and Quantitative Application in Flow Cytometry.

    Science.gov (United States)

    Zhan, Zixuan; Liu, Rui; Chai, Li; Li, Qiuyan; Zhang, Kexin; Lv, Yi

    2017-09-05

    Hypochlorous acid (HClO) acts as a dominant microbicidal mediator in the natural immune system, and the excess production of hypochlorites is related to a series of diseases. Thus, it is vitally important and necessary to develop a highly sensitive and selective method for HClO detection in living systems, and most of fluorescent probes are mainly focused on cells imaging. Besides, accurate HClO quantitative information about individual cells in a large cell population is extremely important for understanding inflammation and cellular apoptosis as well. In our work, a turn-on fluorescent probe has been synthesized, which can selectively and sensitively detect HClO with fast response time. The probe is almost nonfluorescent possibly due to both the spirolactam form of fluorescein and unbridged C═N bonds which can undergo a nonradiative decay process in the excited state. Upon the addition of ClO - , the probe was oxidized to ring-opened fluorescent form and the fluorescence intensity was greatly enhanced. In live cell experiments, the probe was successfully applied to image exogenous ClO - in HeLa cells and endogenous HClO in RAW 264.7 macrophage cells. In particular, the quantitative information on exogenous and endogenous HClO can also be acquired in flow cytometry. Therefore, the probe not only can image exogenous and endogenous HClO but also provides a new and promising platform to quantitatively detect HClO in flow cytometry.

  9. Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS

    Science.gov (United States)

    Long, S. M.; Grosfils, E. B.

    2005-01-01

    Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.

  10. Quantitative SIMS Imaging of Agar-Based Microbial Communities.

    Science.gov (United States)

    Dunham, Sage J B; Ellis, Joseph F; Baig, Nameera F; Morales-Soto, Nydia; Cao, Tianyuan; Shrout, Joshua D; Bohn, Paul W; Sweedler, Jonathan V

    2018-05-01

    After several decades of widespread use for mapping elemental ions and small molecular fragments in surface science, secondary ion mass spectrometry (SIMS) has emerged as a powerful analytical tool for molecular imaging in biology. Biomolecular SIMS imaging has primarily been used as a qualitative technique; although the distribution of a single analyte can be accurately determined, it is difficult to map the absolute quantity of a compound or even to compare the relative abundance of one molecular species to that of another. We describe a method for quantitative SIMS imaging of small molecules in agar-based microbial communities. The microbes are cultivated on a thin film of agar, dried under nitrogen, and imaged directly with SIMS. By use of optical microscopy, we show that the area of the agar is reduced by 26 ± 2% (standard deviation) during dehydration, but the overall biofilm morphology and analyte distribution are largely retained. We detail a quantitative imaging methodology, in which the ion intensity of each analyte is (1) normalized to an external quadratic regression curve, (2) corrected for isomeric interference, and (3) filtered for sample-specific noise and lower and upper limits of quantitation. The end result is a two-dimensional surface density image for each analyte. The sample preparation and quantitation methods are validated by quantitatively imaging four alkyl-quinolone and alkyl-quinoline N-oxide signaling molecules (including Pseudomonas quinolone signal) in Pseudomonas aeruginosa colony biofilms. We show that the relative surface densities of the target biomolecules are substantially different from values inferred through direct intensity comparison and that the developed methodologies can be used to quantitatively compare as many ions as there are available standards.

  11. Automatic generation of a subject-specific model for accurate markerless motion capture and biomechanical applications.

    Science.gov (United States)

    Corazza, Stefano; Gambaretto, Emiliano; Mündermann, Lars; Andriacchi, Thomas P

    2010-04-01

    A novel approach for the automatic generation of a subject-specific model consisting of morphological and joint location information is described. The aim is to address the need for efficient and accurate model generation for markerless motion capture (MMC) and biomechanical studies. The algorithm applied and expanded on previous work on human shapes space by embedding location information for ten joint centers in a subject-specific free-form surface. The optimal locations of joint centers in the 3-D mesh were learned through linear regression over a set of nine subjects whose joint centers were known. The model was shown to be sufficiently accurate for both kinematic (joint centers) and morphological (shape of the body) information to allow accurate tracking with MMC systems. The automatic model generation algorithm was applied to 3-D meshes of different quality and resolution such as laser scans and visual hulls. The complete method was tested using nine subjects of different gender, body mass index (BMI), age, and ethnicity. Experimental training error and cross-validation errors were 19 and 25 mm, respectively, on average over the joints of the ten subjects analyzed in the study.

  12. Combining qualitative and quantitative operational research methods to inform quality improvement in pathways that span multiple settings.

    Science.gov (United States)

    Crowe, Sonya; Brown, Katherine; Tregay, Jenifer; Wray, Jo; Knowles, Rachel; Ridout, Deborah A; Bull, Catherine; Utley, Martin

    2017-08-01

    Improving integration and continuity of care across sectors within resource constraints is a priority in many health systems. Qualitative operational research methods of problem structuring have been used to address quality improvement in services involving multiple sectors but not in combination with quantitative operational research methods that enable targeting of interventions according to patient risk. We aimed to combine these methods to augment and inform an improvement initiative concerning infants with congenital heart disease (CHD) whose complex care pathway spans multiple sectors. Soft systems methodology was used to consider systematically changes to services from the perspectives of community, primary, secondary and tertiary care professionals and a patient group, incorporating relevant evidence. Classification and regression tree (CART) analysis of national audit datasets was conducted along with data visualisation designed to inform service improvement within the context of limited resources. A 'Rich Picture' was developed capturing the main features of services for infants with CHD pertinent to service improvement. This was used, along with a graphical summary of the CART analysis, to guide discussions about targeting interventions at specific patient risk groups. Agreement was reached across representatives of relevant health professions and patients on a coherent set of targeted recommendations for quality improvement. These fed into national decisions about service provision and commissioning. When tackling complex problems in service provision across multiple settings, it is important to acknowledge and work with multiple perspectives systematically and to consider targeting service improvements in response to confined resources. Our research demonstrates that applying a combination of qualitative and quantitative operational research methods is one approach to doing so that warrants further consideration. Published by the BMJ Publishing Group

  13. Two schemes for quantitative photoacoustic tomography based on Monte Carlo simulation

    International Nuclear Information System (INIS)

    Liu, Yubin; Yuan, Zhen; Jiang, Huabei

    2016-01-01

    Purpose: The aim of this study was to develop novel methods for photoacoustically determining the optical absorption coefficient of biological tissues using Monte Carlo (MC) simulation. Methods: In this study, the authors propose two quantitative photoacoustic tomography (PAT) methods for mapping the optical absorption coefficient. The reconstruction methods combine conventional PAT with MC simulation in a novel way to determine the optical absorption coefficient of biological tissues or organs. Specifically, the authors’ two schemes were theoretically and experimentally examined using simulations, tissue-mimicking phantoms, ex vivo, and in vivo tests. In particular, the authors explored these methods using several objects with different absorption contrasts embedded in turbid media and by using high-absorption media when the diffusion approximation was not effective at describing the photon transport. Results: The simulations and experimental tests showed that the reconstructions were quantitatively accurate in terms of the locations, sizes, and optical properties of the targets. The positions of the recovered targets were accessed by the property profiles, where the authors discovered that the off center error was less than 0.1 mm for the circular target. Meanwhile, the sizes and quantitative optical properties of the targets were quantified by estimating the full width half maximum of the optical absorption property. Interestingly, for the reconstructed sizes, the authors discovered that the errors ranged from 0 for relatively small-size targets to 26% for relatively large-size targets whereas for the recovered optical properties, the errors ranged from 0% to 12.5% for different cases. Conclusions: The authors found that their methods can quantitatively reconstruct absorbing objects of different sizes and optical contrasts even when the diffusion approximation is unable to accurately describe the photon propagation in biological tissues. In particular, their

  14. Quantitative neutron radiography using neutron absorbing honeycomb

    International Nuclear Information System (INIS)

    Tamaki, Masayoshi; Oda, Masahiro; Takahashi, Kenji; Ohkubo, Kohei; Tasaka, Kanji; Tsuruno, Akira; Matsubayashi, Masahito.

    1993-01-01

    This investigation concerns quantitative neutron radiography and computed tomography by using a neutron absorbing honeycomb collimator. By setting the neutron absorbing honeycomb collimator between object and imaging system, neutrons scattered in the object were absorbed by the honeycomb material and eliminated before coming to the imaging system, but the neutrons which were transmitted the object without interaction could reach the imaging system. The image by purely transmitted neutrons gives the quantitative information. Two honeycombs were prepared with coating of boron nitride and gadolinium oxide and evaluated for the quantitative application. The relation between the neutron total cross section and the attenuation coefficient confirmed that they were in a fairly good agreement. Application to quantitative computed tomography was also successfully conducted. The new neutron radiography method using the neutron-absorbing honeycomb collimator for the elimination of the scattered neutrons improved remarkably the quantitativeness of the neutron radiography and computed tomography. (author)

  15. Quantitative densitometry of neurotransmitter receptors

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Bleisch, W.V.; Biegon, A.; McEwen, B.S.

    1982-01-01

    An autoradiographic procedure is described that allows the quantitative measurement of neurotransmitter receptors by optical density readings. Frozen brain sections are labeled in vitro with [ 3 H]ligands under conditions that maximize specific binding to neurotransmitter receptors. The labeled sections are then placed against the 3 H-sensitive LKB Ultrofilm to produce the autoradiograms. These autoradiograms resemble those produced by [ 14 C]deoxyglucose autoradiography and are suitable for quantitative analysis with a densitometer. Muscarinic cholinergic receptors in rat and zebra finch brain and 5-HT receptors in rat brain were visualized by this method. When the proper combination of ligand concentration and exposure time are used, the method provides quantitative information about the amount and affinity of neurotransmitter receptors in brain sections. This was established by comparisons of densitometric readings with parallel measurements made by scintillation counting of sections. (Auth.)

  16. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  17. Improving quantitative precision and throughput by reducing calibrator use in liquid chromatography-tandem mass spectrometry

    International Nuclear Information System (INIS)

    Rule, Geoffrey S.; Rockwood, Alan L.

    2016-01-01

    To improve efficiency in our mass spectrometry laboratories we have made efforts to reduce the number of calibration standards utilized for quantitation over time. We often analyze three or more batches of 96 samples per day, on a single instrument, for a number of assays. With a conventional calibration scheme at six concentration levels this amounts to more than 5000 calibration points per year. Modern LC-tandem mass spectrometric instrumentation is extremely rugged however, and isotopically labelled internal standards are widely available. This made us consider whether alternative calibration strategies could be utilized to reduce the number of calibration standards analyzed while still retaining high precision and accurate quantitation. Here we demonstrate how, by utilizing a single calibration point in each sample batch, and using the resulting response factor (RF) to update an existing, historical response factor (HRF), we are able to obtain improved precision over a conventional multipoint calibration approach, as judged by quality control samples. The laboratory component of this study was conducted with an existing LC tandem mass spectrometric method for three androgen analytes in our production laboratory. Using examples from both simulated and laboratory data we illustrate several aspects of our single point alternative calibration strategy and compare it with a conventional, multipoint calibration approach. We conclude that both the cost and burden of preparing multiple calibration standards with every batch of samples can be reduced while at the same time maintaining, or even improving, analytical quality. - Highlights: • Use of a weighted single point calibration approach improves quantitative precision. • A weighted response factor approach incorporates historical calibration information. • Several scenarios are discussed with regard to their influence on quantitation.

  18. Improving quantitative precision and throughput by reducing calibrator use in liquid chromatography-tandem mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Rule, Geoffrey S., E-mail: geoffrey.s.rule@aruplab.com [ARUP Institute for Clinical and Experimental Pathology, 500 Chipeta Way, Salt Lake City, UT 84108 (United States); Rockwood, Alan L. [ARUP Institute for Clinical and Experimental Pathology, 500 Chipeta Way, Salt Lake City, UT 84108 (United States); Department of Pathology, University of Utah School of Medicine, 2100 Jones Medical Research Bldg., Salt Lake City, UT 84132 (United States)

    2016-05-05

    To improve efficiency in our mass spectrometry laboratories we have made efforts to reduce the number of calibration standards utilized for quantitation over time. We often analyze three or more batches of 96 samples per day, on a single instrument, for a number of assays. With a conventional calibration scheme at six concentration levels this amounts to more than 5000 calibration points per year. Modern LC-tandem mass spectrometric instrumentation is extremely rugged however, and isotopically labelled internal standards are widely available. This made us consider whether alternative calibration strategies could be utilized to reduce the number of calibration standards analyzed while still retaining high precision and accurate quantitation. Here we demonstrate how, by utilizing a single calibration point in each sample batch, and using the resulting response factor (RF) to update an existing, historical response factor (HRF), we are able to obtain improved precision over a conventional multipoint calibration approach, as judged by quality control samples. The laboratory component of this study was conducted with an existing LC tandem mass spectrometric method for three androgen analytes in our production laboratory. Using examples from both simulated and laboratory data we illustrate several aspects of our single point alternative calibration strategy and compare it with a conventional, multipoint calibration approach. We conclude that both the cost and burden of preparing multiple calibration standards with every batch of samples can be reduced while at the same time maintaining, or even improving, analytical quality. - Highlights: • Use of a weighted single point calibration approach improves quantitative precision. • A weighted response factor approach incorporates historical calibration information. • Several scenarios are discussed with regard to their influence on quantitation.

  19. Quantitative lymphography

    International Nuclear Information System (INIS)

    Mostbeck, A.; Lofferer, O.; Kahn, P.; Partsch, H.; Koehn, H.; Bialonczyk, Ch.; Koenig, B.

    1984-01-01

    Labelled colloids and macromolecules are removed lymphatically. The uptake of tracer in the regional lymphnodes is a parameter of lymphatic flow. Due to great variations in patient shape - obesity, cachexia - and accompanying variations in counting efficiencies quantitative measurements with reasonable accuracy have not been reported to date. A new approach to regional absorption correction is based on the combination of transmission and emission scans for each patient. The transmission scan is used for calculation of an absorption correction matrix. Accurate superposition of the correction matrix and the emission scan is achieved by computing the centers of gravity of point sources and - in the case of aligning opposite views - by cross correlation of binary images. In phantom studies the recovery was high (98.3%) and the coefficient of variation of repeated measurement below 1%. In patient studies a standardized stress is a prerequisite for reliable and comparable results. Discrimination between normals (14.3 +- 4.2D%) and patients with lymphedema (2.05 +- 2.5D%) was highly significant using praefascial lymphography and sc injection. Clearence curve analysis of the activities at the injection site, however, gave no reliable data for this purpose. In normals, the uptake in lymphnodes after im injection is by one order of magnitude lower then the uptake after sc injection. The discrimination between normals and patients with postthromboic syndrome was significant. Lymphography after ic injection was in the normal range in 2/3 of the patients with lymphedema and is therefore of no diagnostic value. The difference in uptake after ic and sc injection demonstrated for the first time by our quantitative method provides new insights into the pathophysiology of lymphedema and needs further investigation. (Author)

  20. Quantitative measurements of electromechanical response with a combined optical beam and interferometric atomic force microscope

    Energy Technology Data Exchange (ETDEWEB)

    Labuda, Aleksander; Proksch, Roger [Asylum Research an Oxford Instruments Company, Santa Barbara, California 93117 (United States)

    2015-06-22

    An ongoing challenge in atomic force microscope (AFM) experiments is the quantitative measurement of cantilever motion. The vast majority of AFMs use the optical beam deflection (OBD) method to infer the deflection of the cantilever. The OBD method is easy to implement, has impressive noise performance, and tends to be mechanically robust. However, it represents an indirect measurement of the cantilever displacement, since it is fundamentally an angular rather than a displacement measurement. Here, we demonstrate a metrological AFM that combines an OBD sensor with a laser Doppler vibrometer (LDV) to enable accurate measurements of the cantilever velocity and displacement. The OBD/LDV AFM allows a host of quantitative measurements to be performed, including in-situ measurements of cantilever oscillation modes in piezoresponse force microscopy. As an example application, we demonstrate how this instrument can be used for accurate quantification of piezoelectric sensitivity—a longstanding goal in the electromechanical community.

  1. Accurate quantification of microRNA via single strand displacement reaction on DNA origami motif.

    Directory of Open Access Journals (Sweden)

    Jie Zhu

    Full Text Available DNA origami is an emerging technology that assembles hundreds of staple strands and one single-strand DNA into certain nanopattern. It has been widely used in various fields including detection of biological molecules such as DNA, RNA and proteins. MicroRNAs (miRNAs play important roles in post-transcriptional gene repression as well as many other biological processes such as cell growth and differentiation. Alterations of miRNAs' expression contribute to many human diseases. However, it is still a challenge to quantitatively detect miRNAs by origami technology. In this study, we developed a novel approach based on streptavidin and quantum dots binding complex (STV-QDs labeled single strand displacement reaction on DNA origami to quantitatively detect the concentration of miRNAs. We illustrated a linear relationship between the concentration of an exemplary miRNA as miRNA-133 and the STV-QDs hybridization efficiency; the results demonstrated that it is an accurate nano-scale miRNA quantifier motif. In addition, both symmetrical rectangular motif and asymmetrical China-map motif were tested. With significant linearity in both motifs, our experiments suggested that DNA Origami motif with arbitrary shape can be utilized in this method. Since this DNA origami-based method we developed owns the unique advantages of simple, time-and-material-saving, potentially multi-targets testing in one motif and relatively accurate for certain impurity samples as counted directly by atomic force microscopy rather than fluorescence signal detection, it may be widely used in quantification of miRNAs.

  2. Accurate Quantification of microRNA via Single Strand Displacement Reaction on DNA Origami Motif

    Science.gov (United States)

    Lou, Jingyu; Li, Weidong; Li, Sheng; Zhu, Hongxin; Yang, Lun; Zhang, Aiping; He, Lin; Li, Can

    2013-01-01

    DNA origami is an emerging technology that assembles hundreds of staple strands and one single-strand DNA into certain nanopattern. It has been widely used in various fields including detection of biological molecules such as DNA, RNA and proteins. MicroRNAs (miRNAs) play important roles in post-transcriptional gene repression as well as many other biological processes such as cell growth and differentiation. Alterations of miRNAs' expression contribute to many human diseases. However, it is still a challenge to quantitatively detect miRNAs by origami technology. In this study, we developed a novel approach based on streptavidin and quantum dots binding complex (STV-QDs) labeled single strand displacement reaction on DNA origami to quantitatively detect the concentration of miRNAs. We illustrated a linear relationship between the concentration of an exemplary miRNA as miRNA-133 and the STV-QDs hybridization efficiency; the results demonstrated that it is an accurate nano-scale miRNA quantifier motif. In addition, both symmetrical rectangular motif and asymmetrical China-map motif were tested. With significant linearity in both motifs, our experiments suggested that DNA Origami motif with arbitrary shape can be utilized in this method. Since this DNA origami-based method we developed owns the unique advantages of simple, time-and-material-saving, potentially multi-targets testing in one motif and relatively accurate for certain impurity samples as counted directly by atomic force microscopy rather than fluorescence signal detection, it may be widely used in quantification of miRNAs. PMID:23990889

  3. Accurate quantification of microRNA via single strand displacement reaction on DNA origami motif.

    Science.gov (United States)

    Zhu, Jie; Feng, Xiaolu; Lou, Jingyu; Li, Weidong; Li, Sheng; Zhu, Hongxin; Yang, Lun; Zhang, Aiping; He, Lin; Li, Can

    2013-01-01

    DNA origami is an emerging technology that assembles hundreds of staple strands and one single-strand DNA into certain nanopattern. It has been widely used in various fields including detection of biological molecules such as DNA, RNA and proteins. MicroRNAs (miRNAs) play important roles in post-transcriptional gene repression as well as many other biological processes such as cell growth and differentiation. Alterations of miRNAs' expression contribute to many human diseases. However, it is still a challenge to quantitatively detect miRNAs by origami technology. In this study, we developed a novel approach based on streptavidin and quantum dots binding complex (STV-QDs) labeled single strand displacement reaction on DNA origami to quantitatively detect the concentration of miRNAs. We illustrated a linear relationship between the concentration of an exemplary miRNA as miRNA-133 and the STV-QDs hybridization efficiency; the results demonstrated that it is an accurate nano-scale miRNA quantifier motif. In addition, both symmetrical rectangular motif and asymmetrical China-map motif were tested. With significant linearity in both motifs, our experiments suggested that DNA Origami motif with arbitrary shape can be utilized in this method. Since this DNA origami-based method we developed owns the unique advantages of simple, time-and-material-saving, potentially multi-targets testing in one motif and relatively accurate for certain impurity samples as counted directly by atomic force microscopy rather than fluorescence signal detection, it may be widely used in quantification of miRNAs.

  4. Fluctuation localization imaging-based fluorescence in situ hybridization (fliFISH) for accurate detection and counting of RNA copies in single cells

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Yi; Hu, Dehong; Markillie, Lye Meng; Chrisler, William B.; Gaffrey, Matthew J.; Ansong, Charles; Sussel, Lori; Orr, Galya

    2017-10-04

    Quantitative gene expression analysis in intact single cells can be achieved using single molecule- based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple FISH sub-probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific binding of sub-probes and tissue autofluorescence, limiting its accuracy. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on blinking frequencies of photoswitchable dyes. This fluctuation localization imaging-based FISH (fliFISH) uses blinking frequency patterns, emitted from a transcript bound to multiple sub-probes, which are distinct from blinking patterns emitted from partial or nonspecifically bound sub-probes and autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2, and their ratio (Nkx2-2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct blinking frequency patterns, laying the foundation for reliable single-cell transcriptomics.

  5. Leg mass characteristics of accurate and inaccurate kickers--an Australian football perspective.

    Science.gov (United States)

    Hart, Nicolas H; Nimphius, Sophia; Cochrane, Jodie L; Newton, Robert U

    2013-01-01

    Athletic profiling provides valuable information to sport scientists, assisting in the optimal design of strength and conditioning programmes. Understanding the influence these physical characteristics may have on the generation of kicking accuracy is advantageous. The aim of this study was to profile and compare the lower limb mass characteristics of accurate and inaccurate Australian footballers. Thirty-one players were recruited from the Western Australian Football League to perform ten drop punt kicks over 20 metres to a player target. Players were separated into accurate (n = 15) and inaccurate (n = 16) groups, with leg mass characteristics assessed using whole body dual energy x-ray absorptiometry (DXA) scans. Accurate kickers demonstrated significantly greater relative lean mass (P ≤ 0.004) and significantly lower relative fat mass (P ≤ 0.024) across all segments of the kicking and support limbs, while also exhibiting significantly higher intra-limb lean-to-fat mass ratios for all segments across both limbs (P ≤ 0.009). Inaccurate kickers also produced significantly larger asymmetries between limbs than accurate kickers (P ≤ 0.028), showing considerably lower lean mass in their support leg. These results illustrate a difference in leg mass characteristics between accurate and inaccurate kickers, highlighting the potential influence these may have on technical proficiency of the drop punt.

  6. Qualitative versus quantitative assessment of cerebrovascular reserve capacity

    International Nuclear Information System (INIS)

    Okuguchi, Taku

    2000-01-01

    Quantitative studies of cerebral blood flow (CBF) combined with a acetazolamide (ACZ) challenge have defined a subgroup of patients with symptomatic carotid or middle cerebral artery (MCA) occlusive diseases who are at an increased risk for stroke. Recent reports suggest that qualitative CBF techniques could also define the same high-risk subgroup. To evaluate the accuracy of the qualitative method, we compared qualitative ratios with quantitative CBF data, obtained using iodine-123-N-isopropyl-p-iodoamphetamine (IMP) single-photon emission computed tomography (SPECT). We analyzed qualitative and quantitative IMP SPECT images for 50 patients with symptomatic carotid or middle cerebral artery occlusive diseases. Quantitative CBF data were measured by the autoradiographic technique. One region-of-interest within each hemisphere was within the MCA territory. Relative cerebrovascular reserve capacity (CVRC) obtained using qualitative images before and after the intravenous administration of 1 g of ACZ was defined as follows: ( ACZ C occl / ACZ C non )/( baseline C occl / baseline C non ). The threshold for abnormal relative CVRC was defined as less than 1.0. Quantitative CBF was considered abnormal when the response to ACZ (percent change) on the symptomatic side (absolute CVRC) was a decrease of more than 10%. Of 39 patients whose relative CVRC were considered abnormal, 29 (74%) were normal in absolute CVRC (i.e., false positive). Two of 12 (17%) who were not considered compromised by qualitative criteria had abnormal absolute CVRC (i.e., false negative). This study demonstrates that this important subgroup cannot be accurately defined with qualitative methodology. (author)

  7. Quantitative Peptidomics with Five-plex Reductive Methylation labels

    Science.gov (United States)

    Tashima, Alexandre K.; Fricker, Lloyd D.

    2018-05-01

    Quantitative peptidomics and proteomics often use chemical tags to covalently modify peptides with reagents that differ in the number of stable isotopes, allowing for quantitation of the relative peptide levels in the original sample based on the peak height of each isotopic form. Different chemical reagents have been used as tags for quantitative peptidomics and proteomics, and all have strengths and weaknesses. One of the simplest approaches uses formaldehyde and sodium cyanoborohydride to methylate amines, converting primary and secondary amines into tertiary amines. Up to five different isotopic forms can be generated, depending on the isotopic forms of formaldehyde and cyanoborohydride reagents, allowing for five-plex quantitation. However, the mass difference between each of these forms is only 1 Da per methyl group incorporated into the peptide, and for many peptides there is substantial overlap from the natural abundance of 13C and other isotopes. In this study, we calculated the contribution from the natural isotopes for 26 native peptides and derived equations to correct the peak intensities. These equations were applied to data from a study using human embryonic kidney HEK293T cells in which five replicates were treated with 100 nM vinblastine for 3 h and compared with five replicates of cells treated with control medium. The correction equations brought the replicates to the expected 1:1 ratios and revealed significant decreases in levels of 21 peptides upon vinblastine treatment. These equations enable accurate quantitation of small changes in peptide levels using the reductive methylation labeling approach. [Figure not available: see fulltext.

  8. Qualitative versus quantitative assessment of cerebrovascular reserve capacity

    Energy Technology Data Exchange (ETDEWEB)

    Okuguchi, Taku [Iwate Medical Univ., Morioka (Japan). School of Medicine

    2000-06-01

    Quantitative studies of cerebral blood flow (CBF) combined with a acetazolamide (ACZ) challenge have defined a subgroup of patients with symptomatic carotid or middle cerebral artery (MCA) occlusive diseases who are at an increased risk for stroke. Recent reports suggest that qualitative CBF techniques could also define the same high-risk subgroup. To evaluate the accuracy of the qualitative method, we compared qualitative ratios with quantitative CBF data, obtained using iodine-123-N-isopropyl-p-iodoamphetamine (IMP) single-photon emission computed tomography (SPECT). We analyzed qualitative and quantitative IMP SPECT images for 50 patients with symptomatic carotid or middle cerebral artery occlusive diseases. Quantitative CBF data were measured by the autoradiographic technique. One region-of-interest within each hemisphere was within the MCA territory. Relative cerebrovascular reserve capacity (CVRC) obtained using qualitative images before and after the intravenous administration of 1 g of ACZ was defined as follows: ({sub ACZ}C{sub occl}/{sub ACZ}C{sub non})/({sub baseline}C{sub occl}/{sub baseline}C{sub n}= {sub on}). The threshold for abnormal relative CVRC was defined as less than 1.0. Quantitative CBF was considered abnormal when the response to ACZ (percent change) on the symptomatic side (absolute CVRC) was a decrease of more than 10%. Of 39 patients whose relative CVRC were considered abnormal, 29 (74%) were normal in absolute CVRC (i.e., false positive). Two of 12 (17%) who were not considered compromised by qualitative criteria had abnormal absolute CVRC (i.e., false negative). This study demonstrates that this important subgroup cannot be accurately defined with qualitative methodology. (author)

  9. Quantitation of left ventricular dimensions and function by digital video subtraction angiography

    International Nuclear Information System (INIS)

    Higgins, C.B.; Norris, S.L.; Gerber, K.H.; Slutsky, R.A.; Ashburn, W.L.; Baily, N.

    1982-01-01

    Digital video subtraction angiography (DVSA) after central intravenous administration of contrast media was used in experimental animals and in patients with suspected coronary artery disease to quantitate left ventricular dimensions and regional and global contractile function. In animals, measurements of left ventricular (LV) volumes, wall thickness, ejection fraction, segmental contraction, and cardiac output correlated closely with sonocardiometry or thermodilution measurements. In patients, volumes and ejection fractions calculated from mask mode digital images correlated closely with direct left ventriculography. Global and segmental contractile function was displayed in patients by ejection shell images, stroke volume images, and time interval difference images. Central cardiovascular function was also quantitated by measurement of pulmonary transit time and calculation of pulmonary blood volume from digital fluoroscopic images. DVSA was shown to be useful and accurate in the quantitation of central cardiovascular physiology

  10. Reference gene identification for reliable normalisation of quantitative RT-PCR data in Setaria viridis.

    Science.gov (United States)

    Nguyen, Duc Quan; Eamens, Andrew L; Grof, Christopher P L

    2018-01-01

    Quantitative real-time polymerase chain reaction (RT-qPCR) is the key platform for the quantitative analysis of gene expression in a wide range of experimental systems and conditions. However, the accuracy and reproducibility of gene expression quantification via RT-qPCR is entirely dependent on the identification of reliable reference genes for data normalisation. Green foxtail ( Setaria viridis ) has recently been proposed as a potential experimental model for the study of C 4 photosynthesis and is closely related to many economically important crop species of the Panicoideae subfamily of grasses, including Zea mays (maize), Sorghum bicolor (sorghum) and Sacchurum officinarum (sugarcane). Setaria viridis (Accession 10) possesses a number of key traits as an experimental model, namely; (i) a small sized, sequenced and well annotated genome; (ii) short stature and generation time; (iii) prolific seed production, and; (iv) is amendable to Agrobacterium tumefaciens -mediated transformation. There is currently however, a lack of reference gene expression information for Setaria viridis ( S. viridis ). We therefore aimed to identify a cohort of suitable S. viridis reference genes for accurate and reliable normalisation of S. viridis RT-qPCR expression data. Eleven putative candidate reference genes were identified and examined across thirteen different S. viridis tissues. Of these, the geNorm and NormFinder analysis software identified SERINE / THERONINE - PROTEIN PHOSPHATASE 2A ( PP2A ), 5 '- ADENYLYLSULFATE REDUCTASE 6 ( ASPR6 ) and DUAL SPECIFICITY PHOSPHATASE ( DUSP ) as the most suitable combination of reference genes for the accurate and reliable normalisation of S. viridis RT-qPCR expression data. To demonstrate the suitability of the three selected reference genes, PP2A , ASPR6 and DUSP , were used to normalise the expression of CINNAMYL ALCOHOL DEHYDROGENASE ( CAD ) genes across the same tissues. This approach readily demonstrated the suitably of the three

  11. A study on quantification of the information flow and effectiveness of information aids for diagnosis tasks in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Jong Hyun

    2004-02-01

    Diagnosis is one of the most complex and mental resource-demanding tasks in nuclear power plants (NPPs), especially, to main control room (MCR) operators. Diagnosis is a crucial part of disturbance control in NPPs, since it is a prerequisite task for initiating operating procedures. In order to design a control room feature for NPPs, three elements need to be considered: 1) the operational tasks that must be performed, 2) a model of human performance for these tasks, and 3) a model of how control room features are intended to support performance. The operational tasks define the classes of performance that must be considered. A model of human performance makes more explicit the requirements for accurate and efficient performance and reveals potential sources of error. Finally, the model of support allows the generation of specific hypotheses about how performance is facilitated in the control room. The model of support needs to be developed based on the human performance model. This paper proposes three approaches for the system design of operator support systems to aid MCR operators' diagnosis tasks in NPPs, considering the above three elements. This paper presents 1) a quantitative approach to modeling the information flow of diagnosis tasks, 2) strategy-based evaluation of information aids for diagnosis tasks, and 3) quantitative evaluation of NPP decision support systems. As an analysis of diagnosis tasks, this paper presents a method to quantify the cognitive information flow of diagnosis tasks, integrating a stage model (a qualitative approach) with information theory (a quantitative approach). The method includes: 1) constructing the information flow model, which consists of four stages based on operating procedures of NPPs: and 2) quantifying the information flow using Conant's model, a kind of information theory. Then, three experiments were conducted to evaluate the effectiveness of the proposed approach to predicting human performances, especially in

  12. Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?

    Science.gov (United States)

    Gizak, Agnieszka; Rakus, Dariusz

    2016-01-11

    Molecular and cellular biology methodology is traditionally based on the reasoning called "the mechanistic explanation". In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems' complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites), and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.

  13. Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?

    Directory of Open Access Journals (Sweden)

    Agnieszka Gizak

    2016-01-01

    Full Text Available Molecular and cellular biology methodology is traditionally based on the reasoning called “the mechanistic explanation”. In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems’ complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites, and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.

  14. A semi-quantitative model for risk appreciation and risk weighing

    DEFF Research Database (Denmark)

    Bos, Peter M.J.; Boon, Polly E.; van der Voet, Hilko

    2009-01-01

    Risk managers need detailed information on (1) the type of effect, (2) the size (severity) of the expected effect(s) and (3) the fraction of the population at risk to decide on well-balanced risk reduction measures. A previously developed integrated probabilistic risk assessment (IPRA) model...... provides quantitative information on these three parameters. A semi-quantitative tool is presented that combines information on these parameters into easy-readable charts that will facilitate risk evaluations of exposure situations and decisions on risk reduction measures. This tool is based on a concept...... detailed information on the estimated health impact in a given exposure situation. These graphs will facilitate the discussions on appropriate risk reduction measures to be taken....

  15. A chemical approach to accurately characterize the coverage rate of gold nanoparticles

    International Nuclear Information System (INIS)

    Zhu, Xiaoli; Liu, Min; Zhang, Huihui; Wang, Haiyan; Li, Genxi

    2013-01-01

    Gold nanoparticles (AuNPs) have been widely used in many areas, and the nanoparticles usually have to be functionalized with some molecules before use. However, the information about the characterization of the functionalization of the nanoparticles is still limited or unclear, which has greatly restricted the better functionalization and application of AuNPs. Here, we propose a chemical way to accurately characterize the functionalization of AuNPs. Unlike the traditional physical methods, this method, which is based on the catalytic property of AuNPs, may give accurate coverage rate and some derivative information about the functionalization of the nanoparticles with different kinds of molecules. The performance of the characterization has been approved by adopting three independent molecules to functionalize AuNPs, including both covalent and non-covalent functionalization. Some interesting results are thereby obtained, and some are the first time to be revealed. The method may also be further developed as a useful tool for the characterization of a solid surface

  16. A chemical approach to accurately characterize the coverage rate of gold nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Xiaoli; Liu, Min; Zhang, Huihui [Shanghai University, Laboratory of Biosensing Technology, School of Life Sciences (China); Wang, Haiyan [Nanjing University, State Key Laboratory of Pharmaceutical Biotechnology, Department of Biochemistry (China); Li, Genxi, E-mail: genxili@nju.edu.cn [Shanghai University, Laboratory of Biosensing Technology, School of Life Sciences (China)

    2013-09-15

    Gold nanoparticles (AuNPs) have been widely used in many areas, and the nanoparticles usually have to be functionalized with some molecules before use. However, the information about the characterization of the functionalization of the nanoparticles is still limited or unclear, which has greatly restricted the better functionalization and application of AuNPs. Here, we propose a chemical way to accurately characterize the functionalization of AuNPs. Unlike the traditional physical methods, this method, which is based on the catalytic property of AuNPs, may give accurate coverage rate and some derivative information about the functionalization of the nanoparticles with different kinds of molecules. The performance of the characterization has been approved by adopting three independent molecules to functionalize AuNPs, including both covalent and non-covalent functionalization. Some interesting results are thereby obtained, and some are the first time to be revealed. The method may also be further developed as a useful tool for the characterization of a solid surface.

  17. Accurate Online Full Charge Capacity Modeling of Smartphone Batteries

    OpenAIRE

    Hoque, Mohammad A.; Siekkinen, Matti; Koo, Jonghoe; Tarkoma, Sasu

    2016-01-01

    Full charge capacity (FCC) refers to the amount of energy a battery can hold. It is the fundamental property of smartphone batteries that diminishes as the battery ages and is charged/discharged. We investigate the behavior of smartphone batteries while charging and demonstrate that the battery voltage and charging rate information can together characterize the FCC of a battery. We propose a new method for accurately estimating FCC without exposing low-level system details or introducing new ...

  18. Thallium-201 infusion imaging and quantitation of experimental reactive hyperemia

    International Nuclear Information System (INIS)

    Alazraki, N.; Kralios, A.C.; Wooten, W.W.

    1985-01-01

    Accurate quantitation of coronary artery blood flow may be important complimentary information to percent vessel stenosis determined by coronary angiography. Whether T1-201 can be used to identify and quantify rapid changes in blood flow through a major coronary artery was examined experimentally in open chest dogs with a cannulated, servoperfursed circumflex or left anterior descending coronary artery at a constant coronary perfusion pressure of 80mmHg. Blood flow with T1-201 (5 μCi/cc of blood) through the coronary artery was continuously recorded using a tubular electromagnetic flow probe. A mobile scintillation camera interfaced to a nuclear medicine computer was used to image and record myocardial count accumulation plotted as a function of time during the T1-201 infusion. Blood flow was calculated as the slope of myocardial count accumulation against time. Simulating total occlusion, perfusion was stopped for several 20 sec. periods to elicit reactive hyperemic responses. The changes in flow as measured by the flow probe, and by T1-201 were compared. Results demonstrated that scintillation camera recordings depicted coronary flow changes with a high degree of correlation to electromagnetic flow probe recordings (r = 0.85). Reactive hyperemia reaching a three-fold increase in flow was accurately demonstrated by a three-fold increase in slope of the T1-201 counts plotted against time. Any flow change by T1-201 corresponded in time to detection of similar flow changes by flow probe recordings. These findings support further development of this technique for eventual clinical use

  19. An information gap in DNA evidence interpretation.

    Directory of Open Access Journals (Sweden)

    Mark W Perlin

    Full Text Available Forensic DNA evidence often contains mixtures of multiple contributors, or is present in low template amounts. The resulting data signals may appear to be relatively uninformative when interpreted using qualitative inclusion-based methods. However, these same data can yield greater identification information when interpreted by computer using quantitative data-modeling methods. This study applies both qualitative and quantitative interpretation methods to a well-characterized DNA mixture and dilution data set, and compares the inferred match information. The results show that qualitative interpretation loses identification power at low culprit DNA quantities (below 100 pg, but that quantitative methods produce useful information down into the 10 pg range. Thus there is a ten-fold information gap that separates the qualitative and quantitative DNA mixture interpretation approaches. With low quantities of culprit DNA (10 pg to 100 pg, computer-based quantitative interpretation provides greater match sensitivity.

  20. Funnel metadynamics as accurate binding free-energy method

    Science.gov (United States)

    Limongelli, Vittorio; Bonomi, Massimiliano; Parrinello, Michele

    2013-01-01

    A detailed description of the events ruling ligand/protein interaction and an accurate estimation of the drug affinity to its target is of great help in speeding drug discovery strategies. We have developed a metadynamics-based approach, named funnel metadynamics, that allows the ligand to enhance the sampling of the target binding sites and its solvated states. This method leads to an efficient characterization of the binding free-energy surface and an accurate calculation of the absolute protein–ligand binding free energy. We illustrate our protocol in two systems, benzamidine/trypsin and SC-558/cyclooxygenase 2. In both cases, the X-ray conformation has been found as the lowest free-energy pose, and the computed protein–ligand binding free energy in good agreement with experiments. Furthermore, funnel metadynamics unveils important information about the binding process, such as the presence of alternative binding modes and the role of waters. The results achieved at an affordable computational cost make funnel metadynamics a valuable method for drug discovery and for dealing with a variety of problems in chemistry, physics, and material science. PMID:23553839

  1. Quantitative Classification of Rice (Oryza sativa L.) Root Length and Diameter Using Image Analysis

    Science.gov (United States)

    Gu, Dongxiang; Zhen, Fengxian; Hannaway, David B.; Zhu, Yan; Liu, Leilei; Cao, Weixing; Tang, Liang

    2017-01-01

    Quantitative study of root morphological characteristics of plants is helpful for understanding the relationships between their morphology and function. However, few studies and little detailed and accurate information of root characteristics were reported in fine-rooted plants like rice (Oryza sativa L.). The aims of this study were to quantitatively classify fine lateral roots (FLRs), thick lateral roots (TLRs), and nodal roots (NRs) and analyze their dynamics of mean diameter (MD), lengths and surface area percentage with growth stages in rice plant. Pot experiments were carried out during three years with three rice cultivars, three nitrogen (N) rates and three water regimes. In cultivar experiment, among the three cultivars, root length of ‘Yangdao 6’ was longest, while the MD of its FLR was the smallest, and the mean diameters for TLR and NR were the largest, the surface area percentage (SAP) of TLRs (SAPT) was the highest, indicating that Yangdao 6 has better nitrogen and water uptake ability. High N rate increased the length of different types of roots and increased the MD of lateral roots, decreased the SAP of FLRs (SAPF) and TLRs, but increased the SAP of NRs (SAPN). Moderate decrease of water supply increased root length and diameter, water stress increased the SAPF and SAPT, but decreased SAPN. The quantitative results indicate that rice plant tends to increase lateral roots to get more surface area for nitrogen and water uptake when available assimilates are limiting under nitrogen and water stress environments. PMID:28103264

  2. Quantitative imaging of a non-combusting diesel spray using structured laser illumination planar imaging

    Science.gov (United States)

    Berrocal, E.; Kristensson, E.; Hottenbach, P.; Aldén, M.; Grünefeld, G.

    2012-12-01

    Due to its transient nature, high atomization process, and rapid generation of fine evaporating droplets, diesel sprays have been, and still remain, one of the most challenging sprays to be fully analyzed and understood by means of non-intrusive diagnostics. The main limitation of laser techniques for quantitative measurements of diesel sprays concerns the detection of the multiple light scattering resulting from the high optical density of such a scattering medium. A second limitation is the extinction of the incident laser radiation as it crosses the spray, as well as the attenuation of the signal which is to be detected. All these issues have strongly motivated, during the past decade, the use of X-ray instead of visible light for dense spray diagnostics. However, we demonstrate in this paper that based on an affordable Nd:YAG laser system, structured laser illumination planar imaging (SLIPI) can provide accurate quantitative description of a non-reacting diesel spray injected at 1,100 bar within a room temperature vessel pressurized at 18.6 bar. The technique is used at λ = 355 nm excitation wavelength with 1.0 mol% TMPD dye concentration, for simultaneous LIF/Mie imaging. Furthermore, a novel dual-SLIPI configuration is tested with Mie scattering detection only. The results confirm that a mapping of both the droplet Sauter mean diameter and extinction coefficient can be obtained by such complementary approaches. These new insights are provided in this article at late times after injection start. It is demonstrated that the application of SLIPI to diesel sprays provides valuable quantitative information which was not previously accessible.

  3. Towards a Quantitative Performance Measurement Framework to Assess the Impact of Geographic Information Standards

    Science.gov (United States)

    Vandenbroucke, D.; Van Orshoven, J.; Vancauwenberghe, G.

    2012-12-01

    Over the last decennia, the use of Geographic Information (GI) has gained importance, in public as well as in private sector. But even if many spatial data and related information exist, data sets are scattered over many organizations and departments. In practice it remains difficult to find the spatial data sets needed, and to access, obtain and prepare them for using in applications. Therefore Spatial Data Infrastructures (SDI) haven been developed to enhance the access, the use and sharing of GI. SDIs consist of a set of technological and non-technological components to reach this goal. Since the nineties many SDI initiatives saw light. Ultimately, all these initiatives aim to enhance the flow of spatial data between organizations (users as well as producers) involved in intra- and inter-organizational and even cross-country business processes. However, the flow of information and its re-use in different business processes requires technical and semantic interoperability: the first should guarantee that system components can interoperate and use the data, while the second should guarantee that data content is understood by all users in the same way. GI-standards within the SDI are necessary to make this happen. However, it is not known if this is realized in practice. Therefore the objective of the research is to develop a quantitative framework to assess the impact of GI-standards on the performance of business processes. For that purpose, indicators are defined and tested in several cases throughout Europe. The proposed research will build upon previous work carried out in the SPATIALIST project. It analyzed the impact of different technological and non-technological factors on the SDI-performance of business processes (Dessers et al., 2011). The current research aims to apply quantitative performance measurement techniques - which are frequently used to measure performance of production processes (Anupindi et al., 2005). Key to reach the research objectives

  4. Quantitative learning strategies based on word networks

    Science.gov (United States)

    Zhao, Yue-Tian-Yi; Jia, Zi-Yang; Tang, Yong; Xiong, Jason Jie; Zhang, Yi-Cheng

    2018-02-01

    Learning English requires a considerable effort, but the way that vocabulary is introduced in textbooks is not optimized for learning efficiency. With the increasing population of English learners, learning process optimization will have significant impact and improvement towards English learning and teaching. The recent developments of big data analysis and complex network science provide additional opportunities to design and further investigate the strategies in English learning. In this paper, quantitative English learning strategies based on word network and word usage information are proposed. The strategies integrate the words frequency with topological structural information. By analyzing the influence of connected learned words, the learning weights for the unlearned words and dynamically updating of the network are studied and analyzed. The results suggest that quantitative strategies significantly improve learning efficiency while maintaining effectiveness. Especially, the optimized-weight-first strategy and segmented strategies outperform other strategies. The results provide opportunities for researchers and practitioners to reconsider the way of English teaching and designing vocabularies quantitatively by balancing the efficiency and learning costs based on the word network.

  5. A Quantitative Gas Chromatographic Ethanol Determination.

    Science.gov (United States)

    Leary, James J.

    1983-01-01

    Describes a gas chromatographic experiment for the quantitative determination of volume percent ethanol in water ethanol solutions. Background information, procedures, and typical results are included. Accuracy and precision of results are both on the order of two percent. (JN)

  6. The Information Tekhnology Share In Management Information System

    Directory of Open Access Journals (Sweden)

    Nur Zeina Maya Sari

    2015-08-01

    Full Text Available Abstract Management Information System growth cause change of role from all manager in decision making the information technology. While prima facie reason for the usage of information technology in business to support such a manner so that information system may operate better OBrienamp Marakas 2004. Its meaning with existence of information tekhnology in management information system SIM company management decision making which initially often pursued by many factor of non technical become accurately is relevant complete and on schedule

  7. Comparison of salivary collection and processing methods for quantitative HHV-8 detection.

    Science.gov (United States)

    Speicher, D J; Johnson, N W

    2014-10-01

    Saliva is a proved diagnostic fluid for the qualitative detection of infectious agents, but the accuracy of viral load determinations is unknown. Stabilising fluids impede nucleic acid degradation, compared with collection onto ice and then freezing, and we have shown that the DNA Genotek P-021 prototype kit (P-021) can produce high-quality DNA after 14 months of storage at room temperature. Here we evaluate the quantitative capability of 10 collection/processing methods. Unstimulated whole mouth fluid was spiked with a mixture of HHV-8 cloned constructs, 10-fold serial dilutions were produced, and samples were extracted and then examined with quantitative PCR (qPCR). Calibration curves were compared by linear regression and qPCR dynamics. All methods extracted with commercial spin columns produced linear calibration curves with large dynamic range and gave accurate viral loads. Ethanol precipitation of the P-021 does not produce a linear standard curve, and virus is lost in the cell pellet. DNA extractions from the P-021 using commercial spin columns produced linear standard curves with wide dynamic range and excellent limit of detection. When extracted with spin columns, the P-021 enables accurate viral loads down to 23 copies μl(-1) DNA. The quantitative and long-term storage capability of this system makes it ideal for study of salivary DNA viruses in resource-poor settings. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. A simplified edge illumination set-up for quantitative phase contrast mammography with synchrotron radiation at clinical doses

    International Nuclear Information System (INIS)

    Longo, Mariaconcetta; Rigon, Luigi; Lopez, Frances C M; Longo, Renata; Chen, Rongchang; Dreossi, Diego; Zanconati, Fabrizio

    2015-01-01

    This work presents the first study of x-ray phase contrast imaging based on a simple implementation of the edge illumination method (EIXPCi) in the field of mammography with synchrotron radiation. A simplified EIXPCi set-up was utilized to study a possible application in mammography at clinical doses. Moreover, through a novel algorithm capable of separating and quantifying absorption and phase perturbations of images acquired in EIXPCi modality, it is possible to extract quantitative information on breast images, allowing an accurate tissue identification. The study was carried out at the SYRMEP beamline of Elettra synchrotron radiation facility (Trieste, Italy), where a mastectomy specimen was investigated with the EIXPCi technique. The sample was exposed at three different energies suitable for mammography with synchrotron radiation in order to test the validity of the novel algorithm in extracting values of linear attenuation coefficients integrated over the sample thickness. It is demonstrated that the quantitative data are in good agreement with the theoretical values of linear attenuation coefficients calculated on the hypothesis of the breast with a given composition. The results are promising and encourage the current efforts to apply the method in mammography with synchrotron radiation. (note)

  9. Quantitative dissolution of (U, Pu)O2 MOX (0.4% to 44% PuO2) using microwave heating technique

    International Nuclear Information System (INIS)

    Malav, R.K.; Fulzele, Ajit K.; Prakash, Amrit; Afzal, Md.; Panakkal, J.P.

    2011-01-01

    AFFF has fabricated the (U, Pu)O 2 mixed oxide fuels for PHWRs, BWRs, PFBRs and FBTRs. The quantitative dissolution of the fuel samples are required within time for accurate determination of uranium-plutonium in chemical quality control laboratory. This paper describes the use of microwave heating technique in quantitative dissolution of (U, Pu)O 2 MOX (from 0.4% to 44% PuO 2 ). (author)

  10. Left atrial appendage segmentation and quantitative assisted diagnosis of atrial fibrillation based on fusion of temporal-spatial information.

    Science.gov (United States)

    Jin, Cheng; Feng, Jianjiang; Wang, Lei; Yu, Heng; Liu, Jiang; Lu, Jiwen; Zhou, Jie

    2018-05-01

    In this paper, we present an approach for left atrial appendage (LAA) multi-phase fast segmentation and quantitative assisted diagnosis of atrial fibrillation (AF) based on 4D-CT data. We take full advantage of the temporal dimension information to segment the living, flailed LAA based on a parametric max-flow method and graph-cut approach to build 3-D model of each phase. To assist the diagnosis of AF, we calculate the volumes of 3-D models, and then generate a "volume-phase" curve to calculate the important dynamic metrics: ejection fraction, filling flux, and emptying flux of the LAA's blood by volume. This approach demonstrates more precise results than the conventional approaches that calculate metrics by area, and allows for the quick analysis of LAA-volume pattern changes of in a cardiac cycle. It may also provide insight into the individual differences in the lesions of the LAA. Furthermore, we apply support vector machines (SVMs) to achieve a quantitative auto-diagnosis of the AF by exploiting seven features from volume change ratios of the LAA, and perform multivariate logistic regression analysis for the risk of LAA thrombosis. The 100 cases utilized in this research were taken from the Philips 256-iCT. The experimental results demonstrate that our approach can construct the 3-D LAA geometries robustly compared to manual annotations, and reasonably infer that the LAA undergoes filling, emptying and re-filling, re-emptying in a cardiac cycle. This research provides a potential for exploring various physiological functions of the LAA and quantitatively estimating the risk of stroke in patients with AF. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark® for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens

    Directory of Open Access Journals (Sweden)

    Jeffrey S. Larson

    2010-01-01

    Full Text Available We report here the results of the analytical validation of assays that measure HER2 total protein (H2T and HER2 homodimer (H2D expression in Formalin Fixed Paraffin Embedded (FFPE breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7–10 times greater than conventional immunohistochemistry (IHC (HercepTest. The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC or on indirect assessments of gene amplification (FISH.

  12. A Quantitative Index to Support Recurrence Prevention Plans of Human-Related Events

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea [KAERI, Daejeon (Korea, Republic of); Kim, Do Sam; Lee, Durk Hun [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    In Korea, HuRAM+ (Human related event Root cause Analysis Method plus) was developed to scrutinize the causes of the human-related events. The information of the human-related events investigated by the HuRAM+ method has been also managed by a database management system, R-tracer. It is obvious that accumulating data of human error causes aims to support plans that reduce recurrences of similar events. However, in spite of the efforts for the development of the human error database, it was indicated that the database does not provide useful empirical basis for establishment of the recurrence prevention plans, because the framework to interpret the collected data and apply the insights from the data into the prevention plants has not been developed yet. In this paper, in order to support establishment of the recurrence prevention plans, a quantitative index, Human Error Repeat Interval (HERI), was proposed and its applications to human error prevention were introduced. In this paper, a quantitative index, the HERI was proposed and the statistics of HERIs were introduced. These estimations can be employed to evaluate effects of recurrence prevention plans to human errors. If a mean HERI score is low and the linear trend is not positive, it can be suspected that the recurrence prevention plans applied every human-related event has not been effectively propagated. For reducing repetitive error causes, the system design or operational culture can be reviewed. If there is a strong and negative trend, systematic investigation of the root causes behind these trends is required. Likewise, we expect that the HERI index will provide significant basis for establishing or adjusting prevention plans of human errors. The accurate estimation and application of HERI scores is expected to be done after accumulating more data. When a scatter plot of HERIs is fitted by two or more models, a statistical model selection method can be employed. Some criteria have been introduced by

  13. A Quantitative Index to Support Recurrence Prevention Plans of Human-Related Events

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea; Kim, Do Sam; Lee, Durk Hun

    2015-01-01

    In Korea, HuRAM+ (Human related event Root cause Analysis Method plus) was developed to scrutinize the causes of the human-related events. The information of the human-related events investigated by the HuRAM+ method has been also managed by a database management system, R-tracer. It is obvious that accumulating data of human error causes aims to support plans that reduce recurrences of similar events. However, in spite of the efforts for the development of the human error database, it was indicated that the database does not provide useful empirical basis for establishment of the recurrence prevention plans, because the framework to interpret the collected data and apply the insights from the data into the prevention plants has not been developed yet. In this paper, in order to support establishment of the recurrence prevention plans, a quantitative index, Human Error Repeat Interval (HERI), was proposed and its applications to human error prevention were introduced. In this paper, a quantitative index, the HERI was proposed and the statistics of HERIs were introduced. These estimations can be employed to evaluate effects of recurrence prevention plans to human errors. If a mean HERI score is low and the linear trend is not positive, it can be suspected that the recurrence prevention plans applied every human-related event has not been effectively propagated. For reducing repetitive error causes, the system design or operational culture can be reviewed. If there is a strong and negative trend, systematic investigation of the root causes behind these trends is required. Likewise, we expect that the HERI index will provide significant basis for establishing or adjusting prevention plans of human errors. The accurate estimation and application of HERI scores is expected to be done after accumulating more data. When a scatter plot of HERIs is fitted by two or more models, a statistical model selection method can be employed. Some criteria have been introduced by

  14. Selection of reference genes for quantitative real-time PCR in bovine preimplantation embryos

    Directory of Open Access Journals (Sweden)

    Van Zeveren Alex

    2005-12-01

    Full Text Available Abstract Background Real-time quantitative PCR is a sensitive and very efficient technique to examine gene transcription patterns in preimplantation embryos, in order to gain information about embryo development and to optimize assisted reproductive technologies. Critical to the succesful application of real-time PCR is careful assay design, reaction optimization and validation to maximize sensitivity and accuracy. In most of the studies published GAPD, ACTB or 18S rRNA have been used as a single reference gene without prior verification of their expression stability. Normalization of the data using unstable controls can result in erroneous conclusions, especially when only one reference gene is used. Results In this study the transcription levels of 8 commonly used reference genes (ACTB, GAPD, Histone H2A, TBP, HPRT1, SDHA, YWHAZ and 18S rRNA were determined at different preimplantation stages (2-cell, 8-cell, blastocyst and hatched blastocyst in order to select the most stable genes to normalize quantitative data within different preimplantation embryo stages. Conclusion Using the geNorm application YWHAZ, GAPD and SDHA were found to be the most stable genes across the examined embryonic stages, while the commonly used ACTB was shown to be highly regulated. We recommend the use of the geometric mean of those 3 reference genes as an accurate normalization factor, which allows small expression differences to be reliably measured.

  15. Quantitative and informatics tools for studying the effect of low dose radiation on tissue and cell culture

    International Nuclear Information System (INIS)

    Parvin, B.; Yang, Q.; Fontenay, G.; Barcellos-Hoff, M.H.

    2003-01-01

    Full text: The challenge of the post-genomic era is functional genomics, i.e., understanding how the genome is expressed to produce myriad cell phenotypes. To use genomic information to understand the biology of complex organisms, one must understand the dynamics of phenotype generation and maintenance. A phenotype is the result of selective expression of the genome. In order to define cell 'phenomes,' one would track the kinetics and quantities of multiple constituent proteins, their cellular context and morphological features in large populations. Our aim is to extend the development of the BioSig imaging bioinformatics system for understanding how ionizing radiation alters tissue homeostasis and responses in cell culture experiments. Given several thousand antibodies and reagents for differentiating cell-specific protein components, biological heterogeneity, and other variables that affect cellular responses, there is a clear requirements for managing images and information about these images. Our focus is on the development of (1) quantitative methods for protein expression either in tissue or cell culture studies, (2) a adequate data model that couples quantitative results with the experimental variables, and (3) browsing and visualization tools that enable exploration of large scale image data in feature space in the context of biological heterogeneity. The framework provides the basis for studying the effect of low-dose radiation on the cellular microenvironment, inter-cell communication, and the underlying mechanisms. In turn, this information can then be used to more accurately predict more complex multicellular biological responses following exposure to different types of inhibitors. The BioSig informatics approach to microscopy and quantitative image analysis has been used to build a more detailed picture of the signaling that occurs between cells, as a result of an exogenous stimulus such as radiation, or as a consequence of endogenous programs leading

  16. A novel dual energy method for enhanced quantitative computed tomography

    Science.gov (United States)

    Emami, A.; Ghadiri, H.; Rahmim, A.; Ay, M. R.

    2018-01-01

    Accurate assessment of bone mineral density (BMD) is critically important in clinical practice, and conveniently enabled via quantitative computed tomography (QCT). Meanwhile, dual-energy QCT (DEQCT) enables enhanced detection of small changes in BMD relative to single-energy QCT (SEQCT). In the present study, we aimed to investigate the accuracy of QCT methods, with particular emphasis on a new dual-energy approach, in comparison to single-energy and conventional dual-energy techniques. We used a sinogram-based analytical CT simulator to model the complete chain of CT data acquisitions, and assessed performance of SEQCT and different DEQCT techniques in quantification of BMD. We demonstrate a 120% reduction in error when using a proposed dual-energy Simultaneous Equation by Constrained Least-squares method, enabling more accurate bone mineral measurements.

  17. Deriving Quantitative Crystallographic Information from the Wavelength-Resolved Neutron Transmission Analysis Performed in Imaging Mode

    Directory of Open Access Journals (Sweden)

    Hirotaka Sato

    2017-12-01

    Full Text Available Current status of Bragg-edge/dip neutron transmission analysis/imaging methods is presented. The method can visualize real-space distributions of bulk crystallographic information in a crystalline material over a large area (~10 cm with high spatial resolution (~100 μm. Furthermore, by using suitable spectrum analysis methods for wavelength-dependent neutron transmission data, quantitative visualization of the crystallographic information can be achieved. For example, crystallographic texture imaging, crystallite size imaging and crystalline phase imaging with texture/extinction corrections are carried out by the Rietveld-type (wide wavelength bandwidth profile fitting analysis code, RITS (Rietveld Imaging of Transmission Spectra. By using the single Bragg-edge analysis mode of RITS, evaluations of crystal lattice plane spacing (d-spacing relating to macro-strain and d-spacing distribution’s FWHM (full width at half maximum relating to micro-strain can be achieved. Macro-strain tomography is performed by a new conceptual CT (computed tomography image reconstruction algorithm, the tensor CT method. Crystalline grains and their orientations are visualized by a fast determination method of grain orientation for Bragg-dip neutron transmission spectrum. In this paper, these imaging examples with the spectrum analysis methods and the reliabilities evaluated by optical/electron microscope and X-ray/neutron diffraction, are presented. In addition, the status at compact accelerator driven pulsed neutron sources is also presented.

  18. In vivo quantitative NMR imaging of fruit tissues during growth using Spoiled Gradient Echo sequence

    DEFF Research Database (Denmark)

    Kenouche, S.; Perrier, M.; Bertin, N.

    2014-01-01

    of this study was to design a robust and accurate quantitative measurement method based on NMR imaging combined with contrast agent (CA) for mapping and quantifying water transport in growing cherry tomato fruits. A multiple flip-angle Spoiled Gradient Echo (SGE) imaging sequence was used to evaluate...

  19. Current Strategies for Quantitating Fibrosis in Liver Biopsy

    Directory of Open Access Journals (Sweden)

    Yan Wang

    2015-01-01

    Full Text Available Objective: The present mini-review updated the progress in methodologies based on using liver biopsy. Data Sources: Articles for study of liver fibrosis, liver biopsy or fibrosis assessment published on high impact peer review journals from 1980 to 2014. Study Selection: Key articles were selected mainly according to their levels of relevance to this topic and citations. Results: With the recently mounting progress in chronic liver disease therapeutics, comes by a pressing need for precise, accurate, and dynamic assessment of hepatic fibrosis and cirrhosis in individual patients. Histopathological information is recognized as the most valuable data for fibrosis assessment. Conventional histology categorical systems describe the changes of fibrosis patterns in liver tissue; but the simplified ordinal digits assigned by these systems cannot reflect the fibrosis dynamics with sufficient precision and reproducibility. Morphometric assessment by computer assist digital image analysis, such as collagen proportionate area (CPA, detects change of fibrosis amount in tissue section in a continuous variable, and has shown its independent diagnostic value for assessment of advanced or late-stage of fibrosis. Due to its evident sensitivity to sampling variances, morphometric measurement is feasible to be taken as a reliable statistical parameter for the study of a large cohort. Combining state-of-art imaging technology and fundamental principle in Tissue Engineering, structure-based quantitation was recently initiated with a novel proof-of-concept tool, qFibrosis. qFibrosis showed not only the superior performance to CPA in accurately and reproducibly differentiating adjacent stages of fibrosis, but also the possibility for facilitating analysis of fibrotic regression and cirrhosis sub-staging. Conclusions: With input from multidisciplinary innovation, liver biopsy assessment as a new "gold standard" is anticipated to substantially support the accelerated

  20. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. UNiquant, a Program for Quantitative Proteomics Analysis Using Stable Isotope Labeling

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Xin; Tolmachev, Aleksey V.; Shen, Yulei; Liu, Miao; Huang, Lin; Zhang, Zhixin; Anderson, Gordon A.; Smith, Richard D.; Chan, Wing C.; Hinrichs, Steven; Fu, Kai; Ding, Shi-Jian

    2011-03-04

    We present UNiquant, a new software program for analyzing stable isotope labeling (SIL) based quantitative proteomics data. UNiquant surpassed the performance of two other platforms, MaxQuant and Mascot Distiller, using complex proteome mixtures having either known or unknown heavy/light ratios. UNiquant is compatible with a broad spectrum of search engines and SIL methods, providing outstanding peptide pair identification and accurate measurement of the relative peptide/protein abundance.

  2. Quantitative determination of blood losses by a whole-body counter

    International Nuclear Information System (INIS)

    Rochna Viola, E.M.; Garreta, A.C. de; Soria, N.; Blanco, E.

    1976-01-01

    A method to quantitate blood losses by determination of the 51 Cr whole body retention (WBR) was developed. Autologous red cells labelled with Na 2 O 4 51 Cr were given intravenously. Blood losses were simulated by withdrawing blood samples. Percent relative variation (PRV) between real blood losses (blood withdrawal) and blood losses calculated by determining the WBR, were found. Withdrawal of 60 ml blood gave a PRV lower than 10.0%. 51 Cr body loss due to elution and red cell death was found to be 0.017 day -1 . The method allows the accurate detection of total blood losses of 60 ml or more, and it can be used to quantitate gastrointestinal or gynecological hemorrhages, avoiding the inconveniences involved by the complete recolection of faeces or of towels and tampons. (author) [es

  3. A CT-based method for fully quantitative TI SPECT

    International Nuclear Information System (INIS)

    Willowson, Kathy; Bailey, Dale; Baldock, Clive

    2009-01-01

    Full text: Objectives: To develop and validate a method for quantitative 2 0 l TI SPECT data based on corrections derived from X-ray CT data, and to apply the method in the clinic for quantitative determination of recurrence of brain tumours. Method: A previously developed method for achieving quantitative SPECT with 9 9 m Tc based on corrections derived from xray CT data was extended to apply to 2 0 l Tl. Experimental validation was performed on a cylindrical phantom by comparing known injected activity and measured concentration to quantitative calculations. Further evaluation was performed on a RSI Striatal Brain Phantom containing three 'lesions' with activity to background ratios of 1: 1, 1.5: I and 2: I. The method was subsequently applied to a series of scans from patients with suspected recurrence of brain tumours (principally glioma) to determine an SUV-like measure (Standardised Uptake Value). Results: The total activity and concentration in the phantom were calculated to within 3% and I % of the true values, respectively. The calculated values for the concentration of activity in the background and corresponding lesions of the brain phantom (in increasing ratios) were found to be within 2%,10%,1% and 2%, respectively, of the true concentrations. Patient studies showed that an initial SUV greater than 1.5 corresponded to a 56% mortality rate in the first 12 months, as opposed to a 14% mortality rate for those with a SUV less than 1.5. Conclusion: The quantitative technique produces accurate results for the radionuclide 2 0 l Tl. Initial investigation in clinical brain SPECT suggests correlation between quantitative uptake and survival.

  4. Leaderless Covert Networks : A Quantitative Approach

    NARCIS (Netherlands)

    Husslage, B.G.M.; Lindelauf, R.; Hamers, H.J.M.

    2012-01-01

    Abstract: Lindelauf et al. (2009a) introduced a quantitative approach to investigate optimal structures of covert networks. This approach used an objective function which is based on the secrecy versus information trade-off these organizations face. Sageman (2008) hypothesized that covert networks

  5. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  6. Quantitative Analysis of Bone Scintigrams at the Korle-Bu Teaching Hospital

    International Nuclear Information System (INIS)

    Huguette, E.Y.Y.

    2012-01-01

    Qualitative method of diagnosis has been the traditional means of diagnosing bone tumours at the Nuclear Medicine Department of the Korle-Bu Teaching Hospital over the years. Although this method is commendable, a more accurate diagnostic means is the quantitative approach. Study on ninety-five patients undergoing bone scans has been performed quantitatively using image J. The patients were administered with activity ranging from 15 to 30 mCi depending on their weights, and were then scanned with an installed e.Cam SPECT system. A 256 x 1024 matrix size was used in acquiring the bone scans. Quantitative analyses performed with the image J, revealed that uptake levels in all selected body parts were higher for metastatic tumours compared to non-metastatic tumours. The average normalised uptake in the recorded metastatic cases was 1.37332 cts/mm 2 /mCi and the corresponding uptake in the non-metastatic cases was 0.85230 cts/mm 2 /mCi. The relative higher uptake in metastatic tumours is attributed to high osteoblastic activity and blood flow in metastatic cases compared to non-metastatic cases. Quantitative assessment of bone scintigrams is recommended for its high accuracy and quicker means of diagnosing.(author)

  7. Accurate and robust brain image alignment using boundary-based registration.

    Science.gov (United States)

    Greve, Douglas N; Fischl, Bruce

    2009-10-15

    The fine spatial scales of the structures in the human brain represent an enormous challenge to the successful integration of information from different images for both within- and between-subject analysis. While many algorithms to register image pairs from the same subject exist, visual inspection shows that their accuracy and robustness to be suspect, particularly when there are strong intensity gradients and/or only part of the brain is imaged. This paper introduces a new algorithm called Boundary-Based Registration, or BBR. The novelty of BBR is that it treats the two images very differently. The reference image must be of sufficient resolution and quality to extract surfaces that separate tissue types. The input image is then aligned to the reference by maximizing the intensity gradient across tissue boundaries. Several lower quality images can be aligned through their alignment with the reference. Visual inspection and fMRI results show that BBR is more accurate than correlation ratio or normalized mutual information and is considerably more robust to even strong intensity inhomogeneities. BBR also excels at aligning partial-brain images to whole-brain images, a domain in which existing registration algorithms frequently fail. Even in the limit of registering a single slice, we show the BBR results to be robust and accurate.

  8. Informal information for web-based engineering catalogues

    Science.gov (United States)

    Allen, Richard D.; Culley, Stephen J.; Hicks, Ben J.

    2001-10-01

    Success is highly dependent on the ability of a company to efficiently produce optimal designs. In order to achieve this companies must minimize time to market and possess the ability to make fully informed decisions at the early phase of the design process. Such decisions may include the choice of component and suppliers, as well as cost and maintenance considerations. Computer modeling and electronic catalogues are becoming the preferred medium for the selection and design of mechanical components. In utilizing these techniques, the designer demands the capability to identify, evaluate and select mechanical components both quantitatively and qualitatively. Quantitative decisions generally encompass performance data included in the formal catalogue representation. It is in the area of qualitative decisions that the use of what the authors call 'Informal Information' is of crucial importance. Thus, 'Informal Information' must often be incorporated into the selection process and selection systems. This would enable more informed decisions to be made quicker, without the need for information retrieval via discussion with colleagues in the design environment. This paper provides an overview of the use of electronic information in the design of mechanical systems, including a discussion of limitations of current technology. The importance of Informal Information is discussed and the requirements for association with web based electronic catalogues are developed. This system is based on a flexible XML schema and enables the storage, classification and recall of Informal Information packets. Furthermore, a strategy for the inclusion of Informal Information is proposed, and an example case is used to illustrate the benefits.

  9. A quantitative assessment method for the NPP operators' diagnosis of accidents

    International Nuclear Information System (INIS)

    Kim, M. C.; Seong, P. H.

    2003-01-01

    In this research, we developed a quantitative model for the operators' diagnosis of the accident situation when an accident occurs in a nuclear power plant. After identifying the occurrence probabilities of accidents, the unavailabilities of various information sources, and the causal relationship between accidents and information sources, Bayesian network is used for the analysis of the change in the occurrence probabilities of accidents as the operators receive the information related to the status of the plant. The developed method is applied to a simple example case and it turned out that the developed method is a systematic quantitative analysis method which can cope with complex relationship between the accidents and information sources and various variables such accident occurrence probabilities and unavailabilities of various information sources

  10. When Information Improves Information Security

    Science.gov (United States)

    Grossklags, Jens; Johnson, Benjamin; Christin, Nicolas

    This paper presents a formal, quantitative evaluation of the impact of bounded-rational security decision-making subject to limited information and externalities. We investigate a mixed economy of an individual rational expert and several naïve near-sighted agents. We further model three canonical types of negative externalities (weakest-link, best shot and total effort), and study the impact of two information regimes on the threat level agents are facing.

  11. Generalized PSF modeling for optimized quantitation in PET imaging.

    Science.gov (United States)

    Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman

    2017-06-21

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF

  12. ImaEdge - a platform for quantitative analysis of the spatiotemporal dynamics of cortical proteins during cell polarization.

    Science.gov (United States)

    Zhang, Zhen; Lim, Yen Wei; Zhao, Peng; Kanchanawong, Pakorn; Motegi, Fumio

    2017-12-15

    Cell polarity involves the compartmentalization of the cell cortex. The establishment of cortical compartments arises from the spatial bias in the activity and concentration of cortical proteins. The mechanistic dissection of cell polarity requires the accurate detection of dynamic changes in cortical proteins, but the fluctuations of cell shape and the inhomogeneous distributions of cortical proteins greatly complicate the quantitative extraction of their global and local changes during cell polarization. To address these problems, we introduce an open-source software package, ImaEdge, which automates the segmentation of the cortex from time-lapse movies, and enables quantitative extraction of cortical protein intensities. We demonstrate that ImaEdge enables efficient and rigorous analysis of the dynamic evolution of cortical PAR proteins during Caenorhabditis elegans embryogenesis. It is also capable of accurate tracking of varying levels of transgene expression and discontinuous signals of the actomyosin cytoskeleton during multiple rounds of cell division. ImaEdge provides a unique resource for quantitative studies of cortical polarization, with the potential for application to many types of polarized cells.This article has an associated First Person interview with the first authors of the paper. © 2017. Published by The Company of Biologists Ltd.

  13. Potential Application of Quantitative Prostate-specific Antigen Analysis in Forensic Examination of Seminal Stains

    Directory of Open Access Journals (Sweden)

    Zhenping Liu

    2015-01-01

    Full Text Available The aims of this study are to use quantitative analysis of the prostate-specific antigen (PSA in the seminal stain examination and to explore the practical value of this analysis in forensic science. For a comprehensive analysis, vaginal swabs from 48 rape cases were tested both by a PSA fluorescence analyzer (i-CHROMA Reader and by a conventional PSA strip test. To confirm the results of these PSA tests, seminal DNA was tested following differential extraction. Compared to the PSA strip test, the PSA rapid quantitative fluorescence analyzer provided the more accurate and sensitive results. More importantly, individualized schemes based on quantitative PSA results of samples can be developed to improve the quality and procedural efficiency in the forensic seminal inspection of samples prior to DNA analysis.

  14. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  15. Quantitative phylogenetic assessment of microbial communities indiverse environments

    Energy Technology Data Exchange (ETDEWEB)

    von Mering, C.; Hugenholtz, P.; Raes, J.; Tringe, S.G.; Doerks,T.; Jensen, L.J.; Ward, N.; Bork, P.

    2007-01-01

    The taxonomic composition of environmental communities is an important indicator of their ecology and function. Here, we use a set of protein-coding marker genes, extracted from large-scale environmental shotgun sequencing data, to provide a more direct, quantitative and accurate picture of community composition than traditional rRNA-based approaches using polymerase chain reaction (PCR). By mapping marker genes from four diverse environmental data sets onto a reference species phylogeny, we show that certain communities evolve faster than others, determine preferred habitats for entire microbial clades, and provide evidence that such habitat preferences are often remarkably stable over time.

  16. Integration of hydrothermal-energy economics: related quantitative studies

    Energy Technology Data Exchange (ETDEWEB)

    1982-08-01

    A comparison of ten models for computing the cost of hydrothermal energy is presented. This comparison involved a detailed examination of a number of technical and economic parameters of the various quantitative models with the objective of identifying the most important parameters in the context of accurate estimates of cost of hydrothermal energy. Important features of various models, such as focus of study, applications, marked sectors covered, methodology, input data requirements, and output are compared in the document. A detailed sensitivity analysis of all the important engineering and economic parameters is carried out to determine the effect of non-consideration of individual parameters.

  17. Correlation Feature Selection and Mutual Information Theory Based Quantitative Research on Meteorological Impact Factors of Module Temperature for Solar Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Yujing Sun

    2016-12-01

    Full Text Available The module temperature is the most important parameter influencing the output power of solar photovoltaic (PV systems, aside from solar irradiance. In this paper, we focus on the interdisciplinary research that combines the correlation analysis, mutual information (MI and heat transfer theory, which aims to figure out the correlative relations between different meteorological impact factors (MIFs and PV module temperature from both quality and quantitative aspects. The identification and confirmation of primary MIFs of PV module temperature are investigated as the first step of this research from the perspective of physical meaning and mathematical analysis about electrical performance and thermal characteristic of PV modules based on PV effect and heat transfer theory. Furthermore, the quantitative description of the MIFs influence on PV module temperature is mathematically formulated as several indexes using correlation-based feature selection (CFS and MI theory to explore the specific impact degrees under four different typical weather statuses named general weather classes (GWCs. Case studies for the proposed methods were conducted using actual measurement data of a 500 kW grid-connected solar PV plant in China. The results not only verified the knowledge about the main MIFs of PV module temperatures, more importantly, but also provide the specific ratio of quantitative impact degrees of these three MIFs respectively through CFS and MI based measures under four different GWCs.

  18. Quantitative Resistance to Plant Pathogens in Pyramiding Strategies for Durable Crop Protection

    Directory of Open Access Journals (Sweden)

    Marie-Laure Pilet-Nayel

    2017-10-01

    Full Text Available Quantitative resistance has gained interest in plant breeding for pathogen control in low-input cropping systems. Although quantitative resistance frequently has only a partial effect and is difficult to select, it is considered more durable than major resistance (R genes. With the exponential development of molecular markers over the past 20 years, resistance QTL have been more accurately detected and better integrated into breeding strategies for resistant varieties with increased potential for durability. This review summarizes current knowledge on the genetic inheritance, molecular basis, and durability of quantitative resistance. Based on this knowledge, we discuss how strategies that combine major R genes and QTL in crops can maintain the effectiveness of plant resistance to pathogens. Combining resistance QTL with complementary modes of action appears to be an interesting strategy for breeding effective and potentially durable resistance. Combining quantitative resistance with major R genes has proven to be a valuable approach for extending the effectiveness of major genes. In the plant genomics era, improved tools and methods are becoming available to better integrate quantitative resistance into breeding strategies. Nevertheless, optimal combinations of resistance loci will still have to be identified to preserve resistance effectiveness over time for durable crop protection.

  19. Quantitative nuclear medicine imaging: application of computers to the gamma camera and whole-body scanner

    International Nuclear Information System (INIS)

    Budinger, T.F.

    1974-01-01

    The following topics are reviewed: properties of computer systems for nuclear medicine quantitation; quantitative information concerning the relation between organ isotope concentration and detected projections of the isotope distribution; quantitation using two conjugate views; three-dimensional reconstruction from projections; quantitative cardiac radioangiography; and recent advances leading to quantitative nuclear medicine of clinical importance. (U.S.)

  20. Inverse methods for 3D quantitative optical coherence elasticity imaging (Conference Presentation)

    Science.gov (United States)

    Dong, Li; Wijesinghe, Philip; Hugenberg, Nicholas; Sampson, David D.; Munro, Peter R. T.; Kennedy, Brendan F.; Oberai, Assad A.

    2017-02-01

    In elastography, quantitative elastograms are desirable as they are system and operator independent. Such quantification also facilitates more accurate diagnosis, longitudinal studies and studies performed across multiple sites. In optical elastography (compression, surface-wave or shear-wave), quantitative elastograms are typically obtained by assuming some form of homogeneity. This simplifies data processing at the expense of smearing sharp transitions in elastic properties, and/or introducing artifacts in these regions. Recently, we proposed an inverse problem-based approach to compression OCE that does not assume homogeneity, and overcomes the drawbacks described above. In this approach, the difference between the measured and predicted displacement field is minimized by seeking the optimal distribution of elastic parameters. The predicted displacements and recovered elastic parameters together satisfy the constraint of the equations of equilibrium. This approach, which has been applied in two spatial dimensions assuming plane strain, has yielded accurate material property distributions. Here, we describe the extension of the inverse problem approach to three dimensions. In addition to the advantage of visualizing elastic properties in three dimensions, this extension eliminates the plane strain assumption and is therefore closer to the true physical state. It does, however, incur greater computational costs. We address this challenge through a modified adjoint problem, spatially adaptive grid resolution, and three-dimensional decomposition techniques. Through these techniques the inverse problem is solved on a typical desktop machine within a wall clock time of 20 hours. We present the details of the method and quantitative elasticity images of phantoms and tissue samples.

  1. Can blind persons accurately assess body size from the voice?

    Science.gov (United States)

    Pisanski, Katarzyna; Oleszkiewicz, Anna; Sorokowska, Agnieszka

    2016-04-01

    Vocal tract resonances provide reliable information about a speaker's body size that human listeners use for biosocial judgements as well as speech recognition. Although humans can accurately assess men's relative body size from the voice alone, how this ability is acquired remains unknown. In this study, we test the prediction that accurate voice-based size estimation is possible without prior audiovisual experience linking low frequencies to large bodies. Ninety-one healthy congenitally or early blind, late blind and sighted adults (aged 20-65) participated in the study. On the basis of vowel sounds alone, participants assessed the relative body sizes of male pairs of varying heights. Accuracy of voice-based body size assessments significantly exceeded chance and did not differ among participants who were sighted, or congenitally blind or who had lost their sight later in life. Accuracy increased significantly with relative differences in physical height between men, suggesting that both blind and sighted participants used reliable vocal cues to size (i.e. vocal tract resonances). Our findings demonstrate that prior visual experience is not necessary for accurate body size estimation. This capacity, integral to both nonverbal communication and speech perception, may be present at birth or may generalize from broader cross-modal correspondences. © 2016 The Author(s).

  2. Multimodal quantitative phase and fluorescence imaging of cell apoptosis

    Science.gov (United States)

    Fu, Xinye; Zuo, Chao; Yan, Hao

    2017-06-01

    Fluorescence microscopy, utilizing fluorescence labeling, has the capability to observe intercellular changes which transmitted and reflected light microscopy techniques cannot resolve. However, the parts without fluorescence labeling are not imaged. Hence, the processes simultaneously happen in these parts cannot be revealed. Meanwhile, fluorescence imaging is 2D imaging where information in the depth is missing. Therefore the information in labeling parts is also not complete. On the other hand, quantitative phase imaging is capable to image cells in 3D in real time through phase calculation. However, its resolution is limited by the optical diffraction and cannot observe intercellular changes below 200 nanometers. In this work, fluorescence imaging and quantitative phase imaging are combined to build a multimodal imaging system. Such system has the capability to simultaneously observe the detailed intercellular phenomenon and 3D cell morphology. In this study the proposed multimodal imaging system is used to observe the cell behavior in the cell apoptosis. The aim is to highlight the limitations of fluorescence microscopy and to point out the advantages of multimodal quantitative phase and fluorescence imaging. The proposed multimodal quantitative phase imaging could be further applied in cell related biomedical research, such as tumor.

  3. Quantitation of valve regurgitation severity by three-dimensional vena contracta area is superior to flow convergence method of quantitation on transesophageal echocardiography.

    Science.gov (United States)

    Abudiab, Muaz M; Chao, Chieh-Ju; Liu, Shuang; Naqvi, Tasneem Z

    2017-07-01

    Quantitation of regurgitation severity using the proximal isovelocity acceleration (PISA) method to calculate effective regurgitant orifice (ERO) area has limitations. Measurement of three-dimensional (3D) vena contracta area (VCA) accurately grades mitral regurgitation (MR) severity on transthoracic echocardiography (TTE). We evaluated 3D VCA quantitation of regurgitant jet severity using 3D transesophageal echocardiography (TEE) in 110 native mitral, aortic, and tricuspid valves and six prosthetic valves in patients with at least mild valvular regurgitation. The ASE-recommended integrative method comprising semiquantitative and quantitative assessment of valvular regurgitation was used as a reference method, including ERO area by 2D PISA for assigning severity of regurgitation grade. Mean age was 62.2±14.4 years; 3D VCA quantitation was feasible in 91% regurgitant valves compared to 78% by the PISA method. When both methods were feasible and in the presence of a single regurgitant jet, 3D VCA and 2D PISA were similar in differentiating assigned severity (ANOVAP<.001). In valves with multiple jets, however, 3D VCA had a better correlation to assigned severity (ANOVAP<.0001). The agreement of 2D PISA and 3D VCA with the integrative method was 47% and 58% for moderate and 65% and 88% for severe regurgitation, respectively. Measurement of 3D VCA by TEE is superior to the 2D PISA method in determination of regurgitation severity in multiple native and prosthetic valves. © 2017, Wiley Periodicals, Inc.

  4. Spectrally accurate contour dynamics

    International Nuclear Information System (INIS)

    Van Buskirk, R.D.; Marcus, P.S.

    1994-01-01

    We present an exponentially accurate boundary integral method for calculation the equilibria and dynamics of piece-wise constant distributions of potential vorticity. The method represents contours of potential vorticity as a spectral sum and solves the Biot-Savart equation for the velocity by spectrally evaluating a desingularized contour integral. We use the technique in both an initial-value code and a newton continuation method. Our methods are tested by comparing the numerical solutions with known analytic results, and it is shown that for the same amount of computational work our spectral methods are more accurate than other contour dynamics methods currently in use

  5. Reliability Assessment of Cloud Computing Platform Based on Semiquantitative Information and Evidential Reasoning

    Directory of Open Access Journals (Sweden)

    Hang Wei

    2016-01-01

    Full Text Available A reliability assessment method based on evidential reasoning (ER rule and semiquantitative information is proposed in this paper, where a new reliability assessment architecture including four aspects with both quantitative data and qualitative knowledge is established. The assessment architecture is more objective in describing complex dynamic cloud computing environment than that in traditional method. In addition, the ER rule which has good performance for multiple attribute decision making problem is employed to integrate different types of the attributes in assessment architecture, which can obtain more accurate assessment results. The assessment results of the case study in an actual cloud computing platform verify the effectiveness and the advantage of the proposed method.

  6. Quantitative spectral and orientational analysis in surface sum frequency generation vibrational spectroscopy (SFG-VS)

    Science.gov (United States)

    Wang, Hong-Fei; Gan, Wei; Lu, Rong; Rao, Yi; Wu, Bao-Hua

    Sum frequency generation vibrational spectroscopy (SFG-VS) has been proven to be a uniquely effective spectroscopic technique in the investigation of molecular structure and conformations, as well as the dynamics of molecular interfaces. However, the ability to apply SFG-VS to complex molecular interfaces has been limited by the ability to abstract quantitative information from SFG-VS experiments. In this review, we try to make assessments of the limitations, issues and techniques as well as methodologies in quantitative orientational and spectral analysis with SFG-VS. Based on these assessments, we also try to summarize recent developments in methodologies on quantitative orientational and spectral analysis in SFG-VS, and their applications to detailed analysis of SFG-VS data of various vapour/neat liquid interfaces. A rigorous formulation of the polarization null angle (PNA) method is given for accurate determination of the orientational parameter D = /, and comparison between the PNA method with the commonly used polarization intensity ratio (PIR) method is discussed. The polarization and incident angle dependencies of the SFG-VS intensity are also reviewed, in the light of how experimental arrangements can be optimized to effectively abstract crucial information from the SFG-VS experiments. The values and models of the local field factors in the molecular layers are discussed. In order to examine the validity and limitations of the bond polarizability derivative model, the general expressions for molecular hyperpolarizability tensors and their expression with the bond polarizability derivative model for C3v, C2v and C∞v molecular groups are given in the two appendixes. We show that the bond polarizability derivative model can quantitatively describe many aspects of the intensities observed in the SFG-VS spectrum of the vapour/neat liquid interfaces in different polarizations. Using the polarization analysis in SFG-VS, polarization selection rules or

  7. Using multiple PCR and CE with chemiluminescence detection for simultaneous qualitative and quantitative analysis of genetically modified organism.

    Science.gov (United States)

    Guo, Longhua; Qiu, Bin; Chi, Yuwu; Chen, Guonan

    2008-09-01

    In this paper, an ultrasensitive CE-CL detection system coupled with a novel double-on-column coaxial flow detection interface was developed for the detection of PCR products. A reliable procedure based on this system had been demonstrated for qualitative and quantitative analysis of genetically modified organism-the detection of Roundup Ready Soy (RRS) samples was presented as an example. The promoter, terminator, function and two reference genes of RRS were amplified with multiplex PCR simultaneously. After that, the multiplex PCR products were labeled with acridinium ester at the 5'-terminal through an amino modification and then analyzed by the proposed CE-CL system. Reproducibility of analysis times and peak heights for the CE-CL analysis were determined to be better than 0.91 and 3.07% (RSD, n=15), respectively, for three consecutive days. It was shown that this method could accurately and qualitatively detect RRS standards and the simulative samples. The evaluation in terms of quantitative analysis of RRS provided by this new method was confirmed by comparing our assay results with those of the standard real-time quantitative PCR (RT-QPCR) using SYBR Green I dyes. The results showed a good coherence between the two methods. This approach demonstrated the possibility for accurate qualitative and quantitative detection of GM plants in a single run.

  8. A feasible, economical, and accurate analytical method for simultaneous determination of six alkaloid markers in Aconiti Lateralis Radix Praeparata from different manufacturing sources and processing ways.

    Science.gov (United States)

    Zhang, Yi-Bei; DA, Juan; Zhang, Jing-Xian; Li, Shang-Rong; Chen, Xin; Long, Hua-Li; Wang, Qiu-Rong; Cai, Lu-Ying; Yao, Shuai; Hou, Jin-Jun; Wu, Wan-Ying; Guo, De-An

    2017-04-01

    Aconiti Lateralis Radix Praeparata (Fuzi) is a commonly used traditional Chinese medicine in clinic for its potency in restoring yang and rescuing from collapse. Aconiti alkaloids, mainly including monoester-diterpenoidaconitines (MDAs) and diester-diterpenoidaconitines (DDAs), are considered to act as both bioactive and toxic constituents. In the present study, a feasible, economical, and accurate HPLC method for simultaneous determination of six alkaloid markers using the Single Standard for Determination of Multi-Components (SSDMC) method was developed and fully validated. Benzoylmesaconine was used as the unique reference standard. This method was proven as accurate (recovery varying between 97.5%-101.8%, RSD 0.999 9) over the concentration ranges, and subsequently applied to quantitative evaluation of 62 batches of samples, among which 45 batches were from good manufacturing practice (GMP) facilities and 17 batches from the drug market. The contents were then analyzed by principal component analysis (PCA) and homogeneity test. The present study provided valuable information for improving the quality standard of Aconiti Lateralis Radix Praeparata. The developed method also has the potential in analysis of other Aconitum species, such as Aconitum carmichaelii (prepared parent root) and Aconitum kusnezoffii (prepared root). Copyright © 2017 China Pharmaceutical University. Published by Elsevier B.V. All rights reserved.

  9. Rhetorical Numbers: A Case for Quantitative Writing in the Composition Classroom

    Science.gov (United States)

    Wolfe, Joanna

    2010-01-01

    Contemporary argument increasingly relies on quantitative information and reasoning, yet our profession neglects to view these means of persuasion as central to rhetorical arts. Such omission ironically serves to privilege quantitative arguments as above "mere rhetoric." Changes are needed to our textbooks, writing assignments, and instructor…

  10. Qualitative and Quantitative Information Flow Analysis for Multi-threaded Programs

    NARCIS (Netherlands)

    Ngo, Minh Tri

    2014-01-01

    In today’s information-based society, guaranteeing information security plays an important role in all aspects of life: governments, military, companies, financial information systems, web-based services etc. With the existence of Internet, Google, and shared-information networks, it is easier than

  11. Can phenological models predict tree phenology accurately under climate change conditions?

    Science.gov (United States)

    Chuine, Isabelle; Bonhomme, Marc; Legave, Jean Michel; García de Cortázar-Atauri, Inaki; Charrier, Guillaume; Lacointe, André; Améglio, Thierry

    2014-05-01

    or compromise dormancy break at the species equatorward range limits leading to a delay or even impossibility to flower or set new leaves. These models are classically parameterized with flowering or budburst dates only, with no information on the dormancy break date because this information is very scarce. We evaluated the efficiency of a set of process-based phenological models to accurately predict the dormancy break dates of four fruit trees. Our results show that models calibrated solely with flowering or budburst dates do not accurately predict the dormancy break date. Providing dormancy break date for the model parameterization results in much more accurate simulation of this latter, with however a higher error than that on flowering or bud break dates. But most importantly, we show also that models not calibrated with dormancy break dates can generate significant differences in forecasted flowering or bud break dates when using climate scenarios. Our results claim for the urgent need of massive measurements of dormancy break dates in forest and fruit trees to yield more robust projections of phenological changes in a near future.

  12. Machine learning of accurate energy-conserving molecular force fields

    Science.gov (United States)

    Chmiela, Stefan; Tkatchenko, Alexandre; Sauceda, Huziel E.; Poltavsky, Igor; Schütt, Kristof T.; Müller, Klaus-Robert

    2017-01-01

    Using conservation of energy—a fundamental property of closed classical and quantum mechanical systems—we develop an efficient gradient-domain machine learning (GDML) approach to construct accurate molecular force fields using a restricted number of samples from ab initio molecular dynamics (AIMD) trajectories. The GDML implementation is able to reproduce global potential energy surfaces of intermediate-sized molecules with an accuracy of 0.3 kcal mol−1 for energies and 1 kcal mol−1 Å̊−1 for atomic forces using only 1000 conformational geometries for training. We demonstrate this accuracy for AIMD trajectories of molecules, including benzene, toluene, naphthalene, ethanol, uracil, and aspirin. The challenge of constructing conservative force fields is accomplished in our work by learning in a Hilbert space of vector-valued functions that obey the law of energy conservation. The GDML approach enables quantitative molecular dynamics simulations for molecules at a fraction of cost of explicit AIMD calculations, thereby allowing the construction of efficient force fields with the accuracy and transferability of high-level ab initio methods. PMID:28508076

  13. MR urography: Anatomical and quantitative information on ...

    African Journals Online (AJOL)

    Background and Aim: Magnetic resonance urography (MRU) is considered to be the next step in uroradiology. This technique combines superb anatomical images and functional information in a single test. In this article, we aim to present the topic of MRU in children and how it has been implemented in Northern Greece so ...

  14. Quantitative in situ magnetization reversal studies in Lorentz microscopy and electron holography.

    Science.gov (United States)

    Rodríguez, L A; Magén, C; Snoeck, E; Gatel, C; Marín, L; Serrano-Ramón, L; Prieto, J L; Muñoz, M; Algarabel, P A; Morellon, L; De Teresa, J M; Ibarra, M R

    2013-11-01

    A generalized procedure for the in situ application of magnetic fields by means of the excitation of the objective lens for magnetic imaging experiments in Lorentz microscopy and electron holography is quantitatively described. A protocol for applying magnetic fields with arbitrary in-plane magnitude and orientation is presented, and a freeware script for Digital Micrograph(™) is provided to assist the operation of the microscope. Moreover, a method to accurately reconstruct hysteresis loops is detailed. We show that the out-of-plane component of the magnetic field cannot be always neglected when performing quantitative measurements of the local magnetization. Several examples are shown to demonstrate the accuracy and functionality of the methods. © 2013 Elsevier B.V. All rights reserved.

  15. Quantitative angle-insensitive flow measurement using relative standard deviation OCT.

    Science.gov (United States)

    Zhu, Jiang; Zhang, Buyun; Qi, Li; Wang, Ling; Yang, Qiang; Zhu, Zhuqing; Huo, Tiancheng; Chen, Zhongping

    2017-10-30

    Incorporating different data processing methods, optical coherence tomography (OCT) has the ability for high-resolution angiography and quantitative flow velocity measurements. However, OCT angiography cannot provide quantitative information of flow velocities, and the velocity measurement based on Doppler OCT requires the determination of Doppler angles, which is a challenge in a complex vascular network. In this study, we report on a relative standard deviation OCT (RSD-OCT) method which provides both vascular network mapping and quantitative information for flow velocities within a wide range of Doppler angles. The RSD values are angle-insensitive within a wide range of angles, and a nearly linear relationship was found between the RSD values and the flow velocities. The RSD-OCT measurement in a rat cortex shows that it can quantify the blood flow velocities as well as map the vascular network in vivo .

  16. Iridium-191m radionuclide angiocardiography detection and quantitation of left-to-rigth shunts

    International Nuclear Information System (INIS)

    Treves, S.; Fujii, A.; Cheng, C.; Kuruc, A.

    1983-01-01

    The purpose of this study was to determine whether Iridium-191m (Ir-191m) could replace Technetium-99m (Tc-99m) in the detection and quantitation of left-to-right shunts. It was demonstrated that Ir-191m radionuclide angiography is a safe, rapid, and accurate method for the detection and quantitation of left-to-right shunts with very low radiation dose to the patient. It is also possible with this radiotracer to evaluate other aspects of the anatomy and physiology of the circulation such as ventricular function, patency of major vessels, renal and cerebral perfusion. Further improvements on 0s-191 production, generator design and gamma cameras would expand the use of this ultrashort-lived radionuclide

  17. An Accurate Estimate of the Free Energy and Phase Diagram of All-DNA Bulk Fluids

    Directory of Open Access Journals (Sweden)

    Emanuele Locatelli

    2018-04-01

    Full Text Available We present a numerical study in which large-scale bulk simulations of self-assembled DNA constructs have been carried out with a realistic coarse-grained model. The investigation aims at obtaining a precise, albeit numerically demanding, estimate of the free energy for such systems. We then, in turn, use these accurate results to validate a recently proposed theoretical approach that builds on a liquid-state theory, the Wertheim theory, to compute the phase diagram of all-DNA fluids. This hybrid theoretical/numerical approach, based on the lowest-order virial expansion and on a nearest-neighbor DNA model, can provide, in an undemanding way, a parameter-free thermodynamic description of DNA associating fluids that is in semi-quantitative agreement with experiments. We show that the predictions of the scheme are as accurate as those obtained with more sophisticated methods. We also demonstrate the flexibility of the approach by incorporating non-trivial additional contributions that go beyond the nearest-neighbor model to compute the DNA hybridization free energy.

  18. Accurate measurement of mitochondrial DNA deletion level and copy number differences in human skeletal muscle.

    Directory of Open Access Journals (Sweden)

    John P Grady

    Full Text Available Accurate and reliable quantification of the abundance of mitochondrial DNA (mtDNA molecules, both wild-type and those harbouring pathogenic mutations, is important not only for understanding the progression of mtDNA disease but also for evaluating novel therapeutic approaches. A clear understanding of the sensitivity of mtDNA measurement assays under different experimental conditions is therefore critical, however it is routinely lacking for most published mtDNA quantification assays. Here, we comprehensively assess the variability of two quantitative Taqman real-time PCR assays, a widely-applied MT-ND1/MT-ND4 multiplex mtDNA deletion assay and a recently developed MT-ND1/B2M singleplex mtDNA copy number assay, across a range of DNA concentrations and mtDNA deletion/copy number levels. Uniquely, we provide a specific guide detailing necessary numbers of sample and real-time PCR plate replicates for accurately and consistently determining a given difference in mtDNA deletion levels and copy number in homogenate skeletal muscle DNA.

  19. Quantitative analysis by computer controlled X-ray fluorescence spectrometer

    International Nuclear Information System (INIS)

    Balasubramanian, T.V.; Angelo, P.C.

    1981-01-01

    X-ray fluorescence spectroscopy has become a widely accepted method in the metallurgical field for analysis of both minor and major elements. As encountered in many other analytical techniques, the problem of matrix effect generally known as the interelemental effects is to be dealt with effectively in order to make the analysis accurate. There are several methods by which the effects of matrix on the analyte are minimised or corrected for and the mathematical correction is one among them. In this method the characteristic secondary X-ray intensities are measured from standard samples and correction coefficients. If any, for interelemental effects are evaluated by mathematical calculations. This paper describes attempts to evaluate the correction coefficients for interelemental effects by multiple linear regression programmes using a computer for the quantitative analysis of stainless steel and a nickel base cast alloy. The quantitative results obtained using this method for a standard stainless steel sample are compared with the given certified values. (author)

  20. Comparative study of quantitative phase imaging techniques for refractometry of optical fibers

    Science.gov (United States)

    de Dorlodot, Bertrand; Bélanger, Erik; Bérubé, Jean-Philippe; Vallée, Réal; Marquet, Pierre

    2018-02-01

    The refractive index difference profile of optical fibers is the key design parameter because it determines, among other properties, the insertion losses and propagating modes. Therefore, an accurate refractive index profiling method is of paramount importance to their development and optimization. Quantitative phase imaging (QPI) is one of the available tools to retrieve structural characteristics of optical fibers, including the refractive index difference profile. Having the advantage of being non-destructive, several different QPI methods have been developed over the last decades. Here, we present a comparative study of three different available QPI techniques, namely the transport-of-intensity equation, quadriwave lateral shearing interferometry and digital holographic microscopy. To assess the accuracy and precision of those QPI techniques, quantitative phase images of the core of a well-characterized optical fiber have been retrieved for each of them and a robust image processing procedure has been applied in order to retrieve their refractive index difference profiles. As a result, even if the raw images for all the three QPI methods were suffering from different shortcomings, our robust automated image-processing pipeline successfully corrected these. After this treatment, all three QPI techniques yielded accurate, reliable and mutually consistent refractive index difference profiles in agreement with the accuracy and precision of the refracted near-field benchmark measurement.

  1. A method for accurate detection of genomic microdeletions using real-time quantitative PCR

    Directory of Open Access Journals (Sweden)

    Bassett Anne S

    2005-12-01

    Full Text Available Abstract Background Quantitative Polymerase Chain Reaction (qPCR is a well-established method for quantifying levels of gene expression, but has not been routinely applied to the detection of constitutional copy number alterations of human genomic DNA. Microdeletions or microduplications of the human genome are associated with a variety of genetic disorders. Although, clinical laboratories routinely use fluorescence in situ hybridization (FISH to identify such cryptic genomic alterations, there remains a significant number of individuals in which constitutional genomic imbalance is suspected, based on clinical parameters, but cannot be readily detected using current cytogenetic techniques. Results In this study, a novel application for real-time qPCR is presented that can be used to reproducibly detect chromosomal microdeletions and microduplications. This approach was applied to DNA from a series of patient samples and controls to validate genomic copy number alteration at cytoband 22q11. The study group comprised 12 patients with clinical symptoms of chromosome 22q11 deletion syndrome (22q11DS, 1 patient trisomic for 22q11 and 4 normal controls. 6 of the patients (group 1 had known hemizygous deletions, as detected by standard diagnostic FISH, whilst the remaining 6 patients (group 2 were classified as 22q11DS negative using the clinical FISH assay. Screening of the patients and controls with a set of 10 real time qPCR primers, spanning the 22q11.2-deleted region and flanking sequence, confirmed the FISH assay results for all patients with 100% concordance. Moreover, this qPCR enabled a refinement of the region of deletion at 22q11. Analysis of DNA from chromosome 22 trisomic sample demonstrated genomic duplication within 22q11. Conclusion In this paper we present a qPCR approach for the detection of chromosomal microdeletions and microduplications. The strategic use of in silico modelling for qPCR primer design to avoid regions of repetitive

  2. Quantitative measurement of the cerebral blood flow

    International Nuclear Information System (INIS)

    Houdart, R.; Mamo, H.; Meric, P.; Seylaz, J.

    1976-01-01

    The value of the cerebral blood flow measurement (CBF) is outlined, its limits are defined and some future prospects discussed. The xenon 133 brain clearance study is at present the most accurate quantitative method to evaluate the CBF in different regions of the brain simultaneously. The method and the progress it has led to in the physiological, physiopathological and therapeutic fields are described. The major disadvantage of the method is shown to be the need to puncture the internal carotid for each measurement. Prospects are discussed concerning methods derived from the same general principle but using a simpler, non-traumatic way to introduce the radio-tracer, either by breathing into the lungs or intraveinously [fr

  3. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    International Nuclear Information System (INIS)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.

  4. Separation and quantitation of polyethylene glycols 400 and 3350 from human urine by high-performance liquid chromatography.

    Science.gov (United States)

    Ryan, C M; Yarmush, M L; Tompkins, R G

    1992-04-01

    Polyethylene glycol 3350 (PEG 3350) is useful as an orally administered probe to measure in vivo intestinal permeability to macromolecules. Previous methods to detect polyethylene glycol (PEG) excreted in the urine have been hampered by inherent inaccuracies associated with liquid-liquid extraction and turbidimetric analysis. For accurate quantitation by previous methods, radioactive labels were required. This paper describes a method to separate and quantitate PEG 3350 and PEG 400 in human urine that is independent of radioactive labels and is accurate in clinical practice. The method uses sized regenerated cellulose membranes and mixed ion-exchange resin for sample preparation and high-performance liquid chromatography with refractive index detection for analysis. The 24-h excretion for normal individuals after an oral dose of 40 g of PEG 3350 and 5 g of PEG 400 was 0.12 +/- 0.04% of the original dose of PEG 3350 and 26.3 +/- 5.1% of the original dose of PEG 400.

  5. Improved quantitation and reproducibility in multi-PET/CT lung studies by combining CT information.

    Science.gov (United States)

    Holman, Beverley F; Cuplov, Vesna; Millner, Lynn; Endozo, Raymond; Maher, Toby M; Groves, Ashley M; Hutton, Brian F; Thielemans, Kris

    2018-06-05

    Matched attenuation maps are vital for obtaining accurate and reproducible kinetic and static parameter estimates from PET data. With increased interest in PET/CT imaging of diffuse lung diseases for assessing disease progression and treatment effectiveness, understanding the extent of the effect of respiratory motion and establishing methods for correction are becoming more important. In a previous study, we have shown that using the wrong attenuation map leads to large errors due to density mismatches in the lung, especially in dynamic PET scans. Here, we extend this work to the case where the study is sub-divided into several scans, e.g. for patient comfort, each with its own CT (cine-CT and 'snap shot' CT). A method to combine multi-CT information into a combined-CT has then been developed, which averages the CT information from each study section to produce composite CT images with the lung density more representative of that in the PET data. This combined-CT was applied to nine patients with idiopathic pulmonary fibrosis, imaged with dynamic 18 F-FDG PET/CT to determine the improvement in the precision of the parameter estimates. Using XCAT simulations, errors in the influx rate constant were found to be as high as 60% in multi-PET/CT studies. Analysis of patient data identified displacements between study sections in the time activity curves, which led to an average standard error in the estimates of the influx rate constant of 53% with conventional methods. This reduced to within 5% after use of combined-CTs for attenuation correction of the study sections. Use of combined-CTs to reconstruct the sections of a multi-PET/CT study, as opposed to using the individually acquired CTs at each study stage, produces more precise parameter estimates and may improve discrimination between diseased and normal lung.

  6. Involving the expert in the delivery of environmental information from the web

    International Nuclear Information System (INIS)

    Wanner, Leo

    2013-01-01

    Automatic provision of accurate user need- and profile-tailored environmental information is of increasing demand. However, it is a challenging task with many facets. Among them is the difficulty to compile and cast into a formal representation all the expert knowledge needed to accurately and exhaustively acquire, assess and process the data in order to be able to reliably judge their relevance to the user and to produce an adequate summary and recommendations. Studies in Human-Computer Interaction show that both the satisfaction of the users with an application and the objective performance of a service increases if the users (in particular, experts) are assigned an active role in the system. Based on this insight, we propose a largely interactive environmental information acquisition and generation framework, which has been realized in the FP7 project ''Personalized Environmental Service Configuration and Delivery Orchestration'' (PESCaDO). The PESCaDO service involves the experts in four central tasks: (i) determination of criteria for the search of environmental nodes in the web; (ii) assessment of the relevance of the identified nodes; (iii) assessment of the quality of the data provided by the nodes, and (iv) selection of the content to be communicated to the user. Quantitative evaluations and user trials show that the performance of the system is good and the satisfaction with the service is high. (orig.)

  7. Quantitative performance monitoring

    International Nuclear Information System (INIS)

    Heller, A.S.

    1987-01-01

    In the recently published update of NUREG/CR 3883, it was shown that Japanese plants of size and design similar to those in the US have significantly fewer trips in a given year of operation. One way to reduce such imbalance is the efficient use of available plant data. Since plant data are recorded and monitored continuously for management feedback and timely resolution of problems, this data should be actively used to increase the efficiency of operations and, ultimately, for a reduction of plant trips in power plants. A great deal of information is lost, however, if the analytical tools available for the data evaluation are misapplied or not adopted at all. This paper deals with a program developed to use quantitative techniques to monitor personnel performance in an operating power plant. Visual comparisons of ongoing performance with predetermined quantitative performance goals are made. A continuous feedback is provided to management for early detection of adverse trends and timely resolution of problems. Ultimately, costs are reduced through effective resource management and timely decision making

  8. Using quantitative structure-activity relationships (QSAR) to predict toxic endpoints for polycyclic aromatic hydrocarbons (PAH).

    Science.gov (United States)

    Bruce, Erica D; Autenrieth, Robin L; Burghardt, Robert C; Donnelly, K C; McDonald, Thomas J

    2008-01-01

    Quantitative structure-activity relationships (QSAR) offer a reliable, cost-effective alternative to the time, money, and animal lives necessary to determine chemical toxicity by traditional methods. Additionally, humans are exposed to tens of thousands of chemicals in their lifetimes, necessitating the determination of chemical toxicity and screening for those posing the greatest risk to human health. This study developed models to predict toxic endpoints for three bioassays specific to several stages of carcinogenesis. The ethoxyresorufin O-deethylase assay (EROD), the Salmonella/microsome assay, and a gap junction intercellular communication (GJIC) assay were chosen for their ability to measure toxic endpoints specific to activation-, induction-, and promotion-related effects of polycyclic aromatic hydrocarbons (PAH). Shape-electronic, spatial, information content, and topological descriptors proved to be important descriptors in predicting the toxicity of PAH in these bioassays. Bioassay-based toxic equivalency factors (TEF(B)) were developed for several PAH using the quantitative structure-toxicity relationships (QSTR) developed. Predicting toxicity for a specific PAH compound, such as a bioassay-based potential potency (PP(B)) or a TEF(B), is possible by combining the predicted behavior from the QSTR models. These toxicity estimates may then be incorporated into a risk assessment for compounds that lack toxicity data. Accurate toxicity predictions are made by examining each type of endpoint important to the process of carcinogenicity, and a clearer understanding between composition and toxicity can be obtained.

  9. A Qualitative and Quantitative Comparative Analysis of Commercial and Independent Online Information for Hip Surgery: A Bias in Online Information Targeting Patients?

    Science.gov (United States)

    Kelly, Martin J; Feeley, Iain H; O'Byrne, John M

    2016-10-01

    Direct to consumer (DTC) advertising, targeting the public over the physician, is an increasingly pervasive presence in medical clinics. It is trending toward a format of online interaction rather than that of traditional print and television advertising. We analyze patient-focused Web pages from the top 5 companies supplying prostheses for total hip arthroplasties, comparing them to the top 10 independent medical websites. Quantitative comparison is performed using the Journal of American Medical Association benchmark and DISCERN criteria, and for comparative readability, we use the Flesch-Kincaid grade level, the Flesch reading ease, and the Gunning fog index. Content is analyzed for information on type of surgery and surgical approach. There is a statistically significant difference between the independent and DTC websites in both the mean DISCERN score (independent 74.6, standard deviation [SD] = 4.77; DTC 32.2, SD = 10.28; P = .0022) and the mean Journal of American Medical Association score (Independent 3.45, SD = 0.49; DTC 1.9, SD = 0.74; P = .004). The difference between the readability scores is not statistically significantly. The commercial content is found to be heavily biased in favor of the direct anterior approach and minimally invasive surgical techniques. We demonstrate that the quality of information on commercial websites is inferior to that of the independent sites. The advocacy of surgical approaches by industry to the patient group is a concern. This study underlines the importance of future regulation of commercial patient education Web pages. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Quantitating cellular immune responses to cancer vaccines.

    Science.gov (United States)

    Lyerly, H Kim

    2003-06-01

    While the future of immunotherapy in the treatment of cancer is promising, it is difficult to compare the various approaches because monitoring assays have not been standardized in approach or technique. Common assays for measuring the immune response need to be established so that these assays can one day serve as surrogate markers for clinical response. Assays that accurately detect and quantitate T-cell-mediated, antigen-specific immune responses are particularly desired. However, to date, increases in the number of cytotoxic T cells through immunization have not been correlated with clinical tumor regression. Ideally, then, a T-cell assay not only needs to be sensitive, specific, reliable, reproducible, simple, and quick to perform, it must also demonstrate close correlation with clinical outcome. Assays currently used to measure T-cell response are delayed-type hypersensitivity testing, flow cytometry using peptide major histocompatibility complex tetramers, lymphoproliferation assay, enzyme-linked immunosorbant assay, enzyme-linked immunospot assay, cytokine flow cytometry, direct cytotoxicity assay, measurement of cytokine mRNA by quantitative reverse transcriptase polymerase chain reaction, and limiting dilution analysis. The purpose of this review is to describe the attributes of each test and compare their advantages and disadvantages.

  11. Hydrogen atoms can be located accurately and precisely by x-ray crystallography.

    Science.gov (United States)

    Woińska, Magdalena; Grabowsky, Simon; Dominiak, Paulina M; Woźniak, Krzysztof; Jayatilaka, Dylan

    2016-05-01

    Precise and accurate structural information on hydrogen atoms is crucial to the study of energies of interactions important for crystal engineering, materials science, medicine, and pharmacy, and to the estimation of physical and chemical properties in solids. However, hydrogen atoms only scatter x-radiation weakly, so x-rays have not been used routinely to locate them accurately. Textbooks and teaching classes still emphasize that hydrogen atoms cannot be located with x-rays close to heavy elements; instead, neutron diffraction is needed. We show that, contrary to widespread expectation, hydrogen atoms can be located very accurately using x-ray diffraction, yielding bond lengths involving hydrogen atoms (A-H) that are in agreement with results from neutron diffraction mostly within a single standard deviation. The precision of the determination is also comparable between x-ray and neutron diffraction results. This has been achieved at resolutions as low as 0.8 Å using Hirshfeld atom refinement (HAR). We have applied HAR to 81 crystal structures of organic molecules and compared the A-H bond lengths with those from neutron measurements for A-H bonds sorted into bonds of the same class. We further show in a selection of inorganic compounds that hydrogen atoms can be located in bridging positions and close to heavy transition metals accurately and precisely. We anticipate that, in the future, conventional x-radiation sources at in-house diffractometers can be used routinely for locating hydrogen atoms in small molecules accurately instead of large-scale facilities such as spallation sources or nuclear reactors.

  12. Information adaptation the interplay between Shannon information and semantic information in cognition

    CERN Document Server

    Haken, Hermann

    2015-01-01

    This monograph demonstrates the interplay between Shannon information and semantic information in cognition. It shows that Shannon’s information acts as driving force for the formation of semantic information; and vice versa, namely, that semantic information participates in the formation of Shannonian information. The authors show that in cognition, Shannonian and semantic information are interrelated as two aspects of a cognitive process termed as information adaptation. In the latter the mind/brain adapts to the environment by the deflating and/or inflating of the information conveyed by the environment. In the process of information adaptation, quantitative variations in Shannon’s information entail different meanings while different meanings affect the quantity of information. The book illustrates the above conceptually and mathematically by reference to three cognitive processes: pattern recognition, face learning and the recognition of a moving object.

  13. Identification of internal control genes for quantitative expression analysis by real-time PCR in bovine peripheral lymphocytes.

    Science.gov (United States)

    Spalenza, Veronica; Girolami, Flavia; Bevilacqua, Claudia; Riondato, Fulvio; Rasero, Roberto; Nebbia, Carlo; Sacchi, Paola; Martin, Patrice

    2011-09-01

    Gene expression studies in blood cells, particularly lymphocytes, are useful for monitoring potential exposure to toxicants or environmental pollutants in humans and livestock species. Quantitative PCR is the method of choice for obtaining accurate quantification of mRNA transcripts although variations in the amount of starting material, enzymatic efficiency, and the presence of inhibitors can lead to evaluation errors. As a result, normalization of data is of crucial importance. The most common approach is the use of endogenous reference genes as an internal control, whose expression should ideally not vary among individuals and under different experimental conditions. The accurate selection of reference genes is therefore an important step in interpreting quantitative PCR studies. Since no systematic investigation in bovine lymphocytes has been performed, the aim of the present study was to assess the expression stability of seven candidate reference genes in circulating lymphocytes collected from 15 dairy cows. Following the characterization by flow cytometric analysis of the cell populations obtained from blood through a density gradient procedure, three popular softwares were used to evaluate the gene expression data. The results showed that two genes are sufficient for normalization of quantitative PCR studies in cattle lymphocytes and that YWAHZ, S24 and PPIA are the most stable genes. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Delineating Rearrangements in Single Yeast Artificial Chromosomes by Quantitative DNA Fiber Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Weier, Heinz-Ulrich G.; Greulich-Bode, Karin M.; Wu, Jenny; Duell, Thomas

    2009-09-18

    Cloning of large chunks of human genomic DNA in recombinant systems such as yeast or bacterial artificial chromosomes has greatly facilitated the construction of physical maps, the positional cloning of disease genes or the preparation of patient-specific DNA probes for diagnostic purposes. For this process to work efficiently, the DNA cloning process and subsequent clone propagation need to maintain stable inserts that are neither deleted nor otherwise rearranged. Some regions of the human genome; however, appear to have a higher propensity than others to rearrange in any host system. Thus, techniques to detect and accurately characterize such rearrangements need to be developed. We developed a technique termed 'Quantitative DNA Fiber Mapping (QDFM)' that allows accurate tagging of sequence elements of interest with near kilobase accuracy and optimized it for delineation of rearrangements in recombinant DNA clones. This paper demonstrates the power of this microscopic approach by investigating YAC rearrangements. In our examples, high-resolution physical maps for regions within the immunoglobulin lambda variant gene cluster were constructed for three different YAC clones carrying deletions of 95 kb and more. Rearrangements within YACs could be demonstrated unambiguously by pairwise mapping of cosmids along YAC DNA molecules. When coverage by YAC clones was not available, distances between cosmid clones were estimated by hybridization of cosmids onto DNA fibers prepared from human genomic DNA. In addition, the QDFM technology provides essential information about clone stability facilitating closure of the maps of the human genome as well as those of model organisms.

  15. Limitations of quantitative photoacoustic measurements of blood oxygenation in small vessels

    International Nuclear Information System (INIS)

    Sivaramakrishnan, Mathangi; Maslov, Konstantin; Zhang, Hao F; Stoica, George; Wang, Lihong V

    2007-01-01

    We investigate the feasibility of obtaining accurate quantitative information, such as local blood oxygenation level (sO 2 ), with a spatial resolution of about 50 μm from spectral photoacoustic (PA) measurements. The optical wavelength dependence of the peak values of the PA signals is utilized to obtain the local blood oxygenation level. In our in vitro experimental models, the PA signal amplitude is found to be linearly proportional to the blood optical absorption coefficient when using ultrasonic transducers with central frequencies high enough such that the ultrasonic wavelengths are shorter than the light penetration depth into the blood vessels. For an optical wavelength in the 578-596 nm region, with a transducer central frequency that is above 25 MHz, the sensitivity and accuracy of sO 2 inversion is shown to be better than 4%. The effect of the transducer focal position on the accuracy of quantifying blood oxygenation is found to be negligible. In vivo oxygenation measurements of rat skin microvasculature yield results consistent with those from in vitro studies, although factors specific to in vivo measurements, such as the spectral dependence of tissue optical attenuation, dramatically affect the accuracy of sO 2 quantification in vivo

  16. The usefulness of 3D quantitative analysis with using MRI for measuring osteonecrosis of the femoral head

    International Nuclear Information System (INIS)

    Hwang, Ji Young; Lee, Sun Wha; Park, Youn Soo

    2006-01-01

    We wanted to evaluate the usefulness of MRI 3D quantitative analysis for measuring osteonecrosis of the femoral head in comparison with MRI 2D quantitative analysis and quantitative analysis of the specimen. For 3 months at our hospital, 14 femoral head specimens with osteonecrosis were obtained after total hip arthroplasty. The patients preoperative MRIs were retrospectively reviewed for quantitative analysis of the size of the necrosis. Each necrotic fraction of the femoral head was measured by 2D quantitative analysis with using mid-coronal and mid-sagittal MRIs, and by 3D quantitative analysis with using serial continuous coronal MRIs and 3D reconstruction software. The necrotic fraction of the specimen was physically measured by the fluid displacement method. The necrotic fraction according to MRI 2D or 3D quantitative analysis was compared with that of the specimen by using Spearman's correlation test. On the correlative analysis, the necrotic fraction by MRI 2D quantitative analysis and quantitative analysis of the specimen showed moderate correlation (r = 0.657); on the other hand, the necrotic fraction by MRI 3D quantitative analysis and quantitative analysis of the specimen demonstrated a strong correlation (r = 0.952) (ρ < 0.05). MRI 3D quantitative analysis was more accurate than 2D quantitative analysis using MRI for measuring osteonecrosis of the femoral head. Therefore, it may be useful for predicting the clinical outcome and deciding the proper treatment option

  17. A gas chromatography-mass spectrometry method for the quantitation of clobenzorex.

    Science.gov (United States)

    Cody, J T; Valtier, S

    1999-01-01

    Drugs metabolized to amphetamine or methamphetamine are potentially significant concerns in the interpretation of amphetamine-positive urine drug-testing results. One of these compounds, clobenzorex, is an anorectic drug that is available in many countries. Clobenzorex (2-chlorobenzylamphetamine) is metabolized to amphetamine by the body and excreted in the urine. Following administration, the parent compound was detectable for a shorter time than the metabolite amphetamine, which could be detected for days. Because of the potential complication posed to the interpretation of amphetamin-positive drug tests following administration of this drug, the viability of a current amphetamine procedure using liquid-liquid extraction and conversion to the heptafluorobutyryl derivative followed by gas chromatography-mass spectrometry (GC-MS) analysis was evaluated for identification and quantitation of clobenzorex. Qualitative identification of the drug was relatively straightforward. Quantitative analysis proved to be a far more challenging process. Several compounds were evaluated for use as the internal standard in this method, including methamphetamine-d11, fenfluramine, benzphetamine, and diphenylamine. Results using these compounds proved to be less than satisfactory because of poor reproducibility of the quantitative values. Because of its similar chromatographic properties to the parent drug, the compound 3-chlorobenzylamphetamine (3-Cl-clobenzorex) was evaluated in this study as the internal standard for the quantitation of clobenzorex. Precision studies showed 3-Cl-clobenzorex to produce accurate and reliable quantitative results (within-run relative standard deviations [RSDs] clobenzorex.

  18. Scientific and technological information: analysis of periodic publications of information science

    OpenAIRE

    Mayara Cintya do Nascimento Vasconcelos; Gabriela Belmont de Farias

    2017-01-01

    The research analyzes the articles published in national scientific journals of the area of Information Science, classified with Qualis A1, having as parameter the term "scientific and technological information". It presents concepts about scientific and technological information and the processes that involve its uses, as well as scientific communication, information flows and sources of information. The methodology used is a descriptive study with a quantitative-qualitative approach, using ...

  19. Qualitative and quantitative laser-induced breakdown spectroscopy of bronze objects

    International Nuclear Information System (INIS)

    Tankova, V; Blagoev, K; Grozeva, M; Malcheva, G; Penkova, P

    2016-01-01

    Laser-induced breakdown spectroscopy (LIBS) is an analytical technique for qualitative and quantitative elemental analysis of solids, liquids and gases. In this work, the method was applied for investigation of archaeological bronze objects. The analytical information obtained by LIBS was used for qualitative determination of the elements in the material used for manufacturing of the objects under study. Quantitative chemical analysis was also performed after generating calibration curves with standard samples of similar matrix composition. Quantitative estimation of the elemental concentration of the bulk of the samples was performed, together with investigation of the surface layer of the objects. The results of the quantitative analyses gave indications about the manufacturing process of the investigated objects. (paper)

  20. Comparison of clinical semi-quantitative assessment of muscle fat infiltration with quantitative assessment using chemical shift-based water/fat separation in MR studies of the calf of post-menopausal women.

    Science.gov (United States)

    Alizai, Hamza; Nardo, Lorenzo; Karampinos, Dimitrios C; Joseph, Gabby B; Yap, Samuel P; Baum, Thomas; Krug, Roland; Majumdar, Sharmila; Link, Thomas M

    2012-07-01

    The goal of this study was to compare the semi-quantitative Goutallier classification for fat infiltration with quantitative fat-fraction derived from a magnetic resonance imaging (MRI) chemical shift-based water/fat separation technique. Sixty-two women (age 61 ± 6 years), 27 of whom had diabetes, underwent MRI of the calf using a T1-weighted fast spin-echo sequence and a six-echo spoiled gradient-echo sequence at 3 T. Water/fat images and fat fraction maps were reconstructed using the IDEAL algorithm with T2* correction and a multi-peak model for the fat spectrum. Two radiologists scored fat infiltration on the T1-weighted images using the Goutallier classification in six muscle compartments. Spearman correlations between the Goutallier grades and the fat fraction were calculated; in addition, intra-observer and inter-observer agreement were calculated. A significant correlation between the clinical grading and the fat fraction values was found for all muscle compartments (P infiltration of muscle commonly occurs in many metabolic and neuromuscular diseases. • Image-based semi-quantitative classifications for assessing fat infiltration are not well validated. • Quantitative MRI techniques provide an accurate assessment of muscle fat.

  1. Allele-sharing models: LOD scores and accurate linkage tests.

    Science.gov (United States)

    Kong, A; Cox, N J

    1997-11-01

    Starting with a test statistic for linkage analysis based on allele sharing, we propose an associated one-parameter model. Under general missing-data patterns, this model allows exact calculation of likelihood ratios and LOD scores and has been implemented by a simple modification of existing software. Most important, accurate linkage tests can be performed. Using an example, we show that some previously suggested approaches to handling less than perfectly informative data can be unacceptably conservative. Situations in which this model may not perform well are discussed, and an alternative model that requires additional computations is suggested.

  2. ABRF-PRG07: advanced quantitative proteomics study.

    Science.gov (United States)

    Falick, Arnold M; Lane, William S; Lilley, Kathryn S; MacCoss, Michael J; Phinney, Brett S; Sherman, Nicholas E; Weintraub, Susan T; Witkowska, H Ewa; Yates, Nathan A

    2011-04-01

    A major challenge for core facilities is determining quantitative protein differences across complex biological samples. Although there are numerous techniques in the literature for relative and absolute protein quantification, the majority is nonroutine and can be challenging to carry out effectively. There are few studies comparing these technologies in terms of their reproducibility, accuracy, and precision, and no studies to date deal with performance across multiple laboratories with varied levels of expertise. Here, we describe an Association of Biomolecular Resource Facilities (ABRF) Proteomics Research Group (PRG) study based on samples composed of a complex protein mixture into which 12 known proteins were added at varying but defined ratios. All of the proteins were present at the same concentration in each of three tubes that were provided. The primary goal of this study was to allow each laboratory to evaluate its capabilities and approaches with regard to: detection and identification of proteins spiked into samples that also contain complex mixtures of background proteins and determination of relative quantities of the spiked proteins. The results returned by 43 participants were compiled by the PRG, which also collected information about the strategies used to assess overall performance and as an aid to development of optimized protocols for the methodologies used. The most accurate results were generally reported by the most experienced laboratories. Among laboratories that used the same technique, values that were closer to the expected ratio were obtained by more experienced groups.

  3. Quantitative determination of α and f parameters for κo NAA

    International Nuclear Information System (INIS)

    Moon, J. H.; Kim, S. H.; Jeong, Y. S.

    2002-01-01

    Instrumental Neutron Activation Analysis as a representative method of nuclear analytical techniques, has advantages of non-destructive, simultaneous multi-element analysis with the characteristics of absolute measurement. Recently, use of κ o quantitative method which is accurate, convenient and user-friendly has been generalized world-widely. In this study, α and f parameters which is indispensable to implement κ o NAA were experimentally measured at NAA No.1-irradiation hole of HANARO research reactor. In addition, it was intended to apply routine analysis by the establishment of reliable and effective procedure of the measurement

  4. Analysis of ribosomal RNA stability in dead cells of wine yeast by quantitative PCR.

    Science.gov (United States)

    Sunyer-Figueres, Merce; Wang, Chunxiao; Mas, Albert

    2018-04-02

    During wine production, some yeasts enter a Viable But Not Culturable (VBNC) state, which may influence the quality and stability of the final wine through remnant metabolic activity or by resuscitation. Culture-independent techniques are used for obtaining an accurate estimation of the number of live cells, and quantitative PCR could be the most accurate technique. As a marker of cell viability, rRNA was evaluated by analyzing its stability in dead cells. The species-specific stability of rRNA was tested in Saccharomyces cerevisiae, as well as in three species of non-Saccharomyces yeast (Hanseniaspora uvarum, Torulaspora delbrueckii and Starmerella bacillaris). High temperature and antimicrobial dimethyl dicarbonate (DMDC) treatments were efficient in lysing the yeast cells. rRNA gene and rRNA (as cDNA) were analyzed over 48 h after cell lysis by quantitative PCR. The results confirmed the stability of rRNA for 48 h after the cell lysis treatments. To sum up, rRNA may not be a good marker of cell viability in the wine yeasts that were tested. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Accurate Evaluation of Quantum Integrals

    Science.gov (United States)

    Galant, D. C.; Goorvitch, D.; Witteborn, Fred C. (Technical Monitor)

    1995-01-01

    Combining an appropriate finite difference method with Richardson's extrapolation results in a simple, highly accurate numerical method for solving a Schrodinger's equation. Important results are that error estimates are provided, and that one can extrapolate expectation values rather than the wavefunctions to obtain highly accurate expectation values. We discuss the eigenvalues, the error growth in repeated Richardson's extrapolation, and show that the expectation values calculated on a crude mesh can be extrapolated to obtain expectation values of high accuracy.

  6. Quantitative analysis of real-time radiographic systems

    International Nuclear Information System (INIS)

    Barker, M.D.; Condon, P.E.; Barry, R.C.; Betz, R.A.; Klynn, L.M.

    1988-01-01

    A method was developed which yields quantitative information on the spatial resolution, contrast sensitivity, image noise, and focal spot size from real time radiographic images. The method uses simple image quality indicators and computer programs which make it possible to readily obtain quantitative performance measurements of single or multiple radiographic systems. It was used for x-ray and optical images to determine which component of the system was not operating up to standard. Focal spot size was monitored by imaging a bar pattern. This paper constitutes the second progress report on the development of the camera and radiation image quality indicators

  7. A quantitative lubricant test for deep drawing

    DEFF Research Database (Denmark)

    Olsson, David Dam; Bay, Niels; Andreasen, Jan L.

    2010-01-01

    A tribological test for deep drawing has been developed by which the performance of lubricants may be evaluated quantitatively measuring the maximum backstroke force on the punch owing to friction between tool and workpiece surface. The forming force is found not to give useful information...

  8. Experimental Influences in the Accurate Measurement of Cartilage Thickness in MRI.

    Science.gov (United States)

    Wang, Nian; Badar, Farid; Xia, Yang

    2018-01-01

    Objective To study the experimental influences to the measurement of cartilage thickness by magnetic resonance imaging (MRI). Design The complete thicknesses of healthy and trypsin-degraded cartilage were measured at high-resolution MRI under different conditions, using two intensity-based imaging sequences (ultra-short echo [UTE] and multislice-multiecho [MSME]) and 3 quantitative relaxation imaging sequences (T 1 , T 2 , and T 1 ρ). Other variables included different orientations in the magnet, 2 soaking solutions (saline and phosphate buffered saline [PBS]), and external loading. Results With cartilage soaked in saline, UTE and T 1 methods yielded complete and consistent measurement of cartilage thickness, while the thickness measurement by T 2 , T 1 ρ, and MSME methods were orientation dependent. The effect of external loading on cartilage thickness is also sequence and orientation dependent. All variations in cartilage thickness in MRI could be eliminated with the use of a 100 mM PBS or imaged by UTE sequence. Conclusions The appearance of articular cartilage and the measurement accuracy of cartilage thickness in MRI can be influenced by a number of experimental factors in ex vivo MRI, from the use of various pulse sequences and soaking solutions to the health of the tissue. T 2 -based imaging sequence, both proton-intensity sequence and quantitative relaxation sequence, similarly produced the largest variations. With adequate resolution, the accurate measurement of whole cartilage tissue in clinical MRI could be utilized to detect differences between healthy and osteoarthritic cartilage after compression.

  9. Qualitative and quantitative information flow analysis for multi-thread programs

    NARCIS (Netherlands)

    Ngo, Minh Tri

    2014-01-01

    In today's information-based society, guaranteeing information security plays an important role in all aspects of life: communication between citizens and governments, military, companies, financial information systems, web-based services etc. With the increasing popularity of computer systems with

  10. Quantitative modeling of gene networks of biological systems using fuzzy Petri nets and fuzzy sets

    Directory of Open Access Journals (Sweden)

    Raed I. Hamed

    2018-01-01

    Full Text Available Quantitative demonstrating of organic frameworks has turned into an essential computational methodology in the configuration of novel and investigation of existing natural frameworks. Be that as it may, active information that portrays the framework's elements should be known keeping in mind the end goal to get pertinent results with the routine displaying strategies. This information is frequently robust or even difficult to get. Here, we exhibit a model of quantitative fuzzy rational demonstrating approach that can adapt to obscure motor information and hence deliver applicable results despite the fact that dynamic information is fragmented or just dubiously characterized. Besides, the methodology can be utilized as a part of the blend with the current cutting edge quantitative demonstrating strategies just in specific parts of the framework, i.e., where the data are absent. The contextual analysis of the methodology suggested in this paper is performed on the model of nine-quality genes. We propose a kind of FPN model in light of fuzzy sets to manage the quantitative modeling of biological systems. The tests of our model appear that the model is practical and entirely powerful for information impersonation and thinking of fuzzy expert frameworks.

  11. Quantitative determination and validation of octreotide acetate using 1 H-NMR spectroscopy with internal standard method.

    Science.gov (United States)

    Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang

    2018-01-01

    Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Shedding quantitative fluorescence light on novel regulatory mechanisms in skeletal biomedicine and biodentistry.

    Science.gov (United States)

    Lee, Ji-Won; Iimura, Tadahiro

    2017-02-01

    Digitalized fluorescence images contain numerical information such as color (wavelength), fluorescence intensity and spatial position. However, quantitative analyses of acquired data and their validation remained to be established. Our research group has applied quantitative fluorescence imaging on tissue sections and uncovered novel findings in skeletal biomedicine and biodentistry. This review paper includes a brief background of quantitative fluorescence imaging and discusses practical applications by introducing our previous research. Finally, the future perspectives of quantitative fluorescence imaging are discussed.

  13. Accurate Sybil Attack Detection Based on Fine-Grained Physical Channel Information

    Directory of Open Access Journals (Sweden)

    Chundong Wang

    2018-03-01

    Full Text Available With the development of the Internet-of-Things (IoT, wireless network security has more and more attention paid to it. The Sybil attack is one of the famous wireless attacks that can forge wireless devices to steal information from clients. These forged devices may constantly attack target access points to crush the wireless network. In this paper, we propose a novel Sybil attack detection based on Channel State Information (CSI. This detection algorithm can tell whether the static devices are Sybil attackers by combining a self-adaptive multiple signal classification algorithm with the Received Signal Strength Indicator (RSSI. Moreover, we develop a novel tracing scheme to cluster the channel characteristics of mobile devices and detect dynamic attackers that change their channel characteristics in an error area. Finally, we experiment on mobile and commercial WiFi devices. Our algorithm can effectively distinguish the Sybil devices. The experimental results show that our Sybil attack detection system achieves high accuracy for both static and dynamic scenarios. Therefore, combining the phase and similarity of channel features, the multi-dimensional analysis of CSI can effectively detect Sybil nodes and improve the security of wireless networks.

  14. Accurate Sybil Attack Detection Based on Fine-Grained Physical Channel Information.

    Science.gov (United States)

    Wang, Chundong; Zhu, Likun; Gong, Liangyi; Zhao, Zhentang; Yang, Lei; Liu, Zheli; Cheng, Xiaochun

    2018-03-15

    With the development of the Internet-of-Things (IoT), wireless network security has more and more attention paid to it. The Sybil attack is one of the famous wireless attacks that can forge wireless devices to steal information from clients. These forged devices may constantly attack target access points to crush the wireless network. In this paper, we propose a novel Sybil attack detection based on Channel State Information (CSI). This detection algorithm can tell whether the static devices are Sybil attackers by combining a self-adaptive multiple signal classification algorithm with the Received Signal Strength Indicator (RSSI). Moreover, we develop a novel tracing scheme to cluster the channel characteristics of mobile devices and detect dynamic attackers that change their channel characteristics in an error area. Finally, we experiment on mobile and commercial WiFi devices. Our algorithm can effectively distinguish the Sybil devices. The experimental results show that our Sybil attack detection system achieves high accuracy for both static and dynamic scenarios. Therefore, combining the phase and similarity of channel features, the multi-dimensional analysis of CSI can effectively detect Sybil nodes and improve the security of wireless networks.

  15. Pinpointing Phosphorylation Sites: Quantitative Filtering and a Novel Site-specific x-Ion Fragment

    DEFF Research Database (Denmark)

    Kelstrup, Christian D; Hekmat, Omid; Francavilla, Chiara

    2011-01-01

    Phosphoproteomics deals with the identification and quantification of thousands of phosphopeptides. Localizing the phosphorylation site is however much more difficult than establishing the identity of a phosphorylated peptide. Further, recent findings have raised doubts of the validity of the site......-phase phosphate rearrangement reactions during collision-induced dissociation (CID) and used these spectra to devise a quantitative filter that by comparing signal intensities of putative phosphorylated fragment ions with their nonphosphorylated counterparts allowed us to accurately pinpoint which fragment ions...... contain a phosphorylated residue and which ones do not. We also evaluated higher-energy collisional dissociation (HCD) and found this to be an accurate method for correct phosphorylation site localization with no gas-phase rearrangements observed above noise level. Analyzing a large set of HCD spectra...

  16. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    Science.gov (United States)

    Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro; Tamaki, Keiji

    2017-01-01

    In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  17. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    Directory of Open Access Journals (Sweden)

    Sho Manabe

    Full Text Available In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  18. The effect of volume-of-interest misregistration on quantitative planar activity and dose estimation

    International Nuclear Information System (INIS)

    Song, N; Frey, E C; He, B

    2010-01-01

    In targeted radionuclide therapy (TRT), dose estimation is essential for treatment planning and tumor dose response studies. Dose estimates are typically based on a time series of whole-body conjugate view planar or SPECT scans of the patient acquired after administration of a planning dose. Quantifying the activity in the organs from these studies is an essential part of dose estimation. The quantitative planar (QPlanar) processing method involves accurate compensation for image degrading factors and correction for organ and background overlap via the combination of computational models of the image formation process and 3D volumes of interest defining the organs to be quantified. When the organ VOIs are accurately defined, the method intrinsically compensates for attenuation, scatter and partial volume effects, as well as overlap with other organs and the background. However, alignment between the 3D organ volume of interest (VOIs) used in QPlanar processing and the true organ projections in the planar images is required. The aim of this research was to study the effects of VOI misregistration on the accuracy and precision of organ activity estimates obtained using the QPlanar method. In this work, we modeled the degree of residual misregistration that would be expected after an automated registration procedure by randomly misaligning 3D SPECT/CT images, from which the VOI information was derived, and planar images. Mutual information-based image registration was used to align the realistic simulated 3D SPECT images with the 2D planar images. The residual image misregistration was used to simulate realistic levels of misregistration and allow investigation of the effects of misregistration on the accuracy and precision of the QPlanar method. We observed that accurate registration is especially important for small organs or ones with low activity concentrations compared to neighboring organs. In addition, residual misregistration gave rise to a loss of precision

  19. [Quantitative data analysis for live imaging of bone.

    Science.gov (United States)

    Seno, Shigeto

    Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.

  20. Stochastic resonance is applied to quantitative analysis for weak chromatographic signal of glyburide in plasma

    International Nuclear Information System (INIS)

    Zhang Wei; Xiang Bingren; Wu Yanwei; Shang Erxin

    2005-01-01

    Based on the theory of stochastic resonance, a new method carried on the quantitive analysis to weak chromatographic signal of glyburide in plasma, which was embedded in the noise background and the signal-to-noise ratio (SNR) of HPLC-UV is enhanced remarkably. This method enhances the quantification limit to 1 ng ml -1 , which is the same as HPLC-MS, and makes it possible to detect the weak signal accurately by HPLC-UV, which was not suitable before. The results showed good recovery and linear range from 1 to 50 ng ml -1 of glyburide in plasma and the method can be used for quantitative analysis of glyburide

  1. Quantitative operando visualization of the energy band depth profile in solar cells.

    Science.gov (United States)

    Chen, Qi; Mao, Lin; Li, Yaowen; Kong, Tao; Wu, Na; Ma, Changqi; Bai, Sai; Jin, Yizheng; Wu, Dan; Lu, Wei; Wang, Bing; Chen, Liwei

    2015-07-13

    The energy band alignment in solar cell devices is critically important because it largely governs elementary photovoltaic processes, such as the generation, separation, transport, recombination and collection of charge carriers. Despite the expenditure of considerable effort, the measurement of energy band depth profiles across multiple layers has been extremely challenging, especially for operando devices. Here we present direct visualization of the surface potential depth profile over the cross-sections of operando organic photovoltaic devices using scanning Kelvin probe microscopy. The convolution effect due to finite tip size and cantilever beam crosstalk has previously prohibited quantitative interpretation of scanning Kelvin probe microscopy-measured surface potential depth profiles. We develop a bias voltage-compensation method to address this critical problem and obtain quantitatively accurate measurements of the open-circuit voltage, built-in potential and electrode potential difference.

  2. Accurate evaluation of subband structure in a carrier accumulation layer at an n-type InAs surface: LDF calculation combined with high-resolution photoelectron spectroscopy

    Directory of Open Access Journals (Sweden)

    Takeshi Inaoka

    2012-12-01

    Full Text Available Adsorption on an n-type InAs surface often induces a gradual formation of a carrier-accumulation layer at the surface. By means of high-resolution photoelectron spectroscopy (PES, Betti et al. made a systematic observation of subbands in the accumulation layer in the formation process. Incorporating a highly nonparabolic (NP dispersion of the conduction band into the local-density-functional (LDF formalism, we examine the subband structure in the accumulation-layer formation process. Combining the LDF calculation with the PES experiment, we make an accurate evaluation of the accumulated-carrier density, the subband-edge energies, and the subband energy dispersion at each formation stage. Our theoretical calculation can reproduce the three observed subbands quantitatively. The subband dispersion, which deviates downward from that of the projected bulk conduction band with an increase in wave number, becomes significantly weaker in the formation process. Accurate evaluation of the NP subband dispersion at each formation stage is indispensable in making a quantitative analysis of collective electronic excitations and transport properties in the subbands.

  3. Optimized slice-selective 1H NMR experiments combined with highly accurate quantitative 13C NMR using an internal reference method

    Science.gov (United States)

    Jézéquel, Tangi; Silvestre, Virginie; Dinis, Katy; Giraudeau, Patrick; Akoka, Serge

    2018-04-01

    Isotope ratio monitoring by 13C NMR spectrometry (irm-13C NMR) provides the complete 13C intramolecular position-specific composition at natural abundance. It represents a powerful tool to track the (bio)chemical pathway which has led to the synthesis of targeted molecules, since it allows Position-specific Isotope Analysis (PSIA). Due to the very small composition range (which represents the range of variation of the isotopic composition of a given nuclei) of 13C natural abundance values (50‰), irm-13C NMR requires a 1‰ accuracy and thus highly quantitative analysis by 13C NMR. Until now, the conventional strategy to determine the position-specific abundance xi relies on the combination of irm-MS (isotopic ratio monitoring Mass Spectrometry) and 13C quantitative NMR. However this approach presents a serious drawback since it relies on two different techniques and requires to measure separately the signal of all the carbons of the analyzed compound, which is not always possible. To circumvent this constraint, we recently proposed a new methodology to perform 13C isotopic analysis using an internal reference method and relying on NMR only. The method combines a highly quantitative 1H NMR pulse sequence (named DWET) with a 13C isotopic NMR measurement. However, the recently published DWET sequence is unsuited for samples with short T1, which forms a serious limitation for irm-13C NMR experiments where a relaxing agent is added. In this context, we suggest two variants of the DWET called Multi-WET and Profiled-WET, developed and optimized to reach the same accuracy of 1‰ with a better immunity towards T1 variations. Their performance is evaluated on the determination of the 13C isotopic profile of vanillin. Both pulse sequences show a 1‰ accuracy with an increased robustness to pulse miscalibrations compared to the initial DWET method. This constitutes a major advance in the context of irm-13C NMR since it is now possible to perform isotopic analysis with high

  4. Can Measured Synergy Excitations Accurately Construct Unmeasured Muscle Excitations?

    Science.gov (United States)

    Bianco, Nicholas A; Patten, Carolynn; Fregly, Benjamin J

    2018-01-01

    Accurate prediction of muscle and joint contact forces during human movement could improve treatment planning for disorders such as osteoarthritis, stroke, Parkinson's disease, and cerebral palsy. Recent studies suggest that muscle synergies, a low-dimensional representation of a large set of muscle electromyographic (EMG) signals (henceforth called "muscle excitations"), may reduce the redundancy of muscle excitation solutions predicted by optimization methods. This study explores the feasibility of using muscle synergy information extracted from eight muscle EMG signals (henceforth called "included" muscle excitations) to accurately construct muscle excitations from up to 16 additional EMG signals (henceforth called "excluded" muscle excitations). Using treadmill walking data collected at multiple speeds from two subjects (one healthy, one poststroke), we performed muscle synergy analysis on all possible subsets of eight included muscle excitations and evaluated how well the calculated time-varying synergy excitations could construct the remaining excluded muscle excitations (henceforth called "synergy extrapolation"). We found that some, but not all, eight-muscle subsets yielded synergy excitations that achieved >90% extrapolation variance accounted for (VAF). Using the top 10% of subsets, we developed muscle selection heuristics to identify included muscle combinations whose synergy excitations achieved high extrapolation accuracy. For 3, 4, and 5 synergies, these heuristics yielded extrapolation VAF values approximately 5% lower than corresponding reconstruction VAF values for each associated eight-muscle subset. These results suggest that synergy excitations obtained from experimentally measured muscle excitations can accurately construct unmeasured muscle excitations, which could help limit muscle excitations predicted by muscle force optimizations.

  5. Quantitative phase-digital holographic microscopy: a new imaging modality to identify original cellular biomarkers of diseases

    KAUST Repository

    Marquet, P.

    2016-05-03

    Quantitative phase microscopy (QPM) has recently emerged as a powerful label-free technique in the field of living cell imaging allowing to non-invasively measure with a nanometric axial sensitivity cell structure and dynamics. Since the phase retardation of a light wave when transmitted through the observed cells, namely the quantitative phase signal (QPS), is sensitive to both cellular thickness and intracellular refractive index related to the cellular content, its accurate analysis allows to derive various cell parameters and monitor specific cell processes, which are very likely to identify new cell biomarkers. Specifically, quantitative phase-digital holographic microscopy (QP-DHM), thanks to its numerical flexibility facilitating parallelization and automation processes, represents an appealing imaging modality to both identify original cellular biomarkers of diseases as well to explore the underlying pathophysiological processes.

  6. Detecting Genetic Interactions for Quantitative Traits Using m-Spacing Entropy Measure

    Directory of Open Access Journals (Sweden)

    Jaeyong Yee

    2015-01-01

    Full Text Available A number of statistical methods for detecting gene-gene interactions have been developed in genetic association studies with binary traits. However, many phenotype measures are intrinsically quantitative and categorizing continuous traits may not always be straightforward and meaningful. Association of gene-gene interactions with an observed distribution of such phenotypes needs to be investigated directly without categorization. Information gain based on entropy measure has previously been successful in identifying genetic associations with binary traits. We extend the usefulness of this information gain by proposing a nonparametric evaluation method of conditional entropy of a quantitative phenotype associated with a given genotype. Hence, the information gain can be obtained for any phenotype distribution. Because any functional form, such as Gaussian, is not assumed for the entire distribution of a trait or a given genotype, this method is expected to be robust enough to be applied to any phenotypic association data. Here, we show its use to successfully identify the main effect, as well as the genetic interactions, associated with a quantitative trait.

  7. Initial Description of a Quantitative, Cross-Species (Chimpanzee-Human) Social Responsiveness Measure

    Science.gov (United States)

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve E.; Constantino, John N.; Povinelli, Daniel J.; Pruett, John R., Jr.

    2011-01-01

    Objective: Comparative studies of social responsiveness, an ability that is impaired in autism spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species…

  8. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...

  9. Quantitative analysis of low-density SNP data for parentage assignment and estimation of family contributions to pooled samples.

    Science.gov (United States)

    Henshall, John M; Dierens, Leanne; Sellars, Melony J

    2014-09-02

    sufficiently accurate to provide useful information for a breeding program. Treating genotypes as quantitative values is an alternative to perturbing genotypes using an assumed error distribution, but can produce very different results. An understanding of the distribution of the error is required for SNP genotyping platforms.

  10. Fuzzy logic algorithm for quantitative tissue characterization of diffuse liver diseases from ultrasound images.

    Science.gov (United States)

    Badawi, A M; Derbala, A S; Youssef, A M

    1999-08-01

    Computerized ultrasound tissue characterization has become an objective means for diagnosis of liver diseases. It is difficult to differentiate diffuse liver diseases, namely cirrhotic and fatty liver by visual inspection from the ultrasound images. The visual criteria for differentiating diffused diseases are rather confusing and highly dependent upon the sonographer's experience. This often causes a bias effects in the diagnostic procedure and limits its objectivity and reproducibility. Computerized tissue characterization to assist quantitatively the sonographer for the accurate differentiation and to minimize the degree of risk is thus justified. Fuzzy logic has emerged as one of the most active area in classification. In this paper, we present an approach that employs Fuzzy reasoning techniques to automatically differentiate diffuse liver diseases using numerical quantitative features measured from the ultrasound images. Fuzzy rules were generated from over 140 cases consisting of normal, fatty, and cirrhotic livers. The input to the fuzzy system is an eight dimensional vector of feature values: the mean gray level (MGL), the percentile 10%, the contrast (CON), the angular second moment (ASM), the entropy (ENT), the correlation (COR), the attenuation (ATTEN) and the speckle separation. The output of the fuzzy system is one of the three categories: cirrhosis, fatty or normal. The steps done for differentiating the pathologies are data acquisition and feature extraction, dividing the input spaces of the measured quantitative data into fuzzy sets. Based on the expert knowledge, the fuzzy rules are generated and applied using the fuzzy inference procedures to determine the pathology. Different membership functions are developed for the input spaces. This approach has resulted in very good sensitivities and specificity for classifying diffused liver pathologies. This classification technique can be used in the diagnostic process, together with the history

  11. Cross-method validation as a solution to the problem of excessive simplification of measurement in quantitative IR research

    DEFF Research Database (Denmark)

    Beach, Derek

    2007-01-01

    The purpose of this article is to make IR scholars more aware of the costs of choosing quantitative methods. The article first shows that quantification can have analytical ‘costs’ when the measures created are too simple to capture the essence of the systematized concept that was supposed...... detail based upon a review of the democratic peace literature. I then offer two positive suggestions for a way forward. First, I argue that quantitative scholars should spend more time validating their measures, and in particular should engage in multi-method partnerships with qualitative scholars...... that have a deep understanding of particular cases in order to exploit the comparative advantages of qualitative methodology, using the more accurate qualitative measures to validate their own quantitative measures. Secondly, quantitative scholars should lower their level of ambition given the often poor...

  12. 76 FR 9637 - Proposed Information Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity...

    Science.gov (United States)

    2011-02-18

    ... Collection (Veteran Suicide Prevention Online Quantitative Surveys) Activity: Comment Request AGENCY... prevention of suicide among Veterans and their families. DATES: Written comments and recommendations on the.... Abstract: VA's top priority is the prevention of Veterans suicide. It is imperative to reach these at-risk...

  13. 76 FR 27384 - Agency Information Collection Activity (Veteran Suicide Prevention Online Quantitative Surveys...

    Science.gov (United States)

    2011-05-11

    ... Collection Activity (Veteran Suicide Prevention Online Quantitative Surveys) Under OMB Review AGENCY.... Abstract: VA's top priority is the prevention of Veterans suicide. It is imperative to reach these at-risk... families' awareness of VA's suicide prevention and mental health support services. In addition, the surveys...

  14. Applicability of a set of tomographic reconstruction algorithms for quantitative SPECT on irradiated nuclear fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Jacobsson Svärd, Staffan, E-mail: staffan.jacobsson_svard@physics.uu.se; Holcombe, Scott; Grape, Sophie

    2015-05-21

    A fuel assembly operated in a nuclear power plant typically contains 100–300 fuel rods, depending on fuel type, which become strongly radioactive during irradiation in the reactor core. For operational and security reasons, it is of interest to experimentally deduce rod-wise information from the fuel, preferably by means of non-destructive measurements. The tomographic SPECT technique offers such possibilities through its two-step application; (1) recording the gamma-ray flux distribution around the fuel assembly, and (2) reconstructing the assembly's internal source distribution, based on the recorded radiation field. In this paper, algorithms for performing the latter step and extracting quantitative relative rod-by-rod data are accounted for. As compared to application of SPECT in nuclear medicine, nuclear fuel assemblies present a much more heterogeneous distribution of internal attenuation to gamma radiation than the human body, typically with rods containing pellets of heavy uranium dioxide surrounded by cladding of a zirconium alloy placed in water or air. This inhomogeneity severely complicates the tomographic quantification of the rod-wise relative source content, and the deduction of conclusive data requires detailed modelling of the attenuation to be introduced in the reconstructions. However, as shown in this paper, simplified models may still produce valuable information about the fuel. Here, a set of reconstruction algorithms for SPECT on nuclear fuel assemblies are described and discussed in terms of their quantitative performance for two applications; verification of fuel assemblies' completeness in nuclear safeguards, and rod-wise fuel characterization. It is argued that a request not to base the former assessment on any a priori information brings constraints to which reconstruction methods that may be used in that case, whereas the use of a priori information on geometry and material content enables highly accurate quantitative

  15. Quantitation of the human basal ganglia with positron emission tomography

    International Nuclear Information System (INIS)

    Bendrien, B.; Dewey, S.L.; Schlyer, D.J.; Wolf, A.P.; Volkow, N.D.

    1990-01-01

    The accurate measurement of the concentration of a radioisotope in small structures with PET requires a correction for quantitation loss due to the partial volume effect and the effect of scattered radiation. To evaluate errors associated with measures in the human basal ganglia (BG) the authors have built a unilateral model of the BG that the authors have inserted in a 20 cm cylinder. The recovery coefficient (RC = measured activity/true activity) for the BG phantom has been measured on a CTI tomograph (model 931-08/12) with different background concentrations (contrast) and at different axial locations in the gantry. The BG was visualized on 4 or 5 slices depending on its position in the gantry and on the contrast used. The RC was 0.75 with no background (contrast equal to 1.0). Increasing the relative radioactivity 2.00 when the contrast was -0.7 (BG 2 ). This paper also demonstrates that the higher the contrast the more sensitive to axial positioning PET measurements in the BG are. These data provide the authors' with some information about the variability of PET measurements in small structure like the BG and the authors have proposed some strategies to improve the reproducibility

  16. Image interpolation allows accurate quantitative bone morphometry in registered micro-computed tomography scans.

    Science.gov (United States)

    Schulte, Friederike A; Lambers, Floor M; Mueller, Thomas L; Stauber, Martin; Müller, Ralph

    2014-04-01

    Time-lapsed in vivo micro-computed tomography is a powerful tool to analyse longitudinal changes in the bone micro-architecture. Registration can overcome problems associated with spatial misalignment between scans; however, it requires image interpolation which might affect the outcome of a subsequent bone morphometric analysis. The impact of the interpolation error itself, though, has not been quantified to date. Therefore, the purpose of this ex vivo study was to elaborate the effect of different interpolator schemes [nearest neighbour, tri-linear and B-spline (BSP)] on bone morphometric indices. None of the interpolator schemes led to significant differences between interpolated and non-interpolated images, with the lowest interpolation error found for BSPs (1.4%). Furthermore, depending on the interpolator, the processing order of registration, Gaussian filtration and binarisation played a role. Independent from the interpolator, the present findings suggest that the evaluation of bone morphometry should be done with images registered using greyscale information.

  17. Contribution and perspectives of quantitative genetics to plant breeding in Brazil

    Directory of Open Access Journals (Sweden)

    Fernando Henrique Ribeiro Barrozo Toledo

    2012-12-01

    Full Text Available The purpose of this article is to show how quantitative genetics has contributed to the huge genetic progress obtained inplant breeding in Brazil in the last forty years. The information obtained through quantitative genetics has given Brazilian breedersthe possibility of responding to innumerable questions in their work in a much more informative way, such as the use or not of hybridcultivars, which segregating population to use, which breeding method to employ, alternatives for improving the efficiency of selectionprograms, and how to handle the data of progeny and/or cultivars evaluations to identify the most stable ones and thus improverecommendations.

  18. Quantitative measurements in laser induced plasmas using optical probing. Progress report, October 1, 1977--April 30, 1978

    International Nuclear Information System (INIS)

    Sweeney, D.W.

    1978-06-01

    Optical probing of laser induced plasmas can be used to quantitatively reconstruct electron number densities and magnetic fields. Numerical techniques for extracting quantitative information from the experimental data are described and four Abel inversion codes are provided. A computer simulation of optical probing is used to determine the quantitative information that can be reasonably extracted from real experimental systems. Examples of reconstructed electron number densities from interferograms of laser plasmas show steepened electron distributions

  19. FTA real-time transit information assessment : white paper on literature review of real-time transit information systems.

    Science.gov (United States)

    Real-time transit information systems are key technology applications within the transit industry designed to provide better customer service by disseminating timely and accurate information. Riders use this information to make various decisions abou...

  20. Collecting data for quantitative research on pluvial flooding

    NARCIS (Netherlands)

    Spekkers, M.H.; Ten Veldhuis, J.A.E.; Clemens, F.H.L.R.

    2011-01-01

    Urban pluvial flood management requires detailed spatial and temporal information on flood characteristics and damaging consequences. There is lack of quantitative field data on pluvial flooding resulting in large uncertainties in urban flood model calculations and ensuing decisions for investments

  1. Easy and accurate reconstruction of whole HIV genomes from short-read sequence data with shiver

    Science.gov (United States)

    Blanquart, François; Golubchik, Tanya; Gall, Astrid; Bakker, Margreet; Bezemer, Daniela; Croucher, Nicholas J; Hall, Matthew; Hillebregt, Mariska; Ratmann, Oliver; Albert, Jan; Bannert, Norbert; Fellay, Jacques; Fransen, Katrien; Gourlay, Annabelle; Grabowski, M Kate; Gunsenheimer-Bartmeyer, Barbara; Günthard, Huldrych F; Kivelä, Pia; Kouyos, Roger; Laeyendecker, Oliver; Liitsola, Kirsi; Meyer, Laurence; Porter, Kholoud; Ristola, Matti; van Sighem, Ard; Cornelissen, Marion; Kellam, Paul; Reiss, Peter

    2018-01-01

    Abstract Studying the evolution of viruses and their molecular epidemiology relies on accurate viral sequence data, so that small differences between similar viruses can be meaningfully interpreted. Despite its higher throughput and more detailed minority variant data, next-generation sequencing has yet to be widely adopted for HIV. The difficulty of accurately reconstructing the consensus sequence of a quasispecies from reads (short fragments of DNA) in the presence of large between- and within-host diversity, including frequent indels, may have presented a barrier. In particular, mapping (aligning) reads to a reference sequence leads to biased loss of information; this bias can distort epidemiological and evolutionary conclusions. De novo assembly avoids this bias by aligning the reads to themselves, producing a set of sequences called contigs. However contigs provide only a partial summary of the reads, misassembly may result in their having an incorrect structure, and no information is available at parts of the genome where contigs could not be assembled. To address these problems we developed the tool shiver to pre-process reads for quality and contamination, then map them to a reference tailored to the sample using corrected contigs supplemented with the user’s choice of existing reference sequences. Run with two commands per sample, it can easily be used for large heterogeneous data sets. We used shiver to reconstruct the consensus sequence and minority variant information from paired-end short-read whole-genome data produced with the Illumina platform, for sixty-five existing publicly available samples and fifty new samples. We show the systematic superiority of mapping to shiver’s constructed reference compared with mapping the same reads to the closest of 3,249 real references: median values of 13 bases called differently and more accurately, 0 bases called differently and less accurately, and 205 bases of missing sequence recovered. We also

  2. From inverse problems in mathematical physiology to quantitative differential diagnoses.

    Directory of Open Access Journals (Sweden)

    Sven Zenker

    2007-11-01

    Full Text Available The improved capacity to acquire quantitative data in a clinical setting has generally failed to improve outcomes in acutely ill patients, suggesting a need for advances in computer-supported data interpretation and decision making. In particular, the application of mathematical models of experimentally elucidated physiological mechanisms could augment the interpretation of quantitative, patient-specific information and help to better target therapy. Yet, such models are typically complex and nonlinear, a reality that often precludes the identification of unique parameters and states of the model that best represent available data. Hypothesizing that this non-uniqueness can convey useful information, we implemented a simplified simulation of a common differential diagnostic process (hypotension in an acute care setting, using a combination of a mathematical model of the cardiovascular system, a stochastic measurement model, and Bayesian inference techniques to quantify parameter and state uncertainty. The output of this procedure is a probability density function on the space of model parameters and initial conditions for a particular patient, based on prior population information together with patient-specific clinical observations. We show that multimodal posterior probability density functions arise naturally, even when unimodal and uninformative priors are used. The peaks of these densities correspond to clinically relevant differential diagnoses and can, in the simplified simulation setting, be constrained to a single diagnosis by assimilating additional observations from dynamical interventions (e.g., fluid challenge. We conclude that the ill-posedness of the inverse problem in quantitative physiology is not merely a technical obstacle, but rather reflects clinical reality and, when addressed adequately in the solution process, provides a novel link between mathematically described physiological knowledge and the clinical concept of

  3. From Inverse Problems in Mathematical Physiology to Quantitative Differential Diagnoses

    Science.gov (United States)

    Zenker, Sven; Rubin, Jonathan; Clermont, Gilles

    2007-01-01

    The improved capacity to acquire quantitative data in a clinical setting has generally failed to improve outcomes in acutely ill patients, suggesting a need for advances in computer-supported data interpretation and decision making. In particular, the application of mathematical models of experimentally elucidated physiological mechanisms could augment the interpretation of quantitative, patient-specific information and help to better target therapy. Yet, such models are typically complex and nonlinear, a reality that often precludes the identification of unique parameters and states of the model that best represent available data. Hypothesizing that this non-uniqueness can convey useful information, we implemented a simplified simulation of a common differential diagnostic process (hypotension in an acute care setting), using a combination of a mathematical model of the cardiovascular system, a stochastic measurement model, and Bayesian inference techniques to quantify parameter and state uncertainty. The output of this procedure is a probability density function on the space of model parameters and initial conditions for a particular patient, based on prior population information together with patient-specific clinical observations. We show that multimodal posterior probability density functions arise naturally, even when unimodal and uninformative priors are used. The peaks of these densities correspond to clinically relevant differential diagnoses and can, in the simplified simulation setting, be constrained to a single diagnosis by assimilating additional observations from dynamical interventions (e.g., fluid challenge). We conclude that the ill-posedness of the inverse problem in quantitative physiology is not merely a technical obstacle, but rather reflects clinical reality and, when addressed adequately in the solution process, provides a novel link between mathematically described physiological knowledge and the clinical concept of differential diagnoses

  4. Exploring a new quantitative image marker to assess benefit of chemotherapy to ovarian cancer patients

    Science.gov (United States)

    Mirniaharikandehei, Seyedehnafiseh; Patil, Omkar; Aghaei, Faranak; Wang, Yunzhi; Zheng, Bin

    2017-03-01

    Accurately assessing the potential benefit of chemotherapy to cancer patients is an important prerequisite to developing precision medicine in cancer treatment. The previous study has shown that total psoas area (TPA) measured on preoperative cross-section CT image might be a good image marker to predict long-term outcome of pancreatic cancer patients after surgery. However, accurate and automated segmentation of TPA from the CT image is difficult due to the fuzzy boundary or connection of TPA to other muscle areas. In this study, we developed a new interactive computer-aided detection (ICAD) scheme aiming to segment TPA from the abdominal CT images more accurately and assess the feasibility of using this new quantitative image marker to predict the benefit of ovarian cancer patients receiving Bevacizumab-based chemotherapy. ICAD scheme was applied to identify a CT image slice of interest, which is located at the level of L3 (vertebral spines). The cross-sections of the right and left TPA are segmented using a set of adaptively adjusted boundary conditions. TPA is then quantitatively measured. In addition, recent studies have investigated that muscle radiation attenuation which reflects fat deposition in the tissue might be a good image feature for predicting the survival rate of cancer patients. The scheme and TPA measurement task were applied to a large national clinical trial database involving 1,247 ovarian cancer patients. By comparing with manual segmentation results, we found that ICAD scheme could yield higher accuracy and consistency for this task. Using a new ICAD scheme can provide clinical researchers a useful tool to more efficiently and accurately extract TPA as well as muscle radiation attenuation as new image makers, and allow them to investigate the discriminatory power of it to predict progression-free survival and/or overall survival of the cancer patients before and after taking chemotherapy.

  5. Introduction to quantitative research methods an investigative approach

    CERN Document Server

    Balnaves, Mark

    2001-01-01

    Introduction to Quantitative Research Methods is a student-friendly introduction to quantitative research methods and basic statistics. It uses a detective theme throughout the text and in multimedia courseware to show how quantitative methods have been used to solve real-life problems. The book focuses on principles and techniques that are appropriate to introductory level courses in media, psychology and sociology. Examples and illustrations are drawn from historical and contemporary research in the social sciences. The multimedia courseware provides tutorial work on sampling, basic statistics, and techniques for seeking information from databases and other sources. The statistics modules can be used as either part of a detective games or directly in teaching and learning. Brief video lessons in SPSS, using real datasets, are also a feature of the CD-ROM.

  6. Comparison of STIM and particle backscattering spectrometry mass determination for quantitative microanalysis of cultured cells

    International Nuclear Information System (INIS)

    Deves, G.; Ortega, R.

    2001-01-01

    In biological sample microanalysis, a mass-normalisation method is commonly used as a quantitative index of elemental concentrations determined by particle-induced X-ray emission (PIXE). The organic mass can either be determined using particle backscattering spectrometry (BS) or scanning transmission ion microscopy (STIM). However, the accuracy of quantitative microanalysis in samples such as cultured cells is affected by beam-induced loss of organic mass during analysis. The aim of this paper is to compare mass measurements determined by particle BS or by STIM. In order to calibrate STIM and BS analyses, we measured by both techniques the thickness of standard foils of polycarbonate (3 and 6 μm), Mylar[reg] (4 μm), Kapton[reg] (7.5 μm) and Nylon[reg] (15 μm), as well as biological samples of mono-layered cultured cells. Non-damaging STIM analysis of samples before PIXE irradiation is certainly one of the most accurate ways to determine the sample mass, however, this requires strong experimental handling. On the other hand, BS performed simultaneously to PIXE is the simplest method to determine the local mass in polymer foils, but appears less accurate in the case of cultured cells

  7. Demonstration of a viable quantitative theory for interplanetary type II radio bursts

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, J. M., E-mail: jschmidt@physics.usyd.edu.au; Cairns, Iver H. [School of Physics, Physics Road, Building A28, University of Sydney, NSW 2006 (Australia)

    2016-03-25

    Between 29 November and 1 December 2013 the two widely separated spacecraft STEREO A and B observed a long lasting, intermittent, type II radio burst for the extended frequency range ≈ 4 MHz to 30 kHz, including an intensification when the shock wave of the associated coronal mass ejection (CME) reached STEREO A. We demonstrate for the first time our ability to quantitatively and accurately simulate the fundamental (F) and harmonic (H) emission of type II bursts from the higher corona (near 11 solar radii) to 1 AU. Our modeling requires the combination of data-driven three-dimensional magnetohydrodynamic simulations for the CME and plasma background, carried out with the BATS-R-US code, with an analytic quantitative kinetic model for both F and H radio emission, including the electron reflection at the shock, growth of Langmuir waves and radio waves, and the radiations propagation to an arbitrary observer. The intensities and frequencies of the observed radio emissions vary hugely by factors ≈ 10{sup 6} and ≈ 10{sup 3}, respectively; the theoretical predictions are impressively accurate, being typically in error by less than a factor of 10 and 20 %, for both STEREO A and B. We also obtain accurate predictions for the timing and characteristics of the shock and local radio onsets at STEREO A, the lack of such onsets at STEREO B, and the z-component of the magnetic field at STEREO A ahead of the shock, and in the sheath. Very strong support is provided by these multiple agreements for the theory, the efficacy of the BATS-R-US code, and the vision of using type IIs and associated data-theory iterations to predict whether a CME will impact Earth’s magnetosphere and drive space weather events.

  8. Embodied memory allows accurate and stable perception of hidden objects despite orientation change.

    Science.gov (United States)

    Pan, Jing Samantha; Bingham, Ned; Bingham, Geoffrey P

    2017-07-01

    Rotating a scene in a frontoparallel plane (rolling) yields a change in orientation of constituent images. When using only information provided by static images to perceive a scene after orientation change, identification performance typically decreases (Rock & Heimer, 1957). However, rolling generates optic flow information that relates the discrete, static images (before and after the change) and forms an embodied memory that aids recognition. The embodied memory hypothesis predicts that upon detecting a continuous spatial transformation of image structure, or in other words, seeing the continuous rolling process and objects undergoing rolling observers should accurately perceive objects during and after motion. Thus, in this case, orientation change should not affect performance. We tested this hypothesis in three experiments and found that (a) using combined optic flow and image structure, participants identified locations of previously perceived but currently occluded targets with great accuracy and stability (Experiment 1); (b) using combined optic flow and image structure information, participants identified hidden targets equally well with or without 30° orientation changes (Experiment 2); and (c) when the rolling was unseen, identification of hidden targets after orientation change became worse (Experiment 3). Furthermore, when rolling was unseen, although target identification was better when participants were told about the orientation change than when they were not told, performance was still worse than when there was no orientation change. Therefore, combined optic flow and image structure information, not mere knowledge about the rolling, enables accurate and stable perception despite orientation change. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Quantitative assessment of growth plate activity

    International Nuclear Information System (INIS)

    Harcke, H.T.; Macy, N.J.; Mandell, G.A.; MacEwen, G.D.

    1984-01-01

    In the immature skeleton the physis or growth plate is the area of bone least able to withstand external forces and is therefore prone to trauma. Such trauma often leads to premature closure of the plate and results in limb shortening and/or angular deformity (varus or valgus). Active localization of bone seeking tracers in the physis makes bone scintigraphy an excellent method for assessing growth plate physiology. To be most effective, however, physeal activity should be quantified so that serial evaluations are accurate and comparable. The authors have developed a quantitative method for assessing physeal activity and have applied it ot the hip and knee. Using computer acquired pinhole images of the abnormal and contralateral normal joints, ten regions of interest are placed at key locations around each joint and comparative ratios are generated to form a growth plate profile. The ratios compare segmental physeal activity to total growth plate activity on both ipsilateral and contralateral sides and to adjacent bone. In 25 patients, ages 2 to 15 years, with angular deformities of the legs secondary to trauma, Blount's disease, and Perthes disease, this technique is able to differentiate abnormal segmental physeal activity. This is important since plate closure does not usually occur uniformly across the physis. The technique may permit the use of scintigraphy in the prediction of early closure through the quantitative analysis of serial studies

  10. LC-MS/MS quantitative analysis of reducing carbohydrates in soil solutions extracted from crop rhizospheres.

    Science.gov (United States)

    McRae, G; Monreal, C M

    2011-06-01

    A simple, sensitive, and specific analytical method has been developed for the quantitative determination of 15 reducing carbohydrates in the soil solution of crop rhizosphere. Reducing carbohydrates were derivatized with 1-phenyl-3-methyl-5-pyrazolone, separated by reversed-phase high-performance liquid chromatography and detected by electrospray ionization tandem mass spectrometry. Lower limits of quantitation of 2 ng/mL were achieved for all carbohydrates. Quantitation was performed using peak area ratios (analyte/internal standard) and a calibration curve spiked in water with glucose-d(2) as the internal standard. Calibration curves showed excellent linearity over the range 2-100 ng/mL (10-1,000 ng/mL for glucose). The method has been tested with quality control samples spiked in water and soil solution samples obtained from the rhizosphere of wheat and canola and has been found to provide accurate and precise results.

  11. Standard addition strip for quantitative electrostatic spray ionization mass spectrometry analysis: determination of caffeine in drinks.

    Science.gov (United States)

    Tobolkina, Elena; Qiao, Liang; Roussel, Christophe; Girault, Hubert H

    2014-12-01

    Standard addition strips were prepared for the quantitative determination of caffeine in different beverages by electrostatic spray ionization mass spectrometry (ESTASI-MS). The gist of this approach is to dry spots of caffeine solutions with different concentrations on a polymer strip, then to deposit a drop of sample mixed with an internal standard, here theobromine on each spot and to measure the mass spectrometry signals of caffeine and theobromine by ESTASI-MS. This strip approach is very convenient and provides quantitative analyses as accurate as the classical standard addition method by MS or liquid chromatography. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Quantitative Preparation in Doctoral Education Programs: A Mixed-Methods Study of Doctoral Student Perspectives on their Quantitative Training

    Directory of Open Access Journals (Sweden)

    Sarah L Ferguson

    2017-07-01

    Full Text Available Aim/Purpose: The purpose of the current study is to explore student perceptions of their own doctoral-level education and quantitative proficiency. Background: The challenges of preparing doctoral students in education have been discussed in the literature, but largely from the perspective of university faculty and program administrators. The current study directly explores the student voice on this issue. Methodology: Utilizing a sequential explanatory mixed-methods research design, the present study seeks to better understand doctoral-level education students’ perceptions of their quantitative methods training at a large public university in the southwestern United States. Findings: Results from both phases present the need for more application and consistency in doctoral-level quantitative courses. Additionally, there was a consistent theme of internal motivation in the responses, suggesting students perceive their quantitative training to be valuable beyond their personal interest in the topic. Recommendations for Practitioners: Quantitative methods instructors should emphasize practice in their quantitative courses and consider providing additional support for students through the inclusion of lab sections, tutoring, and/or differentiation. Pre-testing statistical ability at the start of a course is also suggested to better meet student needs. Impact on Society: The ultimate goal of quantitative methods in doctoral education is to produce high-quality educational researchers who are prepared to apply their knowledge to problems and research in education. Results of the present study can inform faculty and administrator decisions in doctoral education to best support this goal. Future Research: Using the student perspectives presented in the present study, future researchers should continue to explore effective instructional strategies and curriculum design within education doctoral programs. The inclusion of student voice can strengthen

  13. Quantitative x-ray structure determination of superlattices and interfaces

    International Nuclear Information System (INIS)

    Schuller, I.K.; Fullerton, E.E.

    1990-01-01

    This paper presents a general procedure for quantitative structural refinement of superlattice structures. To analyze a wide range of superlattices, the authors have derived a general kinematical diffraction formula that includes random, continuous and discrete fluctuations from the average structure. By implementing a non-linear fitting algorithm to fit the entire x-ray diffraction profile, refined parameters that describe the average superlattice structure, and deviations from this average are obtained. The structural refinement procedure is applied to a crystalline/crystalline Mo/Ni superlattices and crystalline/amorphous Pb/Ge superlattices. Roughness introduced artificially during growth in Mo/Ni superlattices is shown to be accurately reproduced by the refinement

  14. The Basis of Distinction Between Qualitative and Quantitative ...

    African Journals Online (AJOL)

    This article examines methodological issues associated with qualitative and quantitative research. In doing this, I briefly begin by outlining the philosophical and conceptual framework that informed the two research methodologies and discusses how ontological and epistemological issues were translated in to specific ...

  15. Quantitative tomographic measurements of opaque multiphase flows

    Energy Technology Data Exchange (ETDEWEB)

    GEORGE,DARIN L.; TORCZYNSKI,JOHN R.; SHOLLENBERGER,KIM ANN; O' HERN,TIMOTHY J.; CECCIO,STEVEN L.

    2000-03-01

    An electrical-impedance tomography (EIT) system has been developed for quantitative measurements of radial phase distribution profiles in two-phase and three-phase vertical column flows. The EIT system is described along with the computer algorithm used for reconstructing phase volume fraction profiles. EIT measurements were validated by comparison with a gamma-densitometry tomography (GDT) system. The EIT system was used to accurately measure average solid volume fractions up to 0.05 in solid-liquid flows, and radial gas volume fraction profiles in gas-liquid flows with gas volume fractions up to 0.15. In both flows, average phase volume fractions and radial volume fraction profiles from GDT and EIT were in good agreement. A minor modification to the formula used to relate conductivity data to phase volume fractions was found to improve agreement between the methods. GDT and EIT were then applied together to simultaneously measure the solid, liquid, and gas radial distributions within several vertical three-phase flows. For average solid volume fractions up to 0.30, the gas distribution for each gas flow rate was approximately independent of the amount of solids in the column. Measurements made with this EIT system demonstrate that EIT may be used successfully for noninvasive, quantitative measurements of dispersed multiphase flows.

  16. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  17. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    Zainal Arifin Ahmad; Ahmad Fauzi Mohd Noor; Radzali Othman; Messer, P.F.

    1996-01-01

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  18. Comparison of two methods of quantitation in human studies of biodistribution and radiation dosimetry

    International Nuclear Information System (INIS)

    Smith, T.

    1992-01-01

    A simple method of quantitating organ radioactivity content for dosimetry purposes based on relationships between organ count rate and the initial whole body count rate, has been compared with a more rigorous method of absolute quantitation using a transmission scanning technique. Comparisons were on the basis of organ uptake (% administered activity) and resultant organ radiation doses (mGy MBq -1 ) in 6 normal male volunteers given a 99 Tc m -labelled myocardial perfusion imaging agent intravenously at rest and following exercise. In these studies, estimates of individual organ uptakes by the simple method were in error by between +24 and -16% compared with the more accurate method. However, errors on organ dose values were somewhat less and the effective dose was correct to within 3%. (Author)

  19. An unbiased method for the quantitation of disease phenotypes using a custom-built macro plugin for the program ImageJ

    NARCIS (Netherlands)

    Abd-El-Haliem, A.

    2012-01-01

    Accurate evaluation of disease phenotypes is considered a key step to study plant–microbe interactions, as the rate of host colonization by the pathogenic microbe directly reflects whether the defense response of the plant is compromised. Although several techniques were developed to quantitate the

  20. Current status of accurate prognostic awareness in advanced/terminally ill cancer patients: Systematic review and meta-regression analysis.

    Science.gov (United States)

    Chen, Chen Hsiu; Kuo, Su Ching; Tang, Siew Tzuh

    2017-05-01

    No systematic meta-analysis is available on the prevalence of cancer patients' accurate prognostic awareness and differences in accurate prognostic awareness by publication year, region, assessment method, and service received. To examine the prevalence of advanced/terminal cancer patients' accurate prognostic awareness and differences in accurate prognostic awareness by publication year, region, assessment method, and service received. Systematic review and meta-analysis. MEDLINE, Embase, The Cochrane Library, CINAHL, and PsycINFO were systematically searched on accurate prognostic awareness in adult patients with advanced/terminal cancer (1990-2014). Pooled prevalences were calculated for accurate prognostic awareness by a random-effects model. Differences in weighted estimates of accurate prognostic awareness were compared by meta-regression. In total, 34 articles were retrieved for systematic review and meta-analysis. At best, only about half of advanced/terminal cancer patients accurately understood their prognosis (49.1%; 95% confidence interval: 42.7%-55.5%; range: 5.4%-85.7%). Accurate prognostic awareness was independent of service received and publication year, but highest in Australia, followed by East Asia, North America, and southern Europe and the United Kingdom (67.7%, 60.7%, 52.8%, and 36.0%, respectively; p = 0.019). Accurate prognostic awareness was higher by clinician assessment than by patient report (63.2% vs 44.5%, p cancer patients accurately understood their prognosis, with significant variations by region and assessment method. Healthcare professionals should thoroughly assess advanced/terminal cancer patients' preferences for prognostic information and engage them in prognostic discussion early in the cancer trajectory, thus facilitating their accurate prognostic awareness and the quality of end-of-life care decision-making.

  1. Quantitative structure activity relationship (QSAR) of piperine analogs for bacterial NorA efflux pump inhibitors.

    Science.gov (United States)

    Nargotra, Amit; Sharma, Sujata; Koul, Jawahir Lal; Sangwan, Pyare Lal; Khan, Inshad Ali; Kumar, Ashwani; Taneja, Subhash Chander; Koul, Surrinder

    2009-10-01

    Quantitative structure activity relationship (QSAR) analysis of piperine analogs as inhibitors of efflux pump NorA from Staphylococcus aureus has been performed in order to obtain a highly accurate model enabling prediction of inhibition of S. aureus NorA of new chemical entities from natural sources as well as synthetic ones. Algorithm based on genetic function approximation method of variable selection in Cerius2 was used to generate the model. Among several types of descriptors viz., topological, spatial, thermodynamic, information content and E-state indices that were considered in generating the QSAR model, three descriptors such as partial negative surface area of the compounds, area of the molecular shadow in the XZ plane and heat of formation of the molecules resulted in a statistically significant model with r(2)=0.962 and cross-validation parameter q(2)=0.917. The validation of the QSAR models was done by cross-validation, leave-25%-out and external test set prediction. The theoretical approach indicates that the increase in the exposed partial negative surface area increases the inhibitory activity of the compound against NorA whereas the area of the molecular shadow in the XZ plane is inversely proportional to the inhibitory activity. This model also explains the relationship of the heat of formation of the compound with the inhibitory activity. The model is not only able to predict the activity of new compounds but also explains the important regions in the molecules in quantitative manner.

  2. Leadership training in a family medicine residency program: Cross-sectional quantitative survey to inform curriculum development.

    Science.gov (United States)

    Gallagher, Erin; Moore, Ainsley; Schabort, Inge

    2017-03-01

    To assess the current status of leadership training as perceived by family medicine residents to inform the development of a formal leadership curriculum. Cross-sectional quantitative survey. Department of Family Medicine at McMaster University in Hamilton, Ont, in December 2013. A total of 152 first- and second-year family medicine residents. Family medicine residents' attitudes toward leadership, perceived level of training in various leadership domains, and identified opportunities for leadership training. Overall, 80% (152 of 190) of residents completed the survey. On a Likert scale (1 = strongly disagree, 4 = neutral, 7 = strongly agree), residents rated the importance of physician leadership in the clinical setting as high (6.23 of 7), whereas agreement with the statement "I am a leader" received the lowest rating (5.28 of 7). At least 50% of residents desired more training in the leadership domains of personal mastery, mentorship and coaching, conflict resolution, teaching, effective teamwork, administration, ideals of a healthy workplace, coalitions, and system transformation. At least 50% of residents identified behavioural sciences seminars, a lecture and workshop series, and a retreat as opportunities to expand leadership training. The concept of family physicians as leaders resonated highly with residents. Residents desired more personal and system-level leadership training. They also identified ways that leadership training could be expanded in the current curriculum and developed in other areas. The information gained from this survey might facilitate leadership development among residents through application of its results in a formal leadership curriculum. Copyright© the College of Family Physicians of Canada.

  3. Evaluation of fourier transform profilometry performance: quantitative waste volume determination under simulated Hanford waste tank conditions

    International Nuclear Information System (INIS)

    Jang, Ping-Rey; Leone, Teresa; Long, Zhiling; Mott, Melissa A.; Perry Norton, O.; Okhuysen, Walter P.; Monts, David L.

    2007-01-01

    The Hanford Site is currently in the process of an extensive effort to empty and close its radioactive single-shell and double-shell waste storage tanks. Before this can be accomplished, it is necessary to know how much residual material is left in a given waste tank and the chemical makeup of the residue. The objective of Mississippi State University's Institute for Clean Energy Technology's (ICET) efforts is to develop, fabricate, and deploy inspection tools for the Hanford waste tanks that will (1) be remotely operable; (2) provide quantitative information on the amount of wastes remaining; and (3) provide information on the spatial distribution of chemical and radioactive species of interest. A collaborative arrangement has been established with the Hanford Site to develop probe-based inspection systems for deployment in the waste tanks. ICET is currently developing an in-tank inspection system based on Fourier Transform Profilometry, FTP. FTP is a non-contact, 3-D shape measurement technique. By projecting a fringe pattern onto a target surface and observing its deformation due to surface irregularities from a different view angle, FTP is capable of determining the height (depth) distribution (and hence volume distribution) of the target surface, thus reproducing the profile of the target accurately under a wide variety of conditions. Hence FTP has the potential to be utilized for quantitative determination of residual wastes within Hanford waste tanks. We have completed a preliminary performance evaluation of FTP in order to document the accuracy, precision, and operator dependence (minimal) of FTP under conditions similar to those that can be expected to pertain within Hanford waste tanks. Based on a Hanford C-200 series tank with camera access through a riser with significant offset relative to the centerline, we devised a testing methodology that encompassed a range of obstacles likely to be encountered 'in tank'. These test objects were inspected by use

  4. Informational analysis involving application of complex information system

    Science.gov (United States)

    Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael

    The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.

  5. Micro-computer system for quantitative image analysis of damage microstructure

    International Nuclear Information System (INIS)

    Kohyama, A.; Kohno, Y.; Satoh, K.; Igata, N.

    1984-01-01

    Quantitative image analysis of radiation induced damage microstructure is very important in evaluating material behaviors in radiation environment. But, quite a few improvement have been seen in quantitative analysis of damage microstructure in these decades. The objective of this work is to develop new system for quantitative image analysis of damage microstructure which could improve accuracy and efficiency of data sampling and processing and could enable to get new information about mutual relations among dislocations, precipitates, cavities, grain boundaries, etc. In this system, data sampling is done with X-Y digitizer. The cavity microstructure in dual-ion irradiated 316 SS is analyzed and the effectiveness of this system is discussed. (orig.)

  6. Characterization of 3-Dimensional PET Systems for Accurate Quantification of Myocardial Blood Flow.

    Science.gov (United States)

    Renaud, Jennifer M; Yip, Kathy; Guimond, Jean; Trottier, Mikaël; Pibarot, Philippe; Turcotte, Eric; Maguire, Conor; Lalonde, Lucille; Gulenchyn, Karen; Farncombe, Troy; Wisenberg, Gerald; Moody, Jonathan; Lee, Benjamin; Port, Steven C; Turkington, Timothy G; Beanlands, Rob S; deKemp, Robert A

    2017-01-01

    Three-dimensional (3D) mode imaging is the current standard for PET/CT systems. Dynamic imaging for quantification of myocardial blood flow with short-lived tracers, such as 82 Rb-chloride, requires accuracy to be maintained over a wide range of isotope activities and scanner counting rates. We proposed new performance standard measurements to characterize the dynamic range of PET systems for accurate quantitative imaging. 82 Rb or 13 N-ammonia (1,100-3,000 MBq) was injected into the heart wall insert of an anthropomorphic torso phantom. A decaying isotope scan was obtained over 5 half-lives on 9 different 3D PET/CT systems and 1 3D/2-dimensional PET-only system. Dynamic images (28 × 15 s) were reconstructed using iterative algorithms with all corrections enabled. Dynamic range was defined as the maximum activity in the myocardial wall with less than 10% bias, from which corresponding dead-time, counting rates, and/or injected activity limits were established for each scanner. Scatter correction residual bias was estimated as the maximum cavity blood-to-myocardium activity ratio. Image quality was assessed via the coefficient of variation measuring nonuniformity of the left ventricular myocardium activity distribution. Maximum recommended injected activity/body weight, peak dead-time correction factor, counting rates, and residual scatter bias for accurate cardiac myocardial blood flow imaging were 3-14 MBq/kg, 1.5-4.0, 22-64 Mcps singles and 4-14 Mcps prompt coincidence counting rates, and 2%-10% on the investigated scanners. Nonuniformity of the myocardial activity distribution varied from 3% to 16%. Accurate dynamic imaging is possible on the 10 3D PET systems if the maximum injected MBq/kg values are respected to limit peak dead-time losses during the bolus first-pass transit. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  7. Quantitative measures of walking and strength provide insight into brain corticospinal tract pathology in multiple sclerosis

    Directory of Open Access Journals (Sweden)

    Nora E Fritz

    2017-01-01

    Quantitative measures of strength and walking are associated with brain corticospinal tract pathology. The addition of these quantitative measures to basic clinical information explains more of the variance in corticospinal tract fractional anisotropy and magnetization transfer ratio than the basic clinical information alone. Outcome measurement for multiple sclerosis clinical trials has been notoriously challenging; the use of quantitative measures of strength and walking along with tract-specific imaging methods may improve our ability to monitor disease change over time, with intervention, and provide needed guidelines for developing more effective targeted rehabilitation strategies.

  8. Application of new least-squares methods for the quantitative infrared analysis of multicomponent samples

    International Nuclear Information System (INIS)

    Haaland, D.M.; Easterling, R.G.

    1982-01-01

    Improvements have been made in previous least-squares regression analyses of infrared spectra for the quantitative estimation of concentrations of multicomponent mixtures. Spectral baselines are fitted by least-squares methods, and overlapping spectral features are accounted for in the fitting procedure. Selection of peaks above a threshold value reduces computation time and data storage requirements. Four weighted least-squares methods incorporating different baseline assumptions were investigated using FT-IR spectra of the three pure xylene isomers and their mixtures. By fitting only regions of the spectra that follow Beer's Law, accurate results can be obtained using three of the fitting methods even when baselines are not corrected to zero. Accurate results can also be obtained using one of the fits even in the presence of Beer's Law deviations. This is a consequence of pooling the weighted results for each spectral peak such that the greatest weighting is automatically given to those peaks that adhere to Beer's Law. It has been shown with the xylene spectra that semiquantitative results can be obtained even when all the major components are not known or when expected components are not present. This improvement over previous methods greatly expands the utility of quantitative least-squares analyses

  9. Quantitative Surface Chirality Detection with Sum Frequency Generation Vibrational Spectroscopy: Twin Polarization Angle Approach

    International Nuclear Information System (INIS)

    Wei, Feng; Xu, Yanyan; Guo, Yuan; Liu, Shi-lin; Wang, Hongfei

    2009-01-01

    Here we report a novel twin polarization angle (TPA) approach in the quantitative chirality detection with the surface sum-frequency generation vibrational spectroscopy (SFG-VS). Generally, the achiral contribution dominates the surface SFG-VS signal, and the pure chiral signal is usually two or three orders of magnitude smaller. Therefore, it has been difficult to make quantitative detection and analysis of the chiral contributions to the surface SFG-VS signal. In the TPA method, by varying together the polarization angles of the incoming visible light and the sum frequency signal at fixed s or p polarization of the incoming infrared beam, the polarization dependent SFG signal can give not only direct signature of the chiral contribution in the total SFG-VS signal, but also the accurate measurement of the chiral and achiral components in the surface SFG signal. The general description of the TPA method is presented and the experiment test of the TPA approach is also presented for the SFG-VS from the S- and R-limonene chiral liquid surfaces. The most accurate degree of chiral excess values thus obtained for the 2878 cm -1 spectral peak of the S- and R-limonene liquid surfaces are (23.7±0.4)% and (25.4±1.3)%, respectively.

  10. A method for three-dimensional quantitative observation of the microstructure of biological samples

    Science.gov (United States)

    Wang, Pengfei; Chen, Dieyan; Ma, Wanyun; Wu, Hongxin; Ji, Liang; Sun, Jialin; Lv, Danyu; Zhang, Lu; Li, Ying; Tian, Ning; Zheng, Jinggao; Zhao, Fengying

    2009-07-01

    Contemporary biology has developed into the era of cell biology and molecular biology, and people try to study the mechanism of all kinds of biological phenomena at the microcosmic level now. Accurate description of the microstructure of biological samples is exigent need from many biomedical experiments. This paper introduces a method for 3-dimensional quantitative observation on the microstructure of vital biological samples based on two photon laser scanning microscopy (TPLSM). TPLSM is a novel kind of fluorescence microscopy, which has excellence in its low optical damage, high resolution, deep penetration depth and suitability for 3-dimensional (3D) imaging. Fluorescent stained samples were observed by TPLSM, and afterward the original shapes of them were obtained through 3D image reconstruction. The spatial distribution of all objects in samples as well as their volumes could be derived by image segmentation and mathematic calculation. Thus the 3-dimensionally and quantitatively depicted microstructure of the samples was finally derived. We applied this method to quantitative analysis of the spatial distribution of chromosomes in meiotic mouse oocytes at metaphase, and wonderful results came out last.

  11. Remote In-Situ Quantitative Mineralogical Analysis Using XRD/XRF

    Science.gov (United States)

    Blake, D. F.; Bish, D.; Vaniman, D.; Chipera, S.; Sarrazin, P.; Collins, S. A.; Elliott, S. T.

    2001-01-01

    X-Ray Diffraction (XRD) is the most direct and accurate method for determining mineralogy. The CHEMIN XRD/XRF instrument has shown promising results on a variety of mineral and rock samples. Additional information is contained in the original extended abstract.

  12. Quantitation of hepatitis B virus DNA in plasma using a sensitive cost-effective "in-house" real-time PCR assay

    Directory of Open Access Journals (Sweden)

    Daniel Hubert Darius

    2009-01-01

    Full Text Available Background: Sensitive nucleic acid testing for the detection and accurate quantitation of hepatitis B virus (HBV is necessary to reduce transmission through blood and blood products and for monitoring patients on antiviral therapy. The aim of this study is to standardize an "in-house" real-time HBV polymerase chain reaction (PCR for accurate quantitation and screening of HBV. Materials and Methods: The "in-house" real-time assay was compared with a commercial assay using 30 chronically infected individuals and 70 blood donors who are negative for hepatitis B surface antigen, hepatitis C virus (HCV antibody and human immunodeficiency virus (HIV antibody. Further, 30 HBV-genotyped samples were tested to evaluate the "in-house" assay′s capacity to detect genotypes prevalent among individuals attending this tertiary care hospital. Results: The lower limit of detection of this "in-house" HBV real-time PCR was assessed against the WHO international standard and found to be 50 IU/mL. The interassay and intra-assay coefficient of variation (CV of this "in-house" assay ranged from 1.4% to 9.4% and 0.0% to 2.3%, respectively. Virus loads as estimated with this "in-house" HBV real-time assay correlated well with the commercial artus HBV RG PCR assay ( r = 0.95, P < 0.0001. Conclusion: This assay can be used for the detection and accurate quantitation of HBV viral loads in plasma samples. This assay can be employed for the screening of blood donations and can potentially be adapted to a multiplex format for simultaneous detection of HBV, HIV and HCV to reduce the cost of testing in blood banks.

  13. Accurate quantification of tio2 nanoparticles collected on air filters using a microwave-assisted acid digestion method

    Science.gov (United States)

    Mudunkotuwa, Imali A.; Anthony, T. Renée; Grassian, Vicki H.; Peters, Thomas M.

    2016-01-01

    Titanium dioxide (TiO2) particles, including nanoparticles with diameters smaller than 100 nm, are used extensively in consumer products. In a 2011 current intelligence bulletin, the National Institute of Occupational Safety and Health (NIOSH) recommended methods to assess worker exposures to fine and ultrafine TiO2 particles and associated occupational exposure limits for these particles. However, there are several challenges and problems encountered with these recommended exposure assessment methods involving the accurate quantitation of titanium dioxide collected on air filters using acid digestion followed by inductively coupled plasma optical emission spectroscopy (ICP-OES). Specifically, recommended digestion methods include the use of chemicals, such as perchloric acid, which are typically unavailable in most accredited industrial hygiene laboratories due to highly corrosive and oxidizing properties. Other alternative methods that are used typically involve the use of nitric acid or combination of nitric acid and sulfuric acid, which yield very poor recoveries for titanium dioxide. Therefore, given the current state of the science, it is clear that a new method is needed for exposure assessment. In this current study, a microwave-assisted acid digestion method has been specifically designed to improve the recovery of titanium in TiO2 nanoparticles for quantitative analysis using ICP-OES. The optimum digestion conditions were determined by changing several variables including the acids used, digestion time, and temperature. Consequently, the optimized digestion temperature of 210°C with concentrated sulfuric and nitric acid (2:1 v/v) resulted in a recovery of >90% for TiO2. The method is expected to provide for a more accurate quantification of airborne TiO2 particles in the workplace environment. PMID:26181824

  14. Can Raters with Reduced Job Descriptive Information Provide Accurate Position Analysis Questionnaire (PAQ) Ratings?

    Science.gov (United States)

    Friedman, Lee; Harvey, Robert J.

    1986-01-01

    Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…

  15. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    Science.gov (United States)

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  16. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    International Nuclear Information System (INIS)

    Jha, Abhinav K; Frey, Eric C; Caffo, Brian

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  17. Identification and evaluation of reliable reference genes for quantitative real-time PCR analysis in tea plant (Camellia sinensis (L.) O. Kuntze)

    Science.gov (United States)

    Quantitative real-time polymerase chain reaction (qRT-PCR) is a commonly used technique for measuring gene expression levels due to its simplicity, specificity, and sensitivity. Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a...

  18. Quantitative security analysis for programs with low input and noisy output

    NARCIS (Netherlands)

    Ngo, Minh Tri; Huisman, Marieke

    Classical quantitative information flow analysis often considers a system as an information-theoretic channel, where private data are the only inputs and public data are the outputs. However, for systems where an attacker is able to influence the initial values of public data, these should also be

  19. Melanoma screening: Informing public health policy with quantitative modelling.

    Directory of Open Access Journals (Sweden)

    Stephen Gilmore

    Full Text Available Australia and New Zealand share the highest incidence rates of melanoma worldwide. Despite the substantial increase in public and physician awareness of melanoma in Australia over the last 30 years-as a result of the introduction of publicly funded mass media campaigns that began in the early 1980s -mortality has steadily increased during this period. This increased mortality has led investigators to question the relative merits of primary versus secondary prevention; that is, sensible sun exposure practices versus early detection. Increased melanoma vigilance on the part of the public and among physicians has resulted in large increases in public health expenditure, primarily from screening costs and increased rates of office surgery. Has this attempt at secondary prevention been effective? Unfortunately epidemiologic studies addressing the causal relationship between the level of secondary prevention and mortality are prohibitively difficult to implement-it is currently unknown whether increased melanoma surveillance reduces mortality, and if so, whether such an approach is cost-effective. Here I address the issue of secondary prevention of melanoma with respect to incidence and mortality (and cost per life saved by developing a Markov model of melanoma epidemiology based on Australian incidence and mortality data. The advantages of developing a methodology that can determine constraint-based surveillance outcomes are twofold: first, it can address the issue of effectiveness; and second, it can quantify the trade-off between cost and utilisation of medical resources on one hand, and reduced morbidity and lives saved on the other. With respect to melanoma, implementing the model facilitates the quantitative determination of the relative effectiveness and trade-offs associated with different levels of secondary and tertiary prevention, both retrospectively and prospectively. For example, I show that the surveillance enhancement that began in

  20. Information logistics: A production-line approach to information services

    Science.gov (United States)

    Adams, Dennis; Lee, Chee-Seng

    1991-01-01

    Logistics can be defined as the process of strategically managing the acquisition, movement, and storage of materials, parts, and finished inventory (and the related information flow) through the organization and its marketing channels in a cost effective manner. It is concerned with delivering the right product to the right customer in the right place at the right time. The logistics function is composed of inventory management, facilities management, communications unitization, transportation, materials management, and production scheduling. The relationship between logistics and information systems is clear. Systems such as Electronic Data Interchange (EDI), Point of Sale (POS) systems, and Just in Time (JIT) inventory management systems are important elements in the management of product development and delivery. With improved access to market demand figures, logisticians can decrease inventory sizes and better service customer demand. However, without accurate, timely information, little, if any, of this would be feasible in today's global markets. Information systems specialists can learn from logisticians. In a manner similar to logistics management, information logistics is concerned with the delivery of the right data, to the ring customer, at the right time. As such, information systems are integral components of the information logistics system charged with providing customers with accurate, timely, cost-effective, and useful information. Information logistics is a management style and is composed of elements similar to those associated with the traditional logistics activity: inventory management (data resource management), facilities management (distributed, centralized and decentralized information systems), communications (participative design and joint application development methodologies), unitization (input/output system design, i.e., packaging or formatting of the information), transportations (voice, data, image, and video communication systems

  1. EQPlanar: a maximum-likelihood method for accurate organ activity estimation from whole body planar projections

    International Nuclear Information System (INIS)

    Song, N; Frey, E C; He, B; Wahl, R L

    2011-01-01

    Optimizing targeted radionuclide therapy requires patient-specific estimation of organ doses. The organ doses are estimated from quantitative nuclear medicine imaging studies, many of which involve planar whole body scans. We have previously developed the quantitative planar (QPlanar) processing method and demonstrated its ability to provide more accurate activity estimates than conventional geometric-mean-based planar (CPlanar) processing methods using physical phantom and simulation studies. The QPlanar method uses the maximum likelihood-expectation maximization algorithm, 3D organ volume of interests (VOIs), and rigorous models of physical image degrading factors to estimate organ activities. However, the QPlanar method requires alignment between the 3D organ VOIs and the 2D planar projections and assumes uniform activity distribution in each VOI. This makes application to patients challenging. As a result, in this paper we propose an extended QPlanar (EQPlanar) method that provides independent-organ rigid registration and includes multiple background regions. We have validated this method using both Monte Carlo simulation and patient data. In the simulation study, we evaluated the precision and accuracy of the method in comparison to the original QPlanar method. For the patient studies, we compared organ activity estimates at 24 h after injection with those from conventional geometric mean-based planar quantification using a 24 h post-injection quantitative SPECT reconstruction as the gold standard. We also compared the goodness of fit of the measured and estimated projections obtained from the EQPlanar method to those from the original method at four other time points where gold standard data were not available. In the simulation study, more accurate activity estimates were provided by the EQPlanar method for all the organs at all the time points compared with the QPlanar method. Based on the patient data, we concluded that the EQPlanar method provided a

  2. HOW TO CALCULATE INFORMATION VALUE FOR EFFECTIVE SECURITY RISK ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Mario Sajko

    2006-12-01

    Full Text Available The actual problem of information security (infosec risk assessment is determining the value of information property or asset. This is particularly manifested through the use of quantitative methodology in which it is necessary to state the information value in quantitative sizes. The aim of this paper is to describe the evaluation possibilities of business information values, and the criteria needed for determining importance of information. For this purpose, the dimensions of information values will be determined and the ways used to present the importance of information contents will be studied. There are two basic approaches that can be used in evaluation: qualitative and quantitative. Often they are combined to determine forms of information content. The proposed criterion is the three-dimension model, which combines the existing experiences (i.e. possible solutions for information value assessment with our own criteria. An attempt for structuring information value in a business environment will be made as well.

  3. The Quantitative Reasoning for College Science (QuaRCS) Assessment: Emerging Themes from 5 Years of Data

    Science.gov (United States)

    Follette, Katherine; Dokter, Erin; Buxner, Sanlyn

    2018-01-01

    The Quantitative Reasoning for College Science (QuaRCS) Assessment is a validated assessment instrument that was designed to measure changes in students' quantitative reasoning skills, attitudes toward mathematics, and ability to accurately assess their own quantitative abilities. It has been administered to more than 5,000 students at a variety of institutions at the start and end of a semester of general education college science instruction. I will begin by briefly summarizing our published work surrounding validation of the instrument and identification of underlying attitudinal factors (composite variables identified via factor analysis) that predict 50% of the variation in students' scores on the assessment. I will then discuss more recent unpublished work, including: (1) Development and validation of an abbreviated version of the assessment (The QuaRCS Light), which results in marked improvements in students' ability to maintain a high effort level throughout the assessment and has broad implications for quantitative reasoning assessments in general, and (2) Our efforts to revise the attitudinal portion of the assessment to better assess math anxiety level, another key factor in student performance on numerical assessments.

  4. Quantitative quenching evaluation and direct intracellular metabolite analysis in Penicillium chrysogenum.

    Science.gov (United States)

    Meinert, Sabine; Rapp, Sina; Schmitz, Katja; Noack, Stephan; Kornfeld, Georg; Hardiman, Timo

    2013-07-01

    Sustained progress in metabolic engineering methodologies has stimulated new efforts toward optimizing fungal production strains such as through metabolite analysis of Penicillium chrysogenum industrial-scale processes. Accurate intracellular metabolite quantification requires sampling procedures that rapidly stop metabolism (quenching) and avoid metabolite loss via the cell membrane (leakage). When sampling protocols are validated, the quenching efficiency is generally not quantitatively assessed. For fungal metabolomics, quantitative biomass separation using centrifugation is a further challenge. In this study, P. chrysogenum intracellular metabolites were quantified directly from biomass extracts using automated sampling and fast filtration. A master/slave bioreactor concept was applied to provide industrial production conditions. Metabolic activity during sampling was monitored by 13C tracing. Enzyme activities were efficiently stopped and metabolite leakage was absent. This work provides a reliable method for P. chrysogenum metabolomics and will be an essential base for metabolic engineering of industrial processes. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Renal geology (quantitative renal stone analysis) by 'Fourier transform infrared spectroscopy'.

    Science.gov (United States)

    Singh, Iqbal

    2008-01-01

    To prospectively determine the precise stone composition (quantitative analysis) by using infrared spectroscopy in patients with urinary stone disease presenting to our clinic. To determine an ideal method for stone analysis suitable for use in a clinical setting. After routine and a detailed metabolic workup of all patients of urolithiasis, stone samples of 50 patients of urolithiasis satisfying the entry criteria were subjected to the Fourier transform infrared spectroscopic analysis after adequate sample homogenization at a single testing center. Calcium oxalate monohydrate and dihydrate stone mixture was most commonly encountered in 35 (71%) followed by calcium phosphate, carbonate apatite, magnesium ammonium hexahydrate and xanthine stones. Fourier transform infrared spectroscopy allows an accurate, reliable quantitative method of stone analysis. It also helps in maintaining a computerized large reference library. Knowledge of precise stone composition may allow the institution of appropriate prophylactic therapy despite the absence of any detectable metabolic abnormalities. This may prevent and or delay stone recurrence.

  6. Performance Theories for Sentence Coding: Some Quantitative Models

    Science.gov (United States)

    Aaronson, Doris; And Others

    1977-01-01

    This study deals with the patterns of word-by-word reading times over a sentence when the subject must code the linguistic information sufficiently for immediate verbatim recall. A class of quantitative models is considered that would account for reading times at phrase breaks. (Author/RM)

  7. Optimization of quantitative waste volume determination technique for hanford waste tank closure

    International Nuclear Information System (INIS)

    Monts, David L.; Jang, Ping-Rey; Long, Zhiling; Okhuysen, Walter P.; Norton, Olin P.; Gresham, Lawrence L.; Su, Yi; Lindner, Jeffrey S.

    2011-01-01

    The Hanford Site is currently in the process of an extensive effort to empty and close its radioactive single-shell and double-shell waste storage tanks. Before this can be accomplished, it is necessary to know how much residual material is left in a given waste tank and the uncertainty with which that volume is known. The Institute for Clean Energy Technology (ICET) at Mississippi State University is currently developing a quantitative in-tank imaging system based on Fourier Transform Profilometry, FTP. FTP is a non-contact, 3-D shape measurement technique. By projecting a fringe pattern onto a target surface and observing its deformation due to surface irregularities from a different view angle, FTP is capable of determining the height (depth) distribution (and hence volume distribution) of the target surface, thus reproducing the profile of the target accurately under a wide variety of conditions. Hence FTP has the potential to be utilized for quantitative determination of residual wastes within Hanford waste tanks. In this paper, efforts to characterize the accuracy and precision of quantitative volume determination using FTP and the use of these results to optimize the FTP system for deployment within Hanford waste tanks are described. (author)

  8. Multifunctional skin-like electronics for quantitative, clinical monitoring of cutaneous wound healing.

    Science.gov (United States)

    Hattori, Yoshiaki; Falgout, Leo; Lee, Woosik; Jung, Sung-Young; Poon, Emily; Lee, Jung Woo; Na, Ilyoun; Geisler, Amelia; Sadhwani, Divya; Zhang, Yihui; Su, Yewang; Wang, Xiaoqi; Liu, Zhuangjian; Xia, Jing; Cheng, Huanyu; Webb, R Chad; Bonifas, Andrew P; Won, Philip; Jeong, Jae-Woong; Jang, Kyung-In; Song, Young Min; Nardone, Beatrice; Nodzenski, Michael; Fan, Jonathan A; Huang, Yonggang; West, Dennis P; Paller, Amy S; Alam, Murad; Yeo, Woon-Hong; Rogers, John A

    2014-10-01

    Non-invasive, biomedical devices have the potential to provide important, quantitative data for the assessment of skin diseases and wound healing. Traditional methods either rely on qualitative visual and tactile judgments of a professional and/or data obtained using instrumentation with forms that do not readily allow intimate integration with sensitive skin near a wound site. Here, an electronic sensor platform that can softly and reversibly laminate perilesionally at wounds to provide highly accurate, quantitative data of relevance to the management of surgical wound healing is reported. Clinical studies on patients using thermal sensors and actuators in fractal layouts provide precise time-dependent mapping of temperature and thermal conductivity of the skin near the wounds. Analytical and simulation results establish the fundamentals of the sensing modalities, the mechanics of the system, and strategies for optimized design. The use of this type of "epidermal" electronics system in a realistic clinical setting with human subjects establishes a set of practical procedures in disinfection, reuse, and protocols for quantitative measurement. The results have the potential to address important unmet needs in chronic wound management. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Self-Expression on Social Media: Do Tweets Present Accurate and Positive Portraits of Impulsivity, Self-Esteem, and Attachment Style?

    Science.gov (United States)

    Orehek, Edward; Human, Lauren J

    2017-01-01

    Self-expression values are at an all-time high, and people are increasingly relying upon social media platforms to express themselves positively and accurately. We examined whether self-expression on the social media platform Twitter elicits positive and accurate social perceptions. Eleven perceivers rated 128 individuals (targets; total dyadic impressions = 1,408) on their impulsivity, self-esteem, and attachment style, based solely on the information provided in targets' 10 most recent tweets. Targets were on average perceived normatively and with distinctive self-other agreement, indicating both positive and accurate social perceptions. There were also individual differences in how positively and accurately targets were perceived, which exploratory analyses indicated may be partially driven by differential word usage, such as the use of positive emotion words and self- versus other-focus. This study demonstrates that self-expression on social media can elicit both positive and accurate perceptions and begins to shed light on how to curate such perceptions.

  10. 30 CFR 778.9 - Certifying and updating existing permit application information.

    Science.gov (United States)

    2010-07-01

    ... you have previously applied for a permit and the required information is already in AVS, then you may... information already in AVS is accurate and complete may certify to us by swearing or affirming, under oath and in writing, that the relevant information in AVS is accurate, complete, and up to date. (2) Part of...

  11. Comparison of culture-based, vital stain and PMA-qPCR methods for the quantitative detection of viable hookworm ova.

    Science.gov (United States)

    Gyawali, P; Sidhu, J P S; Ahmed, W; Jagals, P; Toze, S

    2017-06-01

    Accurate quantitative measurement of viable hookworm ova from environmental samples is the key to controlling hookworm re-infections in the endemic regions. In this study, the accuracy of three quantitative detection methods [culture-based, vital stain and propidium monoazide-quantitative polymerase chain reaction (PMA-qPCR)] was evaluated by enumerating 1,000 ± 50 Ancylostoma caninum ova in the laboratory. The culture-based method was able to quantify an average of 397 ± 59 viable hookworm ova. Similarly, vital stain and PMA-qPCR methods quantified 644 ± 87 and 587 ± 91 viable ova, respectively. The numbers of viable ova estimated by the culture-based method were significantly (P methods. Therefore, both PMA-qPCR and vital stain methods appear to be suitable for the quantitative detection of viable hookworm ova. However, PMA-qPCR would be preferable over the vital stain method in scenarios where ova speciation is needed.

  12. Calibration strategy for semi-quantitative direct gas analysis using inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Gerdes, Kirk; Carter, Kimberly E.

    2011-01-01

    A process is described by which an ICP-MS equipped with an Octopole Reaction System (ORS) is calibrated using liquid phase standards to facilitate direct analysis of gas phase samples. The instrument response to liquid phase standards is analyzed to produce empirical factors relating ion generation and transmission efficiencies to standard operating parameters. Empirical factors generated for liquid phase samples are then used to produce semi-quantitative analysis of both mixed liquid/gas samples and pure gas samples. The method developed is similar to the semi-quantitative analysis algorithms in the commercial software, which have here been expanded to include gas phase elements such as Xe and Kr. Equations for prediction of relative ionization efficiencies and isotopic transmission are developed for several combinations of plasma operating conditions, which allows adjustment of limited parameters between liquid and gas injection modes. In particular, the plasma temperature and electron density are calculated from comparison of experimental results to the predictions of the Saha equation. Comparisons between operating configurations are made to determine the robustness of the analysis to plasma conditions and instrument operating parameters. Using the methods described in this research, the elemental concentrations in a liquid standard containing 45 analytes and treated as an unknown sample were quantified accurately to ± 50% for most elements using 133 Cs as a single internal reference. The method is used to predict liquid phase mercury within 12% of the actual concentration and gas phase mercury within 28% of the actual concentration. The results verify that the calibration method facilitates accurate semi-quantitative, gas phase analysis of metal species with sufficient sensitivity to quantify metal concentrations lower than 1 ppb for many metallic analytes.

  13. Automatic detection and quantitative analysis of cells in the mouse primary motor cortex

    Science.gov (United States)

    Meng, Yunlong; He, Yong; Wu, Jingpeng; Chen, Shangbin; Li, Anan; Gong, Hui

    2014-09-01

    Neuronal cells play very important role on metabolism regulation and mechanism control, so cell number is a fundamental determinant of brain function. Combined suitable cell-labeling approaches with recently proposed three-dimensional optical imaging techniques, whole mouse brain coronal sections can be acquired with 1-μm voxel resolution. We have developed a completely automatic pipeline to perform cell centroids detection, and provided three-dimensional quantitative information of cells in the primary motor cortex of C57BL/6 mouse. It involves four principal steps: i) preprocessing; ii) image binarization; iii) cell centroids extraction and contour segmentation; iv) laminar density estimation. Investigations on the presented method reveal promising detection accuracy in terms of recall and precision, with average recall rate 92.1% and average precision rate 86.2%. We also analyze laminar density distribution of cells from pial surface to corpus callosum from the output vectorizations of detected cell centroids in mouse primary motor cortex, and find significant cellular density distribution variations in different layers. This automatic cell centroids detection approach will be beneficial for fast cell-counting and accurate density estimation, as time-consuming and error-prone manual identification is avoided.

  14. A comparison of quantitative methods for clinical imaging with hyperpolarized (13)C-pyruvate.

    Science.gov (United States)

    Daniels, Charlie J; McLean, Mary A; Schulte, Rolf F; Robb, Fraser J; Gill, Andrew B; McGlashan, Nicholas; Graves, Martin J; Schwaiger, Markus; Lomas, David J; Brindle, Kevin M; Gallagher, Ferdia A

    2016-04-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized (13)C-labelled molecules, such as the conversion of [1-(13)C]pyruvate to [1-(13)C]lactate, to be dynamically and non-invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model-free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two-way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time-to-peak and the lactate-to-pyruvate area under the curve ratio were simple model-free approaches that accurately represented the full reaction, with the time-to-peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized (13)C data. © 2016 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd.

  15. Quantitative characterization of viscoelastic behavior in tissue-mimicking phantoms and ex vivo animal tissues.

    Directory of Open Access Journals (Sweden)

    Ashkan Maccabi

    Full Text Available Viscoelasticity of soft tissue is often related to pathology, and therefore, has become an important diagnostic indicator in the clinical assessment of suspect tissue. Surgeons, particularly within head and neck subsites, typically use palpation techniques for intra-operative tumor detection. This detection method, however, is highly subjective and often fails to detect small or deep abnormalities. Vibroacoustography (VA and similar methods have previously been used to distinguish tissue with high-contrast, but a firm understanding of the main contrast mechanism has yet to be verified. The contributions of tissue mechanical properties in VA images have been difficult to verify given the limited literature on viscoelastic properties of various normal and diseased tissue. This paper aims to investigate viscoelasticity theory and present a detailed description of viscoelastic experimental results obtained in tissue-mimicking phantoms (TMPs and ex vivo tissues to verify the main contrast mechanism in VA and similar imaging modalities. A spherical-tip micro-indentation technique was employed with the Hertzian model to acquire absolute, quantitative, point measurements of the elastic modulus (E, long term shear modulus (η, and time constant (τ in homogeneous TMPs and ex vivo tissue in rat liver and porcine liver and gallbladder. Viscoelastic differences observed between porcine liver and gallbladder tissue suggest that imaging modalities which utilize the mechanical properties of tissue as a primary contrast mechanism can potentially be used to quantitatively differentiate between proximate organs in a clinical setting. These results may facilitate more accurate tissue modeling and add information not currently available to the field of systems characterization and biomedical research.

  16. Quantitative characterization of viscoelastic behavior in tissue-mimicking phantoms and ex vivo animal tissues.

    Science.gov (United States)

    Maccabi, Ashkan; Shin, Andrew; Namiri, Nikan K; Bajwa, Neha; St John, Maie; Taylor, Zachary D; Grundfest, Warren; Saddik, George N

    2018-01-01

    Viscoelasticity of soft tissue is often related to pathology, and therefore, has become an important diagnostic indicator in the clinical assessment of suspect tissue. Surgeons, particularly within head and neck subsites, typically use palpation techniques for intra-operative tumor detection. This detection method, however, is highly subjective and often fails to detect small or deep abnormalities. Vibroacoustography (VA) and similar methods have previously been used to distinguish tissue with high-contrast, but a firm understanding of the main contrast mechanism has yet to be verified. The contributions of tissue mechanical properties in VA images have been difficult to verify given the limited literature on viscoelastic properties of various normal and diseased tissue. This paper aims to investigate viscoelasticity theory and present a detailed description of viscoelastic experimental results obtained in tissue-mimicking phantoms (TMPs) and ex vivo tissues to verify the main contrast mechanism in VA and similar imaging modalities. A spherical-tip micro-indentation technique was employed with the Hertzian model to acquire absolute, quantitative, point measurements of the elastic modulus (E), long term shear modulus (η), and time constant (τ) in homogeneous TMPs and ex vivo tissue in rat liver and porcine liver and gallbladder. Viscoelastic differences observed between porcine liver and gallbladder tissue suggest that imaging modalities which utilize the mechanical properties of tissue as a primary contrast mechanism can potentially be used to quantitatively differentiate between proximate organs in a clinical setting. These results may facilitate more accurate tissue modeling and add information not currently available to the field of systems characterization and biomedical research.

  17. Comparison of simplified quantitative analyses of FDG uptake

    International Nuclear Information System (INIS)

    Graham, M.M.; Peterson, L.M.; Hayward, R.M.

    2000-01-01

    Quantitative analysis of [ 18 F]-fluoro-deoxyglucose (FDG) uptake is important in oncologic positron emission tomography (PET) studies to be able to set an objective threshold in determining if a tissue is malignant or benign, in assessing response to therapy, and in attempting to predict the aggressiveness of an individual tumor. The most common method used today for simple, clinical quantitation is standardized uptake value (SUV). SUV is normalized for body weight. Other potential normalization factors are lean body mass (LBM) or body surface area (BSA). More complex quantitation schemes include simplified kinetic analysis (SKA), Patlak graphical analysis (PGA), and parameter optimization of the complete kinetic model to determine FDG metabolic rate (FDGMR). These various methods were compared in a group of 40 patients with colon cancer metastatic to the liver. The methods were assessed by (1) correlation with FDGMR, (2) ability to predict survival using Kaplan-Meier plots, and (3) area under receiver operating characteristic (ROC) curves for distinguishing between tumor and normal liver. The best normalization scheme appears to be BSA with minor differences depending on the specific formula used to calculate BSA. Overall, PGA is the best predictor of outcome and best discriminator between normal tissue and tumor. SKA is almost as good. In conventional PET imaging it is worthwhile to normalize SUV using BSA. If a single blood sample is available, it is possible to use the SKA method, which is distinctly better. If more than one image is available, along with at least one blood sample, PGA is feasible and should produce the most accurate results

  18. Vertebral body trabecular density at the thoracolumbar junction using quantitative computed tomography

    International Nuclear Information System (INIS)

    Singer, K.P.; Breidahl, P.D.; Royal Perth Hospital

    1990-01-01

    Quantitative computed tomography was used to assess vertebral trabecular density in 26 post-mortem spines from individuals aged between 14 and 80 years. All vertebrae from T10 to L1 were scanned transversely near the mid-vertebral level with calculations of trabecular density in HUs averaged and referenced to a mineral equivalent phantom. An age-related decline in trabecular density was recorded (r=0.55, p<0.0001). Density measures from the anterior aspect of the vertebral body were significantly greater than from postero-lateral regions. From T10 to L1, there was a significant decrease in trabecular density, whereas density measures multiplied by vertebral body cross-sectional area were constant. Predictions of vertebral compressive strength using quantitative computed tomography may become more accurate by increasing the sampling area per scan and including vertebral body cross-sectional area as part of the radiologic assessment. (orig.)

  19. Microfluorometric mithramycin assay for quantitating the effects of immunotoxicants on lymphocyte activation

    International Nuclear Information System (INIS)

    Quattrone, A.J.; Ranney, D.F.

    1981-01-01

    A semiautomated, microfluorometric assay has been developed for the detection of toxicant-induced changes in lymphocyte DNA content at standard intervals after mitogen activation. DNA is quantitated by solubilizing the cells and determining the fluorescence enhancement that results from formation of the highly specific mithramycin:DNA adduct. The limit of detection is 0.21 μg (30,000 resting cell equivalents) per microliter well. Correlation with the less sensitive, nonautomatable, diphenylamine DNA assay give a correlation coefficient r = 0.91. Prototype substances representative of true immunotoxicants (prostaglandin E 2 ) and common interfering substances (thymidine at 14 M) have been tested. The latter substance produces false positive results in the standard [ 3 H] thymidine assay. The mithramycin assay does not inappropriately detect this interfering substance. It has the characteristics of a highly specific, accurate technique of screening and quantitating immunotoxic drugs, agents, and mediators in patient sera and other complex biological fluids

  20. Quantitative live imaging of endogenous DNA replication in mammalian cells.

    Directory of Open Access Journals (Sweden)

    Andrew Burgess

    Full Text Available Historically, the analysis of DNA replication in mammalian tissue culture cells has been limited to static time points, and the use of nucleoside analogues to pulse-label replicating DNA. Here we characterize for the first time a novel Chromobody cell line that specifically labels endogenous PCNA. By combining this with high-resolution confocal time-lapse microscopy, and with a simplified analysis workflow, we were able to produce highly detailed, reproducible, quantitative 4D data on endogenous DNA replication. The increased resolution allowed accurate classification and segregation of S phase into early-, mid-, and late-stages based on the unique subcellular localization of endogenous PCNA. Surprisingly, this localization was slightly but significantly different from previous studies, which utilized over-expressed GFP tagged forms of PCNA. Finally, low dose exposure to Hydroxyurea caused the loss of mid- and late-S phase localization patterns of endogenous PCNA, despite cells eventually completing S phase. Taken together, these results indicate that this simplified method can be used to accurately identify and quantify DNA replication under multiple and various experimental conditions.

  1. Digital radiography: a quantitative approach

    International Nuclear Information System (INIS)

    Retraint, F.

    2004-01-01

    'Full-text:' In a radiograph the value of each pixel is related to the material thickness crossed by the x-rays. Using this relationship, an object can be characterized by parameters such as depth, surface and volume. Assuming a locally linear detector response and using a radiograph of reference object, the quantitative thickness map of object can be obtained by applying offset and gain corrections. However, for an acquisition system composed of cooled CCD camera optically coupled to a scintillator screen, the radiographic image formation process generates some bias which prevent from obtaining the quantitative information: non uniformity of the x-ray source, beam hardening, Compton scattering, scintillator screen, optical system response. In a first section, we propose a complete model of the radiographic image formation process taking account of these biases. In a second section, we present an inversion scheme of this model for a single material object, which enables to obtain the thickness map of the object crossed by the x-rays. (author)

  2. Quantitative Digital Tomosynthesis Mammography for Improved Breast Cancer Detection and Diagnosis

    National Research Council Canada - National Science Library

    Zhang, Yiheng

    2008-01-01

    .... When fully developed, the DTM can provide radiologists improved quantitative, three-dimensional volumetric information of the breast tissue, and assist in breast cancer detection and diagnosis...

  3. Identification of Case Content with Quantitative Network Analysis

    DEFF Research Database (Denmark)

    Christensen, Martin Lolle; Olsen, Henrik Palmer; Tarissan, Fabian

    2016-01-01

    the relevant articles. In order to enhance information retrieval about case content, without relying on manual labor and subjective judgment, we propose in this paper a quantitative method that gives a better indication of case content in terms of which articles a given case is more closely associated with...

  4. Information Quality of a Nursing Information System depends on the nurses: A combined quantitative and qualitative evaluation

    NARCIS (Netherlands)

    Michel-Verkerke, M.B.

    2012-01-01

    Purpose Providing access to patient information is the key factor in nurses’ adoption of a Nursing Information System (NIS). In this study the requirements for information quality and the perceived quality of information are investigated. A teaching hospital in the Netherlands has developed a NIS as

  5. Social contagion of correct and incorrect information in memory.

    Science.gov (United States)

    Rush, Ryan A; Clark, Steven E

    2014-01-01

    The present study examines how discussion between individuals regarding a shared memory affects their subsequent individual memory reports. In three experiments pairs of participants recalled items from photographs of common household scenes, discussed their recall with each other, and then recalled the items again individually. Results showed that after the discussion. individuals recalled more correct items and more incorrect items, with very small non-significant increases, or no change, in recall accuracy. The information people were exposed to during the discussion was generally accurate, although not as accurate as individuals' initial recall. Individuals incorporated correct exposure items into their subsequent recall at a higher rate than incorrect exposure items. Participants who were initially more accurate became less accurate, and initially less-accurate participants became more accurate as a result of their discussion. Comparisons to no-discussion control groups suggest that the effects were not simply the product of repeated recall opportunities or self-cueing, but rather reflect the transmission of information between individuals.

  6. A quantitative framework for estimating water resources in India

    Digital Repository Service at National Institute of Oceanography (India)

    Shankar, D.; Kotamraju, V.; Shetye, S.R

    of information on the variables associated with hydrology, and second, the absence of an easily accessible quantitative framework to put these variables in perspective. In this paper, we discuss a framework that has been assembled to address both these issues...

  7. Quantitative nanometer-scale mapping of dielectric tunability

    Energy Technology Data Exchange (ETDEWEB)

    Tselev, Alexander [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Klein, Andreas [Technische Univ. Darmstadt (Germany); Gassmann, Juergen [Technische Univ. Darmstadt (Germany); Jesse, Stephen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Li, Qian [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kalinin, Sergei V. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wisinger, Nina Balke [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-08-21

    Two scanning probe microscopy techniques—near-field scanning microwave microscopy (SMM) and piezoresponse force microscopy (PFM)—are used to characterize and image tunability in a thin (Ba,Sr)TiO3 film with nanometer scale spatial resolution. While sMIM allows direct probing of tunability by measurement of the change in the dielectric constant, in PFM, tunability can be extracted via electrostrictive response. The near-field microwave imaging and PFM provide similar information about dielectric tunability with PFM capable to deliver quantitative information on tunability with a higher spatial resolution close to 15 nm. This is the first time that information about the dielectric tunability is available on such length scales.

  8. Quantitative multimodality imaging in cancer research and therapy.

    Science.gov (United States)

    Yankeelov, Thomas E; Abramson, Richard G; Quarles, C Chad

    2014-11-01

    Advances in hardware and software have enabled the realization of clinically feasible, quantitative multimodality imaging of tissue pathophysiology. Earlier efforts relating to multimodality imaging of cancer have focused on the integration of anatomical and functional characteristics, such as PET-CT and single-photon emission CT (SPECT-CT), whereas more-recent advances and applications have involved the integration of multiple quantitative, functional measurements (for example, multiple PET tracers, varied MRI contrast mechanisms, and PET-MRI), thereby providing a more-comprehensive characterization of the tumour phenotype. The enormous amount of complementary quantitative data generated by such studies is beginning to offer unique insights into opportunities to optimize care for individual patients. Although important technical optimization and improved biological interpretation of multimodality imaging findings are needed, this approach can already be applied informatively in clinical trials of cancer therapeutics using existing tools. These concepts are discussed herein.

  9. Humanitarian information management and systems

    NARCIS (Netherlands)

    van de Walle, B.A.; van den Eede, G.G.P.; Muhren, W.J.; Loffler, J.; Klann, M.

    2009-01-01

    In times of major disasters such as hurricane Katrina or the Sichuan earthquake, the need for accurate and timely information is as crucial as is rapid and coherent coordination among the responding humanitarian community. Effective humanitarian information systems that provide timely access to

  10. Information Literacy Follow-Through: Enhancing Preservice Teachers' Information Evaluation Skills through Formative Assessment

    Science.gov (United States)

    Seely, Sara Robertson; Fry, Sara Winstead; Ruppel, Margie

    2011-01-01

    An investigation into preservice teachers' information evaluation skills at a large university suggests that formative assessment can improve student achievement. Preservice teachers were asked to apply information evaluation skills in the areas of currency, relevancy, authority, accuracy, and purpose. The study used quantitative methods to assess…

  11. Quantitative assessment of cancer cell morphology and motility using telecentric digital holographic microscopy and machine learning.

    Science.gov (United States)

    Lam, Van K; Nguyen, Thanh C; Chung, Byung M; Nehmetallah, George; Raub, Christopher B

    2018-03-01

    The noninvasive, fast acquisition of quantitative phase maps using digital holographic microscopy (DHM) allows tracking of rapid cellular motility on transparent substrates. On two-dimensional surfaces in vitro, MDA-MB-231 cancer cells assume several morphologies related to the mode of migration and substrate stiffness, relevant to mechanisms of cancer invasiveness in vivo. The quantitative phase information from DHM may accurately classify adhesive cancer cell subpopulations with clinical relevance. To test this, cells from the invasive breast cancer MDA-MB-231 cell line were cultured on glass, tissue-culture treated polystyrene, and collagen hydrogels, and imaged with DHM followed by epifluorescence microscopy after staining F-actin and nuclei. Trends in cell phase parameters were tracked on the different substrates, during cell division, and during matrix adhesion, relating them to F-actin features. Support vector machine learning algorithms were trained and tested using parameters from holographic phase reconstructions and cell geometric features from conventional phase images, and used to distinguish between elongated and rounded cell morphologies. DHM was able to distinguish between elongated and rounded morphologies of MDA-MB-231 cells with 94% accuracy, compared to 83% accuracy using cell geometric features from conventional brightfield microscopy. This finding indicates the potential of DHM to detect and monitor cancer cell morphologies relevant to cell cycle phase status, substrate adhesion, and motility. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  12. Quantitation of sweet steviol glycosides by means of a HILIC-MS/MS-SIDA approach.

    Science.gov (United States)

    Well, Caroline; Frank, Oliver; Hofmann, Thomas

    2013-11-27

    Meeting the rising consumer demand for natural food ingredients, steviol glycosides, the sweet principle of Stevia rebaudiana Bertoni (Bertoni), have recently been approved as food additives in the European Union. As regulatory constraints require sensitive methods to analyze the sweet-tasting steviol glycosides in foods and beverages, a HILIC-MS/MS method was developed enabling the accurate and reliable quantitation of the major steviol glycosides stevioside, rebaudiosides A-F, steviolbioside, rubusoside, and dulcoside A by using the corresponding deuterated 16,17-dihydrosteviol glycosides as suitable internal standards. This quantitation not only enables the analysis of the individual steviol glycosides in foods and beverages but also can support the optimization of breeding and postharvest downstream processing of Stevia plants to produce preferentially sweet and least bitter tasting Stevia extracts.

  13. Quantitative profiling of serum samples using TMT protein labelling, fractionation and LC-MS/MS.

    Science.gov (United States)

    Sinclair, John; Timms, John F

    2011-08-01

    Blood-borne biomarkers are urgently required for the early detection, accurate diagnosis and prognosis of disease. Additionally, improved methods of profiling serum and plasma proteins for biomarker discovery efforts are needed. Herein, we report a quantitative method based on amino-group labelling of serum proteins (rather than peptides) with isobaric tandem mass tags (TMT) and incorporating immune-based depletion, gel-based and strong anion exchange separation of proteins prior to differential endoproteinase treatment and liquid chromatography tandem mass spectrometry. We report a generally higher level of quantitative coverage of the serum proteome compared to other peptide-based isobaric tagging approaches and show the potential of the method by applying it to a set of unique samples that pre-date the diagnosis of pancreatic cancer. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  15. Accurate donor electron wave functions from a multivalley effective mass theory.

    Science.gov (United States)

    Pendo, Luke; Hu, Xuedong

    Multivalley effective mass (MEM) theories combine physical intuition with a marginal need for computational resources, but they tend to be insensitive to variations in the wavefunction. However, recent papers suggest full Bloch functions and suitable central cell donor potential corrections are essential to replicating qualitative and quantitative features of the wavefunction. In this talk, we consider a variational MEM method that can accurately predict both spectrum and wavefunction of isolated phosphorus donors. As per Gamble et. al, we employ a truncated series representation of the Bloch function with a tetrahedrally symmetric central cell correction. We use a dynamic dielectric constant, a feature commonly seen in tight-binding methods. Uniquely, we use a freely extensible basis of either all Slater- or all Gaussian-type functions. With a large basis able to capture the influence of higher energy eigenstates, this method is well positioned to consider the influence of external perturbations, such as electric field or applied strain, on the charge density. This work is supported by the US Army Research Office (W911NF1210609).

  16. Development of a quantitative safety assessment method for nuclear I and C systems including human operators

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2004-02-01

    pages of fault trees which should be redrawn from the logical relation between the components in the DPPS. On the other hand, the RGGG model for DPPS can be drawn in only I page, and the structure of the model is almost similar to the actual structure of DPPS. In addition, the RGGG model visually shows the state of information processed by each component. In this sense, I believe that the RGGG method is more intuitive and easy to use. Quantitative analysis of the fault tree model and the RGGG model shows that the two models produce equivalent results. Currently, an identified disadvantage is the calculation time, since a lot of approximation algorithms are already developed for the fault tree analysis, but not for the RGGG method. As a new method for HRA, I develop a quantitative situation assessment model for human operators, since human performance is mainly affected by the situation assessment. In contrast to the conventional HRA methods which are mostly developed by expert opinions, the proposed situation assessment model for human operators is developed on the basis of mathematical theories, Bayesian inference and the information theory, with the following two assumptions. 1. Human operators can do Bayesian inference, even though the results cannot be as accurate as mathematical calculations. 2. In knowledge-driven monitoring, the probability that human operators select an indicator as the next indicator to monitor is proportional to the expected information from the indicator. (The expected information from each indicator can be calculated using the information theory.) With an experiment, it is shown that the two assumptions are reasonable. The proposed mathematical model for the situation assessment of human operators is expected to be used as the basis for the development of the quantitative model for the situation assessment of actual human operators. By combining the RGGG method and the mathematical model for the situation assessment of human operators, I

  17. Quantitative Market Research Regarding Funding of District 8 Construction Projects

    Science.gov (United States)

    1995-05-01

    The primary objective of this quantitative research is to provide information : for more effective decision making regarding the level of investment in various : transportation systems in District 8. : This objective was accomplished by establishing ...

  18. Calculation of Quantitative Structure-Activity Relationship Descriptors of Artemisinin Derivatives

    Directory of Open Access Journals (Sweden)

    Jambalsuren Bayarmaa

    2008-06-01

    Full Text Available Quantitative structure-activity relationships are based on the construction of predictive models using a set of known molecules and associated activity value. This accurate methodology, developed with adequate mathematical and computational tools, leads to a faster, cheaper and more comprehensive design of new products, reducing the experimental synthesis and testing on animals. Preparation of the QSAR models of artemisinin derivatives was carried out by the genetic function algorithm (GFA method for 91 molecules. The results show some relationships to the observed antimalarial activities of the artemisinin derivatives. The most statistically signi fi cant regression equation obtained from the fi nal GFA relates to two molecular descriptors.

  19. Partial volume correction and image segmentation for accurate measurement of standardized uptake value of grey matter in the brain.

    Science.gov (United States)

    Bural, Gonca; Torigian, Drew; Basu, Sandip; Houseni, Mohamed; Zhuge, Ying; Rubello, Domenico; Udupa, Jayaram; Alavi, Abass

    2015-12-01

    Our aim was to explore a novel quantitative method [based upon an MRI-based image segmentation that allows actual calculation of grey matter, white matter and cerebrospinal fluid (CSF) volumes] for overcoming the difficulties associated with conventional techniques for measuring actual metabolic activity of the grey matter. We included four patients with normal brain MRI and fluorine-18 fluorodeoxyglucose (F-FDG)-PET scans (two women and two men; mean age 46±14 years) in this analysis. The time interval between the two scans was 0-180 days. We calculated the volumes of grey matter, white matter and CSF by using a novel segmentation technique applied to the MRI images. We measured the mean standardized uptake value (SUV) representing the whole metabolic activity of the brain from the F-FDG-PET images. We also calculated the white matter SUV from the upper transaxial slices (centrum semiovale) of the F-FDG-PET images. The whole brain volume was calculated by summing up the volumes of the white matter, grey matter and CSF. The global cerebral metabolic activity was calculated by multiplying the mean SUV with total brain volume. The whole brain white matter metabolic activity was calculated by multiplying the mean SUV for the white matter by the white matter volume. The global cerebral metabolic activity only reflects those of the grey matter and the white matter, whereas that of the CSF is zero. We subtracted the global white matter metabolic activity from that of the whole brain, resulting in the global grey matter metabolism alone. We then divided the grey matter global metabolic activity by grey matter volume to accurately calculate the SUV for the grey matter alone. The brain volumes ranged between 1546 and 1924 ml. The mean SUV for total brain was 4.8-7. Total metabolic burden of the brain ranged from 5565 to 9617. The mean SUV for white matter was 2.8-4.1. On the basis of these measurements we generated the grey matter SUV, which ranged from 8.1 to 11.3. The

  20. The Readability of Electronic Cigarette Health Information and Advice: A Quantitative Analysis of Web-Based Information.

    Science.gov (United States)

    Park, Albert; Zhu, Shu-Hong; Conway, Mike

    2017-01-06

    The popularity and use of electronic cigarettes (e-cigarettes) has increased across all demographic groups in recent years. However, little is currently known about the readability of health information and advice aimed at the general public regarding the use of e-cigarettes. The objective of our study was to examine the readability of publicly available health information as well as advice on e-cigarettes. We compared information and advice available from US government agencies, nongovernment organizations, English speaking government agencies outside the United States, and for-profit entities. A systematic search for health information and advice on e-cigarettes was conducted using search engines. We manually verified search results and converted to plain text for analysis. We then assessed readability of the collected documents using 4 readability metrics followed by pairwise comparisons of groups with adjustment for multiple comparisons. A total of 54 documents were collected for this study. All 4 readability metrics indicate that all information and advice on e-cigarette use is written at a level higher than that recommended for the general public by National Institutes of Health (NIH) communication guidelines. However, health information and advice written by for-profit entities, many of which were promoting e-cigarettes, were significantly easier to read. A substantial proportion of potential and current e-cigarette users are likely to have difficulty in fully comprehending Web-based health information regarding e-cigarettes, potentially hindering effective health-seeking behaviors. To comply with NIH communication guidelines, government entities and nongovernment organizations would benefit from improving the readability of e-cigarettes information and advice. ©Albert Park, Shu-Hong Zhu, Mike Conway. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 06.01.2017.